Android gestures over interactive widgets

My application contains an area filled with buttons. I would like to implement such an activity in such a way that with a fling gesture above the button area it switches it to one of the other two areas (using ViewFlipper).

I made two approaches to detect gestures. The first one is using the GestureDetector. However, touch motion events over Button did not raise the onTouchEvent activity method, so - as a result - I could not redirect it to the GestureDetector class. In short, failure.

The second approach is using GestureOverlayView. This time, however, I reached the second extreme: not only a gesture was detected, but also a button by which the gesture is executed reports a click.

I want the interface to work as follows: if the user touches the button and releases the touch (or moves the finger a little), the button reports a click and no gestures are detected. On the other hand, if the user touches the screen and makes a longer move, this gesture must be detected and the button event is not reported by the button.

I have implemented a small concept validation application. The following is an XML action code:

<?xml version="1.0" encoding="utf-8"?> <LinearLayout android:layout_width="match_parent" android:layout_height="match_parent" xmlns:android="http://schemas.android.com/apk/res/android" android:orientation="vertical"> <android.gesture.GestureOverlayView android:layout_width="match_parent" android:layout_height="match_parent" android:id="@+id/overlay"> <LinearLayout android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="vertical"> <TextView android:id="@+id/display" android:layout_width="match_parent" android:layout_height="wrap_content" /> <Button android:layout_width="match_parent" android:layout_height="match_parent" android:id="@+id/button"/> </LinearLayout> </android.gesture.GestureOverlayView> </LinearLayout> 

The following is the java action code:

 package spk.sketchbook; import android.app.Activity; import android.os.Bundle; import android.view.MotionEvent; import android.view.View; import android.view.View.OnClickListener; import android.widget.Button; import android.widget.TextView; import android.widget.Toast; import android.gesture.*; import android.gesture.GestureOverlayView.OnGestureListener; public class Main extends Activity implements OnGestureListener, OnClickListener { private void SetupEvents() { GestureOverlayView ov = (GestureOverlayView)findViewById(R.id.overlay); ov.addOnGestureListener(this); Button b = (Button)findViewById(R.id.button); b.setOnClickListener(this); } /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); SetupEvents(); } @Override public void onGesture(GestureOverlayView arg0, MotionEvent arg1) { TextView tv = (TextView)findViewById(R.id.display); tv.setText("Gesture"); } @Override public void onGestureCancelled(GestureOverlayView arg0, MotionEvent arg1) { } @Override public void onGestureEnded(GestureOverlayView overlay, MotionEvent event) { } @Override public void onGestureStarted(GestureOverlayView overlay, MotionEvent event) { } @Override public void onClick(View v) { TextView tv = (TextView)findViewById(R.id.display); tv.setText("Click"); } } 

The question arises: how to implement such an interface that can solve if a user’s action should be considered as a gesture or a button?

Regards - Spook.

+8
android user-interface interface clickable gestures
source share
2 answers

I decided it myself. The idea is to capture gesture events from GestureOverlayView and, in addition to passing them to the GestureDetector, measure the distance from user gestures. The distance should be stored in the private activity field (accessible by all event handlers). Finally, fling and click handlers check the value of this field; if it is below a certain value (for example, 10 pixels must be accurate), it should be interpreted as a click. Otherwise - like a gesture.

Note that the button will look like a click, and the click handler will be called. I decided to create a low-level click handler (attached to all buttons via GestureOverlayView), which performs gesture / click checks and, if the result is a click, selects one of the above click handlers.

The solution works for me; however, if you want to disable the pressed button of the button and prohibit calling the handler, this is probably due to the redefinition of the buttons and / or components of the GestureOverlayView.

+3
source share

a Another solution: Make this function:

 public void SetSwipe(View v, final GestureDetector gestureScanner){ // Sets scroll view listener v.setOnTouchListener(new OnTouchListener() { @Override public boolean onTouch(View v, MotionEvent event) { if (gestureScanner.onTouchEvent(event)) return true; else return false; } }); } 

And name it for each widget as follows:

 scfSetTag.SetSwipe(svAllView, gestureScanner); 

If you need to do this from many actions or fragments, you can create a class.java file and create an instance of this class for each action that you need to implement to scroll. This detects how the click scroll on any witdget. Hope to help!

+2
source share

All Articles