My application contains an area filled with buttons. I would like to implement such an activity in such a way that with a fling gesture above the button area it switches it to one of the other two areas (using ViewFlipper).
I made two approaches to detect gestures. The first one is using the GestureDetector. However, touch motion events over Button did not raise the onTouchEvent activity method, so - as a result - I could not redirect it to the GestureDetector class. In short, failure.
The second approach is using GestureOverlayView. This time, however, I reached the second extreme: not only a gesture was detected, but also a button by which the gesture is executed reports a click.
I want the interface to work as follows: if the user touches the button and releases the touch (or moves the finger a little), the button reports a click and no gestures are detected. On the other hand, if the user touches the screen and makes a longer move, this gesture must be detected and the button event is not reported by the button.
I have implemented a small concept validation application. The following is an XML action code:
<?xml version="1.0" encoding="utf-8"?> <LinearLayout android:layout_width="match_parent" android:layout_height="match_parent" xmlns:android="http://schemas.android.com/apk/res/android" android:orientation="vertical"> <android.gesture.GestureOverlayView android:layout_width="match_parent" android:layout_height="match_parent" android:id="@+id/overlay"> <LinearLayout android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="vertical"> <TextView android:id="@+id/display" android:layout_width="match_parent" android:layout_height="wrap_content" /> <Button android:layout_width="match_parent" android:layout_height="match_parent" android:id="@+id/button"/> </LinearLayout> </android.gesture.GestureOverlayView> </LinearLayout>
The following is the java action code:
package spk.sketchbook; import android.app.Activity; import android.os.Bundle; import android.view.MotionEvent; import android.view.View; import android.view.View.OnClickListener; import android.widget.Button; import android.widget.TextView; import android.widget.Toast; import android.gesture.*; import android.gesture.GestureOverlayView.OnGestureListener; public class Main extends Activity implements OnGestureListener, OnClickListener { private void SetupEvents() { GestureOverlayView ov = (GestureOverlayView)findViewById(R.id.overlay); ov.addOnGestureListener(this); Button b = (Button)findViewById(R.id.button); b.setOnClickListener(this); } @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); SetupEvents(); } @Override public void onGesture(GestureOverlayView arg0, MotionEvent arg1) { TextView tv = (TextView)findViewById(R.id.display); tv.setText("Gesture"); } @Override public void onGestureCancelled(GestureOverlayView arg0, MotionEvent arg1) { } @Override public void onGestureEnded(GestureOverlayView overlay, MotionEvent event) { } @Override public void onGestureStarted(GestureOverlayView overlay, MotionEvent event) { } @Override public void onClick(View v) { TextView tv = (TextView)findViewById(R.id.display); tv.setText("Click"); } }
The question arises: how to implement such an interface that can solve if a userβs action should be considered as a gesture or a button?
Regards - Spook.
android user-interface interface clickable gestures
Spook
source share