Hi folks!
This thread explains a feature I first introduced in the Siyah kernel (available in 4.1beta5) that allows defining finger movement detection and triggering actions when certain gestures are made.
There are also apps available on the market to do it but this approach happens on the kernel level.
I welcome your feedback on any advantages and drawbacks you find.
Index
This post - feature explanation and samples
Post #2 - Configuring gestures
Post #3 - Actions
Change log
01.12.2012
Added instructions on how to use camera from the lockscreen (see post #3).
Added link to Flint2's Kernel Gesture Builder (see post #2).
Added index.
27.10.2012
Added 3 additional actions (see items 9, 10 and 11 at the end of this post): v1.2 sample script.
Fixed mDNIe negative toggle for newer JB kernels.
23.08.2012
Added action commands and explanations to the 3rd post, with all that has been identified so far.
18.08.2012
Added sample CWM file for the S3 (different coordinates) - thanks to Gokhanmoral
16.08.2012
CWM-flashable zip with ready to use examples
8 gestures
Actions: invert mDNIe; launch camera (3 different apps detected, including JB); direct dial (must edit script); toggle bluetooth; toggle WiFi; play/pause; simulate power button (save the physical button); simulate home key
fixed JB / CM10 hanging on boot when script is present
13.08.2012
Initial post
There are 2 steps required to use this feature:
1. Defining the gestures - in other words, the path that the fingers are expected to make for the gesture to be detected
2. Reacting to detected gestures
Defining gestures
The sysfs entry /sys/devices/virtual/misc/touch_gestures/gesture_patterns provides access to the gesture definitions - the hot spots for the path that each finger must travel for a gesture to be triggered.
"cat /sys/devices/virtual/misc/touch_gestures/gesture_patterns" will show you the current definitions, and some comments on the expected structure:
Code:
# Touch gestures
#
# Syntax
# <gesture_no>:<finger_no>:(x_min|x_max,y_min|y_max)
# ...
# gesture_no: 1 to 10
# finger_no : 1 to 10
# max steps per gesture and finger: 10
# Gesture 1:
...
Choosing the coordinates
Your S2 screen has the following X,Y coordinates:
Code:
+---------------+
|0,0 479,0|
| |
| |
| |
| |
| |
| |
|0,799 479,799|
+---------------+
Each hotspot is a rectangle from X1 to X2 and Y1 to Y2. For example, a hotspot for just the top half of the screen would be X between 0 and 479 and Y between 0 and 399 (~ half of 800).
A maximum of 10 gestures can be defined, each of them using 1 or more fingers (up to a maximum of 10 but in practice more than 4 might not be very feasible) and for each of them a maximum of 10 consecutive hotspots, which make a path.
All gestures must be defined in one go by writing multiple lines to /sys/devices/virtual/misc/touch_gestures/gesture_patterns, in the following form:
Code:
gesture_no:finger_no:(min_x|max_x,min_y|max_y)
gesture_no:finger_no:(min_x|max_x,min_y|max_y)
... additional hotspots for the same finger, or additional fingers, or additional gestures ...
Writing to "gesture_patterns" will erase all previous definitions and replace with what you're writing.
Some examples that can be used in practice (or define your own gestures)
1. swipe one finger near the top and another near the bottom from left to right
Code:
+----+-----------+----+
| | | |
| +-|-----------|-> |
| | | |
+----+ +----+
| |
| |
| |
| |
| |
+----+ +----+
| | | |
| +-|-----------|-> |
| | | |
+----+-----------+----+
Definition (bound to gesture 1; uses fingers 1 and 2):
Code:
1:1:(0|150,0|150)
1:1:(330|480,0|150)
1:2:(0|150,650|800)
1:2:(330|480,650|800)
2. swipe 3 fingers from near the top to near the bottom
Code:
+---------------------+
| |
| + + + |
| | | | |
+---------------------+
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
+---------------------+
| | | | |
| v v v |
| |
+---------------------+
Definition (bound to gesture 2; uses fingers 1, 2 and 3):
Code:
2:1:(0|480,0|200)2:1:(0|480,600|800)
2:2:(0|480,0|200)2:2:(0|480,600|800)
2:3:(0|480,0|200)2:3:(0|480,600|800)
3. draw a Z with one finger while another is pressed on the middle left of the screen
Code:
+----+-----------+----+
| | | |
| +--|-----------|-> |
+----+ +----+
| +--+ |
+----+ | |
| | +--+ |
| + | | |
| | +--+ |
+----+ | |
| +--+ |
+----+-+ +----+
| <-| | |
| +-|-----------|-> |
+----+-----------+----+
Definition (bound to gesture 3; uses fingers 1 and 2):
Code:
3:1:(0|150,0|150)
3:1:(330|480,0|150)
3:1:(0|150,650|800)
3:1:(330|480,650|800)
3:2:(0|150,300|500)
(notice that I mixed the way the lines are written, in order to show how you can organize the entries)
To wrap it all up, you can use the following in an init.d script - as the definitions aren't persisted across reboots - in order to define all these gestures whenever the device starts:
Code:
echo "
[COLOR="SeaGreen"]# Gesture 1 - swipe 1 finger near the top and one near the bottom from left to right
1:1:(0|150,0|150)
1:1:(330|480,0|150)
1:2:(0|150,650|800)
1:2:(330|480,650|800)
# Gesture 2 - swipe 3 fingers from near the top to near the bottom
2:1:(0|480,0|200)2:1:(0|480,600|800)
2:2:(0|480,0|200)2:2:(0|480,600|800)
2:3:(0|480,0|200)2:3:(0|480,600|800)
# Gesture 3 - draw a Z with one finger while another is pressed on the middle left
3:1:(0|150,0|150)
3:1:(330|480,0|150)
3:1:(0|150,650|800)
3:1:(330|480,650|800)
3:2:(0|150,300|500)
[/COLOR]
" > [COLOR="Blue"]/sys/devices/virtual/misc/touch_gestures/gesture_patterns[/COLOR]
There are 2 important things to keep in mind when defining gestures:
* The touches are still delivered to whatever applications are active. If a certain gesture proves to cause nuisance with the actual apps, change it to something different or use it only in certain situations;
* Whenever you're pressing or moving 2 fingers close together, at some point the screen will start detecting only one of them. For some of the gesture definitions this might cause the detection to fail or only work very rarely. Make sure to use the "Show pointer location" option in Settings / Developer in order to be able to track what the device detects, while you're setting things up the way you want.
Triggering actions
Defining gestures won't do anything by itself. Now you need to check the /sys/devices/virtual/misc/touch_gestures/wait_for_gesture entry to see which gesture is detected and do whatever you want.
Here's an example, also to be run from an init.d script:
Code:
( while [ 1 ]
do
GESTURE=`cat /sys/devices/virtual/misc/touch_gestures/wait_for_gesture`
if [ "$GESTURE" -eq "1" ]; then
mdnie_status=`cat /sys/class/mdnie/mdnie/negative | head -n 1`
if [ "$mdnie_status" -eq "0" ]; then
echo 1 > /sys/class/mdnie/mdnie/negative
else
echo 0 > /sys/class/mdnie/mdnie/negative
fi
elif [ "$GESTURE" -eq "2" ]; then
# Start the camera app
am start --activity-exclude-from-recents com.sec.android.app.camera
elif [ "$GESTURE" -eq "3" ]; then
# Edit and uncomment the next line to automatically start a call to the target number
### EDIT ### service call phone 2 s16 "133"
fi
done ) > /dev/null 2>&1 &
What this will do is:
- for the 1st gesture, toggle mDNIe inverted / normal
- for the 2nd gesture, launch the Camera app no matter what app is active (quick, that chick is almost out of view! )
- for the 3rd gesture - after you edit and uncomment the appropriate line - a call will be established to that number (the wife is impatient, I don't even have time to enter my PIN!!! )
It loops eternally looking for the next detected gesture and triggering the appropriate action.
NOTE - this has been edited to no longer cause hangs on CM10 startup. The problem was with comments inside the script that contained chars like ' ( ) etc.; be careful when changing the script not to introduce these problems.
Reading from "wait_for_gesture" blocks until one of them is detected, and therefore no CPU is consumed nor deep sleep prevented because of the infinite loop.
In some rare occasions (e.g. multiple scripts waiting for gestures, which can be awaken at the same time but only one of them will get each gesture) the script can wake up with a value of 0, which should just be ignored.
If no script is reading "wait_for_gesture", multiple gestures can be detected and buffered (at most one instance of each one) and be send immediately as soon as something starts reading the entry.
Doing an "echo reset > ..../wait_for_gesture" will flush that buffer so no pending gestures are reported, only future ones.
Sample script
The attached file is a CWM installable package that contains a sample script with all this and more.
It has both the definition of 8 gestures and actions to be performed for each of those.
Remember to edit and uncomment the line with the intended phone, otherwise it won't do anything when you draw the Z on the screen.
Just flash it on your primary or your secondary ROM and you're good to go, with the behavior described below.
Gestures:
1. one finger on the top left, another on the bottom left; swipe both horizontally to the right edge
triggered action - invert mDNIe
2. swipe 3 fingers from the top of the screen to the bottom
triggered action - launch the camera app
(currently recognizes the apps from stock Sammy 4.0.*, AOKP 4.0.4 and JellyBean / CM10)
3. press one finger on the middle left of the screen; with another finger draw a Z starting on the top left edge
triggered action - immediately dial a predefined number on the script (must edit the script to put the number you want or it won't do nothing as it is)
WARNING: This has a nice bonus but you need to be aware of it - it will work even on a locked screen. Anyone that knows the gesture will be able to dial that destination even without knowing your PIN or Unlock Pattern. They won't however be able to press any of the other phone buttons like Contacts, etc.
4. hold one finger on the bottom right while another goes from top-left to the middle of the screen and back
triggered action - toggle Bluetooth on/off (will also vibrate for 100ms to provide feedback)
5. hold one finger on the bottom left while another goes from top-right to the middle of the screen and back
triggered action - toggle WiFi on/off (will also vibrate for 100ms to provide feedback)
6. hold one finger on the top left and another on the bottom left, move both to the middle right
triggered action - Media play / pause
7. draw an X on the screen - top-left, bottom-right, top-right, bottom-left - while holding another finger on the middle left
triggered action - Power button (to spare the physical button)
8. swipe one finger from the bottom left to the bottom right, then again bottom left (5 times)
triggered action - Home button (to spare the physical button)
9. hold one finger on the bottom left and with another swipe from the top right to top left and back to top right
triggered action - Toggle between the last 2 activities, excluding the TW Launcher (edit the script if you use another launcher)
10. hold one finger on the middle left and with another swipe top-right, bottom-right, top-right (3 times)
triggered action - force closes the current activity
11. press 3 fingers in the positions: top-left, top-right, bottom-left
triggered action - temporarily disables finger detection by the apps (or re-enables) so you can then swipe other gestures without causing effects in the apps
All other gestures automatically re-enable detection after it has been disabled by this gesture.
These gestures and actions are already an evolution over the original sample I shared, as a result of people posting their suggestions and ideas on the thread.
It's your turn now - think of what is useful to you and make sure to share it with others
Configuring gestures
Refer to [GUIDE] Defining/Creating Triggering Actions Gestures easier by janreiviardo for a great visual explanation on how to setup gesture coordinates.
Also, be sure to follow Flint2's Kernel Gesture Builder app for those who are not so fond of editing script files.
Actions
Here's a collection of the several types of actions that have been identified so far. They're mostly ready to use as-is, but do read the script code and edit where necessary to suit your needs.
Please refer to the previous posts for instructions on how to include this in the gesture detection loop.
For test purposes you can simply execute these from the ADB shell, but for it to be part of your daily usage they must be included on your personal script.
Key presses
With these your gestures can simulate that certain keys were pressed, usually hard-keys that you may wish to avoid wearing out, or special keys that the device may even not have but the ROM can react to, depending on the ROM.
Examples: HOME, Power, Volume up/down, Media play/pause, Media stop, Media next/previous, Volume mute/unmute, Recent Apps, etc.
Script code:
Code:
input keyevent 26
This has the same effect as pressing the Power key.
For other key codes, check here for ICS or here for JB. Some examples:
3 - HOME
24/25 - Volume up/down
26 - Power
84 - Search
85 - Media play/pause
86 - Media stop
87/88 - Media next/previous
164 - Toggle volume mute
187 or 214 - Recent apps
220 - Voice search
212/213 - Brightness up/down
215 - App drawer
As an alternative to executing "input keyevent <code>", it is also possible to inject key press events and even choose the delay between the press and the release to simulate long presses.
Example for a HOME key long press:
Code:
sendevent /dev/input/[COLOR="Red"]event1[/COLOR] 1 [COLOR="Red"]102[/COLOR] 1
sendevent /dev/input/event1 0 0 0
usleep [COLOR="red"]500000[/COLOR]
sendevent /dev/input/event1 1 [COLOR="red"]102[/COLOR] 0
sendevent /dev/input/event1 0 0 0
102 is the scan code for the HOME key and it will have a delay of 500ms between pressing and releasing.
Possible scan codes (for the physical buttons):
102 - Home
116 - Power
115 / 114 - Volume up / down
For the touchkeys "menu" and "back", instead of using event1 (gpio-keys as stated by "getevent"), send the scan codes to event7 (sec_touchkey):
139 - Menu
158 - Back
Invoking services
There are quite a few services running on the device, which expose interfaces that can be invoked using the "service call <name> <transaction> <params>..." syntax.
How to explore existing services
To list running services:
Code:
# service list
Found 95 services:
0 sip: [android.net.sip.ISipService]
1 phoneext: [com.android.internal.telephony.ITelephonyExt]
2 [COLOR="Red"]phone[/COLOR]: [com.android.internal.telephony.[COLOR="Red"]ITelephony[/COLOR]]
3 iphonesubinfo: [com.android.internal.telephony.IPhoneSubInfo]
4 simphonebook: [com.android.internal.telephony.IIccPhoneBook]
5 isms: [com.android.internal.telephony.ISms]
6 voip: [android.os.IVoIPInterface]
7 FMPlayer: [com.samsung.media.fmradio.internal.IFMPlayer]
8 mini_mode_app_manager: [com.sec.android.app.minimode.manager.IMiniModeAppManager]
9 tvoutservice: [android.os.ITvoutService]
10 motion_recognition: [android.hardware.motion.IMotionRecognitionService]
11 samplingprofiler: []
...
Search sources (using for instance grepcode.com) for the interfaces, such as ITelephony in this example.
Here you can find the existing transactions for the "phone" service on 4.0.3 (in some cases it may be slightly different in JB).
If we want to invoke the TRANSACTION_call operation on this service, we'll need to indicate transaction code 2 (1+1) and check what parameters it expects. The code for that is in this line which shows that this particular call it needs a string (with the number to call).
So, in conclusion, to make the device call a certain number one only has to issue this commant:
Code:
service call phone 2 s16 "123456789"
replacing the destination number for the one you want.
Note that "service call" accepts arguments of type "i32 <number>" and "s16 <string>", which can be joined together as many times as needed. For transactions expecting "long", you'll need to pass in 2 i32's to make a long value.
In some of them - such as when asking for the current state of something like bluetooth - you'll need to analyze the output (using grep, for instance) to find if the result is "00000000 00000000" vs "00000000 00000001", or some other value.
Collected service calls so far
Calling a phone
Code:
service call phone 2 s16 "123456789"
(replace 123456789 by the destination number you want to call)
Toggling bluetooth enabled/disabled
This involves 3 transactions on the bluetooth service: isEnabled, enable, disable (this one changes from ICS to JB); the output of isEnabled must also be analyzed in order to know what to do next.
Code:
service call bluetooth 1 | grep "0 00000000"
if [ "$?" -eq "0" ]; then
service call bluetooth 3
else
[ "$is_jb" -eq "1" ] && service call bluetooth 5
[ "$is_jb" -ne "1" ] && service call bluetooth 4
fi
The "is_jb" variable should have been set to 1 prior to this in the case of a JB rom (the script on the OT includes it)
Toggling data connection
Similar to bluetooth, but on the connectivity service.
Code:
service call connectivity 18 | grep "0 00000000"
if [ "$?" -eq "0" ]; then
service call connectivity 19 i32 1
else
service call connectivity 19 i32 0
fi
Toggling WiFi on/off
Similar to bluetooth, but on the wifi service.
Code:
service call wifi 14 | grep "0 00000001"
if [ "$?" -eq "0" ]; then
service call wifi 13 i32 1
else
service call wifi 13 i32 0
fi
Vibration
Transaction "vibrate" can be called in the vibrator service (it requires a long parameter, which maps to 2 i32 entries)
It is asynchronous, meaning that the instruction ends but the device will continue vibrating for the requested duration. This is important in case you wish to insert pauses between multiple vibrations; in that case you'll need to call "usleep" (to have times smaller than 1s) but pause for the duration of the first vibration + the non-vibration time you want, before invoking it again.
Code:
service call vibrator 2 i32 300 i32 0
usleep 600000
service call vibrator 2 i32 300 i32 0
This starts the vibration with a timeout of 300ms, pauses for 600ms (enough for it to stop and stay off for another 300ms) and vibrates a second time.
Expand / collapse the status bar
Not particularly useful, but the transactions "expand" and "collapse" can be called on the statusbar service.
Code:
service call statusbar 1
Code:
service call statusbar 2
Enable / disable the touch screen
This can be very useful in order to prevent the finger movement for gestures to trigger side-effects on the active app that also receives those events (moving icons on the launcher, etc.)
For best experience, map these to a simple gesture such as pressing 2 or 3 fingers in the screen corners, preferably without movement in order not to cause active apps to do anything.
For ICS:
Code:
# Disable
service call window 18 i32 0
# Enable
service call window 18 i32 1
For JB use transaction code 15 instead of 18.
Force-stopping an activity
One of the ways to do this is invoking the FORCE_STOP_PACKAGE_TRANSACTION on the activity service.
Code:
service call activity 79 s16 com.swype.android.inputmethod
This will stop the swype package if it's running.
For a more dynamic script that will stop whatever is foreground app, the output of "dumpsys activity" can be combined:
Code:
service call activity 79 s16 `dumpsys activity top | grep '^TASK.*' | cut -d ' ' -f2`
Toggling between the last 2 applications / windows
For this, invoke MOVE_TASK_TO_FRONT_TRANSACTION on the activity service with the task id to activate and with the MOVE_TASK_NO_USER_ACTION flag.
Again, "dumpsys activity" can be used to identify the next-to-last activity which will be brought to front:
Code:
service call activity 24 i32 `dumpsys activity a | grep "Recent #1:" | grep -o -E "#[0-9]+ " | cut -c2-` i32 2
Since the launcher is just an app like any other, if the previous app was the launcher that's where you'll switch to. If you'd like to exclude it from this logic, a slightly more elaborate script is required:
Code:
dumpsys activity a | grep "Recent #1:.* com.sec.android.app.twlauncher"
if [ "$?" -eq "0" ]; then
service call activity 24 i32 `dumpsys activity a | grep "Recent #2:" | grep -o -E "#[0-9]+ " | cut -c2-` i32 2
else
service call activity 24 i32 `dumpsys activity a | grep "Recent #1:" | grep -o -E "#[0-9]+ " | cut -c2-` i32 2
fi
Basically, if the last app was ...twlauncher, switch to the one before that (#2) instead of the last (#1). You'll obviously need to edit the package name to match your launcher.
Launching applications (and other intents)
The "am" command can be used to launch applications, much like what happens when their icons are pressed in the launcher. In fact, this command is so powerful that it can be a challenge to know what to do with it.
The most usual scenario is to merely execute "am start <packagename>/<activity>".
To find out which values to pass as package and activity, you can launch whatever apps you're interested in and then execute "dumpsys activity a" to see what's running and what is their associated activities:
Code:
# dumpsys activity a
ACTIVITY MANAGER ACTIVITIES (dumpsys activity activities)
Main stack:
* TaskRecord{41d938c8 #48 A com.android.email}
numActivities=1 rootWasReset=false
...
Running activities (most recent first):
TaskRecord{41d938c8 #48 A com.android.email}
Run #3: ActivityRecord{41bbeda0 com.android.email/.activity.MessageListXL}
TaskRecord{41dfff40 #46 A com.seasmind.android.gmappmgr}
Run #2: ActivityRecord{41950668 com.seasmind.android.gmappmgr/.GmUserAppMgr}
TaskRecord{41d9a498 #2 A com.sec.android.app.twlauncher}
Run #1: ActivityRecord{41dd14c0 com.sec.android.app.twlauncher/.Launcher}
TaskRecord{41f12788 #40 A com.cooliris.media}
Run #0: ActivityRecord{4157e130 [COLOR="Green"]com.cooliris.media/.Gallery[/COLOR]}
...
Recent tasks:
* Recent #0: TaskRecord{41d938c8 #48 A com.android.email}
...
intent={act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10400000 cmp=com.android.email/.activity.Welcome}
realActivity=[COLOR="green"]com.android.email/.activity.Welcome[/COLOR]
...
* Recent #1: TaskRecord{41dfff40 #46 A com.seasmind.android.gmappmgr}
...
intent={act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.seasmind.android.gmappmgr/.GmUserAppMgr}
realActivity=[COLOR="green"]com.seasmind.android.gmappmgr/.GmUserAppMgr[/COLOR]
...
From here you can see that good candidates for the "am start" command would be "com.cooliris.media/.Gallery", "com.android.email/.activity.Welcome", etc.
Invoking the Gallery app, for instance, will output this:
Code:
# am start com.cooliris.media/.Gallery
Starting: Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] cmp=com.cooliris.media/.Gallery }
Warning: Activity not started, its current task has been brought to the front
In this case, the application was already running so it was merely brought to the foreground. Otherwise, it would have been launched.
On the script in the OP you can find the following for the gesture that launches the Camera app:
Code:
result=`am start com.sec.android.app.camera/.Camera 2>&1 | grep Error`
[ "$result" != "" ] && result=`am start com.android.camera/.Camera 2>&1 | grep Error`
[ "$result" != "" ] && result=`am start com.android.gallery3d/com.android.camera.CameraLauncher 2>&1 | grep Error`
Since different ROMs have different camera apps, in this code 3 different activities are tried, moving on to the next whenever the previous one failed. This still doesn't cover all possibilities, but at least is not only for stock 4.0.3 roms but also works e.g. on some JB roms.
An interesting use can obtained if you're using the Android 4.2 Camera. It supports taking pictures even from a locked phone, in a secure manner i.e. not requiring to unlock the phone but not allowing to browse the gallery contents (while still allowing to see the ones you took at that time). Here's the am command:
Code:
am start -a android.media.action.STILL_IMAGE_CAMERA_SECURE
Check this post for more details.
Finally, there are some topics that you can explore on the "am" command:
1. additional intent details for "am start": instead of merely passing <package>/<activity>, there are many more options to use if you know the action to launch, the category, any data it uses, extra parameters, etc.
2. optional arguments such as "--activity-exclude-from-recents" (which prevents the task to be added to the recents list, in the task manager). Just explore the available options by running "am" only.
3. actions other than "start": force-stop, kill, broadcast, etc.
Again, the "am" command allows lots of things to be done related with Intents, but a lot of investigation is required on what intents exist, what parameters they take, etc.
In pretty much all cases the standard "am start <package>/<activity>" will be the syntax to use.
Sweet jesus. Nice going man
Incredible amount of work, well done
### EDIT ### service call phone 2 s16 "133"
To call "111-111-1111", Should "133" alone be replaced and un-commented like given below
service call phone 2 s16 "111-111-1111"
rav4kar said:
### EDIT ### service call phone 2 s16 "133"
To call "111-111-1111", Should "133" alone be replaced and un-commented like given below
service call phone 2 s16 "111-111-1111"
Click to expand...
Click to collapse
Something like that, yes. I'm not sure whether you need to remove the dashes but I guess not. Just open a shell and run that command directly to see what happens.
Tungstwenty said:
Something like that, yes. I'm not sure whether you need to remove the dashes but I guess not. Just open a shell and run that command directly to see what happens.
Click to expand...
Click to collapse
Thanks so much, it worked as was ( diff. number though) with dashes from shell, greatly appreciate your efforts.
service call phone 2 s16 "111-111-1111"
This is awesome. For a sample can. I ask for a pinch screen with five finger gesture. On and to really p off Apple let me pull fingers away from the screen and use my fingers now holding a virtual joystick to move the gallery in three dimensions. Lol
Seriously though, interested in how you'd go about the five finger pinch.
Thank you
Edit : something like this...?
1:1219|259,0|40)
1:1219|259,359|399)
1:20|40,379|420)
1:2199|239,379|420)
1:3440|480,379|420)
1:3240|280,379|420)
1:4219|259,860|800)
1:4219|259,400|440)
Is there a margin allowed?
Awesome feature and tutorial! Much faster than any app as it's handled at kernel level, anyway an UI for this would be perfect though. Keep up the good work!
on my wife's phone drawing a heart on screen calls my number
Code:
4:1:(200|280,699|799)
4:1:(0|150,300|500)
4:1:(200|280,300|500)
4:1:(330|480,300|500)
4:1:(200|280,699|799)
it is actually a triangle at the lower part of the screen but works with almost any kind of heart figure as long as the action starts from the lowest middle part of the screen.
supercool feature to demonstrate to friends
loved your idea and creativity, you should patent this, some enterprise will want it for sure
Can installed (non system) user apps in /data/data be linked to the application to call in the gesture?
I'm trying to link the Phandroid app but it doesnt seem to want to run it.
I obtained the package name (com.blau.android.phandroidnews) and am trying to run this but nothing happens.
System apps (camera and browser) are opening ok.
One thing I've noticed, if you define for example 3 fingers down from top to bottom, and also an identical gesture but for 2 fingers down from top to bottom, and you perform 3 fingers down on screen, then the 2 gestures get detected and executed (as both 2 fingers down and 3 fingers down gets detected)
Thanks mate
rav4kar said:
Thanks so much, it worked as was ( diff. number though) with dashes from shell, greatly appreciate your efforts.
service call phone 2 s16 "111-111-1111"
Click to expand...
Click to collapse
You don't need the dashes this will also work
service call phone 2 s16 "1111111111"
You can also use international dialing code like this
service call phone 2 s16 "+271111111111"
Sent from my Galaxy S2
great work ,,, it,s amazing
Heroeagle said:
@gokhanmoral
how can i trigger this gesture ?? how can i add the phone number in the init.d file?
Click to expand...
Click to collapse
Here it's a script that you need to put it in your init.d folder ( remember this is an example )
Code:
#!/system/bin/sh
echo "
# Gesture 1 - Heart gesture. Dial favorite number
1:1:(200|280,699|799)
1:1:(0|150,300|500)
1:1:(200|280,300|500)
1:1:(330|480,300|500)
1:1:(200|280,699|799)
" > /sys/devices/virtual/misc/touch_gestures/gesture_patterns
(while [ 1 ];
do
GESTURE=`cat /sys/devices/virtual/misc/touch_gestures/wait_for_gesture`
if [ "$GESTURE" -eq "1" ]; then
#Replace 133 with your favorite number you want to call
service call phone 2 s16 "133"
elif [ "$GESTURE" -eq "0" ]; then
sleep 2
fi
done &);
what a brilliant idea master tungstwenty.. bowing m(-_-)m
PS: if someone has a good template please share
install 4.1Beta 6 on galaxys2/9100
flashed this sample script to try
it wont past siyah logo if this script is loaded.., have to wipe init.d folder from recovery to make my phone boot...
any help?
i just want this script badly (want to be able to pull notification bar even from locked screen).
btw very great work man this script looks very promising.
Tungstwenty
Wow...what a great idea. Impressive work! :thumbup:
I am still learning and have been reading (intensively) the Siyah thread to learn as much as i can about kernels, which is how i came to this thread. I have two quick questions, to help me understand a bit better how kernels work--i apologize if the questions sound dumb.
-since this is done at the kernel level, i assume these gestures take precedence over gestures defined in a launcher or app? For instance, if i define the same gesture to trigger an action in, say, apex launcher, and the same gesture to trigger a different action via your kernel feature, what would happen? Would the kernel action occur and block the launcher action, or would the kernel action happen, and then the launcher action happen?
-can these actions be triggered from the lockscreen without having to unlock the phone the "normal" way?
thanks!
crypticc said:
This is awesome. For a sample can. I ask for a pinch screen with five finger gesture. On and to really p off Apple let me pull fingers away from the screen and use my fingers now holding a virtual joystick to move the gallery in three dimensions. Lol
Seriously though, interested in how you'd go about the five finger pinch.
Thank you
Edit : something like this...?
Code:
1:1:(219|259,0|40)
1:1:(219|259,359|399)
1:2:(0|40,379|420)
1:2:(199|239,379|420)
1:3:(440|480,379|420)
1:3:(240|280,379|420)
1:4:(219|259,860|800)
1:4:(219|259,400|440)
Is there a margin allowed?
Click to expand...
Click to collapse
You're on the right track but still not quite there.
All these lines are for the definition of gesture number 1 (first token) - good
You're defining the expected movement / hotspots of 4 fingers, not 5 as you mentioned (second token) - good if 4-finger pinch is ok instead of 5-finger.
For each of them, the first line defines a BOX where it must start (or pass through for the tracking to start) and another one where it must reach. Once all of the 4 fingers have passed from the initial to the final box - even if they then proceed moving - the gesture will fire up.
Example: the 1st finger (any of them, really) must pass through the box with X between 214 and 259 (about the middle width), and Y between 0 and 40 (top of the screen). Perhaps a box of 40px per 40px might be too small or restrictive, you'll need to test it out. Just think that this means about 1/12 of the total width of the screen (480), and 1/20 of the height (800).
The finger that started there must then proceed to a box near the middle of the screen - same horizontal limits / tolerance, but Y not a bit above half (400).
2nd finger follows the same logic - from the west side to the middle.
3rd as well, from east to the middle.
The 4th is not to good - you have a typo in the minimum Y for the starting position. You meant 760|800 instead of 860|800
Other than that it should work, although depending on your tests you might want to widen a bit more the detection boxes.
I think your final question is also answered - there is a margin allowed, which is the one you defined (in this case 40px wide and tall in every box).
Hi all
I have an issue with an app I am writing. It has the working name VNCHID, and is designed to allow an Android phone/tab to be used as a keyboard and mouse for a computer (or other device) over VNC. I currently have it in a semi-working state, but with unpredictable results. I am hoping someone can help with the issues I have come accross.
The first is related to mapping KeyEvents to VNC (X11) KeySyms. I have derived a class from KeyEvent wrich adds a method "getKeySym". This method does the following:
Uses the getUnicodeCharacter method from KeyEvent to try to determine a character from the key (combination). If one exists, it uses a map (found elsewhere) to translate this into a KeySym.
Does the same again, but without modifier keys. This may be unnecessary, but for now I have left it in.
Assuming the above fails, it uses getKeyCode and runs this through my own map to generate a KeySym.
The problem I am facing on my development device, an A10-based tablet running CM9, is that if a key is pressed which doesn't have a unicode character associated with it, like backspace, the code just stops at getUnicodeCharacter. It doesn't crash the app, throw an exception, or anything else, the execution of the routine just stops completely. It will still pick up other keyevents afterwards.
This seems very strange, and I do not know why it would do so.
My second problem is on my day-to-day phone. I tried installing the app on here to see if the problem was with the device/ROM/etc, but on here it doesn't even detect any keyevents. I have done some research which indicates that my approach (implementing OnKeyListener) is not guaranteed to work with soft keyboards, but it seems very strange that it (almost) does on one device but not on another (using the same soft keyboard, hackers keyboard).
I am now almost at the stage of creating my own keyboard (not an IME, just a layout to be shown in the app with a keyboard), but would like to avoid this. Can anyone help? I am tearing my hair out. Does anyone know the correct way to receive keyevents from a soft keyboard? I do not want an edittext, I just want the raw key events.
The code for my getKeySym() method is bellow. It stops at the first getUnicodeCharacter() call when a non-character key is pressed:
Code:
// Test unicode
Log.i("VNCHID.VKE","Translating keyevent to keysym");
int keysym=0, ucc=0;
ucc=this.getUnicodeChar(this.getMetaState());
Log.i("VNCHID.VKE","Got ucc");
if( ucc > 0 ) {
keysym = UnicodeToKeysym.translate(ucc);
}
if( keysym > 0 ) {
Log.i("VNCHID.VKE", "Got keysym "+keysym+" from ucc "+ucc);
return keysym;
}
// Test unicode without modifiers
ucc=this.getUnicodeChar(0);
if( ucc > 0 ) {
keysym = UnicodeToKeysym.translate(ucc);
}
if( keysym > 0 ) {
Log.i("VNCHID.VKE", "Got keysym "+keysym+" from ucc wom "+ucc);
return keysym;
}
Log.i("VNCHID.VKE","Could not find keysym from ucc, trying from keycode");
// Otherwise lookup from KeyCode
if( KCSM.size() < 1 ) generateKeyCodeSymMap();
/*KCSM.
if( KCSM.containsKey(this.getKeyCode()) ) return KCSM.get(this.getKeyCode());
*/
keysym=KCSM.get(this.getKeyCode(), -1);
Log.i("VNCHID.VKE","Result of lookup by keycode "+this.getKeyCode()+" is "+keysym);
return keysym;
Note it was much simpler, but I have expanded it and inserted the Log calls for debugging. As an example, when backspace is pressed I receive log lines from my onKey method, plus the first one in this method (Translating keyevent to keysym) but nothing after that.
Please help!
Hi
Is this the right forum to be asking questions like this? I'm not trying to be pushy, it's just I've come to expect quick, helpful answers on this site, so I just wanted to double check.
If so... Bump
If not, could someone please point me towards the appropriate forum?
Sent from my MB860 using xda premium
When trying to implement an EditText scrolling method, I found that ScrollView apparently can only scroll one page at a time.
To go around that limitation, I decided to scroll my EditText (instead of the ScrollView) by the desired number of pixels. However, the problem is, as soon as I move my cursor, TextEdit updates the view back to the way it was (i.e., equivalent of ScrollTo(0, 0)). So if I need to scroll by 240 pixels, it either scrolls so that I am at the top of Page 1 (0 pixels), or the top of Page 2 (727 pixels).
Does anyone know of some way to go around this limitation without executing "ScrollTo" at the end of every keystroke? I am sure there should be some option to set AutoScroll=Off, or something along those lines...
Edit: and I would prefer to keep the ScrollView there, since I would like my user to be able to scroll using gestures as well. Though it is okay if the user has to push a button to turn off gesture scrolling to allow EditText.scrollBy scrolling, but not both.
OLD QUESTION:
Hello,
I would like to edit some really long texts using the EditText app. When the text gets so large that it does not fit on the screen, the EditText does not automatically scroll to where the cursor is: so I stop being able to see where I'm typing.
I decided to allocate the top and bottom soft keys to allow me to scroll quickly. I tried several functionalities of the soft keys, but they all failed to resolve this problem:
1. Send a command to my EditText to ScrollBy a certain number of pixels. However, at that point I cannot use scrolling gestures to scroll back up to where I was.
2. Send a sequence of TouchEvents to emulate the scrolling, e.g.:
Code:
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1000, MotionEvent.ACTION_DOWN, 100, 200, 0));
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1100, MotionEvent.ACTION_MOVE, 100, 190, 0));
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1200, MotionEvent.ACTION_MOVE, 100, 180, 0));
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1300, MotionEvent.ACTION_MOVE, 100, 170, 0));
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1400, MotionEvent.ACTION_MOVE, 100, 160, 0));
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1500, MotionEvent.ACTION_MOVE, 100, 150, 0));
scroll.dispatchTouchEvent(MotionEvent.obtain(0, event.getEventTime() + 1600, MotionEvent.ACTION_UP, 100, 140, 0));
However, regardless of playing around with the parameters, and whether the command is sent to my ScrollView or to my EditText, I was still unable to emulate scrolling via a command.
3. Determine where I am using EditText.getLayout.getLineBottom, and then touch that area. For scroll up, that worked good enough; for scroll down, though, if I touch my line, it comes up at the bottom; however, my line is then concealed by the soft keyboard. If I touch below my line, and then update my position, the updating of position happens faster than the touch, and at the end, my position ends up below where I should be.
4. I also tried playing around with different parameters in the XML of my file. Also, no success.
Any suggestions?
Questions should be posted in Q&A forums, not Development forums.
http://forum.xda-developers.com/announcement.php?a=81
See rule #15
Thread moved.
(see updated top message above)