My Question has been asked, maybe in the wrong forum, maybe I didn't ask the right question.
I'll try my best.
I have developed an application, it's almost finished but I have one more feature.
Detect events from laptop/desktop (keyboard) on Android Smartphone
I have been trying bluetooth, USB, NFC etc... I just can't quite nail the solution. I've been trying now for around 5 days to solve this problem. When the smartphone is plugged into a laptop/desktop I'd like to be able to detect keyboard events (or any type of event) from the laptop/desktop in my android app. Android 4.2.2
The idea is to log events, as the android app (installed on a dedicated device) will always be connected to a laptop/desktop for usage. The events, considering what the are will trigger certain actions on the device and within the application. If that makes sense. So far all is going well... it's just this one feature which I cannot nail.
The phone is rooted, USB Host mode enabled, recognises that it has been connected using USB. But I can't for the life of me find the device (in my development example, a macbook pro) nor can I detect any type of event triggered from the laptop.
Is this possible? If so can anyone please help me by pointing me in the right direction, adding links to resources I can read up on.
Many thanks in advance.
This is a problem which is slowly becoming demoralizing as I cannot finish the app without this feature.
Again, many many thanks for anyone willing to share some light on a possible solution. USB would be perfect but am I missing something?
Regards
Anyone?
To end these endless, sleepless nights
Much appreciated guys!
D
chronograff said:
My Question has been asked, maybe in the wrong forum, maybe I didn't ask the right question.
I'll try my best.
I have developed an application, it's almost finished but I have one more feature.
Detect events from laptop/desktop (keyboard) on Android Smartphone
I have been trying bluetooth, USB, NFC etc... I just can't quite nail the solution. I've been trying now for around 5 days to solve this problem. When the smartphone is plugged into a laptop/desktop I'd like to be able to detect keyboard events (or any type of event) from the laptop/desktop in my android app. Android 4.2.2
The idea is to log events, as the android app (installed on a dedicated device) will always be connected to a laptop/desktop for usage. The events, considering what the are will trigger certain actions on the device and within the application. If that makes sense. So far all is going well... it's just this one feature which I cannot nail.
The phone is rooted, USB Host mode enabled, recognises that it has been connected using USB. But I can't for the life of me find the device (in my development example, a macbook pro) nor can I detect any type of event triggered from the laptop.
Is this possible? If so can anyone please help me by pointing me in the right direction, adding links to resources I can read up on.
Many thanks in advance.
This is a problem which is slowly becoming demoralizing as I cannot finish the app without this feature.
Again, many many thanks for anyone willing to share some light on a possible solution. USB would be perfect but am I missing something?
Regards
Click to expand...
Click to collapse
Related
I was wondering if anyone had thought about the idea to change your phone into essentially a second touch screen monitor for your computer. You could do it wirelessly, but I think a usb connection to charge the phone while you do it would be just as good, in some ways better.
One you could extend your desktop to essentially, and make it work like any other monitor, just with the touch input.
Its called Logmein or VNC .
Unless I am mistaken you need to carefully reread my post. I do *NOT* want to control my monitor through my phone. I want my phone to *BE* a monitor for my pc with touch input. Last I checked either of those programs only allowed me to control my pc through my phone, not act as a secondary display. If I am wrong feel free to inform me.
Logmein Ignition is close,but its still not what I want. The goal is to have the phone be nothing more then a monitor and HID for the computer its hooked up to.
i doubt its possible because if you haven't noticed on the back of your monitor are some wires, ( if your using laptop its inside ) those are what your CPU shows to the monitor, and those cannot be connected to a phone, its just not possible.
Resolution wouldn't work its wayyyyyyy to small.
it might be possible, but i highly doubt it.
Totally possible through USB, the point is not to have a huge display but a minidisplay.
USB Monitor example.
http://www.thinkgeek.com/computing/usb-gadgets/bfa3/
No offence, but typical of this forum apparently. Instead of throwing out FUD please answer if you have the knowledge of have read the actual post thoroughly. Thank you .
It could theoretically be done through Linux. The problem would be the resolution conversion. And that, would have to be done from the computer side~~~ convert size/resolution/format~then send. The problem would getting it to stream without delay or lag. Normal text would be probably ok but video, I think it would get delayed so looking at the CRT and the phone you would experience delay and speed differential.
I do not know an app that does that, but I my brother wrote an executable for Sun that did this. So, it can be done.... Sorry I am not much help.....
Perhaps I am not being clear, or people are just not used to the way multiple monitor systems work under windows. Not sure which so I will try to clarify what I am talking about and provide a couple examples.
I want to know if anyone has developed or is interested in developing a way to have an android device be used as a secondary monitor, with touch input, for your PC(preferably windows PC).
What do I mean by secondary monitor?
An independent display that is able to use its native resolution, and not be a duplicate of your monitor.
Examples currently available:
Here is a website that has many different kind of usb monitors.
http://www.mimomonitors.com/
Final note:
The goal is to have a small display that can be taken advantage of when you want to use it, while charging your phone. Given androids ability to multi-task you would not lose access to the phone while doing this.
Application purposes:
Display chat output, music, ventrillo, and web pages while in a game or other landscape intensive task on your computer.
Why?
Our phones sit beside us while we are on the computer and for the most part we don't utilize them while they are there. Why not make them usefull while they charge? People have been purchasing multiple displays or mini-displays for many years now, and I think it would be awesome if we could use our phones for that purpose without having to go out and buy a new device.
Yeah I would love a feature like this.
Sent from my SGH-T959 using XDA App
I'm developing a touch screen based system for controlling electronic music. As part of the development, we'll be building our own touch screen, but that's not going to be ready for some time. In the mean time, I need to start writing the software (which will be done in java), and I'm going to need a touch screen to use for testing.
So, I am NOT trying to write an application for the Galaxy Tab. I am writing a application that runs on the my desktop, and I'd like it to be able to get touch information from the Galaxy, in any way practical. I've looked into using an iPad for this, but it looks to be too much of a pain to be worth it. All I need is a way of my java application receiving the list of co-ordinates of touches from the tab, in real time. I don't need any higher level gesture interpretation (as I'll have to do that on my end for the final system anyway), just all the touch co-ordinates. Does anyone have a suggestion on the best way to go about this? Is there something in existence already to accomplish this easily, or is there any kind of java library I can use to make calls to a connected tab from my application? I've been googling around, but haven't found any particularly useful information on the subject, as the tab is chiefly meant to be a stand-alone item, not a pc peripheral. Any tips on where I might start looking would be a huge help. Thanks!
-cullam
cullambl said:
I'm developing a touch screen based system for controlling electronic music. As part of the development, we'll be building our own touch screen, but that's not going to be ready for some time. In the mean time, I need to start writing the software (which will be done in java), and I'm going to need a touch screen to use for testing.
So, I am NOT trying to write an application for the Galaxy Tab. I am writing a application that runs on the my desktop, and I'd like it to be able to get touch information from the Galaxy, in any way practical. I've looked into using an iPad for this, but it looks to be too much of a pain to be worth it. All I need is a way of my java application receiving the list of co-ordinates of touches from the tab, in real time. I don't need any higher level gesture interpretation (as I'll have to do that on my end for the final system anyway), just all the touch co-ordinates. Does anyone have a suggestion on the best way to go about this? Is there something in existence already to accomplish this easily, or is there any kind of java library I can use to make calls to a connected tab from my application? I've been googling around, but haven't found any particularly useful information on the subject, as the tab is chiefly meant to be a stand-alone item, not a pc peripheral. Any tips on where I might start looking would be a huge help. Thanks!
-cullam
Click to expand...
Click to collapse
Ok, well I'm going to try and be brief and not turn this into an Android programming essay so here goes.
You have a couple of different routes you can take.
1. If you use eclipse for development and you hook up your tablet, you can watch the log and see that it prints useful information constantly, basically debug output that tells you whats going on in the background. If you just want to look at it, you can probably see it there.
2. This would be my choice, but I'm a programmer so I love a new adventure. I would recommend you just write a quick app for your tablet that pumps out the location of a touch whenever you touch the screen. If you are familiar with sockets and such, you can just write a simple server Java app that collects packets of data from your tablet, and just have the tablet send out a multicast packet containing the coordinates you touch every time you touch the screen.
There are probably some other ways, but if you are already going to be doing the bulk of the project in Java, you aren't looking at a difficult learning curve to write a basic little android app.
Thanks! I'll definitely try the eclipse trick. And yeah, writing an app on the tab is probably going to be necessary, but MUCH easier than having to learn a new language, and get an official license to do one on the iPad. The thing I'm really unsure about is the available communication methods for getting data back and forth between them. I was hoping there might be some sort of java api to get calls going through the usb connection. So I'll guess I'll see what the Eclipse hook up shows me.
cullambl said:
Thanks! I'll definitely try the eclipse trick. And yeah, writing an app on the tab is probably going to be necessary, but MUCH easier than having to learn a new language, and get an official license to do one on the iPad. The thing I'm really unsure about is the available communication methods for getting data back and forth between them. I was hoping there might be some sort of java api to get calls going through the usb connection. So I'll guess I'll see what the Eclipse hook up shows me.
Click to expand...
Click to collapse
apple stuff is crap anyways, leave them to their pretentious commercials and closed minded development.
as far as the android sdk, I think it will take you a lot less time to just use network communications. google socket client/server java tutorials and you should be set to go in about 2 hours. I have implemented it, its all straight forward, and imho probably an easier app to write that something that pumps out of the usb port
Awesome, thanks
I'd like to control an Android mouse cursor with the camera and hand-tracking (or a colored LED), just moving my hands, it should work as if I had plugged in a real mouse.
After a long extensive search, I only found this app that does exactly what I want, but it's head tracking based, so it's no use for me: [MouSense for Android]
The funny thing is that there're a pletora of hand tracking solutions like this for PC/Mac/Linux with full source code avaible, I'm surprised no one has ever made one for Android, I found some hand gesture recognizers, but I need to control the whole device, not just detect a limited amount of hand gestures.
I bet many disabled people would benefit from it, like controlling an Android device connected to a tv. Can anyone help me or find a solution (I tried REALLY hard), please? I'd be very grateful if you can help me or even just suggest a solution, thanks!
Hi guys, I've been an avid reader of the xda forum, and it's great, so far it has been an awesome source of knowledge, however up until now I had found everything I needed...
Long story short, I want to be able to use the sensors in my smartphone (Rooted Moto G XT1032, 16 GB) in the open source software Processing. Could somebody please point me in the right direction?
Basically I want to:
Connect my phone to my PC via BT or WiFi
Read the value of the sensors within my phone and transmit them to my PC
Somehow read those values in Processing and create nice graphs and stuff
I know how to program Processing, I've looked into the Amarino Project, but it does require the Arduino Part, and I know how to interface Arduino and Processing (I've done some PID control and other stuff with those), but I want to skip the whole Arduino part (at least for now).
Could somebody please point me in the right direction?
Thanks in advance, cheers!
Update 17/09/2014: Currently looking into the SensoDuino project, which apparently will solve my problems in the Android Front.
I tested it following the video in the main page of the project in the part Pairing and Establishing a Serial Connection between Windows 7 and SensoDuino (sorry I still can't post links), I have been able to read the data using Tera Term
Just mind the COM ports, and that you are actually transmitting over bluetooth, you can check this in the services tab of the properties of the paired device within Windows (If anybody wonders why not Linux, which I use the most, in which I do my main work and in which this would be pretty much a native feature, I intend to use this with a Windows Only Software)
Special thanks to TheBeano!
P.S. Depending on what the rules of this forum regarding solved threads are, I'll either update or publish whatever the outcome of this project is, cheers!
Hectormd said:
Hi guys, I've been an avid reader of the xda forum, and it's great, so far it has been an awesome source of knowledge, however up until now I had found everything I needed...
Long story short, I want to be able to use the sensors in my smartphone (Rooted Moto G XT1032, 16 GB) in the open source software Processing. Could somebody please point me in the right direction?
Click to expand...
Click to collapse
Look at SensoDuino, they have a demo of connecting to Windows 7 via Bluetooth serial. Once you have a Bluetooth serial connection you can use Processing's serial comms to read the phone sensors. You could do the same sort of thing with Amarino, that is just read from the serial port, but it might be more work to decode the binary protocol that it uses from Processing. Sensoduino sends ASCII I think.
TheBeano said:
Look at SensoDuino, they have a demo of connecting to Windows 7 via Bluetooth serial. Once you have a Bluetooth serial connection you can use Processing's serial comms to read the phone sensors. You could do the same sort of thing with Amarino, that is just read from the serial port, but it might be more work to decode the binary protocol that it uses from Processing. Sensoduino sends ASCII I think.
Click to expand...
Click to collapse
Thanks a lot, it looks exactly like the thing that could fulfil my needs.
I'll update this post or publish whatever I come up with cheers, and thanks again!
Hi guys
I'm not 100% sure if this is the right forum area, please pardon me if it isn't.
I'm coming from Symbian OS (Nokia) and using their NOKIA OVI SUITE/PC SUITE for YEARS and it's been terrific. obviously this is the end of the road for it since Nokia/MS is not giving their desktop software much attention.. now i've migrated to Android and I needed an alternative to Nokia PC Suite..
Yes i've googled and searched and found a lot of so called alternatives. but i wanted to save time instead of installing all these options and testing em for myself.. (and i have tested a few already)
What I need is something like Nokia PC Suite w/c are:
1) very good at phonebook and SMS management (Sedning/receiving)
2) can connect both via USB or WIRELESS (bluetooth preferred)
3) Has a real DESKTOP software, not limited due to being browser based (like a lot of the options out there)
4) NOT CLOUD-DEPENDENT .. offline capable
A lot of options i've tried such as AirDroid falls short of at least 1 of the above requirements ..the only Software i tried that came pretty close to what i really want is MoboRobo .. but it has some short comings like unable to send multi-part sms (the software splits it up) and it needs to scan QR everytime i want to connect wirelessly (not automatic/remembering the last connection)
So i'm continuing the hunt and hopefully some of ya'll more experienced android users can shorten the journey for me.
Thanks everyone..
hmm i guess im stuck w/ moborobo