Homing device for Kit - Wear OS Software and Hacking General

Hi guys, a local team of volunteers here in New Zealand, is looking an Android Wear watch to develop a navigation application for a blind person, Kit. Also, possibly for more general use for the visually impaired. Kit likes swimming in the sea, but has difficulty finding his way up the beach back to the entrance of a track that leads to his house. So we are considering using a watch to help him navigate, initially via GPS and magnetometer (so that he can orient in the right direction), then using a BLE proximity beacon for the last part of the journey (as it needs higher accuracy than GPS to find the start of the track). Ironically Kit, who was a renown sound engineer on many NZ films, uses hearing aids, which he can't take in the water, so is also effectively deaf when he emerges!
We have never programmed a watch before so have some fundamental questions:
- Is what we want to do achievable using Android Wear? In other words is it possible to develop significant applications that access watch resources including GPS, vibrator, buttons, BLE, magnetometer?
- Is there likely to be sufficient rom/ram to implement a program that does location and compass calculations to guide kit between a series of waypoints? Clearly it depends, but it could be quite a complex programming with floating point calculations.
- Will we effectively be able to take over the user interface to use just keys and tactile feedback from the vibrator?
Clearly the screen will be fairly useless to Kit, and his hearing problem means that tones won't help!
- Will we be able to programme the Bluetooth (smart) functionality for proximity beacon detection?
- Has anything like this been done already? If so, references and contacts would be really helpful.
The watch requires:
Waterproof, as Kit will be wearing it whilst swimming.
Accurate GPS, for initial navigation between waypoints.
Magnetometer, for orientation and course corrections.
Vibration patterns, to communicate with Kit
Key controls, preferable to touch screen (although very coarse touch screen controls might also be useful)
Gyroscope, ideally, for gesture control
Bluetooth (smart), for beacon scanning and proximity detection, for last part of journey (this is critical)
Other comments / views on what we are wanting to achieve would be appreciated.
Best wishes, Ron.

Related

[Q] low OSC/Midi (no audio) latency possible?

Hello,
I'm new to the mobile development, but I have a computer science background. I want to do a concept for a real-time interface for audio interaction, so real low latency is essential. I have read a lot about this topic in the last week, but it's impossible to get a real overview in such a short time. So before I dig really deep I need to evaluate the feasibility from this project.
I've read a lot about the problems with Android and low audio applications and I know that iOS is way superior in this field, but I would prefer Android when it's possible. (I don't have a mac which is essential for developing iOS, and I just like the more 'open' Andoid more )
So my question are:
1. Is the bad low latency behaviour only related to actual audio buffer/processing, or does it affect midi or OSC only applications too? For me the interface is the important part - that means it's ok for me just to get midi/OSC commands which are processed in a separate PC.
2. I need a low overall latency (from the finger touch event to command output). Is this also a problem with Android compared to iOS?
kind regards,
audio developer

[Q] Remote control Parrot Asteroid Smart via iPad

Hello XDA.... I have an Asteroid Smart installed in the fairing of my 2013 Harley Davidson CVO Road Glide. My fairing also has the option of placing an iPad mini in front of the double din to access all things Apple/iPad. The asteroid smart is controlled via handlebar controls through the paser unika utilizing the resistive protocols (that was custom too . The problem comes in when I have the ipad mounted infront of the asteroid. Obviously, I cant see the asteroid at this point, so I have to try to remember push button sequences on my handlebars to control media sources, playlists, etc. I also can't see the navigation app from the SMART. Before suggesting that I just use the iPad for navigation, I can't as I only have the wifi version of iPad that only has AGPS for location information, not an actual GPS chip in it, so it's not really useable for navigation. Further, I find that I don't really like controlling the asteroid w/out seeing feedback on the screen in front of me.
The ideal situation would be to have VNC remote control of the asteroid via the iPad. Actually any remote control of the asteroid via iPad would be great, but I haven't been able to find any remote control soltuion for an android device that is controllable from an iPad. Sure I can find them that will allow control of android from PC or Mac or Linux, but not controlled from another tablet.
Interestingly enough, In talking with VNC, they have a VNC Mobile Server SDK that would allow for this very type of functionality. It was designed for allowing remote control of android devices via double din head units. However, they don't release it to the general public. Only OEMs, automotive companies and tier 1 suppliers... ughhh... However, they did inform me that Parrot is a licensee of this technology. (I've posted this on the Parrot forum too, to see if they can help). Can anyone offer some insight or suggestions problem?
Thanks in advance!

[Q] Voice controlled home automation

Hi,
I am looking at installing a home automation system in a clients house using a Bitwise controller. I was thinking about installing android tablets as an in-wall solution via WiFi rather than expensive light switches (Rako, Lutron, etc.) and multi-room controllers, as you seem to be able to pick up 7" tablets quite cheap. Stairs will still have standard switches.
I am aware of systems such as Crestron and AMX (we already install AMX) but I think there is a cheaper way to achieve this.
I understand they aren't going to be the best, but all they will be doing is running a Bitwise GUI designed for android tablets, so very little load.
I will have to make various changes to settings such as auto sleep times, wake up screens, etc. which I will address in a separate post. What I was looking for from the thread was to see how I can harness the voice control, which brings two questions.
The first is what is the typical range of the mic built in to the tablets and is there any way to enhance this?
The second is whether there is an app that provides voice control to something like the Bitwise BC1 controller, or am I trying to achieve something that isn't particularly easy?
I would be grateful for any guidance.
Craig

Is there a way to use an Android as a "gesture based" mouse for my PC?

So as the title says... I randomly developed carpal tunnel in my wrist and it's insanely painful to type, which led me to looking for other ways to interact with my PC. Funny enough Android is light years ahead of Windows in this aspect. In Windows you can enable other methods of input so just touch screen voice commands Etc. Then it occurred to me that my Android has a fairly accurate gyroscope in it.
Would it not be feasible to somehow link my phone to my computer and then use gestures not touching the screen but gestures as in moving the phone up or down or left or right to control the mouse? Newer devices have fairly accurate facial recognition technology, I mean there's apps on the Play Store that allow you to control your mouse with just your eyeballs, and it works, so why would gesturing not work?
I've looked around I don't see anything like that. I demand answers.

[DEV] Google Assistant remote control App (headset style) for Amazfit Verge

Hi all,
I'm playing with the thought of developing an app for the Verge that uses the speaker, mic and bluetooth connection of the watch to interact with the google assistant on the connected phone. Similar to the functionality of a bluetooth headset.
I'm thinking of a widget with just one button that starts the interaction, basically the equivalent to a long button press on a headset.
My issue is, that I am new to Android developement and so far only have taken the first infant steps - creating a test app with buttons and text fields - on the Verge. My background is C/C++ microcontroller developement....
This is why I'm searching help here from more experianced Devs that might be interested in this as well.
What are your thoughts, is something like this feasible?
Thanks and regards,
Nacken

Categories

Resources