Hello all, I have a question about developing an application for my graduation project. The idea is to develop an application in which the various books/information sources get a place and can be placed under different semesters/years. The information as provided has to have an easy way of updating for teachers(probbebly web-based). The aplication also has to work on iOS. Can you guys give me an idea on how to start i think HTML5 is a good option.
For cross platform application you can try PhoneGap(phonegap.com) with Dojo (dojotoolkit.org).
Front end is only half!
killerbee12345 said:
Hello all, I have a question about developing an application for my graduation project. The idea is to develop an application in which the various books/information sources get a place and can be placed under different semesters/years. The information as provided has to have an easy way of updating for teachers(probbebly web-based). The aplication also has to work on iOS. Can you guys give me an idea on how to start i think HTML5 is a good option.
Click to expand...
Click to collapse
HTML5 will be great, but it will be difficult for a school to update as a native app. I suggest simply using a mobile-ready website. jQuery Mobile is great for this, and it handles all the cross platform issues. Have you given any consideration to serverside code? Php, .Net, and Node.js are all great options. Talk with an administrator about what kind of system they use to store files & links currently. You could fairly simply make a page that scans a file share they maintain to build the page .
Are all current (including budget) phones capeble of running .net or PHP? I'll probebly host the site/app myselve for a while during the test phase so if i go web-based it will be PHP since my host is a linux host. I'll have a look tomorrow, I'm studying to be an officer on a ship not an programmer so all programming is new, I did some vb/php but nothing fancy.
i started with icenium, but found out that icenium doesn't offer to store the webfiles on your own server. So thats not an option.
I have to start android programming for a project.
The project involves using the App to remote control a camera attached to the pc which is running the web-server.
The App does two things:
1) Send user-inputted parameters for the photo capture to a web-server.
This is relayed to the web-server (using gSOAP or other web-service).
The web-server then calls the program i create on the server, which captures an image from the camera.
The program also has other functionality like image manipulation, making a timelapse movie etc using these captured images.
The program sends back to the app a video file or image.
2) Receive a video file (either saved in Gallery, or viewed inside App).
As you can see bulk of my project will be not on the android but on the server.
My question then is, how do i approach android programming?
I want to make the best use of my time and not have to learn the A-Z of android programming. This is not me being lazy (I have done a few chapters of Sams teach Yourself Android & completed Google's online tutorial on android) but just wanting to spend, what little time i have, wisely.
So i suppose my question is:
1. What areas of android should i concentrate on for this app (e.g. web-services, drawing buttons etc).
2. Any resources that would help me in this quest (e.g. templates)
Apologies if it was long and drawn out, and thanks for reading so far
What are the requirements of android app development?
what all am i suppose to know? rit now i just know c++ is that sufficient?
any help will be appreciated
Hi friends,
I need advice on a pressing issue I am facing right now about an android app I got developed through a freelancer.
I am webmaster of a website for numerology enthusiasts. On this website, we were offering a numerology calculator (basically a combination of html pages with some javascript embeded - all compiled in an .exe format) Later, on some suggestions, we decided to prepare an Android version of this tiny program by hiring a programmer of South India. It was 2011 and the Gingerbread was the prevalent Android platform. The programmer created the App and we published it on Android Market where it is still available on playstore (search for com.namecalculator.lite on playstore and the first result 'Your Lucky Name' is the app in question.)
The problem is that this was an app which was not compatible with the later version of Android. As such, after sometimes, when the ICS version of Android was launched, the app stopped working for ICS devices. As of now, except for some old Android devices, this App is useless.
When I contacted the guy who originally developed this App, he told that the source file of the Apps were not saved by him and as such, he expressed his inability to do anything about it. He told me that if I again wanted him to develop the app for later versions of Android (like ICS, Jellybean etc), I will have to pay him the full development fee as he will have to start again from scratch.
Since my website is only a hobbyist website with negligible revenue, it was not possible for me to again hire this programmer just to develop an upgraded version of the app.
As of now, a very popular part of my website (the app) has become unavailable for its intended users. In this background, I want guidance on the following:-
(1) If an App is already built for an earlier Android version, does making it compatible with future/latest version of Android require the same amount of energy and effort which was needed when the app was developed the first time?
(2) Since the App in question is basically a compilation of html files with some javascript embeded in some pages, will it be really difficult to reconstruct the app if the source file of earlier app has been lost ?(I still have the raw html pages with me)
(3) I am not a programmer but have experience of web-designing, creating blogs etc. Can I self taught myself to create the above mentioned app by reading and following the online tutorials ? If yes, what in your opinion is the expected time an average learner (with no programming background) can do it? Also kindly point me to some good tutorials.
(4) Any other advice on the above issue some of you might be having ?
Regards
Eklavya
Project Treble provides a great help in getting access to vendor-specific HALs, I'll try to explain how, and how to exploit it.
Presentation of vendor HIDLs
Thanks to Project Treble, all HALs must be defined through HIDL.
Standard AOSP HALs means that a generic AOSP system works with standard AOSP features.
But, vendor HALs are also going through HIDL!
APIs defined through HIDL are stable, versioned, hashed and easy to access.
Just plug the HIDL inside your build system, and you get easy access from your applications to the HAL behind the HIDL!
Also, APIs defined through HIDL are supposed to be clean, and mostly self-documented, so getting the HIDL can help understand how the HAL works.
To understand how easy it is, here is a real world client usage of an HIDL:
Code:
IExtBiometricsFingerprint service = IExtBiometricsFingerprint.getService();
service.sendCmdToHal(NAV_ON);
With the HIDL, enabling gestures on the fingerprint sensor on Huawei devices is a two liners! [1]
Browsing vendor HIDLs
Now, the problem is that HIDLs are part of the source code, not the firmware.
So if the vendor doesn't publish it, there is some additional work to do.
Android build system is capable of generating two client APIs for HALs using HIDL. Either a C++ library, or a java library. Not all firmwares will contain java libraries for all APIs, but C++ is almost always available.
Both languages make it fairly easy to reverse engineer the prototype of the functions, which is almost all of HIDL (c++ is missing function arguments)
So, I made a script to reverse engineer those HALs ( https://github.com/phhusson/treble_experimentations/blob/master/vendor-HAL/reverse-hal.sh ).
It is far from perfect, it can't generate a full-blown HIDL, but it makes discovery much easier!
Here are a few examples of APIs it is giving access to:
- Touch screen gives us access to Glove mode and cover mode
- Display gives us access to functions like setColorTemperature, or updateRgbGamma
- Infrared HAL gives us access to learning capability
- fingerprint sensor gives us access to sendCmdToHal
With the first three examples, we can see things can be easy, but the last one makes things trickier. With the last one, there is still a magic value to give to the sendCmdToHal function.
Anyway, that's still quite an improvement compared to before Treble.
Using vendor HALs
So, let's say we've discovered [email protected]::hwTsSetGloveMode(bool) function, and we want to call it.
As I mentioned in the first section, if we have the HIDL, it's a two liners.
But, we don't have it, so what do we do?
Android build system usually generates HIDL client code for C++ and Java, so we just need to piggy-back to this code to be able to call the functions!
Using vendor HALs from Java
For Java, the idea is fairly simple, we need to copy/paste the code from the original firmware (either from an app or framework), and copy it in our own environment.
It's a bit more complicated to realize, here is how I did it:
- grep -rF <name of the function> system # To find where to find the java symbols
This gave me system/system/framework/oat/arm64/hwServices.vdex
- deodex it
- Retrieve all symbols inside proper folder. In this case, vendor/huawei/hardware/tp
- Create a mock of the java class ( here is an example )
- Create the code to call this (here is an example)
- Build the mock and the caller (here is an example)
- Decompile the result into smali
- Replace the mock code by the actual classes from the original firmware
- Recompile the whole
- Run the resulting app/dex
Things to improve
First problem here, is this code is annoying to automate and put into a build system.
Them, we can only partially reverse-engineer the HIDL, which leads to two problems:
- We need to piggy-back previous firmware's APIs
- Calling C++ is much harder (because ABI behaviour when changing headers is tricky)
The cleanest solution would be to fully reverse-engineer the HIDLs.
Though current reverse engineering is so bad that it doesn't even list all the functions available, so a lot of work is needed.
Conclusion
Project Treble's HALs are fun to play with, look at it!
[1] I'm a bit lying here. This is enough to enable event reporting, but there are some additional changes needed to make sense of those events
Thanks for your great work here OP
BTW, on Honor 9 stock EMUI HwCamera2 can't take a photo ( checked on landscape or portrait mode on the both camera ), but video recording it's working like expected !
Also with your solution, home button it's working in this way : when I press home button it's open search with a "=" into it
surdu_petru said:
BTW, on Honor 9 stock EMUI HwCamera2 can't take a photo ( checked on landscape or portrait mode on the both camera ), but video recording it's working like expected !
Click to expand...
Click to collapse
This is fixed by https://github.com/phhusson/huawei_camera_aosp/commit/177459fdb76f0aa68fa4ffb869b633d9460a2fb0
Also with your solution, home button it's working in this way : when I press home button it's open search with a "=" into it
Click to expand...
Click to collapse
Yeah, that's what my footnote basically says
Huawei is defining the meaning of fingerprint evdev in /vendor/usr/keylayout/fingerprint.kl, but the KEYCODE_FINGERPRINT_* it uses doesn't exist in AOSP.
What I'm planning to do there is a daemon listening exclusively to /dev/input/eventX of fingerprint, and launch commands based on the events.
This could have performance issues (input keyevent KEYCODE_HOME takes half a second), so I might switch to creating an uinput.
phhusson said:
This is fixed by https://github.com/phhusson/huawei_camera_aosp/commit/177459fdb76f0aa68fa4ffb869b633d9460a2fb0
Yeah, that's what my footnote basically says
Huawei is defining the meaning of fingerprint evdev in /vendor/usr/keylayout/fingerprint.kl, but the KEYCODE_FINGERPRINT_* it uses doesn't exist in AOSP.
What I'm planning to do there is a daemon listening exclusively to /dev/input/eventX of fingerprint, and launch commands based on the events.
This could have performance issues (input keyevent KEYCODE_HOME takes half a second), so I might switch to creating an uinput.
Click to expand...
Click to collapse
Thank you very much, and good luck
surdu_petru said:
Thank you very much, and good luck
Click to expand...
Click to collapse
Does your fingerprint sensor mechanically click? Or is it just some capacitive sensor?
phhusson said:
Does your fingerprint sensor mechanically click? Or is it just some capacitive sensor?
Click to expand...
Click to collapse
No, the only one withe the click is into Honor 8 ... I already see it this : "key 28 ENTER" into fingerprint.kl ... maybe here we can defined as virtual or something like this if I'm not wrong, but sure need to be tested
EDIT :
I guess it's used only on Honor 8, as fingerprint.kl is almost the same for all devices
surdu_petru said:
No, the only one withe the click is into Honor 8 ... I already see it this : "key 28 ENTER" into fingerprint.kl ... maybe here we can defined as virtual or something like this if I'm not wrong, but sure need to be tested
EDIT :
I guess it's used only on Honor 8, as fingerprint.kl is almost the same for all devices
Click to expand...
Click to collapse
Well, my Device (Mate 9), which doesn't mechanically click, does trigger click event.
The way I'm doing it doesn't require changing vendor partition. But yes, changing vendor/usr/keylayout/fingerprint.kl would be much easier.
xx
Diggin on "my own"
Good evening out there!
First of all: Thank you very much for all of your afford till now.
I own a honor 9 lite and im investigating right now how oreo and treble works.
So i took your reverse_hal and tried to improve it a little bit (see attachment).
Maybe it can keep things easier?
Does it make sense to read out all classes in the so's?
I've uploaded the script (reverse-hal-grork.sh) and an example-output from vendor.huawei.hardware.biometrics.fingerprint.
Please, have a look at it.
greetings
vsrookie
PS: If there are any ideas to improve just tell. I thought about automatically create JavaClasses? Or Smali? Will it work?
vsrookie said:
Good evening out there!
First of all: Thank you very much for all of your afford till now.
I own a honor 9 lite and im investigating right now how oreo and treble works.
So i took your reverse_hal and tried to improve it a little bit (see attachment).
Maybe it can keep things easier?
Does it make sense to read out all classes in the so's?
I've uploaded the script (reverse-hal-grork.sh) and an example-output from vendor.huawei.hardware.biometrics.fingerprint.
Please, have a look at it.
greetings
vsrookie
PS: If there are any ideas to improve just tell. I thought about automatically create JavaClasses? Or Smali? Will it work?
Click to expand...
Click to collapse
Looks good
Could you make a pull-request to https://github.com/phhusson/treble_experimentations/ ?
The ideal target would be to generate the original .hal file, so that we can generate c++ and java code automatically with hidl-gen.
I don't know how big is the gap to be able to do that though...
Hi.
I dont think that it will be earlier then the weekend.
But i will do.
Also i go on with my investigation at weekend.
Greetings
Vsrookie
This is interesting... Going to see if I could extract something useful from the Nabi SE, as I'm curious if it can be Treble'd.
phhusson said:
Project Treble provides a great help in getting access to vendor-specific HALs, I'll try to explain how, and how to exploit it.
Presentation of vendor HIDLs
Thanks to Project Treble, all HALs must be defined through HIDL.
Standard AOSP HALs means that a generic AOSP system works with standard AOSP features.
But, vendor HALs are also going through HIDL!
APIs defined through HIDL are stable, versioned, hashed and easy to access.
Just plug the HIDL inside your build system, and you get easy access from your applications to the HAL behind the HIDL!
Also, APIs defined through HIDL are supposed to be clean, and mostly self-documented, so getting the HIDL can help understand how the HAL works.
To understand how easy it is, here is a real world client usage of an HIDL:
With the HIDL, enabling gestures on the fingerprint sensor on Huawei devices is a two liners! [1]
Browsing vendor HIDLs
Now, the problem is that HIDLs are part of the source code, not the firmware.
So if the vendor doesn't publish it, there is some additional work to do.
Android build system is capable of generating two client APIs for HALs using HIDL. Either a C++ library, or a java library. Not all firmwares will contain java libraries for all APIs, but C++ is almost always available.
Both languages make it fairly easy to reverse engineer the prototype of the functions, which is almost all of HIDL (c++ is missing function arguments)
So, I made a script to reverse engineer those HALs ( https://github.com/phhusson/treble_experimentations/blob/master/vendor-HAL/reverse-hal.sh ).
It is far from perfect, it can't generate a full-blown HIDL, but it makes discovery much easier!
Here are a few examples of APIs it is giving access to:
- Touch screen gives us access to Glove mode and cover mode
- Display gives us access to functions like setColorTemperature, or updateRgbGamma
- Infrared HAL gives us access to learning capability
- fingerprint sensor gives us access to sendCmdToHal
With the first three examples, we can see things can be easy, but the last one makes things trickier. With the last one, there is still a magic value to give to the sendCmdToHal function.
Anyway, that's still quite an improvement compared to before Treble.
Using vendor HALs
So, let's say we've discovered [email protected]::hwTsSetGloveMode(bool) function, and we want to call it.
As I mentioned in the first section, if we have the HIDL, it's a two liners.
But, we don't have it, so what do we do?
Android build system usually generates HIDL client code for C++ and Java, so we just need to piggy-back to this code to be able to call the functions!
Using vendor HALs from Java
For Java, the idea is fairly simple, we need to copy/paste the code from the original firmware (either from an app or framework), and copy it in our own environment.
It's a bit more complicated to realize, here is how I did it:
- grep -rF <name of the function> system # To find where to find the java symbols
This gave me system/system/framework/oat/arm64/hwServices.vdex
- deodex it
- Retrieve all symbols inside proper folder. In this case, vendor/huawei/hardware/tp
- Create a mock of the java class ( here is an example )
- Create the code to call this (here is an example)
- Build the mock and the caller (here is an example)
- Decompile the result into smali
- Replace the mock code by the actual classes from the original firmware
- Recompile the whole
- Run the resulting app/dex
Things to improve
First problem here, is this code is annoying to automate and put into a build system.
Them, we can only partially reverse-engineer the HIDL, which leads to two problems:
- We need to piggy-back previous firmware's APIs
- Calling C++ is much harder (because ABI behaviour when changing headers is tricky)
The cleanest solution would be to fully reverse-engineer the HIDLs.
Though current reverse engineering is so bad that it doesn't even list all the functions available, so a lot of work is needed.
Conclusion
Project Treble's HALs are fun to play with, look at it!
[1] I'm a bit lying here. This is enough to enable event reporting, but there are some additional changes needed to make sense of those events
Click to expand...
Click to collapse
So my theory is.
Trebel is a wrapper for closed source drivers ?
If so. Knowing the calls to the drivers give you the code to make any driver trebel ...