How to learn about low-level ROM configurations / tweaks - Android Q&A, Help & Troubleshooting

Hi all,
I'm trying to find a resource that explains what files and systems control various mid- to low-level aspects of an Android ROM.
For example, I'm looking for a resource that explains:
How to change the System UI volume change sound
How to calibrate the accelerometer threshold that controls screen rotation
How to control system app priority (keep something running and awake)
The volume one in particular may seem a little trivial, but I'm researching a project that aims to tighten some aspects of the Android UX, and these elements are critical.
Any direction would be extremely helpful!

Related

Android Hardware Accelleration explained - by google engineer

Link to Source
......
• Android has always used some hardware accelerated drawing. Since before 1.0 all window compositing to the display has been done with hardware.
• This means that many of the animations you see have always been hardware accelerated: menus being shown, sliding the notification shade, transitions between activities, pop-ups and dialogs showing and hiding, etc.
• Android did historically use software to render the contents of each window. For example in a UI like http://www.simplemobilereview.com/wp-content/uploads/2010/12/2-home-menu.png there are four windows: the status bar, the wallpaper, the launcher on top of the wallpaper, and the menu. If one of the windows updates its contents, such as highlighting a menu item, then (prior to 3.0) software is used to draw the new contents of that window; however none of the other windows are redrawn at all, and the re-composition of the windows is done in hardware. Likewise, any movement of the windows such as the menu going up and down is all hardware rendering.
• "Full" hardware accelerated drawing within a window was added in Android 3.0. The implementation in Android 4.0 is not any more full than in 3.0. Starting with 3.0, if you set the flag in your app saying that hardware accelerated drawing is allowed, then all drawing to the application’s windows will be done with the GPU. The main change in this regard in Android 4.0 is that now apps that are explicitly targeting 4.0 or higher will have acceleration enabled by default rather than having to put android:handwareAccelerated="true" in their manifest. (And the reason this isn’t just turned on for all existing applications is that some types of drawing operations can’t be supported well in hardware and it also impacts the behavior when an application asks to have a part of its UI updated. Forcing hardware accelerated drawing upon existing apps will break a significant number of them, from subtly to significantly.) .....
Click to expand...
Click to collapse
The link above goes covers more bullet points on the issue. If you're interested click it and read
duplicate thread
http://forum.xda-developers.com/showthread.php?t=1377519

Some tips on graphics for a coder coming from ActionScript

Hello, I come to ask only for a few tips or merely a bit of guidance on my development for Android.
I'm pretty skilled in ActionScript and new to Java. In this thread I ask how would I do some things in Java, providing the equivalent in AS.
I intend to draw shapes on screen and use them as UI - as a deeper layer within the Android XML UI.
In ActionScript, I can create Sprites or DisplayObjects, edit their "graphics" property like
// create a DisplayObject instance
_someDisplayObject = new DisplayObject();
// create a white square inside it
_someDisplayObject.graphics.beginFill(0xFFFFFF);
_someDisplayObject.graphics.drawRect(0,0,100,100);
_someDisplayObject.graphics.endFill();
// place it dinamically in a position relative to the Stage (canvas)
_someDisplayObject.x. = Stage.stageWidth / 2;
------------------
Also, are there Tweening libraries like TweenLite or Tweener ? Those which allows to create animation with one like of code like:
// move display object to position X = 300 with half transparency, in half second
Tweener.addTween(_someDisplayObject, { x:300, alpha:0.5, time:0.5, transition"EaseOutSine" } );
------------------
Can anyone give a clue, on what am I looking at, to try the same in Java for Android ?
Just an advice letting me know where to start would be greatly appreciated. :fingers-crossed:
Thank you !
You can create shapes in Android using XML or in your Java code or with both. You can also create Animations including Tween animations. (I haven't used animations before though so I can't help any further).
The first place to go is the Android Developers website (d.android.com/develop) and then in the API Guides section under User Interface option and App Resources option in the left hand menu. There are samples included in the Android SDK under API Demos which you can use to get your started.
I can't post links because I haven't made 10 posts yet, sorry.
As you already know, the Android display model is more complex than the one used in ActionScript.
There is no direct equivalent for the graphics property from AS3 in Android.
To draw low-level graphics you can use the canvas parameter provided in the onDraw method (you override it in your own class inherited from android.view.View).
As for the Tweening, like Kavrocks said, try to learn how to use the Animation class and it's extensions, because they are built inside the android framework, therefore optimized (that's just my opinion). All the tweening engines in AS3 are based on dynamic calling of properties (by name). Java uses the reflection api for calling dynamic properties, but it is much slower than a direct call (from a specialized Animation class, in this case).
I am also trying to figure out different modes of applying animations and effects in actionscript. There's a tutorial on Lynda where you can see som tips from migrating from AS to Java, and HTML5, both of this can help you understand better how to apply them in your application.

Reccs. for CPU Control Apps

Using Set CPU atm. great prog. I like that you can set profiles for specific situations, along with the ability to control voltage, freq., etc.
I'm looking to branch out. What do you use and like that has these features? I've tried System Tuner Control, NS Tools, and another program whose name escapes me atm, but, as nice as they were, they seemed on the heavy side, w/r/t battery consumption and/or RAM usage. My thinking's that separate apps for each function would also hog space, etc.
alljokingaside said:
Using Set CPU atm. great prog. I like that you can set profiles for specific situations, along with the ability to control voltage, freq., etc.
I'm looking to branch out. What do you use and like that has these features? I've tried System Tuner Control, NS Tools, and another program whose name escapes me atm, but, as nice as they were, they seemed on the heavy side, w/r/t battery consumption and/or RAM usage. My thinking's that separate apps for each function would also hog space, etc.
Click to expand...
Click to collapse
IMO settings/performance/cpu is enough

[Q] Modifying in-call screen via AOSP

I've read elsewhere that “Developers cant customize the in-call screen because of security concerns”
So, I am trying to understand the functionality of the in-call screen at the source level.
Can any Android devs, especially those that have created their own custom ROMs, tell me which specific classes in the AOSP are responsible for both display and functionality of this screen?

Daydream Elements: Free VR Development Guidebook App

Daydream Elements by Google is a new, free app that serves as a guidebook, covering VR development basics. While those familiar with VR development would probably be disinterested in this new app, as it is so basic, it’s a great starting point for those unfamiliar with VR. The app showcases six examples of tips and tricks for VR development, complete with the pros and cons for their use.
According to Upload VR, “three of these [examples] are concerned with locomotion. One details teleportation, another showcases smooth movement with restricted peripheral vision, and another shows third-person gameplay. Interestingly examples of all three of these types of experiences have hit Daydream in the past few months. Teleportation can be seen in the VR port of Layers of Fear, while the excellent Eclipse uses smooth movement. Meanwhile both Lola and the Giant and Along Together both used a third-person camera that followed a main character.”
Google’s developer page outlines the following examples below:
Locomotion: techniques for enabling navigating a VR environment
Three ways to achieve locomotion:
Teleportation is locomotion technique for apps using first-person perspective that allows the user to near-instaneously move to a target location. This technique reduces the simulator sickness that many users feel when the virtual camera moves.
Tunneling is a technique used with first-person locomotion (such as walking
) where, during movement, the camera is cropped and a high-constract stable grid is displayed in the user’s peripheral vision. This is analogous to a user watching first-person locomotion on a television set.
Chase Camera is a technique used with third- person locomotion, where the user is controlling a character. Standard third-person camera implementations are problematic in VR and contribute to simular sickness. Chase Camera offers predictable motion – camera rotation only occurs under user direction, and small character movements don’t move the camera at all.
Menus and Virtual Controls: The Daydream controller only exposes two buttons to developers: the clickable touchpad, and the app button. For many developers, two discrete controls does not provide a rich enough set of commands for the games and applications that they would like to create. One solution is to present the user with virtual controls for the app’s command scheme.
Click Menu provides the user with a radial menu of commands emanating from the cursor when the menu is invoked. Because users must click directly on options, this menu design trades the speed of a more gestural approach with the control of discrete clicks and scales well with complex command hierarchies.
Swipe Menu leverages the Daydream controller touchpad to allow the user to quickly select between a small set of commands. This menu trades efficiency for accuracy and does not scale well to large number of commands.
Rendering and Lighting: Performance is critical to VR apps but can be especially challenging on mobile GPUs. Many commonly available mobile shaders and per-pixel lighting solutions provide high quality results but perform poorly on mobile VR systems due to extremely high resolutions, rendering multiple views, distortion and general mobile performance issues.
The Rendering & Lighting demo uses Daydream Renderer to showcase rendering effects that are typically difficult to achieve on mobile hardware. This scene demonstrates Daydream Renderer features like per-pixel lighting, tangent-space normal maps, dynamic shadows, realtime specular highlights, and reflections.
Daydream Rendering and Lighting Demo included as part of Elements as a demonstration of the Daydream Renderer’s capabilities.
The app also spells out all known issues, which you can find here.
This app is definitely for newcomers to VR, however since many people are not yet familiar with the space, it seems like a user-friendly platform that encourages people to try their hand at developing.
Source: appdevelopermagazine

Categories

Resources