Hiya,
I've developed an auto-brightness app and it seems to be a popular prospect among Xperia users.
However, Xperia models don't have properly implemented light sensors and naturally don't work properly. Android doesn't recognize the sensor as such, but ROMs still implement the driver since it seems some auto-brightness works on these phones.
The question(s):
Has anyone managed to gain control over this sensor?
Does some custom kernel implement it properly?
Related
Does the current CM7 build restore functionality to the Proximity and Orientation sensors to the point where they can be used with standard apps?
Does the NT have a proximity sensor? I did not think so.
Sent from my NookTablet using Tapatalk
Just going by statistics from a couple of apps i.e. Android Assistant which purports to give Name, Vendor, Range Resolution and Power data for installed hardware as follows. (list not complete)
Proximity Sensor - SFH7741 By OSRAM
Rotation Vector Sensor - Google Inc
Temperature and Pressure Sensor - PMP085 by Bosh
Magnetic Field Sensor - HMC5843 Magnetometer By Honeywell
Linear Acceleration Sensor - Google Inc.
Light Sensor - BH1780gli By ROHM
Gravity Sensor - Google Inc.
Accelerometer Sensor - kxtf9_accel by kxtf9
No Gyroscope Sensor or Orientation Sensor listed
My error on the Orientation Sensor I should have said, and am most interested in, the Magnetic Field Sensor
Other apps report similar hardware available.
Besides the holy grail of Bluetooth, I was wondering if the CM7 versions have been able to natively find and activate them (if they in fact exist) or if each had to be detected and programing specially written.
Another possibility is that other necessary hardware was never included to utilize all the sensors.
In other words...
Has anyone with the latest CM7 root tried and successfully to run a compass program? I am happy with my simple Nook and Zergy root with BN access but would build a new SD card if I had a working compass for some astronomy apps.
Thanks for any additional info.
I tried a compass on CM9 and got a magnetic sensor error.
Sent from my NookTablet using Tapatalk
Screebl Pro doesn't work on CM7 alpha final.
I think you need to wait for next update...
Odd thing that Screebl doesn't work. I would think that it would use the same sensor for angle identification that other programs do. For instance a nice little program called Skeye can show you the night sky when you and change the vertical angle and/or rotate from portrait to landscape but it does not recognize rotation to different compass points apparently because there is no driver for the magnetic field sensor.
It appears that in my list of sensors above that anything listed with a Google driver is functional. Those without are not. But given that the manufacture of the sensor is listed, there might be Google drivers available that can activate them.
Thanks to those that did some testing
I am doing a project analyzing sensors. I am trying to make an experiment using the proximity sensors. I have managed to utilize apps of the market to find out some things.
The sensor has a resolution of 100cm, meaning it can detect things 100cm away. But it only have 2 settings. 3 cm and 100cm, but nothing in-between.
If there anyway i can obtain the raw sensor data? I need the in between stuff.
For example the ambient light sensor. It has a resolution of about 27,000 sci. I am able to get all the in-between stuff. but not for the proximity sensors.
Hi
I am doing a project which involves improving the efficiency of barcode scanning using smartphone cameras. One of the aspects I am looking at is scanning in low light. Currently in low light conditions applications such as ZXing barcode scanner will turn on the LED camera light, although without the ability to control brightness it can cause glare. My goal is to perhaps control the brightness of the LED using pulse width modulation. How would I go about doing this? The only devices I have at my disposal are a Nexus 4 and Samsung Galaxy S4, and I have been mainly concentrating on the Nexus 4 as I believe this is the platform I have the best chance of achieving my goals. So far I am under the impression that I will need to modify the kernel for one of these devices to achieve what I want? Am I on the right track or is there another way?
Thanks for reading
How will you implement on kernel side the auto detection of different ALSPS or proximity sensor? lets say for example we have a 2 sensor driver namely cm3607 and cm36283, how could we implement them on the ProjectConfig.mk to make the sensor working without create two different kernel images for two different ALSPS?
Hello all,
Any chance someone would be interested in helping to implement fall detection on regular wear OS watches? Something similar to those on the galaxy and apple watches. Am I missing something that should be an obvious reason this hasn't been implemented?
Thanks
I have looked into this, and there are some difficulties to it. Most algorithms for fall detection use continuous accelerometer and/or gyroscope readings. There is a significant battery hit for making these recordings continuously.
As far as I know, the only way to make this viable is to use sensor batching with wakeup accelerometer and gyroscope sensors. That way, the device will be able to sleep for a significant portion of the time and be woken up when there are new measurements. Some devices (I'm looking at you ticwatch pro 3) don't even have wakeup accelerometer or gyroscope. The only way I see to get around this is using the significant motion detector which is always a wake-up sensor. Maybe this sensor will be triggered to wakeup the device when a fall occurs and sensors can be recorded at that time.
I think this is how I would go about it, but maybe I'm missing something
Sounds like a good plan for starters... cheers
permanentusername22 said:
I have looked into this, and there are some difficulties to it. Most algorithms for fall detection use continuous accelerometer and/or gyroscope readings. There is a significant battery hit for making these recordings continuously.
As far as I know, the only way to make this viable is to use sensor batching with wakeup accelerometer and gyroscope sensors. That way, the device will be able to sleep for a significant portion of the time and be woken up when there are new measurements. Some devices (I'm looking at you ticwatch pro 3) don't even have wakeup accelerometer or gyroscope. The only way I see to get around this is using the significant motion detector which is always a wake-up sensor. Maybe this sensor will be triggered to wakeup the device when a fall occurs and sensors can be recorded at that time.
I think this is how I would go about it, but maybe I'm missing something
Click to expand...
Click to collapse
I don't have a great understanding of how these all integrate into the system but I would think that the same sensor, what I am assuming is an accelerometer, that is used for the wake feature could also be used for this. Like you said, using the full suite would be prohibitively expensive from an energy standpoint, but I assume that the wake accelerometer can be used to directly measure the acceleration of the wake motion and could also be used to calculate fall detection, one that measures the acceleration and not just if said motion is occurring. So, there would be no need for the rest of the sensor suite to be used after, correct? As a side note, it would be pretty cool to have the watch read vitals and such and be able to report that with the fall notification via a continuous stream of text messages on set intervals.