RAW photo files. Android Why not ? - Android Q&A, Help & Troubleshooting

Hello. Re Samsung galaxy s5 neo in my case.
I have read is not possible to make the camera shoot raw format photos because camera2 isn't implemented of the device.
Is that something is not going to be possible to change or Is it likely that in the near future that will be overcome and it will be possible to use an app to save phoyos as .dmg or other raw format please. (I know Png is available on some apps )

Related

camera files get lost

after a format by using pocket machanic my device lost the photo and video camera files .
does anyone know how i can get them back ?

[Q] Capture image save as raw. Is it possible?

This was already asked here http://forum.xda-developers.com/showthread.php?t=746878. Leaving aside that one will need s/w to read the file is it possible?
I do have dslr but it would be nice if it was possible from x10
I am also aware of the file size implications.
I did come across the following [https://groups.google.com/group/android-developers/browse_thread/thread/521b12fbc6471aff ]

Create global content filter for files

Hey guys!
I´m new here and I hope you can help me with some questions... I didn´t know if I´m right here, so please correct me if I do something wrong
My "setup":
- Samsung S3 international with CM13
I like to create a global "Content Filter", which filters files (pictures, videos, documents, etc.) on my smartphone. So what I mean is something like what the UserManager (sorry I can´t poste the Link to Android API) from Android framework does:
User A (owner) creates pictures with the camera, download files, etc. and if User B (guest) logs in, User B can´t access the files from User A and vice versa.
The diffrent part is that I like to "hide" or better make them not accessable by custom criteria - for example "only pictures from the last two hours will be shown to User B". And the main point ist that every application gets the same content.
I thought I have to edit and extend the internal/external storage functions (or ContentProvider?) by my custom filter. Later there will be a system app for controlling the behavior of this filter.
Now there are some question, which I ask myself:
- Is it possible to implement this functionality in Android specific code and not in device specific code? So is it possible to make it portable for other devices and Android versions (Custom ROM, manufacture ROM or the pure Stock Android from Google´s Nexus phones) without "much" effort?
- Does it make more sense to use the original Stock Android instead of Cyanogenmod to reach the portability?
- Where do I have to start? I downloaded the source Code of CM13 for the Samsung S3. But how I ask before - is there a generic way for all devices and android versions? I started to look into the framework specific code of android (system/frameworks/base/... ). I thougth I can build the filter between the api calls (java -> jni bridge -> c/c++), but that would be not the right place, yes?
I hope someone understand my plan and can help me with some informations and tipps or where I have to look to get them!
Thanks!!
Fabian
I think you might mean the profile system that android used to have on ICS?.
Unfortunately I can't answer why they removed that feature. Does the phone not have an option or something if you have 2 or more google accounts registered to the device?..
Beamed in by telepathy.

[HELP] The ultimate photography workflow killer: SNAPSEED [ build.prop?] :crying:

Hi there,
I am a photographer and have almost complete my 8-year journey of developing [the ultimate digital photography workflow] .
If it was not for this problem I need your with help
Everything works but.... : PLEASE HELP.
----------
TLDR:
Snapseed can't open raw files, Let this please just be a simple Build.Prop edit that tricks Snapseed into thinking the emulator is a compatible device.
-----------
Full Story:
Part of the workflow does entail editing the raw files in android running emmulated on a windows machine.
Everytime I try to open a .ARW file with Snapseed running on various android Emulators* it will give me the following message which I do not get on my phone (Oneplus 3T).
Snapseed, The ultimate workflow killer
The research so far got me to edit the Build.Prop with the line:" 'persist.camera.HAL3.enabled=1'
Something that is supposed to make the CAMERA2 API work.
Link:https://forum.xda...579
This was based upon the assumption I drew from the quote below from a google help forum in which a Snapseed developer basically was hinting that the error has something to do with RAW support of the phone in general. quote:
Quote:
RAW capture on Android has similar requirements to Snapseed with RAW. That's why I say that if it can capture RAW, it probably supports RAW in Snapseed. Some devices that can capture RAW need a third party camera app to use it.
Bonus tip: Camera apps with RAW will disable it if the phone/tablet doesn't support it.
-Zach
link:https://productforums.google.co...orum/snapseed
-------
Possible solutions?:
-This does give me at least some hope that there is some other thing (Build.prop.-wise) we can do in order to make Snapseed open and edit RAW (.ARW) files.
-Maybe someone with know how could look at the actual code of the APK of snapseed in order to figure out what we could change? I am not technical any further than rooting and flashing a rom once in a while.
You would be of serious help, finding out how to make this work.
For what matters, you will be helping me creating art, my Instagram page:
www.instagram.com/thi_js
Greetings from Amsterdam,
Thijs.
PS.
Ofcourse I will share the final workflow if it will see it's complete light someday. (Since it only works as a whole)
(*I know there is something like Phoenix and Remix OS that do not have this described problem, It does however give rise to the the BIG problem of not being able to calibrate the display with a remote callibrator tool (Idisplay) . If ran on an Emulator the windows in the background will still provide the just and real colors) Emulators I tried? : Memu, Andy and NOX show the same error)
Possible other angle:
Quote:
For the OP problem I was having the same error and it was because my phone doesn't have the RAW capability, meaning the CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITI ES_RAW (https://developer.android.com/refere...PABILITIES_RAW), so the mCharacteristics is never set.
You can remove all the RAW references and implementations and it should work.
from:
https://github.com/googlesamples/and...a2Raw/issues/2
Someone?
bumb

Using a .so file in an android app? How can I access the hardware depth sensor on my (rooted) phone?

I'm building an application that requires the use of the depth sensor on my Samsung Galaxy A80. However it seems like it's impossible to access it through Camera2 and ARCore. I asked Samsung directly and the tech support guys best guess was that Samsung has locked it from being used by third parties.
I rooted my phone and started digging through the file system and eventually found a file called 'com.samsung.sensor.imx316.so' located in /vendor/lib/camera (imx316 is the depth sensor). There are also some similar files that ends in '.bin', but .so files seems to be runnable code if I understood the google results correctly.
That file has the same name as the sensor I can't seem to access. Can this file be used somehow? Can I run it in my own app to get access to the depth data? And if not, there should be a way of getting that data right? I mean, it obviously exists somewhere in the phone since pre-installed apps are using it, and a rooted phone has access to everything?
Did you check REQUEST_AVAILABLE_CAPABILITIES_DEPTH_OUTPUT? How do you know it's impossible?
Most likely you need to reverse engineer the Camera app from your phone.
Your app can call com.samsung.sensor.imx316.so , it's really "just" a linux elf library.
The problem you face is the exported routines from the library, you won't really know
1) the parameters to the functions inside the library
2) any specific order to call functions inside the library i.e. like an init function first , release memory last...
You need to disassemble / reverse engineer the library to make some sense of it (see ghidra / radare2/ ida pro etc ).
Use strace on the current process/app which uses the library to make some sense of the order of calls into the library.
The depth data will be coming from a kernel level driver, you can likely obtain the Samsung kernel source and the driver source should be there. Then it's up to you whether you can make user space library/code to read what the driver exposes. The kernel driver source code will have an uapi header file to investigate.
idk if it helps but i used for a Huawei P30 Pro this https://github.com/Nufflee/tof-camera

Categories

Resources