If this seems familiar, it’s because this was posted in the pictures thread, but got completely buried. In hindsight, this was out-of-place there.
Macro issue:
I set my phone at a fixed distance from an object – just close enough so that it couldn’t completely focus in “Auto” mode – then switched to “Close up” mode. “Close up” mode could not make it any more focused, nor did anything else in the background change. Is this supposed to be the case? The minimum focus distance seems to be about 2.2 inches, FYI.
Tap-to-focus issue:
I’m finding that I need to place the close-up object in the center of my viewfinder. This camera does not seem to want to focus on anything that is off-center if it is too close to the camera, even if you use tap-to-focus. The only time that tap-to-focus works is if you place the close-up object in the center of the viewfinder, then tap on a distant object. Below are my examples:
I placed the water bottle in the center of the viewfinder and tapped on the label to focus on it-
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Then, keeping the water bottle in the center of the viewfinder, I tapped on the Coke bottle in the background which brought that into focus-
Finally, moving the water bottle off-center, I again tried tapping on its label to focus on it, but with no luck. The only way to focus on the bottle with it in the foreground was to bring it to the center of the viewfinder-
The only work-around that I have found is by focusing on the close-up object while it is in the center of the viewfinder, then use the shutter button to lock the focus while I move it off-center. This is not the case on my friend’s iPhone 4S, nor do I think this was a problem on my OG EVO.
I actually found that tapping in the lower-left of the screen allowed me to focus in off-center objects (as long as they were on the left side of the frame). Anyone else have these issues?
Glad I found this thread. I hadn't fooled with the camera on this phone to use macros until today when trying to shoot flowers. I was shooting flowers in the sun and the camera was useless for macro shots. If the background is bright, it always focuses on the background and not what you are tapping on. In fact, even when the flower was dead center, it focused on the background. I can tell it is a software/firmware issue because in lower light, it works OK and it works fine if the entire subject is up close, like a concrete wall. Going to search for a solution... Surprised more people haven't noticed this.
Mike
mikeyxda said:
Glad I found this thread. I hadn't fooled with the camera on this phone to use macros until today when trying to shoot flowers. I was shooting flowers in the sun and the camera was useless for macro shots. If the background is bright, it always focuses on the background and not what you are tapping on. In fact, even when the flower was dead center, it focused on the background. I can tell it is a software/firmware issue because in lower light, it works OK and it works fine if the entire subject is up close, like a concrete wall. Going to search for a solution... Surprised more people haven't noticed this.
Mike
Click to expand...
Click to collapse
Well you have to be careful around here. A lot of fanboy'ism when it comes to devices, so as soon as you say any negative thing about a phone in its forum people will be quick to blame you and call you a noob, lol.
With that being said, this phone has an excellent camera, but the camera software itself is very very subpar. I'm hoping an OTA launches sometime in the future addressing issues like this one.
For those of you having issues with your phones.
Try Macro (Close-Up) + Zooming in on the target.*
It should come out ok.
*For a temp work-around
Noiro said:
For those of you having issues with your phones.
Try Macro (Close-Up) + Zooming in on the target.*
It should come out ok.
*For a temp work-around
Click to expand...
Click to collapse
I actually just found the same thing. If you touch the + magnifying glass to zoom in all the way, half press to focus on that, and then zoom out with the half press held, you can sometimes get a good focus. Seems like the focus just isn't working when in macro situations and zooming in like that gets rid of "clutter" around the subject that seems to confuse the focus algorithm.
Mike
I returned my first LTEvo because of issues it was having with autofocus. Half the time it would autofocus the other time it would sit there blurry and not focus or refocus on anything at all. Not sure if it was a hardware or software issue but it's no longer a problem on the new one I received.
Related
I found this over at (therootofallevo.com) source link is here, sorry if it has been mentioned here before.
By Fernando Gonzalez
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Obviously one of the biggest aspects of the Evo 3D is the 3D capabilities. Some love it, some hate it, some just don’t care, and some enjoy it but just don’t use it often. Whichever side you’re on, I think we can all agree that sometimes the 3D content can strain your eyes a bit. Some people it’s more than just a strain, it can flat out cause a serious headache. From my personal experience, most 3D content on Youtube looks great and doesn’t bother me at all. Watching actual 3D movies looks incredible. Most of the 3D videos I’ve taken look great too and don’t bother me at all. However, the pictures can be a different story. I notice the pictures don’t align too well. When they don’t align right, that’s when the eyes start to strain. The more unaligned they are, the more your eyes will strain. Want to test it out? Take a picture in 3D, and move the Evo left to right, you’ll see the alignment of the two pictures. You’ll notice the closer together the pictures are aligned, the easier it is on your eyes…
WELL.. there’s an option with the 3D pictures that a LOT of people don’t know about —-
Adjust 3D alignment
I can tell you how much better it is on your eyes after adjusting (huge difference), but nothing is better than seeing for yourself first hand. So here’s a quick guide. It’s extremely easy,
1) Open up your camera (in 3D of course ) and take a picture..
2) Go to the gallery on the bottom left of the camera app
Here’s the demo picture I’m going to use: Notice how insanely unaligned the two picture are. This is the type of picture that REALLY strains the eyes bad. I can’t look at it for more than 2 seconds…
3) Tap the screen and select the icon that looks like a wand with sparkels:
(btw, I think the image tearing is because of the 3D effect)
4) You’ll get this popup and select Adjust 3D alignment
5) Now you’ll get this screen:
The left and right tabs is what allows you to adjust the two images taken. If you tilt the screen over you’ll be able to see both the images. Now here’s the trick. I’ll try and explain this without being confusing.. Because of how the images are taken, its impossible to get everything on both images to align perfect. So the key is to figure out whats your focus point of the picture. In my demo picture, its the box. So you adjust the image until the two images are aligned as best you can on the focus point, the box. You’ll notice outside of the focus point, like the monitor base and the black rag to the right, aren’t aligned that well compared to the box. But that’s fine, because again the focus point of the image is the box. So as long as you align the focus point the image will not only look much better, but it will be a million times better on your eyes
Here’s how my demo picture looked before the adjustment, then after..
While I think HTC really should have done a better job with the Evo 3D being able to automatically adjust the images, the fact that they threw this option in there is something to be very happy about. It’s a great option that can bring the fun back to 3D for those that get headaches and strained eyes. So what do you all think? Has it helped at all? If so, will you be taking more 3D pictures?
Wow probably the most helpful post ever lol. Thanks.
Sent from my PG86100 using XDA Premium App
Im glad it helped you out I was looking for info on a tripod since my hands shake so much and found this so since it helped me I knew it would help others here.
I have tinkered with this a bit and the only problem I have with it (although it does make for easier focus) is it leaves an overlap look to the sides of the screen where you adjust the pics from their native locations.
Thanks!
Wow, this really helped with a few of the pix I've taken that were hard to focus on!
One other tip that I've found useful when adjusting this setting is to close one eye. I found that my eyes were trying to focus on the main object (the box in your example) and it made it hard to tell when the pictures were actually aligned. By closing one eye and tilting the screen in such a way that both pictures were visible, I was able to align them much quicker.
-Mark
So, I don't have any background or research into this and this is just speculation based on some observations. I might be on a completely other planet with this...
But, I hear a lot of people make the remark about how 3D gives them headaches or hurts their eyes. I'm sure there's a lot to it, but I'm also thinking it has something to do with how our eyes work vs 3d cameras.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Our eyes criss-cross as we look at closer/further objects. If you don't believe me, try looking at your nose or have someone else look at their nose and watch their eyes. This is what we're used to, it gives us depth and the 2 varying angles of what we are looking at.
The Evo 3D, Optimus 3D, and virtually every other 3D camera I've seen has 2 fixed lenses pointing straight out. I understand why - it would be way too difficult to be able to know how much to angle the lenses based on what you're focusing on, it would be costly, and less durable. But because they're pointing straight forward and don't cross, it's really just recording a double image with little difference in the angle. Not to mention, this isn't a natural view.
I'm wondering if this has anything to do with the discomfort of people viewing 3D. I don't really have much of a problem with seeing anything in 3D, but I used to. When I first got my E3D, 3D pictures hurt my eyes, but video was fine. I later got glasses, found out my eyes are a hair pointing away from each other (to the point where it's not noticeable). Now, I have more depth from anything I watch in 3D and the photos don't hurt my eyes. But I do notice a double image effect if I look into the background on many things. I know this is because you can't exactly line up the whole image, but it led me to think about the differences.
Now this doesn't offer a solution and is probably a pointless post, but was an interesting idea that someone might enjoy discussing.
It'll strain your eyes some because they have to focus on the light from the screen instead of where it should be if looking past or in front of it. But, most of the headache is from your brain trying to figure out what the hell you're looking at. So basically people with weak minds get headaches more often.
If you can find 3D pics or videos with a slight border around them it helps out a ton with that. It also makes the 3D effect much more pronounced.
The issue with 3D is the content dictates what is in focus, not you. If you try to only focus on what is already in focus you should be ok. But it is when you try to focus on distant, out of focus objects, which you will never be able to actually bring into focus, that you cause strain
Dont forget the refresh issue too, you dont get your regular 60fps, you get (polarized left) (blank) (polarized right) (blank) and so on, and while you wont consciously "see" the blank screen your brain has to work harder to edit it, this increase strain by quite a bit.
xHausx said:
So basically people with weak minds get headaches more often.
Click to expand...
Click to collapse
if there is any way to OC our minds
NixZero said:
Dont forget the refresh issue too, you dont get your regular 60fps, you get (polarized left) (blank) (polarized right) (blank) and so on, and while you wont consciously "see" the blank screen your brain has to work harder to edit it, this increase strain by quite a bit.
Click to expand...
Click to collapse
Our screens aren't polarized. Neither are our eyes equipped to handle polarized content.
3D videos are the standard fps for each eye and there's a parallax barrier that decides which image goes to which eye. Take a look at any 3D video, you can clearly see that each eye's image is there, full frame rate
two things happening
One reason for the headaches/eye strain is due a mismatch between where we focus and where we point our eyes.
As stated above, when we look up close our eyes point in (converge) and we bring the near object into focus (accommodate).
This works really well in the real world and helps our brains decide where things are in space.
With artificial 3D, we continue to focus (accommodate) at the screen, but in order to appreciate the 3D, we need to vary our convergence. This mismatch gives some people (about 20% of the population) varying degrees of eyestrain/headaches.
Great thread! Lotta nice info here.
I get eye strains with pictures only but Im okey with the videos.
Tonight on my way home, the sky above the mountains was a beautiful blood red and for some reason it comes across as orange / pink from far away and pink / white when zoomed in.
This is not the first cellphone camera to have taken photos where the reds didn't come through. My Galaxy S4, S5 and 7 were not able to take pictures of red either, it always turned out orange.
What am I doing wrong to not be able to get reds to translate.
Here are pictures from the different phones. All the red LEDs i used were from the same rolls I bought and were all supposed to be dark red, not orange in anyway. The closest i ever got was the lat photo and it's still only about half way there.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Having a greater understanding of digital cameras and the software used to process the image as well as how they work together will help... I'm no expert, but suspect this link will help...
https://www.dpreview.com/forums/thread/3821409
Could be you need to be using more manual settings when you want to focus on a specific deep color or turn off any automatic processing. Could also be the relatively cheap sensors they're using in cell phones vs "real" camera sensors being used on the higher end in full body shooters.
Looks like you're over-exposing the parts you actually want red by exposing for other parts of the frame. In the mountain shot, there are other spots that are red. The last shot with the clock and hazard button, the hazard button is close, maybe migrating a little into the pink or magenta shades. (Can see a bit of magenta cast in other parts, so a WB adjustment needs to be made.) Shifter photo I can see a bit of red reflecting off of the upper shifter shaft but the actual lights are overexposed if they were red or orange. The bed frame you have some red at the bottom of the frame but as you get closer to the light source it washes out. So overall, I'd say it's an exposure issue that isn't really the camera's fault.
The sky with mountains you're running into a dynamic range issue and that'll be a problem for just about any camera. A scene like that would require more than just a couple of frames blended together. You're looking at a background that is essentially still extremely bright compared to the rest of the frame as it is "indirectly" lit by the sun. The foreground is the next brightest as it is close wide spread light sources. The mountains themselves and band of town directly below them are the bottom end of the exposure and where your reds will get to be reds.
The other issue is that sensors all tend to have a bias towards a color that they shoot well and one they don't. For a VAST majority of sensors, red will tend to blow out first. This goes for large pro sensors all the way down to cellphones. It's just one of the many things you learn when you dive into when really getting to know your tools. That's why high end cameras have had RGB histograms for the longest time, so you can keep an eye on reds clipping. The Camera FV-5 app has the ability to show a live RGB histogram. At this point in my shooting though, I just shoot then look at the overall frame and the areas I feel really matter. If I like it or know I can work with it, I keep it. If not, then I readjust my settings and reshoot.
you miss to take a photo with google camera, on v30!
Lyvyoo said:
you miss to take a photo with google camera, on v30!
Click to expand...
Click to collapse
Sorry can you say that again?
slight22 said:
Sorry can you say that again?
Click to expand...
Click to collapse
I think he's saying that you should have used the Google camera app, word is that it takes far better shots on auto, particularly in situations like this where HDR/HDR+ would be on.
It seems it was posted a while back, try giving it a go.
Yes, thank you Septfox. I'm sure that Google Camera app will bring much more DR and overall improved results on V30. Don't forget to come back with conclusions!
Example here (V30 vs Pixel 2, and after V30 with Google Camera vs Pixel 2)
There seems to be a lot of variance on the different versions of the Google Camera app. Even different versions of the same port can be a bit hit or miss. I've had a mixed bag of results with it. Last port I used a couple of nights ago, I ended up using the LG camera app shots instead of the Google Camera app ones.
slight22 said:
This is not the first cellphone camera to have taken photos where the reds didn't come through. My Galaxy S4, S5 and 7 were not able to take pictures of red either, it always turned out orange.
What am I doing wrong to not be able to get reds to translate.
Click to expand...
Click to collapse
Reds blow out first. You have to underexpose to prevent that then brighten shadows and the rest up in a photo editing program. Take it with EV -2 say or less as required. Do that and you will be able to do better than the pixel camera
YMMV, a sunset might be too much to get due to difference in the sky and the ground. So help the camera. Take it when its darker but still red that way the difference between brightest and darkest will be less
It always comes down to one thing, silcon based computing can't keep up with carbon based ie. YOU
Thanks everyone for the thoughts and i will try the google photo app.
Update - So I tried the Google Pixel photo app (had to download the APK) with HDR on still doesn't seem to do the job with reds. The elevator button is a solid red, and again looks pink.
Again, you need to dial back the exposure. Just switching apps isn't going to do the trick.
Yep, try using EV -2 as a quick workaround. If that isn't enough you might have to speed up the shutter to go further.
The idea is to expose it right. Red should appear as red in the photo. Never mind if the photo appears underexposed. It can be brightened later.
You are taking a photo of a light source. The elevator button and everything else is darker in relation. This in itself could be tricking the camera into thinking the scene is too dark and it brightens it up and blows out the red.
Cameras still don't know what they are taking photos of. They need guidance
Hi everyone,
I have today bought my S9 + and I am so happy. Just when I tried super slow motion I noticed a big problem.
If I start the camera on any other mod it is normally, the room is well lit with 4-5 super strong LED lights, but as soon as I start super slow motion, the screen gets really dark, so dark, that it is not usable at all
Anyone have the same issue?
I am thinking about bringing it back tomorrow, because they have to give me a new phone if it is within the first 8 days.
Here is an example:
Auto mode:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Super Slow Motion
same issue here
arnes_king said:
Hi everyone,
I have today bought my S9 + and I am so happy. Just when I tried super slow motion I noticed a big problem.
If I start the camera on any other mod it is normally, the room is well lit with 4-5 super strong LED lights, but as soon as I start super slow motion, the screen gets really dark, so dark, that it is not usable at all
Anyone have the same issue?
I am thinking about bringing it back tomorrow, because they have to give me a new phone if it is within the first 8 days.
Here is an example:
Auto mode:
Super Slow Motion
Click to expand...
Click to collapse
With more fps, you need more light.
That Super Slow Motion requires a lot of lighting in order to be useful, but it happens also with normal videos. Try 4K in 30 and fps and you will notice the same behaviour.
@Galaxo60
Thanks, I know that, so I will try anyway tomorrow with daylight. But my room is very very good lit, with 4-5 LED lights. I also came across this guys with the same issue :/
https://www.reddit.com/r/tmobile/comments/84q2hx/galaxy_s9s9_super_slow_mo_mode_very_dark/
https://www.youtube.com/watch?v=E9HyqWbpG9A
If other people could write if their phone is the same or not, just so I know if it really is a defect or not?
arnes_king said:
If other people could write if their phone is the same or not, just so I know if it really is a defect or not?
Click to expand...
Click to collapse
This is a normal behavior of slow motion in a room with led or any other light except daylight... If you try it outside in the day light you will notice that there isn't any issue.... That has to do with slow motion and led light that is always flickering (you can't see it with your eyes but the camera can)
It's normal. The camera takes 1000 pictures per second. You can emulate it by going into pro mode and set the shutter speed to 1/1000 and you'll see a similar result. If you want a reasonable result inside you will need professional LED panels, standard LEDs won't be bright enough. Here's a test I did and it looks fine:
Thank you all. I tested it now outsides with daylight and it works perfectly fine
I just got my s9+ and it behaves the same except when I am outside in full light, then it is very well lit and works flawlessly. The slow motion camera is designed to be used in high light scenarios so even though your house is highly lit up you will still see a darker screen and even flickering as this isn't adequate lighting for this mode
I'm not sure that's normal, my screen doesn't get so dark when I enter super slow mo, even when I'm not outside
As I'm sure many others have said (feel free to delete this comment if its just repeating already said content), the higher the frame rate, the more light you need to capture it, so in some instances, its almost impossible to use it. I cant use it in my home at night, with just the light on. However, the standard slow motion setting seems to be fine, light wise (which is 240fps, so still pretty decent).
Its a novel feature for sure, and if you time it right, very very cool (although the other day I sent a video of my child splashing some water in her swimming pool and my sister said "why do u do that stupid slow motion thing" so I guess not everyone appreciates the coolness of it ) but I've found that timing it right is a whole other task in itself.
I also faced the same issue. It happened inside the room. So insufficient light may be the issue. Not only the phone got stuck up but it also became very hot. Eventually I switched it off and switched on after some time.
I have these issue too ! And sometimes the slowmotion is reversed (end to start).
clear data and than apply
you can clear the data
As someone stated earlier, it's a problem with how LED lights use PWM (Pulse Width Modulation) to simulate dimming of an incandescent bulb. Our eyes don't perceive this because of POV (Persistence of Vision). The same thing happens with incandescent bulbs due to the frequency (60 hertz in the US) or the AC power. Here are a few links explaining this in detail and why it's important for photographers and high speed videographers alike: https://www.bhphotovideo.com/explor...er-free-lights-and-why-they-are-important-you
http://www.lovehighspeed.com/lighting-for-high-speed/
So I got an s9+ and part of the reason I wanted the upgrade was the telephoto lens and having optical zoom. But within an hour of using the phone, by playing around with the camera and covering different lenses while browsing through the options, I noticed that I could not get the telephoto lens to activate only but a few times and for maybe a picture or two. Everything is being operated through the main sensor with variable aperture.
I guess my main question regarding the 2x telephoto lens is when exactly is it supposed to come into action? As far as my device goes, I have only consistently been able to get it to work with live focus, with the ability to see both wide angle and telephoto after the fact, so at least I know in some form it does work.
There have been many times when I first enter in the camera app, instantly press the 2x and noticed that the cameras did actually shift, only to have it switch back permanently within seconds.
Shooting video has been a mixed bag as well, i have tried different settings and zooming in and out and there have only been a couple times shooting videos that when 2x is hit, it actually changes to the 2x lens and it takes over from there. Besides those few times, there is absolutely no switch, both proved by the lack of choppy transition from each lens as well as the fact I can cover the sensor and nothing happens on the viewfinder when it says it's in 2x zoom.
Attached are some youtu.be links that I screen recorded in various situations, using Pinch to zoom, and also using the actual 2x zoom icon in the stock camera app. But still not being able to get the 2x to work. It seems like since I've noticed this issue, I can't get the lens to switch unless I'm using live focus and it has to take a photo from each lens. What's weird is the phone will go digital zoom all the way from 1x up to 10x and it is horrible, and I could've swore samsung advertises it being a 2x optical zoom with added digital zoom to achieve 10x but how could that be?
Any ideas of what Im doing wrong, if anything? Or is this something I should bring up to Samsung? Also I got my phone at Verizon so possibly try to get a replacement?
Thanks to everyone in advance!!
Interesting. Try live focus, the phone switches to that second camera. So does that mean in theory the s9 could also have 2x zoom by way of cropping image size?
Edit: should had read your whole post, but I am seeing the same.
Sent from my SM-G965U using Tapatalk
Ryan Cordero said:
Interesting. Try live focus, the phone switches to that second camera. So does that mean in theory the s9 could also have 2x zoom by way of cropping image size?
Click to expand...
Click to collapse
Sry I misspoke a little on previous post, when I said live photo I meant using live focus. And yes, using live focus is the only real way that I can see proof the module is working, as after the picture is take I still have the option to select and edit either the main sensor or the 2x sensor.. It's very strange!
The telephoto camera is configured to only work in bright sunlight. Which is annoying.
trenzterra said:
The telephoto camera is configured to only work in bright sunlight. Which is annoying.
Click to expand...
Click to collapse
Thats not true. I have tested it just now in my room, with no bright sunlight nor other bright light. When I click the x2 it zooms optically with the telephoto lens, I see it when I put my finger in front of it. What I saw, is that when I cover the telephoto lens completely, it then switches to the other lens, but if it is clear it works.
So I was able to do some more tests first thing after sunrise, and I am also seeing this being a lighting issue, or rather lack of light that's causing the main sensor to take over..
Most of my previous tests were done in my room and also in other dimly lit areas. Which explains why I was only able to get it working a couple times and the situation has to be right..
I tried it out last night and I noticed the 2x lens was being used only when I had a background light in the picture, and as soon as the light got out of the viewfinder, it would switch back to the main sensor.
I really appreciate your guy's help on this..
I'm not sure I like how samsung has this 2x lens setup and it's almost like the 2x sensor is so bad compared to the main sensor, that in almost all situations the main sensor will end with a better and cleaner picture. Is this how the Note 8 handles its 2x lens as well?
I've been playing around a little with my S9+, it looks like in auto mode, when set to "X2", it only switches to the secondary (narrow) camera when the light is very bright.
In pro mode (where I would've expected to be able to control the sensor in use), I couldn't get the secondary camera to work at all (even if I manually select F2.4 aperture, that matches both cameras).
I've also noticed the X1/X2 indication in auto mode gets mixed up on certain conditions.
I've been noticing this since day 1. Mostly indoors and low lighting, it's the digital zoom from the main shooter that comes into play. Rarely the Telephoto lens is used for 2x zoom and is used primarily for live focus.
android 9 tossed the "2x"
I just (02/23/19) upgraded my Galaxy S9 to Android 9 and have found that the camera app no longer has the "2X" on the screen to allow usage of the optical zoom lens. I there another way to access that?
It looks like this now
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}