Why can my eye pick up more color than a camera sensor at depth?

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

Always thought it was cool how quickly the brain color corrects when you wear rose or orange tinted sunglasses, then everything looks blue after taking them off. Not sure how it works neurologically, maybe someone here can elaborate on that.

I wonder if the brain applies an overall color correction like a filter, or actually adapts in a more complex way to see more accurate color in objects further away than a filter would help with. dives definitely do seem far less green than cameras make them look afterwords.

As for those apps, they'll never work as well as manual correction in photoshop (or gimp etc.). They're not doing anything special that a skilled editor can't do. And because each image is different you'll get a better result adjusting each slider yourself. Takes some practice though. The chinese app is downright sketchy from a privacy/security standpoint.

Check out Derya Akkaynak's work on AI color correction. That algorithm is actually doing things beyond what normal adjustments achieve. Although her current methodologies require 3D data for now. I think there were threads about her work on SB when it was making the rounds in the news.
 
Thanks for all the replies! To be clear, I wasn’t asking how to do WB on my camera or to get colors to look “real” in photographs - instead, I was asking how my brain manages to render a shark as brownish grey in my visual field when my camera sensor registers it as blue. A number of comments described how color perception is influenced by mental WB adjusted for preconceptions/expectations, which I find really interesting. I wonder if knowledge of the existence of that mental image processing mechanism will cause sharks to look bluer at depth on my next dive ...
It is possible to capture videos that is very close to what you see. Your sensor needs to get more light in if you want the result to be same as what you see. I also would not reduce the difference to brain auto wb. Human hw/sw combination do auto iso, auto focus, auto focal length, auto aperture, auto shutter speed, not to mention our lenses function in a completely different way.
I wonder if knowledge of the existence of that mental image processing mechanism will cause sharks to look bluer at depth on my next dive ...
I have been watching tons of colorful uw videos, unfortunately I still see everything as before :).
 
If your brain uses prior knowlege to adjust the percieved image then why does it not apply it to the photograph?
 
Our brain tends to fool us when it comes to color correction. Like, for example, when you see greenish shadows after looking at red light.

Probably, this discrepancy between our vision and camera performance can be explained by greater dynamic range of human eye. Meaning, camera sensors have poor color vision in the shades but our eye still has. All cameras that I've had performed poorly when red and yellow turned darker. This transition is never smooth. Instead of clean color gradient into dark yellow/red I see increase of color noise, and then the color simply turns into a mess of colored and black points which looks like dirt. Sensors perform much better in green, blue and purple, but, by a coincidence, these are not the colors lost in deep waters.
 
  • Like
Reactions: OTF
The biggest issue with comparing photos to what you see underwater is that when you look at the image your brain-eye combination is tuned to the light in your current environment and when you look at an image it magnifies any deficiency in the way the image has been recorded.
The other issue is the dynamic range of the scene - the image you posted goes all the way from deep shadows to near blown highlights and there's not much contrast in the mid tones. UW your eye compensates for this. Also as others have mentioned, there's so little red light the red channel is all noise and turns the whole thing muddy. Have a look at the individual channels to see how little red is recorded.
 
If your brain uses prior knowlege to adjust the percieved image then why does it not apply it to the photograph?
Because the information that is useful to the brain underwater is already lost forever when you look at a photograph.
 
Imagine if we had the same capability of a Mantis Shrimp which has the best range in the world.

Mantis shrimp are able too see 12–16 colors depending on the species. By comparison, humans are only able to see three: red, green and blue
 
Because the information that is useful to the brain underwater is already lost forever when you look at a photograph.
I don't think prior image is used in color vision, at least there is not much mention here -> Color vision - Wikipedia. Prior images are used for pattern recognition, when you drive, your brain prioritize car pattern so you will be more likely to notice a car first and a cyclist later.
 
Sensors perform much better in green, blue and purple, but, by a coincidence, these are not the colors lost in deep waters.
Sensors record/store the light in the form of electrons, I don't think electrons have bias towards a specific color. Reproduction of the colors are done by the camera software.
 
I think the op may be suffering from a less than good camera. I did a bunch of dives recently where one of the other divers in the water had a proper DSLR with all the trimmings. Her pictures made me question if I had been on the same dive, it was much clearer and more colourful than I remembered. Having a wide lens and strobes flattered the conditions.
 
https://www.shearwater.com/products/swift/

Back
Top Bottom