Interesting development for underwater imagery / photography

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

Except it doesn’t work on an IPad any more. Apparently the developers are aware of this and their suggested fix is to ask you to switch to IPhone. Why in the world would I switch to a smaller screen. I’m looking for something else to replace Dive+
I agree with you. I think Dive+ was made for their phone housing and they want you to buy the housing. The color correction worked well on my underwater photos in my Photo app on the iPhone but I'm looking for similar correction for my GoPro videos on my PC.
 
I think there are a few points that some of you may not be considering.

- Yes, we can do white balancing in PS/LR. But, that balances everything in the image to the same baseline. Part of what she is saying is that an object that is further away from the camera has lost more of its color than an object close to the camera. So, the white balance should be different, depending on the distance from the camera. Of course, that is not something we can (easily) do in PS/LR. I think this is why the video shows her placing her card and then shooting it multiple places from a distance and then swimming closer and closer. That allows her algorithm to calibrate itself for how much of each color is lost at each distance.

- Strobes will give you the color for things that are close, but not things that are far. Her approach would let you shoot ambient light and get the natural color for everything (more or less - I think).

- You might think that shooting ambient light all the time and using this algorithm to give the color would be unfeasible a lot of the time because of the slow shutter speeds that would be required. I'm not so sure about that. At least, not for the more high end cameras. Google ISO Invariance, if you aren't familiar with it. Bottom line: If there is even halfway decent ambient light, you shoot with no strobes and the same shutter speed and "fix" the exposure in PS/LR. And use this algorithm to fix the colors.

Don't get me wrong on that last point. Even with ISO Invariance, the right amount of light is better than too little light. Shooting in ambient and raising the exposure in post is still going to end up with more image noise than having enough light in the first place. I'm just saying that I think there are a fair number of my own pictures where the ambient light was adequate for a good exposure, but I still used strobes anyway, to get the best color. Color I would not have gotten by simply adjusting WB in post. Or maybe I've been doing something wrong in post... But, I think that this algorithm would potentially allow for a pretty decent chunk of a lot of people's photos that were shot with strobes to be done without a strobe and applying this algorithm instead.

Bottom line: I think this algorithm would only even be intended to be used at depths where there simply IS enough ambient light. I'm just thinking that said depth might be a little deeper than what you might think at first.

- All of that said, it sounds like her algorithm depends on having a bunch of photos of the dive site to use in order to calibrate itself. That's probably not going to really be feasible for a LOT of situations. It is also not clear to me whether the algorithm would be useful for, say, shooting up from the bottom to take a picture of a big fish in blue water. I say that partly because it's unclear if the algorithm is building a 3D model of the dive site and using that to calculate distances to everything in each photo. If it is, then shooting up into blue water would not allow application of the algorithm as there would be nothing in the photo as a reference from which to calculate distances. I'm considering what I know about her algorithm and what I know about 3D Photogrammetry in saying that. Which, admittedly, is not really a lot about either.

- I suspect that in the process of commercialization, there will be a handful of "standard Sea-Thru profiles" developed that can be used for when you aren't able to swim around a take a bunch of photos of the site yourself. Something like a green water profile and a blue water profile. Maybe a few variations of each. The photographer could pick which one to use (and/or change it in post, if shooting in RAW, and using a post-processing application that knows what to do with these different Sea-Thru profiles).

- For my own photography, I think there would still be times where I would want to use strobes. It lets me put the "focus" (ha ha!) of the photo where I want it and everything else is de-emphasized by being blue. Similar to using shallow depth of field in land-based portraiture, to make the photo subject sharp and really stand out by having everything else in the frame appear fuzzy/blurry. But, I can also envision times where having everything in the photo look like it was shot in air would give a more-pleasing result (like you'd normally see done in a land-based landscape photo). In other words, if this algorithm really "makes it" into public use, it would just be another tool in the toolbox.
 
I'm working on a solution for the X Rite Passport right now and should be able to test in about a week. The X Rite is not waterproof so I did some research into resins to coat the card and seal the water out. I looked up the best product I could find for windshield chip repair to have the best optical clarity and prevent yellowing. I've ordered 2 different viscosity resins from Delta Kits and a Convoy 2 UV light which should arrive in a couple days. The idea is to put a light coat of resin over the top and seal it to become 100% waterproof at any depth. I also have a vacuum pump that I will purge any bubbles with before curing to maintain optimal clarity.

I've used an X Rite underwater once and sealed it in a double layered vacuum sealed bags. It has great potential but that first try was a long time ago and at the start of me using the X Rite. Now I use color calibration on land for almost everything I do and it's very simple. Create the profile in X Rite's program and then it will automatically show up in Lightroom.

To get proper calibration the camera needs to have the white balance set accurately for each situation. Once white balance is set then take a photo of the X Rite Passport. If you don't set the white balance first the calibration will be off and may shift colors away from what you want. The initial time I tried a calibration underwater the white balance was a little off and when I applied the calibration everything shifted more yellow.

White balance will help adjust color tone but not make them accurate. A calibration will usually make blues and reds deeper than they would otherwise. Since I focus primarily on portraits of people underwater, my favorite is caves and I want the skin tones to come out more accurately and warmer than they do otherwise. Once I can test it out I plan to write an article and show comparisons.

This concept could be implemented to create the same result she is trying to produce with her algorithm. you would need to take multiple photos of the X-Rite color chart at different distances but you would need to adjust the white balance to be white at the distance the color chart is at. Once you have the color profiles you just choose which one matches the distance for your primary subject or use layers and use one profile and another for the distance. This could also work great with strobes by using filters on the lights instead of on the lens. This would blend the natural light with artificial.

I use color calibration for every photo shoot I do on land and it helps the blues and reds become deeper and more rich. It won't add red in but tell the image the value of what red should be so it can correct what isn't there.
 
Interesting, but looks like it only works for static image
 
https://www.shearwater.com/products/perdix-ai/

Back
Top Bottom