Making Lemonade in SoCal

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

beldridg

ScubaBoard Supporter
ScubaBoard Supporter
Messages
747
Reaction score
1,504
Location
Southern California, USA
# of dives
500 - 999
I was supposed to have four days diving the USS Oriskany this past week but we canceled last minute due to Hurricane Ian.

Instead, I hastily put together three days of diving here in Southern California to work on some long-standing projects. Oh yeah, we also found and identified a new wreck. :) I also built a couple cool new photogrammetry models and started on a model of the Yukon.

The conditions were good with good visibility deep water. A summary of the week is at the post below. I'll be working on detailed posts for each of the dives / discoveries.


Here is a screenshot of the photogrammetry model of the new wreck we found & identified:

Screen Shot 2022-09-27 at 7.31.20 PM.png


When life gives you lemons, make lemonade!

Regards,

- brett
 
Nice work.
I need to get back into southern California waters. I've been out of it way too long.
There is gear that has never seen salt water.
 
Brett, I'm working on some photogrammetry models as well and I only recently discovered Apple's Object Capture, which is built into both iOS and MacOS. It takes away a lot of the pain of Agisoft/Meshroom. I can scan and create a model on my phone in very little time, with all imaging done underwater. If you have not tried it check out the app PhotoCatch -- it is a good intro to the technology.
 
Brett, I'm working on some photogrammetry models as well and I only recently discovered Apple's Object Capture, which is built into both iOS and MacOS. It takes away a lot of the pain of Agisoft/Meshroom. I can scan and create a model on my phone in very little time, with all imaging done underwater. If you have not tried it check out the app PhotoCatch -- it is a good intro to the technology.

Very interesting and thanks for the information. I'll have to check it out. What housing do you use with your iPhone underwater? Do you use any artificial light?

- brett
 
I use the DiveVolk housing. I know you dive quite a bit deeper than I normally do so that may be a problem. The DV housing is nice in that it allows full use of the touchscreen.

So far I have not used lighting but I will be soon. I have been testing at depths of only 10-30' so there is plenty of light. Even at those depths though the texture qualities are much better when there is more light. One cool thing about Object Capture is that it can work with video as an input, so if you have a video light you can just record the subject from all sides/angles and then feed that to the app. It works very well just at that level. If you want accurate scaling -- not absolute accuracy but model-to-model accuracy -- you need to use still photos that are taken in Apple's Portrait Mode -- the mode that creates a depth map. The depth map is then used to compute scale and it is included in the final model. It's all quite impressive IMO. The scale is not accurate though, I think diffraction might throw it off as it uses two cameras and then does the required math.
 
I use the DiveVolk housing. I know you dive quite a bit deeper than I normally do so that may be a problem. The DV housing is nice in that it allows full use of the touchscreen.

So far I have not used lighting but I will be soon. I have been testing at depths of only 10-30' so there is plenty of light. Even at those depths though the texture qualities are much better when there is more light. One cool thing about Object Capture is that it can work with video as an input, so if you have a video light you can just record the subject from all sides/angles and then feed that to the app. It works very well just at that level. If you want accurate scaling -- not absolute accuracy but model-to-model accuracy -- you need to use still photos that are taken in Apple's Portrait Mode -- the mode that creates a depth map. The depth map is then used to compute scale and it is included in the final model. It's all quite impressive IMO. The scale is not accurate though, I think diffraction might throw it off as it uses two cameras and then does the required math.

I took a look and it says the depth rating is 80m which probably covers 90% of my dives.

I hate "big" phones and my phone is an iPhone 12 Mini so I'm not sure how useful it would be underwater but maybe I'll play around with it. Could be an interesting tool.

- brett
 
If you don't need accurate scaling you can just use a video as the source. Swim around/over your object and video it from all sides/angles, then run the video through the app. The video can be captured apart from the app. All you do in the app is tell it what source data to use, either a video or a bunch of stills. So you could test it out using your current imaging rig.
 
I took a look and it says the depth rating is 80m which probably covers 90% of my dives.

I hate "big" phones and my phone is an iPhone 12 Mini so I'm not sure how useful it would be underwater but maybe I'll play around with it. Could be an interesting tool.

- brett

If you want to mess around with a kraken housing let me know, I’m not diving with it for a while as I figure out the rEvo more.

If you don't need accurate scaling you can just use a video as the source. Swim around/over your object and video it from all sides/angles, then run the video through the app. The video can be captured apart from the app. All you do in the app is tell it what source data to use, either a video or a bunch of stills. So you could test it out using your current imaging rig.

Interesting, does your phone housing allow you to use the native phone camera app? The kraken housing unfortunately does not.
 
If you want to mess around with a kraken housing let me know, I’m not diving with it for a while as I figure out the rEvo more.



Interesting, does your phone housing allow you to use the native phone camera app? The kraken housing unfortunately does not.
Yes, the DiveVolk housing has a double layer of heavy vinyl that allows full use of the touch screen. I use a few different apps underwater and needed that feature.

I would not use a housing with a required app as the developer can go belly up at any time, then you are stuck with hardware that is not usable.
 
Yes, the DiveVolk housing has a double layer of heavy vinyl that allows full use of the touch screen. I use a few different apps underwater and needed that feature.

I would not use a housing with a required app as the developer can go belly up at any time, then you are stuck with hardware that is not usable.

Interesting. Yeah for iOS you need a 3rd party app, but for google pixel phones google has their own app that interacts directly with the housing and allows for native camera usage.

I hadn’t heard of DiveVolk before. Interesting product. Not a fan of the idea they want screen protectors removed and the per phone fit kit. The newer model, which seems to be all that fits the latest iPhones is also only rated to 60M. All about trade offs. Seems like a good product though.
 
https://www.shearwater.com/products/teric/

Back
Top Bottom