Fine tuning a Decompression Algorithm

Please register or login

Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.

Benefits of registering include

  • Ability to post and comment on topics and discussions.
  • A Free photo gallery to share your dive photos with the world.
  • You can make this box go away

Joining is quick and easy. Log in or Register now!

DiveNav

ScubaBoard Supporter
ScubaBoard Supporter
Scuba Instructor
Messages
3,889
Reaction score
488
Location
Southern California
# of dives
200 - 499
I have been asked several time what kind of decompression algorithm is DiveNav using in its eDiving and divePAL simulators and my answer invariably was ...... "Buhlmann" :wink:
But I realized that probably you wanted a more specific answer, and maybe even a comparison of our algorithm against the ones used in real dive computers.

Existing Data

We did some research and found a very interesting article published by ScubaLab: Digging Deep on 2009's New Dive Computers. This article is based on a report generated by our friend Karl Huggins and his staff from the USC Catalina Hyperbaric Chamber.
This article describes the results of a head-to-head comparison among several dive computers during a series of 4 dives performed at the Chamber.

Our Tests
To test our own decompression algorithm we entered into divePAL the series of 4 dives (dive profiles and surface intervals) described in the original report. We used the recently released divePAL Nitrox because it allows to Plan and Analyze series of up to 5 dives.
Here below the profiles for the 4 dives:
Dive 1​
divepal_scubalab_1.jpg


Dive 2​
divepal_scubalab_2.jpg


Dive 3​
divepal_scubalab_3.jpg


Dive 4​
divepal_scubalab_4.jpg


.... to be continued ....
 
Of course you realize not many people dive straight Buhlmann anymore don't you? 80% gradient factors are more common
?
Do you realize that NO recreational dive computer has GF settings?

And why 80%? I do prefer 15-85
 
Once the dive profiles were ready, we "dove" them with divePAL, checking for the No Deco Time at each pre-defined point (depth and time).

We then collected the data in a spreadsheet and compared them against the results presented in the ScubaLab article mentioned earlier.

I was not surprised to find out that our own implementation of the Buhlmann ZH-L16C algorithm was the most aggressive of the group as I did suspected manufacturers of dive computers wanted to be cautious with their own algorithm.
Here, as an example, the comparison results for Dive 2:

algo_comparison_d2_original.png

Just few notes on how to read this graph:
- the axis on the left side is NDT in minutes for each point measured
- the axis on the bottom is dive time in minutes
- the axis on the top is depth in feet

The RED LINE represent our original algorithm.
Clearly it was the most aggressive of the group as it was providing the highest NDT value for every point being measured.

After analysing the results, we decided that we wanted an algorithm that will fit in the "middle of the road". But how can we achieve this?

...... to be continued ....
 
So, our goal was to come up with a decompression algorithm that will fit in the "middle of the road": not too aggressive and not too conservative either.

The Bulhmann ZH-L16C algorithm is a mathematical representation of gasses absorption and release by the human body when scuba diving.
This algorithm includes 16 compartments that are characterized by HALF TIMES and OVER-PRESSURE GRADIENTS (for a more detailed explanation on how this algorithm works, see the online class Introduction to Dive Computers). Half Times range from 4 minutes to 635 minutes and Over-Pressure Gradientes range from 12.7msw to 32.4msw

Over the course of few days, we run several simulation with different coefficients for the various compartments.
Initially we thought that it would have been enough to just apply a linear reduction on all the compartments, but we quickly realized that this approach was penalizing excessively the fast compartments (making them too conservative) and dive 1, while barely affecting the slow compartments and dives 3 and 4.
So we decided to drop the linear approach and decided to fine tune each compartment at time - one by one!
This approach would have been impractical for a real dive computer as it could have taken a huge amount of time and resources, but, in our case, using divePAL it was relatively fast as it takes only few seconds to change the coefficients and recompile the code and only few minutes to check the algorithm on the series of 4 dives.

At the end of the process we obtained a modified algorithm that is more in line with the algorithms used in recreational dive computers, and, more specifically, is mid-to-conservative for single deep dives and middle-of-the- road for series of dives.
Here below the results for the series of 4 dives described in the ScubaLab report:

Dive 1
divepal_algo_d1_optimized.png

Dive 2
divepal_algo_d2_optimized.png

Dive 3
divepal_algo_d3_optimized.png

Dive 4
divepal_algo_d4_optimized.png

So, now you have a better idea on how DiveNav decompression algorithm compares to the algorithms implemented in several dive computers.
If you want to check out our algorithm yourself just use divePAL (the Nitrox version), load the series of 4 dives (search for "chamber" as dive site) and analyze them.

One final thing: when you dive for real, follow YOUR dive computer as it knows exactly what your dive profile is!
 
Your original algorithm seemed to have very close agreement (+/- a minute or so) with the Oceanic VEO 250 simulation you also provided. So I assume now the Oceanic Haldane/Spencer algorithm will be quite a bit more aggressive than your modified Buhlmann. Perhaps you could make the aggressiveness of your algorithm a user option with 3 or 4 levels to select from. This link suggests 4 groups of dive computer algorithms: Gear / Accessories | Scuba Diving Magazine
 
...... Perhaps you could make the aggressiveness of your algorithm a user option with 3 or 4 levels to select from.....
Yes. This is in our plans.
Now that we have divePAL, it is quite easy for us to "experiment" with different sets of coefficients.
Maybe we will have 5 groups: very aggressive, aggressive, average, conservative, very conservative

.....This link suggests 4 groups of dive computer algorithms: Gear / Accessories | Scuba Diving Magazine
Thank you for the link. A bit old indeed.
 
This is a very interesting thread, and I very much appreciate the timely information, DiveNav as I'm doing research now to buy my first computer. I'm impressed at the very significant differences in NDLs produced by the different algorithms in the testing that you referenced, especially as I was really leaning towards the Mares Puck Air (which seems like a great value) until I saw that its algorithm is by far the most conservative of the units tested. Like everyone else, I would love to maximize bottom time, but now I'm really wondering about the inherent validity of the different algorithms and how to factor safety into the equation. Mares promotes their implementation of the Wienke RGBM algorithms as being "the most evolved algorithm for reducing the formation of micro-bubbles without compromising dive times", and the research into the formation of micro-bubbles, on which the algorithm is based, certainly would appear to a lay person like me to be an advancement in the field. However, it also seems clear that the algorithm is fundamentally more conservative than competing algorithms, based on the results of both of the units that used the RGBM model (Mares and Suunto). So my question is: should I believe in the claimed superiority of the RGBM model and sacrifice dive time for the sake of safety by buying the Puck Air, or should I instead believe that more liberal algorithms must be acceptably safe since so many divers have been using them on so many dives, and buy a computer that will give me more dive time? Anyone have an opinion?
 
I pass you the caveat that Bill Hamilton always used: "cutting a table involves marking with chalk and choping with an axe." So the idea of "fine tuning" is, perhaps, a bit strange.
 
I pass you the caveat that Bill Hamilton always used: "cutting a table involves marking with chalk and choping with an axe." So the idea of "fine tuning" is, perhaps, a bit strange.
"Fine Tuning" in the contest of third thread means adjusting the original ZH-L16C algorithm in an attempt to produce an algorithm that would fit "in the middle" when compared to other algorithms when tested using a pre-defined set of dive profiles. Nothing more, nothing less.

Of course, I do know that any deco algorithm is just a mathematical model that produces precise ESTIMATES of the nitrogen absorption and release in our bodies ....
 

Back
Top Bottom