I personally don't use lumens to rate a light, watts tends to be more accurate,
I am sure they get their 3800 number, question is at what distance from the led is it measured. That's the trick
please explain to me how measurement of power consumption is more accurate for brightness than a measurement of the total amount of visible light? You can back into lumen output but you need the rating of the emitters which is lumens/watt and then you need the total efficiency of the system *losses in the drivers, and then the optical efficiency of the lens itself which BB has very cheap lenses that are not particularly efficient*.
Now, I gave you rough watt consumption for that light, it is an average, and there are many assumptions being made, but if you have a 50wh pack that burns for 3 hours, then it's a 50/3=16.6w light. Problem is you can't discharge those batteries to 100%, and there are all sorts of efficiency losses that have to be factored in.
IF they get their 3800 number, it's for a brief second immediately as the light turns on and then it falls off like the chart in the picture. The batteries can't supply that much power for very long. If it was a true 3800 lumen then it would have about a 30min burn time on high.
Watt consumption is fine if you are comparing lights from the same manufacturer, i.e. the LD-40 and the LD-20 which are engineered the same and you have the same variables at play, but it's not fair to use those same values when comparing lights from different manufacturers and here's why.
Companies like UWLD, Dive Rite, Light Monkey, Light and Motion etc. all use constant output drivers. I.e. they consume about the same amount of power from start to finish and as such have a relatively constant light output. You can see this in the chart I posted from the LX20. When you look at lights from manufacturers that aren't quite so honest about their lights performance then you get lights like from Big Blue where they do not use constant output drivers and the out falls off drastically which is why I used the term "average" because you have not idea what the watt consumption is from start to finish. We can back into an average watt draw with some assumptions, but we can only use that as an average over the duration.
One last point about why watts are a bad unit to use. LED's have all sorts of different efficiency values depending on what is being done with them. LED's efficiency is based on the power being sent to them and it falls off as they get hot. Some manufacturers under-drive their lights to improve efficiency, and some over-drive them to maximize light output. That value can go all over the place from as low as 60lumen/watt all the way up to about 120lumen/watt which makes a comparison based on power consumption all but useless when you are comparing manufacturers or heaven forbid different generations of lights *The original LM 12W LED for example was only about 450lumen total which is less than 40 lumen/watt.
If a manufacturer claims lumen, it's the right unit to use to compare, but you have to be smart about it because Lux is also important which is the intensity of the beam in the middle. Video lights for example have very low lux, but very high lumen because there is no hotspot and it's quite wide. A laser pointer has high lux but low lumen so it's all a balance, but watts is not the right unit to use when comparing lights to one another because of the wild variation in efficiencies.