In an ideal theoretical world, yes...but in the real world, any constant-current regulated light will drop out of regulation because batteries are not an infinite power source.
How long a light will run out-of-regulation depends mostly on the power source. If you use protected li-ion cells, they will just cut off...no warning. Unprotected li-ion cells will fall off a short & steep voltage curve.. Alkaline cells will decline down a very long and shallow curve. Lithium primaries and NiMH cells lie in between these extremes.
The important thing is that, when most lights drop out of regulation, they continue to deliver imperceptibly lower levels of light for significant time. And the amount of light that we see is what matters.
When you suggest "-15% rated lumen output" as the runtime metric cutoff, I'm guessing that you actually mean "-15% perceived light output". They are not the same. The human eye responds non-linearly to light intensity.
Put simply, your eyes/brain cannot see the difference between 100% lumens and 85% lumens in most cases. Lumens must be doubled (200%) or halved (50%) before you will really notice a difference. As a general rule of thumb, the lumens must be 500% higher for a light to look twice as bright. A 1000-lumen light looks about twice as bright as 200 lumens. Non-intuitive, but that's the way our eyes work.
So...assuming you were talking about perceived brightness, 50% of lumen output is right on target.
By the way, this is not my idea. The guys in the flashlight communities like candlepowerforums came to the same conclusion long ago...and they demand this rating from the cutting-edge flashlight manufacturers:
runtime = time to 50% of rated lumens
It makes good sense. Think about it.
J
Hi Jaydil,
I think that if you can't see the difference in lightness in a 49% less bright light, you should buy a light with 49% less luminous flux because you can't tell the difference
Seriously though, the human eye (HVS) responds to different wavelengths of light differently. In Photopic (daylight conditions) and Scotopic (night time conditions) these sensitivies change even furthur (towards blue) as the rods take over to perceive lightness.
Every human will have a slightly different response to lightness at a particular wavelength, so how can you generalize that 49% won't be perceived at all? at what wavelength? In what conditions? How fast is the change? When the light changes slowly it is less perceptive until it is changed quickly.
At the end of the day you will have less lumens illuminating the area you wish to illuminate. That is not what I paid for.
The best idea I have heard yet is to rate the light output (in lumens) for the end of the specified runtime. Then you know what you are at least guaranteed to get throughout the runtime. Although I agree you could get this info by just taking half the "quoted" lumen value to obtain this. Stating the runtime to half the lumen value is misleading to those that are not in the know, and not good customer focused marketing.