Originally Posted by
MikeAurelius
Ummm... so do you.
Here is the basic formula for calculating exposure to any type of radiation: R * (A / (2 * 3.14159 * r^2) ) Where R = Radiance of the object in watts per square centimeter, A = the area of the radiation, in square centimeters, and r = the distance to the eye from the radiation source, in centimeters.
Let's say, for the sake of discussion, that the radiance of a given object is 2 watts per square centimeter, the area of the object is 1 square centimeter, and the distance is 10 centimeters. Plugging the data into the formula, you get the following:
2 * (1 / (2 * 3.14159 * 10^2))
2 * (1 / (2 * 3.14159 * 100))
2 * (1 / 628.32)
2 * 0.00159
0.00318
At 10 centimeters, an object that measures 1 square centimeters with a radiance of 2 watts per square centimeter has the effect of 0.00318 watts per square centimeter.
Now, I haven't been able to find any data that suggests that a television or computer monitor puts out 2 watts per square cm of energy. It's more likely it is in the 1/4 watt range or less.
My Samsung smartphone has a screen with a measurement of 6 x 11 centimeters (66 square cm). Let's assume the output is .25 watts per square cm (I believe it is certainly less than that), and I read at 30 cm distance.
From above:
.25 * (66 / (2 * 3.14159 * 30^2))
.25 * (66 / (2 * 3.14159 * 900))
.25 * (66 / 5654.86)
.25 * .01167
0.00292 watts per cm squared entering the eye
Now, all this supposes that the radiation emitted from the source is equal across the spectrum, as a black body emitter does, and we know that this is not the case. To be precise, you would need exact emissions per wavelength across the spectrum you are interested in and plug in the numbers accordingly. But this basic formula gives a close approximation for a "white light" emitter.
I've looked at a lot of the white papers written about the so-called 'blue light hazard' and there is no detailed referential information given about where the data was originally calculated and what source of energy was used. There's lots of "this is bad" but there's nothing to back it up.
Yes, the energy required to get from red to blue is higher. This is scientific fact. The question remains though, at what point does it become hazardous to the eye? It's been established that 380 nm and below is hazardous. But how did the "researchers" get from 380 nm to 430 nm? Where's the proof that it's "hazardous". The only thing I've seen is the oft-repeated "we don't need to see it", and I find that to be a lot of hokum.
Bookmarks