DragonlensmanWV N.A.O.L.
"There is nothing patriotic about hating your government or pretending you can hate your government but love your country."
I assume a case could be made that any medium for which light passes could cause acuity loss.
Certainly, any "real," homogeneous optical medium will at least 1) reflect some degree of light at boundary interfaces and 2) disperse light, both of which can degrade the quality of the image. Absorption and light scatter can degrade the image quality even further. But the effects have to be rather significant before visual acuity is noticeably affected. Keep in mind that the "just noticeable difference" for a typical observer is roughly 0.25 D, which actually represents a considerably greater reduction in contrast and image quality than the presence of low levels of these other effects.I assume a case could be made that any medium for which light passes could cause acuity loss
Darryl J. Meister, ABOM
I assume that we can then make the argument about the above! If you combine the effects of power, material, index, abbe, etc with possible variances in the refractive findings on any giving day, we can open up a whole hell of a Mighty Pandora's Box!!
When is Barry going to chime in?
;):cheers::D
Last edited by Fezz; 07-15-2008 at 03:42 PM.
The whole question of losing 2 lines of resolution comes down to what measurement or chart you are using (you have to define your "line"). This entire discussion assumes we are talking about the Snellen Chart which is for measuring Visual Accuity only not resolution. There are other charts for Resolution such as the Koren, USAF 1951 and the I3A/ISO.
Poly is unlikely to drop a line in the Snellen Chart, but very likely to in tests specifically designed to test Resolution. So if the lab tech in Reply 1 is talking about those charts that refer specifically to Resolution he could be correct.
I think we all know that this question is both subjective and objective. I have a -10.0 pt who loves poly, and a minus -1.0 pt who doesn't.
We ALL know that Poly is not the best optically, and we ALL know that most patients won't notice the difference.
All I know is that I need another Calculus class.
Sharpstick
In the United States, at least, more often than not we're referring to the classic Snellen chart, unless we're dealing with a research environment. For corrected acuities in the 20/20 range, adjacent Snellen lines would typically represent +/-5 feet. This represents a difference of roughly +/-1 minute of arc, give or take.The whole question of losing 2 lines of resolution comes down to what measurement or chart you are using (you have to define your "line"). This entire discussion assumes we are talking about the Snellen Chart which is for measuring Visual Accuity only not resolution.
This is a good point to make, particularly given the relatively small number of individuals who really fall into the higher prescription category generally associated with meaningful levels of chromatic aberration. Of course, your comment regarding subjective observations also alludes to the potential for symptomatic wear even in lower prescriptions. It's definitely difficult to generalize...We ALL know that Poly is not the best optically, and we ALL know that most patients won't notice the difference... I think we all know that this question is both subjective and objective. I have a -10.0 pt who loves poly, and a minus -1.0 pt who doesn't.
Darryl J. Meister, ABOM
Never overlook the fact that 20/happy is interchangable with 20/not complaining. This is a very fine line that I don't think gets the attention it deserves. I know OMD's who put as many of thier patient's as possible in O.U. Rx's for contacts thinking that thier patient's appreciate the convienence of not being able to get them mixed up, more than the best vision they can get.
Same could be said for poly usage....
Bad medicine, bad bad medicine....
Chip
Well, for me, its more about some of the factors that are either assumed, or not understood:
1. Does the given Rx really represent maximum acuity, i.e., full DV and astigmatism/axis correction...(you never know)
2.What has been the "habitual" vision/RX? We're keenly sensitive to comparative judgments, and humans often come to erroneous conclusions about what "looks" better, (at least at first)
3. Darryl's stuff on opticampus about chromatic aberration is great, but needs to be taken further. Yes, our eye has significant axial chromatism. Yet, when even an emmetrope looks at the full moon, does he/she see color fringing? No. Why? neurological adaptation.
4. Same goes for the inherently decenterd optics of the human eye. It has significant mirror coma, but do you notice? No. Why? Neurological adaptation.
5. The wallies of poly: Anyone on the board whose has one of Dr. Morrison's Enigma/Countour optics eyewear (Darryl, just why did Zeiss trash this great technology? Was it NIH?), knows that there is no lateral color error with these lenses, made from poly. I really believe it is the industry's fasination with Flat lenses that exacerbates lateral color with poly. And yes, I know that higher index lenses should be flatter form for maximum axial correction and sharpness
It really comes down to contrast: who has it, and who doesn't.
Barry
Sidenote / thread Hi-Jack-
My pair of Enigmas is just about DOA. The lenses have spider cracked so bad, that I can't keep them tight, and the scratches.......well, lets just say that I put the AR to the test! I will miss them when they are gone!
:cheers::cheers::cheers:
Adaptation plays a role, but keep in mind that the lateral chromatic aberration of the eye, which is responsible for color fringing, is not as significant as the axial chromatic aberration of the eye, particularly since the sensitivity of the retina drops off rapidly away from the fovea.3. Darryl's stuff on opticampus about chromatic aberration is great, but needs to be taken further. Yes, our eye has significant axial chromatism. Yet, when even an emmetrope looks at the full moon, does he/she see color fringing? No. Why? neurological adaptation.
Unfortunately, it was extremely difficult and expensive to mold and edge these lenses. So, given the market demand for the product at the time, it wasn't really cost-effective to continue making them.(Darryl, just why did Zeiss trash this great technology? Was it NIH?)
Darryl J. Meister, ABOM
Thanks again, for your thoughtful response. I'll concede that an extended object like the moon is influenced by lateral chromatism. But, a star (being a point) is not. And no one I know sees color fringing, naked eye, on stars (axial chromatism).
Shame about the Contour optics. They certainly have a more imprtant place in our fieled than yet another free-form progressive, IMHO.
Barry
Last edited by Barry Santini; 08-27-2008 at 07:16 PM.
Color fringing from lateral chromatic aberration of the eye is actually visible, but we are generally just unaware of it. If you move a card slowly over your pupil while staring at the edge of a white, brightly illuminated window sill, for instance, you can observe your own lateral chromatic aberration. Although I've never given it too much thought, I suspect that the transverse effects of ocular chromatic aberration would normally be perceived more as a reduction in image contrast, not as color fringing, since the defocused images of each color would still overlap to some extent.not. And no one I know sees color fringing, naked eye, on stars (axial chromatsim).
In any event, the eye has several mechanisms in place to minimize this and other aberrations:
1) The sensitivity of the retina drops off rapidly away from the fovea; at only 5 degrees from the fovea, visual acuity has already dropped by roughly 66%, which reduces the chromatic aberration effects for larger or extended objects.
2) The Stiles-Crawford effect results in significantly less sensitivity for rays of light refracted through the periphery of the pupil, which reduces the chromatic aberration effects due to prismatic displacement away from the optical axis of the eye.
3) The relative sensitivity of the eye drops off rapidly away from the center of the visible spectrum, leaving the eye considerably less sensitive to red and blue colors and to chromatic aberration effects in general.
Of course, I guess you could think of one or two of these as a form of neurological adaptation in the evolutionary sense, anyway.
Darryl J. Meister, ABOM
Although I haven't personally done the pupil test you describe, it does make sense. When I worked for Tele Vue optics, we used to regularly *sample* portions of the exit pupil of eyepieces to observe and corroborate design tradeoffs we made. However, anything other than an artifical star point is considered an extended object, and you are right, you would be sampling lateral color. But, axial color (and the eye's coma from its decentered optics) has been compensated by evolution as it tries to reduce/prevent neural fatigue.
Interesting discussion...I think you're right, these effects are part of neural adaptation (& I'm thinkin' I'm a little rusty here).
barry
Last edited by Barry Santini; 08-27-2008 at 07:16 PM.
Could any of you Ultra Bright Folks educate a simpleton like me as to where I could read up on neural adaption?
Thanks!
;):cheers:
I'm just glad to have a few people on this 'Board who are interested enough in this stuff to be able to banter back and forth about it. I often say the same of Harry C. in the various ophthalmic optics threads.Interesting discussion...I think you're right, these effects are part of neural adaptation (& I'm thinkin' I'm a little rusty here).
Although it doesn't focus specifically on neurological adaptation, I generally recommend Tunnacliffe's Introduction to Visual Optics for those genuinely interested in vision science and visual optics. Michaels's Visual Optics and Refraction is also a great text, although it is now long out-of-print.Could any of you Ultra Bright Folks educate a simpleton like me as to where I could read up on neural adaption?
Darryl J. Meister, ABOM
There are currently 1 users browsing this thread. (0 members and 1 guests)
Bookmarks