Results 1 to 7 of 7

Thread: Evaluating Calibration Tolerances

  1. #1
    OptiBoard Professional
    Join Date
    Oct 2009
    Location
    World
    Occupation
    Optical Laboratory Technician
    Posts
    129

    Evaluating Calibration Tolerances

    By calibraton of Natural Diamond Turning Tool on modern generators there are used for calibration in general two steps:
    - height calibration (result should be one straight line)
    - -6,00 curve

    For both steps there are set some tolerances and i would like to know, how these tolerance are being evaluated.
    1. what does mean in terms of power shift when by -6,00 curve test there is deviance of 3μm (measured by spherometer) ?
    2. what does mean in terms of to have the center of the lens precisly cut in the center the deviance of 0,01mm (height test)?

    I would like to know some calculation behind it - so deviances transfer to the "real affects".

  2. #2
    Banned
    Join Date
    Jul 2010
    Location
    St. Cloud, Minnesota
    Occupation
    Ophthalmic Technician
    Posts
    3,089
    I've got an older Gerber/Coburn Vector CNC generator to run my glass Rx production on. There are several calibration settings on this machine, and I assume similar settings with newer machines.

    First is the cutter (or diamond wheel): known major and minor diameters of the wheel.
    Next is the machine itself, the Vector is calibrated by setting
    1. The Y axis (front to back) of the base slide
    2. The X axis (side to side) of the cross feed
    3. The Z axis (thickness) of the lens feed
    4. Parallelism of the finished lens.

    Once all those parameters are set, the machine generates as close to an "ideal" curve for further surfacing.

    As far as tolerances are concerned, in the spectacle lens business, microns are never used in tolerances. The smallest measurement standard testing instruments will go is 0.01 mm. A micron is 0.001 mm. Remember that the spherometer is a measure of saggital depth.

    Let's look at the 50 mm bell sag reading for a 6.00 diopter lens. It should be 3.6115. If you use the tolerance you mentioned, 0.003 mm, that changes the sag to 3.614, and the curve is now 6.004. Given the average surface roughness coming off of a generator is in the neighborhood of 0.050 to 0.125 mm (depending on the material type: glass, plastic, etc.), having a tolerance of 0.003 mm is on the face, silly.

    Depending on the type of material that is being worked, your tolerances are going to vary from glass to plastic to high index, just due to surface roughness coming off the generator. Yes, you can slow the production cycle down to get a smoother surface, but that usually takes more time than the average grinding cycle on the surfacing machine. So there is your trade-off, time in generating or time in surfacing.

    I think what you are confusing is metal machining for precision metal finishing versus glass/plastic machining for spectacle lenses. While the optical machine industry DID start with metal machining equipment, it separated to its own set of standards many years ago.

    Ideally, in a perfect production environment, a generator is going to produce the exact curve at the exact lens thickness every single time. However, as the cutting tool wears, thickness and curve variations creep in. You maintain "ideal" settings by relatively frequent calibration runs. However, other variations will occur as well, for example, a slightly tipped lens in the blocking operation, the block not properly set in the generator chuck. Or in a worst case, the lens for whatever reason decides to self-destruct during the generating process and de-calibrates the machine.

    If you refer back to the ANSI standards of ophthalmic lenses, you will see that there are built-in tolerances to the finished lens. In Z80, the standard tolerance for a plano power lens is +/- 0.12 diopters. I've never seen a plano lens that far off, typically, you will see +/- 0.04 diopters.

    Here in my shop, we use precision ground optical test masters that are accurate to 0.000001 inch of radius. From those, we make working gauge masters (that are replaced from time to time with new ones due to normal wear and tear). We use a monochromatic light source and count the number of fringes that appear. Our master masters are within 6-8 fringes (2 fringes is 1/2 wavelength or roughly 0.000050 inch, and our working masters are roughly twice that.

    Normal ophthalmic production is nowhere near that precise. But what really matters is the final lens. Does the lens measure +/- 0.04 diopters in a digital lensometer with no optical defects (such as waves or astigmatism)? If yes, then you are good to go.

    Don't over think the process.

  3. #3
    OptiBoard Professional
    Join Date
    Oct 2009
    Location
    World
    Occupation
    Optical Laboratory Technician
    Posts
    129
    Thank you for the exhausting reply. But anyway i do not understand what do you mean by "metal machining for precision metal finishing versus glass/plastic machining for spectacle lenses". So what i understand that those difference in microns are related to the metal finishing ?
    Because convertion of those micron differences to sagitta and then to power difference make no sense.

    Do you have something to the second point from my original post?
    Last edited by essegn; 08-30-2014 at 04:13 PM.

  4. #4
    Banned
    Join Date
    Jul 2010
    Location
    St. Cloud, Minnesota
    Occupation
    Ophthalmic Technician
    Posts
    3,089
    2nd point: Thickness also depends to a certain degree to surface roughness. You have to allow for further stock removal in subsequent processing steps. 0.01 mm is extremely tight for a thickness tolerance, personally speaking, 0.10 mm is far better in a mass-production setting.

  5. #5
    Banned
    Join Date
    Jul 2010
    Location
    St. Cloud, Minnesota
    Occupation
    Ophthalmic Technician
    Posts
    3,089
    Quote Originally Posted by essegn View Post
    Thank you for the exhausting reply. But anyway i do not understand what do you mean by "metal machining for precision metal finishing versus glass/plastic machining for spectacle lenses". So what i understand that those difference in microns are related to the metal finishing ?
    Because convertion of those micron differences to sagitta and then to power difference make no sense.

    Do you have something to the second point from my original post?
    Metal machining typically will have much smaller tolerances for machine settings, for example, when two surfaces are machined so that they fit together as flats, the tolerance for flatness is going to be in the 10-15 micron range. When you have a rod sliding through a piece of metal, you are going to have several tolerances (or fit types): slip fit, friction fit, and press fit. A slip fit will have a larger bore than the rod by maybe 0.5 mm. A friction fit will be larger by 0.05 mm. A press fit might be 0.005 mm (5 microns).

    Remember that saggita is a more precise way to measure a radius of curvature. With a precision ground bell at a known diameter and precision gauge, you can measure a polished surface to probably several microns. It takes a lot of time and very expensive equipment to produce that kind of precision surface. Ophthalmic optics is not precision optics. An average precision optics curve tolerance is going to be measured in nanometers (nm), and microns are considered sloppy. Ophthalmic optics are measured in 0.05 mm tolerances, because production equipment cannot meet any tighter tolerances across a production run.

    You are correct that micron differences in saggita result in essentially zero resultant power difference is correct. That's why the 3 micron tolerance is unsuitable.

    Also remember that a metal machined surface is essentially a ready-to-use surface, whereas an ophthalmic lens surface coming off a generator still requires at least one fining step before polishing.

    In the ophthalmic world, thickness tolerances are generally +/- 0.10 mm. And this is because the standard measuring equipment available is only capable of measuring in that increment.
    It is the same with measuring saggita. Most gauges used for saggita are capable of reading down to 0.01 mm increments, and tool cutting tolerances are typically +/- 0.05 mm. So again, 0.003 (3 microns) is far too precise in a mass-production setting.

    It basically comes down to cost per lens. If I can repeatedly reproduce a plano lens with a +/-0.04 diopter tolerance for $5.00 and you can repeatedly reproduce a plano lens with a +/- 0.01 diopter tolerance for $75.00, which lens is going to sell more units?

    Can you do it? Yes. Will it sell at a reasonable price? No.

  6. #6
    OptiBoard Professional
    Join Date
    Oct 2009
    Location
    World
    Occupation
    Optical Laboratory Technician
    Posts
    129
    To the second point: obviously i have described it wrong. I mean the test when by Natural Diamond Turning Tool is being cut the lens (plano) and then after the surface is done the chuck axis move to 0 degrees, then the vertical line is being cut from the edge to the center and then the chuck axis is moved by 180 degrees and another cut from the edge to the center of the lens is being cut. The focus of the calibration test to have one straight line. I am wondering when the two lines are shifted how to evaluate if the shift is in the tolerance and how the tolerance is being calculated.
    Ther test is not related to the correct center thickness and its adjustment.

  7. #7
    Banned
    Join Date
    Jul 2010
    Location
    St. Cloud, Minnesota
    Occupation
    Ophthalmic Technician
    Posts
    3,089
    Ok, that is the X Axis slide adjustment on my machine.

    Basically, what it is doing is adjusting the slide so that regardless of the rotation of the lens, the diamond is always centered back to front. Typically, the X Axis slide is locked in place after the adjustment is made.

    When the X Axis is out of adjustment, you end up with a "pimple" or "nipple" on the lens where the cutter is not properly centered. Because of the minor diameter of the cutter, you can get awful close, but either you overshoot or undershoot the spot. The point of the test is to get as close as you can. For example, I've got a 9.52 mm diameter cutter on my machine. When I run the X-Axis calibration run, I aim for centration within 0.10 mm on either side, and I test this with a digital micrometer, not the dial gauge on the machine. I use a rectangular shaped lens in the machine that is blocked so that the long axis of the rectangle is along the 0-180 line. After the test, I check the deviation about 5 mm away from the center, and then at 15 mm from center, on both sides. I average out the difference and shoot for 0.10 difference.

    This is one of those calibrations that you can take 5 hours trying to get it dead on the money, or spend 20 minutes and get it close enough. It doesn't have to be perfect, just close enough. My current calibration of 0.10 mm is getting me a slight prismatic effect of about 0.04 diopters, large enough to measure, small enough to not matter.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. tolerances
    By leestaniforth in forum Ophthalmic Optics
    Replies: 5
    Last Post: 07-22-2010, 03:19 AM
  2. Evaluating Employee Job Performance
    By Snitgirl in forum General Optics and Eyecare Discussion Forum
    Replies: 3
    Last Post: 07-17-2005, 12:43 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •