Claims
- 1. In a machine vision system including a camera that captures image data of a sample, a method for inspecting one of a plurality of different workpieces, comprising the steps of:
a) positioning the camera at a prescribed distance from a first side of a platen; a) placing a first sample selected from among the plurality of workpieces on a second side of the platen within the field of view of the camera, the second side of the platen being opposite the first side of the platen and the platen being generally transparent; c) capturing image data of the first sample through the platen; and d) outputting information relating to dimensional characteristics of the first sample.
- 2. The method as in claim 1, wherein there is a product definition for each of the plurality of workpieces and wherein the output information is relative to the product definition corresponding to the first sample.
- 3. The method as in claim 2, including the additional step of processing the captured image data using the product definition corresponding to the first sample prior to the step of outputting information.
- 4. The method as in claim 3, including the additional steps of:
e) placing a second sample that corresponds to a different workpiece than the first sample and therefore has a different product definition; f) capturing image data of the second sample through the platen; and g) outputting information relating to the dimensional characteristics of the second sample.
- 5. The method as in claim 4, wherein steps e) through g) are performed with the camera at the prescribed distance.
- 6. The method as in claim 2, including the additional steps of repositioning the camera in accordance with data in the product definition for the first sample.
- 7. The method as in claim 6, wherein the camera is repositioned automatically.
- 8. The method as in claim 2, including the additional step of maintaining the product definitions in a data store.
- 9. The method as in claim 8, wherein the data store maintains data on the dimensional characteristics of any sample placed on the platen.
- 10. The method as in claim 1, including the additional step of indicating a position of one of the first sample and a feature of the first sample relative to the platen.
- 11. The method as in claim 10, wherein there is a product definition for each of the plurality of workpieces and wherein an orientation of one of the sample and the feature of the sample relative to the platen is associated with the product definition for the first sample.
- 12. The method as in claim 11, wherein a location of one of the sample and the feature of the sample relative to the platen is associated with the product definition for the first sample.
- 13. The method as in claim 10, wherein the indicating step comprises positioning one or more indicators prior to the step of capturing image data.
- 14. The method as in claim 13, wherein the one or more indicators are positioned by a motor in response to a signal from the machine vision system.
- 15. The method as in claim 14, wherein the one or more indicators are selected from the group of lasers and mechanical pointers.
- 16. The method as in claim 10, including the additional step of indicating an orientation of one of the first sample and the feature of the fist sample relative to the platen.
- 17. The method as in claim 16, wherein the orientation is indicated by one of depiction on a display connected to the machine vision system, rotation of a beam and rotation of a grid displayed on or within the platen.
- 18. The method as in claim 1, including the additional steps of:
(e) repeating steps (b) through (d) for a plurality of first samples, all corresponding to the same one of the plurality of workpieces; (f) monitoring changes in the output information for the plurality of first samples; and (g) plotting the output information to display dimensional or rotational variation in one or more features of the plurality of first samples.
- 19. The method as in claim 18, wherein the plot further includes an indication of a standard product specification whereby a trend in the output in data can be gauged.
- 20. The method as in claim 1, wherein the step of outputting information includes providing the output information to a machine which is connected to the machine vision system through a network.
- 21. A user interface for interacting with a machine vision system having a computer, an input device, and a display, comprising:
a first form for identifying a sample to be inspected; a second form for selecting a product definition for use in measuring the sample; and a button operatively configured to trigger an indicator that indicates a position and an orientation of the sample on the platen in accordance with the selected product definition.
- 22. The user interface as in claim 21, wherein the sample to be inspected is identified on the first form by one of a part number and a name.
- 23. The user interface as in claim 21, wherein the first and second forms are presented together on a single display.
- 24. The user interface as in claim 21, further comprising a third form for commencing a measurement of the sample.
- 25. The user interface as in claim 24, wherein the third form includes the button.
- 26. The user interface as in claim 25, wherein the indicator indicates the position and the orientation of the sample on the display.
- 27. The user interface as in claim 25, wherein the indicator is associated with the platen and indicates the position and the orientation of the sample on the platen.
- 28. The user interface as in claim 27, wherein the indicator is automatically positioned to guide the position and the orientation of the sample on the platen.
- 29. The user interface as in claim 21, further comprising an information form providing information concerning at least one of a correct camera position of the camera, a lens selection, an f-stop setting, and a focal length.
- 30. The user interface as in claim 24, further comprising an output form configured to display the measurement of the sample.
- 31. The user interface as in claim 30, wherein the output form displays one or more data points, each data point comprising a measured value and a time taken for a particular feature of the sample.
- 32. The user interface as in claim 31, wherein data points for plural features of the sample are displayed together on the output form.
- 33. The user interface as in claim 30, wherein the measurement of the sample on the output form displays whether criteria in the product definition were met.
- 34. The user interface as in claim 30, wherein the measurement of the sample on the output form displays whether there was an error in the measurement.
- 35. The user interface as in claim 31, wherein the data points for successive measurements are displayed on a graph.
- 36. The user interface as in claim 35, wherein the vertical side of the graph is adjusted relative to one of boundaries input by a user and values from the measurement.
- 37. In a machine vision system, a method for characterizing one of a plurality of workpieces comprising the steps of:
a) selecting a product definition of one of the plurality of workpieces; b) placing a sample on a platen within the field of view of the machine vision system; c) capturing image data of the sample; d) processing the image data for comparison to the product definition; e) comparing the processed image data for conformance to the product definition; f) comparing the processed image data for conformance to historical data concerning prior samples; and g) reporting to the user information concerning the results of the comparing steps.
Parent Case Info
[0001] This application claims the benefit of priority under 35 U.S.C. § 119(e) from U.S. Provisional Application No. 60/275,371, filed Mar. 13, 2001, entitled “System And Method For Machine Vision Inspection Through Platen.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60275371 |
Mar 2001 |
US |