This application is related to copending and commonly assigned U.S. patent application Ser. No. 10/865,155, entitled METHOD AND APPARATUS FOR VISUAL DETECTION AND INSPECTION OF OBJECTS, by William M. Silver, filed Jun. 9, 2004, the teachings of which are expressly incorporated herein by reference.
This invention relates to automated detection and inspection of objects being manufactured on a production line, and more particularly to setup systems and methods for such automated detection and inspection.
Industrial manufacturing relies on automatic inspection of objects being manufactured. One form of automatic inspection that has been in common use for decades is based on optoelectronic technologies that use electromagnetic energy, usually infrared or visible light, photoelectric sensors, and some form of electronic decision making.
One well-known form of optoelectronic automatic inspection uses an arrangement of photodetectors. A typical photodetector has a light source and a single photoelectric sensor that responds to the intensity of light that is reflected by a point on the surface of an object, or transmitted along a path that an object may cross. A user-adjustable sensitivity threshold establishes a light intensity above which (or below which) an output signal of the photodetector will be energized.
One photodetector, often called a gate, is used to detect the presence of an object to be inspected. Other photodetectors are arranged relative to the gate to sense the light reflected by appropriate points on the object. By suitable adjustment of the sensitivity thresholds, these other photodetectors can detect whether certain features of the object, such as a label or hole, are present or absent. A decision as to the status of the object (for example, pass or fail) is made using the output signals of these other photodetectors at the time when an object is detected by the gate. This decision is typically made by a programmable logic controller (PLC), or other suitable electronic equipment.
Automatic inspection using photodetectors has various advantages. Photodetectors are inexpensive, simple to set up, and operate at very high speed (outputs respond within a few hundred microseconds of the object being detected, although a PLC will take longer to make a decision).
Automatic inspection using photodetectors has various disadvantages, however, including:
Another well-known form of optoelectronic automatic inspection uses a device that can capture a digital image of a two-dimensional field of view (FOV) in which an object to be inspected is located, and then analyze the image and make decisions. Such a device is usually called a machine vision system, or simply a vision system. The image is captured by exposing a two-dimensional array of photosensitive elements for a brief period, called the integration or shutter time, to light that has been focused on the array by a lens. The array is called an imager and the individual elements are called pixels. Each pixel measures the intensity of light falling on it during the shutter time. The measured intensity values are then converted to digital numbers and stored in the memory of the vision system to form the image, which is analyzed by a digital processing element such as a computer, using methods well-known in the art to determine the status of the object being inspected.
In some cases the objects are brought to rest in the field of view, and in other cases the objects are in continuous motion through the field of view. An event external to the vision system, such as a signal from a photodetector, or a message from a PLC, computer, or other piece of automation equipment, is used to inform the vision system that an object is located in the field of view, and therefore an image should be captured and analyzed. Such an event is called a trigger.
Machine vision systems avoid the disadvantages associated with using an arrangement of photodetectors. They can analyze patterns of brightness reflected from extended areas, easily handle many distinct features on the object, accommodate line changeovers through software systems and/or processes, and handle uncertain and variable object locations.
Machine vision systems have disadvantages compared to an arrangement of photodetectors, including:
Machine vision systems have limitations that arise because they make decisions based on a single image of each object, located in a single position in the field of view (each object may be located in a different and unpredictable position, but for each object there is only one such position on which a decision is based). This single position provides information from a single viewing perspective, and a single orientation relative to the illumination. The use of only a single perspective often leads to incorrect decisions. It has long been observed, for example, that a change in perspective of as little as a single pixel can in some cases change an incorrect decision to a correct one. By contrast, a human inspecting an object usually moves it around relative to his eyes and the lights to make a more reliable decision.
Also, the limitations of machine vision systems arise in part because they operate too slowly to capture and analyze multiple perspectives of objects in motion, and too slowly to react to events happening in the field of view. Since most vision systems can capture a new image simultaneously with analysis of the current image, the maximum rate at which a vision system can operate is determined by the larger of the capture time and the analysis time. Overall, one of the most significant factors in determining this rate is the number of pixels comprising the imager.
The availability of new low-cost imagers, such as the LM9630 from National Semiconductor of Santa Clara, Calif. that operate at a relatively low-resolution (approximately 100×128 pixels), high frame rate (up to 500 frames per second) and high sensitivity allowing short shutter times with inexpensive illumination (e.g., 300 microseconds with LED illumination), have made possible the implementation of a novel vision detector that employs on-board processors to control machine vision detection and analysis functions. A novel vision detector using such an imager, and overall inspection system employing such a vision detector, is taught in copending and commonly assigned U.S. patent application Ser. No. 10/865,155, entitled METHOD AND APPARATUS FOR VISUAL DETECTION AND INSPECTION OF OBJECTS, by William M. Silver, filed Jun. 9, 2004, and the teachings of which are expressly incorporated herein by reference (herein also termed “above-incorporated-by-reference METHOD AND APPARATUS).
An advantage to the above-incorporated-by-reference detection and inspection METHOD AND APPARATUS is that the vision detector can be implemented within a compact housing that is programmed using a PC or other Human-Machine Interface (HMI) device (via, for example, a Universal Serial Bus (USB)), and is then deployed to a production line location for normal runtime operation. The outputs of the apparatus are (in one implementation) a pair of basic High/Low lines indicating detection of the object and whether that object passes or fails based upon the characteristics being analyzed. These outputs can be used (for example) to reject a failed object using a rejection arm mounted along the line that is signaled by the apparatus' output.
By way of example,
In an alternate example, the vision detector 100 sends signals to a PLC for various purposes, which may include controlling a reject actuator. In another exemplary implementation, suitable in extremely high-speed applications or where the vision detector cannot reliably detect the presence of an object, a photodetector is used to detect the presence of an object and sends a signal to the vision detector for that purpose. In yet another implementation, there are no discrete objects, but rather material flows past the vision detector continuously—for example a web. In this case the material is inspected continuously, and signals are sent by the vision detector to automation equipment, such as a PLC, as appropriate.
Basic to the function of the vision detector 100 in the above-incorporated-by-reference METHOD AND APPARATUS is the ability to exploit the abilities of the imager's quick-frame-rate and low-resolution image capture to allow a large number of image frames of an object passing down the line to be captured and analyzed in real-time. Using these frames, the apparatus' on-board processor can decide when the object is present and use location information to analyze designated areas of interest on the object that must be present in a desired pattern for the object to “pass” inspection.
With brief reference to
Boxes labeled “c”, such as box 220, represent image capture by the vision detector 100. Boxes labeled “a”, such as box 230, represent image analysis. It is desirable that capture “c” of the next image be overlapped with analysis “a” of the current image, so that (for example) analysis step 230 analyzes the image captured in capture step 220. In this timeline, analysis is shown as taking less time than capture, but in general analysis will be shorter or longer than capture depending on the application details. If capture and analysis are overlapped, the rate at which a vision detector can capture and analyze images is determined by the longer of the capture time and the analysis time. This is the “frame rate”. The above-incorporated-by-reference METHOD AND APPARATUS allows objects to be detected reliably without a trigger signal, such as that provided by a photodetector.
Each analysis step “a” first considers the evidence that an object is present. Frames where the evidence is sufficient are called active. Analysis steps for active frames are shown with a thick border, for example analysis step 240. In an exemplary implementation, inspection of an object begins when an active frame is found, and ends when some number of consecutive inactive frames are found. In the example of
At the time that inspection of an object is complete, for example at the end of analysis step 248, decisions are made on the status of the object based on the evidence obtained from the active frames. In an exemplary implementation, if an insufficient number of active frames were found then there is considered to be insufficient evidence that an object was actually present, and so operation continues as if no active frames were found. Otherwise an object is judged to have been detected, and evidence from the active frames is judged in order to determine its status, for example pass or fail. A variety of methods may be used to detect objects and determine status within the scope of this example; some are described below and many others will occur to those skilled in the art. Once an object has been detected and a judgment made, a report may be made to appropriate automation equipment, such as a PLC, using signals well-known in the art. In such a case a report step would appear in the timeline. The example of
Note in particular that the report 260 may be delayed well beyond the inspection of subsequent objects such as object 110 (
Once inspection of an object is complete, the vision detector 100 may enter an idle step 280. Such a step is optional, but may be desirable for several reasons. If the maximum object rate is known, there is no need to be looking for an object until just before a new one is due. An idle step will eliminate the chance of false object detection at times when an object couldn't arrive, and will extend the lifetime of the illumination system because the lights can be kept off during the idle step.
The processor of the exemplary above-incorporated-by-reference METHOD AND APPARATUS is provided with two types of software elements to use in making its decisions: “Locators” that locate the object and “Detectors” that decide whether an object feature is present or absent. The decisions made by both Locators and Detectors are used to judge whether an object is detected and, if so, whether it passes inspection. In one example, Locators can be simply described as a one-dimensional edge detector in a region of interest. The vision detector is configured for locating objects by placing Locators at certain positions in an image where an edge feature of the object can be seen when the object is in the field of view. The Locator can be oriented with respect to the direction the object is moving, and sized to ensure that the edge feature of the object can be located at multiple positions while in the field of view. During analysis, the location of the edge feature of the object within the Locator can be reported, as well as a logical output state that the location is known.
Detectors are vision tools that operate on a region of interest that produce a logical output state that detects the presence or absence of features in an image of the object. The vision detector is configured for detecting features of an object by placing Detectors at certain positions in an image where object features can be seen when the object is located by the Locators. Various types of Detectors can be used, such as Brightness Detectors, Edge Detectors, and Contrast Detectors.
Detectors can be linked to the location of the feature determined by a Locator to further refine the presence detection and inspection of the object. Accordingly, in each frame where the object may be viewed at a different perspective, the location of the object determined by the Locator will be different, and the position of the Detectors in the image can be moved according to the location determined by the Locator. The operation of the vision detector at high frame rates, therefore permits the vision detector to capture and analyze multiple images of the object while it passes through the field of view.
The above-discussion of Locators and Detectors is further illustrated by way of example in
The Locator 320 is used to detect and locate the top edge of the object, and the Locator 322 is used to detect and locate the right edge. A Brightness Detector 330 is used to help detect the presence of the object. In this example the background is brighter than the object, and the sensitivity threshold is set to distinguish the two brightness levels, with the logic output inverted to detect the darker object and not the brighter background. Together the Locators 320 and 322, and the Brightness Detector 330, provide the evidence needed to judge that an object has been detected, as further described below. A Contrast Detector 340 is used to detect the presence of the hole 312. When the hole 312 is absent the contrast would be very low, and when present the contrast would be much higher. A Spot Detector could also be used. An Edge Detector 360 is used to detect the presence and position of the label 310. If the label 310 is absent, mis-positioned horizontally, or significantly rotated, the analog output of the Edge Detector would be very low. A Brightness Detector 350 is used to verify that the correct label has been applied. In this example, the correct label is white and incorrect labels are darker colors.
As the object (110 in
The choice of Gadgets to wire to ObjectDetect is made by a user based on knowledge of the application. In the example of
The logic output of ObjectDetect Judge 400 is wired to AND Gate 470. The logic output of ObjectPass Judge 402 is inverted (circle 403) and also wired to AND Gate 470. The ObjectDetect Judge is set to “output when done” mode, so a pulse appears on the logic output of ObjectDetect Judge 400 after an object has been detected and inspection is complete. Since the logic output of ObjectPass 402 has been inverted, this pulse will appear on the logic output of AND Gate 470 only if the object has not passed inspection. The logic output of AND Gate 470 is wired to an Output Gadget 480, named “Reject”, which controls an output signal from the vision detector than can be connected directly to a reject actuator 170 (
To aid the user's understanding of the operation of the exemplary vision detector 100, Gadgets and/or wires can change their visual appearance to indicate fuzzy logic values. For example, Gadgets and/or wires can be displayed red when the logic value is below 0.5, and green otherwise. In
where wi is the ith weight and zi is the corresponding pixel gray level. In this example, the weights approximate a Gaussian function of distance r from the center of the kernel to the center of each weight,
so that pixels near the center are weighted somewhat higher than those near the edge. One advantage of a center-weighted Brightness Detector is that if a bright feature happens to lie near the edge of the Detector's ROI, then slight variations in its position will not cause large variations in the analog output. In
In example, b=1.0.
In another exemplary implementation, the analog output is defined by the function C(q), which is the gray level such that:
where q is a percentile chosen by a user. C is the inverse cumulative weighted distribution of gray levels. Various useful values of q are given in the following table:
In one example of a Contrast Detector, the analog output is the standard deviation of the gray levels within the ROI. In an exemplary implementation, the array of positive weights 500 is used to compute a weighted standard deviation:
In another example, the analog output is given by
C(qhi)−C(qlo) (6)
where the q values may be chosen by the user. Useful values are qhi=0.95, qlo=0.05.
In the implementation of
The step kernel 600, with values ki, can be considered to be the product of an ideal step edge template ei and a kernel of positive weights wi:
Note that the ideal step edge template values ei are +1 when ki>0, corresponding to the black on white region of step kernel 600, and −1 when ki<0, corresponding to the white on black region of step kernel 600.
Define contrast C and weighted normalized correlation R2 of the step kernel and a like-shaped ROI with pixel values zi as follows:
The contrast C uses the standard formula for weighted standard deviation, and R2 uses the standard formula for weighted normalized correlation, but simplified because for step kernel 600
An orthogonal step kernel 610 with values ki′ is also created that is identical to the step kernel 600 but rotated 90 degrees. The ratio
is a reasonable estimate of the tangent of the angle between the actual and expected direction of an edge, particularly for small angles where D is also a good estimate of the angle itself. Note that an orthogonal step template 610 doesn't need to be created—the values from the step kernel 600 can be used, but corresponding to the pixels values in the ROI in a different order.
A weighted normalized correlation operation 700 using ROI 710 and step kernel 720 computes R2. A contrast operation 730 using ROI 710 and step kernel 720 computes C, which is converted by fuzzy threshold operation 740 into a fuzzy logic value 742 indicating the confidence that the contrast is above the noise level. Weighted correlation operations 750 and 752, using ROI 710, step kernel 720, and orthogonal step kernel 722, and absolute value of arctangent of ratio operation 760, compute D, which is converted by fuzzy threshold operation 770 into a fuzzy logic value 772 indicating the confidence that the angle between the expected and actual edge directions is small.
A fuzzy AND element 780 operates on R2 and fuzzy logic values 742 and 772 to produce the analog output 790 of the Edge Detector. Note that R2, being in the range 0-1, can be used directly as a fuzzy logic value. The analog output 790 is in the range 0-1, but it can be multiplied by some constant, for example 100, if a different range is desired. Note that the logic output of an Edge Detector is derived from the analog output using the sensitivity threshold that all Photos have.
In
The use of ridge kernel 800 is similar to that for step kernel 600. The contrast C is computed using the same formula, but R2 uses a different formula because the sum of the kernel values is not 0:
Note that this formula reduces to the one used for step edges when the sum of the kernel values is 0.
A different method is used to determine the angle D between the actual and expected edge directions. A positive rotated ridge kernel 810 with values ki+ is created with an edge direction θ+a, and a negative rotated ridge kernel 810 with values ki− is created with an edge direction θ−a. A parabola is fit to the three points
The x coordinate of the minimum of the parabola is a good estimate of the angle D between the actual and expected edge directions.
R2 and the fuzzy logic values are used by fuzzy AND element 980 to produce a ridge analog output 992 for an Edge Detector that can detect ridge edges. For an Edge Detector that can detect either step or ridge edges, the ridge analog output 992 and analog output 990 from a step edge detector 988 can be used by fuzzy OR element 982 to produce a combined analog output 991.
Position control 1020 is used to position a Photo in the field of view. Diameter spinner 1022 is used to change the diameter of a Detector. Direction controls 1024 are used to orient an Edge Detector to the expected edge direction. Position, diameter, and orientation can also be set by manipulation of graphics in an image view, for example the image view of
Edge type checkboxes 1030 are used to select the types of edges to be detected and the edge polarity. Dark-to-light step, light-to-dark step, dark ridge, and light ridge can be selected. Any combination of choices is allowed, except for choosing none.
Jiggle spinner 1040 allows the user to specify a parameter j such that the Edge Detector will be run at a set of positions ±j pixels around the specified position, and the position with the highest analog output will be used. Sensitivity threshold controls 1050 allow the user to set the sensitivity fuzzy threshold of a Photo. Zero-point label 1051 shows value t0 that can be set by zero-point slider 1052. One-point label 1053 shows value t1, which can be set by one-point slider 1054. Analog output label 1055 shows the current analog output of a Photo. The analog output is also shown graphically by the filled-in region to the left of analog output label 1055, which shrinks and grows like a mercury thermometer lying on its side. The filled-in region can be displayed in three distinct colors or patterns corresponding to a first zone 1056 below t0, a second zone 1057 between t0 and t1, and a third zone 1058 above t1.
Contrast threshold controls 1060 allow the user to view the contrast C and set the contrast fuzzy thresholds 740 and 940. These controls operate in the same manner as the sensitivity threshold controls 1050.
Direction error controls 1070 allow the user to view the angle between the actual and expected edge directions D and set the direction fuzzy thresholds 770 and 970. These controls operate in the same manner as the sensitivity threshold controls 1050, except that the thermometer display fills from right-to left instead of left-to-right because lower values of D correspond to higher fuzzy logic values.
The use of spot kernel 1100 is similar to that for ridge kernel 800. Weighted normalized correlation R2 and contrast C are computed using the same formulas as was used for the ridge kernel.
In one example, a Locator searches a one-dimensional range for an edge, using any of a variety of well-known techniques. The search direction is normal to the edge, and a Locator has a width parameter that is used to specify smoothing along the edge, which is used in well-known ways. The analog output of a Locator depends on the particular method used to search for the edge.
In one example, a Locator searches a one-dimensional range for an edge using the well-known method of computing a projection of the ROI parallel to the edge, producing a one-dimensional profile along the search range. The one-dimensional profile is convolved with a one-dimensional edge kernel, and the location of the peak response corresponds to the location of the edge. A interpolation, such as the well-known parabolic interpolation, can be used if desired to improve the edge location accuracy. In another example, an edge can be located by searching for a peak analog output using the edge detector of
In another example, a Locator searches a multi-dimensional range, using well-known methods, which may include translation, rotation, and size degrees of freedom. It will be clear to one skilled in the art how to employ multi-dimensional Locators to position Photos in practicing the example, so the following discussion will be limited to one-dimensional Locators, which are preferred due to their simplicity.
Detector 1310 and Locator 1312 can be moved around in the FOV by clicking anywhere on their border and dragging. Detector 1310 has a resize handle 1320 for changing its diameter, and Locator 1312 has a resize handle 1322 for changing its width and range, and a rotate handle 1324 for changing its direction. All Photos can be moved by dragging the border, and have similar handles as appropriate to their operation.
In the example of
A Locator has a rail 1332, shown in
Every Photo can be linked to zero or more locators, up to some maximum number determined by this example. The number of links determines the number of degrees of freedom that the Locators can control. Degrees of freedom include rotation, size, and the two degrees of freedom of translation. In one example, the maximum number of links is two and only the translation degrees of freedom are controlled.
A linkage defines how a Photo moves as the Locator's plunger moves, following an edge in the image. The movements are defined to keep the Photo at a constant relative distance to the rail or rails of the locators to which it is linked. In this example, the linkages are drawn using a mechanical analogy, such that one could actually build a linkage out of structural elements and bearings and the Photos would move in the same way as forces are applied to the plungers.
In
Every photo has an emitter, a diamond-shaped handle drawn somewhere on the border. For example Detector 1310 has emitter 1350 and Locator 1312 has emitter 1352. A link is created by drag-dropping a Photo's emitter to any point on a Locator. If the link already exists, the drag-drop might delete the link, or another mechanism for deleting might be used. The user may not create more than the maximum number of allowable links from any Photo, nor any circular dependencies. To aid the user during an emitter drag over a Locator, a tool tip can be provided to tell the user whether a link would be created, deleted, or rejected (and why). Dragging a Locator does not change the behavior of its plunger—it stays locked on an edge if it can find one, or reverts to the center if not. Thus dragging a locator while an edge is detected just changes its search range; the plunger does not move relative to the FOV. More generally, dragging a Locator never changes the position of any Photo to which it is linked. Dragging a Locator will adjust the rod lengths as necessary to insure that no other Photo moves relative to the FOV.
Any plunger may be dragged manually within the range of its Locator, whether or not it has found an edge, and any linked Photos will move accordingly. This allows users to see the effect of the linkages. As soon as the mouse button is released, the plunger will snap back to its proper position (moving linked Photos back as appropriate).
In
Comparing second image view 1402 with first image view 1400, first plunger 1424 has moved down as it follows a first edge (not shown) in the image, and second plunger 1434 has moved to the left and slightly down as it follows a second edge (not shown). Note that the positions in the FOV of Locators 1420 and have not changed, but Detector 1410 has moved down and to the left to follow the plungers, which is following the edges of an object and therefore following the motion of the object itself. In a mechanical analogy, Detector 1410 moves because it is rigidly attached to first rail 1426 by first rod 1422, and to second rail 1436 by second rod 1432. Note that first slider 1428 has slid to the left along first rail 1426, and second slider 1438 has slid down along second rail 1436. The sliders slide along the rails when two non-orthogonal Locators are linked to a Photo.
If a Photo is linked to two nearly parallel Locators, its motion would be unstable. It is useful to set an angle limit between the Locators, below which the linked Photo will not be moved. This state can be indicated in some way in the image view, such as by displaying the two rods using a special color such as red. The ability to have Locators either at fixed positions or linked to other Locators provides important flexibility. In
The Locators are configured to follow the top and right edges of a circular feature 1550. Comparing second image view 1502 with first image view 1500, the circular feature 1550 has moved down, causing rail 1522 to move down to follow it. This moves both Detector 1510 and second Locator 1530 down. Note that Detector 1510 is at the same position relative to the object, and so is second Locator 1530. This is desirable in this case, because if second Locator 1530 were fixed in the FOV, it might miss the right edge of circular feature 1550 as it moves up and down. Note that this would not be problematic if the edge of an object in the image was a straight line.
First Locator 1520 has no Locator to move it left and right so as to find the top edge of circular feature 1550. The first Locator 1520 cannot link to second Locator 1530 because that would create a circular chain of links, which is not allowed because one Locator has to run first and it cannot be linked to anything. Instead, the motion of the object through the FOV insures that first Locator 1520 will find the top edge. In the example of
Accordingly,
Thus, in
Comparing second image view 1702 with first image view 1700, the object (not shown) has moved to the right and rotated counterclockwise, which can be seen by the motion of the Detectors as the Locators follow the object edges. Note that second Locator 1722 and third Locator 1724 are linked to first Locator 1720 so that they stay close to the Detectors.
Having described in detail the setup of Locators and Detectors in accordance with the above-incorporated-by reference METHOD AND APPARATUS, it should be clear that, while effective, the GUI screen of
Thus, in establishing appropriate Locators and Detectors in an image view of an object during setup, the functionality of the GUI can be highly beneficial. It is desirable that the process for setting up such Locators and Detectors be as easy to use and accurate as possible. By arranging functions of the GUI to facilitate automated setup of locators and detectors, the overall performance and ease of use of the vision detector can be greatly enhanced.
This invention provides a system and method for automating the setup of Locators and Detectors within an image view of an object on the HMI of a vision detector by determining detectable edges and best fitting the Locators and Detectors to a location on the object image view following the establishment of an user selected operating point on the image view, such as by clicking a GUI cursor. In this manner, the initial placement and sizing of the graphical elements for Locator and Detector ROIs are relatively optimized without excessive adjustment by the user. Locators can be selected for direction, including machine or line-movement direction, cross direction or angled direction transverse to cross direction and movement direction. Detectors can be selected based upon particular analysis tools, including brightness tools, contrast tools and trained templates. The Locators and detectors are each associated with a particular set of operating parameters, such as activation threshold, which are displayed in a control box within the GUI (and can be accessed by clicking on the specific Locator or Detector. A parameter bar can also be provided adjacent to the depiction of the Detector on the image view for easy reference. Both Locators and Detectors may be manually readjusted once automatically placed and sized by drag and drop techniques.
In an illustrative embodiment the system includes a GUI screen image view of an object derived from a vision sensor having a field of view in which the object is in relative motion thereto and a plurality of image frames of the object within the filed of view are captured by the vision detector. The image view is accessible by the GUI cursor. An edge detection process determines and analyzes detectable edges in the screen image view and stores edge information. A selector allows a user to select either a (a) Locator or (b) a Detector based upon a predetermined analysis tool for placement on the image view. an automatic placement process then uses that edge information to place the selected (a) Locator or (b) Detector at a position on the image view upon which the cursor points with a size that is determined based upon a location of adjacent edges of the object image view.
The automatic placement process is constructed and arranged to place the Locator on the image view relative to a nearest adjacent edge of the image view and to adjust the Locator so as to avoid a stronger-magnitude more-distant edge. This allows a Locator having a predetermined width when originally sized to be finally sized with a cutoff on the side near the stronger edge, thus avoiding confusion as the object moves through the field of view between edges, since the Locator's activation threshold is generally set relative to the nearest adjacent edge's magnitude.
In addition, the automatic placement process is constructed and arranged to place the Director on the image view relative to the position at which the cursor points so that a relative center of the Detector as at the position at which the cursor points and an outer boundary of the Director extends to a location that is within detected edges of the object image view. The outer boundary is typically circular, and is built from incrementally larger-radius circles until the average score of pixel values of the image within the boundary indicates a change beneath an applicable threshold (based upon brightness or contrast, for example). At this time, the boundary closest to the radius still within the threshold is chosen for the automatically sized ROI of the Detector.
The invention description below refers to the accompanying drawings, of which:
In this embodiment, the GUI 1800 is provided as part of a programming application running on the HMI and receiving interface information from the vision detector. In the illustrative embodiment, a .NET framework, available From Microsoft Corporation of Redmond, Wash., is employed on the HMI to generate GUI screens. Appropriate formatted data is transferred over the link between the vision detector and HMI to create screen displays and populate screen data boxes, and transmit back selections made by the user on the GUI. Techniques for creating appropriate screens and transferring data between the
The screen 1800 includes a status pane 1802 in a column along the left side. This pane controls a current status box 1804, the dialogs for controlling general setup 1806, setup of object detection with Locators and Detectors 1808, object inspection tool setup 1810 and runtime/test controls 1812. The screen 1800 also includes a right-side column having a pane 1820 with help buttons.
The lower center of the screen 1800 contains a current selection control box 1830. The title 1832 of the box 1830 relates to the selections in the status pane 1802. In this example, the user has clicked select job 1834 in the general setup box 1806. Note, the general setup box also allows access to an item (1836) for accessing a control box (not shown) that enables setup of the imager (also termed “camera”), which includes, entry of production line speed to determine shutter time and gain. In addition, the general setup box allows the user to set up a part trigger (item 1838) via another control box (not shown). This may be an external trigger upon which the imager begins active capture and analysis of a moving object, or it may be an “internal” trigger in which the presence of a part is recognized due to analysis of a certain number of captured image frames (as a plurality of complete object image frames are captured within the imager's field of view).
The illustrated select job control box 1830 allows the user to select from a menu 1840 of job choices. In general, a job is either stored on an appropriate memory (PC or vision detector or is created as a new job. Once the user has selected either a stored job or a new job, the next button accesses a further screen with a Next button 1842. These further control boxes can, by default, be the camera setup and trigger setup boxes described above.
Central to the screen 1800 is the image view display 1850, which is provided above the control box 1830 and between the columns 1802 and 1820 (being similar to image view window 198 in
As shown in
Before describing further the setup procedure, reference is made briefly to the bottommost window 1870 which includes a line of miniaturized image frames that comprise a so-called “film strip” of the current grouping of stored, captured image frames 1872. These frames 1872 each vary slightly in bottle position with respect to the FOV, as a result of the relative motion. The film strip is controlled by a control box 1874 at the bottom of the left column.
Reference is now made to
As shown, a cursor 1930 is brought toward an edge 1940 of the object 1852. Once the user “clicks” on the cursor placement, the screen presents the control box 2010, which now displays a parameter box 2012. Briefly, this box sets up the applicable threshold indicator 2014 for machine direction. The nature of the parameter box is highly variable herein. In general, the user can decide how high or low to set a threshold for edge detection.
The click of the cursor 1930 also generates a novel Locator graphic 2020 on the image view 1850 of the object 1852. This graphic 2020 is similar in operation to the Locator 320 (
In this example, the Locator is sized with a height HL1 and width WL1 that are optimized to a given segment of edge 1940 of the object 1852. Likewise, the locator is positioned at an angle A that allows the above-described plunger bar 2022 to approximately define a straight line within the (curving) edge portion closest to the clicked cursor 1930. In general, the height HL1 if the plunger 2020 is chosen by the process so that it remains within a predetermined deviation of the object edge from a straight line. In other words, the plunger, at its opposing ends 2024 and 2026 deviates from the curving object edge 1940 no more than a predetermined distance—a longer plunger would exceed that distance at the selected edge location. The procedure for determining automatic placement and sizing of the Locator 2020 is described in greater detail below.
The position in the FOV at which the cursor 1930 is clicked typically defines the center of the locator. The locator itself remains fixed at the clicked position in the FOV. The moving object image passes through the Locator with the plunger 2022 following the detectable edge transition. In automatic setup, the Locator's width WL1 is determined by the distance from the click point to a detectable edge transition for the object in the setup view. Hence, if the click point of the cursor 1930 were further from the edge 1940, then the Locator graphic would appear longer in the width direction to lie properly upon the object. The extension of the locator into the body of the object image is sufficient so that the edge transition of the object can be of the object can be properly detected while the object is placed in the current image view (the illustrated view upon which setup is being made). Again the height HL1 of the locator and plunger 2022 is based upon a close fit with the nearest object edge transition. A more detailed procedure for the automated placement of a Locator is described with reference to
The graphical representation of the Locator 2020 is set to a given polarity so that it properly identifies the transition from light background to dark. A polarity selector (not shown) can be provided in the status pane 1802 or control box 2010. In this manner, a Locator can be placed on either edge (see phantom Locator 2030 on opposing edge 2032) and detect the movement of the object through the FOV from either edge. Polarity can be displayed by providing different, unique, opaque shading on each side of the Locator 2020. In this example, shading fill (symbolized by hatch lines) 2040 is used to show a dark-to-light polarity given a prevailing right-to-left machine direction. Likewise, the opposing alternate Locator 2030 would be set for light-to-dark polarity in this example.
It is contemplated that the automated placement of the Locator 2020 may not always yield the best result. Thus, the control box 2010 includes a recreate button 2050 that allows the Locator 2020 to be removed and replaced in another location by a subsequent move and click of the cursor 1930. Alternatively, the clicking of the cursor 1930 on a different position of the object can be adapted to recreate the Locator elsewhere on the image view 1850. Note that a cross direction button 2052 and angle direction button 2054 can still be accessed to generate additional locators as needed, using the same automated and manual placement and resizing procedures as applicable to the locator 2020.
In addition, when a Locator's automatic placement is generally desirable, but its angle, width or height will not necessarily obtain the best results, then the Locator can be manually resized as shown generally in
Having placed and adjusted a Locator 2120, reference is now made to
When a given type of tool is selected, the user may then move the cursor to an appropriate location on the object 1852 (see cursor 1930 shown in phantom). By clicking on the positioned cursor 1930 (phantom) a Detector region of interest (ROI) circle 2240 (shown in phantom) using brightness as a detection criterion is formed on the object in association with the plunger 2122 of the locator 2120. The diameter of the circle is selected automatically from the center click point based upon placement so that it falls within the desired brightness region of the object. In other words, parts of the ROI that are outside a given brightness range cause the circle to be sized so as to avoid these regions. Similarly to the case of the Locator, the threshold level of a given detector is also estimated and automatically set, subject to subsequent adjustment by the user.
In this example, the automatically sized ROI circle 2240 (phantom) covers a majority of the width of the object body 1854. As described above, when the object is located, its presence is verified by the existence of the bright spot within the ROI. However, the user may desire a longer period of detection. Thus, by clicking the cursor 1930 (shown solid), and dragging on the circle edge, the ROI's diameter can be reduced (arrows 2242) from the larger diameter automatically sized circle (phantom) to a reduced-size circle 2250 that allows verification of presence within a larger range of movement across the FOV. Note that a threshold and brightness bar 2260 is automatically appended to the Detector circle 2250 by the GUI. This allows the user to ascertain the current settings and readings of the particular detector. Such data is helpful particularly where a plurality of detectors are present on the image view, and only one Detector's status is currently shown (typically the last Detector clicked) in the control box 2210. Note that by clicking any Detector or Locator in the image view, the relevant control box and associated parameter box is retrieved and displayed in the GUI.
The user may place as many Detectors as he or she desires in association with a given locator. To further verify object presence, a second Detector may be applied as shown in
In the example of
Note that the automatic sizing of a Detector ROI circle is described in further detail with reference to
Briefly, the user may also select Detectors based upon other tools such as template. When selecting template a control box (not shown) allows the user to lay a dark circle (automatically with manual adjustment option) on an object image location. The user activates a training button to sore the pattern in the vision detector's memory. Generalized pattern-matching algorithms are used to determine whether a detected ROI on the object matches the pattern. A threshold setting slider is provided to adjust the pattern matching algorithm.
The status pane 1810 also shows a set up inspection box 1810 with an associated button 1884 for inspection tools. In general, inspection occurs within the detector concurrently with detection. In some implementations, simply detecting an object is sufficient. In other applications, the detector can inspect objects for flaws by analyzing ROI's in association with a locator. Typically, ROIs are placed in areas where flaws will affect the appearance of the object in sufficient degree to be discriminated by the relatively low-resolution capture of the vision detector. Briefly, when the inspection setup button 1884 is clicked, the user is provided with various screens similar to those in
The automatic placement and sizing of a Locator in response to positioning and clicking of a cursor on the image view is now discussed in further detail with reference to
The user desires to place a Locator along the left-side edge portion 2412 and has clicked a cursor at the click point 2414 at a slight spacing from the edge 2412 (step 2512). The procedure locates the closest point on the nearest located edge 2412 and establishes this point as the Locator origin 2416 (step 2514). The origin 2416 is defined in terms of orthogonal x and y axes and a rotation θ relative to the axes and the closest distance can be determined as the shortest line segment 2419 between the click point 2414 and origin 2416. In one embodiment, the angle of this segment with respect to the X-Y axes defines θ (the segment being oriented at 90 degrees to θ). The procedure 2500 begins to define increments above and below the origin (steps 2516, 2518, 2520 and 2522) generating a line 2420 that fits along the edge 2412 in each direction from the origin 2416. This forms the basis of the plunger when the creation of the Locator is complete. The increments build as far as they are able until the maximum width (according to a predetermined constant) is achieved (for example the lower point 2430). The increments may build to less than the maximum width if they deviate from the edge by more than a maximum deviation (MAXDEV), at which point (top point 2432) increments are no longer built. In one embodiment, MAXDEV is approximately 2 pixels wide. Once increments are maximized, the maximum height of the locator is established.
In step 2524, the width of the Locator in both directions from the line 2420 is established (MAXWIDTH1 and MAXWIDTH2). Typically, width is determined by a predetermined ratio of the height and by other factors, such as ensuring that a sufficient portion of the width is located in each of the object side and background side.
The procedure 2500 may attempt to move the Locator line 2420 upwardly or downwardly along the edge to seek a better fit within a predetermined limit (steps 2526 and 2528) that allows a truncated side (due to exceeding MAXDEV) of the Locator to expand in height. Likewise, in an embodiment, the line may be rotated relative to θ, to allow a better fit within certain rotational limits. Once the Locator positioning is established, the procedure 2500 in step 2530 ranks the strength of the transition of all edges within the original width of the Locator's ROI. In this example, a stronger (or equally strong) edge 2440 is identified (step 2532), which may confuse the analysis during runtime. Thus, the procedure 2500 resizes the width boundary 2442 (step 2534 and arrow 2441) to exclude the edge 2440. The amount (ADJWIDTH) of withdrawal of the Locator's width boundary 2442 may be calculated based upon a constant or a ratio relative to the distance between edges 2412 and 2440, or upon another metric. Finally the Locator is completed in step 2536.
Upon completion of the Locator's layout, a threshold value is assigned to the Locator. This value is calculated by deriving a measured magnitude (via the Sobel operator) of the edgelets at the edge line 2420 and multiplying this value by a constant to determine an absolute threshold value for the GUI. In an embodiment, a constant of 0.7 is used to establish a default value for the threshold assigned to the Locator, resulting in allowance of variation of up to 30%.
The placement and sizing of a detector in accordance with an embodiment of this invention is now described in further detail with reference to the exemplary object 2410 of
Next, in step 2712, the user moves the cursor to a point on the object image view and clicks the location to establish a center point (click point) 2620 for the Detector ROI circle (step 2714). This click point is established as the origin of the circle with an initial Radius equal to zero within the depicted x-axis and y-axis coordinate system. The procedure then (steps 2716 and 2718) begins to build a series of circles about the origin 2620, successively incrementing (typically by one or two pixels in distance per increment) the radius of the circle and deriving an average magnitude score for all points (or sum of all magnitudes) in the image view along the circle. In this example, the circles build successively outwardly (radial arrows 2622) from the origin 2620 to radiuses R1<R2<R3<R4. Each time the step 2718 decides whether the average or summed score of all image pixels within the given circle is (a) greater-than-or-equal-to, or (b) less-than the desired threshold value. Referring to the graph 2800 in
The GUI thus automatically displays the chosen circle with radius R3 and allows the user the option to increase or decrease the diameter as appropriate (step 2722). As described above, a further graphic image of a threshold and setting bar is provided alongside the completed circle.
The determination of magnitude is, in part based upon the type of tool used in conjunction with the Detector. In the case of brightness, the tool bases decisions upon pixel intensity versus a constant. The constant can be predetermined or calculated from the average image intensity in a variety of ways. In the case of contrast, the magnitude score may be a differential gradient between intensities and the threshold may be a constant gradient. Where needed, inverse values for these thresholds can be derived through subtraction from a constant. Automatic placement and sizing of a template circle may be based upon contrast or brightness (or both).
Hence, the above description provides useful and highly flexible mechanisms for allowing minimally trained persons to quickly employ a vision detector without the need of intensive human programming or labor in setup. The completed setup may be tested as needed, and by accessing various GUI screens through “Back” buttons and clicks upon the image's Locators and Detectors during test time, adjustments can be made to the Locators and Detectors, or new/replacement Locators and Detectors can be placed on the image view.
The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope thereof. For example, while ROIs for Locators are shown as rectangles and Detectors are shown as circles, their ROIs may each define a different shape or a variety of selectable and/or customized shapes as needed. Likewise, while a particular form of HMI and GUI are shown, a variety of hardware and GUI expressions are expressly contemplated. For example, in alternate embodiments access to operating parameters may be through alternate display screens or boxes. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4214265 | Olesen | Jul 1980 | A |
4292666 | Hill et al. | Sep 1981 | A |
4384195 | Nosler | May 1983 | A |
4647979 | Urata | Mar 1987 | A |
4847772 | Michalopoulos et al. | Jul 1989 | A |
4916640 | Gasperi | Apr 1990 | A |
4953841 | Polarek | Sep 1990 | A |
4962538 | Eppler et al. | Oct 1990 | A |
4972494 | White et al. | Nov 1990 | A |
5018213 | Sikes | May 1991 | A |
5040056 | Sager et al. | Aug 1991 | A |
5121201 | Seki | Jun 1992 | A |
5146510 | Cox et al. | Sep 1992 | A |
5164998 | Reinsch et al. | Nov 1992 | A |
5177420 | Wada | Jan 1993 | A |
5184217 | Doering | Feb 1993 | A |
5198650 | Wike et al. | Mar 1993 | A |
5210798 | Ekchian et al. | May 1993 | A |
5233541 | Corwin et al. | Aug 1993 | A |
5262626 | Goren et al. | Nov 1993 | A |
5271703 | Lindqvist et al. | Dec 1993 | A |
5286960 | Longacre, Jr. et al. | Feb 1994 | A |
5298697 | Suzuki et al. | Mar 1994 | A |
5317645 | Perozek et al. | May 1994 | A |
5365596 | Dante et al. | Nov 1994 | A |
5420409 | Longacre, Jr. et al. | May 1995 | A |
5476010 | Fleming et al. | Dec 1995 | A |
5481712 | Silver et al. | Jan 1996 | A |
5581625 | Connell et al. | Dec 1996 | A |
5602890 | Gray et al. | Feb 1997 | A |
5687249 | Kato | Nov 1997 | A |
5717834 | Werblin et al. | Feb 1998 | A |
5734742 | Asaeda et al. | Mar 1998 | A |
5742037 | Scola et al. | Apr 1998 | A |
5751831 | Ono | May 1998 | A |
5802220 | Black et al. | Sep 1998 | A |
5809161 | Auty et al. | Sep 1998 | A |
5825483 | Michael et al. | Oct 1998 | A |
5852669 | Eleftheriadis et al. | Dec 1998 | A |
5872354 | Hanson | Feb 1999 | A |
5917602 | Bonewitz et al. | Jun 1999 | A |
5929418 | Ehrhart et al. | Jul 1999 | A |
5932862 | Hussey et al. | Aug 1999 | A |
5937096 | Kawai | Aug 1999 | A |
5942741 | Longacre et al. | Aug 1999 | A |
5943432 | Gilmore et al. | Aug 1999 | A |
5960097 | Pfeiffer et al. | Sep 1999 | A |
5960125 | Michael et al. | Sep 1999 | A |
5966457 | Lemelson | Oct 1999 | A |
6046764 | Kirby et al. | Apr 2000 | A |
6049619 | Anandan et al. | Apr 2000 | A |
6061471 | Coleman, Jr. et al. | May 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6072882 | White et al. | Jun 2000 | A |
6075882 | Mullins et al. | Jun 2000 | A |
6078251 | Landt et al. | Jun 2000 | A |
6088467 | Sarpeshkar et al. | Jul 2000 | A |
6115480 | Washizawa | Sep 2000 | A |
6158661 | Chadima, Jr. et al. | Dec 2000 | A |
6160494 | Sodi et al. | Dec 2000 | A |
6161760 | Marrs | Dec 2000 | A |
6169535 | Lee | Jan 2001 | B1 |
6169600 | Ludlow | Jan 2001 | B1 |
6173070 | Michael et al. | Jan 2001 | B1 |
6175644 | Scola et al. | Jan 2001 | B1 |
6175652 | Jacobson et al. | Jan 2001 | B1 |
6184924 | Schneider et al. | Feb 2001 | B1 |
6215892 | Douglass et al. | Apr 2001 | B1 |
6282462 | Hopkins | Aug 2001 | B1 |
6285787 | Kawachi et al. | Sep 2001 | B1 |
6298176 | Longacre, Jr. et al. | Oct 2001 | B2 |
6301610 | Ramser et al. | Oct 2001 | B1 |
6333993 | Sakamoto | Dec 2001 | B1 |
6346966 | Toh | Feb 2002 | B1 |
6347762 | Sims et al. | Feb 2002 | B1 |
6360003 | Doi et al. | Mar 2002 | B1 |
6396517 | Beck et al. | May 2002 | B1 |
6396949 | Nichani | May 2002 | B1 |
6408429 | Marrion et al. | Jun 2002 | B1 |
6429387 | Kuribayashi et al. | Aug 2002 | B1 |
6434264 | Asar | Aug 2002 | B1 |
6446868 | Robertson et al. | Sep 2002 | B1 |
6483935 | Rostami et al. | Nov 2002 | B1 |
6487304 | Szeliski | Nov 2002 | B1 |
6525810 | Kipman | Feb 2003 | B1 |
6526156 | Black et al. | Feb 2003 | B1 |
6539107 | Michael et al. | Mar 2003 | B1 |
6545705 | Sigel et al. | Apr 2003 | B1 |
6549647 | Skunes et al. | Apr 2003 | B1 |
6573929 | Glier et al. | Jun 2003 | B1 |
6580810 | Yang et al. | Jun 2003 | B1 |
6587122 | King et al. | Jul 2003 | B1 |
6597381 | Eskridge et al. | Jul 2003 | B1 |
6608930 | Agnihotri et al. | Aug 2003 | B1 |
6618074 | Seeley et al. | Sep 2003 | B1 |
6625317 | Gaffin et al. | Sep 2003 | B1 |
6628805 | Hansen et al. | Sep 2003 | B1 |
6629642 | Swartz et al. | Oct 2003 | B1 |
6646244 | Aas et al. | Nov 2003 | B2 |
6668075 | Nakamura | Dec 2003 | B1 |
6677852 | Landt | Jan 2004 | B1 |
6681151 | Weinzimmer et al. | Jan 2004 | B1 |
6714213 | Lithicum et al. | Mar 2004 | B1 |
6741977 | Nagaya et al. | May 2004 | B1 |
6750974 | Svetkoff et al. | Jun 2004 | B2 |
6753876 | Brooksby et al. | Jun 2004 | B2 |
6754374 | Miller et al. | Jun 2004 | B1 |
6761316 | Bridgelall | Jul 2004 | B2 |
6766414 | Francis et al. | Jul 2004 | B2 |
6774917 | Foote et al. | Aug 2004 | B1 |
6795521 | Hsu et al. | Sep 2004 | B2 |
6816063 | Kubler | Nov 2004 | B2 |
6817982 | Fritz et al. | Nov 2004 | B2 |
6891570 | Tantalo et al. | May 2005 | B2 |
6901277 | Kaufman et al. | May 2005 | B2 |
6919793 | Heinrich | Jul 2005 | B2 |
6944584 | Tenney et al. | Sep 2005 | B1 |
6985827 | Williams et al. | Jan 2006 | B2 |
6987528 | Nagahisa et al. | Jan 2006 | B1 |
6997556 | Pfleger | Feb 2006 | B2 |
6999625 | Nelson | Feb 2006 | B1 |
7062071 | Tsujino et al. | Jun 2006 | B2 |
7066388 | He | Jun 2006 | B2 |
7070099 | Patel | Jul 2006 | B2 |
7085401 | Averbuch et al. | Aug 2006 | B2 |
7088387 | Freeman et al. | Aug 2006 | B1 |
7088846 | Han et al. | Aug 2006 | B2 |
7097102 | Patel et al. | Aug 2006 | B2 |
7130457 | Kaufman et al. | Oct 2006 | B2 |
7175090 | Nadabar | Feb 2007 | B2 |
7181066 | Wagman | Feb 2007 | B1 |
7227925 | Mansfield et al. | Jun 2007 | B1 |
7227978 | Komatsuzaki et al. | Jun 2007 | B2 |
7266768 | Ferlitsch et al. | Sep 2007 | B2 |
7274808 | Baharav et al. | Sep 2007 | B2 |
7280685 | Beardsley et al. | Oct 2007 | B2 |
7611358 | Cox et al. | Nov 2009 | B2 |
7657081 | Blais et al. | Feb 2010 | B2 |
7734008 | Sanders et al. | Jun 2010 | B1 |
7787678 | Unal et al. | Aug 2010 | B2 |
8049193 | Appleby et al. | Nov 2011 | B1 |
8194821 | Seppi et al. | Jun 2012 | B2 |
20010042789 | Krichever et al. | Nov 2001 | A1 |
20020005895 | Freeman et al. | Jan 2002 | A1 |
20020099455 | Ward | Jul 2002 | A1 |
20020122582 | Masuda et al. | Sep 2002 | A1 |
20020177918 | Pierel et al. | Nov 2002 | A1 |
20020181405 | Ying | Dec 2002 | A1 |
20020196336 | Batson et al. | Dec 2002 | A1 |
20020196342 | Walker et al. | Dec 2002 | A1 |
20030062418 | Barber et al. | Apr 2003 | A1 |
20030095710 | Tessadro | May 2003 | A1 |
20030113018 | Nefian et al. | Jun 2003 | A1 |
20030120714 | Wolff et al. | Jun 2003 | A1 |
20030137590 | Barnes et al. | Jul 2003 | A1 |
20030201328 | Jam et al. | Oct 2003 | A1 |
20030219146 | Jepson et al. | Nov 2003 | A1 |
20030227483 | Schultz et al. | Dec 2003 | A1 |
20040148057 | Breed et al. | Jul 2004 | A1 |
20040218806 | Miyamoto et al. | Nov 2004 | A1 |
20050184217 | Kong et al. | Aug 2005 | A1 |
20050226490 | Phillips et al. | Oct 2005 | A1 |
20050254106 | Silverbrook et al. | Nov 2005 | A9 |
20050257646 | Yeager | Nov 2005 | A1 |
20050275728 | Mirtich et al. | Dec 2005 | A1 |
20050275831 | Silver | Dec 2005 | A1 |
20050275833 | Silver | Dec 2005 | A1 |
20050275834 | Silver | Dec 2005 | A1 |
20050276445 | Silver et al. | Dec 2005 | A1 |
20050276459 | Eames et al. | Dec 2005 | A1 |
20050276460 | Silver et al. | Dec 2005 | A1 |
20050276461 | Silver et al. | Dec 2005 | A1 |
20050276462 | Silver et al. | Dec 2005 | A1 |
20060022052 | Patel et al. | Feb 2006 | A1 |
20060107211 | Mirtich et al. | May 2006 | A1 |
20060107223 | Mirtich et al. | May 2006 | A1 |
20060131419 | Nunnink | Jun 2006 | A1 |
20060133757 | Nunnink | Jun 2006 | A1 |
20060146337 | Marshall et al. | Jul 2006 | A1 |
20060223628 | Walker et al. | Oct 2006 | A1 |
20060249581 | Smith et al. | Nov 2006 | A1 |
20060283952 | Wang | Dec 2006 | A1 |
20070009152 | Kanda | Jan 2007 | A1 |
20070146491 | Tremblay et al. | Jun 2007 | A1 |
20070181692 | Barkan et al. | Aug 2007 | A1 |
20080036873 | Silver | Feb 2008 | A1 |
20080166015 | Haering et al. | Jul 2008 | A1 |
20080167890 | Pannese et al. | Jul 2008 | A1 |
20080205714 | Benkley | Aug 2008 | A1 |
20080219521 | Benkley | Sep 2008 | A1 |
20080285802 | Bramblet et al. | Nov 2008 | A1 |
20100318936 | Tremblay et al. | Dec 2010 | A1 |
Number | Date | Country |
---|---|---|
10012715 | Sep 2000 | DE |
2309078 | Feb 2002 | DE |
10040563 | Feb 2002 | DE |
0815688 | Jan 1998 | EP |
0939382 | Sep 1999 | EP |
0815688 | May 2000 | EP |
0896290 | Oct 2004 | EP |
1469420 | Oct 2004 | EP |
1734456 | Dec 2006 | EP |
2226130 | Jun 1990 | GB |
60147602 | Aug 1985 | JP |
9-288060 | Nov 1997 | JP |
11-101689 | Apr 1999 | JP |
2000-84495 | Mar 2000 | JP |
2000-227401 | Aug 2000 | JP |
2000-322450 | Nov 2000 | JP |
2002-148205 | May 2002 | JP |
WO-9609597 | Mar 1996 | WO |
WO-0141068 | Jun 2001 | WO |
WO-0215120 | Feb 2002 | WO |
WO-02075637 | Sep 2002 | WO |
WO-03102859 | Dec 2003 | WO |
WO-2005050390 | Jun 2005 | WO |
WO-2005124709 | Dec 2005 | WO |
Entry |
---|
PCT/US2008/083191, Search Report, Feb. 17, 2009. |
Response to Written Opinion, Singapore patent No. 200608484-2, Dec. 11, 2009. |
Prosecution file history for U.S. Appl. No. 10/865,155, Jun. 9, 2004 through Jan. 10, 2011. |
Prosecution file history for U.S. Appl. No. 10/979,535, Nov. 2, 2004 through 1/29/09. |
Prosecution file history for U.S. Appl. No. 10/979,572, Nov. 2, 2004 through Jan. 11, 2001. |
Prosecution file history for U.S. Appl. No. 10/987,497, Oct. 2, 2008 through Mar. 30, 2009. |
Prosecution file history for U.S. Appl. No. 10/988,120, Nov. 12, 2004 through Dec. 31, 2009. |
Prosecution file history for U.S. Appl. No. 11/059,512, Feb. 16, 2005 through Dec. 18, 2008. |
Non-Final Office Action for U.S. Appl. No. 11/094,650, dated Jan. 28, 2009. |
Prosecution file history for U.S. Appl. No. 11/136,019, May 24, 2005 through Oct. 29, 2010. |
Prosecution file history for U.S. Appl. No. 11/136,103, May 24, 2005 through Dec. 18, 2009. |
Prosecution file history for U.S. Appl. No. 11/138,033, Jun. 27, 2007 through Oct. 25, 2007. |
Prosecution file history for U.S. Appl. No. 11/138,025, May 26, 2005, through Jan. 17, 2009. |
Prosecution file history for U.S. Appl. No. 11/616,726, Dec. 27, 2006 through Aug. 20, 2010. |
Prosecution file history for U.S. Appl. No. 11/769,494, Jun. 27, 2007 through Oct. 25, 2007. |
European Patent application No. 05758781, file history Feb. 18, 2006 through Mar. 25, 2009. |
PCT/US2005/019923 International Preliminary Report on Patentability, May 12, 2006. |
European Patent application No. 05763341, file history Jun. 13, 2007 through Sep. 20, 2010. |
Japanese Patent application No. 2007-527,637, Office action English translation, dated May 25, 2010. |
Japanese Patent Application 2007-527 637, Response to Office action dated, Sep. 1, 2010, English translation. |
European Patent application No. 05756516, file history Feb. 23, 2006 through Feb. 10, 2009. |
International Preliminary Report on Patentability, PCT/US2008/007280 Publication Date Dec. 17, 2009. |
International Search Report, PCT/US20081007302, Publication Date Nov. 5, 2009. |
Written Opinion of the International Searching Authority PCT/US2008/007302, Publication Date Nov. 5, 2009. |
Written Opinion of the International Searching Authority, PCT/US2008/007280, Publication Date Dec. 15, 2009. |
Apple Computer Inc., Studio Display User's Manual online, retrieved on Nov. 24, 2010, retrieved from the Internet http://manuals.info.apple.com/en/studioDisplay—15inLCDUserManual.pdf, 1998. |
Search Report, PCT/US2008/083191, Publication Date Feb. 17, 2009. |
Cognex Corporation, VisionPro Getting Started, Revision 3.2, 590-6508, copyright 2003. |
National Instruments, IMAQVision Builder Tutorial, IMAQ XP-002356530, b, ttp://www.ni.com.‘pdf,’manuals/322228c.pdf, Publication Date, Dec. 2000. |
Allen-Bradley, Bulletin 2803 VIM Vision Input Module, Cat. No. 2803-VIM2, Printed USA, (1991) (Submitted in 3 parts). |
Allen-Bradley, Bulletin 5370 CVIM Configurable Vision Input Module, User Manual Cat. No. 5370-CVIM, (1995) (Submitted in 3 parts). |
Allen-Bradley, User's Manual, Bulletin 2803 VIM Vision Input Module, Cat. No. 2803-VIM1, (1987) (Submitted in 2 parts). |
Cognex Corporation, Screen shot of the CheckMate GUI Ver 1.6, (Jan. 2005). |
Cognex Corporation, Sensorpart FA45 Vision Sensor, (Sep. 29, 2006). |
Cognex Corporation, 3000/4000/5000 Vision Tools, revision 7.6, 590-0136, Chapter 13, (1996). |
Cognex Corporation, Cognex 3000/4000/5000, Vision Tools, Revision 7.6, 590-0136, Chapter 10 Auto-Focus, (1996). |
Vietze, Oliver Miniaturized Vision Sensors for Process Automation, (Jan. 2, 2005). |
“Cognex 3000/4000/5000 Image Processing”, Revision 7.4 590-0135 Edge Detection Tool, (1996). |
“Cognex 4000/5000 SMD Placement Guidance Package, User's Manual” Release 3.8.00, Chapter 15, 590-6168,(1998). |
“Cognex VisionPro”, Getting Started—QuickStart Tutorial, Cognex Corporation, 590-6560, Revision 3.5,(May 2004),69-94. |
“CVL Vision Tools Guide”, Cognex MVS-8000 Series, Chapter 5, Symbol Tool, CVL 5.4,(Dec. 1999). |
Demotte, Donald, “Visual Line Tracking”, Application Overview & Issues Machine Vision for Robot Guidance Workshop, (May 5, 2004). |
Shane C. Hunt, Mastering Microsoft PhotoDraw 2000, SYBEX, Inc., May 21, 1999. |
Integrated Design Tools, High-Speed CMOS Digital Camera, X-Stream Vision User's Manual, 2000. |
IO Industries, High Speed Digital Video Recording Software 4.0,IO industries, Inc.—Ontario, CA, 2002. |
Phillip Kahn, Building Blocks for Computer Vision Systems, IEEE Expert, vol. 8, No. 6, XP002480004, pp. 40-50, Dec. 6, 1993. |
Matrox, Interactive Windows Imaging Software for Industrial and Scientific Applications, Inspector 4.0—Matrox Imaging, pp. 8, Apr. 15, 2002. |
Olympus Industrial, Design Philosophy, i-speed, 2002. |
Olympus Industrial High Speed, High Quality Imaging Systems, i-speed Product Brochure—Publisher Olympus Industrial, 2002. |
RVSI, Smart Camera Reader for Directly Marked Data Matrix Codes, HawkEye 1515 with GUI, 2004. |
Whelan, P. et al., Machine Vision Algorithms in Java, Chapter 1—An Introduction to Machine Vision, Springer-Verlag, XP002480005, 2001. |
Photron, USA, Product information for Fastcam-X 1280 PCI, Copyright 2004, www.photron.com. |
Photron, USA, Product information for Fastcam PCI, Copyright 2004, www.photron.com. |
Photron, USA, Product information for Ultima 1024, Copyright 2004, www.photron.com. |
Photron, USA, Product information for Ultima 512, Copyright 2004, www.photron.com. |
Photron, USA, Product information for Ultima APX, Copyright 2004, www.photron.com. |
KSV Instruments Ltd., HiSIS 2002—High Speed Imaging System, www.ksvItd.fi, 2004. |
ICS 100, Intelligent Camera Sensor, SICK Product Information, SICK Industrial Sensors, 6900 West 110th St., Minneapolis, MN 55438, www.sickusa.com, Jan. 3, 2002. |
Matsushita Imagecheckers, NAiS Machine Vision, Matsushita Machine Vision Systems, 2003. |
Rohr, K., Incremental Recognition of Pedestrians from Image Sequences, CVPR93, 1993. |
Chang, Dingding et al., Feature Detection of Moving Images using a Hierarchical Relaxation Method, IEICE Trans. Inf. & Syst., vol. E79-D, Jul. 7, 1996. |
Zarandy, A. et al., Vision Systems Based on the 128X128 Focal Plane Cellular Visual Microprocessor Chips, IEEE, Mar. 2003, III-518—III-521. |
SmartCapture Tool, Feature Fact Sheet, Visionx Inc., www.visionxinc.com, 2003. |
Wilson, Andrew, CMOS/CCD sensors spot niche applications, Vision Systems, 2003. |
Matsushita LightPix AE10, NAiS Machine Vision, Matsushita Machine Vision Systems, 2003. |
Corke, Peter I., et al., Real Time Industrial Machine Vision, Electrical Engineering Congress Sydney, Australia, CSIRO Division of Manufacturing Technology, 1994. |
Marsh, R et al., The application of Knowledge based vision to closed-loop control of the injection molding process, SPIE vol. 3164, Faculty of Engineering University of the West of England United Kingdom, 1997, pp. 605-616. |
Zarandy, Akos et al., Ultra-High Frame Rate Focal Plane Image Sensor and Processor, IEEE Sensors Journal, vol. 2, No. 6, 2002. |
LM9630 100×128, 580 fps UltraSensitive Monochrome CMOS Image Sensor, National Semiconductor Corp., www.national.com, Rev. 1.0, Jan. 2004. |
Analog Devices, Inc., Blackfin Processor Instruction Set Reference, Revision 2.0, Part No. 82-000410-14, May 2003. |
ADSP-BF533 Blackfin Processor Hardware Reference, Analog Devices Inc., Media Platforms and Services Group, Preliminary Revision, Part No. 82-002005-01, Mar. 2003. |
National Instruments, IMAQVision Builder Tutorial, IMAQ XP-002356530, http://www.ni.com/pdf/manuals/322228c.pdf, Dec. 2000. |
Denis, Jolivet, LabView and IMAQ Vision Builder Provide Automated Visual Builder, LabVIEW, National Instruments, XP002356529, http://www.ni.com/pdf/csma/us/JNDESWG.pdf, 2001. |
Chen, Y.H., Computer vision for General Purpose Visual Inspection: a Fuzzy Logic Approach, Optics and Lasers in Engineering 22, Elsevier Science Limited, vol. 22, No. 3, 1995, pp. 182-192. |
Di Mauro, E.C., et al., Check a generic and specific industrial inspection tool, IEEE Proc.-Vis. Image Signal Process, vol. 143, No. 4, Aug. 27, 1996, pp. 241-249. |
Uno, T. et al., A Method of Real-Time Recognition of Moving Objects and its Application, Pattern Recognition: Pergamon Press, vol. 8, pp. 201-208, 1976. |
Hearing, N., et al., Visual Event Detection, Kluwer Academic Publishers, Chapter 2, Section 8, 2001. |
IBM, Software Controls for Automated Inspection Device Used to Check Interposer Buttons for Defects, IP.com Journal, IP.com Inc., West Henrietts, NY US, Mar. 27, 2003. |
Wright, Anne, et al, Cognachrome Vision System User's Guide, Newton Research Labs, Manual Edition 2.0, Documents Software Version 26.0, Jun. 3, 1996. |
Stemmer Imaging GmbH, Going Multimedia with Common Vision Blox, Product News, www.stemmer-imaging.de, Mar. 3, 2004. |
Cordin Company, Electronic Imaging Systems, High Speed Imaging Solutions: 200-500 Series Cameras, www.cordin.com, 2004. |
Bi-i, AnaLogic Computers Ltd., 2003. |
Bi-i, Bio-inspired Real-Time Very High Speed Image Processing Systems, AnaLogic Computers Ltd., http://www.analogic-computers.com/cgi-bin/phprint21.php, 2004. |
Cellular device processes at ultrafast speeds, VisionSystems Design, Feb. 2003. |
LaVision GmbH, High Speed CCD/CMOS Camera Systems, Overview of State of-the-Art High Speed Digital Camera Systems, UltraSpeedStar, www.lavision.de, Sep. 24, 2004. |
10-K SEC Filing, iQ 180 Products, Adaptive Optics Associates 900 Coles Road, Blackwood, NJ 08012-4683, Dec. 2003. |
Laser Scanning Product Guide, Adaptive Optics Associates, Industrial Products and Systems, 900 Coles Road, Blackwood, NJ 08012-4683, Industrial Holographic and Conventional Laser ID, Omnidirectional Bar Code Scanners, Mar. 2003. |
CV-2100 Series, Keyence America, http://www.keyence.com/products/vision/cv—2100—spec.html, High-Speed Digital Machine Vision System, Dec. 29, 2003. |
West, Perry C., High-Speed, Real-Time Machine Vision, Imagenation and Automated Vision Systems, Inc., 2001. |
Asundi, A., et al., High-Speed TDI Imaging for Peripheral Inspection, Proc. SPIC vol. 2432, Machine Vision Applications in Industrial Inspection III, Frederick Y. Wu, Stephen S. Wilson, Eds., Mar. 1995, pp. 189-194. |
Baillard, C., et al., Automatic Reconstruction of Piecewise Planar Models from Multiple Views, CVPR, vol. 02, No. 2, 1999, pp. 2559. |
Kim, Zuwhan et al., Automatic Description of Complex Buildings with Multiple Images, IEEE 0-7695-0813, 2000, pp. 155-162. |
Siemens AG, Simatic Machine Vision, Simatic VS 100 Series, www.siemens.com/machine-vision, Apr. 1, 2003. |
Stauffer, Chris et al., Tracking-Based Automatic Object Recognition, Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA http://www.ai.mit.edu, pp. 133-134. |
Bauberg, A.M. et al., Learning Flexible Models from Image Sequences, University of Leeds, School of Computer Studies, Research Report Series, Report 93.36, Oct. 1993, pp. 1-13. |
Number | Date | Country | |
---|---|---|---|
Parent | 10987497 | Nov 2004 | US |
Child | 12931504 | US |