This Application is a U.S. National filing under §371 of International Application No. PCT/US/27351, filed Aug. 31, 2001, claiming priority from U.S. Ser. No. 60/229,613, filed Aug. 31, 2000 (which is hereby incorporated by reference).
The present invention relates to sensor and imaging systems, and more particularly to a system for providing and interpreting image data.
Sensor and imaging systems are increasingly in demand in today's technology driven economy. These systems include a camera for viewing objects included within a field of view. The camera generates image data that is analyzed by a computer to determine what, if any, action should be taken in response to the object detected.
Many recognition systems use two or more cameras for viewing objects included within one field of view. In addition to the costs associated with using several cameras, these systems require a specific mounting arrangement for each of the cameras. Such systems have a reduced level of reliability over single camera systems because both cameras are needed for proper operation.
Single camera systems are typically mounted at a fixed location and look for objects that satisfy, or fail to satisfy, predetermined criteria. For instance—systems that check for structural defects. These systems are incapable of making decisions that are not already specified.
Accordingly, a need exists for a sensor and imaging system that, by using an image provided by a camera, can decide whether a condition has, or has not, been satisfied.
In an embodiment in accordance with the present invention, a system is provided having a camera, a processor, and a user interface. The camera transmits image data responsive to a scene within a field of view. In response to the image data, the processor indicates whether a condition has been satisfied. The user interface is operably connected to the processor and allows a user to select criteria for detection of objects, for indicating criteria selected, and for providing visual confirmation that an object has been detected.
In another embodiment, a control interface is also provided for effecting other devices. Further, the system provides signals to influence other devices.
In yet another embodiment, the system provides a signal to open a door upon a determination by the processor that a condition has been satisfied. The door is then open by a conventional electro mechanical door opener system having a drive motor operably connected to the door.
Other features and advantages of the invention will be apparent from the following specification taken in conjunction with the following drawings.
a and 1b are a block diagram of an embodiment of a sensor and imaging system in accordance with the present invention;
a and 15b are a simplified block diagram of one of the object sensor/imaging circuits of
While this invention is susceptible of embodiments in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated.
Turning to
Turning to
The cameras assemblies 112 and 113 can include charge coupled devices (CCD), or the like, having preferably a wide-angle lens, and capable of transmitting image data to the sensor/imaging circuits 114 and 115, respectively. The image data corresponds, respectively, to a scene within each camera's field of view.
The sensor/imaging circuits 114,115 process the image data for determining whether a user selected condition has been satisfied. The user selected conditions are selected via a man-machine interface comprising the I/O display board 116 and the membrane keypad 118. In an embodiment, the man-machine interface is operably connected to the sensor/imaging circuits 114,115 and allow a user to select criteria for detection of objects, for indicating criteria selected, and for providing visual confirmation that an object has been detected.
Turning to
In an embodiment, annular apertures 146 extend through the axis of the ceramic resistor 142 and the conductive terminals 144,145. The apertures 146 have substantially identical outer circumferences and are concentrically aligned with each other.
Preferably, the outer perimeter 148 of conductive terminal 145 includes a plurality of ears 150 extending outwardly therefrom. Extending through each ear 150 is an aperture 152 for extending an attachment screw 154 (
Turning to
The camera assembly housing 131 includes an aperture 158 that allows the camera's field-of-view 136 to extend outside of the housing. The window 132 is mounted over the aperture 158 to prevent contaminants such as dirt and moisture from entering the camera assembly.
Preferably, the window 132 is sandwiched between the thermally conductive terminal ring 145 of the heater assembly 140 and an annular gasket 160 made of a resilient material and adjoining against the inner surface 162 of the camera assembly housing 131 about aperture 158. In an embodiment, the window 132 is made of a visually transparent material such as borosilicate glass.
The camera 134 generates image data or electronic pixel data 218 representative of what is observed in the camera's field-of-view 136. In an embodiment, the image data 218 is analyzed by one of the video processors 114,115 (
Within the camera assembly housing 131, the terminals 144,145 of the heater assembly 140 are coupled to a voltage source 166 for maintaining a voltage potential across the ceramic resistor 142. The heat generated by the ceramic resistor 142 as current flows therethrough is dissipated through conductive terminal 145 and onto the window 132. In an embodiment, enough heat is provided to maintain the window 132 at a temperature above the dew-point of the air outside of the camera assembly housing 131. For instance, the heater can activate at about 87° F. and deactivate at about 106° F.
As will be appreciated by those having skill in the art, the use of a ceramic resistor 142 eliminates the need for a mechanical thermostat or the like since the resistor material exhibits a positive-temperature coefficient. The elimination of a thermostat increases the reliability of the heater and reduces the amount of noise placed on voltage supplies as a result of switching. Turning back to
Turning to
Preferably, the safety zone 168 is maintained in an area immediately surrounding the door 121 to prevent the door from closing when a person or object is in the immediate vicinity of the door. Moreover, the activation zone(s) 168 open the door when a person or vehicle approaches the door 121. A failsafe system can also be provided to open the door 121 whenever there is a loss of illumination within the field of view, severe illumination changes, electronics failure, camera knocked ajar, or the camera lens is obscured.
As indicated previously, the system 110, via the user interface, has the capability to define at least a portion of an image received by the camera assemblies as a control zone. In an embodiment, the system 110 has the capability to choose coordinates from all pixel coordinated by direct access within the control zone. Alternatively, the system 110 has the capability to choose from multiple predefined zones. Moreover, the system 110 can have the capability to put real objects in the field of view so as to delineate boundary coordinates and the real objects become part of the image data.
In an embodiment, the user interface has three modes of operation: parameter edit mode, run mode, and diagnostic mode. In parameter edit mode, a user can input or modify configuration parameters, using touch keypad buttons 610, 622, 624, and 626, such as the door model, English or metric units, camera heights and distance from the door. In the run mode, the system 110 is activated. As such, the system 110 processes images from the cameras 112,113 and outputs safety and activation zone indication signals through the I/O board 116, and displays status information on the display LEDs 614, and 616. In the diagnostic mode, additional information regarding the status of the system 110 is made available via an I/O port (not shown).
In an embodiment, the video digitizer 126 receives an analog image signal from one of the cameras, digitizes the analog image signal, and transmits the digitized image signal to the field programmable gate array 124.
As explained in detail further herein, the field programmable gate array 124 is programmed to perform one or more image processing operations in response to the digitized image signal received. In an embodiment, these operations include comparing predetermined traits of the digitized image signal with one or more previously received digitized image signals to provide composite image data. In response to the composite image data, the central processing unit 125 determines whether one or more conditions have been satisfied.
Operably coupled to the video digitizer 126 is a temperature sensor 128 having an output correlating to the temperature of the video digitizer. Upon an indication by the temperature sensor of a condition whereby the video digitizer 126 is not operating within a predetermined temperature range or limit, reset command is issued by a reset circuit 130 whereby the system 110 maintained in a reset state until the temperature of the video digitizer returns to within the predetermined temperature range or limit.
In an embodiment, the FPGA 124 performs a relatively high-rate pixel processing in order to unburden the CPU 125 and achieve a desired video processing frame rate. This hardware architecture balance reduces overall system cost by removing the cost associated with an adequately fast CPU chip. A further frame rate speed up can be achieved by using the FPGA and CPU processing simultaneously in parallel. This parallel processing is accomplished by FPGA pixel processing the next frame during the interval that the CPU is data processing the current frame. Thus, the new FPGA output is immediately available to the CPU process when the CPU finishes the current frame data processing. This process structure requires the ability to maintain two independent sets of data, and is referred to later herein as ping/pong control.
Turning to
In an embodiment, the image processing module 212 receives stored electronic pixel data 218 comprising current image data 220, reference image data 222, reference edges data 224, and previous image data 226. Preferably, the current image data 220 is the most recently taken image (i.e., taken at t), the previous image data 226 is the next most recently taken image data (i.e., taken at t+1), and the reference image data 222 is the oldest of the taken image data (i.e., taken at t+1+x). Moreover, as explained in detail further herein, the reference edges data 224 consists of edge data extracted from the reference image data 222.
The image processing module 212 also receives parameter data 228 from the man-machine interface (i.e., membrane keypad 118 and I/O display board 116 of
As explained in detail further herein, in response to the electronic pixel data 218 and the parameter data 228, the image processing module 212 produces derived image data 230 comprising edge segmentation, motion segmentation, and region segmentation.
The feature processing module 214 receives the derived image data 230 and the parameter data 228. As explained in detail further herein, the feature processing module 214 produces, in response to the image data 230 and parameter data 228, feature data 232 comprising edge features, motion features, region features, and frame features.
The detection processing module 216 receives the feature data 232 and the parameter data 228. In response to the data, the detection processing module 216 produces control signals 234 comprising a detection signal for opening and closing the door 121 (
Turning to
The initialize parameters step 236 includes initialization of the man-machine interface and constant data and derived parameters. During initialization of the man-machine interface, user entered data is read and stored into memory. Constant data is also loaded into memory along with derived parameters relating to control zones for opening and closing the door.
Thus, as indicated above, upon application of power to the system, the initialize parameter module 236 initiates the initialization of the man-machine interface (i.e., membrane keypad 118 and I/O display board 116 of
The initialize zones module 234 initiates the initialization of the control zones whereupon data associated with user or predefined safety zones and activation zones is complied. The initialize FPGA 242 and the initialize video digitizer 244 initiates the initialization of the FPGA 124
The initialize video system 246 initiates the initialization of the CPU 125 (
After the system is initialized, the system operates in a video processing loop depicted in the simplified block diagrams of
Within the video processing loop 250, the CPU 125 process use the current ping/pong buffer to point to, load and unpack that data into a third database—the user data—bank U. This data is used in the CPU process later to generate features and detection decisions on the current frame. Preferably, at the same time, the CPU process starts the FPGA capture and process activity on the FPGA 124. While the CPU is processing features for the current frame, the FPGA is computing image data for the next frame. The detection and control activity sends the safety and activate signals out through the FPGA serial I/O interface. The CPU feature and detection processing takes longer than the FPGA computations. When the CPU finishes the current frame, the FPGA data is retrieved to the opposite bank (e.g., Bank 1 if processing Bank 0). Diagnostic messages can be output at the end of each frame processing, as well as at any point in the video processing. The process then loops to set Bank U to the new current bank (Bank 0 or Bank 1), and the FPGA is again initiated.
Turning to
Once the derived image data 230 is produced, it is preferably stored within one of a plurality of memory banks 230a,230b and then provided, via switching, for feature processing. Accordingly, the derived image data 230 provided to the feature processing module 214 is static. However, the FPGA 124 continuously processes the electronic pixel data 218 and loads the results of the image processing, via switching, into the memory bank not currently accessible to the processing module 214. Accordingly, the derived image data 230 within the memory banks is accessible to the feature processing module via switched between the memory banks 230a,230b on a first-in-first-out basis.
Preferably, two memory banks 230a and 230b are provided. Turning back to
The retrieve FPGA data step 254 provides for obtaining the static data within the memory banks for processing of the static data during the calculating all features step 256. In particular, temporary storage registers and counters are reset, and the static data is unpacked to provide the derived image data 230 for processing by the feature processing module 214 (
In an embodiment, and as explained in detail further herein, the feature processing module 214 (
Further, the detection processing module 216 performs the calculate detection & control step 258, in response to the feature data 232 (
The save FPGA data step 260 occurs once the FPGA 124 (
Turning to
The ACC, when enabled by user input, functions during initialization to find the best starting gain by iterating and testing the image result. When a gain is found which satisfies established criteria, iterating stops, and the process continues to the video loop with the selected gain. The ACC also functions at the beginning of the video loop, but does not iterate to fine a satisfactory gain. Only a single gain change is performed in the loop per frame. The gain change and consequent video system initialization take a much shorter time than a frame time (100 ms). The decision to require a gain change in the video loop is controlled by criteria calculated in the detection and control portion of the CPU activity. The criteria can include aging, zone activity, and long and short time-constant filters.
Turning to
In an embodiment, the modified Sobel operator module 302 receives current (B) image input 312 and generates the edge image (GEB) 314 from the current input image. A reference image (GER) 316, initialized in the CPU, is subtracted from the current edge image in the positive difference operator module 304, where negative values are set to zero. The grey-level edge image is thresholded 306, eroded 308, and labeled 310. The output of the label I operator 310 is a 16-bit labeled image 318, an equivalence table 320, and counts of the number of labels used 322 and the number of entries in the equivalence table. Counts of the number of set pixels in the binary input 324 and output 326 of the erode operator 308 are also output to the CPU, completing the edge image processing.
Label I operator 310 is used in each thread of the image processing. Label I 310 is the first part of a two step process used to produce the labeling of the connected components of the binary input. Label 1310 passes a 2×2 kernel over the binary input image beginning with the upper left of the image. The elements of the kernel are identified as follows:
If the binary pixel in X is zero, the output is zero. If X is set, the labels B, A, C are scanned in that order. If all of B, A, C are non-zero, the next value of a label counter is output at X and the counter is incremented. If any B, A, C are non-zero, the label operator is the value of the first non-zero label. If more than one of B, A, C is non-zero, the first non-zero value is output. If any of the remaining non-zero labels is different from the output value, the output value and the different value are written to an equivalence table.
Turning to
The positive difference of the current grey-level input image (B) 360 and the previous image (A) 362 is thresholded 346 and 348 and dilated 350 and 352, as well as the positive difference of A and B 342 and 344. The results are inclusively ORed 354. The resulting binary image is labeled as in the edge case (
The grey-level edge image is thresholded, eroded 356, and labeled 358. The output of the label I operator 358 is a 16-bit labeled image 364, an equivalence table 366, and counts of the number of labels used 368 and the number of entries in the equivalence table. Counts of the number of set pixels in the binary input 370 and output 372 of the erode operator 356 are also output to the CPU, completing the motion detector image processing using regions.
Label I operator 358 is used in each thread of the image processing. Label I 358 is the first part of a two step process used to produce the labeling of the connected components of the binary input. Label I 358 passes a 2×2 kernel over the binary input image beginning with the upper left of the image. The elements of the kernel are identified as follows:
If the binary pixel in X is zero, the output is zero. If X is set, the labels B, A, C are scanned in that order. If all of B, A, C are non-zero, the next value of a label counter is output at X and the counter is incremented. If any B, A, C are non-zero, the label operator is the value of the first non-zero label. If more than one of B, A, C is non-zero, the first non-zero value is output. If any of the remaining non-zero labels is different from the output value, the output value and the different value are written to an equivalence table.
Turning to
The system image processing region analysis detection operation is analogous to the motion detection operation of
Turning to
The output of the label I operator 432 is a 16-bit labeled image 434, an equivalence table 436, and counts of the number of labels used 438 and the number of entries in the equivalence table. Counts of the number of set pixels in the binary input 440 and output 442 of the erode operator 430 are also output to the CPU, completing the system image processing having a motion detector that uses edges.
Turning to
The presence (P) or edge feature module 452 and the shadow and lightbeam (SL) or region feature module 458 calculations are quite similar to the point of generating the edge/region score discounts. Moreover, within the P feature 452 and SL feature 458 calculations, the global calculations are very similar to the zone calculations. The zone calculations restrict the spatial range of feature calculations for each zone using the associated zone mask. The results of the P and SL feature calculations are stored in a database (feature tables) for use in detection determination 460.
Frame features 456 are calculated differently than P and SL features. Frame features 456 are not features of objects, but of the input grey level image and the current edge image. Frame statistics are computed in order to draw inferences about conditions of the camera and video system integrity. Frame statistics are also used to condition some detection variables that act as adaptive thresholds. Three fault flags can be set by the calculate frame features module 456: illumination fault flag, obscure fault flag, and ajar fault 462. Each of these faults 462 is determined through associated metrics. The illumination fault is controlled by evaluating the modified Kuiper statistic, the uniform centered mean, and the variance of the grey level input. The obscure and ajar faults use the current and archive edges to detect whether the camera and/or video system have become obscured or knocked ajar.
The system 450 will not update the reference if any motion is detected in any zone. To determine if there is motion in each zone, the labeled edge image counts the number of non-zero pixels in the labeled zone, calculated in the presence P feature module 452. The non-zero pixels become motion pixels, calculated in the motion M feature module 454. The system 450 counts the non-zero pixels in the motion labeled image to verify if the zone motion pixels in each zone is greater than zero (0). The system 450 counts the non-zero pixels in the zone detection mask for accumulation in the count.
Turning to
The label module 472 receives presence input in the form of labeled edge image 478, equivalence tables 480, and label and conflict counts 482. The label module 472 resolves pixel labeling conflicts within the region, it replaces labels with region numbers, it makes an area call, renumbering regions with sequential indices, and re-indexes the region again, passing data related to the number of regions and regions image to the calculate global presence features model 474.
The calculate global presence features model 474 uses the regions image 484, the number of regions 486 and current edges (GEB) 488 to create a global feature table. The global feature table is first initialized, regions are labeled as to area, mean grey level intensity, histogram, and centroid. The region is then recalculated for variance of grey level and centroid, listing the features (global, safety, . . . ) of the pixels within the region.
The calculate P feature zones module 476 takes the aforementioned regions image 484, number of regions 486, the current edges (GEB) 488 and creates a zone feature table using zone mask and rectangle 490. The system 470 determines motion in zones by calculating detection in a safety zone, a secondary safety zone, a door zone, a first activation zone, a second activation zone, a first guard zone, and a second guard zone.
Turning to
The global P features are calculated first by initializing the edge counts of the feature table 502. The global extent of the image is calculated in the image first pass module 504. Area, centroid, mean, histogram and edge counts are accumulated and put through the image second pass module 506 where a second central moments and variance is accumulated. The feature table pass module 508 calculates the derived features including the spread, elongation, orientation, and ellipse shape of the region. The calculate region scores module 510 determines door rejection, edge shape suppression, and edge grey level suppression. A score comes from the grey level variance of the region and a discount is applied to the score. After region scores are calculated 510, the next region is looped through the calculations of the feature table pass module 508.
Turning to
The zone presence P features are calculated first by initializing the zone counts of the feature table 522. The global extent of the zone is calculated in the zone first pass module 524. The zone is calculated to determine if pixels are in the selected zone mask. Area, centroid, mean, and histogram are also accumulated and put through the image second pass module 526 where a second central moments and variance is accumulated. The feature table pass module 528 calculates the derived features including the spread, elongation, orientation, and ellipse shape of the region. The calculate region scores module 530 determines door rejection, area proportion suppression, edge shape suppression, and edge grey level suppression. After region scores are calculated 530, the next region is looped through the calculations of the feature table pass module 528.
Turning to
The label module 542 receives presence input in the form of labeled edge image 544, equivalence tables 546, and label and conflict counts 548. The label module 542 resolves pixel labeling conflicts within the region, it replaces labels with region numbers, it makes an area call, renumbering regions with sequential indices, and re-indexes the region again, passing data related to the number of regions and regions image to the calculate global presence features model 550.
The calculate global presence features model 550 uses the regions image 552, the number of regions 554 and the current difference image to create a global feature table. The global feature table is first initialized, regions are labeled as to area, mean grey level intensity, histogram, and centroid. The region image is then recalculated for variance of grey level and centroid second movements, listing the shape features of the image within the region.
The calculate SL feature zones module 558 takes the aforementioned regions image 552, number of regions 554, the current edges (GEB) 560 and creates a zone feature table using zone mask and rectangle 562. The system 540 determines motion in zones by calculating detection in a safety zone, a secondary safety zone, a door zone, a first activation zone, a second activation zone, a first guard zone, and a second guard zone.
Turning to
The global SL features are calculated first by initializing the edge counts of the feature table 572. The global extent of the image is calculated in the image first pass module 574. Area, centroid, mean, histogram and edge counts are accumulated and put through the image second pass module 576 where a second central moments and variance is accumulated. The feature table pass module 578 calculates the derived features including the spread, elongation, orientation, ellipse shape factor of the region, modified kniper statistic and mapped mean and variance. The calculate region scores module 580 determines the SL score with region suppression from shadow and light beam discount, shape discount, and area discount and with transient suppression. After region scores are calculated 580, the next region is looped through the calculations of the feature table pass module 578.
Turning to
The zone SL features are calculated first by initializing the zone counts of the feature table 592. The global extent of the zone is calculated in the zone first pass module 594. The zone is calculated to determine if pixels or zone rectangle are in the selected zone mask. Area, centroid, mean, and histogram are also accumulated and put through the image second pass module 596 where a second central moments and variance is accumulated. The feature table pass module 598 calculates the derived features including the spread, elongation, orientation, ellipse shape factor, modified kniper statistic, and mapped means and variance of the region. The calculate region scores module 600 determines the SL score with region suppression from shadow and light beam discount, shape discount, and area discount and with transient suppression. After region scores are calculated 600, the next region is looped through the calculations of the feature table pass module 598.
In an embodiment, an automatic door control and safety system is provided that controls door behavior in accordance with logic that interprets a nominally optically sensed object situation and environment proximate to the door. The system uses a camera sensor sub-system fitted with an appropriate lens in order to generate an image of the desired sensing area. Digital images produced by the camera sub-system are processed using image processing in a processing sub-system in order to develop data used to drive specific decision logic to effect desired door control. Thus, door control is effected by computer interpretation of image content.
In an embodiment, from a processing point of view, the system incorporates several processing stages: 1) image formation; 2) image conditioning; 3) image processing; 4) image content processing; 5) derived data processing; 6) data interpretation processing; and 7) control logic processing.
The door control and safety system is supported by hardware elements to include the camera sub-system, and a general purpose processor sub-system that can be augmented by a digital signal processing device. The camera sub-system can include a lens system, a charge-coupled device imaging device, amplifiers, and an analog-to-digital conversion element. These element can be commonly found together in home computer applications, for example, which interface a digital camera to produce digital images on the computer screen for capture and storage for a variety of purposes.
The system uses a selection of image processing operators, implemented in an algorithm, and subsequent derived data processing and interpretation. The selected image processing operators and image content processing are derived through the optical phenomena exhibited by objects within the field of view of the camera. The image processing operates on the numbers contained in the array representative of scene determined though the lens and camera mounting geometry. This image processing creates internal arrays of numbers which are the results of the image processing, to be used by subsequent operations thus forming a sequence of image processing operations.
In an embodiment of the system, the entire image field is processed. Furthermore, there are no prior assumptions about target objects used to develop any processing elements designed to match anticipated object characteristics for the purpose of selecting subsets of the entire image field.
At the beginning of the image processing sequence, the image processing accepts a new input image of the scene (which is a single time sample (“frame”) of the on-going image digitization stream). Storage is provided in order to maintain a previous image frame for comparison to a newly captured image frame (a “background” image). This stored image frame is captured in the same way as a new frame, and, in particular, is a single image frame, not an average of more than one frame.
In an embodiment, each new image frame is filtered to remove speckle noise using a median filter. The median filter removes isolated noise while not blurring the image as does averaging. Such isolation noise may be due to imaging sensor noise, downstream electronics noise or environmentally-produced scintillation. The image stored for comparison is filtered one with the median filter, as is the current image. The median filter in can be implemented as a 3×3 filter kernel that is passed over every pixel in the image array. The value at the center of the kernel is deposited in a new image array, and the value is that which is the median of the nine numbers in the filter kernel.
After image filtering, two new image arrays are generated (i.e.,
After differencing, the images still contain 8-bit values. (Images with multiple bit levels are commonly referred to as grey-scale images). After image differencing, a thresholding operator is applied to each of the resulting positive and negative contrast grey-scale images. The threshold values applied to the two images may be different. The values can be fixed or adaptive wherein changes are made based on downstream image interpretation results. The pixel-by-pixel thresholding operation produces two new images. For each image, when the grey level in the input image exceeds the associated threshold value, a “1” is placed in the output image array, otherwise a “0” is placed. The result of the thresholding operation is thus two “binary” images.
Turning to
Turning to
Turning to
With a database representative of image content, the features of each region are considered by interpretation logic to develop control logic decisions. In an embodiment, the interpretation logic is implemented as a set of “if-then-else” constructs, and can utilize arithmetic combination of the basic region features in order to determine image content interpretation. For instance, the resulting region area can be used to infer the presence of an object of interest, and the region centroid and bounding rectangle determine the location of that object. (The bounding rectangle is the smallest rectangle that includes all pixels belonging to the region.)
In an embodiment, the operator can define rectangular regions of the image field of view to determine areas for specific control actions. The bounding rectangle coordinates of the computer-derived object regions of interest are compared to the coordinates of the operator-determined decision regions in order to determine subsequent control logic results. If an object is declared to be in the safety zone, for example, the control logic indicates that the door should remain open until the safety zone is clear. Similarly, if an object is determined to be in the activation zone (the binary region bounding rectangle representative of the image object intersects the activation zone decision rectangle), then the signal is sent to open the door. In an embodiment, the image regions selected by the operator for control logic purposes are not used in any way to initialize or otherwise influence the image processing of the entire image in order to determine image content.
While the specific embodiments have been illustrated and described, numerous modifications come to mind without significantly departing from the spirit of the invention, and the scope of protection is only limited by the scope of the accompanying claims.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US01/27351 | 8/31/2001 | WO | 00 | 9/15/2003 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO02/19698 | 3/7/2002 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
2758676 | Nikazy | Aug 1956 | A |
3195126 | Barker | Jul 1965 | A |
3255434 | Schwarz | Jun 1966 | A |
3534499 | Chaffee | Oct 1970 | A |
3562423 | Murphy | Feb 1971 | A |
3590151 | Keith | Jun 1971 | A |
3663937 | Bolner | May 1972 | A |
3668625 | Wolf et al. | Jun 1972 | A |
3685012 | Case et al. | Aug 1972 | A |
3686434 | Lemelson | Aug 1972 | A |
3691302 | Gaebele et al. | Sep 1972 | A |
3691556 | Bloice | Sep 1972 | A |
3740466 | Marshall et al. | Jun 1973 | A |
3796208 | Bloice | Mar 1974 | A |
3816648 | Noll et al. | Jun 1974 | A |
3852592 | Scoville et al. | Dec 1974 | A |
3890463 | Ikegami et al. | Jun 1975 | A |
3924130 | Cohen et al. | Dec 1975 | A |
3930735 | Kerr | Jan 1976 | A |
3947833 | Eckstein, Jr. | Mar 1976 | A |
3988533 | Mick et al. | Oct 1976 | A |
4044860 | Kaneko et al. | Aug 1977 | A |
4063282 | Exton | Dec 1977 | A |
4081830 | Mick et al. | Mar 1978 | A |
4133004 | Fitts | Jan 1979 | A |
4136950 | Labrum et al. | Jan 1979 | A |
4148062 | Kamin | Apr 1979 | A |
4160998 | Kamin | Jul 1979 | A |
4163357 | Greive et al. | Aug 1979 | A |
4163991 | Burrig | Aug 1979 | A |
4173402 | Hosike et al. | Nov 1979 | A |
4183013 | Agrawala et al. | Jan 1980 | A |
4185298 | Billet et al. | Jan 1980 | A |
4187519 | Vitels et al. | Feb 1980 | A |
4198653 | Kamin | Apr 1980 | A |
4214265 | Olesen | Jul 1980 | A |
4219845 | Gibbons et al. | Aug 1980 | A |
4219847 | Pinkney et al. | Aug 1980 | A |
4240109 | Michael et al. | Dec 1980 | A |
4245243 | Gutjahr et al. | Jan 1981 | A |
4249207 | Harman et al. | Feb 1981 | A |
4249212 | Ito et al. | Feb 1981 | A |
4257063 | Loughry et al. | Mar 1981 | A |
4258351 | Shigeta et al. | Mar 1981 | A |
4298858 | Romanski | Nov 1981 | A |
4317130 | Brown | Feb 1982 | A |
4337481 | Mick et al. | Jun 1982 | A |
4364113 | Sengebusch et al. | Dec 1982 | A |
4395699 | Sternberg | Jul 1983 | A |
4408224 | Yoshida | Oct 1983 | A |
4410910 | Andes | Oct 1983 | A |
4414685 | Sternberg | Nov 1983 | A |
4433325 | Tanaka et al. | Feb 1984 | A |
4433438 | Couturier | Feb 1984 | A |
4435835 | Sakow et al. | Mar 1984 | A |
4449144 | Suzuki | May 1984 | A |
4450482 | Ackerman | May 1984 | A |
4455550 | Iguchi | Jun 1984 | A |
4458266 | Mahoney | Jul 1984 | A |
4479145 | Azuma et al. | Oct 1984 | A |
4490851 | Gerhart et al. | Dec 1984 | A |
4493420 | Dennis | Jan 1985 | A |
4506765 | Payne et al. | Mar 1985 | A |
4520343 | Koh et al. | May 1985 | A |
4520504 | Walker et al. | May 1985 | A |
4543567 | Shiratan et al. | Sep 1985 | A |
4554459 | Tsutsumi et al. | Nov 1985 | A |
4555724 | Enriquez | Nov 1985 | A |
4556900 | Willis | Dec 1985 | A |
4565029 | Kornbrekke et al. | Jan 1986 | A |
4569078 | Zuk | Feb 1986 | A |
4574393 | Blackwell et al. | Mar 1986 | A |
4577344 | Waslen et al. | Mar 1986 | A |
4589030 | Klay | May 1986 | A |
4589139 | Hada et al. | May 1986 | A |
4626891 | Achiha | Dec 1986 | A |
4626908 | Tani | Dec 1986 | A |
4639767 | Suzuki | Jan 1987 | A |
4641120 | Bonfig et al. | Feb 1987 | A |
4641356 | Sternberg | Feb 1987 | A |
4653109 | Lemelson et al. | Mar 1987 | A |
4662479 | Tsuji et al. | May 1987 | A |
4665554 | Sternberg | May 1987 | A |
4669218 | Kornbrekke et al. | Jun 1987 | A |
4679077 | Yuasa et al. | Jul 1987 | A |
4680704 | Konicek et al. | Jul 1987 | A |
4685145 | Schiller | Aug 1987 | A |
4685146 | Fenster et al. | Aug 1987 | A |
4692806 | Anderson et al. | Sep 1987 | A |
4694329 | Belmares-Sarabia et al. | Sep 1987 | A |
4698937 | Kornbrekke et al. | Oct 1987 | A |
4709264 | Tamusa et al. | Nov 1987 | A |
4736252 | Nakagawa et al. | Apr 1988 | A |
4737847 | Araki et al. | Apr 1988 | A |
4739401 | Sacks et al. | Apr 1988 | A |
4742549 | Roschier | May 1988 | A |
4760607 | Sternberg et al. | Jul 1988 | A |
4779131 | Matsumoto et al. | Oct 1988 | A |
4783833 | Kawabata et al. | Nov 1988 | A |
4794248 | Gray | Dec 1988 | A |
4799243 | Zepke | Jan 1989 | A |
4807034 | Takeuchi et al. | Feb 1989 | A |
4823010 | Kornbrekke et al. | Apr 1989 | A |
4825393 | Nishiya | Apr 1989 | A |
4831641 | Niemi | May 1989 | A |
4839648 | Beucher et al. | Jun 1989 | A |
4845761 | Cate et al. | Jul 1989 | A |
4847772 | Michalopoulos et al. | Jul 1989 | A |
4849906 | Chodos et al. | Jul 1989 | A |
4858156 | Martin | Aug 1989 | A |
4860371 | Matsuyama et al. | Aug 1989 | A |
4868651 | Chou et al. | Sep 1989 | A |
4871948 | Nelson | Oct 1989 | A |
4874063 | Taylor | Oct 1989 | A |
4876728 | Roth | Oct 1989 | A |
4881270 | Knecht et al. | Nov 1989 | A |
4884136 | Ninomiya et al. | Nov 1989 | A |
4906940 | Greene et al. | Mar 1990 | A |
4908704 | Fujioka et al. | Mar 1990 | A |
4920572 | Sugita et al. | Apr 1990 | A |
4924310 | von Brandt | May 1990 | A |
4924416 | Sasao | May 1990 | A |
4928176 | Schmidt et al. | May 1990 | A |
4931864 | Kawamura et al. | Jun 1990 | A |
4937878 | Lo et al. | Jun 1990 | A |
4947353 | Quinlan, Jr. | Aug 1990 | A |
4951137 | Kisov et al. | Aug 1990 | A |
4959714 | Lo et al. | Sep 1990 | A |
4962419 | Hibbard et al. | Oct 1990 | A |
4967083 | Kornbrekke et al. | Oct 1990 | A |
4969202 | Groezinger | Nov 1990 | A |
4974077 | Kusaba | Nov 1990 | A |
4975970 | Zettel et al. | Dec 1990 | A |
4975973 | Kasane et al. | Dec 1990 | A |
4979136 | Weiman et al. | Dec 1990 | A |
4984071 | Yonezawa | Jan 1991 | A |
4985618 | Inada et al. | Jan 1991 | A |
4987602 | Brunner et al. | Jan 1991 | A |
4991092 | Greensite | Feb 1991 | A |
5001557 | Begle | Mar 1991 | A |
5008739 | D'Luua et al. | Apr 1991 | A |
5010578 | Siener et al. | Apr 1991 | A |
5018218 | Peregrim et al. | May 1991 | A |
5022062 | Annis | Jun 1991 | A |
5023809 | Spackman et al. | Jun 1991 | A |
5031227 | Raasch et al. | Jul 1991 | A |
5032905 | Koga | Jul 1991 | A |
5034986 | Karmann et al. | Jul 1991 | A |
5047851 | Sauerwein et al. | Sep 1991 | A |
5052045 | Peregrim et al. | Sep 1991 | A |
5067014 | Bergen et al. | Nov 1991 | A |
5075632 | Payne et al. | Dec 1991 | A |
5091967 | Ohsawa | Feb 1992 | A |
5099324 | Abe | Mar 1992 | A |
5101440 | Watanabe et al. | Mar 1992 | A |
5103305 | Watanabe | Apr 1992 | A |
5115477 | Groezinger | May 1992 | A |
5119442 | Brown | Jun 1992 | A |
5121201 | Seki | Jun 1992 | A |
5134472 | Abe | Jul 1992 | A |
5134661 | Reinsch | Jul 1992 | A |
5140649 | Kageyama | Aug 1992 | A |
5142152 | Boiucaner | Aug 1992 | A |
5149921 | Picado | Sep 1992 | A |
5150421 | Morishita et al. | Sep 1992 | A |
5150426 | Bauh et al. | Sep 1992 | A |
5151945 | Lee et al. | Sep 1992 | A |
5159646 | Kumagai | Oct 1992 | A |
5161107 | Mayeaux et al. | Nov 1992 | A |
5162902 | Bell et al. | Nov 1992 | A |
5181254 | Schweizer et al. | Jan 1993 | A |
5182776 | Suzuki et al. | Jan 1993 | A |
5182778 | Rudek et al. | Jan 1993 | A |
5187747 | Capello et al. | Feb 1993 | A |
5212740 | Pack et al. | May 1993 | A |
5239596 | Mahoney | Aug 1993 | A |
5241608 | Fogel | Aug 1993 | A |
5243663 | Kudoh | Sep 1993 | A |
5243668 | Kitamura et al. | Sep 1993 | A |
5247366 | Ginosar et al. | Sep 1993 | A |
5249241 | Silverman et al. | Sep 1993 | A |
5257209 | Maskandey | Oct 1993 | A |
5258586 | Suzuki et al. | Nov 1993 | A |
5263098 | Hovikami | Nov 1993 | A |
5271064 | Dhawan et al. | Dec 1993 | A |
5281964 | Iida et al. | Jan 1994 | A |
5282337 | Duhame et al. | Feb 1994 | A |
5283573 | Takatov et al. | Feb 1994 | A |
5289520 | Pellegrino et al. | Feb 1994 | A |
5291313 | Kim | Mar 1994 | A |
5294986 | Tsuji et al. | Mar 1994 | A |
5296852 | Rathi | Mar 1994 | A |
5298697 | Suzuki et al. | Mar 1994 | A |
5300739 | Bittner | Apr 1994 | A |
5305395 | Mahoney et al. | Apr 1994 | A |
5311598 | Bose et al. | May 1994 | A |
5313295 | Taniguchi et al. | May 1994 | A |
5315389 | Izawa et al. | May 1994 | A |
5319547 | Krug et al. | Jun 1994 | A |
5353021 | Toyamas | Oct 1994 | A |
5359674 | van der Wal | Oct 1994 | A |
5376962 | Zorten | Dec 1994 | A |
5387768 | Izard et al. | Feb 1995 | A |
5387930 | Toh | Feb 1995 | A |
5402118 | Aoki | Mar 1995 | A |
5406501 | Florent | Apr 1995 | A |
5410418 | Yonezawa | Apr 1995 | A |
5420971 | Westerink et al. | May 1995 | A |
5426517 | Schwartz | Jun 1995 | A |
5426685 | Pellegrino et al. | Jun 1995 | A |
5432528 | Ritter | Jul 1995 | A |
5436984 | Saskkinen et al. | Jul 1995 | A |
5438360 | Edwards | Aug 1995 | A |
5450502 | Eschbach et al. | Sep 1995 | A |
5483351 | Mailloux et al. | Jan 1996 | A |
5490218 | Krug et al. | Feb 1996 | A |
5500904 | Markaudey et al. | Mar 1996 | A |
5509082 | Toyama et al. | Apr 1996 | A |
5511133 | Shimizu et al. | Apr 1996 | A |
5519784 | Vermeulen et al. | May 1996 | A |
5528703 | Lee | Jun 1996 | A |
5537224 | Suzuki et al. | Jul 1996 | A |
5544258 | Levien | Aug 1996 | A |
5551533 | Ng | Sep 1996 | A |
5555318 | Ito et al. | Sep 1996 | A |
5572595 | Kumagai et al. | Nov 1996 | A |
5581370 | Fuss et al. | Dec 1996 | A |
5581625 | Connell | Dec 1996 | A |
5590217 | Toyama | Dec 1996 | A |
5592567 | Kilger | Jan 1997 | A |
5596418 | Strolle et al. | Jan 1997 | A |
5598338 | Tanigrchi et al. | Jan 1997 | A |
5604822 | Pearson et al. | Feb 1997 | A |
5606432 | Ohtsuka et al. | Feb 1997 | A |
5609152 | Pellegrino et al. | Mar 1997 | A |
5612928 | Haley et al. | Mar 1997 | A |
5617484 | Wada et al. | Apr 1997 | A |
5621868 | Mizutani et al. | Apr 1997 | A |
5625709 | Kasdon | Apr 1997 | A |
5631975 | Riglet et al. | May 1997 | A |
5631984 | Graf et al. | May 1997 | A |
5640468 | Hsu | Jun 1997 | A |
5657403 | Wolff et al. | Aug 1997 | A |
5668890 | Winkelman | Sep 1997 | A |
5673355 | Strolle et al. | Sep 1997 | A |
5675624 | Relihan et al. | Oct 1997 | A |
5682438 | Kojima et al. | Oct 1997 | A |
5684898 | Brady et al. | Nov 1997 | A |
5687249 | Kato | Nov 1997 | A |
5687251 | Erler et al. | Nov 1997 | A |
5701163 | Richards et al. | Dec 1997 | A |
5727080 | Cox et al. | Mar 1998 | A |
5734746 | Jaspers | Mar 1998 | A |
5740801 | Branson | Apr 1998 | A |
5748773 | Tashiro et al. | May 1998 | A |
5748802 | Winkelman | May 1998 | A |
5756981 | Roustaei et al. | May 1998 | A |
5757286 | Jonsson et al. | May 1998 | A |
5761326 | Brady et al. | Jun 1998 | A |
5771485 | Echigo | Jun 1998 | A |
5774569 | Waldenmaier | Jun 1998 | A |
5799100 | Clarke et al. | Aug 1998 | A |
5799106 | Mooney et al. | Aug 1998 | A |
5802208 | Podilchuk et al. | Sep 1998 | A |
5808697 | Fujimura et al. | Sep 1998 | A |
5809162 | Csipkes et al. | Sep 1998 | A |
5822453 | Lee et al. | Oct 1998 | A |
5825922 | Pearson et al. | Oct 1998 | A |
5832111 | Florent | Nov 1998 | A |
5832118 | Kim | Nov 1998 | A |
5832139 | Battesman et al. | Nov 1998 | A |
5835613 | Breed et al. | Nov 1998 | A |
5835618 | Fang et al. | Nov 1998 | A |
5835638 | Rucklidge et al. | Nov 1998 | A |
5838299 | Smith et al. | Nov 1998 | A |
5838758 | Krug et al. | Nov 1998 | A |
5844565 | Mizutani et al. | Dec 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
5845007 | Ohashi et al. | Dec 1998 | A |
5847755 | Wixson et al. | Dec 1998 | A |
5848179 | Braet | Dec 1998 | A |
5848190 | Kleehammer et al. | Dec 1998 | A |
5854851 | Bamberger et al. | Dec 1998 | A |
5857029 | Patel | Jan 1999 | A |
5859698 | Chau et al. | Jan 1999 | A |
5862254 | Kim et al. | Jan 1999 | A |
5872857 | Chudos et al. | Feb 1999 | A |
5875264 | Carlstrom | Feb 1999 | A |
5877819 | Branson | Mar 1999 | A |
5883969 | Le Gouzouguec et al. | Mar 1999 | A |
5890808 | Neff et al. | Apr 1999 | A |
5892917 | Myerson | Apr 1999 | A |
5901241 | Koljonen et al. | May 1999 | A |
5907643 | Adach | May 1999 | A |
5912721 | Yamaguchi et al. | Jun 1999 | A |
5937090 | Kim | Aug 1999 | A |
5946404 | Bakshi et al. | Aug 1999 | A |
5946407 | Bamberger et al. | Aug 1999 | A |
5949918 | McCaffrey | Sep 1999 | A |
5956435 | Buzug et al. | Sep 1999 | A |
5963276 | Inbar | Oct 1999 | A |
5970164 | Bamberger et al. | Oct 1999 | A |
5974169 | Bachelder | Oct 1999 | A |
5978502 | Ohash | Nov 1999 | A |
5978507 | Shackleton et al. | Nov 1999 | A |
5982424 | Simerly et al. | Nov 1999 | A |
5982917 | Clarke et al. | Nov 1999 | A |
5999660 | Zorin et al. | Dec 1999 | A |
6002525 | Poulo et al. | Dec 1999 | A |
6002782 | Dionysian | Dec 1999 | A |
6020931 | Bilbrey et al. | Feb 2000 | A |
6041138 | Nishida | Mar 2000 | A |
6055334 | Kato | Apr 2000 | A |
6067366 | Simanovsky et al. | May 2000 | A |
6069971 | Kanno et al. | May 2000 | A |
6075871 | Simanovsky et al. | Jun 2000 | A |
6075890 | Park | Jun 2000 | A |
6081618 | Naoi et al. | Jun 2000 | A |
6088468 | Ito et al. | Jul 2000 | A |
6101294 | McCaffrey et al. | Aug 2000 | A |
6104763 | Limberg | Aug 2000 | A |
6111607 | Kampyama | Aug 2000 | A |
6111974 | Hiraoglu et al. | Aug 2000 | A |
6111980 | Sano et al. | Aug 2000 | A |
6118892 | Williams | Sep 2000 | A |
6134339 | Luo | Oct 2000 | A |
6134373 | Strolle et al. | Oct 2000 | A |
6137893 | Michael et al. | Oct 2000 | A |
6148103 | Nenonen | Nov 2000 | A |
6154560 | Cothren et al. | Nov 2000 | A |
6157373 | Rago | Dec 2000 | A |
6163621 | Paik et al. | Dec 2000 | A |
6201616 | Sasanuma et al. | Mar 2001 | B1 |
6219447 | Lee | Apr 2001 | B1 |
6246450 | Inbar | Jun 2001 | B1 |
6246827 | Strolle et al. | Jun 2001 | B1 |
6255650 | Warner et al. | Jul 2001 | B1 |
6259472 | Park | Jul 2001 | B1 |
6266102 | Azuma et al. | Jul 2001 | B1 |
6272246 | Taka | Aug 2001 | B1 |
6320981 | Yada | Nov 2001 | B1 |
6373533 | Kawabata et al. | Apr 2002 | B1 |
6392764 | Eschbach et al. | May 2002 | B1 |
6393148 | Bhaskar | May 2002 | B1 |
6415049 | Yanagita et al. | Jul 2002 | B1 |
6421097 | O'Rourke | Jul 2002 | B1 |
Number | Date | Country | |
---|---|---|---|
20050074140 A1 | Apr 2005 | US |
Number | Date | Country | |
---|---|---|---|
60229613 | Aug 2000 | US |