This invention relates to the detection of objects, whether moving or not, within the path of either elevator car or hoistway doors, and motion of passengers or objects toward the elevator by means of a pattern recognition neural network which provides a door open command in appropriate cases.
Typical systems utilized to detect objects in or near the path of an elevator door employ an array of light sources disposed vertically on one edge of a door which provide light beams that energize a corresponding array of photodetectors disposed on an opposite edge of the door, whereby interruption of a light beam will cause a door open command to cause a door to become or remain open. Such systems are generally satisfactory but have the characteristic of not sensing things which are not within the discreet paths of light. Furthermore, with the light arrays being in a single plane, there is the opportunity to not sense the presence of persons or things until they have been extended some distance into the door opening.
More complicated elevator door obstruction detection is disclosed in U.S. Pat. Nos. 5,387,768 and 5,410,149. However, apparatus according to these disclosures sense only motion, and therefore do not sense objects which are static or immobile within the door pathway. Furthermore, the processing of images is highly complex and requires significant software and software processing time. The adaptation of such complex devices to elevator landings which have different image responses is also very complex, slow and expensive, due to the nature of the processing involved.
Current light beam door obstruction detectors require flexing cables to provide power to and receive responses from the moving doors.
Objects of the invention include provision of an elevator doorway obstruction sensing system: which can sense not only objects or persons moving toward the elevator, while ignoring other motion, but also non-moving objects or persons in the pathway of the doors; which can be readily adapted to a wide variety of floor landing images, utilizing readily available software in a personal computer which need only be temporarily connected to the apparatus during the learning process, and thereafter removed; which is extremely fast and does not require complex image processing which can be readily adapted as a retrofit to a wide variety of elevator systems and floor landings; which does not require apparatus mounted on the doors; and which is easily implemented at relatively low cost.
According to the present invention, video images of a volume which includes a portion of the elevator door paths, including the door sills, and a portion of a landing adjacent to the elevator, including the landing floor adjacent to the sills, are converted into single-dimension numerical vectors, and passed through a pattern-recognizing neural network to provide an open door signal in response to one or more patterns recognized by the neural network as indicating something moving toward the elevator or as something within one or more of the door paths.
Other objects, features and advantages of the present invention will become more apparent in the light of the following detailed description of exemplary embodiments thereof, as illustrated in the accompanying drawing.
Referring to
According to the invention, the camera is provided with a suitable objective lens to limit its view to zones one and two. Suitable illumination, to ensure that the zones of interest are properly illuminated, may comprise infrared illumination, which will not disturb passengers but will provide a reliable image intensity.
According to the invention, a first concept is to determine patterns within zone 2 for open doors (that is, viewing the sills), closed doors (that is, viewing the tops of the doors), and for doors that are opening and doors that are closing. Anything that does not match those images will trigger the generation of a door open command to cause the doors to become or remain open. Within zone 1, patterns are recognized that indicate movement indicative of a desire to enter the elevator. This may include indication of movement toward the elevator and may include movement of a person sideways in order to get around another person, and other movements which are learned to be indicative of an intent to pass through the doorway onto the elevator car. In this embodiment, zones 1 and 2 do not overlap. However, zone 1 may be extended to include zone 2, if desired, in any given implementation of the invention.
In
To teach the neural network 35 the intended recognition scheme, a personal computer 42 is connected to receive images from the camera and to provide control over the card 33. The P.C. 42 will have suitable software, such as Zisc Engine for Image Recognition software (ZEIFR), that allows the operator to teach the neural network patterns and to locate differences between an image and some template. Patterns can be based on pixel intensity, color and so forth, and pattern recognition may be base on either Radial Basis Function (RBF) or K-Nearest-Neighbor (KNN) models. Training the image recognition engine is achieved by marking objects on the screen of the P.C. and listing one of up to 200 categories that the image is to be associated with, or listing the desired outcome from sensing a particular image, and then clicking on the Learn button. Any area of the live video not recognized by the image recognition engine is marked with a colored rectangle. Such rectangles may be selected on the screen, matched with the desired category or outcome, and entered into the system. The learning which occurs in the image recognition software is downloaded to the processing card 33. Learning can be formed on either still or moving images. The recognition engine is able to ignore the motion of the doors, patterns or colors in the environment, (floors, walls, etc.) or images and reflections from the clothing worn by people within the field of view.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US02/15658 | 5/14/2002 | WO | 00 | 11/12/2004 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO03/097506 | 11/27/2003 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5001557 | Begle | Mar 1991 | A |
5182776 | Suzuki et al. | Jan 1993 | A |
5284225 | Platt | Feb 1994 | A |
5298697 | Suzuki et al. | Mar 1994 | A |
5387768 | Izard et al. | Feb 1995 | A |
5410149 | Winston, Jr. et al. | Apr 1995 | A |
5518086 | Tyni | May 1996 | A |
5717832 | Steimle et al. | Feb 1998 | A |
6050369 | Leone et al. | Apr 2000 | A |
6339375 | Hirata et al. | Jan 2002 | B1 |
6386325 | Fujita | May 2002 | B1 |
6386326 | Pustelniak et al. | May 2002 | B1 |
6547042 | Collins | Apr 2003 | B1 |
20010045327 | Shemanske et al. | Nov 2001 | A1 |
20030168288 | Deplazes et al. | Sep 2003 | A1 |
20060037818 | Deplazes et al. | Feb 2006 | A1 |
Number | Date | Country | |
---|---|---|---|
20050173200 A1 | Aug 2005 | US |