The subject matter disclosed herein relates to conveyance systems, such as elevator systems. More specifically, the subject matter disclosed herein relates to an elevator system that uses facial recognition to control elevator dispatching.
Elevator systems can use a variety of techniques to allow a user to request elevator service. In traditional systems, users provide an up or down hall call, and then enter a floor destination upon entering the elevator car. Other existing systems allow a user to enter a destination call at a kiosk, the destination call specifying a particular floor. Other existing systems read a user identifier, such as an employee badge, to determine a destination floor.
An exemplary embodiment is a conveyance system including a camera to generate an image of an area of interest; a dispatch system including a facial recognition unit and a profile unit, the facial recognition unit detecting facial features of a user in the image; the dispatch system determining if the facial features match a profile stored in the profile unit, the dispatch system scheduling car service in response to the facial features matching the profile stored in the profile unit; a system interface including a system interface camera, the system interface camera to generate a second image of the user at the system interface; the facial recognition unit detecting facial features of the user in the second image; the dispatch system determining if the facial features of the user in the second image match the profile stored in the profile unit; the system interface requesting a destination from the user when the facial features of the user in the second image do not match the profile stored in the profile unit; the system interface presenting an anticipated destination from the profile when the facial features of the user in the second image match the profile stored in the profile unit.
Another exemplary embodiment is a method for operating a conveyance system, the method including generating an image of an area of interest; detecting facial features of a user in the image; determining if the facial features match a profile; scheduling conveyance service in response to the facial features matching the profile; generating a second image of the user at a system interface; detecting facial features of the user in the second image; determining if the facial features of the user in the second image match the profile stored; requesting a destination from the user when the facial features of the user in the second image do not match the profile; presenting an anticipated destination from the profile when the facial features of the user in the second image match the profile stored in the profile unit.
Another exemplary embodiment is a computer program product, tangibly embodied on a non-transitory computer readable medium, for operating a conveyance system, the computer program product including instructions that, when executed by a computer, cause the computer to perform operations including: generating an image of an area of interest; detecting facial features of a user in the image; determining if the facial features match a profile; scheduling conveyance service in response to the facial features matching the profile; generating a second image of the user at a system interface; detecting facial features of the user in the second image; determining if the facial features of the user in the second image match the profile stored; requesting a destination from the user when the facial features of the user in the second image do not match the profile; and presenting an anticipated destination from the profile when the facial features of the user in the second image match the profile stored in the profile unit.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
As described herein, the dispatch system 16 obtains an anticipated destination for a user based on facial recognition. Dispatch system 16 includes a facial recognition unit 18 and a profile storage unit 20. Facial recognition unit 18 may be implemented by software executing on dispatch system 16. Profile storage unit 20 may be implemented by a database stored in memory on dispatch system 16. Operation of the facial recognition unit 18 and the profile storage unit 20 are described in further detail herein. While the dispatch system 16 is shown including the facial recognition unit 18 and the profile storage unit 20, one or both of these units, or the functions provided by these units, may be implemented by one or more system(s) (e.g., remote server, cloud computing system) remotely located from dispatch system 16.
A plurality of cameras 22 are directed to an area adjacent the elevator cars 12, such as a building lobby or along an access route to the elevators. Cameras 22 may be dispersed at various locations so as to acquire images from multiple viewpoints (i.e. simultaneous views of the user to provide more detection opportunities). They may also positioned at different locations so as to acquire images from multiple positions with respect to the elevators to provide motion estimation of a particular user. Providing images of users from multiple viewpoints simplifies the facial recognition, as it is more likely to acquire a view corresponding to existing feature profile(s) of each user. This allows cameras 22 to be lower resolution and lower cost.
A system interface 30 includes a system interface camera 32 for acquiring images of users positioned at the system interface 30. System interface 30 may be a kiosk (e.g., in the building lobby) or a wall mounted unit (e.g., at a floor landing). System interface 30 may be implemented using a microprocessor based device (e.g., computer, server) executing a computer program stored in a memory to perform the functions described herein. Alternatively, the system interface 30 may be implemented in hardware (e.g., ASIC) or in a combination of hardware and software. An input/output unit 34 is used to present information to users and receive commands from users. Input/output unit 34 may be implemented using a touchscreen, a display with peripherals (e.g., buttons, mouse, microphone, speaker), or other known input/output devices.
The processing at 102 can also detect direction of travel of a user, based on the viewpoints of cameras 22. User movement may be tracked in the area of interest to determine if a user is heading towards elevators 12 or heading away from elevators 12. Detection of facial features may be limited to users approaching the cameras 22 based on direction of travel.
At 104, elevator service is scheduled for any users heading towards the elevators 12 and having an already existing profile in profile storage 20. User profiles in profile unit 20 may be indexed by facial features generated by facial recognition unit 18.
At 106, the user arrives at the system interface 30. System interface camera 32 acquires a second image of the user and facial recognition is used to recognize the user. System interface 30 may be equipped with a facial recognition unit, or the second image from system interface camera 32 may be routed to the dispatch system 16 for facial recognition.
At 108, the facial features of the user at the system interface 30 are compared to facial features in profile storage unit 20 to identify the user and associated the user with a profile. If the user is not identified at 108, flow proceeds to 105 where a probable destination is determined by dispatch system 16. The probable destination may be based on time of day, location of the user, historical elevator usage data, events scheduled in the building for that day/time, etc. At 107, the probable destination is presented to user through the system interface 30. For example, system interface 30 may present a prompt with the probable destination (e.g., “Are you heading to the seminar on floor 30?”). At 109, the user can override the probable destination and enter a different destination. If no override is received within a certain period of time (e.g., 3 seconds) or if the user expressly accepts the destination through system interface 30, flow proceeds to 112.
If the user overrides the probable destination at 109, flow proceeds to 110 where the system interface 30 prompts the user for a destination. The user enters a destination through the input/output unit 34. The destination may be a specific floor or an indication of up or down. At 112, from either the negative branch of 109 or from 110, the system interface 30 prompts the user to register the destination. If the user selects yes, then at 114 a profile is created in profile storage unit 20 for the user including the user facial features, the user current location, the day of week, time of day and the destination floor and flow proceeds to 116. At 112, if the user declines to register the destination, flow proceeds directly to 116. In another embodiment, the user may be directed to building security to create a user profile.
At 116, an elevator call is created based on the destination entered at 110. The elevator call is an actual command for the elevator controller 14 to provide a car from one floor to another (in the event the destination specifies a floor) or to provide a car for travel in a certain direction (in the event the destination specifies up or down). At 118, the user is directed to the appropriate elevator car 12 through the input/output unit 34 (e.g., please proceed to car A).
If at 108, the user is identified, flow proceeds to 120 where the user profile is accessed from profile storage unit 20. At 122, the anticipated destination is determined based on one or more of the user current location, day of week and time of day and the anticipated destination is presented to the user on the input/output unit 34.
At 124, if the user does not override the anticipated destination within a certain period of time (e.g., 3 seconds) or if the user expressly accepts the destination through system interface 30, flow proceeds to 116 where an elevator call is created based on the anticipated destination in the user profile. At 118, the user is directed to the appropriate elevator car 12 through the input/output unit 34 (e.g., please proceed to car A).
If at 124, the user elects to override the anticipated destination, flow proceeds to 110, where the user is prompted for a destination. Flow proceeds as described above, with the user provided an option to register the destination at 112. If a user with an existing profile registers a destination, their profile is updated with the new destination at 114.
The embodiments described above relate to a lobby, but similar systems may be employed at each landing. One or more cameras 22 may be installed at each landing and positioned to capture users approaching the elevator door(s). Each landing includes a system interface 30, which may be in the form of a wall mounted device, rather than a kiosk. Processing similar to that disclosed with reference to
The above embodiments refer to a user specifying that a destination be stored in their profile. Dispatch system 16 may also learn user patterns, and update the user profile automatically. For example, if every Friday a user travels to the lobby at lunchtime rather that the cafeteria floor, dispatch system 16 can learn this behavior and update the user profile accordingly. Of course, the user would be provided the option to override the anticipated destination as described above. When the user overrides an anticipated destination, the system may provide a list of recent manual destination requests based on travel of that user and/or the system may present a list of popular destination floors in the building
In certain applications, it may be desirable to erase profiles to reduce storage demand on profile storage unit 20 and reduce the number of profiles that need to be searched in attempting to match facial features to a profile. In a hotel, for example, profiles more than 2 weeks old measured from a creation date, can be deleted as there is a low likelihood that a guest at the hotel will remain longer than two weeks. Profiles may also be deleted a time period (e.g., 24 hours) after a user checks out of a hotel. Further, profiles that have not been matched to user for a predetermined period of time (e.g., a month) may be deleted, as this indicates the user is no longer visiting the building.
Embodiments described herein are directed to an elevator system dispatching elevator cars. Embodiments may also include other types of transportation (train, subway, monorail, etc.) and thus embodiments may be generally applied to conveyance systems which dispatch cars.
As described above, exemplary embodiments can be in the form of processor-implemented processes and devices for practicing those processes, such as dispatch system 16. The exemplary embodiments can also be in the form of computer program code containing instructions embodied in tangible media, such as floppy diskettes, CD ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes a device for practicing the exemplary embodiments. The exemplary embodiments can also be in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into an executed by a computer, the computer becomes an device for practicing the exemplary embodiments. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2013/057800 | 9/3/2013 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2015/034459 | 3/12/2015 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6772862 | Friedli | Aug 2004 | B2 |
7620817 | Friedli et al. | Nov 2009 | B2 |
8020672 | Lin | Sep 2011 | B2 |
8061485 | Finschi | Nov 2011 | B2 |
8260042 | Peng | Sep 2012 | B2 |
8381880 | Finschi | Feb 2013 | B2 |
8490754 | Amano | Jul 2013 | B2 |
8813917 | Salmikuukka | Aug 2014 | B2 |
8857569 | Friedli | Oct 2014 | B2 |
8910752 | Furutani | Dec 2014 | B2 |
9238568 | Nonami | Jan 2016 | B2 |
20090208067 | Peng et al. | Aug 2009 | A1 |
20120234631 | Hsieh | Sep 2012 | A1 |
20150151947 | Kuroda | Jun 2015 | A1 |
20150183618 | Kondo | Jul 2015 | A1 |
20150314983 | Saari | Nov 2015 | A1 |
20150329316 | Lee | Nov 2015 | A1 |
20160090270 | Wang | Mar 2016 | A1 |
20160311646 | Bryant | Oct 2016 | A1 |
20170144859 | Scoville | May 2017 | A1 |
20170210594 | Gerstenmeyer | Jul 2017 | A1 |
Number | Date | Country |
---|---|---|
101506077 | Aug 2009 | CN |
2005132549 | May 2005 | JP |
2006137572 | Jun 2006 | JP |
2008044711 | Feb 2008 | JP |
2008120549 | May 2008 | JP |
2008127158 | Jun 2008 | JP |
Entry |
---|
International Search Report for application PCT/US2013/057800 dated Apr. 24, 2014, 5 pages. |
Written Opinion for application PCT/US2013/057800 dated Apr. 24, 2014, 8 pages. |
Chinese Office Action and Search Report for application 201380079344.X, dated Mar. 30, 2017 6pgs. |
European Search Report for application 13892938.5, dated May 2, 2017, 7 pgs. |
Number | Date | Country | |
---|---|---|---|
20160214830 A1 | Jul 2016 | US |