This application claims the benefit of Indian provisional application no. 201811009167 filed Mar. 13, 2018, which is incorporated herein by reference in its entirety.
The subject matter disclosed herein generally relates to elevator systems and, more particularly, to an augmented reality car operating panel.
Elevator systems typically include a car operating panel (COP) utilized for operation of the elevator car. The car operating panel receives elevator passenger inputs that can then designate a desired floor, hold open or close the elevator car doors, sound an alarm, and the like. Car operating panels utilizing buttons and touch based systems have a variety of issues that present themselves overtime such as, for example, wear and tear of buttons, loss of touch sensitivity over time, regular service and cleaning is required, cost ineffectiveness, reduced passenger experience, power consumption, and hygienic issues as physical touch is required.
According to one embodiment, an elevator system is provided. The elevator system includes an elevator car, a sensor, and a projector affixed to the elevator car, wherein the projector is operated by a controller and the controller is configured to receive an indication of a presence of a passenger in the elevator car. A car operating panel is projected, by a projector, in the elevator car, wherein the car operating panel comprises a virtual element. An activation of the virtual element from the passenger is sensed by a sensor and an action is initiated based at least in part on sensing the activation of the virtual element.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the action comprises initiating an elevator command for the elevator car.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the action comprises adding, by the controller, a second virtual element to the car operating panel and projecting, by the projector, the second virtual element in the car operating panel.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the activation of the virtual element from the passenger comprises engaging, by the passenger, a region in the virtual element for a duration of time.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the activation of the virtual element from the passenger comprises engaging, by the passenger, a region in the virtual element utilizing a movement pattern.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the virtual element comprises a first color and responsive to sensing the activation of the virtual element from the passenger, the virtual element changes to a second color.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the controller is further configured to periodically detect, by the sensor, the presence of the passenger and responsive to the passenger exiting the elevator car, initiating a power savings mode for the car operating panel.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the car operating panel is projected on to a surface of the elevator car.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the car operating panel comprises a first mode and a second mode and wherein the first mode comprises a minimal display of the virtual element and wherein the second mode comprises a descriptive display of the virtual element.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the initiating the action comprises projecting, by the projector, the second mode for the car operating panel.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the controller is further operable to determine a location of the passenger in the elevator car and the car operating panel is projected to a second location proximate to the location of the passenger.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include that the car operating panel further comprises a news feed.
In addition to one or more of the features described above, or as an alternative, further embodiments of the elevator system may include a microphone, the microphone is operated by the controller and the controller is further configured to receive an audio command from the user and responsive to the audio command from the user, initiate a second action for the elevator car.
According to one embodiment, a computer-implemented method for operating an elevator system is provided. The method includes receiving an indication of a presence of a passenger in an elevator car. A car operating panel is projected, by a projector in the elevator car and the car operating panel comprises a virtual element. An activation of the virtual element from the passenger is sensed by a sensor and an action is initiated based at least in part on sensing the activation of the virtual element.
In addition to one or more of the features described above, or as an alternative, further embodiments of the method may include that initiating the action comprises initiating an elevator command for the elevator car.
In addition to one or more of the features described above, or as an alternative, further embodiments of the method may include that initiating the action comprises adding a second virtual element to the car operating panel and projecting, by the projector, the second virtual element in the car operating panel.
In addition to one or more of the features described above, or as an alternative, further embodiments of the method may include that the activation of the virtual element from the passenger comprises engaging, by the passenger, a region in the virtual element for a duration of time.
In addition to one or more of the features described above, or as an alternative, further embodiments of the method may include that the activation of the virtual element from the passenger comprises engaging, by the passenger, a region in the virtual element utilizing a movement pattern.
In addition to one or more of the features described above, or as an alternative, further embodiments of the method may include that the virtual element comprises a first color and responsive to sensing the activation of the virtual element from the passenger, the virtual element changes to a second color.
In addition to one or more of the features described above, or as an alternative, further embodiments of the method may include periodically detecting, by the sensor, the presence of the passenger and responsive to the passenger exiting the elevator car, initiating a power savings mode for the car operating panel.
The present disclosure is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements.
As shown and described herein, various features of the disclosure will be presented. Various embodiments may have the same or similar features and thus the same or similar features may be labeled with the same reference numeral, but preceded by a different first number indicating the figure to which the feature is shown. Thus, for example, element “a” that is shown in FIG. X may be labeled “Xa” and a similar feature in FIG. Z may be labeled “Za.” Although similar reference numbers may be used in a generic sense, various embodiments will be described and various features may include changes, alterations, modifications, etc. as will be appreciated by those of skill in the art, whether explicitly described or otherwise would be appreciated by those of skill in the art.
The roping 107 engages the machine 111, which is part of an overhead structure of the elevator system 101. The machine 111 is configured to control movement of the elevator car 103 and the counterweight 105. The position encoder 113 may be mounted on an upper sheave of a speed-governor system 119 and may be configured to provide position signals related to a position of the elevator car 103 within the elevator shaft 117. In other embodiments, the position encoder 113 may be directly mounted to a moving component of the machine 111, or may be located in other positions and/or configurations as known in the art.
The controller 115 is located, as shown, in a controller room 121 of the elevator shaft 117 and is configured to control the operation of the elevator system 101, and particularly the elevator car 103. For example, the controller 115 may provide drive signals to the machine 111 to control the acceleration, deceleration, leveling, stopping, etc. of the elevator car 103. The controller 115 may also be configured to receive position signals from the position encoder 113. When moving up or down within the elevator shaft 117 along guide rail 109, the elevator car 103 may stop at one or more landings 125 as controlled by the controller 115. Although shown in a controller room 121, those of skill in the art will appreciate that the controller 115 can be located and/or configured in other locations or positions within the elevator system 101.
The machine 111 may include a motor or similar driving mechanism. In accordance with embodiments of the disclosure, the machine 111 is configured to include an electrically driven motor. The power supply for the motor may be any power source, including a power grid, which, in combination with other components, is supplied to the motor.
Although shown and described with a roping system, elevator systems that employ other methods and mechanisms of moving an elevator car within an elevator shaft, such as hydraulic and/or ropeless elevators, may employ embodiments of the present disclosure.
Referring to
In exemplary embodiments, the processing system 200 includes a graphics processing unit 41. Graphics processing unit 41 is a specialized electronic circuit designed to manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display. In general, graphics processing unit 41 is very efficient at manipulating computer graphics and image processing and has a highly parallel structure that makes it more effective than general-purpose CPUs for algorithms where processing of large blocks of data is done in parallel. The processing system 200 described herein is merely exemplary and not intended to limit the application, uses, and/or technical scope of the present disclosure, which can be embodied in various forms known in the art.
Thus, as configured in
Turning now to an overview of technologies that are more specifically relevant to aspects of the disclosure, elevator car operating panels are typically installed on the inside of elevator cars to allow passengers to select a desired floor or select a number of options while inside the elevator car. The panels typically include electronic and mechanical buttons or switches that require a passenger to physically touch and manipulate the button or switch to activate a desired elevator command. Drawbacks to the physical touch based system for elevator car operating panels include wear and tear of buttons, loss of touch sensitivity over time, regular service and cleaning is required, cost ineffectiveness, reduced passenger experience, power consumption, and hygienic issues.
Turning now to an overview of the aspects of the disclosure, one or more embodiments address the above-described shortcomings of the prior art by providing an elevator system that utilizes augmented reality for the functionality of a car operating panel utilizing three-dimensional (3D) holography. Aspects include a holographic projection device in electronic communication with an elevator controller to receive a passenger input in an elevator car. The holographic projection device can project the car operating panel onto a surface within the elevator car or it can project the COP into the air within the elevator car. A passenger can select a floor by inputting a command into the holographic COP which is then communicated to the elevator controller.
Turning now to a more detailed description of aspects of the present disclosure,
In one or more embodiments, the controller 302 can be implemented on the processing system 200 found in
In one or more embodiments, the controller 302 is operable to control the elevator car 304 within an elevator system. The elevator car 304 can be a part of a larger elevator bank that operates within a multi-story building with the controller 302 controlling the elevator car 304 along with multiple other elevator cars in the same building. In one or more embodiments, when a passenger enters the elevator car 304, the controller 302 can detect the presence of the passenger using any means such as, for example, a motion sensor 312 or the like. The presence of the passenger in the elevator car 304 can cause the projector 306 to project a car operating panel (COP) 314 in the elevator car 304. The COP 314 can be projected onto a surface within the elevator car 304 or the COP 314 can be projected in the air within the elevator car 304. The COP 314 can include one or more virtual elements within the projected COP. The one or more virtual elements can include virtual representations of elevator buttons, display screens, video, images, or other virtual media such as news feeds and the like.
In one or more embodiments, the virtual elements on the car operating panel 304 can be activated to engage the elevator car 304. For example, a passenger enters the elevator car and the projector 306 projects the car operating panel 314 in the elevator car 304 with virtual elevator buttons. The passenger can press at the virtual elevator buttons which causes the controller 302 to activate the elevator car 304 and deliver the passenger to a desired floor.
In one or more embodiments, when a virtual element 402 is activated by a passenger, it can change colors, size, shape, or shading to indicate that the selection has been received or acknowledged by the car operating panel 314. In some cases, a passenger's hand might be at more than one location. In the illustrated example, the index finger of the passenger is at the first location 406 and also the passenger's thumb is shown to be at a second location 408 which corresponds to the ‘9’ button (e.g., virtual element). The camera 310 captures the media of the passenger's hand and transmits the media to the controller 302. The controller 302, utilizing logic, can determine a confidence interval for the first location 406 and the second location 408 to determine which location (region in the virtual element) is being selected by the passenger. The locations with the highest confidence interval can be chosen as the virtual element being selected by the passenger. In the illustrative example, the ‘6’ button is the button being selected. In one or more embodiments, if the confidence interval is below a confidence threshold, the car operating panel 314 can alert the passenger to adjust their selection by moving their hand or finger to a region more indicative of their selection. The alert could be in the form of a message text on the car operating panel 314 or the car operating panel 314 projection can be augmented in size, shape, color, or shading to alert the passenger to make another selection. While the illustrated example utilizes the camera 310 to detect a selection or activation of virtual elements 402 of the car operating panel 314, any type of sensor 312 can be utilized to detect a passenger's selection of a virtual element 402.
In one or more embodiments, the car operating penal 314 can operate in several modes such as a first mode that displays basic virtual elements 402 such as numbering or one or two word buttons as illustrated in
In one or more embodiments, the car operating panel 314 can operate in a third mode or power savings mode. The power savings mode can turn off the projection of the car operating panel 314 or dim the lighting of the projected car operating panel 314. A car operating panel 314 can enter power savings mode when the elevator car 304 has no passengers or is in an idle mode and is not moving.
In one or more embodiments, to activate a virtual element 402, the passenger can maneuver his or her hand to a region at or near the desired virtual element. As the passenger's hand is present at the region at the virtual element for a duration of time, the virtual element is activated and the controller 402 can receive a signal indicating to engage the elevator car 304. In one or more embodiments, the passenger's hand or finger can utilize a movement pattern to activate the virtual element 402. For example, the car operating panel 314 can display multiple virtual elements 402 and a passenger can move his or her finger through the virtual element as if pushing a physical representation of the virtual element. This finger movement through the virtual element (e.g., movement pattern) can be recognized by the controller 302 to indicate activation of the virtual element.
In one or more embodiments, the car operating panel 314 can include a doodle for the associated theme of the car operating panel 314. The doodle can change periodically (e.g., hourly, daily, etc.) to resemble a specialty of a particular day. The doodle can include a search feature that allows passengers to search for tenants in the building or other searchable features for the building or region.
In one or more embodiments, the voice sensor 308 can be utilized to assist passengers that are unable to utilize the car operating panel 314. For example, a blind passenger will not be able to see the projection and can utilize voice commands to activate the elevator commands. In one or more embodiments, the projector 306 can project the car operating panel 314 on to a specific surface in the elevator car 304. The surface of the elevator car 304 can have braille engraved at regions that correspond to virtual elements of the car operating panel 314. This can assist blind passengers with activating the virtual elements to indicate a desired floor. In one or more embodiments, the camera 310 can capture hand movements and gestures from passengers that may need assistance. Image processing techniques can be utilized for gesture recognition for passengers.
In one or more embodiments, the elevator car 304 can have a near field communication (NFC) transceiver affixed to it to communicate with the user device 324 that has a corresponding NFC transceiver. The user device 324 can transmit signals to the controller 302 to indicate the passenger can access restricted floors and the projection of the car operating panel 314 can display virtual elements corresponding to the restricted floors for the passenger.
In one or more embodiments, the database 322 can store new themes and software updates that can be transmitted to the controller 302 to update the care operating panel 314 in the elevator car 304.
In one or more embodiments, the camera 310 utilizing image recognition techniques can determine physical features of a passenger to project the car operating panel 314 at a location that is convenient for the passenger. For example, passengers that are shorter might require the car operating panel 314 to be projected at a lower location in the elevator car 304 for ease of use.
Additional processes may also be included. It should be understood that the processes depicted in
A detailed description of one or more embodiments of the disclosed apparatus and method are presented herein by way of exemplification and not limitation with reference to the Figures.
The term “about” is intended to include the degree of error associated with measurement of the particular quantity based upon the equipment available at the time of filing the application.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
While the present disclosure has been described with reference to an exemplary embodiment or embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this present disclosure, but that the present disclosure will include all embodiments falling within the scope of the claims.
Number | Name | Date | Kind |
---|---|---|---|
4969700 | Haines | Nov 1990 | A |
6031519 | O'Brien | Feb 2000 | A |
6161654 | Sirigu | Dec 2000 | A |
7054045 | Mcpheters et al. | May 2006 | B2 |
7881901 | Fein et al. | Feb 2011 | B2 |
8089456 | Grego et al. | Jan 2012 | B2 |
8127251 | Fein et al. | Feb 2012 | B2 |
8212768 | Fein et al. | Jul 2012 | B2 |
8477098 | Fein et al. | Jul 2013 | B2 |
8500284 | Rotschild et al. | Aug 2013 | B2 |
8514194 | Lawrence et al. | Aug 2013 | B2 |
8547327 | Clarkson et al. | Oct 2013 | B2 |
9268146 | Iwasawa et al. | Feb 2016 | B2 |
9773345 | Stirbu et al. | Sep 2017 | B2 |
10732721 | Clements | Aug 2020 | B1 |
20050277467 | Karabin et al. | Dec 2005 | A1 |
20100253700 | Bergeron | Oct 2010 | A1 |
20130220740 | Yoo | Aug 2013 | A1 |
20150266700 | Salmikuukka | Sep 2015 | A1 |
20160147308 | Gelman et al. | May 2016 | A1 |
20160200547 | Nagata | Jul 2016 | A1 |
20160306817 | Heilig et al. | Oct 2016 | A1 |
20170313546 | King | Nov 2017 | A1 |
20180086595 | Guillot | Mar 2018 | A1 |
20180111792 | Schach | Apr 2018 | A1 |
20190066681 | Hsu | Feb 2019 | A1 |
20190284020 | Gireddy | Sep 2019 | A1 |
20210206598 | Roth | Jul 2021 | A1 |
20210214185 | Hiltunen | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
103373650 | Oct 2013 | CN |
204400366 | Jun 2015 | CN |
204416809 | Jun 2015 | CN |
204802795 | Nov 2015 | CN |
105197702 | Dec 2015 | CN |
105967019 | Sep 2016 | CN |
205634507 | Oct 2016 | CN |
205953247 | Feb 2017 | CN |
2008013299 | Jan 2008 | JP |
2010143683 | Jul 2010 | JP |
20070099783 | Oct 2007 | KR |
05113399 | Dec 2005 | WO |
Entry |
---|
EP Search Report; dated Sep. 24, 2019; Application No. 19162375.0; Filed: Mar. 12, 2019; 6 pages. |
IN Office Action; dated Mar. 25, 2021; Application No. 201811009167; Filed: Mar. 1, 2019; 4 pages. |
European Examination Report for Application No. 19162375.0; dated Feb. 22, 2022; 5 Pages. |
Number | Date | Country | |
---|---|---|---|
20190284020 A1 | Sep 2019 | US |