The invention relates to a system and a method for monitoring a waiting area. The invention further relates to a computer program product comprising instructions for causing a processor system to perform said method.
Waiting areas are common in business, government and healthcare services. Typically, people sit or stand in such a waiting area until an event takes place for which they are waiting. The event may be a scheduled event, such as a doctor's appointment, a radiology examination, a passport renewal appointment, etc. Typically, the waiting area is constituted by a waiting room or a section of a room, and is typically provided with chairs. However, other forms of waiting areas are equally conceivable, e.g., outdoor waiting areas.
It may be desirable to enable a user, such as a business, government or healthcare worker, to monitor the waiting area. It is known to use a closed circuit television (CCTV) system for that purpose, which enables the user to monitor the waiting area by observing the waiting area on a television or other display. Such a CCTV system enables remote monitoring of the waiting area, e.g., from another room or another section of a waiting room. Accordingly, despite being located elsewhere, the user can observe whether a scheduled person has arrived, how many persons are waiting, etc. Advantageously, the user can adapt a workflow, adjust a schedule, etc., to a situation in the waiting area.
It is known to augment a video stream, such as one obtained from a CCTV system, with information associated with the person shown in the video stream.
For example, US 2011/0153341 describes a patient identification system which comprises a data storage storing patient information including patient identifying information associated with one or more patient images. The system further comprises a processor adapted to facilitate identification of a patient, receive a camera feed including an image of a patient, perform facial recognition using the camera feed to identify the patient in comparison with information stored in the data storage, retrieve information associated with the identified patient from the patient storage, and display the retrieved information in conjunction with the image of the identified patient on a computer screen.
It would be advantageous to obtain a system, method and/or computer program product which enables a user to better monitor a waiting area.
The following aspects of the invention enable the user to better monitor the waiting area by identifying a person in the waiting area, estimating his/her emotional state, and visually representing the identified person and the emotional state in an output image.
In a first aspect of the invention, a system is provided for monitoring a waiting area, the system comprising:
a database interface for accessing a database comprising identification data of one or more scheduled persons scheduled for an event;
an identification subsystem for i) receiving attribute data indicative of an attribute of a waiting person in the waiting area, and ii) matching the attribute to the identification data, thereby establishing an identified person;
an emotion determining subsystem for j) receiving physiological data indicative of a physiological parameter of the identified person, and jj) based on the physiological parameter, estimating an emotional state of the identified person; and
a display processor for visually representing the identified person and the emotional state in an output image.
In a further aspect of the invention, a method is provided for monitoring a waiting area, the method comprising:
accessing a database comprising identification data of one or more scheduled persons scheduled for an event;
receiving attribute data indicative of an attribute of a waiting person in the waiting area;
matching the attribute to the identification data, thereby establishing an identified person;
receiving physiological data indicative of a physiological parameter of the identified person;
based on the physiological parameter, estimating an emotional state of the identified person; and
visually representing the identified person and the emotional state in an output image.
In a further aspect of the invention, a computer program product is provided comprising instructions for causing a processor system to perform the method set forth.
The inventors have recognized that a waiting person may react emotionally to a scheduled event and/or the waiting itself. For example, if the scheduled event is a medical examination or scan, the waiting person may be nervous or anxious. Similarly, if the waiting person has been waiting for a prolonged period of time, the waiting person may be irritated. The inventors have further recognized that the emotional state of the waiting person is of relevance to the user, as it may affect their workflow, schedule, etc. For example, an anxious person may need to be calmed prior to starting the medical examination or scan. However, said emotional state is typically not easily observed by a user of, e.g., a CCTV system.
The aforementioned measures enable a user to monitor a waiting area. For that purpose, a database is accessed. The database comprises identification data which identifies one or more persons which have an appointment, i.e., are each scheduled for an event and thus are considered a scheduled person. For example, the database may comprise a name of the scheduled person, a photograph, and/or other identification data. A waiting person is identified in the waiting area. For that purpose, attribute data is obtained which allows the identification determining subsystem to determine an attribute of the waiting person. The attribute tells apart the waiting person, e.g., from other persons waiting in the waiting area. The attribute may be, e.g., a biometric attribute of the person, a token-based attribute of a token carried by the person, etc. The attribute is associated with the waiting person, e.g., by being a physical feature of the waiting person, by being carried by the waiting person, etc.
The identification data is used to find a scheduled person which sufficiently matches the attribute. Hence, the waiting person is identified in that the waiting person in the waiting area is linked to the scheduled person identified by the identification data. In addition, physiological data is obtained which allows the emotion determining subsystem to determine a physiological parameter of the identified person, such as a body temperature, a pulse rate, a facial muscle position, etc. The physiological parameter is used to estimate an emotional state of the identified person. The emotional state is then visually represented in an output image together with a representation of the identified person.
The above measures have the effect that an output image is provided which, when viewed by the user, informs the user about the identified person in the waiting room and his/her emotional state. Consequently, the output image enables the user to monitor the waiting room, in that the waiting person is automatically identified and the result thereof is shown to the user. Moreover, the user simultaneously obtains feedback about the emotional state of the identified person. Advantageously, the user or other worker can react to the emotional state. For example, if the identified person is shown to be anxious, the identified person can be calmed down before the scheduled event. Advantageously, the user can better maintain the workflow and/or the schedule. Advantageously, it is less likely that a scheduled event is prolonged due to an unsuitable emotional state of the identified person.
Optionally, the system further comprises a video recording subsystem for obtaining a video image of the waiting area showing the identified person, and the display processor is arranged for including the video image in the output image. By providing the video image to the user, an image-based representation of the identified person is provided to the user. Advantageously, the video image enables the user to better monitor the waiting area.
Optionally, the display processor is arranged for visually representing the emotional state in an overlay in the video image. The video image thus shows the identified person with his/her emotional state being visualized in as part of an overlay in the video image. Advantageously, the overlay is visually linked to the identified person in the video image to enable the user to intuitively associate the emotional state with the identified person.
Optionally, the video image constitutes the attribute data, and the identification subsystem is arranged for identifying the attribute of the identified person based on an analysis of the video image. The attribute is thus identified using the video image, i.e., using the image-based representation of the identified person in the video image, and in particular by analyzing the video image. Advantageously, there is no need for separate identification sensors in the waiting area. Rather, the video recording sensors of the video recording subsystem are used instead, e.g., an existing CCTV camera.
Optionally, the identification subsystem is arranged for using facial recognition to match a facial attribute of the waiting person to the identification data. Facial attributes are well suited for identifying the waiting person in the video image. Advantageously, the identification data comprises an image-based representation of the scheduled person, i.e., a photograph, thereby facilitating matching the facial attribute of the waiting person to the identification data.
Optionally, the video image constitutes the physiological data, and the emotion determining subsystem is arranged for obtaining the physiological parameter of the identified person based on an analysis of the video image. The physiological parameter is thus obtained from the video image, i.e., from the image-based representation of the identified person in the video image. Advantageously, there is no need for separate physiological sensors in the waiting area. Rather, the video recording sensors of the video recording subsystem are used instead, e.g., an existing CCTV camera.
Optionally, the database comprises further information associated with the identified person, and the display processor is arranged for visually representing the further information from the database in the output image. Advantageously, a more informative output image is obtained.
Optionally, the further information from the database comprises at least one of the group of: a name of the identified person, a time of a scheduled event of the identified person, a remaining time until the scheduled event, a subject matter of the scheduled event, a physical need of the identified person prior to the scheduled event, and a psychological need of the identified person prior to the scheduled event.
Optionally, the identification subsystem is arranged for determining a waiting time of the identified person in the waiting area, and the display processor is arranged for visually representing the waiting time in the output image.
Optionally, the emotion determining subsystem is arranged for estimating a change in the emotional state of the identified person, and the display processor is arranged for visually representing the change in the emotional state in the output image.
Optionally, the emotion determining subsystem is arranged for triggering an alert if the change in the emotional state exceeds a threshold.
Optionally, the database is indicative of a past physiological parameter of the identified person, and the emotion determining subsystem is arranged for determining estimating the change in the emotional state based on the past physiological parameter.
Optionally, the system comprises a mobile display device for displaying the output image. By displaying the output image on a mobile display device, the user can easily recognize the identified person in the waiting room by verifying the output image.
It will be appreciated by those skilled in the art that two or more of the above-mentioned embodiments, implementations, and/or aspects of the invention may be combined in any way deemed useful.
Modifications and variations of the method and/or the computer program product, which correspond to the described modifications and variations of the system, can be carried out by a person skilled in the art on the basis of the present description.
The invention is defined in the independent claims. Advantageous yet optional embodiments are defined in the dependent claims.
These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter. In the drawings,
It should be noted that items which have the same reference numbers in different Figures, have the same structural features and the same functions, or are the same signals. Where the function and/or structure of such an item has been explained, there is no necessity for repeated explanation thereof in the detailed description.
For that purpose, the identification subsystem 140 receives the identification data 022 from the database 020. Moreover, in the example shown in
The system 100 further comprises an emotion determining subsystem 160 for receiving physiological data indicative of a physiological parameter of the identified person 012, and for estimating an emotional state of the identified person based on the physiological parameter. In the example shown in
The system 100 further comprises a display processor 180 for visually representing the identified person and the emotional state in an output image 182-188. For that purpose, the display processor 180 is shown to receive identification data 142 from the identification subsystem 140 and emotional state data 162 from the emotion determining subsystem 160.
An operation of the system 100 may be briefly explained as follows. The database interface 120 provides access to the database 020. The identification subsystem 140 identifies a waiting person 012 in the waiting area 010 by receiving attribute data indicative of the attribute of the waiting person in the waiting area and by matching the attribute to the identification data, thereby establishing the identified person. As a result, the waiting person is identified, i.e., the waiting person becomes an identified person 012. The emotion determining subsystem 160 receives physiological data indicative of a physiological parameter of the identified person 012, and based on the physiological parameter, estimates an emotional state of the identified person 012. The display processor 180 then visually represents the identified person and the emotional state in an output image 182-188.
The method 200 comprises, in a step titled “ACCESSING DATABASE”, accessing 210 a database, the database being indicative of one or more scheduled persons scheduled for an event. The method 200 further comprises, in a step titled “OBTAINING ATTRIBUTE OF WAITING PERSON”, receiving 220 attribute data indicative of an attribute of a waiting person in the waiting area. The method 200 further comprises, in a step titled “IDENTIFYING WAITING PERSON”, matching 230 the attribute to the identification data, thereby establishing an identified person. The method 200 further comprises, in a step titled “OBTAINING PHYSIOLOGICAL PARAMETER”, receiving 240 physiological data indicative of a physiological parameter of the identified person. The method 200 further comprises, in a step titled “ESTIMATING EMOTIONAL STATE”, based on the physiological parameter, estimating 250 an emotional state of the identified person. The method 200 further comprises, in a step titled “GENERATING OUTPUT IMAGE”, visually representing 260 the identified person and the emotional state in an output image. It will be appreciated that the steps of the method 200 may be performed in any suitable order. In particular, the steps involved in identifying the waiting person and estimating the emotional state may be performed in a different order, e.g., simultaneously or in a reverse order. For example, the emotional state of a waiting person in the waiting area may be estimated, and only thereafter the waiting person may be identified.
The user may thus learn from the output image 182 of
The identified persons are visually represented in the output image 182 by their name 300, i.e., in the form of a text-based representation of each identified person. In addition, the emotional state 320 of each identified person is graphically represented, i.e., as a smiley. However, the identified person and the emotional state may be visually represented in the output image in various ways, as will be further explained in reference to
It is noted that the output image 184 of
In general, the database 020 may comprise a name of the identified person, a time of a scheduled event of the identified person 012, a remaining time until the scheduled event, a subject matter of the scheduled event, a physical need of the identified person prior to the scheduled event, and/or a psychological need of the identified person prior to the scheduled event or various other relevant information for the user. The display processor 180 may be arranged for visually representing the information in the output image 186. As such, the user may learn that the identified person 302 may, e.g., have a physical disability and need transportation to the examination room, suffers from claustrophobia, has an allergy, etc.
It will be appreciated that the emotional state 320 may be visually represented in various ways. Hence, even though
The above examples distinguish between a negative emotional state such as nervousness and anxiety and a positive emotional state such as calmness. It will be appreciated that the emotion determining subsystem 160 may be arranged for estimating any relevant type of emotional state. In particular, the emotion determining subsystem 160 may arranged for identifying emotional states which may be of relevance for the scheduled events. Moreover, the display processor 180 may be arranged for visually representing said emotional states in the output image, while omitting visually representing emotional states which are not of relevance for the scheduled event. Hence, the visual representation of emotional states in the output image may essentially serve to warn the user of those persons in the waiting area which need attention from the user or other worker prior or during the scheduled event. The display processor 180 may thus omit visually representing in the output image the emotional states of persons who do not need attention from the user.
The above examples show the identification subsystem 140 identifying all waiting persons in the waiting area 010. It will be appreciated that the identification subsystem 100 may equally only identify one or any suitable number of waiting persons.
In general, identification subsystem 140 may be arranged for identifying the attribute from attribute data obtained from an identification sensor in the waiting area. The identification sensor may be, e.g., an image sensor from a video camera of the video recording subsystem 150. Similarly, the emotion determining subsystem 160 may be arranged for determining the physiological parameter of the identified person from physiological data from a physiological sensor in the waiting area. Said physiological sensor may be the same sensor as used for identifying the attribute, e.g., the image sensor of the video camera. However, this is not a limitation, in that the identification sensor and the physiological sensor may be different. Moreover, either or both of the sensors may rather be located outside of the waiting area 010 by being arranged for sensing inside the waiting area 010. For example, if the identification subsystem 140 uses Bluetooth-based discovery of a mobile phone of the waiting person to identify the waiting person, the Bluetooth receiver, i.e., the identification sensor, may be located outside of the waiting area 010 while being able to receive Bluetooth signals from inside of the waiting area 010. The system 100 may comprise said sensor(s) or, alternatively, be connectable to said sensor(s).
Moreover, in general, the emotion determining subsystem 160 may be arranged for estimating a change in the emotional state of the identified person 012, and the display processor 180 may be arranged for visually representing the change in the emotional state in the output image. The change in the emotional state may be a momentary change, e.g., a change occurring while the identified person is waiting in the waiting area 010. The change in the emotional state may also occur with respect to, e.g., a past visit of the identified person. For that purpose, the database 020 may be indicative of a past emotional state of the identified person, e.g., in the form of a past physiological parameter. The emotion determining subsystem 160 may be arranged for triggering an alert if the change in the emotional state exceeds a threshold. For example, an exclamation mark may be included next to the visual representation of the emotional state, or an audio alert may be triggered.
In general, the identification subsystem may be embodied in various ways. For example, the identification subsystem may use face recognition to match a waiting person to a database. For that purpose, the database may comprise photographs of scheduled persons. The photographs may be obtained from medical records of patients. The identification subsystem may also employ a tag-based identification technique in which a tag provided to a person during entry to the hospital. The tag may comprise information which allows identification of the person. The tag may be a passive tag or an active tag. The passive tag may be a visual tag, e.g., a card or a piece of paper comprising a machine readable code, e.g., a waiting number or a QR code. The machine readable code may be read from the video image which is obtained of the waiting area. The identification subsystem may also identify the waiting person using a personal device of the waiting person, e.g., by sensing a presence of his/her Smartphone, e.g., using Bluetooth discovery. Alternatively or additionally, the waiting person may also signal his/her presence by using an application on the Smartphone.
The emotion determining subsystem may be embodied in various ways. For example, as aforementioned, the physiological parameter may be obtained from a video image of the waiting area. Alternatively or additionally, the physiological parameter may be obtained from a personal monitor worn by the identified person. The personal monitor may be provided by a Smartphone of the person, e.g., in the form of an application running on the Smartphone which uses the Smartphone's sensors to measure the physiological parameter.
The video recording subsystem may comprise a video camera for obtaining a video stream of video images. The video stream may be a continuous video stream or an interval video stream. The video camera may be located in the waiting area. Additionally, video cameras may be provided in other areas, such as corridors, wards, patient rooms, etc.
It will be appreciated that the present invention may be advantageously used in a healthcare setting. For example, many radiology control rooms are provided with a display showing a live video of a waiting area. This enables a technologist to see if patients have arrived yet, are waiting, etc. A radiological examination is an exciting and in some cases even frightening event for patients. As a consequence, the technologist may frequently deal with anxious or phobic patients. Such patients require more attention and may need more time for scanning. This in turn may affect the workflow of the technologist and schedule for the following patients. For example, when a patient is very anxious and the technologist becomes aware of the anxiousness, the patient may be immediately offered some water or a chair so as to calm down the patient. It is therefore desirable for the technologist to become aware of the emotional state of a patient. The present invention may be advantageously used for this purpose. Advantageously, additional useful information may be displayed to the technologist, such as important allergy information, claustrophobia, transport needs etc. This may allow the technologist to speed up the patients' preparation for the examination.
It will be appreciated that the invention also applies to computer programs, particularly computer programs on or in a carrier, adapted to put the invention into practice. The program may be in the form of a source code, an object code, a code intermediate source and an object code such as in a partially compiled form, or in any other form suitable for use in the implementation of the method according to the invention. It will also be appreciated that such a program may have many different architectural designs. For example, a program code implementing the functionality of the method or system according to the invention may be sub-divided into one or more sub-routines. Many different ways of distributing the functionality among these sub-routines will be apparent to the skilled person. The sub-routines may be stored together in one executable file to form a self-contained program. Such an executable file may comprise computer-executable instructions, for example, processor instructions and/or interpreter instructions (e.g. Java interpreter instructions). Alternatively, one or more or all of the sub-routines may be stored in at least one external library file and linked with a main program either statically or dynamically, e.g. at run-time. The main program contains at least one call to at least one of the sub-routines. The sub-routines may also comprise function calls to each other. An embodiment relating to a computer program product comprises computer-executable instructions corresponding to each processing step of at least one of the methods set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically. Another embodiment relating to a computer program product comprises computer-executable instructions corresponding to each means of at least one of the systems and/or products set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically.
The carrier of a computer program may be any entity or device capable of carrying the program. For example, the carrier may include a storage medium, such as a ROM, for example, a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example, a hard disk. Furthermore, the carrier may be a transmissible carrier such as an electric or optical signal, which may be conveyed via electric or optical cable or by radio or other means. When the program is embodied in such a signal, the carrier may be constituted by such a cable or other device or means. Alternatively, the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted to perform, or used in the performance of, the relevant method.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb “comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article “a” or “an” preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2013/060787 | 12/11/2013 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
61739782 | Dec 2012 | US |