This application claims the priority benefit of Taiwan application serial no. 110104980, filed on Feb. 9, 2021. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to a health caring system and health caring method.
As aging population is growing, an increasing number of elderly people need to receive care services. Currently, there are many health caring systems on the market that can monitor the health condition of users. For most health caring systems, users are required to wear a wearable device to sense the physiological state of the user through a sensor on the wearable device. However, the discomfort caused by the wearable device often makes the user refuse to put on the wearable device. Accordingly, practitioners in the related field are making efforts to find out a method for monitoring the user's state without using a wearable device.
The disclosure provides a health caring system and a health caring method that can monitor the status of persons in a target space.
In the disclosure, a health caring system is adaptable for monitoring the state of a person in a target space. The health caring system includes a processor, a storage medium, a transceiver and an image capturing device. The image capturing device captures image data of the target space, where the image data includes time information. The storage medium stores the space division configuration corresponding to the target space. The processor is coupled to the storage medium, the transceiver, and the image capturing device, and is configured to: obtain a posture of a person according to the image data; determine a space division where the person is located according to the image data and the space division configuration; determine a behaviour of the person according to the posture, the space division, and the time information; determine that an event has occurred according to the behaviour, the space division, and the time information; and output an alarm message corresponding to the event through the transceiver.
In an embodiment of the disclosure, the processor creates a virtual identification code corresponding to the person based on the image data, and determines behaviour based on the virtual identification code.
In an embodiment of the disclosure, the processor determines the time period during which the person leaves the space division based on the image data, the space division, and the time information, and determines that the event has occurred in response to the time period being greater than the time threshold.
In an embodiment of the disclosure, the processor determines the time period during which the person performs a behaviour based on the time information, and determines that the event has occurred based on the time period.
In an embodiment of the disclosure, the processor determines that the image data is usable in response to the brightness of the image data being greater than the brightness threshold, and determines that the event has occurred based on the image data in response to the image data being usable.
In an embodiment of the disclosure, the image data includes a first image corresponding to a first time point and a second image corresponding to a second time point, wherein the processor determines that the image data is usable according to the similarity between the first image and the second image, and determines that the event has occurred according to the image data in response to the image data being usable.
In an embodiment of the disclosure, the behaviour includes a first behaviour and a second behaviour, wherein the processor determines the proportion of the first behaviour and the second behaviour in the time period according to the behaviour and the time information, and determines that the event has occurred according to the proportion.
In an embodiment of the disclosure, the processor generates at least one of the following based on the virtual identification code, the behaviour, the space division, and the time information: spatial heatmap, temporal heatmap, trajectory map, action proportion chart, time record of entering space division and time record of leaving space division.
In an embodiment of the disclosure, the storage medium stores historical behaviours corresponding to the person, and the processor determines that the event has occurred based on the historical behaviours and the behaviour.
A health caring method of the disclosure is adaptable for monitoring the status of a person in a target space, including: obtaining the image data of the target space and the space division configuration corresponding to the target space, wherein the image data includes time information; obtaining a posture of a person according to the image data; determining a space division where the person is located according to the image data and the space division configuration; determining a behaviour of the person according to the posture, the space division, and the time information; determining that an event has occurred according to the behaviour, the space division, and the time information; and outputting an alarm message corresponding to the event.
Based on the above, the health caring system of the disclosure can determine the state of the person in the target space by analyzing the image data without using the wearable device.
In order to make the content of the present disclosure more comprehensible, the following embodiments are provided as examples based on which the present disclosure can indeed be implemented. In addition, wherever possible, elements/components/steps with the same reference numbers in the drawings and embodiments represent the same or similar components.
The processor 110 is, for example, a central processing unit (CPU), or other programmable general-purpose or specific-purpose micro control unit (MCU), a microprocessor, a digital signal processor (DSP), a programmable controller, an application specific integrated circuit (ASIC), a graphics processing unit (GPU), an image signal processor (ISP), an image processing unit (IPU), an arithmetic logic unit (ALU), a complex programmable logic device (CPLD), a field programmable gate array (FPGA) or other similar components or a combination of the above components. The processor 110 may be coupled to the storage medium 120, the transceiver 130, and the image capturing device 140, and access and execute a plurality of modules and various applications stored in the storage medium 120, thereby realizing the functions of the health caring system.
The storage medium 120 is, for example, any type of fixed or removable random access memory (RAM), a read-only memory (ROM), a flash memory, a hard disk drive (HDD), a solid state drive (SSD) or similar components or a combination of the above components, and configured to store multiple modules or various applications that can be executed by the processor 110 to realize the functions of the health caring system.
The transceiver 130 transmits and receives signals in a wireless or wired manner. The transceiver 130 may also perform operations such as low-noise amplification, impedance matching, frequency mixing, up or down frequency conversion, filtering, amplification, and the like.
The image capturing device 140 can be configured to capture image data of the target space. The target space may be a space where the monitored person often stays. For example, the image capturing device 140 may be installed on the ceiling of the home or office of the person being monitored, so as to capture the image data corresponding to the target space (i.e., home or office). The image data may include images and time information corresponding to the images. In an embodiment, the image capturing device 140 can capture the image data of the target space through a fisheye lens.
In step S201, the processor 110 of the health caring system 100 may capture image data of the target space through the image capturing device 140, wherein the image data may include the image and time information corresponding to the image.
Referring to
In an embodiment, the processor 110 may determine whether the image data is usable according to the brightness of the image data. Specifically, the processor 110 may determine that the image data is usable in response to the brightness of the image data being greater than the brightness threshold, and may determine that the image data is not usable in response to the brightness of the image data being less than or equal to the brightness threshold. In this way, when the image data is not clear due to the low brightness, the processor 110 will not use the image data to determine the status of the person in the target space 40.
In an embodiment, the processor 110 may determine whether the image data is usable according to the similarity between different frames of the image data. Specifically, the image data may include a first image corresponding to a first time point and a second image corresponding to a second time point, wherein the first time point may be different from the second time point. The processor 110 may calculate the similarity between the first image and the second image. The disclosure provides no limitation to the method of calculating the similarity. After obtaining the similarity between the first image and the second image, the processor 110 may determine that the image data is usable in response to the similarity being greater than the similarity threshold, and may determine that the image data is not usable in response to the similarity being less than or equal to the similarity threshold. In this way, if the difference between the different image data is too large, the processor 110 will not use the image data to determine the status of the person in the target space 40.
In step S203, the processor 110 may create a virtual identification code for the person in the target space according to the image data. For example, if a person A and a person B are located in the target space 40, the processor 110 may create a corresponding virtual identification code A for the person A, and may create a corresponding virtual identification code B for the person B.
In step S204, the processor 110 can obtain the posture of the person according to the image data, and can determine the space division where the person is located according to the image data and the space division configuration, wherein the posture of the person is, for example, associated with the articulation point of the person.
Specifically, the storage medium 120 may prestore the space division configuration corresponding to the target space 40. The space division configuration can be adopted to divide the target space 40 into one or more regions.
The processor 110 may set the acquired posture or space division and related information to be associated with the virtual identification code. For example, information such as the acquired posture or space division is set to be associated with the virtual identification code A, thereby indicating that the posture or the space division corresponds to the person A.
Referring to
In step S206, the processor 110 may determine whether an event corresponding to the monitored person has occurred. If an event has occurred, go to step S207. If no event has occurred, return to step S201. Specifically, the processor 110 can determine whether an event corresponding to the monitored person has occurred based on information such as behaviour, space division, or time information.
In an embodiment, the processor 110 may determine the time period during which the person leaves the target space 40 or the space division based on the behaviour, space division, or time information. If the time period is greater than the time threshold, the processor 110 may determine that the event has occurred. For example, the processor 110 may determine that the monitored person has left the target space 40 from the space division 44 representing the bathroom door for more than 1 hour based on the behaviour, space division, or time information. As such, it means that the person has entered the bathroom for more than 1 hour. The person entering the bathroom for more than one hour means that the person might pass out in the bathroom. Therefore, the processor 110 can determine that an event that “person might pass out in the bathroom” has occurred.
In an embodiment, the processor 110 may determine the time period during which a person performs a specific behaviour based on the time information, and determine that the event has occurred based on the time period. For example, the processor 110 may determine, based on the time information, that the time for the person to perform the “lying” behaviour in the space division 41 representing the aisle exceeds 5 minutes. In this way, it means that the person might fall down on the aisle and cannot get up on his own. Therefore, the processor 110 can determine that an event “person fall” has occurred.
In an embodiment, if the person performs multiple behaviours including the first behaviour and the second behaviour, the processor 110 may determine the proportion of the first behaviour and the second behaviour in a specific time period based on the multiple behaviours and time information, and determine that the event has occurred according to the proportion. For example, if the person has performed various behaviours such as “walking” and “lying”, the processor 110 may determine that the person often lies down and lacks exercise in response to the high proportion of the “lying” behaviour and the “walking” behaviour. Based on this, the processor 110 can determine that an event that “person's activity status is different from normal status” has occurred.
In an embodiment, the storage medium 120 may prestore historical behaviours corresponding to the monitored person. The processor 110 can determine that the event has occurred according to the historical behaviours and the current behaviour. For example, the processor 110 can determine that the person's historical daily lying time is about 10 hours based on the person's historical behaviour, and can determine that the person's daily lying time is about 12 hours based on the person's current behaviour. Accordingly, the processor 110 can determine that the person's lying time has increased. Therefore, the processor 101 can determine that an event of “decrease of person's activity” has occurred.
In step S207, the processor 110 may output an alarm message corresponding to the event through the transceiver 130. For example, when the processor 110 determines that the monitored person in the target space 40 has fallen down, the processor 110 may send an alarm message to the family or caregiver of the person through the transceiver 130 to notify the family or caregiver to help the monitored person as soon as possible.
In an embodiment, the processor 110 may generate various charts based on virtual identification codes, behaviours, spatial divisions, or time information, wherein the various charts may include, but are not limited to, a spatial heatmap, a temporal heatmap, a trajectory map, an action proportion chart, time record of entering space division and time record of leaving space division. The processor 110 may output the generated chart through the transceiver 130. For example, the processor 110 may transmit the generated chart to the user's terminal device through the transceiver 130. The user can view the chart through the display of the terminal device.
The spatial heatmap can be used to determine the frequency of monitored persons in different locations. For example, the user of the health caring system 100 can determine that the monitored person frequently appears in the space division 42 within a specific time period according to the spatial heatmap, thereby determining that the person often rests on the sofa.
The temporal heatmap can be used to determine the time during which the monitored person is at the location.
The trajectory map can be used to determine the movement trajectory of the monitored person in the target space 40.
The action proportion chart can be used to determine the proportion of different behaviours performed by the monitored person.
The time record of entering the space division and the time record of leaving the space division can be used to determine the time at which the monitored person enters or leaves the space division. For example, the user of the health caring system 100 can determine that the monitored person leaves the space division 44 at 20:00 and returns to the space division 44 at 20:10 according to the time record of entering the space division and the time record of leaving the space division.
In summary, the health caring system of the disclosure can determine the status of the person in the target space by analyzing the image data obtained by the image capturing device, and the monitored person may not put on a wearable device. The health caring system can determine the posture, position and behaviour of the person in the target space through the image data, and determine whether a specific event has occurred based on the above determining result and time information. If a specific event has occurred, the health caring system can output an alarm message to notify other persons to help the monitored person. The health caring system can also generate corresponding charts for the monitored person. The user can use the chart to determine whether the status of the monitored person is abnormal.
Number | Date | Country | Kind |
---|---|---|---|
110104980 | Feb 2021 | TW | national |