This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2011-094288, filed on Apr. 20, 2011; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a behavior estimation apparatus, a behavior estimation method, and a computer readable medium.
There has been known a technique of estimating a behavior of a worker, who is carrying out a predetermined work, and displaying a work history of the worker by using the result of the estimation.
However, in the conventional art, there are many sensors required to estimate the behavior of the worker, which is the combination of the basic actions such as “walking”, “running”, and “sitting”. Therefore, the configuration becomes complicated, entailing a problem of increased production cost.
According to one embodiment, a behavior estimation apparatus includes a detection unit that is attached to a user and is configured to detect sensing information used to estimate a plurality of basic actions of the user; a first estimation unit configured to estimate the basic actions based on the sensing information; a second estimation unit configured to estimate behavior data in which a plurality of higher-level behaviors made of a combination of the basic actions is arranged in chronological order, based on basic action data in which the plurality of basic actions; a work data assignment unit configured to assign any one of a plurality of work behaviors each indicating a behavior related to a work to each of the higher-level behaviors, so as to acquire work data in which the plurality of work behaviors is arranged in chronological order; an evaluation unit configured to evaluate whether the work data satisfies a predetermined standard; a determination unit configured to determine, when the work data satisfies the predetermined standard, a name of the work behavior assigned to each of the higher-level behaviors constituting the behavior data as the work behavior corresponding to the higher-level behavior; and a display unit configured to display information according to the name by the determination unit.
An embodiment will be described below in detail with reference to the attached drawings. In the present embodiment, an example of estimating a behavior of a doctor or a nurse, who engages in medical service, will be described. However, the present invention is not limited thereto.
The portable terminal apparatus 10 is put on a user (doctor or nurse) who is the subject of the behavior estimation. Here, as illustrated in
The detection unit 14 is a unit for detecting sensing information used to estimate the basic actions of the user. The basic actions includes simple actions such as “walking”, “running”, “sitting”, “standing”, and “lying”. In the embodiment, as the sensing information, acceleration information indicating acceleration of the portable terminal apparatus 10 (user), and position information indicating the position of the portable terminal apparatus 10 are employed. The sensing information is not limited thereto, and the type of the sensing information is optional. Briefly, the sensing information may be information used to estimate the basic actions of the user.
In the embodiment, the detection unit 14 is configured to include an acceleration sensor for detecting the acceleration of the portable terminal apparatus 10, and a position information detection unit for detecting the position information (access point information) that indicates the position of the portable terminal apparatus 10 (i.e., the position of the user) in a hospital. The acceleration sensor is configured to be capable of detecting the acceleration in each of three axial directions of x, y, and z, for example.
As illustrated in
The control unit 50 is a unit to control the respective units of the server apparatus 20, and it is a computer provided with a CPU, a read only memory (ROM), a random access memory (RAM), and the like, for example. Functions of the control unit 50 includes a registration unit 51, a first estimation unit 52, a second estimation unit 53, an assignment unit 54, an evaluation unit 55, and a determination unit 56. These functions are implemented when the CPU of the control unit 50 reads a control program stored in the ROM onto the RAM and executes the same. Alternatively, at least some of these functions may be implemented by individual circuit (hardware). The control program executed by the control unit 50 can be stored in a computer-readable recording medium such as a compact disk read only memory (CD-ROM), a compact disk recordable (CD-R), a memory card, a digital versatile disk (DVD), and a flexible disk (FD) in a file of an installable or executable format.
The registration unit 51 registers the sensing information into the storage unit 60 when the communication unit 30 receives the sensing information from the portable terminal apparatus 10. In the embodiment, the registration unit 51 registers the acceleration information, received by the communication unit 30, to an acceleration information storage unit 62 included in the storage unit 60, while registering the position information, received by the communication unit 30, to a position information storage unit 63 included in the storage unit 60.
The first estimation unit 52 estimates the basic actions of the user based on the sensing information. In the embodiment, the first estimation unit 52 estimates the basic actions of the user based on a basic action estimation rule stored in a basic action estimation rule storage unit 61 included in the storage unit 60, the acceleration information stored in the acceleration information storage unit 62, and the position information stored in the position information storage unit 63. The detail will be described later. The first estimation unit 52 registers the estimated basic action to a basic action storage unit 64 included in the storage unit 60.
The second estimation unit 53 estimates behavior data in which a plurality of higher-level behaviors made of a combination of the basic actions is arranged in chronological order, based on the basic action data in which a plurality of basic actions estimated by the first estimation unit 52 is arranged in chronological order. The detail will be described later. The second estimation unit 53 registers a parameter (described later) generated for estimating the behavior data to a parameter storage unit 65 in the storage unit 60, and registers the estimated behavior data to a behavior data storage unit 66 included in the storage unit 60.
The assignment unit 54 assigns any one of a plurality of work behaviors, indicating the behavior related to the work, to each of the plurality of higher-level behaviors constituting the behavior data estimated by the second estimation unit 53, thereby obtaining work data in which a plurality of work behaviors is arranged in chronological order. The evaluation unit 55 evaluates whether the work data satisfies a predetermined standard. The determination unit 56 determines a name of the work behaviors respectively assigned to the plurality of higher-level behaviors constituting the behavior data, as the work behaviors corresponding to the higher-level behaviors, when the work data satisfies the predetermined standard. Thus, the work data corresponding to the behavior data estimated by the second estimation unit 53 is determined. The detail will be described later. The determination unit 56 stores the determined work data and a name (e.g., “bedsore round work”) of the work specified by the work data into a work data storage unit 69 included in the storage unit 60 while associating them with each other.
The storage unit 60 is a unit for storing therein various kinds of data. The storage unit 60 includes the basic action estimation rule storage unit 61, the acceleration information storage unit 62, the position information storage unit 63, the basic action storage unit 64, the parameter storage unit 65, the behavior data storage unit 66, a work knowledge storage unit 67, an evaluation rule storage unit 68, and the work data storage unit 69. The basic action estimation rule storage unit 61 stores therein the basic action estimation rule used for estimating the basic action of the user. The acceleration information storage unit 62 stores therein the acceleration information of the user (the portable terminal apparatus 10). The position information storage unit 63 stores therein the position information of the user (the portable terminal apparatus 10). The basic action storage unit 64 stores therein the basic action estimated by the first estimation unit 52. The parameter storage unit 65 stores therein the parameters generated by the second estimation unit 53. The behavior data storage unit 66 stores therein the behavior data estimated by the second estimation unit 53. The work knowledge storage unit 67 stores therein work knowledge. The evaluation rule storage unit 68 stores therein the evaluation rule used for the evaluation by the evaluation unit 55. The work data storage unit 69 stores therein the work data and the name of the work specified by the work data while associating them with each other.
When the communication unit 30 receives the sensing information from the portable terminal apparatus 10, the registration unit 51 of the server apparatus 20 registers the received sensing information to the storage unit 60 (step S3). More specifically, the registration unit 51 executes pre-processing such as a noise removing process for removing noise data from the acceleration information and the position information received by the communication unit 30 and/or a central value calculating process for calculating a central value in a fixed period. Thereafter, the registration unit 51 registers the acceleration information to the acceleration information storage unit 62 included in the storage unit 60, and registers the position information to the position information storage unit 63 included in the storage unit 60. The noise removing process or the central value calculating process may not be executed. What is done by the pre-processing is optional. For example, an average calculation for calculating an average in a fixed period, a frequency calculation, or Fourier transformation can be performed on the acceleration information and the position information received by the communication unit 30 as the pre-processing.
When a predetermined period t1 has elapsed from the start of the behavior estimation process (step S4: YES), the first estimation unit 52 estimates the basic action of the user within the predetermined period t1 (step S5). The more detail will be described below. The first estimation unit 52 reads the acceleration information within the predetermined period t1 from the acceleration information storage unit 62, and reads the position information within the predetermined period t1 from the position information storage unit 63. Then, the first estimation unit 52 reads the basic action estimation rule from the basic action estimation rule storage unit 61, and estimates the basic action of the user by referring to the read basic action estimation rule.
Referring back to
More specifically, when a higher-level behavior (z−i) is assigned to the basic action other than the specific basic action wi, the second estimation unit 53 calculates a conditional probability P(zi=j|z−i,wi,di), indicating that the higher-level behavior j is assigned to the specific basic action wi. With this calculation, the second estimation unit 53 estimates the parameters θ and φ. The conditional probability P(zi=j|z−i,wi,di) is calculated with the following equation (1).
P(zi=j|z−i,wi,di)={(CWTwij+β)/ΣWi=1CWTwij+Wβ}×{(CDTdi j+α)/ΣTt=1CDTdit+Tα} (1)
In the equation (1), CWT indicates the frequency matrix (W×T) of the basic action and the higher-level behavior, and CDT indicates the frequency matrix (D×T) of the behavior data and the higher-level behavior. W, D, and T indicate the number of the basic actions, the number of pieces of the action data, and the number of the higher-level behaviors, respectively. α and β indicate parameters given by a system designer.
The second estimation unit 53 first initializes the values of the elements in the respective matrices CWT and CDT with optional numerical values, and then, randomly changes the values of the elements in the respective matrices so as to calculate the conditional probability P with the equation (1). The second estimation unit 53 sequentially updates the respective matrices CWT and CDT according to the calculation result. The process described above is repeated predetermined number of times, whereby the CWT and CDT are determined. The parameters φ and θ are estimated by using the determined CWT and CDT. The parameters φ and θ are calculated according to the following equations (2) and (3). The probability of the lower-level behavior wi in the higher-level behavior j is represented by φij, and the probability of the higher-level behavior j in the action data d is represented by θjd.
φij={(CWTwij+β)/ΣWi=1CWTwij+Wβ} (2)
θjd={(CDTdij+α)/ΣTt=1CDTdij+Tα} (3)
In the example of
Returning back to
Then, the assignment unit 54 assigns any one of the plurality of work behaviors to each of the plurality of higher-level behaviors Z constituting the behavior data estimated in step S8 (step S10). The case where the work behavior is assigned to each of the plurality of higher-level behaviors Z, constituting the behavior data of nurse Sato, will be described below. However, the same is applied to the other users.
As illustrated in
The evaluation for the work data of “A1→A2→A3→A4→A2” of nurse Sato obtained through the assignment will be described as one example. The first evaluation item of the evaluation rule in
As illustrated in
Next, the control unit 50 controls the display unit 40 so as to display the information according to the determination made by the determination unit 56 (step S14). For example, as illustrated in
On the other hand, when the evaluation unit 55 determines that the work data obtained through the assignment in step S10 does not satisfy the standard in step S12 (step S12: NO), the process returns to the above-mentioned step S8 where the second estimation unit 53 estimates again the behavior data. Specifically, the second estimation unit 53 again performs the process from the determination of the frequency matrices CWT and CDT.
As described above, in the embodiment, the behavior data in which the plurality of higher-level behaviors made of the combination of basic actions is arranged in chronological order is estimated, based on the basic action data in which the plurality of basic actions is arranged in chronological order. Therefore, it becomes possible for the behavior estimation apparatus to have only the sensor(s) for detecting the sensing information used for the estimation of the basic actions. Consequently, the configuration can be simplified and the production cost can be reduced.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2011-094288 | Apr 2011 | JP | national |