The above and other objects, features and advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
The construction and operation of the present invention are described in detail with reference to the accompanying drawings below. The same reference numerals, shown in the respective drawings, designate the same elements having the same functions.
Referring to
In the case where the user 201 uses an active guide system, it is necessary to detect the position of the user 201. For this purpose, 14×24 pressure sensors are attached to the top surface of the mattress of the bed. The pressure sensors have advantages over temperature sensors or the like in that they have fast reaction speed and high resolution. In this case, it is preferred that the intelligent bed robot according to the present invention further include a control box, so that the control box can sequentially scan the pressure sensors, attached to the mattress, using a multiplexer, read the values of selected pressure sensors, and deliver pressure value data to a main computer in the form of a pressure measurement distribution image.
Furthermore, it is preferred that processors be attached to respective sections so that hardware is not damaged when the shape of the bed changes, since the bed has a triple folding structure. In this case, construction can be made such that three sensing processors are attached to the sections and transmit data to a main computer.
An algorithm capable of recognizing user information must be applied to the mattress 204. In order to support stochastic real-time signal processing, the Principal Component Analysis (PCA) algorithm is used as the information processing algorithm. The PCA algorithm recognizes sensor information from the bed in the form of images, and simultaneously employs a filtering technique and a stochastic approach, which are commonly used in image processing.
An algorithm with a small computational load is necessary because a lot of noise is included in the input data provided by the sensors and because the trajectory generation as well as the control of the robot arm should be processed in real time. The work that must precede the application of the present algorithm is data base construction for classifying the incoming bed image input. Accordingly, a database containing information about the positions and postures of the user on the bed must be constructed. Meanwhile, a method of applying the PCA algorithm to the present invention is described below.
First, the several primary pressure patterns of the mattress are determined to perform pattern classification. That is, a recognition target pattern, such as the lying posture and sitting posture of a human, is determined, and then pressure sensor data are extracted.
Thereafter, feature points are extracted by calculating the covariance of pressure sensor data. The feature points are used as important information that is used to distinguish the patterns of respective postures.
Thereafter, the number of feature points is determined such that success rate of classification among respective pieces of pattern data can be reached up to the predefined level.
Thereafter, pattern recognition is performed by comparing new incoming pressure sensor data with the predetermined feature points. Through this process, the current posture of the user can be determined, and, using variation in the posture of the user based on recognition results, motion information can be predicted.
The transfer rails 208 function to guide the robot arm 202, 206 and 210 through a path so as to provide service to a person on the bed.
The intelligent robot arm 202, 206 and 210 includes vertical bars 202(a) and 202(b) configured to have a predetermined length and placed on two opposite sides of the bed, and a horizontal bar 206 configured to connect the vertical bars 202(a) and 202(b) with each other. Torque sensors are placed at respective lower ends of the vertical bars 202(a) and 202(b) to measure the horizontal and vertical force applied by the user (see
One or more grippers 210(a) and 210(b) move horizontally along the horizontal bar 206, and are mounted on the horizontal bar 206 so that they can rotate around the horizontal bar 206. Furthermore, fingers 212 and 214 are included in the gripper 210 to grip an object. The gripper 210 may move along the horizontal bar 206 to the center of the bed, and may move outside the mattress 204 when the gripper 210 is not necessary. Furthermore, the gripper 210 can rotate 360°, so that it can conduct the work of fetching a shelf from below the bed, or gripping, spreading and removing bedclothes on the bed.
The working procedure of the gripper 210, for example, the procedure of the work of folding bedclothes after a user has woken up, is described below. In this case, the operation of the gripper 210 may be performed both in an automatic mode and in a command mode.
In the case where a command mode is set, the user 201 issues a command using voices or a bed manipulator. In this case, the current position and posture of a patient are detected by the pressure sensors attached to the mattress 204. Thereafter, the robot arm moves to a location suitable for the posture and position of the user, and conducts the work of folding bedclothes.
Meanwhile, in the case where an automatic mode is set, for example, in the case where the user 201 wakes up and sits up on the bed, the variation in the posture of the user 201 is detected by the sensors attached to the mattress 204. In this case, the work of folding bedclothes can be performed together with the control of the action of moving the folding structure of the bed so as to conduct the work without a separate command from the user.
In the work of the intelligent bed robot 200 of the present invention in a command mode, a voice command or a command via the manipulator is received, and then a service is provided. In this case, the user 201 can issue a command, for example, to conduct the action of preparing a meal or going out using a voice or a manipulator. At this time, the user 201 does not issue commands for respective actions that constitute a time series of actions, but transmits only higher control commands (preparation of a meal, and preparation for going out) that are classified according to category.
It is preferred that the gripper 210 include two fingers (a first finger 214, and a second finger 212), in which case the first finger 214 is fastened, thereby preventing a joint from being bent when raising and pulling a heavy object. The overall shape of the gripper 210 is curved to be similar to that of a human hand, and thus can reduce the risk of injury to the user (201 shown in
It is preferred that the first finger 214 be provided with an accommodation part so that the second finger 212 can pass through the first finger 214 when the first finger 214 and the second finger 212 overlap each other. In this case, a path formed in the second finger 212 may be a through hole that passes through the second finger 212, or a path that is formed by opening a side of the second finger 212, as illustrated in
When the second finger 212 is maximally closed, the first finger 214 and the second finger 212 overlap each other. Through this overlapping action, the size of a contact area can be increased for the case where cloth, such as the cloth of bedclothes, is gripped. In this case, it is preferred that the insides of the fingers be formed of rubber or the like, thereby having great frictional force.
The gripper 210 of the intelligent bed robot according to the present invention includes first, second and third gear structures 217, 219, and 220.
The first gear structure 217 is provided with threads that are engaged with a ball screw 209 mounted inside the horizontal bar 206 along the length thereof, and that guide the gripper 210 through linear movement. The second gear structure 219 is formed of a worm gear, which works in conjunction with the first gear structure 217 to rotate around the horizontal bar 206. The third gear structure 220 is configured such that the fingers 212 and 214 can conduct the work of gripping an object. Since the first gear structure 217 functions as a support when the second gear structure 219 rotates, the first gear structure 217 does not rotate around the horizontal bar 206, but guides the worm gear of the second gear structure 219 through rotation.
Referring to
As shown in
The third gear structure 220 receives driving force from the third motor 218, and functions to control the motions of the fingers 212 and 214 when the motion of picking up an object is performed.
The sensors attached to the mattress recognize the positional state of the user on the bed, and the vertical bars (202 shown in
From
Although the existing intelligent bed robots function to support the body weight of users or to provide shelves for meals, the function of picking up objects using one or more grippers is added to the intelligent bed robot of the present invention, so that the intelligent bed robot of the present invention actively assists a user in his or her physical activities.
Two torque sensors are attached to each transfer platform, and measure horizontal and vertical forces that are applied to the vertical bar by the user.
If the force applied to the vertical bar has only a horizontal component fx, torques τA and τB, which are measured by respective sensors, have the same direction. In contrast, the vertical component fy of a force applied to the vertical bar causes torques τA and τB, which are measured by respective sensors, to have opposite directions.
The robot arm attached to the intelligent robot of the present invention can operate both in a follow mode and in a support mode. A follow mode is an operation mode in which a user can control the robot arm using a command so as to secure a support means using the robot, while a support mode is an operation mode in which a user fastens and supports his or her body on the secured support means.
In the follow mode, a user can move the robot arm to a desired position with reference to data detected by the above-described torque sensors. In contrast, in the support mode, the body weight of the user can be supported without the influence of data detected by the torque sensors, so that the user can rest his or her body on a support means that is provided by the robot arm.
As an example, when a user desires to change his or her posture on a bed, how the follow mode and the support mode are performed is described below.
First, a user locates the horizontal bar near his or her chest using a voice command. In this case, the position of the user is detected by the pressure sensors attached to the mattress, and the horizontal bar moves to a location suitable for the user based on the results of the detection (see
However, the location of the horizontal bar based on a desired position and the results of data measurement varies with the characteristics of an individual user.
Accordingly, in the follow mode, the user can adjust the location and height of the horizontal bar by pushing or pulling the horizontal bar while holding it. In this case, the force applied to the horizontal bar is detected by the torque sensors, and respective vertical bars are moved. Finally, when the user is placed at a desired position, the process may proceed to the support mode. The user may switch the operation mode from the follow mode to the support mode using a voice command.
In the support mode, the robot arm operates based on kinematics control. The bed system estimates the position of the user, and then controls the robot arm based on a scheduled path.
In contrast, in the follow mode, the user's intention is detected through the torque sensors. Referring to
The torque sensors and the user's intention can be expected to be intuitive, while the structure of the robot arm system is not easily explained mathematically. Accordingly, it is preferable to use a fuzzy network to find the user's intention. The following Table 1 and Table 2 are rule tables that are used for a fuzzy network, where P indicates ‘positive’, N indicates ‘negative’, L indicates ‘large’, S indicates ‘small’, and ZO indicates ‘zero’. Thus, the motor A and the motor B are controlled according to the rule tables.
Using the Table 1 and Table 2, the robot arm can be moved to a desired location through a simple computation using the signs and intensities of signals.
The intelligent bed robot according to the present invention may be implemented such that the length of the horizontal bar varies with the relative locations of the vertical bars. When data is received from the sensors attached to the mattress and the vertical bar moves to a location suitable for the user, the length of the horizontal bar may vary with the relative locations of the vertical bars. As results of the movement of the robot arm based on the data measurement of the sensors, the length of the horizontal bar is formed to be the shortest in the case of
The intelligent bed robot according to the present invention has 10 DC motors for operating the robot arm and 2 Alternating Current (AC) motors for raising an object. Accordingly, it is preferred that the control system be based on a Controller Area Network (CAN) to simultaneously control the devices. Referring to
The present invention provides an intelligent bed robot equipped with one or more grippers, which analyzes the posture and action of a bed user and is provided with a robot arm having multiple degrees of freedom.
Furthermore, the present invention provides an intelligent robot, in which one or more grippers capable of picking up objects are added to a conventional robot equipped with a pressure sensor-provided mattress and a supporting robot arm, thereby providing more familiar and convenient services to a user.
Moreover, the present invention provides an intelligent bed robot that can be used to perform health monitoring and evaluate a rehabilitation procedure.
Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2006-0094678 | Sep 2006 | KR | national |