INTELLIGENT BED ROBOT EQUIPPED WITH PRESSURE SENSOR-PROVIDED MATTRESS AND GRIPPER-PROVIDED SUPPORTING ROBOT ARM

Information

  • Patent Application
  • 20080078030
  • Publication Number
    20080078030
  • Date Filed
    August 03, 2007
    17 years ago
  • Date Published
    April 03, 2008
    16 years ago
Abstract
Disclosed herein is an intelligent bed robot. The intelligent bed robot includes a pressure sensor-provided mattress, an intelligent robot arm, transfer rails, and at least one gripper. The pressure sensor-provided mattress monitors the position, posture and motion of a user on a bed in real time, and assists the user. The intelligent robot arm includes vertical bars disposed on two opposite sides of the bed and configured to have a predetermined length, a horizontal bar configured to connect the vertical bars to each other, and torque sensors disposed at lower ends of the vertical bar, and measures horizontal and vertical forces applied by the user. The transfer rails guide the intelligent robot arm along a path of movement. The gripper is coupled to the horizontal bar of the intelligent robot arm, and is provided with a finger unit capable of picking up an object.
Description

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other objects, features and advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:



FIG. 1 is a perspective view of a prior art intelligent bed robot;



FIG. 2 is a view showing a preferred embodiment of an intelligent bed robot according to the present invention;



FIGS. 3A and 3B are a perspective view showing a gripper mounted on the intelligent bed robot according to the present invention, and a view showing the mounting of the gripper on the horizontal bar, respectively;



FIG. 4 is a perspective view showing an example in which grippers are mounted on the robot arm of the intelligent bed robot according to the present invention;



FIGS. 5A and 5B are enlarged photos respectively showing the portion of FIG. 4 in which the gripper is attached to the horizontal bar, and a longitudinal hole which is formed along the horizontal bar;



FIGS. 6A and 6B show torque sensors that are attached to the vertical bar;



FIGS. 7A and 7B show the different locations of the horizontal bar and the vertical bars; and



FIG. 8 is a diagram showing the control system of the intelligent bed robot according to the present invention.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

The construction and operation of the present invention are described in detail with reference to the accompanying drawings below. The same reference numerals, shown in the respective drawings, designate the same elements having the same functions.



FIG. 2 shows a preferred embodiment of an intelligent bed robot 200 according to the present invention. That is, FIG. 2 shows the intelligent bed robot 200 equipped with grippers.


Referring to FIG. 2, the intelligent bed robot 200 according to the present invention includes a mattress 204, an intelligent robot arm 202, 206 and 210, and transfer rails 208. In this case, pressure sensors are attached to the mattress 204 so as to monitor the position, posture and motion of a user 201 on a bed in real time and to assist him or her. The mattress 204 needs to detect the intention of the user 201 when the user 201 lies down on the bed and then sits up, and to assist the user 201 in sitting up using the triple folding structure of the intelligent bed robot.


In the case where the user 201 uses an active guide system, it is necessary to detect the position of the user 201. For this purpose, 14×24 pressure sensors are attached to the top surface of the mattress of the bed. The pressure sensors have advantages over temperature sensors or the like in that they have fast reaction speed and high resolution. In this case, it is preferred that the intelligent bed robot according to the present invention further include a control box, so that the control box can sequentially scan the pressure sensors, attached to the mattress, using a multiplexer, read the values of selected pressure sensors, and deliver pressure value data to a main computer in the form of a pressure measurement distribution image.


Furthermore, it is preferred that processors be attached to respective sections so that hardware is not damaged when the shape of the bed changes, since the bed has a triple folding structure. In this case, construction can be made such that three sensing processors are attached to the sections and transmit data to a main computer.


An algorithm capable of recognizing user information must be applied to the mattress 204. In order to support stochastic real-time signal processing, the Principal Component Analysis (PCA) algorithm is used as the information processing algorithm. The PCA algorithm recognizes sensor information from the bed in the form of images, and simultaneously employs a filtering technique and a stochastic approach, which are commonly used in image processing.


An algorithm with a small computational load is necessary because a lot of noise is included in the input data provided by the sensors and because the trajectory generation as well as the control of the robot arm should be processed in real time. The work that must precede the application of the present algorithm is data base construction for classifying the incoming bed image input. Accordingly, a database containing information about the positions and postures of the user on the bed must be constructed. Meanwhile, a method of applying the PCA algorithm to the present invention is described below.


First, the several primary pressure patterns of the mattress are determined to perform pattern classification. That is, a recognition target pattern, such as the lying posture and sitting posture of a human, is determined, and then pressure sensor data are extracted.


Thereafter, feature points are extracted by calculating the covariance of pressure sensor data. The feature points are used as important information that is used to distinguish the patterns of respective postures.


Thereafter, the number of feature points is determined such that success rate of classification among respective pieces of pattern data can be reached up to the predefined level.


Thereafter, pattern recognition is performed by comparing new incoming pressure sensor data with the predetermined feature points. Through this process, the current posture of the user can be determined, and, using variation in the posture of the user based on recognition results, motion information can be predicted.


The transfer rails 208 function to guide the robot arm 202, 206 and 210 through a path so as to provide service to a person on the bed.


The intelligent robot arm 202, 206 and 210 includes vertical bars 202(a) and 202(b) configured to have a predetermined length and placed on two opposite sides of the bed, and a horizontal bar 206 configured to connect the vertical bars 202(a) and 202(b) with each other. Torque sensors are placed at respective lower ends of the vertical bars 202(a) and 202(b) to measure the horizontal and vertical force applied by the user (see FIGS. 5A and 5B).


One or more grippers 210(a) and 210(b) move horizontally along the horizontal bar 206, and are mounted on the horizontal bar 206 so that they can rotate around the horizontal bar 206. Furthermore, fingers 212 and 214 are included in the gripper 210 to grip an object. The gripper 210 may move along the horizontal bar 206 to the center of the bed, and may move outside the mattress 204 when the gripper 210 is not necessary. Furthermore, the gripper 210 can rotate 360°, so that it can conduct the work of fetching a shelf from below the bed, or gripping, spreading and removing bedclothes on the bed.


The working procedure of the gripper 210, for example, the procedure of the work of folding bedclothes after a user has woken up, is described below. In this case, the operation of the gripper 210 may be performed both in an automatic mode and in a command mode.


In the case where a command mode is set, the user 201 issues a command using voices or a bed manipulator. In this case, the current position and posture of a patient are detected by the pressure sensors attached to the mattress 204. Thereafter, the robot arm moves to a location suitable for the posture and position of the user, and conducts the work of folding bedclothes.


Meanwhile, in the case where an automatic mode is set, for example, in the case where the user 201 wakes up and sits up on the bed, the variation in the posture of the user 201 is detected by the sensors attached to the mattress 204. In this case, the work of folding bedclothes can be performed together with the control of the action of moving the folding structure of the bed so as to conduct the work without a separate command from the user.


In the work of the intelligent bed robot 200 of the present invention in a command mode, a voice command or a command via the manipulator is received, and then a service is provided. In this case, the user 201 can issue a command, for example, to conduct the action of preparing a meal or going out using a voice or a manipulator. At this time, the user 201 does not issue commands for respective actions that constitute a time series of actions, but transmits only higher control commands (preparation of a meal, and preparation for going out) that are classified according to category.



FIGS. 3A and 3B are a perspective view showing a gripper mounted on the intelligent bed robot according to the present invention, and a view showing the mounting of the gripper on the horizontal bar, respectively. For the convenience of the user (201 shown in FIG. 2), the gripper 210 is mounted on the horizontal bar to conduct the work of spreading or removing bedclothes in addition to the work of conveying the shelf.


It is preferred that the gripper 210 include two fingers (a first finger 214, and a second finger 212), in which case the first finger 214 is fastened, thereby preventing a joint from being bent when raising and pulling a heavy object. The overall shape of the gripper 210 is curved to be similar to that of a human hand, and thus can reduce the risk of injury to the user (201 shown in FIG. 2) that may be caused upon collision with the user.


It is preferred that the first finger 214 be provided with an accommodation part so that the second finger 212 can pass through the first finger 214 when the first finger 214 and the second finger 212 overlap each other. In this case, a path formed in the second finger 212 may be a through hole that passes through the second finger 212, or a path that is formed by opening a side of the second finger 212, as illustrated in FIG. 3A.


When the second finger 212 is maximally closed, the first finger 214 and the second finger 212 overlap each other. Through this overlapping action, the size of a contact area can be increased for the case where cloth, such as the cloth of bedclothes, is gripped. In this case, it is preferred that the insides of the fingers be formed of rubber or the like, thereby having great frictional force.



FIG. 3B shows the mounting of the gripper 210 on the horizontal bar 206. That is, FIG. 3B shows the mounting of the gripper 210 on the horizontal bar 206 via gear structures.


The gripper 210 of the intelligent bed robot according to the present invention includes first, second and third gear structures 217, 219, and 220.


The first gear structure 217 is provided with threads that are engaged with a ball screw 209 mounted inside the horizontal bar 206 along the length thereof, and that guide the gripper 210 through linear movement. The second gear structure 219 is formed of a worm gear, which works in conjunction with the first gear structure 217 to rotate around the horizontal bar 206. The third gear structure 220 is configured such that the fingers 212 and 214 can conduct the work of gripping an object. Since the first gear structure 217 functions as a support when the second gear structure 219 rotates, the first gear structure 217 does not rotate around the horizontal bar 206, but guides the worm gear of the second gear structure 219 through rotation.


Referring to FIGS. 3A and 3B, a first motor 215 is mounted in the horizontal bar and provides driving force to the ball screw 209. The ball screw 209, which receives driving force from the first motor 215, rotates inside the horizontal bar 206, and the first gear structure 217 engages with spiral threads and moves linearly along the horizontal bar 206. In this case, the first gear structure 217 is provided with threads and thus works in conjunction with the ball screw 209. Meanwhile, in the region where the ball screw 209 is placed, the horizontal bar 206 has longitudinal holes, so that the inside of the horizontal bar 206 communicates with the outside of the horizontal bar 206. Using the above-described connection structure, the ball screw works in conjunction with the first gear structure, so that the rotation of the first gear structure 217 can be prevented. That is, the first gear structure 217 only moves linearly along the horizontal bar 206, does not rotate around the horizontal bar 206, and functions as a support when the second gear structure 219 rotates.


As shown in FIGS. 3 and 4, the second gear structure 219 has a worm gear form, and, as the worm gear works in conjunction with the first gear structure 217, the gripper 212 can rotate around the horizontal bar 206. The reason why the second gear structure 219 has a worm gear form is that an unnecessary motion can be prevented in a non-control state (an unpowered state, etc.) because the gripper 212 undergoes high loads, for example, while moving various objects.


The third gear structure 220 receives driving force from the third motor 218, and functions to control the motions of the fingers 212 and 214 when the motion of picking up an object is performed.


The sensors attached to the mattress recognize the positional state of the user on the bed, and the vertical bars (202 shown in FIG. 2) move upon the performance of a necessary motion, in which case the moving distances of the right and left vertical bars (202 shown in FIG. 2) may not be the same (see FIG. 7B). According to the present invention, the length of the horizontal bar (206 shown in FIG. 2) is automatically changed in response to a change in the distance between the vertical bars (202 shown in FIG. 2) based on the relative positions of the vertical bars (202 shown in FIG. 2), which have moved. As an example, the horizontal bar (206 shown in FIG. 2) may be configured such that the length thereof can be automatically changed around a central separation element (211 shown in FIG. 4) (see FIG. 4).



FIG. 4 shows an example in which grippers are mounted on the robot arm of the intelligent bed robot according to the present invention. That is, FIG. 4 is a perspective view showing parts of the vertical bars and the horizontal bar equipped with the grippers.


From FIG. 4, it can be found that the present invention provides a structure in which the vertical bar 202 is connected to both ends of the horizontal bar 206 and the grippers 210 can move linearly along the horizontal bar 206 and rotate around the horizontal bar 206.



FIGS. 5A and 5B are enlarged photos showing the portion of FIG. 4, in which a gripper 210 is attached to the horizontal bar 206, and a longitudinal hole 206-1 or 206-2, which is formed in the horizontal bar 206 along the length thereof, respectively. Referring to FIGS. 5A and 5B, the robot arm of the intelligent bed robot according to the present invention has two grippers 210(a) and 210(b). Each of the grippers has 3 Degrees of Freedom (3 DOF). One degree of freedom is used to move along the horizontal bar 206, another degree of freedom is used to rotate around the horizontal bar 206, and the remaining degree of freedom is used to control the work of picking up. In this case, respective degrees of freedom are controlled by the first, second and third motors (215, 216 and 218 shown in FIGS. 3A and 3B), which have been described in conjunction with FIGS. 3A and 3B.


Although the existing intelligent bed robots function to support the body weight of users or to provide shelves for meals, the function of picking up objects using one or more grippers is added to the intelligent bed robot of the present invention, so that the intelligent bed robot of the present invention actively assists a user in his or her physical activities.



FIGS. 6A and 6B show torque sensors that are attached to the vertical bar. Referring to FIG. 6A, a transfer platform, which enables the robot arm to move across the entire area of the bed along the transfer rails, is connected to the low end of a support bar, and two Direct Current (DC) motors, which are installed on both sides of the bed, drive the transfer platform.


Two torque sensors are attached to each transfer platform, and measure horizontal and vertical forces that are applied to the vertical bar by the user.


If the force applied to the vertical bar has only a horizontal component fx, torques τA and τB, which are measured by respective sensors, have the same direction. In contrast, the vertical component fy of a force applied to the vertical bar causes torques τA and τB, which are measured by respective sensors, to have opposite directions.


The robot arm attached to the intelligent robot of the present invention can operate both in a follow mode and in a support mode. A follow mode is an operation mode in which a user can control the robot arm using a command so as to secure a support means using the robot, while a support mode is an operation mode in which a user fastens and supports his or her body on the secured support means.


In the follow mode, a user can move the robot arm to a desired position with reference to data detected by the above-described torque sensors. In contrast, in the support mode, the body weight of the user can be supported without the influence of data detected by the torque sensors, so that the user can rest his or her body on a support means that is provided by the robot arm.


As an example, when a user desires to change his or her posture on a bed, how the follow mode and the support mode are performed is described below.


First, a user locates the horizontal bar near his or her chest using a voice command. In this case, the position of the user is detected by the pressure sensors attached to the mattress, and the horizontal bar moves to a location suitable for the user based on the results of the detection (see FIGS. 7A and 7B).


However, the location of the horizontal bar based on a desired position and the results of data measurement varies with the characteristics of an individual user.


Accordingly, in the follow mode, the user can adjust the location and height of the horizontal bar by pushing or pulling the horizontal bar while holding it. In this case, the force applied to the horizontal bar is detected by the torque sensors, and respective vertical bars are moved. Finally, when the user is placed at a desired position, the process may proceed to the support mode. The user may switch the operation mode from the follow mode to the support mode using a voice command.


In the support mode, the robot arm operates based on kinematics control. The bed system estimates the position of the user, and then controls the robot arm based on a scheduled path.


In contrast, in the follow mode, the user's intention is detected through the torque sensors. Referring to FIGS. 6A and 6B, when the user pushes or pulls the horizontal bar, the robot arm measures the vibrations of the torque sensors, and then controls the motor A and the motor B. After the user's intention has been detected through the torque sensors, the position desired by the user is calculated with reference to the combination of the measurement results of the torque sensor.


The torque sensors and the user's intention can be expected to be intuitive, while the structure of the robot arm system is not easily explained mathematically. Accordingly, it is preferable to use a fuzzy network to find the user's intention. The following Table 1 and Table 2 are rule tables that are used for a fuzzy network, where P indicates ‘positive’, N indicates ‘negative’, L indicates ‘large’, S indicates ‘small’, and ZO indicates ‘zero’. Thus, the motor A and the motor B are controlled according to the rule tables.











TABLE 1









τB














Motor A
PL
PS
ZO
NS
NL

















τA
PL
ZO
ZO
PS
PL
PL



PS
ZO
ZO
PS
PS
PL



ZO
NS
NS
ZO
PS
PS



NS
NL
NS
NS
ZO
ZO



NL
NL
NL
NS
ZO
ZO


















TABLE 2









τB














Motor B
PL
PS
ZO
NS
NL

















τA
PL
PL
PL
PS
ZO
ZO



PS
PL
PS
PS
ZO
ZO



ZO
PS
PS
ZO
NS
NS



NS
ZO
ZO
NS
NS
NL



NL
ZO
ZO
NS
NL
NL









Using the Table 1 and Table 2, the robot arm can be moved to a desired location through a simple computation using the signs and intensities of signals.



FIGS. 7A and 7B show the different locations of the horizontal bar and the vertical bars. FIGS. 7A and 7B show the comparison of the case where the length of the horizontal bar is identical to that of the minor axis of the bed with the case where the length of the horizontal bar is longer than that of the minor axis of the bed.


The intelligent bed robot according to the present invention may be implemented such that the length of the horizontal bar varies with the relative locations of the vertical bars. When data is received from the sensors attached to the mattress and the vertical bar moves to a location suitable for the user, the length of the horizontal bar may vary with the relative locations of the vertical bars. As results of the movement of the robot arm based on the data measurement of the sensors, the length of the horizontal bar is formed to be the shortest in the case of FIG. 7A, while the length of the horizontal bar is formed to be longer than the minor axis of the bed in the case of FIG. 7B.



FIG. 8 is a diagram showing the control system of the intelligent bed robot according to the present invention.


The intelligent bed robot according to the present invention has 10 DC motors for operating the robot arm and 2 Alternating Current (AC) motors for raising an object. Accordingly, it is preferred that the control system be based on a Controller Area Network (CAN) to simultaneously control the devices. Referring to FIG. 8, the 10 DC motors installed in the vertical bar and the grippers are connected to the CAN, and the 2 AC motors are controlled via RS232 using a microcontroller. The microcontroller stores data that is detected by the pressure sensors attached to the mattress.


The present invention provides an intelligent bed robot equipped with one or more grippers, which analyzes the posture and action of a bed user and is provided with a robot arm having multiple degrees of freedom.


Furthermore, the present invention provides an intelligent robot, in which one or more grippers capable of picking up objects are added to a conventional robot equipped with a pressure sensor-provided mattress and a supporting robot arm, thereby providing more familiar and convenient services to a user.


Moreover, the present invention provides an intelligent bed robot that can be used to perform health monitoring and evaluate a rehabilitation procedure.


Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.

Claims
  • 1. An intelligent bed robot, comprising: a pressure sensor-provided mattress configured to monitor a position, posture and motion of a user on a bed in real time, and assisting the user;an intelligent robot arm configured to comprise vertical bars disposed on two opposite sides of the bed and configured to have a predetermined length, a horizontal bar configured to connect the vertical bars to each other, and torque sensors disposed at lower ends of the vertical bar, and to measure horizontal and vertical forces applied by the user;transfer rails configured to guide the intelligent robot arm along a path of movement; andat least one gripper coupled to the horizontal bar of the intelligent robot arm so that it can move horizontally along the horizontal bar and rotate around the horizontal bar, and provided with a finger unit capable of picking up an object.
  • 2. The intelligent bed robot as set forth in claim 1, further comprising a control box for reading pressure values of the pressure sensors by sequentially scanning the pressure sensors using a multiplexer, and delivering data about the pressure values to a main computer in a pressure measurement distribution image form.
  • 3. The intelligent bed robot as set forth in claim 1, wherein the robot arm has two operation modes, including a follow mode, in which the user can control the robot arm using his or her command so as to obtain support means using the robot, and a support mode, in which the user's body is secured and supported using the acquired support means.
  • 4. The intelligent bed robot as set forth in claim 3, wherein the follow mode conducts work using a fuzzy network.
  • 5. The intelligent bed robot as set forth in claim 1, wherein the horizontal bar has a central separation element so that a length of the horizontal bar can vary with relative locations of the vertical bars.
  • 6. The intelligent bed robot as set forth in claim 1, wherein the pressure sensor-provided mattress monitors a person on a bed using a Principal Component Analysis (PCA) algorithm, the PCA algorithm performing: a first step of determining a human's recognition target pattern and relating pressure sensor dataset for pattern classification;a second step of extracting feature points by calculating covariance of the pressure sensor data;a third step of determining a number of feature points capable of maximizing information classification between respective pieces of pattern data; anda fourth step of performing pattern recognition by comparing new incoming pressure sensor data with predetermined feature points.
  • 7. The intelligent bed robot as set forth in any one of claims 1 to 6, wherein the robot arm works in a command mode of conducting operations in response to the user's command, which includes user's voice, and in an automatic mode of conducting work according to a predetermined work command.
  • 8. The intelligent bed robot as set forth in claim 1, wherein: the horizontal bar has a ball screw disposed therein along a length of the horizontal bar; andthe gripper includes:a first gear structure provided with threads so that the first gear structure can guide the gripper through linear movement in conjunction with the ball screw;a second gear structure formed of a worm gear that works in conjunction with the first gear structure so that the gripper can rotate around the horizontal bar; anda third gear structure configured such that the finger unit can conduct work of picking up an object.
  • 9. The intelligent bed robot as set forth in claim 8, wherein the ball screw and the first gear structure work in conjunction with each other along a longitudinal hole that is formed along the length of the horizontal bar.
  • 10. The intelligent bed robot as set forth in claim 1, wherein the finger unit has a shape of a human hand, includes a stationary first finger and a movable second finger, and performs work of gripping an object through overlapping of the first and second fingers.
  • 11. The intelligent bed robot as set forth in claim 10, wherein the first finger is provided with an accommodation part so that the first finger can accommodate the second finger when the first finger and the second finger overlap each other.
  • 12. The intelligent bed robot as set forth in claim 10, wherein the first finger and the second finger have respective rubber portions that come into contact with an object.
  • 13. The intelligent bed robot as set forth in claim 1, wherein the intelligent bed robot is controlled using a Controller Area Network (CAN) device.
Priority Claims (1)
Number Date Country Kind
10-2006-0094678 Sep 2006 KR national