The present disclosure generally relates to robots, and particularly to a smart robotic walking assistant that can provide walking assistance and training and a method for controlling the robotic walking assistant.
Walking is one of the most important abilities that enable people to remain independent and healthy throughout their lives. Unfortunately, there are numerous people who lose their walking ability because of accidents or diseases. As society ages, the number of seniors who suffer from walking dysfunctions grows rapidly. Additionally, older people have the highest risk of death or serious injury arising from a fall and the risk increases with age.
Recent advances in robotics provide an innovative solution to alleviate these challenges by improving elderly quality of life and prioritizing their dignity and independence. As such, robotic walking assistants have attracted significant attention in recent years. One type of a robotic walking assistant can be designed to help support a portion of the user's bodyweight to reduce the load on the user's legs while walking, leading to reduced fatigue and less physical exertion. For example, robotic walking assistants typically include wheels for movement and a vertical body having handles that allow users to push the robotic walking assistants while walking.
However, because of the fixed nature of the wheels and the vertical body, these robotic walking assistants may suffer from lack of sufficient stability when they provide a seat that allows users to sit on. In addition, these robotic walking assistants may suffer from the problem that people with a large stride tend to kick the back of the robotic walking assistants while walking.
Therefore, there is a need to provide a robotic walking assistant and a method for controlling the robotic walking assistant to overcome the above-mentioned problems.
Many aspects of the present embodiments can be better understood with reference to the following drawings. The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present embodiments. Moreover, in the drawings, all the views are schematic, and like reference numerals designate corresponding parts throughout the several views.
The disclosure is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like reference numerals indicate similar elements. It should be noted that references to “an” or “one” embodiment in this disclosure are not necessarily to the same embodiment, and such references can mean “at least one” embodiment.
Although the features and elements of the present disclosure are described as embodiments in particular combinations, each feature or element can be used alone or in other various combinations within the principles of the present disclosure to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.
In one embodiment, the robotic walking assistant 100 may include a wheeled base 10, a body 20 positioned on the wheeled base 10, an elevation mechanism 30 (see
With reference to
In one embodiment, the body 20 is positioned on the top of the wheeled base 10 and disposed in a vertical direction. The body 20 includes at least one handle 21. A user may hold the at least one handle 21 while walking/standing, which allows the robotic walking assistant 100 to provide an upward support force to the user, thereby helping the user to maintain balance during his/her walking/standing. The robotic walking assistant is like a walking cane with the at least one handle 21, which can ensure stability of the walking of a user.
In one embodiment, the elevation mechanism 30 is connected between the wheeled base 10 and the body 20. Referring to
In one embodiment, the robotic walking assistant may include sensors that enable the robotic walking assistant 100 to perceive the environment where the robotic walking assistant 100 operates. In one embodiment, the sensors may include ranging sensors that require no physical contact with objects being detected. They allow the robotic walking assistant 100 to perceive an obstacle without actually having to come into contact with it. As shown in
The control system 40 (see
The wheeled base 10 may be a differential drive platform, in one example. With reference to
In one embodiment, the base 11 may include a base body 110 (see
The wheel mechanisms 12 are respectively connected to the distal ends of the output shafts 143. In the embodiment, each output shaft 143 (see
Referring to
When the two wheels 122 and the wheel 133 are in contact with the surface S, three support points are formed between the wheels 122, 133 and the surface S. For example, when the wheel mechanisms 12 are in the retracted positions, two support points A (see
Since the wheels 122 can move with respect to the base 11, the distances between the wheels 122, 133 are adjustable. Specifically, as shown in
The robotic walking assistant 100 as described in embodiments above is a machine that stands on a triangular footprint and has an adjustable height. When the body 20 moves up and down or the robotic walking assistant 100 supports a portion of the bodyweight of a user pushing the robotic walking assistant 100 or sitting on a seat (which will be described later) of the robotic walking assistant 100, the center of gravity of the robotic walking assistant 100 is shifted. However, as long as the center of gravity of the robotic walking assistant 100 remains oriented inside the supporting polygon formed by connecting the three support points between the wheels 122, 133 and the surface S, the robotic walking assistant 100 remains upright and will not tip over. Although the center of gravity of the robotic walking assistant 100 moves when the body 20 moves up or a user sits on the seat of the robotic walking assistant 100, the supporting polygon formed by connecting the three support points between the wheels 122, 133 and the surface S has a larger area after the wheels 122 moves from the retracted positions to the extended positions, and the center of gravity of the robotic walking assistant 100 can still fall within the confines of the supporting polygon. Additionally, when the wheels 122 are moved to their extended positions, the distance between a user supported by the robotic walking assistant and the back of the robotic walking assistant 100 is increased, compared to when the wheels 122 are moved to their retracted positions, which can prevent a user with a large stride from kicking the back of the robotic walking assistant 100.
Referring to
Referring to
Referring to
In one embodiment, the seat 50 may include a seat cover 51 and a seat body 52 arranged within the seat cover 51. The seat body 52 is a planar structure and substantially square. Two opposite sides of the seat body 52 are rotatably connected to the inner frame 23. In one embodiment, two angled bars 233 are connected to the inner frame 23 and located above the wheels 122. Each angled bar 233 includes a horizontal bar 2331 protruding from one vertical bar 231 of the inner frame 23, and a vertical bar 2332. Two seat mounting members 24 are respectively fixed to the vertical bars 2332, and each include a vertical tab 241. The opposite sides of the seat body 52 are rotatably connected to inner sides 2411 of the vertical tabs 241. With such configuration, the seat body 52 can be rotated to the folded position where the seat 50 is slightly inclined with respect to the body 20, and can be rotated to the unfolded position where the seat 50 is substantially perpendicular to the body 20.
In one embodiment, a seat motor 53 is fixed to the outer sides of one vertical tab 241, and is configured to actuate rotational movement of the seat body 52. The seat motor 53 can be a rotary DC motors that directly drives the seat body 52 to rotate. In another embodiment, a transmission mechanism can be arranged between the seat motor 53 and the seat body 52 to transmit rotary motion from the seat motor 53 to the seat body 52. In one embodiment, a limit switch may be arranged on the seat body 52 and the vertical tab 241. After the seat body 52 moves to the folded/unfolded positions, the limit switch is activated and the control system 40 stops rotation of the seat 50 according to signals from the limit switch. The limit switch may be mechanical, optical, or magnetic type limit switches. In one embodiment, a stop member may be fixed to the seat body 52, and a groove may be defined in the vertical tab 241 adjacent to the stop member. An end of the stop member is received in the groove and slide in the groove when the seat body 52 rotates. When the stop member comes into contact with one of the opposite ends of the groove, the rotation of the seat body 52 is stopped.
Referring to
In one embodiment, two actuator mounting members 28 are fixed to the inner frame 23 of the body 20 and the motor mounting members 25. The actuator mounting members 28 are disposed at opposite sides of the seat body 52, under the motor mounting members 25, and opposite the two armrests 60. Two linear actuators 61 are fixed to the actuator mounting members 28. In one embodiment, each linear actuator 61 may include a motor 62, a tube 63, and an output shaft 64 that is slidably connected to the tube 63. Via actuation of the motor 62, the output shaft 64 can slide with respect to the tube 63. The armrests 60 are respectively rotatably connected to the distal ends of the output shaft 64. When the output shafts 64 slide with respect to the tube 63, the armrests 60 are pushed by the output shafts 64 and can thus rotate with respect to the armrest mounting members 27.
Referring to
Referring to
In one embodiment, the range of motion of the camera 71 can be set to 180 degrees. Since the camera 71 is rotatable and can move up and down together with the body 20, the camera can have a large field of view (FOV). In addition, a visual serving algorithm could be adopted to enable the camera to track certain objects.
Referring to
The robotic walking assistant 100 further includes a base motion controller 101 electrically connected to the processor 41, foot motor drivers 153, wheel motor drivers 102, wheel mechanism motor drivers 103, and an elevation motor driver 104 that are electrically connected to the base motion controller 101. The foot motor drivers 153 are configured to drive the motors 151 of the actuated feet 15. The wheel motor drivers 102 are configured to drive the motors 1201 that are configured to actuate rotational movement of the wheels 122. The wheel mechanism motor drivers 103 are configured to drive the motors 141 that are configured to actuate movement of the wheel mechanisms 12. The elevation motor driver 104 is configured to drive the motor 31 of the elevation mechanism 30.
The robotic walking assistant 100 further includes a body motion controller 301 electrically connected to the processor 41, a seat motor driver 501, a camera motor driver 713, armrest motor drivers 601, and handle motor drivers 210 that are electrically connected to the body motion controller 301. The seat motor driver 501 is configured to drive the seat motor 53 of the seat 50. The camera motor driver 713 is configured to drive the motor 711. The armrest motor drivers 601 are configured to drive the motors 62. The motor drivers 210 are configured to drive the motors 215.
Referring to
In one embodiment, the robotic walking assistant 100 further includes a power system 81 that powers all key components of the robotic walking assistant 100. The power system 81 is mounted in the base 10, and may include a battery management system (BMS), one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of electrical power. The power system 81 may further include a self-charging unit that can be engaged with a docking charging station in a fixed location, which allows the robotic walking assistant 100 to be charged. The battery management system manages a rechargeable battery, such as by protecting the battery from operating outside its safe operating area, monitoring its state, calculating secondary data, reporting that data, controlling its environment, authenticating it and/or balancing it.
In one embodiment, the robotic walking assistant 100 may further include a front display 82 and a rear display 83. The front display 82 and the rear display 83 may be a touch-sensitive display device and each provide an input interface and an output interface between the robotic walking assistant 100 and a user. The front display 82 and the rear display 83 display visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof. In one embodiment, the front display 82 faces the front of the robotic walking assistant 100 to display general information, or allow telepresence of a user who is not actively using the walking function. The rear display 83 can display the walking related information.
In one embodiment, the robotic walking assistant 100 may further include a speaker 84 and a microphone 85 that provide an audio interface between a user and the robotic walking assistant 100. The microphone 85 receives audio data, converts the audio data to an electrical signal that is transmitted as a command to the control system 40. The speaker 84 converts the electrical signal to human-audible sound waves. The speaker 84 and the microphone 85 enable voice interaction between a user and the robotic walking assistant. The speaker 84 may play music or other audio contents to users for entertainment purpose. The robotic walking assistant 100 may further include wireless communication interfaces 86, such as WIFI and BLUETOOTH modules. The robotic walking assistant 100 may further include wireless communication interfaces 86, such as WIFI and BLUETOOTH modules. The robotic walking assistant 100 may further include an NFC subsystem 89 that may include an NFC chip and an antenna that communicates with another device/tag, which allows the NFC subsystem 89 to have an NFC reading function. The NFC subsystem 89 can be used for authorization purpose. That is, the NFC subsystem 89 can serve as a security mechanism to determine user privileges or access levels related to system resources.
It should be noted that
Step S101: Receive command instructions. The processor 41 of the control system 40 receives command instructions. For example, the processor 41 may receive a command instruction from a user (e.g., care seeker) that request the robotic walking assistant 100 to fetch an object from one location and deliver the object to another location.
Step S201: Move the wheeled base 10 in response to a first command instruction. The processor 41 may analyze each command instruction and move the wheeled base 10 to a determined location in response to a first command instruction. The first command instruction may include descriptions of locations where the robotic walking assistant 100 needs to reach. For example, when a user (e.g., care seeker) requests the robotic walking assistant 100 to fetch and deliver an object, the first command instruction may include descriptions of a starting location where the object is stored and a target location where the object needs to be delivered. The processor 41 may execute software programs and/or sets of instructions stored in storage 42 to perform localization, motion planning, and trajectory tracking such that the wheeled base 10 can determine its real-time position in a known map during movement along a planned path. If there is a dynamic obstacle on the planned path, the processor 41 can plan a new path to avoid the obstacle. In other words, the wheels 122 may be controlled to follow a prescribed path which will be adjusted if there are obstacles on the path. The wheeled base 10 can autonomously move first to the starting location and then to the target location. Additionally, the wheels 122 can be controlled with command on the screen or control inputs inferred from the handles, which could be attached with load cells. This allows a user to directly control movement of the wheels 122.
Step S301: Move the wheel mechanisms 12 with respect to the base 11 in response to a second command instruction. The processor 41 may analyze each command instruction and move the wheel mechanisms 12 to the retracted positions or the extended positions according to the second command instruction. The processor 41 may receive the second command instruction from a user (e.g., care seeker) to move the wheel mechanisms 12 to the extended positions such that the user can grab the handles 21 and push the robotic walking assistant 100, or the user can sit on the seat 50. Additionally, the processor 41 may move the wheel mechanisms 12 to the retracted positions when certain conditions are met, for example when the robotic walking assistant 100 moves to the determined position and there is no further physical task.
Step S401: Rotate the seat 50 in response to a third command instruction. The processor 41 may analyze each command instruction and rotate the seat 50 to the folded or unfolded position according to the third command instruction. The processor 41 may receive the third command instruction from a user (e.g., care seeker) to rotate the seat 50 to the unfolded position such that the user can sit on the seat 50. The processor 41 may receive the third command instruction from the user to rotate the seat 50 back to the folded position such that the robotic walking assistant 100 is ready to be pushed by the user. Additionally, the processor 41 may rotate the seat 50 when certain conditions are met. For example, when the processor 41 determines that the user is tired according to the output from camera 71, the processor 41 can rotate the seat 50 to the unfolded position such that the user can sit on the seat 50.
Step S501: Rotate the armrests 60 in response to a fourth command instruction. The processor 41 may analyze each command instruction and rotate the armrests 60 to the folded or unfolded positions according to the fourth command instruction. The processor 41 may receive the fourth command instruction from a user (e.g., care seeker) to rotate the armrests 60 to the unfolded positions such that the user can put his/her arms on the armrests 60 when the user sits on the seat 50. Additionally, the processor 41 may rotate the armrests 60 when certain conditions are met. For example, when the seat 50 has been rotated to the unfolded position, the processor 41 rotates the armrests 60 to the unfolded positions; when the seat 50 has been rotated to the folded position, the processor 41 rotates the armrests 60 to the folded positions. The arm rests 60 and the seat 50 can be rotated simultaneously to their folded positions or unfolded positions. However, they can be controlled to rotate separately when needed.
Step S601: Move the handles 21 in response to a fifth command instruction. The processor 41 may analyze each command instruction and move the handles 21 according to the fifth command instruction. The processor 41 may receive the fifth command instruction from a user (e.g., care seeker) to move the handles 21 to the extended positions such that the user can grab the handles 21 to push the robotic walking assistant 100 while walking. Additionally, the processor 41 may move the handles 21 when certain conditions are met. For example, when the wheel mechanisms 12 are move to their extended positions, the processor 41 moves the handles 21 to the extended positions; when the wheel mechanisms 12 are move to their retracted positions, the processor 41 moves the handles 21 to their retracted positions.
Step S701: Rotate the camera 71 in response to a sixth command instruction. The processor 41 may analyze each command instruction and rotate the camera 71 according to the sixth command instruction. For example, the processor 41 may receive a command instruction from a user (e.g., care seeker) and control the robotic walking assistant 100 to move autonomously between determined positions. In this scenario, the processor 41 rotates the camera 71 to face forward to detect objects in front of the robotic walking assistant 100 such that the robotic walking assistant 100 can perceive the environment. The processor 41 may receive a command instruction from a user (e.g., care seeker) who requests the robotic walking assistant 100 to provide assistance when the user is walking, the processor 41 rotates the camera 71 to face backward to detect the facial expressions or other bio-characters of the user. As a result, the robotic walking assistant 100 can monitor the tiredness of the user.
Step S801: Control the elevation mechanism 30 to move the body 20 up and down in response to a seventh command instruction. The processor 41 may analyze each command instruction and control the elevation mechanism 30 to move the body 20 up and down in response to the seventh command instruction. For example, the processor 41 may receive a command instruction from a user (e.g., care seeker) and control the robotic walking assistant 100 to move autonomously between determined positions. In this scenario, the processor 41 control the elevation mechanism 30 to 2I move the body 20 down to the retracted position such that the robotic walking assistant 100 can have a limited height, which facilitates stability during movement and travel of the robotic walking assistant 100. The processor 41 may receive a command instruction from a user (e.g., care seeker) who requests the robotic walking assistant 100 to provide assistance when the user is walking, the processor 41 can then determine the height of the user can move the body 20 up to an extended position according to the height of the user. In this scenario, the extended position is not a fixed position and may change depending on the height of the user. With such configuration, the robotic walking assistant 100 can have the flexibility to adapt to different users of different height, which allows different users to walk and push the robotic walking assistant 100 in a substantially upright pose.
In one embodiment, the robotic walking assistant 100 can operate in different modes. For example, as shown in
The robotic walking assistant 100 can operate in a third mode or standing assistive mode. In this mode, the wheel mechanisms 12 and the handles 21 are moved to their extended positions, which enables the robotic walking assistant 100 to serve as a stable structure where the user can grab the handles 21 and stand up from a sitting position. After the robotic walking assistant 100 in the first mode approaches the user who is sitting, the robotic walking assistant 100 can be switched to the third mode. When there is no physical task, the robotic walking assistant 100 in the third mode can be switched to the first mode. The robotic walking assistant 100 can operate in a fourth mode or walking assistive mode. In response to a walking assistive mode command instruction, the wheel mechanism 12 and the handles 21 are moved to their extended positions, the feet 152 are moved up away from the surface S, and the body 20 is moved up to an extended position according to the height of the user. In this mode, the robotic walking assistant 100 is ready to be pushed by the user and helps support a portion of the bodyweight of the user when the user is walking. After the robotic walking assistant 100 in the first mode approaches the user who is standing, the robotic walking assistant 100 can be switched to the fourth mode. When there is no physical task, the robotic walking assistant 100 in the fourth mode can be switched to the first mode.
The robotic walking assistant 100 can operate in a fifth mode or walking training mode. In response to a walking training mode command instruction, the wheel mechanism 12 and the handles 21 are moved to their extended positions, the feet 152 are moved up away from the surface S. and the body 20 is moved up to an extended position according to the height of the user. In this mode, the robotic walking assistant 100 is ready to be pushed by the user and helps support a portion of the bodyweight of the user when the user is walking. After the robotic walking assistant 100 in the first mode approaches the user who is standing, the robotic walking assistant 100 can be switched to the fifth mode. When there is no physical task, the robotic walking assistant 100 in the fifth mode can be switched to the first mode. The difference between the walking training mode and the walking assistive mode is that the robotic walking assistant 100 in the walking training mode can exert extra resistance to the user so that he/she has to make extra efforts to push the robotic walking assistant forward or around, thus increasing the muscle strength and coordination capability given enough training sessions. In one embodiment, the wheeled base 10 may further include brakes. When then robotic walking assistant is switched to the walking training mode, the processor 41 controls the brakes to press against the moving wheels 122 to create friction. In this case, the user needs to apply more pushing force to the robotic walking assistant 100, thereby increasing the muscle strength and coordination capability given enough training sessions.
The robotic walking assistant 100 can operate in a sixth mode or rest mode. In response to a rest mode command instruction, the wheel mechanisms 12 are moved to their extended positions, the feet 152 are moved down to be in contact with the surface S, and the seat 50 and the armrests 60 are rotated to their unfolded positions. The robotic walking assistant 100 is thus ready for the user to take a seat for rest. The robotic walking assistant 100 in the fourth mode can be switched to the sixth mode after receiving a command from the user or detecting that the user is tired. The robotic walking assistant 100 in the sixth mode can be switched to the fourth mode after receiving a command from the user. It should be noted that
The sixth scenario shows that the walking assistant 100 is switched to the rest mode such that the user can sit on the seat 50. The seventh scenario shows that the robotic walking assistant 100 continues to escort the user toward the destination after the user takes a break. The eighth scenario shows that the walking assistant 100 has detected obstacles/hazards in front of the walking assistant 100, and guides the user to walk around the obstacles/hazards. The walking assistant 100 may report the obstacles/hazards to the central platform. The seventh scenario shows that the robotic walking assistant 100 continues to escort the user until they reach the planned destination.
Step S171: Receive a walking schedule from the central platform. The processor 41 of the control system 40 receives the walking schedule from the central platform. In one embodiment, the walking schedule is created on the central platform by a healthcare professional. The schedule may include descriptions of start time of walk, duration of walk, starting location, destination location, walking route, location of the user, identifying information of the user, and the like.
Step S172: Move autonomously to a location of the user (e.g., a care seeker or a patient) according to the walking schedule. After step S171, the robotic walking assistant 100 is switched to the autonomous mode and move toward the location of the user specified in the walking schedule.
Step S173: Locate and identify the user. In one embodiment, the robotic walking assistant 100 may locate and identify the user using face recognition technology.
Step S174: Request confirmation from the user about the walking schedule. The robotic walking assistant 100 may display the walking schedule on the front display 82, and may read out the walking schedule. The robotic walking assistant 100 may further provide one or more user interfaces for the user to accept or modify the walking schedule.
Step S175: Send a confirmation result to the central platform. After the user accepts or modifies the walking schedule, the robotic walking assistant 100 sends the confirmation result to the central platform.
Step S181: Move autonomously to a location of a user. In one embodiment, the robotic walking assistant 100 may move autonomously to the location of the user according to a pre-planned walking schedule or in response to command instruction from the user.
Step S182: Locate and identify the user. In one embodiment, the robotic walking assistant 100 may locate and identify the user using face recognition technology.
Step S183: Determine whether the user is standing. If the user is standing, the procedure goes to step S184.
Step S184: Switch the robotic walking assistant 100 to the walking assistive mode, with the body 20 moved up to an extended position. In one embodiment, the robotic walking assistant 100 may receive a user profile that includes the height of the user from the central platform. The body 20 may be moved up to the extended position according to the height of the user such that the handles 21 are at a comfortable height for the user. The robotic walking assistant 100 may further provide a user interface for the user to adjust the height of the handles 21. In this case, the processor 41 may control the elevation mechanism 30 to move the body 20 up/down according to a height value inputted by the user.
Step S186: Request confirmation from the user about a current walking event. In one embodiment, the walking schedule may include a number of walking events, and the robotic walking assistant 100 may determine a current walking event corresponding to the current time. The walking event may include descriptions of a destination, a walking route, duration of walk, etc. In another embodiment, the robotic walking assistant 100 may plan a walking route according to the destination specified in the walking schedule. The robotic walking assistant 100 may display the destination, the planned walking route, walking speed, and duration of walk on the first display. The robotic walking assistant 100 may further provide one or more user interfaces for the user to accept or modify the displayed parameters.
Step S187: Move toward the destination. After the user confirms or modifies the current walking event, the robotic walking assistant 100 escorts the user and moves toward the destination according to the accepted/modified walking event. In one embodiment, the robotic walking assistant 100 can move autonomously and guide the user to walk along a planned path toward the destination. In another embodiment, the robotic walking assistant 100 moves only when being pushed/pulled by the user. In this case, the rear display 83 may display navigation information to guide the user to walk along a planned path toward the destination.
If the user is not standing, the procedure goes to step S185. Step S185: Switch the robotic walking assistant 100 to the standing assistive mode. In this mode, the robotic walking assistant 100 can help the user to stand up. The procedure then goes to Step S184.
It should be appreciated the above disclosure detailed several embodiments of the robotic walking assistant 100 that can provide walking assistance and fall prevention. As mentioned above, the robotic walking assistant 100 can be employed in assisted living facilities or healthcare facilities. However, the disclosure is not limited thereto. In other exemplary usage scenarios, the robotic walking assistant 100 may be used in hospitals.
With the configuration described above, the robotic walking assistant can promote an active living life style for the elderly people. The robotic walking assistant can allow them to do more exercise to maintain their mobility capability. Moving around also provide more chances for the elderly people to interact with other people (particularly in the elderly care facility or assistive living facility) so that they feel less isolated. The robotic walking assistant also has features to prevent the falling. For instance, the robotic walking assistant will issue tripping hazard signal to the elderly people if it detects a water puddle or a slipper on the way.
Referring to
Step S191: Detect whether two hands of a user have held the two handles of the robotic walking assistant.
In one embodiment, each handle 21 may include a sensor to detect whether two hands of a user have held the two handles of the robotic walking assistant. For example, one electrocardiogram (ECG) sensor 77 may be imbedded in each handle 21. The ECG sensors 77 measure the electrical activity of the heart of the user. After the two hands of the user hold the two handles 21, the two ECG sensors 77 will send signals to the control system 40. The control system 40 can then determine that the two hands have held the handles 21. It should be noted that other types of sensors (e.g., force sensors) may be used to detect whether two hands of a user have held the two handles of the robotic walking assistant.
In another embodiment, object recognition technology may be employed to determine whether two hands of a user have held the two handles of the robotic walking assistant. Specifically, the camera 71 may be rotated to face backward to capture images of the handles, and send the images to the control system 40. The control system 40 may perform object recognition based on these images to determine whether two hands of a user have held the two handles of the robotic walking assistant. Various object recognition algorithms are known and are not detailed here.
Referring to
Step S1911: Prompt the user to hold the two handles.
In one embodiment, the control system 40 may display a visual prompt (e.g., “Please hold your hands on the handles.”) on the rear display 83 to prompt the user to hold the two handles. The control system 40 may output an audio prompt to the user while displaying the visual prompt on the rear display 83.
Step S1912: Detect force exerted on the two handles to determine whether the two hands of the user have held the two handles.
In the embodiment, two force sensors embedded in two handles 21 can detect the force exerted on the two handles by the hands of the user. If the output from the force sensors indicates that no hands hold the two handles 21 or that only one hand holds one of the handles, the procedure goes back to step s1911. If the output from the force sensors indicates that two hands have held the two handles 21, the procedure goes to step S192.
In another embodiment, the control system 40 may determine whether the two hands of the user have held the two handles based on output from other types of sensors, such as ECG sensors 77, camera 21, and the like.
Step S192: Receive a command from the user to select an operation mode in response to detection of the two hands holding the two handles.
After determining that the two hands of the user have held the two handles 21, the control system 40 may display a user interface on the rear display 83. The rear display 83 may be a touch sensitive display and can receive a manual operation of the user on the display 83, which allows the user to select an operation mode of the robotic walking assistant. The operation mode may include a walking assistive mode, a walking training mode, and a static training mode. In this case, the rear display 83 may display user interface elements corresponding to the three operation modes. After detection of a touch operation on one of the user interface elements, the control system 40 may control the robotic walking assistant to operate in the selected operation mode.
In one embodiment, the control system 40 may uses speech recognition to wirelessly control the robotic walking assistant. Voice commands are taken through the microphone 85, processed by the control system 40 and finally the robotic walking assistant acts accordingly. For example, the control system may extract a key word “walking assistive mode” from a voice command from the user, and control the robotic walking assistant to operate in the walking assistive mode. Accordingly, the control system 40 may receive a command to select a corresponding operation mode of the robotic walking assistant through the rear display 83 and the microphone 85.
Step S193: Control the wheeled base to move in response to a walking assistive mode being selected.
In one embodiment, in response to the selection of the walking assistive mode, the handles 21 are moved to their extended positions, the feet 152 are moved up away from the surface S, and the body 20 is moved up to an extended position according to the height of the user. In this mode, the robotic walking assistant 100 is ready to be pushed by the user and helps support a portion of the bodyweight of the user when the user is walking.
In one embodiment, the robotic walking assistant could be customized in shape/size to adapt itself to different users. This could be done by either manual control, voice control, or automatically based on the user's profile including but not limited to height, weight, gender, age, etc. After the customization of the configuration of the walking assistant robot, the personalized configuration could be associated with the specific user and reusable for the next session. In this embodiment, the robotic walking assistant may receive a profile corresponding to a specific user from a remote cloud database, and customize the shape/size of the robotic walking assistant accordingly.
Step S194: Provide resistance to at least one of the one or more wheels according to selection of the user, in response to a walking training mode being selected.
The walking training mode is similar to the walking assistive mode. In response to the selection of the walking assistive mode, the handles 21 are moved to their extended positions, the feet 152 are moved up away from the surface S. and the body 20 is moved up to an extended position according to the height of the user. The difference between the walking training mode and the walking assistive mode is that the robotic walking assistant 100 in the walking training mode can exert extra resistance to the user so that he/she has to make extra efforts to push the robotic walking assistant forward or around, thus increasing the muscle strength and coordination capability given enough training sessions. In one embodiment, the wheeled base 10 may further include brakes. When then robotic walking assistant operates in the walking training mode, the processor 41 controls the brakes to press against the moving wheels 122 to create friction. In this case, the user needs to apply more pushing force to the robotic walking assistant 100, thereby increasing the muscle strength and coordination capability given enough training sessions.
Step S195: Lock the one or more wheels in response to a static training mode being selected.
In response to the selection of the walking assistive mode, the wheels 122 are locked. The robotic walking assistant is thus locked and cannot move, which allows a user to do static training. For example, when the robotic walking assistant operates in the static training mode, a user may do static squat hold while holding the two handles 21. When the two hands of the user hold the two handles 21, the robotic walking assistant 100 can provide an upward support force to the user, thereby helping the user to maintain balance during his/her static training.
The method above enables the robotic walking assistant to have customization capability based on the user preference. Therefore, a wider range of customers, even with different heights and limb lengths can benefit from the customizable shape of the robotic walking assistant for different walking scenarios. The robotic walking assistant is controlled to operate in a selected operation mode after detecting the two hands of a user holding the two handles, which can ensure the safety of the user during walking or exercising.
Referring to
Step S196: Detect a push or a pull from the user.
In one embodiment, after the user has selected the walking assistive mode, the control system 40 may control the wheeled base to move according to a profile corresponding to the user. The profile may include a default speed of the wheeled base that is set by the user or a healthcare professional. The user can then walk together with the robotic walking assistant that moves at the default speed. In one embodiment, the user is allowed to change the speed of the robotic walking assistant by pulling or pushing the handles 21. In the embodiment, a first force sensor and a second force sensors may be embedded in one of the two handles 21. The first force sensor is configured to detect the push from the user, and the second force sensor is configured to detect the pull from the user. In one embodiment as shown in
Step S197: Increase speed of the wheeled base in response to detection of the push from the user.
When the moving speed of the wheeled base is slower than expected, the user may apply a pushing force to the handle 21. After determining that the user has applied a pushing force to the handle, the control system 40 increases the moving speed of the wheeled base. For example, the control system 40 may increase the moving speed of the wheeled base by increasing the rotational speed output by the motors 1201 that are configured to actuate rotational movement of the wheels 122.
Step S198: Reduce speed of the wheeled base in response to detection of the pull from the user.
When the moving speed of the wheeled base is faster than expected, the user may apply a pulling force to the handle 21. After determining that the user has applied a pulling force to the handle, the control system 40 reduces the moving speed of the wheeled base. For example, the control system 40 may reduce the moving speed of the wheeled base by reducing the rotational speed output by the motors 1201 that are z;
With such method, it is convenient and intuitive for a user to adjust the moving speed of the robotic walking assistant such that the speed of the robotic walking assistant can adapt to the walking speed of the user.
Referring to
Step S1941: Prompt the user to select a level of difficulty.
In the embodiment, the level of difficulty is an indicator that reflects the amount of pushing force that is required to push the robotic walking assistant to move. The higher the level of difficulty is, the more the amount of pushing force is. After the selection of the walking training mode by a user, the control system 40 may display a user interface on the rear display 83. The rear display 83 may be a touch sensitive display and can receive a manual operation of the user on the display 83, which allows the user to select a desired level of difficulty. In this case, the rear display 83 may display user interface elements corresponding to different levels of difficulty.
In one embodiment, the control system 40 may uses speech recognition to wirelessly control the robotic walking assistant. Voice commands are taken through the microphone 85, processed by the control system 40 and finally the robotic walking assistant acts accordingly. For example, the control system may extract a key word “intermediate level” from a voice command from the user, and determines that an intermediate level of difficulty is selected by the user.
Step S1942: Provide a level of resistance corresponding to the level of difficulty selected by the user to the at least one of the one or more wheels.
Referring to
In one embodiment, each brake 124 is a contactless braking system that is believed to have a longer lifetime and require less maintenance. For example, the brakes 124 may be an eddy current brake (ECB) that is an electric braking system employing the eddy currents principle. Referring to
In one embodiment, the brake 124 may further include a current amplifier 1245 which is used as a power source for the coil 1242. The current amplifier 1245 is electrically coupled to the control system 40. The control system 40 controls the current amplifier 1245 to apply an AC current with different phases to the coil 1242. The use of electromagnets allows the resistance provided by the brake 124 to be set to any desired level.
Referring to
Referring to
Step S1991: Detect fatigue of the user when the robotic walking assistant operates in the walking assistive mode, the walking training mode, or the static training mode.
In one embodiment, the fatigue of the user is determined based on output from the ECG sensors 77. When a user is tired, his/her heart function, nerve function, respiratory function, and other related functions change accordingly. Therefore, the fatigue status could be reflected by electrocardiogram. The ECG signals may be measured at a sampling rate of 100 Hz from the user's palms as he/she holds the handles 21. The ECG signals may be measured and transmitted to the control system 40. The user's health condition such as the normal, fatigued and drowsy states is analyzed by evaluating the heart rate variability in the time and frequency domains.
In another embodiment, the fatigue of the user may be evaluated based on the walking time and/or walking distance. Specifically, the robotic walking assistant may check with the user if he/she is tired after a preset walking time and/or walking distance. The control system may determine the fatigue of the user based on the response from the user.
Step S1992: Rotate the foldable seat to an unfolded position according to a command from the user in response to detection of fatigue of the user.
After the detection of the fatigue of the user, the robotic walking assistant may check with the user if he/she needs a rest. After receiving a command indicating that the user needs to take a rest, the control system 40 rotates the foldable seat 50 to an unfolded position such that the user can sit on the seat 50. It should be noted that, the V) feet 152 are moved down to be in contact with the surface S, and the armrests 60 are rotated to their unfolded positions after the control system 40 receives a command indicating that the user needs to take a rest.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present disclosure and its practical applications, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated.
This application is a continuation-in-part of and claims priority to co-pending application Ser. No. 17/359,672, which was filed on Jun. 28, 2021. The application is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
8627909 | Chang | Jan 2014 | B2 |
10251805 | Morbi | Apr 2019 | B2 |
20200281801 | Karlovich | Sep 2020 | A1 |
20210053222 | Offengenden | Feb 2021 | A1 |
20220110818 | Orrell-Jones | Apr 2022 | A1 |
20220409469 | Shen | Dec 2022 | A1 |
20230270618 | Gong | Aug 2023 | A1 |
Number | Date | Country |
---|---|---|
6199380 | Sep 2017 | JP |
6393879 | Sep 2018 | JP |
6620326 | Dec 2019 | JP |
Number | Date | Country | |
---|---|---|---|
20220409469 A1 | Dec 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17359672 | Jun 2021 | US |
Child | 17528158 | US |