Method and device for control of a mobility device

Abstract
A system for control of a mobility device comprising a controller for analyzing data from at least one sensor on the mobility device, wherein the data is used to determine the gait of user. The gait data is then used to provide motion command to an electric motor on the mobility device.
Description
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH

Not applicable.


BACKGROUND OF THE INVENTION

The invention relates to a mobility device. More specifically, the invention relates to a control system and method of controlling a mobility device having an electric motor that is worn on the feet of a user to provide mobility assistance.


Commuters and other travelers often have to walk the final leg of their trip, regardless of whether they travelled by car, bus, train, or other means. Depending on the distance, the time needed to complete this final leg of the journey can comprise a significant amount of the total duration of the trip. While bikes or scooters can be used, they are bulky and require skill and a minimum level of fitness to operate. Powered systems, such as moving walkways, suffer from a lack of mobility. Other mobility solutions suffer the same drawbacks or lack the ability to adapt to a particular user. Therefore, it would be advantageous to develop a control system for a mobility device that does not require any special skills or user training and can adapt to the individual needs of a particular user.\


BRIEF SUMMARY

According to embodiments of the present invention is system and method of controlling a pair of mobility device, wherein the mobility devices are worn on each foot of a user. A sensor in each mobility device obtains data about the gait of a user and transmits the data to a processor. The processor analyzes the gait of a user and then uses the gait data to develop motion commands for each mobility device. Each mobility device may comprise a motor, gearing, and wheels. When worn on the feet of a user, the mobility devices allow a user to walk at an increased rate of speed for a given cadence and stride length, as compared to their speed without the mobility devices. Further, the control system adapts to a user so no learning or other control inputs are required by the user.





BRIEF SUMMARY OF THE SEVERAL VIEWS OF THE DRAWINGS


FIG. 1 depicts a mobility device with an embedded controller, according to one embodiment.



FIG. 2 is a block diagram of a control system according to one embodiment.



FIG. 3 shows the steps of the method of control, utilizing the controller depicted in FIG. 2.





DETAILED DESCRIPTION

As shown in FIG. 1, a mobility device 100, according to one embodiment, comprises a plurality of wheels 101, with at least one of the wheels 101 connected to an electric motor 102. Further shown in FIG. 1 is an onboard controller 111 and an optional remote controller 112. During typical use, a user will wear two mobility devices 100, one on each foot. The mobility device 100 enables a pedestrian to walk faster than a normal walking pace by adding torque to the wheels 101 of the mobility device 100 worn on the foot in contact with the ground. In this manner, the user experiences an effect similar to that of walking on a moving walkway. More specifically, the control system 110 of the present invention enables a user to maintain a normal walking motion by adapting the control of the motor 102 to the movements of the user. As will be discussed in greater detail, the speed at which the wheels 101 spin, through a torque applied by the motor 102, is controlled in part by an analysis of the user's gait.



FIG. 2 depicts the components of the onboard controller 111, which comprises at least one inertial measurement unit 113, a processor 114, a motor driver 115, and a wireless communication module 116. Two onboard controllers 111 are shown in FIG. 2 since each mobility device (i.e. one for each foot of the user) will house an onboard controller 111. In an alternative embodiment, the control system 110 may also include a remote controller 112, which is capable of sending commands to each of the onboard controllers 111. In this particular embodiment, both the left and right mobility devices 100 receive command speeds from the remote controller 112, which can be in the form of a hand-held controller, a computer, or a mobile phone, and actuate the mobility devices at the specified command speeds.


The control system 110 is used to collect data and analyze the gait of a user. When a pair of mobility devices 100 is worn by a user, each mobility device 100 will have a control system 110. For example, the onboard processor 114 reads gait dynamic data, which may comprise acceleration, angular rates, orientation, gyroscopic data, or quaternion data of each mobility device 100 from the inertial measurement unit 113. In one embodiment, both onboard controllers 111 send the gait dynamic data to the remote controller 112 and, in return, receive a motion command from the remote controller 112. The motion command comprises, for example, acceleration to a set speed, braking, deceleration to a set speed, and holding at a constant speed. In alternative embodiments, additional data can be included in the motion command. Upon receiving the motion command, the onboard processer 114 along with the motor driver 115 converts the motion command into a motor driving signal and drives the motor system 102, thereby affecting the speed of the wheels 101. In one embodiment, the motor driver 115 receives a speed command and drives the motor 102 at the command speed via a feedback loop control.


The flow diagram shown in FIG. 3 depicts the method of gait-based motion control, according to one embodiment, comprising the steps of receiving gait dynamic data 301, detecting the stance/swing phase 302, computing the gait trajectory vector 303, determining the user gait 304, and determining the motion command 305.


In step 301, the control system 110 receives gait dynamic data from both onboard controllers 111. The gait dynamic data includes data collected from the inertial measurement unit 113 in each mobility device 100. Next, in step 302, the control system 110 determines the status of each mobility device 100 as being ‘in stance’ (i.e. on the ground) or ‘swing’ (i.e. in the air). Then, in step 303, if the mobility device 100 is in the stance phase, a gait trajectory vector is set to zero. The gait trajectory vector may comprise an estimated foot velocity, stride length, orientation, and elevation, among other parameters. For example, acceleration in the x direction can be integrated over a period of time to determine forward velocity. Similarly, acceleration in the z direction can be used to derive elevation. By way of further example, if the elevation is positive, this could indicate that a user is climbing stairs. A negative elevation can indicate a user is travelling down a set of stairs. Acceleration in the y direction (i.e. side-to-side) can be used to derive orientation, which may be indicative of a turning motion by the user. If the mobility device 100 is in swing phase, a gait speed and trajectory vector are calculated based on the gait dynamic data. For example, in one embodiment, the acceleration data acquired from the inertial measurement units 113 is integrated to provide a velocity for each mobility device 100. The average of the velocity of both mobility devices 100 can be used to calculate the user's overall speed.


Next, at step 304, the gait speed and trajectory vectors are compared against a pre-configured gait model (or profile) which comprises a range of speeds during walking, different ranges of elevation during walking, climbing hills, or stepping on stairs. Based on the result of said comparisons, the user gait is determined. Once gait is determined, at step 305 the motion command is generated based on the determined gait. For example, if the average velocity of the two mobility devices 100 is calculated to be 1.2 m/s, then the gait is determined to be ‘middle’ (or any other assigned profile based on the average velocity) and requires a motion command for a wheel speed of 0.8 m/s. A lower average velocity may require a motion command with a lower wheel speed.


However, in optional step 306, the remote controller 112 checks if any user input has been registered. The user input can be in various forms such as pressing a button or moving the remote controller 112 in a certain trajectory. For example, the user input may press a button indicating that the user wants forward motion. Thus, the forward motion command received from the user can override the motion command provided by the controller 112 or onboard processors 111. After checking for a user input at step 306, a motion command is generated and sent by the remote controller 112 to both onboard controllers 111. However, if the user input is received from step 306, the final motion command is replaced with the user input before being sent to the onboard controllers 111.


In an alternative embodiment, each onboard controller 111 determines the gait in step 304 and generates a motion command in step 305. To prevent inconsistent commands from each onboard controller 111, each sends the motion command signal to the other for cross-validation in step 307. The motion command may include acceleration to a set speed, braking, deceleration to a set speed, and holding at a constant speed. Upon validating the motion command, the processor 114 along with the motor driver 115 convert the motion command into a motor driving signal and drive the motor system. Stated differently, in step 307, cross validation compares the motion commands generated by each of the two mobility devices 100. For example, the motor driver 115 will only command motor speed when both commands are similar and will brake when the speed commands are inconsistent.


While the disclosure has been described in detail and with reference to specific embodiments thereof, it will be apparent to one skilled in the art that various changes and modification can be made therein without departing from the spirit and scope of the embodiments. Thus, it is intended that the present disclosure cover the modifications and variations of this disclosure provided they come within the scope of the appended claims and their equivalents.

Claims
  • 1. A mobility device comprising: a motor;at least one inertial measurement unit; anda controller configured to: receive gait dynamic data from the at least one inertial measurement unit, wherein the gait dynamic data comprises gyroscopic data,determine a gait of a user based on the gait dynamic data, andgenerate a motion command based on the gait.
  • 2. The mobility device of claim 1, wherein the gait dynamic data further comprises at least one of acceleration data, angular rate data, orientation data, or quaternion data.
  • 3. The mobility device of claim 1, wherein the controller is further configured to detect a stance/swing phase of the mobility device based on the gait dynamic data.
  • 4. The mobility device of claim 1, wherein the controller is further configured to compute a gait trajectory vector, wherein the gait trajectory vector comprises a foot velocity.
  • 5. The mobility device of claim 4, wherein the gait trajectory vector further comprises a stride length of the user.
  • 6. The mobility device of claim 4, wherein the gait trajectory vector further comprises an orientation of the mobility device.
  • 7. The mobility device of claim 4, wherein the gait trajectory vector further comprises an elevation of the mobility device.
  • 8. The mobility device of claim 1, further comprising a remote control configured to override the motion command based on a user input.
  • 9. The mobility device of claim 8, wherein the remote control is a mobile phone.
  • 10. The mobility device of claim 1, wherein the controller is further configured to compare the gait of the user against a pre-configured gait model.
  • 11. The mobility device of claim 10, wherein the pre-configured gait model comprises a range of velocities for walking on various inclines.
  • 12. The mobility device of claim 10, wherein the pre-configured gait model comprises a range of velocities for walking on stairs.
  • 13. A method of controlling a mobility device comprising: receiving gait dynamic data from at least one inertial measurement unit of the mobility device, wherein the gait dynamic data comprises gyroscopic data;determining a gait of a user based on the gait dynamic data;generating a motion command for the mobility device based on the gait.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/629,252, filed Jan. 7, 2020, which is the U.S. national phase under 35 U.S.C. § 371 of International Application No. PCT/US2018/041345, filed Jul. 9, 2018, which claims the benefit under 35 U.S.C. § 119 of U.S. Provisional Application No. 62/530,177, filed Jul. 8, 2017, each of which is incorporated herein by reference in its entirety.

US Referenced Citations (80)
Number Name Date Kind
833100 Wells Oct 1906 A
1672700 Vass Jun 1928 A
1801205 Mirick Apr 1931 A
2857008 Pirrello Oct 1958 A
3392986 Ryan Jul 1968 A
4334690 Klamer et al. Jun 1982 A
4417737 Suroff Nov 1983 A
4553767 Robjent et al. Nov 1985 A
RE32346 Klamer et al. Feb 1987 E
4932676 Klamer Jun 1990 A
5056802 Piotrowski Oct 1991 A
5236058 Yamet et al. Aug 1993 A
5400484 Gay Mar 1995 A
5413380 Fernandez May 1995 A
5730241 Shyr et al. Mar 1998 A
5797466 Gendle Aug 1998 A
6059062 Staelin May 2000 A
6322088 Klamer et al. Nov 2001 B1
6425587 Moon Jul 2002 B1
6497421 Edgerley et al. Dec 2002 B1
6517091 Fisher et al. Feb 2003 B1
6645126 Martin et al. Nov 2003 B1
7163210 Rehkemper et al. Jan 2007 B1
7204330 Lauren Apr 2007 B1
9027690 Chavand May 2015 B2
9295302 Reed et al. Mar 2016 B1
9925453 Tuli Mar 2018 B1
10456698 Chen et al. Oct 2019 B2
10709961 Zhang et al. Jul 2020 B2
10933298 Zhang et al. Mar 2021 B2
10933299 Zhang et al. Mar 2021 B2
11364431 Zhang et al. Jun 2022 B2
20010022433 Chang Sep 2001 A1
20030047893 Pahis Mar 2003 A1
20030141124 Mullet Jul 2003 A1
20040239056 Cho et al. Dec 2004 A1
20050046139 Guan Mar 2005 A1
20050082099 Tuli Apr 2005 A1
20060027409 Adams et al. Feb 2006 A1
20070090613 Lyden Apr 2007 A1
20070273110 Brunner Nov 2007 A1
20080093144 Manor Apr 2008 A1
20090120705 McKinzie May 2009 A1
20100207348 Othman Aug 2010 A1
20120285756 Treadway Nov 2012 A1
20130025955 Chavand Jan 2013 A1
20130123665 Mariani et al. May 2013 A1
20130226048 Unluhisarcikli et al. Aug 2013 A1
20130274640 Butters et al. Oct 2013 A1
20130282216 Edney Oct 2013 A1
20140196757 Goffer Jul 2014 A1
20150196403 Kim et al. Jul 2015 A1
20150196831 Treadway et al. Jul 2015 A1
20150352430 Treadway et al. Dec 2015 A1
20160045385 Aguirre-Ollinger et al. Feb 2016 A1
20160058326 Winfree et al. Mar 2016 A1
20160113831 Hollander Apr 2016 A1
20160250094 Amundson et al. Sep 2016 A1
20160331557 Tong et al. Nov 2016 A1
20170055880 Agrawal et al. Mar 2017 A1
20170181917 Ohta et al. Jun 2017 A1
20170182397 Zhang Jun 2017 A1
20170259162 Mo Sep 2017 A1
20170259811 Coulter et al. Sep 2017 A1
20170296116 McCarthy et al. Oct 2017 A1
20180008881 Mo Jan 2018 A1
20180015355 Desberg et al. Jan 2018 A1
20180333080 Malawey et al. Nov 2018 A1
20190061557 Quick et al. Feb 2019 A1
20190184265 Micacchi Jun 2019 A1
20190314710 Zhang et al. Oct 2019 A1
20190351315 Li Nov 2019 A1
20200000373 Agrawal et al. Jan 2020 A1
20200061444 Zhang et al. Feb 2020 A1
20200061445 Zhang et al. Feb 2020 A1
20200129843 Zhang et al. Apr 2020 A1
20200129844 Zhang et al. Apr 2020 A1
20200197786 Artemev Jun 2020 A1
20210015200 Tuli Jan 2021 A1
20210113914 Zhang Apr 2021 A1
Foreign Referenced Citations (33)
Number Date Country
2759524 Feb 2006 CN
201423154 Mar 2010 CN
201565096 Sep 2010 CN
101912680 Dec 2010 CN
101912681 Dec 2010 CN
102167117 Aug 2011 CN
102805928 Dec 2012 CN
203389316 Jan 2014 CN
104689559 Jun 2015 CN
204364838 Jun 2015 CN
204395401 Jun 2015 CN
105214299 Jan 2016 CN
106039689 Oct 2016 CN
205627021 Oct 2016 CN
106390428 Feb 2017 CN
106390430 Feb 2017 CN
106582003 Apr 2017 CN
0686412 Dec 1995 EP
0834337 Apr 1998 EP
0894515 Feb 1999 EP
3629925 Apr 2020 EP
2452563 Mar 2009 GB
2005081038 Mar 2005 JP
2013111118 Jun 2013 JP
2011092443 Aug 2011 WO
2018082192 May 2018 WO
2018082193 May 2018 WO
2018082194 May 2018 WO
2018082195 May 2018 WO
2019014152 Jan 2019 WO
2019014154 Jan 2019 WO
2019212995 Nov 2019 WO
2020146680 Jul 2020 WO
Non-Patent Literature Citations (10)
Entry
European Supplementary Search Report for EP 18831335.7 dated Feb. 3, 2021.
International Search Report and Written Opinion for PCT/CN2017/000499 dated Oct. 20, 2017.
International Search Report and Written Opinion for PCT/CN2017/000500 dated Oct. 20, 2017.
International Search Report and Written Opinion for PCT/CN2017/000501 dated Nov. 3, 2017.
International Search Report and Written Opinion for PCT/CN2017/000502 dated Oct. 13, 2017.
International Search Report and Written Opinion for PCT/US2018/041343 dated Sep. 7, 2018.
International Search Report and Written Opinion for PCT/US2018/041345 dated Sep. 7, 2018.
International Search Report and Written Opinion for PCT/US2019/029742 dated Aug. 26, 2019.
International Search Report and Written Opinion for PCT/US2020/012992 dated Apr. 1, 2020.
International Search Report and Written Opinion for PCT/US2021/056014 dated Jan. 18, 2022.
Related Publications (1)
Number Date Country
20220314103 A1 Oct 2022 US
Provisional Applications (1)
Number Date Country
62530177 Jul 2017 US
Continuations (1)
Number Date Country
Parent 16629252 US
Child 17843153 US