This disclosure relates to a perception system for a lower body powered exoskeleton.
Patients with impaired lower body sensation (e.g., due to spinal cord injuries) may use a lower body powered exoskeleton device to assist with standing and walking. The exoskeleton device may include one or more joints with actuators that enable a user of the device to take steps. The actuators may be controlled based on gait parameters manually input to a computer system in communication with the exoskeleton device. One or more helpers may assist the user of the exoskeleton device to facilitate balance and prevent falls while the user is using the device.
Lower body powered exoskeleton devices are designed to allow paraplegic patients to stand and walk. The usability of such devices may pose challenges, for example, due to gait parameters (e.g., step length, step height, step timing, etc.) typically being required to be adjusted manually. Additionally, the user of a lower body powered exoskeleton device (also referred to herein a “rider”) does not typically receive feedback on foot placement or balance from the device during use. As such, it is impractical for the rider to use the exoskeleton device without assistance from one or more helpers to support their movements as they walk. Some embodiments of the present disclosure relate to a perception system for a lower body powered exoskeleton system. The perception system may be configured to detect aspects of the terrain in front of the rider and provide automatic gait adjustments based on the terrain. In some embodiments, the perception system may be further configured to provide feedback to the rider. For instance, the feedback may include information about an operation of the exoskeleton device (e.g., footstep and/or crutch target locations) and/or balance information associated with the device and/or rider.
In some embodiments, the invention features a perception system for a lower body powered exoskeleton device. The perception system includes a camera configured to capture one or more images of terrain in proximity to the exoskeleton device, and at least one processor. The at least one processor is programmed to perform footstep planning for the exoskeleton device based, at least in part, on the captured one or more images of terrain, and issue an instruction to perform a first action based, at least in part, on the footstep planning.
In one aspect, issuing an instruction to perform a first action includes issuing an instruction to provide feedback to a user of the exoskeleton device based, at least in part on the footstep planning. In another aspect, issuing an instruction includes sending information to a visualization system configured to provide the feedback to the user. In another aspect, the visualization system includes an augmented reality (AR) system. In another aspect, performing footstep planning for the exoskeleton device includes determining a location of one or more footstep targets, and sending information to the visualization system includes sending information associated with the location of the one or more footstep targets. In another aspect, performing footstep planning for the exoskeleton device includes determining a location of one or more crutch targets to place a crutch configured to be used with the exoskeleton device, and sending information to the visualization system includes sending information associated with the location of the one or more crutch targets.
In another aspect, issuing an instruction to provide feedback includes issuing an instruction to an audio system to output audio including the feedback. In another aspect, issuing an instruction to provide feedback includes issuing an instruction to a haptic system to output haptic feedback to the user. In another aspect, issuing an instruction to provide feedback includes issuing an instruction to provide at least two of visual, audio, and haptic feedback to the user. In another aspect, issuing an instruction to perform a first action includes issuing an instruction to a controller of the exoskeleton device to set one or more gait parameters of the exoskeleton device based, at least in part on the footstep planning. In another aspect, the one or more gait parameters of the exoskeleton are selected from the group consisting of step length, step height, and step timing.
In another aspect, performing footstep planning includes determining at least one footstep target location for the exoskeleton device in which a foot of exoskeleton device has full contact with a surface of the terrain. In another aspect, the terrain includes a step, and performing footstep planning includes determining at least one footstep target location for the exoskeleton device on the step.
In another aspect, the at least one processor is further programmed to perform balance estimation associated with the exoskeleton device and issue an instruction to perform a second action based, at least in part, on the balance estimation. In another aspect, issuing an instruction to perform the second action includes issuing an instruction to provide feedback to a user of the exoskeleton device based, at least in part, on the balance estimation. In another aspect, issuing an instruction to provide feedback includes sending information to a visualization system configured to display the feedback to the user. In another aspect, the visualization system includes an augmented reality (AR) system. In another aspect, performing balance estimation associated with the exoskeleton device includes determining a balance state associated with the exoskeleton device, and sending information to the visualization system includes sending information associated with the balance state. In another aspect, determining a balance state associated with the exoskeleton device includes determining a numerical value for the balance state associated with the exoskeleton device, and the information associated with the balance state includes information associated with the numerical value. In another aspect, the information associated with the numerical value includes a balance meter that indicates to the user whether the balance state is sufficient to perform a next leg swing of the exoskeleton device. In another aspect, information associated with the balance state further includes information of how to improve the balance state when it is determined that the balance state is not sufficient to perform a next leg swing of the exoskeleton device. In another aspect, determining a balance state of the exoskeleton device is performed, based at least in part, on force information received from at least one sensor. In another aspect, the at least one sensor includes a foot force sensor located on a foot portion of the exoskeleton device, and the force information includes force information received from the foot force sensor. In another aspect, the at least one sensor includes a force sensor arranged on a crutch configured to be used with the exoskeleton device, and the force information includes force information received from the force sensor arranged on the crutch.
In another aspect, issuing an instruction to provide feedback includes issuing an instruction to an audio system to output audio including the feedback. In another aspect, issuing an instruction to provide feedback includes issuing an instruction to an haptic system to output haptic feedback to the user. In another aspect, issuing an instruction to provide feedback includes issuing an instruction to provide at least two of visual, audio, and haptic feedback to the user. In another aspect, the perception system further includes an inertial measurement unit (IMU), and issuing an instruction to provide feedback to the user of the exoskeleton device includes issuing an instruction to provide feedback based, at least in part on an output of the IMU.
In some embodiments, the invention features a method of providing assistive feedback to a user of a lower body powered exoskeleton device. The method includes receiving one or more images of terrain in front of the exoskeleton device, performing, by at least one processor, footstep planning for the exoskeleton device, the footstep planning being performed based, at least in part, on the one or more images of terrain, and providing assistive feedback to the user of the exoskeleton device based, at least in part, on the footstep planning.
In one aspect, providing assistive feedback to the user includes sending information to a visualization system configured to provide the assistive feedback to the user. In another aspect, the visualization system includes an augmented reality (AR) system. In another aspect, performing footstep planning for the exoskeleton device includes determining a location of one or more footstep targets, and providing assistive feedback to the user includes displaying the location of the one or more footstep targets. In another aspect, performing footstep planning for the exoskeleton device includes determining a location of one or more crutch targets to place a crutch configured to be used with the exoskeleton device, and providing assistive feedback to the user includes displaying the location of the one or more crutch targets.
In another aspect, providing assistive feedback to the user includes outputting audio including the assistive feedback. In another aspect, providing assistive feedback to the user includes outputting haptic feedback to the user. In another aspect, providing assistive feedback to the user includes providing at least two of visual, audio, and haptic feedback to the user. In another aspect, providing assistive feedback to the user includes setting one or more gait parameters of the exoskeleton device based, at least in part on the footstep planning. In another aspect, the one or more gait parameters of the exoskeleton are selected from the group consisting of step length, step height, and step timing.
In another aspect, performing footstep planning includes determining at least one footstep target location for the exoskeleton device in which a foot of exoskeleton device has full contact with a surface of the terrain. In another aspect, the terrain includes a step, and performing footstep planning includes determining at least one footstep target location for the exoskeleton device on the step.
In another aspect, the method further includes performing balance estimation associated with the exoskeleton device, and providing assistive feedback to the user of the exoskeleton device is further based, at least in part, on the balance estimation. In another aspect, providing assistive feedback to the user includes sending information to a visualization system configured to provide the assistive feedback to the user. In another aspect, the visualization system includes an augmented reality (AR) system. In another aspect, performing balance estimation associated with the exoskeleton device includes determining a balance state associated with the exoskeleton device, and providing assistive feedback to the user includes providing an indication of the balance state. In another aspect, determining a balance state associated with the exoskeleton device includes determining a numerical value for the balance state associated with the exoskeleton device, and providing an indication of the balance state includes providing an indication of the numerical value. In another aspect, the indication of the numerical value includes a balance meter that indicates to the user whether the balance state is sufficient to perform a next leg swing of the exoskeleton device. In another aspect, providing an indication of the balance state further includes providing an indication of how to improve the balance state when it is determined that the balance state is not sufficient to perform a next leg swing of the exoskeleton device. In another aspect, determining a balance state of the exoskeleton device is performed, based at least in part, on force information received from at least one sensor. In another aspect, the at least one sensor includes a foot force sensor located on a foot portion of the exoskeleton device, and the force information includes force information received from the foot force sensor. In another aspect, the at least one sensor includes a force sensor arranged on a crutch configured to be used with the exoskeleton device, and the force information includes force information received from the force sensor arranged on the crutch.
In another aspect, providing assistive feedback to the user includes outputting audio including the assistive feedback. In another aspect, providing assistive feedback to the user includes outputting haptic feedback to the user. In another aspect, providing assistive feedback to the user includes providing at least two of visual, audio, and haptic feedback to the user. In another aspect, providing assistive feedback to the user of the exoskeleton device includes providing assistive feedback based, at least in part on an output of an inertial measurement unit (IMU).
In some embodiments, the invention features a system. The system includes a lower body powered exoskeleton device configured to be worn by a user, a perception system configured to capture one or more images of terrain in proximity to the exoskeleton device, an augmented reality (AR) system configured to be worn by the user, and at least one processor. The at least one processor is programmed to perform footstep planning for the exoskeleton device based, at least in part, on the one or more images of terrain and send first information based on a result of the footstep planning to the AR system for presentation by the AR system to the user.
In one aspect, the system further includes a foot force sensor configured to sense foot force information, and the at least one processor is further programmed to perform balance estimation based, at least in part, on the foot force information. In another aspect, the at least one processor is further programmed to send second information based on a result of the balance estimation to the AR system for presentation by the AR system to the user. In another aspect, performing balance estimation associated with the exoskeleton device includes determining a balance state associated with the exoskeleton device, and the second information includes information associated with the balance state. In another aspect, determining a balance state associated with the exoskeleton device includes determining a numerical value for the balance state associated with the exoskeleton device, and the second information includes information associated with the numerical value. In another aspect, the AR system is configured to display a balance meter to the user, the balance meter indicating, based at least in part, on the second information, whether the balance state is sufficient to perform a next leg swing of the exoskeleton device. In another aspect, the AR system is further configured to indicate an instruction to the user how to improve the balance state when it is determined that the balance state is not sufficient to perform a next leg swing of the exoskeleton device.
In another aspect, the perception system is mounted to a pelvis of the exoskeleton device. In another aspect, performing footstep planning for the exoskeleton device includes determining a location of one or more footstep targets, and sending information to the AR system includes sending information associated with the location of the one or more footstep targets. In another aspect, performing footstep planning for the exoskeleton device includes determining a location of one or more crutch targets to place a crutch configured to be used with the exoskeleton device, and sending information to the AR system includes sending information associated with the location of the one or more crutch targets.
In another aspect, the at least one processor is further programmed to set one or more gait parameters of the exoskeleton device based, at least in part on the footstep planning. In another aspect, the one or more gait parameters of the exoskeleton are selected from the group consisting of step length, step height, and step timing. In another aspect, performing footstep planning includes determining at least one footstep target location for the exoskeleton device in which a foot of exoskeleton device has full contact with a surface of the terrain. In another aspect, the terrain includes a step, and performing footstep planning includes determining at least one footstep target location for the exoskeleton device on the step. In another aspect, the system further includes an inertial measurement unit (IMU), and the at least one processor is further programmed to send second information based on output of the IMU to the AR system.
The advantages of the invention, together with further advantages, may be better understood by referring to the following description taken in conjunction with the accompanying drawings. The drawings are not necessarily to scale, and emphasis is instead generally placed upon illustrating the principles of the invention.
A lower body powered exoskeleton device (also referred to herein simply as an “exoskeleton device”) is a robotic device that enables patients with lower limb impairments to stand and walk. As described above, some conventional exoskeleton devices present safety and/or usability challenges to the extent that the devices typically cannot be used without the support of one or more helpers. Additionally, the gait parameters of the exoskeleton device are typically configured manually, making usability cumbersome and slow. Some embodiments of the present disclosure are directed to systems and techniques for improving the usability of an exoskeleton device by providing feedback to the system and/or the user of the system based on perception of one or more aspects of the user's environment.
The inventors have recognized and appreciated that widespread adoption and/or overall usability of exoskeleton devices may be hindered by their non-autonomous operation and lack of feedback provided to the user about their operation.
Based, at least in part, on an output of processing the image(s), compute device 230 may be configured to initiate one or more actions. For instance, compute device 230 may send information based on the processed images to controller 110 onboard the exoskeleton device 200 to control operation of one or more actuators of the exoskeleton device, as described above. For instance, when the compute device 230 determines that the terrain in front of the exoskeleton device 200 includes a step, compute device 230 may be configured to provide information (e.g., control instructions) to controller 110, which may in turn automatically adjust one or more gait parameters of the exoskeleton device 200 to enable the exoskeleton device to step onto the step. More generally, feedback provided by compute device 230 may be provided to controller 110 to facilitate navigation of exoskeleton device 200 across terrain in the environment of the exoskeleton device by automatically selecting gait parameter(s) appropriate for the detected terrain. In some embodiments, compute device 230 may be configured to update gait parameters used by a controller of exoskeleton device 200, but control of the exoskeleton device (e.g., to initiate taking an next step) may be left up to the user to control. For instance, as described in more detail below, a user may be provided with feedback that enables them to determine when it is safe to take a next step.
In some embodiments, compute device 230 may be configured to provide feedback to the user of the exoskeleton device 200 to facilitate the user's understanding of how the exoskeleton device 200 may be controlled. For instance, visual feedback determined based, at least in part, on the processed images may be provided via a visualization device 220 communicatively coupled to the compute device 230. In some embodiments, visualization device 220 may be implemented as an augmented reality (AR) or mixed reality (MR) device configured to be worn by the user of the exoskeleton device 230. In some embodiments, visualization device 220 may be implemented as a display on a computing device (e.g., a display of a smartphone). When implemented as an AR device, visualization device 220 may be configured to display visual feedback, examples of which include, planned footstep locations and/or crutch placements, overlaid on a view of the environment of the exoskeleton device 200. For instance, the AR device may be implemented as see through glasses that project visualizations onto the scene observed by the user. Additionally or alternatively other visual feedback (e.g., balance information) may be provided from compute device 230 to visualization device 220 for display to the user.
In some embodiments, compute device 230 may be configured to perform footstep planning based, at least in part, on processing the images captured by perception system 210. The footstep planning may involve generating one or more models of the environment of the exoskeleton device 200. For instance, the compute device 230 may be configured to generate a terrain map of the environment and the footstep planning process may determine one or more candidate locations within the terrain map for the exoskeleton device 200 to take a next step and/or for a user to place a crutch. As shown in
In some embodiments, compute device 230 may be configured to provide feedback in addition to or alternatively to visual feedback. For instance, compute device 230 may be configured to provide audio feedback and/or haptic feedback that informs the user about one or more operations of the exoskeleton device 200. For example, compute device 230 may be configured to provide audio feedback to one or more speakers to inform the user that the exoskeleton device is ready to take a next step, that the user should lean to the right or the left to improve balance of the exoskeleton device, or to provide any other suitable audio output. Similarly, compute device 230 may be configured to communicate with one or more haptic devices (e.g., a vibratory device located on a crutch used by the user of exoskeleton device 200) to provide haptic feedback regarding an operation of the exoskeleton device 200. In some embodiments, compute device 230 is configured to provide feedback to the user using at least two modalities (e.g., at least two of audio, visual, and haptic feedback).
In some embodiments, one or more of exoskeleton device 200, perception system 210, visualization device 220 or another device (e.g., a headset) used in combination with one of those devices, may include one or more microphones configured to capture audio data (e.g., speech information such as speech commands) provided by a user of exoskeleton device 200. The received audio data may be provided to compute device 230 for processing for use in performing one or more actions (e.g., sending a control instruction to a controller of the exoskeleton device 200 to take a next step).
System 300 may also include processing components 320. Processing components 320 may receive input from perception components 310 and provide output to exoskeleton device 330 and/or visualizer 340, as described further below. In some embodiments, processing components 320 may be implemented on compute device 230 described in connection with the system of
Terrain map 326 may be provided as input to a footstep planning and/or balance estimation process 328. As described above, to enable a user to walk with a lower body powered exoskeleton device, gait parameters (e.g., step length, step height, step timing, etc.) for the exoskeleton device are typically entered manually by a helper. In some embodiments, one or more (e.g., all) gait parameters for the exoskeleton device 330 may be determined automatically (e.g., without user input) based, at least in part, on terrain map 326. Some embodiments implement a footstep planning process that determines footstep target locations for a next step of the exoskeleton device based on the terrain map 326, and corresponding gait parameters may be determined based, at least in part, on the planned footstep target locations.
The footstep target locations may be determined based on information about the exoskeleton device (e.g., the size of the feet of the exoskeleton device, the allowable step size range of the exoskeleton device), and information about the terrain. The inventors have recognized that stability of the exoskeleton device on the terrain may be improved by ensuring that most or all of the foot of the exoskeleton device is in contact with the terrain. For instance, when the terrain includes a step, it may be more stable for the exoskeleton device to place most or all of its foot on the step rather than having a portion of the foot hanging off the edge of the step. Accordingly, in some embodiments, a footstep target location for the exoskeleton device may be determined such that the foot of the exoskeleton has full contact with the surface of the terrain as specified in terrain map 326. Determining whether a foot of the exoskeleton device has full contact with the surface of the terrain may be performed in any suitable manner. In some embodiments, each of a plurality of points on the foot of the exoskeleton device may be modeled and the distance between those points and the terrain represented in terrain map 326 may be minimized. If it is determined that a candidate footstep target location does not place the foot in a location on the terrain with full contact, a different candidate footstep target location that does have full contact with the surface of the terrain may be selected. For instance, the candidate footstep location may be moved forward, backward or in another direction until a location in which the foot has full contact with the surface is found.
After performing footstep planning in process 328, one or more gait parameters output from the footstep planning process may be provided to a controller of exoskeleton device 330. For instance, information about the step size, step height, and/or step timing may be provided to exoskeleton device 330 via communications interface 332. In some embodiments, communications interface 332 is implemented as a wireless communications interface (e.g., a WiFi interface) between processing components 320 and a controller of exoskeleton device 330. When processing components 320 are implemented onboard exoskeleton device 330 communications interface 332 may be implemented as a wired communications interface between processing components 320 and a controller of exoskeleton device 330. In this way, one or more gait parameters may be determined and updated, at least in part, on perceived information about the terrain in which the exoskeleton device is operating rather than relying on a helper to manually determine and input the values. Such an automated approach may have several advantages compared to a manual entry approach. For example, when inputting step size, a helper typically must select from a discrete set of step sizes (e.g., 10 cm, 15 cm, 18 cm, 22 cm) despite the exoskeleton device being capable of taking steps at any step size within an allowable range (e.g., 1-35 cm). By requiring the helper to select from among a discrete set of step sizes the footstep target locations are considerably limited. By contrast, when the step size is automatically determined based, at least in part, on terrain map 326, any step size within the allowable step size range may be selected subject to certain constraints (e.g., full foot contact with the surface of the terrain). Additionally, an automatic updating of gait parameters using the techniques described herein may enable footstep planning that incorporates a level of dynamics that is not possible when manual input of gait parameters is required. For instance, an automatic update of gait parameters based on a perception of the terrain may enable the user to step faster and/or with more balance compared to the conventional manual input approach.
In some embodiments, the footstep planning process may be informed, at least in part, on a balance estimation of the exoskeleton device. For instance, in planning a next step for the left foot, a balance estimation associated with a left leg swing to accomplish the next left foot placement may be used to determine a footstep target location for the left foot. The balance estimation may be determined based, at least in part, on static forces applied to the parts of the exoskeleton device in contact with the ground (e.g., the exoskeleton device feet) and/or one or more crutches used to provide stability to the user of the exoskeleton device. In some embodiments, the balance estimation may be determined based, at least in part, on a sequence of movements with the exoskeleton device feet and/or crutches (e.g., left foot->left crutch->right foot->right crutch, or some other sequence). In some embodiments, exoskeleton device 330 includes at least one foot sensor arranged on a foot of the device. Using information from the foot sensor(s) and information from the perception components 310, the center of mass of the entire system (e.g., the exoskeleton device and the user) may be determined. Information from the perception components 310 may be used to localize a support polygon in the environment, and it may be determined whether the center of mass of the entire system is within the localized support polygon. When it is determined that the center of mass is within the support polygon, it may be determined that the current balance state is good and it is safe for the user to swing the leg of the exoskeleton to execute a next step. Information regarding the current balance state may be provided to the user, as described in more detail below.
In some embodiments, information determined by footstep planning and/or balance estimation process 328 may be provided as feedback to the user of exoskeleton device 330 to enable the user to better understand the planned movements of the exoskeleton device. For instance, the feedback may be provided to visualizer 340, which in some embodiments, may be implemented as an augmented reality (AR) or mixed reality (MR) device that includes a display. The output of footstep planning and/or balance estimation process 328 may include one or more footstep target locations and/or one or more crutch target locations. This information may be provided to visualizer 340, which may be configured as a representation of the footstep target locations and/or the crutch target locations projected onto a scene of the environment as observed through the visualizer 340.
In some embodiments, a sequence and amount of leg and crutch placement to improve or maximize stability (e.g., balance) may be determined. The inventors have recognized and appreciated that paraplegics that may use the exoskeleton device to stand and walk do not typically have a good sense of their balance. For example, they may not know what will happen regarding their balance when they swing their leg in the exoskeleton device. Indeed, the only information a user may receive about balance is how much pressure they exert on the crutches. For instance, they may perceive that more pressure exerted on the crutch corresponds to their center of mass being more forward (e.g., close to the crutch edge) and in such a situation they may be sufficiently balanced to perform a leg swing.
In some embodiments, an indication of the balance estimation of the user and exoskeleton device may be provided as visual feedback by visualizer 340. For instance, as shown in
Process 700 then proceeds to act 714, where an action is performed based on the footstep planning and/or the balance estimation performed in act 712. As described above, in some embodiments, the action performed may include setting one or more gait parameters of the exoskeleton device used to execute a next step. Such information may be provided to a controller of the exoskeleton device, wherein the controller is configured to control one or more actuators of the exoskeleton device to execute the next step. In some embodiments, the action performed in act 714 includes providing feedback to the user. For example, the feedback may enable the user to understand a current balance state of the exoskeleton device and/or how to improve the balance state to enable execution of a next step using the exoskeleton device. Additionally or alternatively the feedback may provide the user with a better understanding of future movements of the exoskeleton device based on the footstep planning. For instance, information associated with the footstep planning process (e.g., footstep target locations and/or crutch target locations) may be provided to a visualizer (e.g., an AR system) that enables the user to visualize where the exoskeleton device should step next to provide a good balance state. The visualization may also include crutch placement locations that show where the user should place the crutches to maintain a good balance state.
To enable the determined footstep target locations and/or crutch target locations to be visualized in the proper locations in the environment, process 800 proceeds to act 812, where the reference frame of the visualizer worn by the user and the reference frame of the exoskeleton device are aligned. The reference frame of the visualizer and the exoskeleton device may be aligned in any suitable way. In the example shown in
An orientation may herein refer to an angular position of an object. In some instances, an orientation may refer to an amount of rotation (e.g., in degrees or radians) about three axes. In some cases, an orientation of a robotic device may refer to the orientation of the robotic device with respect to a particular reference frame, such as the ground or a surface on which it stands. An orientation may describe the angular position using Euler angles, Tait-Bryan angles (also known as yaw, pitch, and roll angles), and/or Quaternions. In some instances, such as on a computer-readable medium, the orientation may be represented by an orientation matrix and/or an orientation quaternion, among other representations.
In some scenarios, measurements from sensors on the base of the robotic device may indicate that the robotic device is oriented in such a way and/or has a linear and/or angular velocity that requires control of one or more of the articulated appendages in order to maintain balance of the robotic device. In these scenarios, however, it may be the case that the limbs of the robotic device are oriented and/or moving such that balance control is not required. For example, the body of the robotic device may be tilted to the left, and sensors measuring the body's orientation may thus indicate a need to move limbs to balance the robotic device; however, one or more limbs of the robotic device may be extended to the right, causing the robotic device to be balanced despite the sensors on the base of the robotic device indicating otherwise. The limbs of a robotic device may apply a torque on the body of the robotic device and may also affect the robotic device's center of mass. Thus, orientation and angular velocity measurements of one portion of the robotic device may be an inaccurate representation of the orientation and angular velocity of the combination of the robotic device's body and limbs (which may be referred to herein as the “aggregate” orientation and angular velocity).
In some implementations, the processing system may be configured to estimate the aggregate orientation and/or angular velocity of the entire robotic device based on the sensed orientation of the base of the robotic device and the measured joint angles. The processing system has stored thereon a relationship between the joint angles of the robotic device and the extent to which the joint angles of the robotic device affect the orientation and/or angular velocity of the base of the robotic device. The relationship between the joint angles of the robotic device and the motion of the base of the robotic device may be determined based on the kinematics and mass properties of the limbs of the robotic devices. In other words, the relationship may specify the effects that the joint angles have on the aggregate orientation and/or angular velocity of the robotic device. Additionally, the processing system may be configured to determine components of the orientation and/or angular velocity of the robotic device caused by internal motion and components of the orientation and/or angular velocity of the robotic device caused by external motion. Further, the processing system may differentiate components of the aggregate orientation in order to determine the robotic device's aggregate yaw rate, pitch rate, and roll rate (which may be collectively referred to as the “aggregate angular velocity”).
In some implementations, the robotic device may also include a control system that is configured to control the robotic device on the basis of a simplified model of the robotic device. The control system may be configured to receive the estimated aggregate orientation and/or angular velocity of the robotic device, and subsequently control one or more jointed limbs of the robotic device to behave in a certain manner (e.g., maintain the balance of the robotic device).
In some implementations, the robotic device may include force sensors that measure or estimate the external forces (e.g., the force applied by a limb of the robotic device against the ground) along with kinematic sensors to measure the orientation of the limbs of the robotic device. The processing system may be configured to determine the robotic device's angular momentum based on information measured by the sensors. The control system may be configured with a feedback-based state observer that receives the measured angular momentum and the aggregate angular velocity, and provides a reduced-noise estimate of the angular momentum of the robotic device. The state observer may also receive measurements and/or estimates of torques or forces acting on the robotic device and use them, among other information, as a basis to determine the reduced-noise estimate of the angular momentum of the robotic device.
In some implementations, multiple relationships between the joint angles and their effect on the orientation and/or angular velocity of the base of the robotic device may be stored on the processing system. The processing system may select a particular relationship with which to determine the aggregate orientation and/or angular velocity based on the joint angles. For example, one relationship may be associated with a particular joint being between 0 and 90 degrees, and another relationship may be associated with the particular joint being between 91 and 180 degrees. The selected relationship may more accurately estimate the aggregate orientation of the robotic device than the other relationships.
In some implementations, the processing system may have stored thereon more than one relationship between the joint angles of the robotic device and the extent to which the joint angles of the robotic device affect the orientation and/or angular velocity of the base of the robotic device. Each relationship may correspond to one or more ranges of joint angle values (e.g., operating ranges). In some implementations, the robotic device may operate in one or more modes. A mode of operation may correspond to one or more of the joint angles being within a corresponding set of operating ranges. In these implementations, each mode of operation may correspond to a certain relationship.
The angular velocity of the robotic device may have multiple components describing the robotic device's orientation (e.g., rotational angles) along multiple planes. From the perspective of the robotic device, a rotational angle of the robotic device turned to the left or the right may be referred to herein as “yaw.” A rotational angle of the robotic device upwards or downwards may be referred to herein as “pitch.” A rotational angle of the robotic device tilted to the left or the right may be referred to herein as “roll.” Additionally, the rate of change of the yaw, pitch, and roll may be referred to herein as the “yaw rate,” the “pitch rate,” and the “roll rate,” respectively.
As shown in
Processor(s) 902 may operate as one or more general-purpose processor or special purpose processors (e.g., digital signal processors, application specific integrated circuits, etc.). The processor(s) 902 can be configured to execute computer-readable program instructions 906 that are stored in the data storage 904 and are executable to provide the operations of the robotic device 900 described herein. For instance, the program instructions 906 may be executable to provide operations of controller 908, where the controller 908 may be configured to cause activation and/or deactivation of the mechanical components 914 and the electrical components 916. The processor(s) 902 may operate and enable the robotic device 900 to perform various functions, including the functions described herein.
The data storage 904 may exist as various types of storage media, such as a memory. For example, the data storage 904 may include or take the form of one or more computer-readable storage media that can be read or accessed by processor(s) 902. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with processor(s) 902. In some implementations, the data storage 904 can be implemented using a single physical device (e.g., one optical, magnetic, organic or other memory or disc storage unit), while in other implementations, the data storage 904 can be implemented using two or more physical devices, which may communicate electronically (e.g., via wired or wireless communication). Further, in addition to the computer-readable program instructions 906, the data storage 904 may include additional data such as diagnostic data, among other possibilities.
The robotic device 900 may include at least one controller 908, which may interface with the robotic device 900. The controller 908 may serve as a link between portions of the robotic device 900, such as a link between mechanical components 914 and/or electrical components 916. In some instances, the controller 908 may serve as an interface between the robotic device 900 and another computing device. Furthermore, the controller 908 may serve as an interface between the robotic device 900 and a user(s). The controller 908 may include various components for communicating with the robotic device 900, including one or more joysticks or buttons, among other features. The controller 908 may perform other operations for the robotic device 900 as well. Other examples of controllers may exist as well.
Additionally, the robotic device 900 includes one or more sensor(s) 910 such as force sensors, proximity sensors, motion sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, and/or infrared sensors, among other possibilities. The sensor(s) 910 may provide sensor data to the processor(s) 902 to allow for appropriate interaction of the robotic device 900 with the environment as well as monitoring of operation of the systems of the robotic device 900. The sensor data may be used in evaluation of various factors for activation and deactivation of mechanical components 914 and electrical components 916 by controller 908 and/or a computing system of the robotic device 900.
The sensor(s) 910 may provide information indicative of the environment of the robotic device for the controller 908 and/or computing system to use to determine operations for the robotic device 900. Further, the robotic device 900 may include other sensor(s) 910 configured to receive information indicative of the state of the robotic device 900, including sensor(s) 910 that may monitor the state of the various components of the robotic device 900. The sensor(s) 910 may measure activity of systems of the robotic device 900 and receive information based on the operation of the various features of the robotic device 900, such the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic device 900. The sensor data provided by the sensors may enable the computing system of the robotic device 900 to determine errors in operation as well as monitor overall functioning of components of the robotic device 900.
For example, the computing system may use sensor data to determine the stability of the robotic device 900 during operations as well as measurements related to power levels, communication activities, components that require repair, among other information. As an example configuration, the robotic device 900 may include gyroscope(s), accelerometer(s), and/or other possible sensors to provide sensor data relating to the state of operation of the robotic device. Further, sensor(s) 910 may also monitor the current state of a function that the robotic device 900 may currently be operating. Additionally, the sensor(s) 910 may measure a distance between a given robotic limb of a robotic device and a center of mass of the robotic device. Other example uses for the sensor(s) 910 may exist as well.
Additionally, the robotic device 900 may also include one or more power source(s) 912 configured to supply power to various components of the robotic device 900. Among possible power systems, the robotic device 900 may include a hydraulic system, electrical system, batteries, and/or other types of power systems. As an example illustration, the robotic device 900 may include one or more batteries configured to provide power to components via a wired and/or wireless connection. Within examples, components of the mechanical components 914 and electrical components 916 may each connect to a different power source or may be powered by the same power source. Components of the robotic device 900 may connect to multiple power sources as well.
Within example configurations, any type of power source may be used to power the robotic device 900, such as a gasoline and/or electric engine. Further, the power source(s) 912 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples. Other configurations may also be possible. Additionally, the robotic device 900 may include a hydraulic system configured to provide power to the mechanical components 914 using fluid power. Components of the robotic device 900 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example. The hydraulic system of the robotic device 900 may transfer a large amount of power through small tubes, flexible hoses, or other links between components of the robotic device 900. Other power sources may be included within the robotic device 900.
Mechanical components 914 can represent hardware of the robotic device 900 that may enable the robotic device 900 to operate and perform physical functions. As a few examples, the robotic device 900 may include actuator(s), extendable leg(s), arm(s), wheel(s), one or multiple structured bodies for housing the computing system or other components, and/or other mechanical components. The mechanical components 914 may depend on the design of the robotic device 900 and may also be based on the functions and/or tasks the robotic device 900 may be configured to perform. As such, depending on the operation and functions of the robotic device 900, different mechanical components 914 may be available for the robotic device 900 to utilize. In some examples, the robotic device 900 may be configured to add and/or remove mechanical components 914, which may involve assistance from a user and/or other robotic device.
The electrical components 916 may include various components capable of processing, transferring, providing electrical charge or electric signals, for example. Among possible examples, the electrical components 916 may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic device 900. The electrical components 916 may interwork with the mechanical components 914 to enable the robotic device 900 to perform various operations. The electrical components 916 may be configured to provide power from the power source(s) 912 to the various mechanical components 914, for example. Further, the robotic device 900 may include electric motors. Other examples of electrical components 916 may exist as well.
In some implementations, the robotic device 900 may also include communication link(s) 918 configured to send and/or receive information. The communication link(s) 918 may transmit data indicating the state of the various components of the robotic device 900. For example, information read in by sensor(s) 910 may be transmitted via the communication link(s) 918 to a separate device. Other diagnostic information indicating the integrity or health of the power source(s) 912, mechanical components 914, electrical components 916, processor(s) 902, data storage 904, and/or controller 908 may be transmitted via the communication link(s) 918 to an external communication device.
In some implementations, the robotic device 900 may receive information at the communication link(s) 918 that is processed by the processor(s) 902. The received information may indicate data that is accessible by the processor(s) 902 during execution of the program instructions 906, for example. Further, the received information may change aspects of the controller 908 that may affect the behavior of the mechanical components 914 or the electrical components 916. In some cases, the received information indicates a query requesting a particular piece of information (e.g., the operational state of one or more of the components of the robotic device 900), and the processor(s) 902 may subsequently transmit that particular piece of information back out the communication link(s) 918.
In some cases, the communication link(s) 918 include a wired connection. The robotic device 900 may include one or more ports to interface the communication link(s) 918 to an external device. The communication link(s) 918 may include, in addition to or alternatively to the wired connection, a wireless connection. Some example wireless connections may utilize a cellular connection, such as CDMA, EVDO, GSM/GPRS, or 4G telecommunication, such as WiMAX or LTE. Alternatively or in addition, the wireless connection may utilize a Wi-Fi connection to transmit data to a wireless local area network (WLAN). In some implementations, the wireless connection may also communicate over an infrared link, radio, Bluetooth, or a near-field communication (NFC) device.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure.
This application claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/453,212, filed on Mar. 20, 2023, and titled, “PERCEPTION SYSTEM FOR A LOWER BODY POWERED EXOSKELETON,” the entire contents of which is incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63453212 | Mar 2023 | US |