This disclosure relates generally to methods of and systems for estimating a topography of at least two parts of a body.
Some applications may involve monitoring a topography of parts of a body. However, some methods of monitoring a topography of parts of a body may require high power consumption, have a limited field of view, may be uncomfortable to wear, may have low accuracy, or may depend on complex algorithms.
According to at least one embodiment, there is disclosed a method of estimating a topography of at least first and second parts of a body, the method comprising: causing at least one processor circuit to receive at least one signal representing at least one measurement of deformation of at least a portion of the body; causing the at least one processor circuit to associate the deformation with relative positions of at least the first and second parts of the body; and causing the at least one processor circuit to produce at least one output signal representing the relative positions of at least the first and second parts of the body.
According to at least one embodiment, there is disclosed a system for estimating a topography of at least first and second parts of a body, the system comprising: a means for receiving at least one signal representing at least one measurement of deformation of at least a portion of the body; a means for associating the deformation with relative positions of at least the first and second parts of the body; and a means for producing at least one output signal representing the relative positions of at least the first and second parts of the body. According to at least one embodiment, there is disclosed a system for estimating a topography of at least first and second parts of a body, the system comprising at least one processor circuit configured to, at least: receive at least one signal representing at least one measurement of deformation of at least a portion of the body; associate the deformation with relative positions of at least the first and second parts of the body; and produce at least one output signal representing the relative positions of at least the first and second parts of the body.
Other aspects and features will become apparent to those ordinarily skilled in the art upon review of the following description of illustrative embodiments in conjunction with the accompanying figures.
Referring to
Display Device
In the embodiment shown, the display device 105 is a television screen. However, display devices of alternative embodiments may vary. For example, a display device of an alternative embodiment may be a virtual-reality goggle, an augmented-reality goggle, a mixed-reality goggle, a mobile phone, a smartphone, a tablet, a projected image on a screen, or any display device of a visual interactive system.
Sensor
Referring to
Further, the resiliently deformable material 104 is sized to be received tightly on (or conform to) a forearm 106 of a body, and configured to surround the forearm 106. The sensor 102 may therefore be referred to as a sensor textile. The sensor 102 includes a plurality of deformation sensors, such as deformation sensors 108 and 110, for example. When the sensor 102 is worn on the forearm 106, the deformation sensors of the sensor 102 are positioned against an external surface of the forearm 106 and positioned to measure deformations of the forearm 106 that may be caused by movement of muscles, bones, tendons, or other tissues in the forearm 106.
In the embodiment shown, the deformation sensors of the sensor 102 are positioned in the sensor 102 in a two-dimensional array including a row of deformation sensors shown generally at 112, a row of deformation sensors shown generally at 114, a row of deformation sensors shown generally at 116, and a row of deformation sensors shown generally at 118. The rows of deformation sensors 112, 114, 116, and 118 are spaced apart from each other such that, when the sensor 102 is worn on the forearm 106, the rows of deformation sensors 112, 114, 116, and 118 are spaced apart from each other in a direction along the forearm 106, and each of the rows of deformation sensors 112, 114, 116, and 118 includes a plurality of deformation sensors spaced apart from each other in an anterior-posterior direction when worn on the forearm 106. Therefore, the deformation sensors of the sensor 102 are spaced apart from each other in at least two directions and therefore form a grid or two-dimensional array.
The sensor 102 is an example only, and alternative embodiments may differ. For example, in alternative embodiments, deformation sensors may be positioned in other ways, such as an irregular pattern over two dimensions that may correspond to anatomical features. For example, to detect radial artery pulsations, a high-density array of sensors can be placed close to a radial artery and other sensors on the forearm for movement detection.
The sensor 102 also includes a data processing unit 120 in communication with the deformation sensors of the sensor 102. Each of the rows of deformation sensors may include a respective plurality of stretchable wire lines, such as the stretchable wire line 122 shown in the row of deformation sensors 112, and a stretchable bus line 124 may connect the stretchable wire lines (such as the stretchable wire line 122, for example) to the data processing unit 120.
In the embodiment shown, the data processing unit 120 is configured to communicate wirelessly with the computing device 103, for example according to a Bluetooth™, WiFi, Zigbee™, near-field communication (“NFC”), or 5G protocol, or according to another protocol for wireless communication. However, in alternative embodiments, the data processing unit 120 may communicate with the computing device 103 using one or more wires or in other ways. Additionally, the data processing unit 120 may implement functions including but not limited to analog signal conditioning and amplification, analog to digital conversion, signal filtering and processing, signal classification and recognition, machine learning, and wireless data transfer. The data processing unit 120 may also include battery and storage devices or wireless charging or other energy harvesting components such as energy generation from movement or environmental light, for example.
In general, information (such as information representing measurements of deformations by the sensor 102, for example) may be transferred wirelessly or otherwise to the computing device 103 in real time. Alternatively, such information can be stored in the data processing unit 120 or elsewhere, and transferred to the computing device 103 at a later time.
Further, a communication rate between the processing unit 120 and the computing device 103 may be about a few megabytes per second, about a few thousand bytes per second, about a few bytes per second, about a few bytes every hour, or about a few bytes every day, depending for example on energy-usage requirements or accuracy or refresh rates of data that may be needed for a specific application. Such a communication rate may, for example, be high in gaming and sports applications and may be much lower in other applications. Such a communication rate can be adaptively modified to save energy, for example increasing when demand is high and decreasing when there is little or no need for data.
The data processing unit 120 may also include one or more inertial measurement units (“IMUs”) such as one or more accelerometers, one or more gyroscopes, one or more magnetometers, or a combination of two or more thereof, which may detect orientation and angles of movement as spatial reference point for tissue, for example. The processing unit 120 may fuse measurements of deformation (or topography data) with data from one or more such IMUs, which may improve accuracy and functionality. The data processing unit 120 may also include one or more global positioning system (GPS) capabilities (or one or more other locating devices), which may facilitate identifying one or more locations of the sensor 102 or long-range movements of the sensor 102.
The data processing unit 120 or the sensor 102 may also include one or more haptic devices, or other devices which may apply tactile or other feedback to a person wearing the sensor 102.
Deformation sensors such as those described herein may be similar to sensors that are described in U.S. Pat. No. 9,494,474. For example, referring to
Referring to
The sensor 102 is an example only, and sensors of alternative embodiments may differ. For example, a sensor of an alternative embodiment may not be worn on a body, and such as sensor may be a furniture cover or bedding, for example.
Further, the embodiment shown includes one sensor 102, but alternative embodiments may include more than one sensor on one body or (as shown in
Computing Device
In general, the computing device 103 may include a personal computer, a laptop, a tablet, a stand-alone computing device, or any computing hardware for a virtual-reality goggle, an augmented-reality goggle, a mixed-reality goggle, a mobile phone, a smartphone, a television screen, a gaming device, a projector for projecting images on a screen, or any display device of a visual interactive system.
Also, although
Referring to
In general, the storage memory 154 includes stores for storing storage codes as described herein, for example. In general, the program memory 156 stores program codes that, when executed by the microprocessor 152, cause the processor circuit 150 to implement functions of the computing device 103 such as those described herein, for example. The storage memory 154 and the program memory 156 may be implemented in one or more of the same or different computer-readable storage media, which in various embodiments may include one or more of a read-only memory (“ROM”), a random access memory (“RAM”), a hard disc drive (“HDD”), a solid-state drive (“SSD”), a remote memory such as one or more cloud or edge cloud storage devices, and other computer-readable and/or computer-writable storage media.
The I/O module 158 may include various signal interfaces, analog-to-digital converters (“ADCs”), receivers, transmitters, and/or other circuitry to receive, produce, and transmit signals as described herein, for example. In the embodiment shown, the I/O module 158 includes an input signal interface 160 for receiving signals (for example according to one or more protocols such as those described above) from the data processing unit 120 of the sensor 102, and an output signal interface 162 for producing one or more output signals and for transmitting the one or more output signals to the display 105 to control the display 105.
The I/O module 158 is an example only and may differ in alternative embodiments. For example, alternative embodiments may include more, fewer, or different interfaces. Further, the I/O module 158 may connect the computing device 103 to a computer network (such as an internet cloud or edge cloud, for example), and such a computer network may facilitate real-time communication with other computing devices. Such other computing devices may interact with the computing device 103 to permit remote interaction, for example.
More generally, the processor circuit 150 is an example only, and alternative embodiments may differ. For example, in alternative embodiments, the computing device 103 may include different hardware, different software, or both. Such different hardware may include more than one microprocessor, one or more alternatives to the microprocessor 152, discrete logic circuits, or an application-specific integrated circuit (“ASIC”), or a combination of one or more thereof, for example. As a further example, in alternative embodiments, some or all of the storage memory 154, of the program memory 156, or both may be cloud storage or still other storage.
The storage memory 154 includes a musculoskeletal model store 164, which stores codes representing one or more musculoskeletal models of a body. For example, such a musculoskeletal model may represent bones, muscles (such as the flexor digitorum superficialis muscle bundles, for example), tendons, fascia, arteries, and other tissues, including representations of how positions of muscles or other tissues (and movements, contractions and rotations thereof) may be associated with relative positions of body parts, or with angles of flexion, extension, or rotations of joints of the body. In some embodiments, the deformation sensors of the sensor 102 may be positioned to measure deformation of particularly important body parts of the musculoskeletal model.
Program Memory
In general, the program memory 156 may include program codes that, when executed by the microprocessor 152, cause the processor circuit 150 to implement machine learning or artificial intelligence algorithms such as deep neural networks, deep learning, or support vector machines, for example. Further, the program memory 156 may cause the processor circuit 150 to implement cloud virtual machines.
The program memory 156 includes program codes 166, which are illustrated schematically in
The deformation measurements measured by the deformation sensor 110, may, for example, represent a moving tissue dynamic topography (MTDT) map, which may provide relative changes (in percentage, for example) in one or more signals produced by the deformation sensors at different locations on the forearm 106. The topography examples shown in
Referring back to
After block 194, or if at block 190 the positions of the deformation sensors are calibrated relative to the anatomical features, the program codes 166 continue at block 198, which includes codes that, when executed by the microprocessor 152, cause the processor circuit 150 to infer, according to the deformation measurement as received at block 168 and as stored in the input buffer 170, positions of one or more body parts underlying the deformation sensors of the sensor 102. In general, such underlying body parts may include one or more muscles, one or more bones, one or more tendons, one or more other body parts, or a combination of two or more thereof. The codes at block 198 may involve a statistical learning algorithm trained to associate deformation of a portion of the body with positions of one or more muscles. The program codes 166 then continue at block 200, which includes codes that, when executed by the microprocessor 152 cause the processor circuit 150 to store codes representing the inferred muscle positions in an underlying body part position store 202 in the storage memory 154. Such information regarding such a body part may be stored in the storage memory 154, in cloud storage, or elsewhere for later retrieval. Such information regarding such a body part may indicate, for example, size or activity of a muscle, form or fitness of a muscle, size of the body part, the fit and stretch of the sensor around the body part, or a combination of two or more thereof, for example.
Referring to
As another example, referring to
Referring back to
As the embodiment shown illustrates, embodiments such as those described herein may infer, from deformation of one part of a body (the forearm 106 in the embodiment shown), one or more joint angles between a first part of the body (the forearm 106 in the embodiment shown) where deformation is measured and a second part of the body (such as the hand 186 or one or more fingers of the hand 186) that is not within a sensor (the sensor 102 in the embodiment shown) but that rather may be outside of (or spaced apart from) a part of the body (the forearm 106 in the embodiment shown) where deformation is measured and that may be movable relative to the part of the body where deformation is measured.
Referring back to
Such joint angles between body parts or anatomical positions of body parts may more generally be referred to as a topography of such body parts. In general, a topography of body parts may refer to relative positions or orientations of the body parts. Further, as the embodiment shown illustrates, embodiments such as those described herein may infer, from deformation of one part of a body (the forearm 106 in the embodiment shown), one or more joint angles, one or more anatomical positions, or (more generally) a topography of one or more body parts (the hand 186 and fingers of the hand 186) that are not within a sensor (the sensor 102 in the embodiment shown) but that rather may be outside of (or spaced apart from) a part of the body (the forearm 106 in the embodiment shown) where deformation is measured and that may be movable relative to the part of the body where deformation is measured.
As another example, movement of an elbow adjacent the forearm 106, of one or more fingers of the hand 186, of a shoulder on a same arm as the forearm 106, or of still other body parts may be inferred from measurements of deformation of the forearm 106.
An anatomical position, or a sequence of anatomical positions at respective different times, stored in the anatomical positions store 240 may represent a gesture or a user input.
Therefore, after block 238, the program codes 166 continue at block 242, which includes codes that, when executed by the microprocessor 152, cause the processor circuit 150 to determine whether an anatomical position, or a sequence of anatomical positions at respective different times, stored in the anatomical positions store 240 may represent a gesture or a user input.
An example of a sequence of anatomical positions at respective different times is illustrated in
If at block 242 an anatomical position, or a sequence of anatomical positions at respective different times, stored in the anatomical positions store 240 may represent a gesture or a user input, then the program codes 166 continue at block 252, which includes codes that, when executed by the microprocessor 152, cause the processor circuit 150 to store, in a gesture or user input store 254 in the storage memory 154, one or more codes representing the gesture or user input identified at block 242.
After block 252, or if at block 242 a gesture or user input is not identified, the program codes 166 continue at block 256, which includes codes that, when executed by the microprocessor 152, cause the processor circuit 150 to cause the output signal interface 162 to produce one or more output signals in response to respective positions of one or more underlying body parts stored in the position of underlying body part store 202, one or more joint angles stored in the joint angles store 224, one or more anatomical positions stored in the anatomical positions store 240, one or more gestures or user inputs stored in the gesture or user input store 254, or a combination of two or more thereof.
After block 256, the program codes 166 may return to block 168 as described above, so that measurements and inferences may be handled iteratively over a period of time.
Other inferences may be made. For example, speed, force, or both of movement may be detected or inferred, for example from one or more measurements or inferences of how forcefully or how fast a muscle contracts. Fit of the sensor 102 (or of another wearable or of other clothing) and volume of a muscle for a specific user may also be measured and inferred. Such measurements of inferences may indicate whether a size of a muscle changes over a period of time.
In general, the one or more output signals may control the display device 105 or one or more other display devices in different applications depending on the inferences such as those described above or calculations based on deformation measured by the sensor 102. For example, the one or more output signals may control the display device 105 in a gaming application, or the one or more output signals may control a virtual-reality, augmented-reality, or mixed-reality display. As another example, the one or more output signals may control one or more robotic devices. As another example, the one or more output signals may cause the display device 105 to display one or more anatomical positions stored in the anatomical positions store 240 at one or more different times, and such displays may facilitate analysis of body movements for analysis of sports performance, medical diagnosis, or other purposes. In alternative embodiments, program codes may cause the processor circuit 150 may predict gestures or user inputs based on specific muscle bundle or bone or tendon movement.
Also, in general, such control of the display device 105 may be real-time or may be delayed. For example, control of the display device 105 responsive to measurements of deformations by the sensor 102 may involve controlling a gaming application, a virtual-reality, augmented-reality, or mixed-reality display, or one or more robotic devices in real-time, or may display anatomical positions inferred from measurements of deformations by the sensor 102 in real time. Alternatively, such control of the display device 105 may be delayed. For example, anatomical positions inferred from measurements of deformations by the sensor 102 may be stored and accumulated over time, and may be displayed later.
In summary, in the embodiment described above, when the user moves fingers of the hand 186, the hand 186, or the forearm 106, deformation measurements by the deformation sensors may be used to form a time-dependent MTDT of the forearm, 106, which may represent movement (such as gradual movement, for example) of specific muscle bundles, bones, tendons, or two or more thereof within the forearm 106, and such movement can be related (in real time, for example) to movements (such as gradual movements, for example) of the hand 186 or of one or more fingers of the hand 186, including transitions between gestures.
Referring to
In this embodiment, the sensor 258 may provide MTDT monitoring for accurate detection and monitoring of walking patterns, gait, or running habits. Referring to
Although the sensor 258 is shown on a lower leg 260, sensors of other embodiments may sense movements of body parts, such as a thigh, a hip, one or more buttocks, or a combination of two or more thereof.
Referring to
Accurate placement of the plurality of deformation sensors, such as deformation sensors 272 and 274, both anterior and posterior sides of the torso 270 (on a chest, abdomen, and back, for example), may enable measuring MTDT data from some or all of the upper body. The deformation sensors placed on the torso 270 (or for example the chest and epigastrium) may, in addition, measure respiratory rate, respiratory pattern, heart rate, heart rate variability, or other vital signs. The plurality of deformation sensors can measure MTDT from both the anterior and posterior side of the torso 270, which can be associated with body movement such as shoulder stretch and/or rotational movements of the torso 270.
Sensors of other embodiments may be in a shirt, a top, a vest, or other upper-body garments or wearables.
The system 100 is an example only, and alternative embodiments may differ. For example, referring to
As shown in
In general, different embodiments may include multiple sensors on the same body, which may be in communication with each other, and which may facilitate measurements more accurately or more comprehensively than a single sensor. Further, one or more sensors on multiple bodies (as shown in
Further, multiple computing devices such as those described herein may execute the same or complementary programs, and may interact with each other using a computer network (such as the Internet, for example).
In summary, sensors such as those described herein may be worn on one or more parts of a body, and may measure deformations that may be associated with movements of one or more other parts of the body. Such associations may provide input for applications such as virtual reality, augmented reality, mixed reality, robotic control, other human-computer interactions, health monitoring, rehabilitation, sports and wellness, or gaming, for example.
Although specific embodiments have been described and illustrated, such embodiments should be considered illustrative only and not as limiting the invention as construed according to the accompanying claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CA2019/050493 | 4/18/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62660168 | Apr 2018 | US |