The present technology relates to a connected device software application. More specifically, but not by limitation, the present technology relates, to an application capable of assessing a user's real-time fall risk when installed onto a commercially available mobile device equipped with inertial measurement capabilities, having Internet and/or cellular connectivity, and voice communication technology.
The approaches described in this section could be pursued, but are not necessarily approaches that have previously been conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.
In response to the numerous risks associated with aging, and the fact that the population of the United States is rapidly aging, the effort to maintain independence has led to the development of a number of applications focused on various aspects of health monitoring. Most of these applications have been developed in a manner such that they include capabilities for monitoring biological factors such as; blood pressure, heart rate, blood glucose levels, and/or sleep. While evidence suggests these biological signals associated with overall health and that consistent monitoring of parameters such as these can contribute to improved health, currently available health applications do not provide the capability to consistently monitor a user's capacity for producing motion. Additionally, these current health monitoring applications are generally not self-contained and many times require hardware in additional to that on which they have been installed. The present technology provides a self-contained comprehensive method of evaluating a user's movement capabilities and provides non-invasive methods to directly monitor and identify declines in functional capacity. The results of these critical motion assessments can be easily accessed by the user and displayed on the user's mobile device in various formats.
In some embodiments the present disclosure is directed to a system of one or more computers which can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination thereof installed on the system that in operation causes or cause the system to perform actions and/or method steps as described herein.
According to some embodiments the present technology is directed to a method for monitoring movement capabilities of a user using clinical mobility based assessments, the method comprising: (a) providing, using a mobile device comprising an inertial measurement device, a clinical mobility based assessment to a user; (b) generating, using the inertial measurement device, inertial data of the user that is indicative of movement capabilities of the user based on the clinical mobility based assessment; (c) logging the inertial data of the user locally to the mobile device resulting in locally logged inertial data of the user; (d) processing in real-time the locally logged inertial data of the user to determine position and orientation of the mobile device during the clinical mobility based assessment; (e) determining, using the position and the orientation of the mobile device during the clinical mobility based assessment, a physical movement assessment of the user associated with the clinical mobility based assessment; and (f) displaying, using the mobile device, at least a portion of the physical movement assessment to the user.
In various embodiments the method includes displaying a representation of the clinical mobility based assessment via an interactive animated conversational graphical user interface displayed by the mobile device.
In some embodiments the method includes the clinical mobility based assessment includes one or more of a test duration, a turning duration, a sit-to-stand duration, a stand-to-sit duration, a number of sit-to-stand repetitions completed within a predetermined period of time, and a number of stand-to-sit repetitions completed within a predetermined period of time.
In various embodiments the inertial data of the user that is indicative of movement capabilities of the user based on the clinical mobility based assessment comprises gyroscope data generated using a gyroscope; and accelerometer data generated using an accelerometer.
In some embodiments the processing in real-time the locally logged inertial data of the user to determine position and orientation of the mobile device during the clinical mobility based assessment comprises: segmenting and aligning the locally logged inertial data of the user resulting in segmented and aligned inertial data of the user; gravitational acceleration counterbalancing of the segmented and aligned inertial data of the user resulting in counterbalanced inertial data of the user; determining velocity of the mobile device during the clinical mobility based assessment using the counterbalanced inertial data of the user; drift compensating the velocity of the mobile device during the clinical mobility based assessment resulting in drift compensated velocity data; and determining the position and the orientation of the mobile device during the clinical mobility based assessment using the drift compensated velocity data.
In various embodiments the processing in real-time the locally logged inertial data of the user to determine position and orientation of the mobile device during the clinical mobility based assessment comprises: segmenting and aligning the locally logged inertial data of the user resulting in segmented and aligned inertial data of the user; integrating angular orientation of the segmented and aligned inertial data of the user resulting in counterbalanced inertial data of the user; determining velocity of the mobile device during the clinical mobility based assessment using the counterbalanced inertial data of the user; drift compensating the velocity of the mobile device during the clinical mobility based assessment resulting in drift compensated velocity data; and determining the position and the orientation of the mobile device during the clinical mobility based assessment using the drift compensated velocity data.
In some embodiments the method further comprises: determining features of functional movements of the user based on the position and the orientation of the mobile device during the clinical mobility based assessment, the features of functional movements including one or more of: time to completion of a task, rate to completion of a task, total repetitions of a task completed within a predetermined period of time, decay of repetitions of a task completed within a predetermined period of time, turn rate, anteroposterior sway, mediolateral sway, gait characteristics, total magnitude of displacement, vertical displacement, mediolateral displacement, and resultant displacement.
In various embodiments the physical movement assessment to the user includes one or more of a static stability of the user, dynamic stability of the user, postural stability of the user, balance of the user, mobility of the user, fall risk of the user, lower body muscular strength of the user, lower body muscular endurance of the user, lower body muscular flexibility of the user, upper body muscular strength of the user, and upper body muscular endurance of the user.
In some embodiments the method further comprises: receiving the locally logged inertial data of the user and the physical movement assessment of the user; conducting a longitude physical movement assessment analysis using the physical movement assessment of the user associated with the clinical mobility based assessment; and displaying at least a portion of the longitude physical movement assessment analysis to the user.
Certain embodiments of the present technology are illustrated by the accompanying figures. It will be understood that the figures are not necessarily to scale. It will be understood that the technology is not necessarily limited to the particular embodiments illustrated herein.
The detailed embodiments of the present technology are disclosed here. It should be understood, that the disclosed embodiments are merely exemplary of the invention, which may be embodied in multiple forms. Those details disclosed herein are not to be interpreted in any form as limiting, but as the basis for the claims.
In various embodiments an object of the present technology is a software application to provide monitoring and assessment of functional motion capacity of a user through simple interaction with an inertial measurement unit equipped mobile device. As such, the software application functions to consistently evaluate the motion characteristics of a user and report how those motion characteristics relate to the real-time functional capacity of the user. The software application also provides a user with the capability for assessing performance on a variety of fundamental movement tests. Additionally, the capacity of the software application to utilize cloud-based storage and compute functionality provides the capability for quick storage, retrieval and assessment of multiple tests in such a manner that real-time declines in functional movement capacity can be identified and reported. Additional advantages of the software application are apparent from the detailed embodiment descriptions and accompanying drawings, which set forth embodiments of the present technology.
In various embodiments the application 155 is an Electronic Caregiver developed mobile application capable of monitoring the movement capabilities of the user 110. When in use, the application 155 embodies the capability for the collection, processing, storage, and analysis of data describing motion characteristics of the user 110 during various clinical mobility based assessments. For example, a clinical mobility based assessment may be a motion task. In various embodiments a clinical mobility based assessment may be a test duration, a turning duration, a sit-to-stand duration, a stand-to-sit duration, a number of sit-to-stand repetitions completed within a predetermined period of time, and a number of stand-to-sit repetitions completed within a predetermined period of time. For example, the clinical mobility based assessments described in
In various embodiments the user 110 may access the mobile device 120 by accessing a display of a representation of the clinical mobility based assessment via an interactive animated conversational graphical user interface displayed by the mobile device 120. Embodiments of the present technology include providing, using the mobile device 120 comprising the inertial measurement device 130, a clinical mobility based assessment to a user and generating, using the inertial measurement device 130, inertial data of the user 110 that is indicative of movement capabilities of the user 110 based on the clinical mobility based assessment. Embodiments comprise logging the inertial data of the user 110 locally to the mobile device 120 resulting in locally logged inertial data of the user 110. In various embodiments the inertial data of the user 110 that is indicative of movement capabilities of the user 110 based on the clinical mobility based assessment comprises gyroscope data generated using the gyroscope 140; and accelerometer data generated using the accelerometer 150.
In various embodiments the inertial data processing algorithm 200 is for monitoring movement capabilities of the user 110 using clinical mobility based assessments. Embodiments of the present technology include processing in real-time the locally logged inertial data of the user 110 to determine position and orientation of the mobile device 120 during the clinical mobility based assessment. In some embodiments the processing in real-time the locally logged inertial data of the user 110 to determine position and orientation of the mobile device during the clinical mobility based assessment comprises: segmenting and aligning the locally logged inertial data of the user 110 resulting in segmented and aligned inertial data of the user 110. For example, segmenting and aligning the locally logged inertial data of the user 110 is shown in
Embodiments of the present technology include processing in real-time the locally logged inertial data of the user 110 to determine position and orientation of the mobile device 120 during the clinical mobility based assessment. In some embodiments the processing in real-time the locally logged inertial data of the user 110 to determine position and orientation of the mobile device during the clinical mobility based assessment comprises: segmenting and aligning the locally logged inertial data of the user 110 resulting in segmented and aligned inertial data of the user 110; integrating angular orientation of the segmented and aligned inertial data of the user 110 resulting in counterbalanced inertial data of the user 110; determining velocity of the mobile device during the clinical mobility based assessment using the counterbalanced inertial data of the user 110; drift compensating the velocity of the mobile device during the clinical mobility based assessment resulting in drift compensated velocity data; and determining the position and the orientation of the mobile device during the clinical mobility based assessment using the drift compensated velocity data.
In general, the cloud computing network 320 is a cloud-based computing environment, which is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices.
The cloud computing network 320 may be formed, for example, by a network of web servers that comprise a plurality of computing devices, such as the computer system 700, with each server (or at least a plurality thereof) providing processor and/or storage resources. These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users).
As shown in
In various embodiments, the method 600 optionally includes receiving 670 the locally logged inertial data of the user and the physical movement assessment of the user; conducting 680 a longitude physical movement assessment analysis using the physical movement assessment of the user associated with the clinical mobility based assessment; and displaying 690 at least a portion of the longitude physical movement assessment analysis to the user.
In various embodiments the conducting the longitude physical movement assessment analysis comprises: receiving a predetermined threshold of change in physical movement associated with a domain from a cloud-based normative data storage; comparing the physical movement assessment of the user with the predetermined threshold of change in physical movement; determining, based on the comparing, that the physical movement assessment exceeds the predetermined threshold of change in physical movement; and displaying, if the physical movement assessment exceeds the predetermined threshold of change in physical movement, a longitude mobility assessment to the user.
In various embodiments, systems and methods of the present technology described herein are capable of performing the same assessment as a clinician on demand in various embodiments. As such, the user 110 assumes a seated position in a standard chair, opens the application 155 (e.g., Electronic Caregiver application), and selects a clinical mobility based assessment (i.e., the timed up-and-go clinical mobility based assessment) from the drop down menu on the mobile device 120. Upon test selection, the inertial measurement device 130 is activated and begins collecting inertial data of the user 110. After a 5 second countdown, the user 110 performs the timed up-and-go test from beginning to end. After returning to the seated position, the user selects the end test icon to terminate collection of inertial data. As the timed up-and-go test is completed, the signal segmentation algorithm segments the inertial data into a standing phase 415, an outbound phase 420 (i.e., outbound walking), a 180° turn phase 425 (i.e., turning), an inbound phase 430 (i.e., inbound walking), and a sitting phase 435. Following segmenting and aligning the locally logged inertial data of the user, a variety of features (e.g. time to test completion, magnitude of vertical acceleration during standing, and magnitude of vertical acceleration during sitting) are used to identify characteristics of functional decline of the user 110. For example, characteristics of functional decline may include an increase in the time to complete the timed up-and-go test, a decline in the peak and/or overall magnitude of vertical acceleration during the standing phase 415 or an increase in the peak and/or overall magnitude of vertical acceleration during the sitting phase 435.
Another common functional test utilized in a geriatric care provision setting is the postural stability test. The postural stability test requires the user 110 to maintain a static standing position for a period of time during which postural sway measurements are collected. As the postural stability test is completed, a clinician typically records the observed stability of the user 110 completing the postural stability test as well as the various magnitudes of acceleration that are indicative of postural sway. Again, systems and methods of the present technology including the application 155 (e.g., Electronic Caregiver application) are capable of performing the same assessment as the clinician on demand. As such, the user 110 assumes a standing position, opens the application 155 (e.g., Electronic Caregiver application) and selects the postural stability test from a drop down menu. Upon selection of the postural stability test, the inertial measurement device 130 in the mobile device 120 is activated and begins collecting inertial data of the user 110. After a 5 second countdown, the user 110 performs the postural stability test for a temporal period specified by the application 155. As the postural stability test is completed, the inertial data of the user 110 is processed and transposed into anteroposterior, mediolateral and resultant magnitudes (i.e., accelerometer data) and angular motion magnitudes about the anteroposterior, mediolateral and transverse axes (i.e., gyroscopic data). The accelerometer data and the gyroscopic data are analyzed to quantify the magnitude of sway along and about each bodily axis which can be used as an indicator of overall static stability and potential risk of falling of the user 110.
The example computer system 700 includes a processor or multiple processors 705 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), and a main memory 710 and a static memory 715, which communicate with each other via a bus 720. The computer system 700 can further include a video display unit 725 (e.g., a liquid-crystal display (LCD), organic light emitting diode (OLED) display, or a cathode ray tube (CRT)). The computer system 700 also includes at least one input device 730, such as an alphanumeric input device (e.g., a keyboard), a cursor control device (e.g., a mouse), a microphone, a digital camera, a video camera, and so forth. The computer system 700 also includes a disk drive unit 735, a signal generation device 740 (e.g., a speaker), and a network interface device 745.
The disk drive unit 735 (also referred to as the disk drive unit 735) includes a machine-readable medium 750 (also referred to as a computer-readable medium 750), which stores one or more sets of instructions and data structures (e.g., instructions 755) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 755 can also reside, completely or at least partially, within the main memory 710, static memory 715 and/or within the processor(s) 705 during execution thereof by the computer system 700. The main memory 710, static memory 715, and the processor(s) 705 also constitute machine-readable media.
The instructions 755 can further be transmitted or received over a communications network 760 via the network interface device 745 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP), CAN, Serial, and Modbus). The communications network 760 includes the Internet, local intranet, Personal Area Network (PAN), Local Area Network (LAN), Wide Area Network (WAN), Metropolitan Area Network (MAN), virtual private network (VPN), storage area network (SAN), frame relay connection, Advanced Intelligent Network (AIN) connection, synchronous optical network (SONET) connection, digital T1, T3, E1 or E3 line, Digital Data Service (DDS) connection, Digital Subscriber Line (DSL) connection, Ethernet connection, Integrated Services Digital Network (ISDN) line, cable modem, Asynchronous Transfer Mode (ATM) connection, or an Fiber Distributed Data Interface (FDDI) or Copper Distributed Data Interface (CDDI) connection. Furthermore, communications network 760 can also include links to any of a variety of wireless networks including Wireless Application Protocol (WAP), General Packet Radio Service (GPRS), Global System for Mobile Communication (GSM), Code Division Multiple Access (CDMA) or Time Division Multiple Access (TDMA), cellular phone networks, Global Positioning System (GPS), cellular digital packet data (CDPD), Research in Motion, Limited (RIM) duplex paging network, Bluetooth radio, or an IEEE 802.11-based radio frequency network.
While the machine-readable medium 750 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions. The term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media. Such media can also include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memory (RAM), read only memory (ROM), and the like.
The example embodiments described herein can be implemented in an operating environment comprising computer-executable instructions (e.g., software) installed on a computer, in hardware, or in a combination of software and hardware. The computer-executable instructions can be written in a computer programming language or can be embodied in firmware logic. If written in a programming language conforming to a recognized standard, such instructions can be executed on a variety of hardware platforms and for interfaces to a variety of operating systems. Although not limited thereto, computer software programs for implementing the present method can be written in any number of suitable programming languages such as, for example, Hypertext Markup Language (HTML), Dynamic HTML, XML, Extensible Stylesheet Language (XSL), Document Style Semantics and Specification Language (DSSSL), Cascading Style Sheets (CSS), Synchronized Multimedia Integration Language (SMIL), Wireless Markup Language (WML), Java™, Jini™, C, C++, C #, .NET, Adobe Flash, Perl, UNIX Shell, Visual Basic or Visual Basic Script, Virtual Reality Markup Language (VRML), ColdFusion™ or other compilers, assemblers, interpreters, or other computer languages or platforms.
Thus, technology for monitoring movement capabilities of a user using clinical mobility based assessments is disclosed. Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes can be made to these example embodiments without departing from the broader spirit and scope of the present application. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
This application claims the priority benefit of U.S. Provisional Application Ser. No. 62/645,053, filed on Mar. 19, 2018 titled “Consumer Application for Mobile Assessment of Functional Capacity and Falls Risk,” which is hereby incorporated by reference herein in its entirety including all appendices and all references cited therein.
Number | Name | Date | Kind |
---|---|---|---|
5211642 | Clendenning | May 1993 | A |
5475953 | Greenfield | Dec 1995 | A |
6665647 | Haudenschild | Dec 2003 | B1 |
7233872 | Shibasaki et al. | Jun 2007 | B2 |
7445086 | Sizemore | Nov 2008 | B1 |
7612681 | Azzaro et al. | Nov 2009 | B2 |
7971141 | Quinn et al. | Jun 2011 | B1 |
8206325 | Najafi et al. | Jun 2012 | B1 |
8771206 | Gettelman et al. | Jul 2014 | B2 |
9317916 | Hanina et al. | Apr 2016 | B1 |
9591996 | Chang et al. | Mar 2017 | B2 |
9972187 | Srinivasan et al. | May 2018 | B1 |
10387963 | Leise et al. | Aug 2019 | B1 |
10628635 | Carpenter, II et al. | Apr 2020 | B1 |
10813572 | Dohrmann et al. | Oct 2020 | B2 |
11113943 | Wright et al. | Sep 2021 | B2 |
20020062342 | Sidles | May 2002 | A1 |
20020196944 | Davis et al. | Dec 2002 | A1 |
20040109470 | Derechin et al. | Jun 2004 | A1 |
20050035862 | Wildman et al. | Feb 2005 | A1 |
20050055942 | Maelzer et al. | Mar 2005 | A1 |
20070238936 | Becker | Oct 2007 | A1 |
20080010293 | Zpevak et al. | Jan 2008 | A1 |
20080186189 | Azzaro et al. | Aug 2008 | A1 |
20090094285 | Mackle et al. | Apr 2009 | A1 |
20100124737 | Panzer | May 2010 | A1 |
20110126207 | Wipfel et al. | May 2011 | A1 |
20110145018 | Fotsch et al. | Jun 2011 | A1 |
20110232708 | Kemp | Sep 2011 | A1 |
20120025989 | Cuddihy et al. | Feb 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120120184 | Fornell et al. | May 2012 | A1 |
20120121849 | Nojima | May 2012 | A1 |
20120154582 | Johnson et al. | Jun 2012 | A1 |
20120165618 | Algoo et al. | Jun 2012 | A1 |
20120179067 | Wekell | Jul 2012 | A1 |
20120179916 | Staker et al. | Jul 2012 | A1 |
20120229634 | Laett et al. | Sep 2012 | A1 |
20120253233 | Greene et al. | Oct 2012 | A1 |
20130000228 | Ovaert | Jan 2013 | A1 |
20130127620 | Siebers et al. | May 2013 | A1 |
20130145449 | Busser et al. | Jun 2013 | A1 |
20130167025 | Patri et al. | Jun 2013 | A1 |
20130204545 | Solinsky | Aug 2013 | A1 |
20130212501 | Anderson et al. | Aug 2013 | A1 |
20130237395 | Hjelt et al. | Sep 2013 | A1 |
20130289449 | Stone et al. | Oct 2013 | A1 |
20130303860 | Bender et al. | Nov 2013 | A1 |
20140128691 | Olivier | May 2014 | A1 |
20140148733 | Stone et al. | May 2014 | A1 |
20140171039 | Bjontegard | Jun 2014 | A1 |
20140171834 | DeGoede et al. | Jun 2014 | A1 |
20140232600 | Larose et al. | Aug 2014 | A1 |
20140243686 | Kimmel | Aug 2014 | A1 |
20140278605 | Borucki et al. | Sep 2014 | A1 |
20140330172 | Jovanov | Nov 2014 | A1 |
20140337048 | Brown et al. | Nov 2014 | A1 |
20140358828 | Phillipps et al. | Dec 2014 | A1 |
20140368601 | deCharms | Dec 2014 | A1 |
20150019250 | Goodman et al. | Jan 2015 | A1 |
20150109442 | Derenne et al. | Apr 2015 | A1 |
20150169835 | Hamdan et al. | Jun 2015 | A1 |
20150359467 | Tran | Dec 2015 | A1 |
20160026354 | McIntosh et al. | Jan 2016 | A1 |
20160154977 | Jagadish et al. | Jun 2016 | A1 |
20160217264 | Sanford | Jul 2016 | A1 |
20160253890 | Rabinowitz et al. | Sep 2016 | A1 |
20160267327 | Franz et al. | Sep 2016 | A1 |
20160314255 | Cook et al. | Oct 2016 | A1 |
20170000387 | Forth et al. | Jan 2017 | A1 |
20170000422 | Moturu et al. | Jan 2017 | A1 |
20170055917 | Stone et al. | Mar 2017 | A1 |
20170140631 | Pietrocola et al. | May 2017 | A1 |
20170147154 | Steiner et al. | May 2017 | A1 |
20170192950 | Gaither et al. | Jul 2017 | A1 |
20170193163 | Melle et al. | Jul 2017 | A1 |
20170197115 | Cook | Jul 2017 | A1 |
20170213145 | Pathak et al. | Jul 2017 | A1 |
20170337274 | Ly et al. | Nov 2017 | A1 |
20170344706 | Torres et al. | Nov 2017 | A1 |
20170344832 | Leung et al. | Nov 2017 | A1 |
20180075558 | Hill, Sr. et al. | Mar 2018 | A1 |
20180165938 | Honda et al. | Jun 2018 | A1 |
20180182472 | Preston et al. | Jun 2018 | A1 |
20180189756 | Purves et al. | Jul 2018 | A1 |
20180322405 | Fadell et al. | Nov 2018 | A1 |
20180360349 | Dohrmann et al. | Dec 2018 | A9 |
20180368780 | Bruno et al. | Dec 2018 | A1 |
20190029900 | Walton et al. | Jan 2019 | A1 |
20190042700 | Alotaibi | Feb 2019 | A1 |
20190057320 | Docherty et al. | Feb 2019 | A1 |
20190090786 | Kim et al. | Mar 2019 | A1 |
20190116212 | Spinella-Mamo | Apr 2019 | A1 |
20190130110 | Lee et al. | May 2019 | A1 |
20190164015 | Jones, Jr. et al. | May 2019 | A1 |
20190196888 | Anderson et al. | Jun 2019 | A1 |
20190220727 | Dohrmann et al. | Jul 2019 | A1 |
20190259475 | Dohrmann et al. | Aug 2019 | A1 |
20190286942 | Abhiram et al. | Sep 2019 | A1 |
20190311792 | Dohrmann et al. | Oct 2019 | A1 |
20190318165 | Shah et al. | Oct 2019 | A1 |
20190385749 | Dohrmann et al. | Dec 2019 | A1 |
20200101969 | Natroshvili et al. | Apr 2020 | A1 |
20200251220 | Chasko | Aug 2020 | A1 |
20200357256 | Wright et al. | Nov 2020 | A1 |
20210007631 | Dohrmann et al. | Jan 2021 | A1 |
20210273962 | Dohrmann et al. | Sep 2021 | A1 |
Number | Date | Country |
---|---|---|
104361321 | Feb 2015 | CN |
106056035 | Oct 2016 | CN |
107411515 | Dec 2017 | CN |
3815108 | May 2021 | EP |
2002304362 | Oct 2002 | JP |
2005228305 | Aug 2005 | JP |
2016525383 | Aug 2016 | JP |
20160040078 | Apr 2016 | KR |
WO-2014043757 | Mar 2014 | WO |
WO2018032089 | Feb 2018 | WO |
WO2019143397 | Jul 2019 | WO |
WO2019164585 | Aug 2019 | WO |
WO2019182792 | Sep 2019 | WO |
WO2019199549 | Oct 2019 | WO |
WO2019245713 | Dec 2019 | WO |
WO2020163180 | Aug 2020 | WO |
WO2020227303 | Nov 2020 | WO |
Entry |
---|
Machine translation of KR-20160040078-A, retrieved from Espacenet on Dec. 14, 2020. (Year: 2016). |
Bajaj, Prateek, “Reinforcement Learning”, GeeksForGeeks.org [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet:<URL:https://www.geeksforgeeks.org/what-is-reinforcement-learning/>, 7 pages. |
Kung-Hsiang, Huang (Steeve), “Introduction to Various RL Algorithms. Part 1 (Q-Learning, Sarsa, DQN, DDPG)”, Towards Data Science, [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet:<URL:https://towardsdatascience.com/introduction-to-various-reinforcement-learning-algorithms-i-q-learning-sarsa-dqn-ddpg-72a5e0cb6287>, 5 pages. |
Bellemare et al., A Distributional Perspective on Reinforcement Learning:, Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, Jul. 21, 2017, 19 pages. |
Friston et al., “Reinforcement Learning or Active Inference?” Jul. 29, 2009, [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet:<URL:https://doi.org/10.1371/journal.pone.0006421 PLoS One 4(7): e6421>, 13 pages. |
Zhang et al., “DQ Scheduler: Deep Reinforcement Learning Based Controller Synchronization in Distributed SDN” ICC 2019—2019 IEEE International Conference on Communications (ICC), Shanghai, China, doi: 10.1109/ICC.2019.8761183, pp. 1-7. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/057814, dated Jan. 11, 2019, 9 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/068210, dated Apr. 12, 2019, 9 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2019/021678, dated May 24, 2019, 12 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2019/025652, dated Jul. 18, 2019, 11 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2019/034206, dated Aug. 1, 2019, 11 pages. |
Rosen et al., “Slipping and Tripping: Fall Injuries in Adults Associated with Rugs and Carpets,” Journal of Injury & Violence Research, 5(1), 61-69. (2013). |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2020/031486, dated Aug. 3, 2020, 7 pages. |
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2020/016248, dated May 11, 2020, 7 pages. |
“Office Action”, Australia Patent Application No. 2019240484, dated Nov. 13, 2020, 4 pages. |
“Office Action”, Australia Patent Application No. 2018403182, dated Feb. 5, 2021, 5 pages. |
“Office Action”, Australia Patent Application No. 2018409860, dated Feb. 10, 2021, 4 pages. |
Leber, Jessica, “The Avatar will See You Now”, MIT Technology Review, Sep. 17, 2013, 4 pages. |
“Office Action”, India Patent Application No. 202027035634, dated Jun. 30, 2021, 10 pages. |
“Office Action”, India Patent Application No. 202027033121, dated Jul. 29, 2021, 7 pages. |
“Office Action”, Canada Patent Application No. 3088396, dated Aug. 6, 2021, 7 pages. |
“Office Action”, Japan Patent Application No. 2020-543924, dated Jul. 27, 2021, 3 pages [6 pages with translation]. |
“Office Action”, Australia Patent Application No. 2019240484, dated Aug. 2, 2021, 3 pages. |
“Office Action”, China Patent Application No. 201880089608.2, dated Aug. 3, 2021, 8 pages [17 pages with translation]. |
“Office Action”, Canada Patent Application No. 3089312, dated Aug. 19, 2021, 3 pages. |
“Extended European Search Report”, European Patent Application No. 18901139.8, dated Sep. 9, 2021, 6 pages. |
“Office Action”, Canada Patent Application No. 3091957, dated Sep. 14, 2021, 4 pages. |
“Office Action”, Japan Patent Application No. 2020-540382, dated Aug. 24, 2021, 7 pages [13 pages with translation]. |
“Extended European Search Report”, European Patent Application No. 18907032.9, dated Oct. 15, 2021, 12 pages. |
Marston et al., “The design of a purpose-built exergame for fall prediction and prevention for older people”, European Review of Aging and Physical Activity 12:13, <URL:https://eurapa.biomedcentral.com/track/pdf/10.1186/s11556-015-0157-4.pdf>, Dec. 8, 2015, 12 pages. |
Ejupi et al., “Kinect-Based Five-Times-Sit-to-Stand Test for Clinical and In-Home Assessment of Fall Risk in Older People”, Gerontology (vol. 62), (May 28, 2015), <URL:https://www.karger.com/Article/PDF/381804>, May 28, 2015, 7 pages. |
Festl et al., “iStoppFalls: A Tutorial Concept and prototype Contents”, <URL:https://hcisiegen.de/wp-uploads/2014/05/isCtutoriaLdoku.pdf>, Mar. 30, 2013, 36 pages. |
“Notice of Allowance”, Australia Patent Application No. 2019240484, dated Oct. 27, 2021, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20190282130 A1 | Sep 2019 | US |
Number | Date | Country | |
---|---|---|---|
62645053 | Mar 2018 | US |