The present technology pertains to remote physical therapy. In particular, but not by way of limitation, the present technology provides systems and methods of remote physical therapy and assessment of patients.
In various embodiments, the present disclosure is directed to methods carried out on a system and executed on one or more computing devices, which can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination thereof installed on the system and/or computing devices that in operation causes or cause the system to perform actions and/or method steps to enable remote physical therapy.
In some embodiments the present technology is directed to a system for remote physical therapy and assessment of patients, the system comprising: a) an at least one on-location sensor to capture and transmit visual data; b) an at least one on-location client device to display personalized instructions; c) an interactive graphical user interface to display the personalized instructions on the at least one on-location client device; d) a server system that includes: an AI virtual game engine (also referred to herein as “AI”, “AI game engine”, or “AI engine”), to analyze the visual data and produce updated personalized instructions; an at least one processor; a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the processor to run the AI virtual game engine; and e) a network; whereby the network is connected to the server system, the at least one on-location sensor and the at least one on-location client device. The on-location sensor may be a depth of field camera. The remote physical therapy system may include, connect to and/or integrate with electronic medical record system(s). The AI virtual game engine may also use data from the electronic medical record system to produce updated personalized instructions, routines, movements, or physical therapy plans, separately or in addition to data the system collects itself. The network in this system may also be a content delivery network. The network may also connect to additional or fewer devices and systems.
Embodiments of the present technology may also be directed to systems and methods for physical therapy training and delivery (referred to herein as “PTTD”). PTTD is an artificial intelligence virtual physical therapy exergame application that remotely delivers clinically prescribed fitness regimens through a graphical user interface (GUI) with instructional animation by a virtual caregiver avatar. PTTD integrates artificial intelligence into its motion analysis software that allows a user/patient to measure their physical progress. The application may be used to supplement home fitness programs that are prescribed by the user's clinician(s) for preventive and rehabilitative physical therapy or by the user's physical trainers for sports cross-training. Clinicians and fitness/sports organizations may register for PTTD. This allows clinicians to remotely promote their care plan to the patient who is in the home and provides further insight into patient outcome progress. PTTD does this by integrating machine learning algorithms (MLA) into joint tracking motion analysis for user-compliance detection and kinesthetic progress. Moreover, PTTD includes clinically validated exercise animations, and allows for user or third-party access to joint-tracking data for external validation of users' regimens.
In the description, for purposes of explanation and not limitation, specific details are set forth, such as particular embodiments, procedures, techniques, etc. to provide a thorough understanding of the present technology. However, it will be apparent to one skilled in the art that the present technology may be practiced in other embodiments that depart from these specific details.
The accompanying drawings, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed disclosure and explain various principles and advantages of those embodiments.
The methods and systems disclosed herein have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
Physical therapy is provided as a primary care treatment or in conjunction with other forms of medical services. It is directed to addressing illnesses, injuries and trauma that affect an individual's ability to move or perform functional tasks. It can be an important component of many treatment regimens and is utilized in the treatment and long-term management of chronic conditions, illnesses and even the effects of ageing. It is also widely used in the treatment and rehabilitation of injuries, short-term pain and physical trauma.
Physical therapy may be composed of a number of components including the monitoring and/or assessment of patients, prescribing and/or carrying out physical routines or movements, instructing patients to perform specific actions, movements or activities, and scheduling short or long-term physical routines for patients; all these components designed to rehabilitate and treat pain, injury or the ability to move and perform functional tasks. Physical therapy may also contain an educational component directed to the patient and the patient's care circle.
Embodiments of the present technology provides systems and methods that enable physical therapy in some or all of its different forms to be undertaken and carried out remotely, from different locations and without any physical therapist or other individual being present with the patient.
While the present technology is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail several specific embodiments with the understanding that the present disclosure is to be considered as an exemplification of the principles of the present technology and is not intended to limit the technology to the embodiments illustrated.
By making physical therapy remotely available to patients, the present technology enables a wide range of applications capable of addressing unmet needs in the areas of medical diagnostics, patient education, treatment, rehabilitation, communication and information sharing, and the like. There is generally a common need to find more cost-effective and scalable methods of assessing, monitoring and treating patients based on their medical history, future and long-term prospects, and their current physical status and abilities, as well as a need to deliver physical therapy in an accessible and standardized format to all patients, and in a variety of locations, with or without physical therapists or other individuals being physically present with the patient.
The term ‘patient’ is used to describe any individual that is using or intending to use or is prescribed the use of any embodiment of the systems and methods described herein, it may be used synonymously with the term ‘user’. The term ‘care circle’ is used to describe individuals, organizations or entities that may be assigned either by the patient or by other means to be notified and informed of the patient's status, progress or need for immediate attention and/or help.
The remote physical therapy system provides and can incorporate and utilize different forms of motion detection, monitoring and tracking capabilities, in conjunction with analysis that may be executed and provided by artificial intelligence (AI) to both enhance the capture of audio-visual and other motion data as well as to provide analysis of the captured motion detection data and other available data. The AI engine may utilize amongst other factors, performance metrics from a patient carrying out physical therapy routines, or patient movement measurements, as variables to determine or calculate performance factors and metrics, patient health status, or other indicators of the patient's physical or mental state or wellbeing (all these collectively referred to as “patient state”). This system may also be able to integrate with, and read and/or write data to, electronic medical record systems of patients, and incorporate or utilize these records in analysis and other computations. A patient's historical data, both captured by the system and from external records, may serve as a baseline from which the system and AI engine uses to determine past, current, and future performance metrics or to provide insights into the health status of a patient. The assessment and analysis may also be carried out by analyzing metrics of performance, recovery, and fitness. The use of an AI engine is not strictly necessary.
Some embodiments of the system deliver routines to the patient's client device and display it with a graphical user interface, which may be that of an interactive avatar. The client device may be an Addison powered Artificial Intelligence based gait and physical therapy console, or a display device such as a cellular phone, tablet, computing device, and/or augmented reality or virtual reality headset/device or other audiovisual technology device. The interactive avatar may be able to perform routines or movements and provide instructions, feedback, information, communication, engagement or perform examples of physical therapy movements or routines for the patient or their care circle.
Some embodiments of the system may also be incorporated into smart home technologies, homecare, or other software applications or electronic devices that may assist in monitoring, observing, and ensuring compliance, or otherwise assist in detecting movement, measuring performance of patients, or even displaying the graphical user interface and/or interactive avatar. One or more plug-and-play input and output devices may also be used and be incorporated into the system.
Motion capture, skeletal tracking, detection of sounds, or the capture of infrared data may be undertaken by components embedded within a dedicated console or by plug-and-play or other devices that can be added to the system as desired, these devices may include microphones, cameras, including depth of field, infrared or thermal cameras, or other motion or image capture devices and technology. Some embodiments may incorporate air handling, air correction, skeletal tracking software and other detection and monitoring technologies.
Embodiments of the system may be able to detect movement, general health status and indicators, adverse reactions to drugs, sudden or rapid movements including but not limited to seizures, falls or heart attacks, or other changes in the physical or mental state of the patient based on visual, motion detection, skeletal tracking, sound, or other forms of captured data. It may detect or determine the patient state based on long or short-term analysis and/or calculations in conjunction with motion detection and/or analysis of a patient. It may also detect and/or determine the patient's state from data not directly collected or obtained from the physical therapy monitoring and assessment system. For example, data from the patient's records, medical history and of drug use may all be used.
In some embodiments, the system may be able to detect specific illnesses, diseases, deformities, or ailments suffered by the patient. One example could be detecting a parkinsonian shuffle from the gait velocity, time-in swing, time in double support, cadence, inverted pendulum measurement, or festination of a patient.
In various embodiments a notification or form of communication is sent to the patient's care circle, to notify them of changes in the patient state, non-compliance with scheduled routines or when certain movement(s) are detected. The form of notification or communication may be set or may be designated by the system depending on the severity of the detected motion or status of the patient. Notification may be carried out using any form of communication including but not limited to digital, electronic, cellular, or even voice or visual, and may be delivered through a monitor, television, electronic device, or any other interface that is able to communicate with or notify the patient or their care circle.
In various embodiments, the system may be deployed in hospitals, medical offices and other types of clinics or nursing homes, enabling on-site and live motion detection and analysis of patients. One example may be the detection of a patient that walks into a hospital, where the characteristics and datapoints collected from the patient's motion inside the premises are captured and analyzed, and a report is produced and transmitted/communicated to the designated physician. The physician seeing the patient will have up-to-date information prior to the visit that will inform the physician to look for certain symptoms, or possible health issues that were detected or red-flagged from the analysis of the patient's motion and/or other captured characteristics. This enables the physician to undertake tests or ask more detailed questions based on the indicators and report provided by the system.
In some embodiments, the system may prescribe specific movements or physical therapy routines, from a library containing a catalogue of physical therapy movements, routines, and regimens, to treat or rehabilitate patients or reduce their pain from injuries, to reduce future health risks, falls risk and other physical problems or potential for injuries. This library may be stored in a database accessible by users and other stakeholders. The prescribed movements and/or routines may be personalized for each person based on both captured personal data as well as the patient's external medical records or data. Artificial intelligence may be utilized to prescribe or provision specific movements, routines, or regimens. Artificial intelligence or machine learning may be used to detect and assess the patient's or user's health status, general health indicators and patient state, and prescribe and alter movements, routines and/or physical therapy plans accordingly. Artificial Intelligence may also access databases or other external information repositories and incorporate that data when tailoring, customizing, and provisioning movements, routines or plans according to the patient's goals or needs.
In some embodiments artificial intelligence uses the total information captured from all patients to devise new physical therapy movements, routines or plans it assesses to be beneficial to a specific patient, these movements or routines may be created and then added to a library containing a catalogue of physical therapy routines or movements. Devising new routines or movements allows for specific and more precise treatment plans to be delivered to each patient. These treatment plans may then be collected or organized into standardized plans that are delivered to other patients that possess certain common factors or indicators. Prescribed changes may then be sent to the patient's care circle.
A library containing all movements and routines/regimens may be updated by adding new routines or movements and may be accessed to update individual patient routines. Each movement, or routine comes with its own preset and calculated assessment metrics, performance variables, as well as associated notifications and instructions.
Various embodiments of the invention utilize an artificial intelligence virtual physical therapy exergame, physical therapy training and delivery (PTTD). PTTD delivers an exercise regimen or routine through a GUI. The exercise regimen can be individually tailored to improve a user's health status or condition by providing a customized approach to provisioning the virtual exercise programs. Provisioning of the user's regimen can be done by the individual or a qualified third party. A qualified third party can be, but is not restricted to, a user's physician, caregiver, or any other responsible party or member of the care circle. Provisioning of the regimen or routine includes, but is not restricted to, the selection of exercise(s), the number of sets, the number of repetitions, and the regimen schedule. The user may choose their programmed exercises from a non-relational database that connects to the GUI. The objects in the database serve as inputs to a schema that is programmed by the individual or third party. The schema pulls appropriate animations of the virtual caregiver to the user interface based on provisioned user inputs. Once a regimen is selected or added to the user's goals, the regimen is stored to the user's profile. The objective of PTTD is to improve patient mobility in a clinical aspect, however it is not limited to healthcare-based settings or outcomes for patients with ailments. For instance, PTTD can also be used in a cross-training environment to improve athletic performance of individuals and/or athletes.
Physical therapists (PT) collaborate with PTTD developers to continually build and improve upon an exercise directory and animation motion capture footage. A licensed PT provides developers with accredited preventative and rehabilitative exercises for virtual instruction. The exercises are stored to a database for users' individualized provisioning. Each exercise in the database can be based on, but is not restricted to, anatomical or injury classification (e.g., hip strengthening or rib fracture rehabilitation.)
The database couples each exercise with its appropriate virtual caregiver animation. This animation serves as the set of exergame instructions for the user to follow. The avatar leads individuals through exercise. The motion capture (mocap) process for the animation is done utilizing mocap software. A developer in a mocap suit is recorded by cameras while performing exercises provided by the licensed PT. The PT supervises the developer through the motion capture process. The motion capture footage serves as the skeletal blueprint onto which the virtual caregiver's animation can be rendered. This ensures that the virtual caregiver instructs the patient with proper clinical form and modifications.
PTTD records and measures a user's compliance with their provisioned regimen or routine. This is accomplished by joint tracking and motion analysis software that is integrated with a depth camera. The user's movements are recorded via a non-invasive depth tracking software that collects real-time spatial location of joints across a three-dimensional axis. Changes in the spatial location are calculated into a multitude of variables such as, but not limited to, the number of repetitions, cadence, gait velocity, stride length, range of motion, and time to completion. Prior to beginning the regimen, the user will go through a calibration routine that collects the user's anatomical ground-truth data. This allows the motion analysis software to track the user's movements with greater precision and accuracy. The user's ground-truth data is processed through the motion analysis software and labeled appropriately for future analysis. The user is prompted according to the provisioned regime to begin the prescribed activity. The user begins the activity, and their movements are recorded simultaneously as the provisioned regimen streams. All footage of users is made non-identifiable through a gray scale filter.
Each activity stream that is recorded by the depth camera gets appended on to the user's profile in a non-relational database. Each time a new activity stream is stored to the user profile, a machine learning algorithm (MLA) is triggered for analysis on joint-tracking data. The MLA compiles the user's calibrated ground truth data as a baseline to compare future joint-tracking data. The MLA can be, but is not limited to, supervised or unsupervised models that interpret when the user's kinesthetics are anomalous to their usual patterns. Anomalies detected by the MLA are classified as improvements or declines that are visualized on a GUI. Another option for additional insight is access to the user's stored activity data without MLA analysis. Upon user authorization, stored activity data can be available for playback. This allows the user or an authorized third-party to review streaming footage to validate compliance and kinesthetic progress at their own discretion and provides another source of ground-truth for the models.
Another valuable component of PTTD is the cloud-based data store. Gait and posture are unique to an individual based on personal characteristics and features such as medical history, age, and gender. One impediment in machine-learning based motion analysis is obtaining ground-truth data. What could be interpreted as anomalous movement for one user could be classified as an improvement for another if compared to a general population average. To address this, PTTD implements two methods of analysis: an individualized model, and a population-based model. Each time motion analysis is triggered for an individual, the data is stored to their user database. The individualized model has a user database that restricts its analysis to ground-truth data supplied only by the individual. The progress reports from the individual's database are compared to their personal ground-truth data set. As the user increases their PTTD usage, the motion analysis tailors insights to their individual baselines with the new datapoints, and this additional data allows the algorithm to gain additional insight into developments or progress in the user's kinesthetics. This allows the motion analysis algorithm to continually retrain itself for improved accuracy.
From these individual databases, data is pulled into a general population data lake to provide further macro-scale or big data insights. Authorization to access the data lake can be granted to any individual organization through an ordering process. This makes ground-truth data available to those who do not have access to the physical hardware and the infrastructure that goes into collection of motion analysis data but wish to perform research and development using the data lake's features. All data is encrypted at rest and in transit.
The remote physical therapy system 100 includes a sensor 111 enabled to capture data, a client device 110 that displays an interactive avatar through a graphical user interface 112, a server system 105 that includes an AI Virtual Game Engine 120 which provides the functionality of the system and all of its embodiments as described throughout this document. The system may also include an Electronic Medical Record system 125. The different components of the system are connected via a network 115.
The system 205 may communicatively couple with the client 210 via a public or private network, such as network 215. Suitable networks may include or interface with any one or more of, for instance, a local intranet, a PAN (Personal Area Network), a LAN (Local Area Network), a WAN (Wide Area Network), a MAN (Metropolitan Area Network), a virtual private network (VPN), a storage area network (SAN), a frame relay connection, an Advanced Intelligent Network (AIN) connection, a synchronous optical network (SONET) connection, a digital T1, T3, E1 or E3 line, Digital Data Service (DDS) connection, DSL (Digital Subscriber Line) connection, an Ethernet connection, an ISDN (Integrated Services Digital Network) line, a dial-up port such as a V.90, V.34 or V.34bis analog modem connection, a cable modem, an ATM (Asynchronous Transfer Mode) connection, or an FDDI (Fiber Distributed Data Interface) or CDDI (Copper Distributed Data Interface) connection. Furthermore, communications may also include links to any of a variety of wireless networks, including WAP (Wireless Application Protocol), GPRS (General Packet Radio Service), GSM (Global System for Mobile Communication), CDMA (Code Division Multiple Access) or TDMA (Time Division Multiple Access), cellular phone networks, GPS (Global Positioning System), CDPD (cellular digital packet data), RIM (Research in Motion, Limited) duplex paging network, Bluetooth radio, or an IEEE 802.11-based radio frequency network. The network 215 can further include or interface with any one or more of an RS-232 serial connection, an IEEE-1394 (Firewire) connection, a Fiber Channel connection, an IrDA (infrared) port, a SCSI (Small Computer Systems Interface) connection, a USB (Universal Serial Bus) connection or other wired or wireless, digital, or analog interface or connection, mesh or Digi® networking.
The system 205 generally comprises a processor, 230, a network interface 235, and a memory 240. According to some embodiments, the memory 240 comprises logic (e.g., instructions) 245 that can be executed by the processor 230 to perform various methods. For example, the logic may include a user interface module 225 as well as an AI virtual Game Engine 220 which includes data aggregation and correlation (hereinafter application 220) that is configured to provide the functionalities described in greater detail herein including systems and methods of remote physical therapy.
It will be understood that the functionalities described herein, which are attributed to the system 205 and application 220 may also be executed within the client 210. That is, the client 210 may be programmed to execute the functionalities described herein. In other instances, the system 205 and client 210 may cooperate to provide the functionalities described herein, such that the client 210 is provided with a client-side application that interacts with the system 205 such that the system 205 and client 210 operate in a client/server relationship. In some embodiments, complex computational features may be executed by the system 205, while simple operations that require fewer computational resources may be executed by the client 210, such as data gathering and data display.
The AI game engine creates or is given a physical therapy schedule for a patient 910. The AI game engine sends a reminder to the patient or the patient's client device or console 920. The client device or console delivers the reminder or notification to the patient 930. If the patient does not acknowledge the reminder, the patient's care circle is notified 935. Otherwise, the patient acknowledges the reminder 940. In some embodiments, the patient may have to confirm their identity 945 to the client device or console. Only after confirmation of the patient's identity does the client device or console receive and/or display the patient's routines or other private data. The console or client device may confirm the identity of the patient via facial recognition or via other biometric means. If the patient chooses not to perform the physical therapy activity, an alert is generated, and the patient's care circle is notified 950. Otherwise, the patient chooses, accepts, or confirms that the activity will be performed to the client device or console 955. The movements are demonstrated 960 to the patient via the client device or console, in many embodiments with an avatar. The patient then indicates that he/she is ready to begin demonstrated activity 965. The client device or console requests 970 that the patient move into a correct position or orientation relative to the client device, console, or camera 970.
The example computer system 1 includes a processor or multiple processor(s) 5 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), and a main memory 10 and static memory 15, which communicate with each other via a bus 20. The computer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)). The computer system 1 may also include an alpha-numeric input device(s) 30 (e.g., a keyboard), a cursor control device (e.g., a mouse), a voice recognition or biometric verification unit (not shown), a drive unit 37 (also referred to as disk drive unit), a signal generation device 40 (e.g., a speaker), and a network interface device 45. The computer system 1 may further include a data encryption module (not shown) to encrypt data.
The disk drive unit 37 includes a computer or machine-readable medium 50 on which is stored one or more sets of instructions and data structures (e.g., instructions 55) embodying or utilizing any one or more of the methodologies or functions described herein. The instructions 55 may also reside, completely or at least partially, within the main memory 10 and/or within the processor(s) 5 during execution thereof by the computer system 1. The main memory 10 and the processor(s) 5 may also constitute machine-readable media.
The instructions 55 may further be transmitted or received over a network (e.g., network 115, see
One skilled in the art will recognize that Internet service may be configured to provide Internet access to one or more computing devices that are coupled to the Internet service, and that the computing devices may include one or more processors, buses, memory devices, display devices, input/output devices, and the like. Furthermore, those skilled in the art may appreciate that the Internet service may be coupled to one or more databases, repositories, servers, and the like, which may be utilized to implement any of the embodiments of the disclosure as described herein.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
While specific embodiments of, and examples for, the system are described above for illustrative purposes, various equivalent modifications are possible within the scope of the system, as those skilled in the relevant art will recognize. For example, while processes or steps are presented in a given order, alternative embodiments may perform routines having steps in a different order, and some processes or steps may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or sub-combinations. Each of these processes or steps may be implemented in a variety of different ways. Also, while processes or steps are at times shown as being performed in series, these processes or steps may instead be performed in parallel or may be performed at different times.
While various embodiments have been described above, they are presented as examples only, and not as a limitation. The descriptions are not intended to limit the scope of the present technology to the forms set forth herein. To the contrary, the present descriptions are intended to cover such alternatives, modifications, and equivalents as may be included within the spirit and scope of the present technology as appreciated by one of ordinary skill in the art. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments.
The present application claims the priority benefit of U.S. Provisional Patent Application No. 63/114,045, filed on Nov. 16, 2020, and titled “Methods and Systems for Remote Physical Therapy and Assessment of Patients”, which is hereby incorporated by reference in its entirety. The present application is related to U.S. Pat. No. 10,813,572, issued on Oct. 27, 2020, and titled “Intelligent System for Multi-Function Electronic Caregiving to Facilitate Advanced Health Monitoring, Fall and Injury Prediction, Health Maintenance and Support, and Emergency Response”, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5211642 | Clendenning | May 1993 | A |
5475953 | Greenfield | Dec 1995 | A |
6665647 | Haudenschild | Dec 2003 | B1 |
7233872 | Shibasaki et al. | Jun 2007 | B2 |
7445086 | Sizemore | Nov 2008 | B1 |
7612681 | Azzaro et al. | Nov 2009 | B2 |
7971141 | Quinn et al. | Jun 2011 | B1 |
8206325 | Najafi et al. | Jun 2012 | B1 |
8771206 | Gettelman | Jul 2014 | B2 |
9072929 | Rush | Jul 2015 | B1 |
9317916 | Hanina et al. | Apr 2016 | B1 |
9591996 | Chang et al. | Mar 2017 | B2 |
9972187 | Srinivasan | May 2018 | B1 |
10387963 | Leise et al. | Aug 2019 | B1 |
10417388 | Han et al. | Sep 2019 | B2 |
10628635 | Carpenter, II et al. | Apr 2020 | B1 |
10761691 | Anzures et al. | Sep 2020 | B2 |
10813572 | Dohrmann et al. | Oct 2020 | B2 |
10943407 | Morgan et al. | Mar 2021 | B1 |
11113943 | Wright et al. | Sep 2021 | B2 |
11213224 | Dohrmann et al. | Jan 2022 | B2 |
20020062342 | Sidles | May 2002 | A1 |
20020196944 | Davis et al. | Dec 2002 | A1 |
20040109470 | Derechin et al. | Jun 2004 | A1 |
20050035862 | Wildman et al. | Feb 2005 | A1 |
20050055942 | Maelzer et al. | Mar 2005 | A1 |
20070032929 | Yoshioka | Feb 2007 | A1 |
20070238936 | Becker | Oct 2007 | A1 |
20080010293 | Zpevak et al. | Jan 2008 | A1 |
20080186189 | Azzaro et al. | Aug 2008 | A1 |
20090094285 | Mackle et al. | Apr 2009 | A1 |
20100124737 | Panzer | May 2010 | A1 |
20110126207 | Wipfel et al. | May 2011 | A1 |
20110145018 | Fotsch et al. | Jun 2011 | A1 |
20110232708 | Kemp | Sep 2011 | A1 |
20120025989 | Cuddihy et al. | Feb 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120120184 | Fornell et al. | May 2012 | A1 |
20120121849 | Nojima | May 2012 | A1 |
20120154582 | Johnson et al. | Jun 2012 | A1 |
20120165618 | Algoo et al. | Jun 2012 | A1 |
20120179067 | Wekell | Jul 2012 | A1 |
20120179916 | Staker et al. | Jul 2012 | A1 |
20120229634 | Laett et al. | Sep 2012 | A1 |
20120253233 | Greene et al. | Oct 2012 | A1 |
20130000228 | Ovaert | Jan 2013 | A1 |
20130060167 | Dracup | Feb 2013 | A1 |
20130127620 | Siebers et al. | May 2013 | A1 |
20130145449 | Busser et al. | Jun 2013 | A1 |
20130167025 | Patri et al. | Jun 2013 | A1 |
20130204545 | Solinsky | Aug 2013 | A1 |
20130212501 | Anderson et al. | Aug 2013 | A1 |
20130237395 | Hjelt et al. | Sep 2013 | A1 |
20130289449 | Stone et al. | Oct 2013 | A1 |
20130303860 | Bender et al. | Nov 2013 | A1 |
20140074454 | Brown et al. | Mar 2014 | A1 |
20140128691 | Olivier | May 2014 | A1 |
20140148733 | Stone et al. | May 2014 | A1 |
20140171039 | Bjontegard | Jun 2014 | A1 |
20140171834 | DeGoede et al. | Jun 2014 | A1 |
20140232600 | Larose et al. | Aug 2014 | A1 |
20140243686 | Kimmel | Aug 2014 | A1 |
20140257852 | Walker et al. | Sep 2014 | A1 |
20140267582 | Beutter et al. | Sep 2014 | A1 |
20140278605 | Borucki et al. | Sep 2014 | A1 |
20140330172 | Jovanov et al. | Nov 2014 | A1 |
20140337048 | Brown et al. | Nov 2014 | A1 |
20140343460 | Evans, III et al. | Nov 2014 | A1 |
20140358828 | Phillipps et al. | Dec 2014 | A1 |
20140368601 | deCharms | Dec 2014 | A1 |
20150019250 | Goodman et al. | Jan 2015 | A1 |
20150109442 | Derenne et al. | Apr 2015 | A1 |
20150169835 | Hamdan et al. | Jun 2015 | A1 |
20150359467 | Tran | Dec 2015 | A1 |
20160026354 | McIntosh et al. | Jan 2016 | A1 |
20160117470 | Welsh et al. | Apr 2016 | A1 |
20160117484 | Hanina et al. | Apr 2016 | A1 |
20160125620 | Heinrich et al. | May 2016 | A1 |
20160154977 | Jagadish et al. | Jun 2016 | A1 |
20160217264 | Sanford | Jul 2016 | A1 |
20160253890 | Rabinowitz et al. | Sep 2016 | A1 |
20160267327 | Franz et al. | Sep 2016 | A1 |
20160314255 | Cook et al. | Oct 2016 | A1 |
20170000387 | Forth et al. | Jan 2017 | A1 |
20170000422 | Moturu et al. | Jan 2017 | A1 |
20170024531 | Malaviya | Jan 2017 | A1 |
20170055917 | Stone et al. | Mar 2017 | A1 |
20170140631 | Pietrocola et al. | May 2017 | A1 |
20170147154 | Steiner et al. | May 2017 | A1 |
20170189751 | Knickerbocker | Jul 2017 | A1 |
20170192950 | Gaither et al. | Jul 2017 | A1 |
20170193163 | Melle et al. | Jul 2017 | A1 |
20170197115 | Cook et al. | Jul 2017 | A1 |
20170213145 | Pathak et al. | Jul 2017 | A1 |
20170273601 | Wang et al. | Sep 2017 | A1 |
20170337274 | Ly et al. | Nov 2017 | A1 |
20170344706 | Torres et al. | Nov 2017 | A1 |
20170344832 | Leung et al. | Nov 2017 | A1 |
20180005448 | Choukroun et al. | Jan 2018 | A1 |
20180075558 | Hill, Sr. et al. | Mar 2018 | A1 |
20180096504 | Valdivia et al. | Apr 2018 | A1 |
20180154514 | Angle et al. | Jun 2018 | A1 |
20180165938 | Honda et al. | Jun 2018 | A1 |
20180182472 | Preston et al. | Jun 2018 | A1 |
20180189756 | Purves et al. | Jul 2018 | A1 |
20180322405 | Fadell et al. | Nov 2018 | A1 |
20180330810 | Gamarnik | Nov 2018 | A1 |
20180360349 | Dohrmann et al. | Dec 2018 | A9 |
20180365759 | Balzer | Dec 2018 | A1 |
20180368780 | Bruno et al. | Dec 2018 | A1 |
20190029900 | Walton et al. | Jan 2019 | A1 |
20190042700 | Alotaibi | Feb 2019 | A1 |
20190057320 | Docherty et al. | Feb 2019 | A1 |
20190090786 | Kim et al. | Mar 2019 | A1 |
20190116212 | Spinella-Mamo | Apr 2019 | A1 |
20190130110 | Lee et al. | May 2019 | A1 |
20190156575 | Korhonen | May 2019 | A1 |
20190164015 | Jones, Jr. et al. | May 2019 | A1 |
20190196888 | Anderson et al. | Jun 2019 | A1 |
20190220727 | Dohrmann et al. | Jul 2019 | A1 |
20190259475 | Dohrmann et al. | Aug 2019 | A1 |
20190282130 | Dohrmann et al. | Sep 2019 | A1 |
20190286942 | Abhiram et al. | Sep 2019 | A1 |
20190311792 | Dohrmann et al. | Oct 2019 | A1 |
20190318165 | Shah et al. | Oct 2019 | A1 |
20190385749 | Dohrmann et al. | Dec 2019 | A1 |
20200043594 | Miller | Feb 2020 | A1 |
20200101969 | Natroshvili et al. | Apr 2020 | A1 |
20200129107 | Sharma | Apr 2020 | A1 |
20200236090 | De Beer et al. | Jul 2020 | A1 |
20200251220 | Chasko | Aug 2020 | A1 |
20200357256 | Wright et al. | Nov 2020 | A1 |
20200357511 | Sanford | Nov 2020 | A1 |
20210007631 | Dohrmann et al. | Jan 2021 | A1 |
20210016150 | Jeong | Jan 2021 | A1 |
20210110894 | Shriberg et al. | Apr 2021 | A1 |
20210134456 | Posnack | May 2021 | A1 |
20210273962 | Dohrmann et al. | Sep 2021 | A1 |
20210358202 | Tveito et al. | Nov 2021 | A1 |
20210375426 | Gobezie | Dec 2021 | A1 |
20210398410 | Wright et al. | Dec 2021 | A1 |
20220022760 | Salcido et al. | Jan 2022 | A1 |
20220031199 | Hao | Feb 2022 | A1 |
20220117515 | Dohrmann et al. | Apr 2022 | A1 |
20220319696 | Dohrmann et al. | Oct 2022 | A1 |
20220319713 | Dohrmann et al. | Oct 2022 | A1 |
20220319714 | Dohrmann et al. | Oct 2022 | A1 |
20230108601 | Coelho Alves | Apr 2023 | A1 |
Number | Date | Country |
---|---|---|
2949449 | Nov 2015 | CA |
104361321 | Feb 2015 | CN |
106056035 | Oct 2016 | CN |
107411515 | Dec 2017 | CN |
2002304362 | Oct 2002 | JP |
2005228305 | Aug 2005 | JP |
2010172481 | Aug 2010 | JP |
2012532652 | Dec 2012 | JP |
2016137226 | Aug 2016 | JP |
2016525383 | Aug 2016 | JP |
1020160040078 | Apr 2016 | KR |
WO2000005639 | Feb 2000 | WO |
WO2014043757 | Mar 2014 | WO |
WO2017118908 | Jul 2017 | WO |
WO2018032089 | Feb 2018 | WO |
Entry |
---|
Rosen et al., “Slipping and Tripping: Fall Injuries in Adults Associated with Rugs and Carpets,” Journal of Injury & Violence Research, 5(1), (2013), pp. 61-69. |
Bajaj, Prateek, “Reinforcement Learning”, GeeksForGeeks.org [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet :<URL:https://www.geeksforgeeks.org/what-is-reinforcement-learning/>, 7 pages. |
Kung-Hsiang, Huang (Steeve), “Introduction to Various RL Algorithms. Part I (Q-Learning, SARSA, DQN, DDPG)”, Towards Data Science, [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet :<URL:https://towardsdatascience.com/introduction-to-various-reinforcement-learning-algorithms-i-q-learning-sarsa-dqn-ddpg-72a5e0cb6287>, 5 pages. |
Bellemare et al., A Distributional Perspective on Reinforcement Learning:, Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, Jul. 21, 2017, 19 pages. |
Friston et al., “Reinforcement Learning or Active Inference?” Jul. 29, 2009, [online], [retrieved on Mar. 4, 2020], Retrieved from the Internet :<URL:https://doi.org/10.1371/journal.pone.0006421 PLoS ONE 4(7): e6421>, 13 pages. |
Zhang et al., “DQ Scheduler: Deep Reinforcement Learning Based Controller Synchronization in Distributed SDN” ICC 2019—2019 IEEE International Conference on Communications (ICC), Shanghai, China, doi: 10.1109/ICC.2019.8761183, pp. 1-7. |
Leber, Jessica, “The Avatar Will See You Now”, MIT Technology Review, Sep. 17, 2013, 4 pages. |
Marston et al., “The design of a purpose-built exergame for fall prediction and prevention for older people”, European Review of Aging and Physical Activity 12:13, <URL:https://eurapa.biomedcentral.com/track/pdf/10.1186/s11556-015-0157-4.pdf>, Dec. 8, 2015, 12 pages. |
Ejupi et al., “Kinect-Based Five-Times-Sit-to-Stand Test for Clinical and In-Home Assessment of Fall Risk in Older People”, Gerontology (vol. 62), (May 28, 2015), <URL:https://www.karger.com/Article/PDF/381804>, May 28, 2015, 7 pages. |
Festl et al., “iStoppFalls: A Tutorial Concept and prototype Contents”, <URL:https://hcisiegen.de/wp-uploads/2014/05/isCtutoriaLdoku.pdf>, Mar. 30, 2013, 36 pages. |
Number | Date | Country | |
---|---|---|---|
20220157427 A1 | May 2022 | US |
Number | Date | Country | |
---|---|---|---|
63114045 | Nov 2020 | US |