SYSTEMS AND METHODS FOR PREDICTING A FALL

Information

  • Patent Application
  • 20230215568
  • Publication Number
    20230215568
  • Date Filed
    January 04, 2023
    a year ago
  • Date Published
    July 06, 2023
    a year ago
  • CPC
    • G16H50/20
  • International Classifications
    • G16H50/20
Abstract
Systems, methods and techniques for training and applying machine learning models to predict whether or not one or more individuals will suffer a fall event. In certain embodiments, a machine learning model can include both a static component and a dynamic component, where each component is associated with different types of medical data. In certain embodiments, an adjustment factor based on fall history of individuals is applied to the output of the machine learning model to generate a final score predictive of a fall event. In certain embodiments, the machine learning model is both trained and applied to medical data associated with predetermined forms, and where the predetermined forms include a value range associated with a medical condition.
Description
TECHNICAL FIELD

The present disclosure relates generally to systems and methods for predicting a fall, and more particularly, to systems and methods for prioritizing resources based on a fall risk to individuals or group of individuals.


BACKGROUND

According to the National Council on Aging, falls are the leading cause of fatal and nonfatal injuries for older Americans. Falls threaten seniors' safety and independence and generate enormous economic and personal costs. Falls are the leading cause of fatal injury and the most common cause of nonfatal trauma-related hospital admissions among older adults. Facilities have limited resources/time and need to prioritize who receives therapy first. Providing resources, such as scheduling therapy sessions for the residents of the facility based on the suffering of a potential fall (i.e., one that may happen in the future) can help prevent the occurrence of falls. Thus, it would be advantageous to quickly and efficiently prioritize resources based on how likely an individual is to fall. The present disclosure is directed to solving these and other problems.


SUMMARY

According to one or more implementations of the present disclosure, a system for prioritizing resources for a plurality of individuals, the system including a control system configured to implement a method of prioritizing resources for a plurality of individuals or a method of training a fall prediction algorithm as described herein.


According to one or more implementations of the present disclosure a computer-implemented method is provided. The method comprises: receiving, by one or more computer processors, a first plurality of medical data associated with a first plurality of individuals, and training, by the one or more computer processors, a machine learning model (MLM) based on the received first plurality of data, such that the trained MLM is able to process a second plurality of medical data associated with a second plurality of individuals and output a score associated with each of the second plurality of individuals, wherein the score can be an estimated prediction of a fall event occurring, respectively, that each of the second plurality of individuals could suffer, and wherein at least a portion of the first plurality of data and a portion of the second plurality of data are based on one or more pre-determined fillable forms associated with a value range. In one or more implementations, the output score can be used alone or in conjunction with an individual's information, such as preferences and/or medical information, to assign a resource to the individual in order to prevent a fall.


According to one or more implementations of the present disclosure a computer program product is provided. The computer program product comprises: a computer-readable storage medium storing computer-readable program code executable by a processor to: apply a machine learning model (MLM) to a plurality of data associated with at least one individual, compute an initial fall risk score for the at least one individual based on the application of the MLM, and generate a final fall risk score by applying an adjustment factor to the initial fall risk score. Optionally, the computer program product can be a non-transitory computer program product.


According to one or more implementation of the present disclosure, an apparatus is provided. The apparatus comprises a memory to store instructions; and processing circuitry, coupled with the memory, operable to execute the instructions, that when executed, cause the processing circuitry to: receive a plurality of medical data associated with a plurality of individuals, compute a fall prediction score associated with each of the plurality of individuals based on the received medical data; and allocate one or more resources to each of the plurality of individuals based on the fall prediction computation.


According to one or more implementations of the present disclosure, an apparatus is provided. The apparatus comprises: a memory to store instructions and processing circuitry, coupled with the memory, operable to execute the instructions, that when executed, cause the processing circuitry to: apply a hybrid machine learning model (MLM) to a plurality of data associated with one or more individuals, and compute a fall prediction score for the one or more individuals based in whole or in part on the application of the hybrid MLM.


According to one or more implementations of the present disclosure a computer-implemented method is provided. The method comprises: receiving, by one or more computer processors, a first plurality of medical data associated with a first plurality of individuals, and training, by the one or more computer processors, a machine learning model (MLM) based on the received first plurality of data, such that the trained MLM is able to process a second plurality of medical data associated with a second plurality of individuals and output a score associated with each of the second plurality of individuals, wherein the score can be an estimated prediction of a fall event occurring, respectively, that each of the second plurality of individuals could suffer, and wherein at least a portion of the first plurality of data and a portion of the second plurality of data are based on one or more pre-determined fillable forms associated with a value range. In one or more implementations, the method can further include: receiving another plurality of medical data for the second plurality of individuals, entirely or partly distinct from the first and second plurality of medical data, inputting the estimated prediction score into another model and the another plurality of data into the another model, and outputting yet another estimated prediction of a fall event occurring, respectively, that each of the second plurality of individuals could suffer. In one or more implementations, the output score can be used alone or in conjunction with an individual's information, such as preferences and/or medical information, to assign a resource to the individual in order to prevent a fall.


According to one or more implementations of the present disclosure a computer program product is provided. The computer program product comprises: a computer-readable storage medium storing computer-readable program code executable by a processor to: apply a machine learning model (MLM) to a plurality of data associated with at least one individual, compute an initial fall risk score for the at least one individual based on the application of the MLM, generate a second fall risk score by applying an adjustment factor to the initial fall risk score (e.g. modifying the score), receive a second plurality of data related to the at least one individual, apply another model to the second fall risk score and the second plurality of data, and generate a final fall risk score based on the application of the another model. Optionally, the computer program product can be a non-transitory computer program product


According to one or more implementations of the present disclosure, a computer implemented method is includes. The computer implemented method includes: receiving a first plurality of data associated with one or more individuals, applying a hybrid machine learning algorithm to the first plurality of data, generating an initial score based on the application of the hybrid machine learning algorithm to the first plurality of data, generating a second score by applying an adjustment factor to the initial score (e.g. modifying the score), wherein the adjustment factor is based on a fall history data of the one or more individuals, receiving a second plurality of data associated with the one or more individuals, applying another algorithm to the second plurality of data associated with the one or more individuals, generating a third score based on application of the another algorithm, and generating a final score based on the second score and the third score, where the final score is a prediction as to whether or not the one or more individuals will sustain a fall.


The above summary is not intended to represent each implementation or every aspect of the present disclosure. Additional features and benefits of the present disclosure are apparent from the detailed description and figures set forth below.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a functional block diagram of a system, according to one or more implementations of the present disclosure;



FIG. 2 is a functional block diagram of a system for allocating resources, according to one or more implementations of the present disclosure;



FIG. 3 is a data entry system, according to one or more implementations of the present disclosure; and



FIG. 4A is a system for predicting a fall, according to one or more implementations of the present disclosure;



FIG. 4B is a system for predicting a fall, according to one or more implementations of the present disclosure;



FIG. 5A is a machine learning training architecture, according to one or more implementations of the present disclosure;



FIG. 5B is a machine learning architecture for predicting a fall, according to one or more implementations of the present disclosure;



FIG. 5C is a machine learning architecture for predicting a fall, according to one or more implementations of the present disclosure;



FIG. 6A is a machine learning training architecture, according to one or more implementations of the present disclosure;



FIG. 6B is a machine learning architecture for predicting a fall, according to one or more implementations of the present disclosure;



FIG. 7 is a process flow diagram for a method of training a machine learning mode, according to one or more implementations of the present disclosure;



FIG. 8 is a process flow diagram for a method of utilizing a machine learning model, according to one or more implementations of the present disclosure;



FIG. 9 is a process flow diagram for a method of utilizing a machine learning model, according to one or more implementations of the present disclosure;



FIG. 10 is a process flow diagram for a method of utilizing a machine learning model, according to one or more implementations of the present disclosure; and



FIG. 11 is a communications architecture useful with at least one implementation of the present disclosure.





While the present disclosure is susceptible to various modifications and alternative forms, specific implementations and embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that it is not intended to limit the present disclosure to the particular forms disclosed, but on the contrary, the present disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.


DETAILED DESCRIPTION

The rate of falls increases by age whereby one or more estimations about one out of four Americans over 65 years of age fall each year. Falls are the leading cause of fatal injury and the most common cause of nonfatal trauma-related hospital admissions among older adults. Falls result in more than 2.8 million emergency treatments annually including over 800,000 hospitalizations and more than 27,000 deaths. Where falls do not cause death, falls can result in lasting and critical consequences include injury, long-term disability, reduced activity and mobility levels and admission to long term care institutes. Fear of falling itself can lead to self-imposed mobility deterioration and social isolation. All these create a decrease in quality of life for individuals at risk of falls or who suffer falls. In addition to the morbidity and deterioration of quality of life falls can cause, these are also costly. In 2015, the total cost of fall injuries was 50 billion dollars. Projections are that the cost may reach almost 70 billion by the end of 2020 as the population ages. Programs such as Medicare and Medicaid cover about 75 percent of the costs.


The seriousness of falls requires a management approach to reduce falls from occurring as described by the various implementations of the present disclosure herein. According to one or more implementations, herein are described systems and methods for prioritizing resources to individuals in a population or prioritizing resources to groups of individuals. According to one or more other implementations, a method of training an algorithm to predict a fall is described.


Although the risk of falls occurring generally increases with age, the implementations of the present disclosure are directed to individuals of any age. In one or more implementations of the present disclosure the methods described herein are directed to individuals of all ages. In one or more implementations the methods are directed to individuals of working age of 16 years and older. In one or more implementations the methods described herein are directed to adults above about 55 years old, over 62 years old, or over 65 years old years. In one or more implementations the methods described herein are directed to patients or residents of a care facility.


In one or more implementations of the present disclosure, the assignment of resources is specific to a particular individual based on one or more of: i) a fall prediction score and ii) a patient or resident profile, e.g. a medical profile of the same, where the specific assignment of resources operates as a prophylaxis to avoid an injury completely or to mitigate the effects of a potential injury.


In one or more implementations of the present disclosure, a hybrid machine learning model that includes both a dynamic and static component increases the overall accuracy and efficiency of a system's capacity to predict a fall of an individual. In one or more implementations, the use of a static component directed to particular data and the use of a dynamic component with respect to other data improves the overall accuracy of a system by reducing or eliminating the variation of weights, e.g. in a machine learning context, that are directed to data of a higher predictive importance or are otherwise less likely to be influenced by other factors in an overall prediction scheme, while also permitting data that is less important or otherwise more likely to be influenced by other factors in a prediction scheme to be subject to variable weights. Accordingly, in one or more implementations, the accuracy of an overall system is improved solely by using a dynamic and static component. Moreover, in one or more implementations, data associated with both the training and/or application of the hybrid machine learning algorithm can be categorized in a manner that eliminates the need for additional processing steps (e.g. data pre-processing), which not only can increase the accuracy of the training and application of the model, but also the efficiency of training the algorithm and the efficiency of generating an output after training.


In one or more implementations of the present disclosure, any machine learning technique can be employed to predict a fall (dynamic, static, or hybrid). In one or more implementations, the initial score produced by the algorithm or technique can be adjusted or modified by a factor based on data determined to be of particular significance in relation to whether an individual will fall. For example, an individual's fall history can be the most important factor as to whether or not that individual will fall again, and a static component, separate from the algorithm used to determine the initial score can generate an adjustment factor; and when applied, the initial score is adjusted by the adjustment factor, which in turn increases the accuracy of the overall output of a system in relation to a prediction as to whether or not an individual will suffer a fall in the future.


In one or more implementations of the present disclosure, any one of i) data used to train a machine learning system or algorithm directed to predicting a fall ii) data inputted into a trained machine learning system or algorithm to generate an output related to data used to train a machine learning system or algorithm directed to predicting a fall, or iii) data used in any context to generate a suggestion as to prioritization of resources can be based on pre-determined forms with a selectable value range in relation therewith. A pre-determined form can be an electronic form, paper form, or any other suitable entry available via a graphical user interface that allows a user to select a value from a value range in relation to a patient or individual's profile, such as a medical condition associated therewith. For example, the user may select a value, e.g. the severity of a patient's medical condition, from a range of values, e.g. 1-10. In one or more implementations, the use of predetermined forms with a selectable range of values can further enhance the efficiency and accuracy of both the training of a machine learning algorithm and the application of the same, as less errors occur in the processing and training because of i) an alleviated need to perform more significant natural language processing, e.g. as associated with processing hand written notes, ii) more consistency in the training, testing, and validation sets used in the machine learning process, and iii) the data used to develop or train the machine learning algorithm more closely resembles the data that the trained algorithm processes in application.


In one or more implementations, the various techniques discussed herein and above, can be combined to further increase the overall accuracy and efficiency of a system, method, algorithm or technique directed to predicting a fall, in addition to providing a more appropriate and specific allocation of resources with respect to an individual. For example, utilizing an overall system that employs a hybrid machine learning model that also has an adjustment factor derived from a component distinct from the hybrid machine learning model itself, in addition to being trained with and applied to data associated with predetermined forms, can produce a more accurate and efficiently processed result than the use of any one feature in isolation; this in turn can result in a more accurate portrait of an individual's overall health profile, which in turn can permit a more appropriate and specifically tailored allotment of resources to said individual.


Reference is now made to the drawings, wherein like reference numerals are used to refer to like elements. In the following description, for the purpose of explanation, numerous specific details are set forth in order to provide a thorough understanding thereof. It may be evident, however, that the novel embodiments can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form to facilitate a description thereof. The intention is to cover all modification, equivalents, and alternatives within the scope of the claims.


Referring to FIG. 1, a system 100, according to one or more implementations of the present disclosure, is illustrated. The system 100 includes a risk score module 110, a prioritizing resources module 120, a re-evaluate module 130, a control system 140, a memory 144, and an electronic interface 146.


The risk score module 110 includes a fall risk value 112 for each of n subjects or individuals in a plurality n of individuals (e.g., patients or residents). As used herein, for simplicity, a “high” or “higher” value or score will indicate a higher risk of a fall event occurring, while a “low” or “lower” fall risk value or score will indicate, in comparison, a lesser chance of a fall event occurring. It is understood a score or value can be a number which can be directly related to how high or low a score is, or alternatively it can be indirectly related to how high or low a score is. In one or more implementations of the present disclosure, the risk score module 110 includes a fall risk value 112 for each of the n subjects or individuals. The falls risk value 112 is indicative of a likelihood of an individual, falling. In one or more implementations the fall risk value is a frequency of falling such as the estimation of number of falls over a period, such as the falls per month, falls per week or falls per day


As used herein “medical-related” data can be any relevant data. Without limitation, medical-related data can include demographic data such as age, gender, ethnicity, marital status, number of children, and education level. In one or more implementations of the present disclosure demographic data includes age and gender. Without limitation, medical-related data can include data from Electronic Health Records (EHR) such as vital signs (e.g., heart rate, respiratory rate, and temperature), diagnostic information (e.g., blood tests, genetic tests, culture results, and imagery such as x-rays), height, weight, treatments received (e.g., medications, and walking assist such as a cane). In one or more implementations the medical-related data includes height and weight. In one or more implementations the medical-related data includes medication information, pain medication information, and pain frequency information. In one or more implementations the medical-related data includes any one or more of temperature deviation information, pulse deviation information, respiratory deviation information, diastolic deviation information, systolic deviation information, and deviation in blood sugar information. Without limitation, medical-related data can include administrative (e.g., hospital discharge data), health survey data, observational data or data from patient registries. In one or more implementations the medical-related data includes pain presence information. Without limitation, the medical data can include data from Long Term Care Minimum Data Set (MDS), and the Brief Interview for Mental Status Score (BIMS). In one or more implementations the medical-information data includes mental status information. Without limitation the medical related date is Activities of Daily Living (ADL) data, such as balance information associated with moving from sitting to standing, balance information associated with walking, balance information associated with turning, bathing assistance information, bed mobility information, bladder function information, bowel function information, toilet use information, toilet hygiene information, walking assistance information, bed mobility information, assistance for walking in a room information, assistance for walking in a corridor information, general assistance when moving, balance information during transitions and walking, functional limitation in range of motions information, and mobility device information. In one or more implementations the medical-related data is historical data of the number of previous falls.


As used here a “deviation” is a deviation from a normal value for an individual, where “normal” is the value before a fall occurs. In one or more implementations of the present disclosure, the normal value is calculated as an average of 3 of the most recent values. In one or more implementations, the difference from the latest value to the average is the deviation which can be expressed as a percentage. For example, the last value can be just prior to admitting or upon admitting an individual to a facility.


The fall risk value 112 can be determined using a fall risk estimation algorithm, such as one based on component 118. The fall risk algorithm can be configured to receive as input medical-related data 114 for each of the individuals in the plurality of individuals n and output a fall risk value 112. In one or more implementations, the fall risk algorithm 118 is based in whole or in part on the techniques, systems, methods and/or algorithms discussed with respect to FIGS. 4A-9.


The number of individuals, n, can be any value such as a value between 2 and 50 million, 2 and ten million, 2 and two million, 2 and one million, 2 and 100,000, 2 and 1000, 2 and 100. In one or more implementations of the present disclosure, the number of individuals, n, is at least 2, at least 5 at least 10, at least 20 or at least 50. In one or more implementations the amount of individuals, n, can include all or a subset of adults over 55 years old, over 62 years old, or over 65 years old, where this subset of adults resides in a political region such as in a country, in a state, a province, in a county, in a parish, or in a municipality. In one or more implementations the number of individuals, n, includes all residents in one facility, such as a workplace, a hospital, a group home, or a retirement facility. In one or more implementations of the present disclosure the number of individuals, n, includes all residents in more than one facility, such as more than one workplace, more than one hospital, more than one group home, more than one retirement facility or a combination of these.


The risk score module 110 includes a total fall risk score for all of the n subjects or individuals associated with 116. In one or more implementations of the present disclosure the risk score for all of the n subjects associated with 116 is a combined risk score from a group of individuals n, where, in the population or plurality of individuals n, the group is a subset of the total in the population. For example, in one or more implementations there are m groups, where m is an integer comprising at least a portion of the plurality of individuals n. Herein, m can be any value less than n. The number of individuals in each group can be the same or can be different. In one or more implementations the groups correspond to members of a political region. In one or more implementations the members from a first groups reside in a first facility and the members from a second group reside in a second facility. In one or more implementations, risk score is a function of the individual risk scores for each individual in the group. In one or more implementations the individual risk scores are summed to determine the combined risk score. In one or more implementations the average of the individual risk scores for all the individuals in a group is used as the combined risk score. In one or more implementations the mean of the individual risk scores for all the individuals in a group is used as the combined risk score.


The prioritizing resources module 120 includes prioritizing resources for individual subjects 122 based at least in part on their total risk score 116 or their individual fall risk value 112. In one or more implementations of the present disclosure the prioritizing resources module 120 receives the total risk score 116 and/or fall risk value 112 from the risk score module 110. A list of priority can be made as shown in FIG. 1, wherein Subject 1 has highest priority based on total risk score 116 or fall risk value 112, sequentially followed by Subjects 2-4 having less priority all the way to Subject n who has the least priority. In one or more implementations a personalized schedule is created such that the individuals in plurality of individual's n are given resources based not only on their total risk score but also based on their individual needs. For example, Subject 1 is assigned a resource D, Subject 2 is assigned a resource B, Subject 3 is assigned a Resource A, and Subject 4 is assigned a Resource D. In one or more implementations an available resource is not used, such as Resource C. In one or more implementations two or more individuals can be assigned the same resource, such as Subject 1 and Subject 4 assigned to Resource D or Subject 2 and Subject 4 assigned to resource B. In one or more implementations an individual can be assigned to more than one resource, such as Subject 4 assigned to Resource B and Resource D. In one or more implementations only a portion of the individuals are assigned such that Subjects x to n are not assigned a resource, where x is any number less than n.


The prioritizing resources module 120 can also include prioritizing resources for groups 124 based at least in part on the total risk score 116 and/or fall risk value 112 corresponding to an individual or group of individuals. In one or more implementations of the present disclosure the prioritizing resources module 120 receives the combined total risk score 116 or fall risk value 112 from the risk score module 110. A list of priority can be made, for example, wherein Group 1 has highest priority based on Group 1's combined total risk score 116, sequentially followed by Groups 2-4 having less priority all the way to Group m who has the least priority. In one or more implementations a personalized group schedule is created such that the individuals in plurality of Group's m are given resources based not only on their total risk score but also based on the Group's need; alternatively, individual fall risk values 112 alone or in combination with individual health needs can be assigned. For example, Group 1 is assigned a resource E, Group 2 is assigned a resource F, Group 3 is assigned a Resource F, and Group 4 is assigned a Resource H. In one or more implementations a group can be assigned to more than one resource. In one or more implementations an available resource is not used, such as a Resource G. In one or more implementations two or more individuals can be assigned the same resource, such as Group 2 and Group 3 shown assigned to Resource F. In one or more implementations only a portion of the individuals are assigned a resource such as Groups y to m are not assigned a resource, where y is any number less than m.


As used herein a resource is anything that can be used to avoid or mitigate a future or predicted fall. In one or more implementations of the present disclosure the resources are any one or more of physical therapy sessions, medication, walking aids, personal protection devices, environment modification, monitoring equipment, a scheduled visit to a nurse, a scheduled visit to a doctor, a scheduled visit to a physical therapist, and an allotment of money. In one or more implementations of the present disclosure, a resource is a physical therapy session where the resource can include assignment of a specific physical therapy for an individual, extending a physical therapy treatment already prescribed to the individual, or modifying a physical therapy treatment, such as extending the time spent in physical therapy.


In one or more implementations a resource is a medication wherein a new medication is prescribed, for example to reduce pain, or a medication that causes unwanted side effects, such as dizziness, is substituted with another which does not cause the unwanted side effect. In one or more implementations the medication improves alertness, improves sleep and rest, or decreases agitation.


In one or more implementations of the present disclosure the resource is a walking aid such as prescribing or modifying a walking aid. In one or more implementations the walking aid can include a cane, a walker, a wheelchair, orthopedic shoes, a prosthetic, leg braces or the like. In one or more implementations, the resource is a protective or preventative device worn by the individual such as a helmet, knee pads, elbow pads, gloves, and non-skid shoes.


In one or more implementations of the present disclosure the resource is an environment modification. For example, in one or more implementations the individual is provided with better lighting or automatic lighting, such that when the individual gets out of bed a light is turned on. In one or more implementations, furniture is rearranged to make walking pathways less obstructed. In one or more implementations the modification includes adding floor traction enhancers such as traction strips, traction mats, non-skid mats, or the like on a smooth walking surface. In one or more implementations reflective or brightly colored strips are used to delineate hazards, such as step. In one or more implementations bars or holding rails are implemented along walls for gripping to. In one or more implementations a visual or audible alarm can be implemented with a motion detector or trigger to help an individual avoid bumping into an object or tripping over an object. In one or more implementations a protective device is placed in the environment of the individual such as fall arresting or soft pads, and corner protectors. In one or more implementations the environment modification is a change in a room assignment, for example, where a subject is moved closer to a dining hall if they have one or more difficulties walking and are more likely to suffer fall based on their current walking distances. In one or more implementations the resource is a facility, for example, where an individual is assigned to or moved to a facility more appropriate to their needs with respect to eliminating the occurrence of a fall or mitigating the effects therefrom.


In one or more implementations of the present disclosure the resource is a monitoring resource. For example, one or more of a camera, a motion detector, a pressure sensor, a positioning system such as a GPS tracker, or a care provider, such as one or more care providers assigned to monitor the individual. In one or more implementations the monitoring resource is worn or carried by the individual, such as a motion detector or GPS on a wrist band.


In one or more implementations of the present disclosure the resource is a scheduled visit to a care provider such as a nurse, a doctor, or a physical therapist.


In one or more implementations of the present disclosure the resource is a fungible resource such as an allotment of money. For example, a facility may be allotted an amount of money for use to implement various resources for reducing the number of falls or limiting the effects therefrom, or an individual may be allotted an amount of money to provide for themselves resources for eliminating the number of falls or mitigating the effects therefrom. In one or more implementations the money is earmarked for specific resources such as for providing better lighting, buying walkers, and hiring care providers.


In one or more implementations of the present disclosure the algorithm 118 can determine the kind of resource needed. For example, the algorithm 118 may show an individual has a high-risk score and it is determined that this is due to a specific medical-related data input such as poor gait attributes. A physician can conclude that the individual should be prescribed orthopedic shoes to improve their gait. As another example, the algorithm 118 may show an individual with a high-risk score is taking medication that will leave them drowsy and prone to falling out of bed when waking up. In this instance, a soft mat can be implemented next to the bed to mitigate any potential serious fall. In another example, the individuals in a facility having a high collective risk score are shown by the algorithm to have a high proportion of individuals suffering from dementia and the facility might be earmarked to receive funds for hiring more staff to monitor and care for the individuals.


Although in one or more implementations of the present disclosure the risk score module 110 provides information for the prioritizing resources module 120, in one or more implementations information from prioritized resource model 122 is provided back to the risk score module 110. For example, the total risk score 116 can be re-evaluated as indicated by module 120. In one or more implementations all of the individuals are re-evaluated. In one or more other implementations only a subset of the plurality individuals is re-evaluated. For example, only a subset of the plurality of individuals with the highest total risk score are re-evaluated as determined by module 120, such as those in the top 90%, top 75%, top 50%, top 25% or top 10%. In one or more implementations the total risk score 116 for all or a subset of individuals is evaluated at least once a year, at least once every 6 months, at least every three months, at least once a month or at least once a week. In one or more implementations a risk score is evaluated for an individual when they are assigned to or soon after (e.g., within a week) of arriving at a new facility. In one or more implementations the module 130 can be an algorithm such as an algorithm that determines when and how to implement re-evaluation of the risk score.


The system 100 includes a control system 140, a memory device 144, an electronic interface 146, and an external device 148, e.g. display device 150 or other devices not explicitly shown in FIG. 1.


The control system 140 includes one or more processors 142. The control system 110 is generally used to control and analyze data obtained and/or generated by the modules and components of the system 100. The processor 142 can be a general or special purpose processor or microprocessor. While one processor 142 is shown in FIG. 1, the control system 140 can include any suitable number of processors (e.g., one processor, two processors, five processors, ten processors, etc.) that can be in a single housing, or located remotely from each other. The control system 140 can be coupled to and/or positioned within, for example, a housing of an external device 148. The control system 140 can be centralized (within one such housing) or decentralized (within two or more of such housings, which are physically distinct). In such implementations including two or more housings containing the control system 140, such housings can be located proximately and/or remotely from each other.


The memory device 144 stores machine-readable instructions that are executable by the processor 142 of the control system 140. The machine-readable instructions can include the fall risk algorithm 118, algorithms for combining total risk scores, algorithms for prioritizing resources (e.g., used by prioritizing resources module 120), algorithms for selecting subjects or groups for evaluation (e.g., used by re-evaluate module 130), and information control algorithms for controlling data flow within the various modules. The memory device 144 can be any suitable computer readable storage device or media, such as, for example, a random or serial access memory device, a hard drive, a solid-state drive, a flash memory device, etc. While one memory device 144 is shown in FIG. 1, the system 100 can include any suitable number of memory devices 144 (e.g., one memory device, two memory devices, five memory devices, ten memory devices, etc.). The memory device can be coupled to and/or positioned within a housing of an external device 148. Like the control system 140, the memory device 144 can be centralized (within one such housing) or decentralized (within two or more of such housings, which are physically distinct).


The electronic interface 146 is configured to receive data (e.g., medical-related data) such that the data can be stored in the memory device 144 and/or analyzed by the processor 142 of the control system 140. The electronic interface 146 can include an antenna, a receiver (e.g., an RF receiver), a transmitter (e.g., an RF transmitter), a transceiver, or any combination thereof. The electronic interface 146 can include one more processors and/or one more memory devices that are the same as, or similar to, the processor 142 and the memory device 144 described herein. In one or more implementations, the electronic interface 146 is coupled to or integrated in the external device 148. In other implementations, the electronic interface 146 is coupled to or integrated (e.g., in a housing) with the control system 140 and/or the memory device 144.


The external device 148 (FIG. 1) includes a display device 150. The external device 148 can be, for example, a mobile device such as a smart phone, a tablet, a laptop, a desktop computer or the like. Alternatively, the external device 148 can be an external sensing system, a television (e.g., a smart television) or another smart home device (e.g., a smart speaker(s) such as Google Home, Amazon Echo, Alexa etc.). In one or more implementations, the user device is a wearable device (e.g., a smart watch). The display device 150 is generally used to display image(s) including still images, video images, or both. In one or more implementations, the display device 150 acts as a human-machine interface (HMI) that includes a graphic user interface (GUI) configured to display the image(s) and an input interface. The display device 150 can be an LED display, an OLED display, an LCD display, or the like. The input interface can be, for example, a touchscreen or touch-sensitive substrate, a mouse, a keyboard, or any sensor system configured to sense inputs made by a human user interacting with the external device 148. In one or more implementations, one or more user devices can be used by and/or included in the system 100.


While the control system 140 and the memory device 144 are described and shown in FIG. 1 as being a separate and distinct component of the system 100, in one or more implementations, the control system 140 and/or the memory device 144 are integrated in the external device 148. Alternatively, in one or more implementations, the control system 140 or a portion thereof (e.g., the processor 142) can be located in a cloud (e.g., integrated in a server, integrated in an Internet of Things (IoT) device, connected to the cloud, be subject to edge cloud processing, etc.), located in one or more servers (e.g., remote servers, local servers, etc., or any combination thereof.


While system 100 is shown as including all of the components described above, more or fewer components can be included in a system. For example, in one or more implementations a reevaluate module 130 is not used. In one or more other implementations, the fall risk algorithm 118 is not used. In one or more further implementations the prioritizing resources module does not include prioritizing resources for groups.


An illustration of a determination for a fall risk value is illustrated by Table 1. Although shown as a Table 1, the information in Table 1 can be processed, entered and outputted using one or more algorithms or a combination of algorithms and manual inputs as disclosed herein or otherwise suitable. In one or more implementations of the present disclosure, the data is entered or transferred electronically from a data base as disclosed herein or otherwise suitable.


In Table 1 one or more exemplary features are illustrated as, for after a fall: Number of Hospital Days, Therapy Minutes, New Diagnosis & X-Ray results (head fracture, head injury), Presence of Pain, change of Condition, ADL data, Medical Orders, Resident Calendar, Post-Fall Care Plans, Progress Notes, Function Score from section GG, MDS/PDPM Score, Environmental factors, General orders, Other Fall Risk Scores, and Previous Falls.


Some definitions and clarification for the listed features are as follows. Function Score from section GG refers to functional ability and includes admission and discharge self-care and mobility performance. Progress Notes are the part of a medical record where healthcare professionals record details to document a patient's clinical status or achievements during the course of a hospitalization or over the course of outpatient care. PDPM refers to Patient-Driven Payment Model. MDS refers to The Minimum Data Set which is part of a federally mandated process for clinical assessment of all residents in Medicare or Medicaid certified nursing homes. Environmental factors include where the individual was when the fall occurred, such on stairs, in a shower, getting out of bed, outdoors, or on a smooth/slippery surface.


An example of a determination of a Fall Risk Value is illustrated by Table 1. Although shown as a Table, the information in Table 1 can be processed, entered and outputted using one or more algorithms or a combination of algorithms and manual inputs or as is otherwise suitable. In one or more implementations of the present disclosure, the data is entered or transferred electronically from a data base as disclosed herein or as is otherwise suitable.


In Table 1 some exemplary features are illustrated as, for after a fall: Number of Hospital Days, Therapy Minutes, New Diagnosis & X-Ray results (head fracture, head injury), Presence of Pain, change of Condition, ADL data, Medical Orders, Resident Calendar, Post-Fall Care Plans, Progress Notes, Function Score from section GG, MDS/PDPM Score, Environmental factors, General orders, Other Fall Risk Scores, and Previous Falls.


Some definitions and clarification for the listed features are as follows. Function Score from section GG refers to functional ability and includes admission and discharge self-care and mobility performance. Progress Notes are the part of a medical record where healthcare professionals record details to document a patient's clinical status or achievements during the course of a hospitalization or over the course of outpatient care. PDPM refers to Patient-Driven Payment Model. MDS refers to The Minimum Data Set which is part of a federally mandated process for clinical assessment of all residents in Medicare or Medicaid certified nursing homes. Environmental factors include where the individual was when the fall occurred, such on stairs, in a shower, getting out of bed, outdoors, or on a smooth/slippery surface.


As shown in Table 1a, the features are each scored 0-4 (Data Points) and have a weight value. The Max score for each feature is the max for the feature (4) multiplied by the weight, for example the max value is 8 for the Number of Hospital Day Row, 4 for the Therapy Minutes, and 8 for the New Diagnosis. Combination of all the Max scores by addition gives a total value of 100. The Fall Prediction Score can be adjusted (e.g. modified), for example, as shown in Table 1 is a value between 0-9. According to other implementations of the present disclosure, other scoring, weights, combining of scores and final adjustments can also be made.


The features are sourced or determined as noted in the Data Source column of Table 1. For example, the number of days in the hospital is sourced from MDS data. The days provide a contribution of Data points as follows: 0 days in hospital=0, 0-1 days in hospital=1, 2-5 days in hospital=2, 6-10 days in hospital=3 and more than 10 days in hospital=4. As noted, the weight for this row is 2, indicating that it is a more important feature for determining the likelihood of a fall than a feature with a weight of 1. As another example, for the Therapy Minutes feature, the Data points are allocated as follows: 0 minutes=0, 1-60 minutes=1, 60-120 minutes=2, 120-240 minutes=3, more than 240 minutes=4. For Therapy minutes the Weight is 1.


In the specific example illustrated by Table 1, data for “Betty” data is entered. Betty, a resident at the Sunrise Retirement home, fell on the floor on her way to the dining room. She suffered a mild fracture on her left hand. She spent 3 days in the hospital and needed some assistance with her ADD s for the next 10 days. Betty was ordered non-slip socks afterwards, some pain medication and also spend about an hour in physical therapy. Betty had one fall in the past. The other features for Betty are assessed and entered into Table 1 as listed. Betty's weighted scores are listed, combined for a total score of 39, which is then normalized to the Fall Prediction Score value of 4.

    • a. In the following Tables 1a-3, MDS refers to Minimal Data Set, NX refers to any suitable entry, scheme and/or formula (as applicable) available and/or useable in a daily care or other patient or resident management software or scheme, RCM refers to Revenue Cycle Management, O/E-refers to Observation and Events. The MDS codes are standard codes available from the US Department of Health and Human Services, summarized in the table and also, for example listed online at https://downloads.cms.gov/files/mds-3.0-rai-manual-v1.17.1_october_2019. pdf accessed Apr. 30, 2020 and herein incorporated by reference.









TABLE 1a







Fall Prediction Value

















Betty's






Max
Score


Feature
Data Source
Data Points
Weight
Score
(Weighted)















No of
Fall Event, Post-Fall
No
2
8
6


Hospital
Observation, MDS, and
Hosp = 0





Days right
Census
0-1 = 1 2-5 = 2





after fall
MDS:
6-10 = 3 > 10 =






A2100 = 03 (Discharge to
4






an acute hospital)







A2000 (Actual Discharge







Date)







A03103 = 2 (Type of







Discharge - Planned or







Unplanned)







J1800 = 1 (New Falls







since admission)







J1900B = 1 or 2 (Injury)







J1900C = 1 or 2 (Major







injury)







Census:







Hospital Leave







Discharge






Therapy
Fall Event, CA, and POC
No minutes
1
4
4


Minutes after

0





fall

1-60 = 1







60-120 = 2







120-240 = 3







>240 = 4





New
Face Sheet, Neuro
0-4
2
8
4


Diagnosis &
Observation and






X-Ray
Ancillary Orders






results
Face Sheet






Fracture
Diagnosis Codes and






Head injury
Start







Date for code related to







date of fall







Ancillary Orders (Lab







Soft) Radiology







comments






Presence of
MDS,
0-4
1
4
1


Pain
Events/Observations,






(subsequent
POC, Vitals (Pain as a






to the Fall
Vital)






Event)







Change of
MDS, Observation,
No = 0 Yes = 4
1
4
0


Condition
Events






ADL data
MDS, CA or POC
Independent = 0
2
8
4


right after

1





fall

Person Assist =







2







Extensive = 4





Medication
Pain Meds, Antipsych.
0-4
2
8
2


Orders
Meds, etc.






Resident
Follow up appts
No = 0 Yes = 1
1
4
0


Calendar







Post-Fall
CAAs Care Plan
0-4
2
8
2


Care Plans
Category for Falls







Care Plans







Goal start dates







New interventions






Progress
Fall Predictive Analysis
0-4
1
4
1


Notes







Function
MDS, POC, and CA
No
2
8
2


Score from
Function Scores
Data = 0





section GG

Low = 1







Medium = 2







High = 4





MDS/PDPM
Score change after the
No
2
8
2


Score
fall
Data = 0







Low = 1







Medium = 2







High = 4





Environ-

0-4
1
4
2


mental







Factors







General
Nonslip socks etc
0-4
1
4
1


Orders







Other Falls
John Hopkins fall risk
0-4
2
8
4


Risk Scores
assessment tool JHFRAT






Previous
MDS
No
2
8
4


Falls
J1700 A, B, and C= 1
Data = 0






(C = Fracture)
1-2 = 2







>2 = 4





Total Score



100
39


Fall







Prediction



9
4


Score







(0-9)











    • c. As noted in various implementations herein, including with respect FIGS. 4A-B and FIG. 7, training of a machine learning model can include receiving medical related data. In various implementations, Table 2 lists possible medical data inputs and how they can be quantified. For example, a particular bodily function activity (e.g. bowel movement), pre-bodily function activity (e.g. toilet seating), or daily activity (e.g. walking) is shown, where each type of activity can be further subdivided into sub-categories (e.g. rotating on a toilet, sitting, standing, etc). In one or more implementations, the bodily function activities, pre-bodily function activities, and/or daily activities, e.g. as listed by numbers 1-N below in Table 1, can include one or more of sitting on a toilet, bladder output, bowl movement, bed mobility, general mobility, various locomotive activities, showering, bathing, and/or any other relevant activity associated with a residential care or nursing home facility. In one or more implementations, the general activity, e.g. toilet use, can be categorized into sub-categories; for example, type or ease of rotation on a toilet or bed, type or ease of standing, lying or sitting on a bed, whether an individual was sitting or standing to use a toilet for bladder output, the number of locomotive activities performed, the type of transfer used to move an individual from one place to another (e.g. wheel chair), the speed or nature associated with walking, or any other relevant manner of categorizing a general activity into multiple sub-categories, and attributing a value, e.g. service value (as discussed in the foregoing) to said sub-category. The value for each of these categories can be calculated based on the number of times a service was provided in a specified time frame, e.g. weekly, bi-weekly, monthly, or any other relevant time frame, where for the 15-day time period, if a service is provided within that time frame, it is counted as 1, otherwise it is counted as 0. A service can include a care provider giving assistance to the individual for the activity. Therefore, the values for these rows of medical-related data range between 0 to N, where N is the number of days associated with the particular time interval.

    • d. Table 3 list some addition possible medical-related data (Vitals and Pain) and how these can be quantified. For example, rows 1-6 provide Temperature, Pulse, Respiration, Diastolic, Systolic, Blood Sugar data. These are provided as deviations from a baseline average using any suitable averaging and deviation scheme as shown. Other data can be more directly entered such as row 7, weight and row 8 height. Data in rows 9-12 relate to pain and have a binary response such as row 9 (e.g., does resident show evidence of pain or possible pain yes value is 1, no value is 0). Other entries and associated data are described and shown in Table 3, e.g. 13-14.












TABLE 2







Medical-Related Data










Item





#
Column Name
Data Tables
Description





 1
EventID
NX RCM
When a Fall Event has occurred an Event ID for that




O/E
Fall is provided, else the Event ID is the same as the





Patient ID.


 2
EventDate
NX RCM
Date the Event has occurred




O/E



 3
Fall Prediction Score
NX RCM
See Fall Prediction Score




O/E



 4
PatientID
NX
The Patient ID has been taken across different





databases, in case of same patient ID with different





demographic information, it is a different patient from





another facility that has the same Patient ID.


 5
Sex
NX
Gender (M, F, U)


 6
RaceCode
NX
Race of the Patient (Asian, Black, Hispanic, Native,





Unknown, White)


 7
MARITALSTATUS
NX
Marital Status of the Patient during the event (D, M,



CODE

S, U, W, X)


 8
Age
NX
Age of the Patient during the time of the event


 9
NoOfFalls
NX RCM
No of previous falls the patient has had so far.




O/E



10
Diagnosis
NX
Diagnosis during the event (These are in the form of





ICD 10 codes, multiple diagnosis are specified





separated by comma)


11
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity 1,

occurred date), if the service was provided ″n″ times



sub-category 1, without

in a day, it will be counted as 1 else it is 0. Values



assist

range from 0-N


12
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity 1,

occurred date), if the service was provided ″n″ times



sub-category 1, with

in a day, it will be counted as 1 else it is 0. Values



assist, with assist

range from 0-N


13
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity 2,

occurred date), if the service was provided ″n″ times



sub-category 2, with

in a day, it will be counted as 1 else it is 0. Values



assist

range from 0-N


14
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity 2,

occurred date), if the service was provided ″n″ times



sub-category 2, without

in a day, it will be counted as 1 else it is 0. Values



assist

range from 0-N


15
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity 2,

occurred date), if the service was provided ″n″ times



sub-category 2, with

in a day, it will be counted as 1 else it is 0. Values



assist

range from 0-N



. . .
. . .
. . .


30
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity

occurred date), if the service was provided ″n″ times



N, sub-category N,

in a day, it will be counted as 1 else it is 0. Values



without assist

range from 0-N


31
Pre-Bodily Function,
NX
Number of times a service was provided for the N



Bodily Function, and/or

interval time period (inclusive of event occurred/not



Daily Living Activity

occurred date), if the service was provided ″n″ times



N, sub-category N, with

in a day, it will be counted as 1 else it is 0. Values



assist

range from 0-N
















TABLE 3







Medical-Related Data (Vitals and Pain)










Item
Column
Data



#
Name
Tables
Description





N1
Temperature

Average Temperature Value over a specified



Deviation

time period from N1 (day 1) Nn (dayn) or





any suitable deviation calculation from a





median, average, or other suitable value.


 2
Pulse

Average Pulse Value over a specified time



Deviation

period from N1 (day 1) Nn (dayn) or any





suitable deviation calculation from a median,





average, or other suitable value.


 3
Respiration

Average Respiration Value over a specified



Deviation

time period from N1 (day 1) Nn (dayn) or





any suitable deviation calculation from a





median, average, or other suitable value.


 4
Diastolic

Averaging Diastolic Value over a specified



Deviation

time period from N1 (day 1) Nn (dayn) or





any suitable deviation calculation from a





median, average, or other suitable value.


 5
Systolic

Averaging Systolic Deviation Value over a



Deviation

specified time period from N1 (day 1) Nn





(dayn) or any suitable deviation calculation





from a median, average, or other suitable





value.


 6
Blood Sugar

Average Blood Sugar Value over a specified



Deviation

time period from N1 (day 1) Nn (dayn) or





any suitable deviation calculation from a





median, average, or other suitable value.


 7
Weight

Weight is measured in Pounds


 8
Height

Height is measured in Inches


 9
Evidence_of_
NX
Binary Response 1 or 0 , ″Does Resident



Pain

show evidence of pain (or possible pain)? =





Yes


10
J0200
MDS
Should Pain Assessment Interview be





Conducted?: 1 = Yes


11
J0300
MDS
Pain Presence: 1 = Yes


12
J0700
MDS
Should the Staff Assessment for Pain be





Conducted? = 1 Yes


13
J0850
MDS
Frequency of Indicator of Pain or Possible





Pain = 1 OR 2 OR 3


14
J0400
MDS
Pain Frequency: = 1 OR 2 OR 3









Referring to FIG. 2, a system 200, according to one or more implementations of the present disclosure, is illustrated. The system 200 can, in one or more implementations, can assign or recommend a resource, such as a prophylactic measure, to a resident or patient of a healthcare facility, e.g. resident, home, nursing home, or hospital based in whole or in part one or more of i) a. The “modules,” “units,” or “components” described in the system 200, whether contained in memory or otherwise employed therein, can be any suitable software, logic (hardware or software), or hardware element specifically configured to perform or be used in the performance of one or more tasks or functions as discussed herein.


In one or more implementations, the system 200 can include a resource assignment unit 201 which in turn includes one or more processors 202, memory 203, and a network interface 207. The one or more processors 202 can be any suitable software or hardware computer components for carrying out any operation as discussed herein. The memory 203 can be any suitable component or unit for storing protocols, information, algorithms, and/or instructions for execution by the one or more processors, e.g., the memory 203 may be any volatile and/or non-volatile memory capable of storing information during and/or for execution of instructions. The devices, systems, sources, units and/or components of the resource assignment unit 201 can be coupled to a network 220, e.g., the Internet, via one or more wired and/or wireless network links, and can be accessed by one or more network interfaces 207.


In one or more implementations, the resource assignment unit 201 can interact with one or more users or clients 218 (and associated user/client computing devices 216, e.g. a laptop, mobile phone, tablet, or desktop computer) via a network interface 207 that can access a network 220, and the resource assignment unit 201 can interact with one or more data databases (as shown with reference to other implementations) also via the network interface 207 accessing the network 220, where in one or more implementations the one or more data sources can include one or more data sources related to the health, living condition, status, medical profile, or other information associated with one or more patients or residence in a medical facility (e.g. hospital, doctor's office, in-patient facility, out-patient facility, physical therapist's office, etc.) or resident home (e.g. a nursing home, assisted living facility, other residence, etc.). In one or more implementations, the data sources can include demographic data, diagnostic data, active (and/or previous) medication data, activity data, MDS data (and/or any other data mandated by, provided by, or associated with a governmental or regulatory body), vital change information data, and any other available data associated with one or more individuals (e.g. patients or residents). In one or more implementations, as shown, a resident or patient information database 208 can be included in the memory 203 and can be updated by the one or more users 218. In one or more implementations, resource assignment unit 201 can be local, e.g. contained in or programmed to be executed thereon, to user device 216.


In one or more implementations, any information or data as described herein can be part of the patient information database 208. In one or more implementations, the information in the patient or resident information database 208 can include one or more medical condition data for one or more patients or residents, e.g. Subject 1, Subject 2, etc. The condition or conditions, e.g. “Cdt1,” Cdt 2,” and so on can refer to a particular medical condition; for example, a broken hip, dementia, or any other medical condition. In one or more implementations, the patient information database 208 can further include one or more preferences, e.g. “P1,” P2,” and so on for the one or more patients or residents, where the one or more preferences can indicate a specific designation by either the patients themselves, a medical professional, or other individual familiar to the residents or patients as to a particular treatment, entertainment, media, or other mechanism that may be needed or desired by the patient or resident in times of distress. For example, preference “P1” can be a patient's or resident's request to hear a family member's voice recording in times of distress (e.g. when said patient or resident suffers from dementia), “P2” can be a doctor's or other medical professional's suggestion that when a patient is in distress or in danger he or she is confined to a wheel chair or be given a leg brace (e.g. when said patient or resident has a fractured bone or other orthopedic disability), or any other suitable preference or treatment applicable for a particular individual in relation to designated danger (e.g. potential of a fall) in light of a medical condition (e.g. dementia or orthopedic issue).


In one or more implementations the fall prediction unit 204 can be configured to perform any of the fall prediction operations of systems or flows as discussed herein for making a determination as to whether or not an individual will sustain a fall, including but not limited to FIG. 1, FIG. 4A-FIG. 9, and where said operations are executed by the one or more processors 202. In one or more implementations, the fall prediction unit 204 can be programmed with the operations of FIG. 8, and where said operations are executed by processor 202. In one or more implementations, as stated herein, the data used to generate a fall prediction by unit 204 can be based in whole or in part on resident or patient database 208 or other databases (as shown with reference to other implementations) as described herein.


In one or more implementations, the matching unit 206 can be configured to associate a condition and/or preference to a fall prediction score outputted by the fall prediction unit 204. In some implementations, the matching unit can be configured to output a specific resource to an individual if he or she has a particular condition and in relation to that condition the individual's associated fall prediction score exceeds a certain threshold. In some implementations, the matching unit 206 can account for multiple conditions by performing any suitable mathematical operation, e.g. a sum, product, weighting or factorial increase based on the fall prediction score outputted by the fall prediction unit 204 and the one or more conditions associated by a particular individual. In some implementations, a particular condition will automatically warrant at least one resource allocation irrespective of a fall prediction score, and the addition of other conditions can impact whether additional resources are allocated to a particular individual. In some implementations each condition can be associated with a particular threshold in relation to the fall prediction score and in relation to whether a particular resource or resource is allocated.


In some implementations, for example, the matching unit 206 can be configured to suggest or allocate a resource based solely on the fall prediction score outputted by the fall prediction unit 204, e.g. via network 220 and displaying or otherwise outputting the score via one or more computer devices 216 to one or more users 218. For example, if the score meets or exceeds a first threshold is met, Nt1 then a first resource is suggested, e.g. a soothing voice of a family member or pre-recorded music associated with a family member of a particular resident or patient is played over a loud speaker (e.g. instructing a resident to stay in place while medical assistance arrives). If the score meets or exceeds a second threshold Nt2, then another resource or an additional resource can be suggested, e.g. a wheelchair. If the score meets or exceeds a third threshold Nt3, then another or yet additional resource can be suggested, e.g. immediate medical assistance by a care provider (nurse, therapist, doctor, etc.) can be suggested. In some implementation, since, as discussed with reference to other embodiments and implementations, e.g. FIG. 1, FIG. 4A-FIG. 6B, FIG. 8 and FIG. 9, a fall prediction score is based on data that is in part particular to an individual or patient, inherently, the resource allocation based on the fall prediction score is specific to a user. In some implementation, in order to more particularize a resource allocation to a user, the matching unit 206 can compare a fall prediction score associated with a patient or resident to a particular condition associated with said patient or resident. For example, the matching unit 206 can be configured to suggest a particular resource based on fall prediction scores that are specific to particular conditions or are weighted based on the number and/or type of conditions. For example, condition 1, e.g. dementia, can have a weight W1. If a particular individual suffers from dementia and has a fall prediction score FS, then the product of W1 and FS can be compared to particular threshold reference scores, e.g RS1, RS2, RS3, and so on, where if the product of W1 and FS exceeds the one or more threshold or reference scores, then a resource corresponding to each reference score or a resource corresponding to the highest score is suggested. If multiple conditions are present, e.g. condition 1, e.g. dementia, and condition 2, e.g. a broken bone or fractured bone, then multiple weights, each corresponding to each condition, e.g. W1 for dementia and W2 for a broken or fractured bone, then W1 and W2 (and so on) can be applied to the fall prediction score FS to determine whether one or more resources should be assigned to the corresponding individual. In some implementations, the weights, e.g. W1 and W2, can be higher or lower based on their relevance as to whether or not an individual will suffer a fall relative to a condition, e.g. a sprained ankle can have a lower weight than a fractured hip.


In some implementations, each condition can correspond to a particular threshold, and if the threshold, e.g. RS1, RS2, and for the condition or conditions is not met, then no resource is assigned, and if the threshold for the condition or conditions is met, then each resource associated with the particular condition is suggested by the matching unit 206. In some implementations, the threshold for one condition can be correlated to the condition presents a higher likelihood an individual will suffer a fall as result of the condition, e.g. a sprained ankle can have a lower threshold than a fractured hip. In some implementations, the matching unit 206 can coordinate with one or more sensing, audiovisual (e.g. camera), or other suitable devices to make a suggestion as to whether or not a resource should be provided. For example, if a particular resident or patient has a score that exceeds a certain threshold or exceeds a certain threshold relative to one or more conditions, and the patient or resident already has the suggested allocation (e.g. by conducting a scan of a particular setting and performing any suitable audiovisual operation to detect the patient or resident and his or her surrounding and/or allocated resources), then no further allocation can be made. By way of another example, the matching unit 206 can suggest allocating or not allocating a resource based on a threshold being met or not met relative to a fall prediction score and one or more conditions, in addition to an audiovisual scan of a particular setting. If a resource allocation is suitable only in the case of patient movement, e.g. leg brace or wheel chair, for example, but the patient or resident is on his or her bed, then a particular resource allocation can be withheld as extraneous based on an audiovisual analysis of the care setting.


In some implementations, a condition warrants more than one resource allocation or a certain fall prediction score that is weighted by multiple conditions warrants more than one resource allocation. In some implementations, a certain condition can warrant a resource allocation irrespective of a fall prediction score, but a fall prediction score weighted by additional conditions can warrant an additional resource and/or a scan or other review of a care setting, e.g. by a sensor or camera, can warrant an additional resource assignment.


As shown, and pursuant to one or more implementations, the resource assignment unit 201 can interact with one or more devices in a care setting 210 via the network 220. The care setting 210 can be a room or other area appropriate for a care setting, e.g. a medical facility, nursing home, resident home, etc. The devices in the care setting 210 can include one or more cameras or sensors 212 (e.g. infrared cameras) and/or one or more audio output devices 211, including but not limited to a loudspeaker. In some implementations, other devices such as televisions, monitors, or any other suitable sensing or audio-visual equipment or media. In some implementations, any suitable unit or component of the resource allocation 201, e.g. the matching unit 206 can be configured to have programmable instructions, which when executed by the processor 202, output instructions to the one or more devices, e.g. 210 and 212, associated with the care setting 210.


In some implementations, the cameras or sensors 212 can continually monitor residents or patients 214a-214c, where each patient or resident is associated with a particular care provisions 213a-213c, and where the care provisions can be any suitable provisions providable by a care setting, e.g. nursing home, resident home, hospital, etc., such as a bed, wheel chair, surgical table, etc. In some implementations, a suitable unit of the resource allocation unit, e.g. the matching unit 206, continually updates, e.g. in specified intervals (seconds, minutes, hours, etc.), one or more users 218 via one or more computer devices 216 of the need or lack thereof to allocate resources to the one or more patients or residents 214a-214c. For example, patient or resident 214a can correspond to Subject 1 in the resident database 208. The fall prediction unit 204 can automatically and continuously receive data associated with Subject 1 from the resident database 208 and/or other databases as disclosed herein with reference to other embodiments. The fall prediction unit 204 can continuously update the fall prediction score associated with patient or resident 214a based on the data feed from the one or more databases, e.g. database 208. In some implementations, the matching unit 206 can then match the fall prediction score to the one or more conditions associated with patient or resident 214a and suggest a resource allocation as a result. In some implementations, Cdt. 1 can be dementia and P.1 can be a preset audio recording of a song indicated as pleasant by Subject 1 or a recording of a familiar voice, e.g. instructing the patient or resident to stay still while someone arrives. If patient or resident 214a receives a score exceeding a certain threshold relative to the Cdt. 1, then the matching unit 206 can automatically instruct speaker 211 to play the audio recording (e.g. song or voice). In some implementations, the matching unit 206 can notify user 218 via network 220 and computer 216, and the user 218 can make the decision as to whether or not to allocate the resource (e.g. play the recording). In some implementations, the matching unit 206 can coordinate with the camera 212 to determine whether or not to allocate the resource P1, e.g. the camera can be configured to scan the care setting 210 and determine whether patient or resident 214a is mobile, and if he or she is not mobile, e.g. he or she is on his or her bed, then no resource is allocated, and vice versa (if the threshold of fall prediction score relative to Cdt. 1 is met).


In some implementations, more than one patient or resident can be processed by system 200 simultaneously. For example, patient or resident 214b can correspond to Subject 2 in the resident database 208 and patient or resident 214c can correspond to Subject 3 in the resident database 208. The fall prediction unit 204 can automatically and continuously receive data associated with Subjects 2 and 3 from the resident database 208 and/or other databases as disclosed herein with reference to other embodiments. The fall prediction unit 204 can continuously update the fall prediction score associated with patient or resident 214b and 214c based on the data feed from the one or more databases, e.g. database 208. In some implementations, the matching unit 206 can then match the fall prediction score to the one or more conditions associated with patient or resident 214b and patient or resident 214c and suggest a resource allocation as a result. In some implementations, Cdt. 2 is high blood pressure and Cdt. 3 is a sprained leg can be dementia and P.2 can be a leg brace suggested by a medical professional as a result of the patient's or resident's, e.g. patient 214b, overall health profile. In some implementations, Cdt2 and Cdt3, individually, might not suffice for a particular resource allocation, but the weighting of both in relation to a fall prediction score, can result in a suggestion to allocate resource P.2. In some implementations, if patient or resident 214b receives a score exceeding a certain threshold relative to the product of weights associated with Cdt. 2 and Cdt. 3, then the matching unit 206 can automatically instruct the speaker 212 to call for medical attention. In some implementations, the matching unit 206 can notify user 218 via network 220 and computer 216, and the user 218 can make the decision as to whether or not to allocate the resource (e.g. request medical attention from a medical care provider). In some implementations, the matching unit 206 can coordinate with the camera 212 to determine whether or not to allocate the resource P2, e.g. the camera can be configured to scan the care setting 210 and determine whether patient or resident 214b is mobile, and if he or she is not mobile, e.g. he or she is on his or her bed, then no resource is allocated, and vice versa (if the threshold of fall prediction score relative to Cdt. 2 and/or Cdt. 3 is met). In some implementations, Cdt. 4 is a broken hip and P3 can be a wheel chair suggested by a medical professional as a result of the patient's or resident's, e.g. patient 214c, overall health profile. In some implementations, when the health condition, e.g. Cdt. 4, is of such severity that constant resource allocation is required, the matching unit 206 can suggest a resource allocation irrespective of a fall prediction score, e.g. a wheelchair. In some implementations, the matching unit 206 can suggest additional resources based on a fall prediction score and/or coordination with the camera 212. For example, if the camera 212 detects movement in the care setting 210 by patient 214c, then an additional resource can be requested, e.g. immediate medical assistance. In some implementations, a fall prediction score, which can be of a lesser threshold for Cdt. 4 then Cdt. 1, can suggest an additional resource as a result of exceeding a certain threshold, e.g. immediate medical attention.



FIG. 3 illustrates a document storage system 300 according to one or more implementations of the present disclosure. In one or more implementations, storage system 300 can be associated with predetermined form entries that can be used to make data of a certain format type that both enhances the training of a machine learning algorithm and application of the same in real time to generate a prediction as to whether or not a user will sustain a fall. In one or more implementations, pre-determined forms can be used both in training and applying an algorithm, computer operation, system, or method for predicting a fall; for example, as discussed with respect to FIG. 1, FIG. 4A, FIG. 4B, FIG. 8, and other implementations as described herein.


In one or more implementations, system 300 includes data sources 305a . . . 305n, where data sources 301a . . . 301n can include any information discussed with respect to any other data sources or databases as described herein, including medical information associated with one or more individuals. One or more users 303a . . . 303n can interact with the data sources 305a . . . 305n utilizing any one or more suitable computer devices 302a . . . 302n (as discussed herein or as otherwise suitable) and via communication over any suitable network 310 (as discussed herein or as otherwise suitable). In one or more implementations, the computing devices 302a . . . 302n can be loaded with any suitable software that allows one or more users to enter information, including medical information, related to one or more individuals via predetermined forms 301a . . . 301n. In one or more implementations, predetermined forms 301a . . . 301n are forms associated with a pre-determined value range for describing or characterizing a medical condition or other information associated with an individual. In one or more implementations, the forms can be associated with program logic running on devices 302a . . . 302n that allow a user to enter information concerning a condition and a computation is performed to provide a numerical value associated with that condition. In one or more implementations, the pre-determined forms can be based in whole or in part on Tables 1-3 and/or the discussion provided and/or associated herein with the same.


In one or more implementations, the use of pre-determined forms of this nature during training and/or application of a machine learning algorithm can enhance both the efficiency of training the machine learning algorithm and/or the accuracy and/or efficacy of application of the same, including with respect to machine learning algorithms directed to providing a fall prediction. The use of pre-determined forms of this kind can accomplish at least two desirable objectives: 1) reducing pre-processing time during training and/or application of a machine learning algorithm, as integer-type data is preferable in many instances of machine learning, and 2) when applying an algorithm to make a prediction, if the data used to train the algorithm has a substantially similar format as that which is inputted into the algorithm to render a prediction, the chance of errors are minimized.


Referring to FIG. 4A, a system 400a, according to one or more implementations of the present disclosure, is illustrated. The system 400a can, in one or more implementations, provide a user with a prediction as to the likelihood as to whether or not an individual will suffer a fall. The “modules,” “units,” or “components” described in the system 400a, whether contained in memory or otherwise employed therein, can be any suitable software, logic (hardware or software), or hardware element specifically configured to perform or be used in the performance of one or more tasks or functions as discussed herein.


In one or more implementations, the system 400a can include a fall prediction unit 401a which in turn includes one or more processors 402a, memory 403a, storage 407a and a network interface 414a. The one or more processors 402a can be any suitable software or hardware computer components for carrying out any operation as discussed herein. The memory 403a can be any suitable component or unit for storing protocols, information, algorithms, and/or instructions for execution by the one or more processors, e.g., the memory 403a may be any volatile and/or non-volatile memory capable of storing information during and/or for execution of instructions. The devices, systems, sources, units and/or components of the fall prediction unit 401a can be coupled to a network 420a, e.g., the Internet, via one or more wired and/or wireless network links, and can be accessed by one or more network interfaces 412a.


In one or more implementations, the fall prediction unit 401a can interact with one or more users or clients 484a . . . 484n (and associated user/client computing devices 480a . . . 480n, e.g. a laptop, mobile phone, tablet, or desktop computer) via a network interface 412a that can access a network 420a, and the fall prediction unit 401a can interact with one or more data databases or data sources 440a also via the network interface 412a accessing the network 420a, where in one or more implementations the one or more data sources can include one or more data sources related to the health, living condition, status, medical profile, or other information associated with one or more patients or residence in a medical facility (e.g. hospital, doctor's office, in-patient facility, out-patient facility, physical therapist's office, etc.) or resident home (e.g. a nursing home, assisted living facility, other residence, etc.). In one or more implementations, the data sources 440a can include demographic data 445a, diagnostic data 450a, active (and/or previous) medication data 455a, activity data 460a, MDS data 465a (and/or any other data mandated by, provided by, or associated with a governmental or regulatory body), vital change information data 470a, and any other available data 475a associated with one or more individuals (e.g. patients or residents). In one or more implementations, the data sources 440a can include real-time data (e.g. live data from constant monitoring of one or more individuals and/or continuously updated data concerning the one or more individuals), static or historical data concerning the one or more individuals (e.g. past medical events, injuries, etc.), or any suitable combination of data.


In one or more implementations, at least one or more of the data sources 445a-470a are based in whole or in part on information associated with predetermined forms that are in turn associated with value ranges and/or equations for providing a numerical value in relation to a medical condition; where in one or more implementations, the predetermined forms that provide the data can be based in whole or in part on Tables 1-3 and/or FIG. 3 of the present disclosure. In one or more implementations, utilizing data in this format, e.g. numerical format where a number represents the severity of a condition, importance of a medication, type of gender, age, etc. can both improve an accuracy of a system that trains a machine learning algorithm and an operation and efficiency of a machine learning algorithm itself because the use of natural language techniques (and errors associated therewith) is reduced or eliminated and the type of data used in training can have a higher symmetry with the data used in applying a machine learning model.


In one or more implementations, the memory 403a can include a machine learning training protocol 404a and an operating system 496a, where the operating system 496a can be any suitable operating system compatible with system 400a. In one or more implementations the machine learning training protocol 405a and/or the trained algorithm or model 408 can be based in whole or in part on the relevant operations associated with one or more of FIGS. 7-10. In one or more implementations, the machine learning training protocol 404a can train an algorithm 408a, e.g. a trained machine learning model 408a, to intake data related to a patient or resident of a medical facility, nursing, or other facility. In one or more implementations, the machine learning training protocol 404a can include a determination protocol 406a and a training protocol 405a. In one or more implementations, the data sources 445a-475a include fall history data, e.g. in data source 475a, on one or more residents in a nursing home, medical facility, or assisted living facility, etc. In one or more implementations, the determination protocol 406a applies any suitable correlation, distribution, or other suitable mathematical technique to determine a correlation or relationship between a condition, medication, demographic, daily living, or any other suitable information and a fall history of one or more individuals. For example, the determination protocol 406a can tally the number of falls associated with a particular data, e.g. a number of previous falls, a type of medication, etc. associated with one or more individuals. In one or more implementations, it can rank order, by frequency and from first to last, the number of falls in relation to the particular data. In one or more implementations, the rank order serves as the basis for selecting which type of data can be processed by one or more static models and/or subject to a static model adjustment, e.g. an adjustment factor, as discussed herein. In one or more implementations, a different scheme can be used to determine which types of data can be processed by a static model and/or subject to an adjustment factor after overall processing by a machine learning algorithm, including random selection, selection by experimentation, application of probabilistic or gaussian schemes, differential or integration techniques that consider different variables, and/or with coordination with a training protocol 405a as discussed below.


In one or more implementations, training protocol 405a intakes data from data sources 445a-475a and trains an algorithm 408a, e.g. a trained machine learning model based on the data intake. In one or more implementations, the training protocol 405a trains one or more dynamic protocols and determines and/or applies one or more static protocols as part of an overall training of algorithm 408a. In one or more implementations, an algorithm 408a, e.g. a trained machine learning model 408a (including the sub-protocols of algorithm 408a) trained by the machine learning training protocol 404a can be stored in a storage 407a, where the storage can include any suitable memory or other components as described herein, including volatile and/or non-volatile memory for storing a trained machine learning algorithm.


Any suitable dynamic machine model scheme can be generated or trained by training protocol 405a, including a neural network, convolution neural networks, modular neural network, etc. In one or more implementations, the dynamic machine learning protocol is a neural network. In one or more implementations, the dynamic machine learning protocol, e.g. neural network, can be trained using any suitable technique, including deep learning techniques, e.g. backpropagation, as describe herein with respect to other implementations or as otherwise suitable, with the output and backpropagation being in reference to fall prediction in relation to information from data sources 445a-475a.


In one or more implementations, as stated, the training protocol 405a utilizes, determines and/or applies one or more static protocols with respect to one or more data sources, e.g. data sources of 445a-475a as part of an overall training process for algorithm 408a. In one or more implementations, an output of the one or more static protocols serves as an input to the neural network of the dynamic protocol. In one or more implementations, the one or more static protocols are one or more equations applied to information from some of the data sources 445a-475a, and where the remaining data sources feed directly into the neural network. In one or more implementations, the one or more static protocols determine the weights associated with information obtained from part of the data sources 445a-475a and those weights are static, where in one or more implementations, the one or more static protocols perform more than one mathematical operation prior to generating an output to serve as an input to the dynamic protocol, and where one or more weights of the dynamic protocol, e.g. neural network, are variable. In one or more implementations, at least one equation associated with the static model can be any suitable equation that correlates the likelihood of an event in relation to a data entry and/or the probability of an event occurring in relation to a data entry. In one or more implementations, the static protocols can be preset, in whole or in part, by a user based on experimentation on what type of equation, e.g. probabilistic equation and/or correlation equation, enhances the accuracy of an algorithm. In one or more implementations, the determination protocol 406a can partially determine the static protocols by providing an overall mathematical structure, e.g. as shown below with respect to Equation 1, that is derived from a preset grouping of probabilistic and/or correlation equations, where the accuracy of each equation is determined by one or more iterations of backpropagation during training in relation to actual fall events. In one or more implementations, the determination protocol 406a can partially determine the equations that form the basis for the static protocols and one or more users can set the basis for the same; e.g., the determination protocol 406a can determine the overall mathematical structure and the one or more users can determine specific numbers, e.g. factors (multipliers, dividers, etc.) associated with the same.


In one or more implementations, as stated and implied above, the determination protocol 406a can coordinate with the training protocol 405a to determine which data has a highest degree of association between and individual falling or not falling, e.g. during backpropagation and/or other training operations, the determination unit 406a can log which types of data (and variations of the same) have a largest influence on whether or not an individual will suffer a fall; accordingly, in one or more implementations, a first series of training operations can be utilized without static models to determine which data will be subject to one or more static models and/or adjustment factors. In one or more implementations, training does not commence until the static models are determined, by experimentation or as otherwise discussed herein, as the output of static models can serve as one or more inputs to the dynamic model, e.g. neural network.


In one or more implementations, the equation associated with one or more static models can be as follows:





(Fall Count/Not Fall County)−(Not Fall Count/Fall Count)   Equation 1


In one or more implementations, the “Count” of Equation 1 is based on the number of times a particular data appears in entries of a table or matrix with other data in relation to multiple individuals, where “fall count” refers to a number of times a particular data is associated with a fall and “Not Fall Count,” refers to a number of times the same data is part of an entry not associated with a fall. In one or more implementations, the relational table or matrix can be generated by the training protocol 405a. In one or more implementations, for each particular data in a row, Equation 1 determines one or more (static) weights associated with each particular data, and thereafter a transformation operation is performed by the training protocol 405a. In one or more implementations, the transformation operation averages all the weights associated with a particular row, and thereafter the average of the weights is inputted into the dynamic model, e.g. neural network. In one or more implementations, the table and/or matrix of data values, weights and/or average of weights can serve as the basis of a static model dictionary for use during training of the dynamic protocol and/or in application of the same. In one or more implementations, the static dictionary is used in final model training, e.g. first the dictionary is formed, and then training commences with the dynamic protocol.


In one or more implementations, the determination protocol 406a determines an adjustment factor (e.g. for modifying), e.g. another static model, to apply at the output of the dynamic protocol in order to generate a final risk score. In one or more implementations, utilizing the techniques and operations as discussed herein, the determination protocol 406a can determine the type of data that will be subject to the adjustment factor by determining which type of data correlates to the highest degree with respect to a fall event occurring (or not occurring). In one or more implementations, as stated or implied herein, the adjustment factor (e.g. equation to modify) or model is pre-determined by a user, e.g. using experimentation and/or other probabilistic techniques In one or more implementations, the type of data that can be subject to the adjustment factor is a fall history of one or more individuals on a temporal basis, e.g. the number of falls an individual has suffered within a predetermined time range. In one or more implementations, utilizing the techniques and operations as discussed herein, the determination protocol 406a can also determine a mathematical equation that serves as the adjustment factor or adjustment protocol. In one or more implementations, as stated expressly or implicitly herein, both the type of data that will be subject to the adjustment factor and the equation associated with the same can be determined by experimentation.


In one or more implementations, Equation 2, as provided below, can serve as the basis of the adjustment factor:






Nf*2%   Equation 2


“Nf” stands for the number of fall counts within a predetermined time range associated with one or more individuals. For example, an initial fall prediction score outputted by the dynamic protocol (which includes inputs from one or more static protocols) can be multiplied by the number of times an individual has fallen within a predetermined time frame, and in turn multiplied by two percent to produce a final fall prediction score during training.


In one or more implementations, the determination protocol 406a, in coordination with the training protocol 405a, and in order to further enhance an accuracy of a final fall prediction score, can apply a sub-algorithm of conditional static equations to the initial fall prediction score. For example, one or more threshold operations can be applied to an initial score to determine what type of adjustment factor will be applied to the initial fall prediction. For example, in one or more implementations, the one or more threshold operations can be related to the value of the initial score, the number of falls associated with a particular individual, and/or the time frame associated with those falls. By way of a specific example, and according to one or more implementations, the application and operations of the adjustment factor or protocol can be as follows: If a fall event is within 30 days and an initial score is less than 90%, then the final score can be adjusted as follows: 90+(0.2*Nf on prior 30 days). If a fall event is within 30-90 days and the initial score is less than 80%, then 80+(0.2*Nf on prior 30-90 days). If a fall event is within 90-365 days and an initial score is less than 70%, then 70+(0.2*Nf counts on prior 90-365 days).


In one or more implementations, the training protocol 405a can train the dynamic protocol, e.g. via backpropagation, with a feedback loop or iteration sequence that includes the adjustment factor, e.g. each iteration is after the adjustment factor or protocol is applied. In one or more implementations, the adjustment factor or protocol is determined by the determination protocol 406a and is used after training (e.g. during application of a trained algorithm or protocol), but it is not used during training.


In one or more implementations, once the training protocol 405a and determination protocol 406a have trained and/or determined the various protocols to make up an algorithm that can make a fall prediction, the trained algorithm and components can be stored in storage 407a, where storage 407a can include any suitable combination of memory components, including volatile and/or non-volatile memory, and any other suitable component for a processor to retrieve and execute the trained algorithm and its sub-protocols. In one or more implementations, the trained algorithm is trained algorithm 408a, and can include dynamic protocol 410a, static protocols 410b-410n, and/or adjustment protocol 411a, where in one or more implementations, as expressly stated and/or implied above, dynamic protocol 410a is trained by training protocol 405a, static protocols 410b-410n are determined by determination protocol 406a and/or by experimentation, and adjustment protocol 411a is determined by determination protocol 406a and/or by experimentation.


In one or more implementations, trained algorithm 408a can be applied to one or more data sources related to one or more individuals, e.g. a patient or resident of a nursing home, hospital, or other care facility, to provide a prediction as to whether or not an individual will sustain a fall. For convenience, data sources 440a-475a are used to describe some aspects of trained algorithm 408, e.g. data sources 440a-475a include both the training data sets to train algorithm and separate data, including real time incoming data, for applying the algorithm 408a to render a prediction, where there is no overlap between the data associated with a first group of individuals used to train algorithm 408a and the data set associated with one or more other individuals where the algorithm 408a is expected to make a prediction. In one or more implementations, the algorithm can be applied to or intake data from entirely separate data sources, although the data categories can be the same.


In one or more implementations, one or more users 484a, 484n can make a request using one or more computer devices 480a, 480n and over network 420a to the fall prediction unit 401a such that the fall prediction unit 401a outputs one or more predictions 495a, 495n as to whether not an individual will suffer a fall. In one or more implementations, certain data, e.g. demographic data, activity data, MDS data, and/or vital change information related to the one or more individuals feeds directly into the dynamic protocol 410a (e.g. a neural network), and other types of data, e.g. diagnostic data and active medication data, feeds into the one or more static protocols 410b-410n. In one or more implementations, the data types that fed directly into the neural network and the data types that fed into the static protocols during training correspond to the same input feed flow in application of the trained model. In one or more implementations, the static protocols 410b-410n utilize the static dictionary developed during training to generate weights and weight averages associated with the incoming data stream, and thereafter, output the processed results as inputs into the dynamic protocols 410a (e.g. neural network). In one or more implementations, the static protocols 410b-410n perform a matching operation that matches a data type, e.g. medication or diagnosis, associated with an individual where a fall prediction is sought to an entry in a static dictionary, and outputs a weight or weight average associated with the matched entry into the dynamic protocol (e.g. neural network).


In one or more implementations, the static protocols 410b-410n can perform a real time computation of the incoming data stream, e.g. construct one or more new tables or matrices with respect to the incoming data stream, compute new one or more static weights and one or more static weight averages from the incoming data, and apply the one or more averages as inputs to the dynamic protocol 410a. In one or more implementations, an initial score is generated by the dynamic protocol 410a (which includes inputs from the one or more static protocols 410b-410n), and thereafter a final score is outputted to the one or more users 484a, 484n. In one or more implementations, the adjustment protocol 411a is based on a fall history equation (e.g. another static model) and/or fall history protocol; where in one or more implementations, the adjustment protocol 411a is based on Equation 2 and/or one or more threshold operations as discussed above with reference to training operations.


In one or more implementations, the machine training protocol 404a and/or trained algorithm 408a can be retrieved from memory or storage and executed by one or more processors, such as processor 402a.


Embodiments are not limited in the above manner, and the above system is merely an exemplary embodiment for implementing one or more features of the present disclosure.


Referring to FIG. 4B, a system 400b, according to one or more implementations of the present disclosure, is illustrated. The system 400b can, in one or more implementations, provide a user with a prediction as to the likelihood as to whether or not an individual will suffer a fall. The “modules,” “units,” or “components” described in the system 400b, whether contained in memory or otherwise employed therein, can be any suitable software, logic (hardware or software), or hardware element specifically configured to perform or be used in the performance of one or more tasks or functions as discussed herein.


In one or more implementations, the system 400b can include a fall prediction unit 401b which in turn includes one or more processors 402b, memory 403b, storage 407b and a network interface 414a. The one or more processors 402b can be any suitable software or hardware computer components for carrying out any operation as discussed herein. The memory 403b can be any suitable component or unit for storing protocols, information, algorithms, and/or instructions for execution by the one or more processors, e.g., the memory 403a may be any volatile and/or non-volatile memory capable of storing information during and/or for execution of instructions. The devices, systems, sources, units and/or components of the fall prediction unit 401a can be coupled to a network 420a, e.g., the Internet, via one or more wired and/or wireless network links, and can be accessed by one or more network interfaces 412a.


In one or more implementations, the fall prediction unit 401b can interact with one or more users or clients 484a′ . . . 484n′ (and associated user/client computing devices 480a′ . . . 480n′, e.g. a laptop, mobile phone, tablet, or desktop computer) via a network interface 414b that can access a network 420b, and the fall prediction unit 401b can interact with one or more data databases or data sources 440b also via the network interface 414b accessing the network 420b, where in one or more implementations the one or more data sources can include one or more data sources related to the health, living condition, status, medical profile, or other information associated with one or more patients or residence in a medical facility (e.g. hospital, doctor's office, in-patient facility, out-patient facility, physical therapist's office, etc.) or resident home (e.g. a nursing home, assisted living facility, other residence, etc.). In one or more implementations, the data sources 440a can include demographic data 445b, diagnostic data 450b, active (and/or previous) medication data 455b, activity data 460b, MDS data 465b (and/or any other data mandated by, provided by, or associated with a governmental or regulatory body), vital change information data 470b, and any other available data 475b associated with one or more individuals (e.g. patients or residents). In one or more implementations, the data sources 440b can include real-time data (e.g. live data from constant monitoring of one or more individuals and/or continuously updated data concerning the one or more individuals), static or historical data concerning the one or more individuals (e.g. past medical events, injuries, etc.), or any suitable combination of data.


In one or more implementations, at least some of the data sources 445b-470b are based in whole or in part on information associated with predetermined forms that are in turn associated with value ranges and/or equations for providing a numerical value in relation to a medical condition; where in one or more implementations, the predetermined forms that provide the data can be based in whole or in part on Table 1 and/or FIG. 10 of the present disclosure. In one or more implementations, utilizing data in this format, e.g. numerical format where a number represents the severity of a condition, importance of a medication, type of gender, age, etc. can both improve an accuracy of a system that trains a machine learning algorithm and an operation or efficiency of a machine learning algorithm itself because the use of natural language techniques (and errors associated therewith) is reduced or eliminated and the type of data used in training can have a higher symmetry with the data used in applying a machine learning model.


In one or more implementations, the memory 403b can include a machine learning training unit 404b, a determination unit 406b, and an operating system 496b, where the operating system 496b can be any suitable operating system compatible with system 400b. In one or more implementations the machine learning training unit 405b and/or the machine learning application component 408b can be based in whole or in part on the relevant operations associated with one or more of FIGS. 7-9 and/or configured to execute the same. The machine learning unit 404b can train or configure one or more modules that become part of an overall machine learning application component 408b. In one or more implementations, the machine learning model 409b can include a dynamic module 410a′, one or more static modules 410b′-410n′, and/or an adjustment module 412b. In one or more implementations, one or more components of the machine learning model 409b, e.g. a trained machine learning model 409b, is trained by the training unit 405b to intake data related to a patient or resident of a medical facility, nursing, or other facility and render a fall prediction. In one or more implementations, the data sources 445b-475b can include fall history data, e.g. in data source 475b, on one or more residents in a nursing home, medical facility, or assisted living facility, etc.


In one or more implementations, the determination unit 406b applies any suitable correlation, distribution, or other suitable mathematical technique to determine a correlation or relationship between a condition, medication, demographic, daily living, or any other suitable information and a fall history of one or more individuals. For example, the determination protocol 406b can tally the number of falls associated with a particular data, e.g. a number of previous falls, a type of medication, etc. and in turn associated with one or more individuals. In one or more implementations, it can rank order, by frequency and from first to last, the number of falls in relation to the particular data. In one or more implementations, the rank order serves as the basis for selecting which type of data can be processed by one or more static models and/or subject to a static model adjustment, e.g. an adjustment factor, as discussed herein. In one or more implementations, a different scheme can be used to determine which types of data can be processed by a static model and/or subject to an adjustment factor after overall processing by a machine learning algorithm, including random selection, selection by experimentation, application of probabilistic or gaussian schemes, differential or integration techniques that consider different variables, and/or with coordination with a training unit 405b as discussed below.


In one or more implementations, training unit 405b intakes data from data sources 445b-475b and trains one or more components or modules of model 409b, e.g. a trained machine learning model based on the data intake. In one or more implementations, the training unit 405b trains one or more dynamic modules or components and determines (in coordination with the determination unit 406b) and/or applies one or more static modules or components as part of an overall training of model 409b. In one or more implementations, a trained model 409b, e.g. a trained machine learning model trained and/or applied by the training unit 405b can be stored in a storage 407b, where the storage can include any suitable memory or other components as described herein, including volatile and/or non-volatile memory for storing a trained machine learning algorithm.


Any suitable dynamic machine model scheme can be generated or trained by training unit 405b, including a neural network, convolution neural networks, modular neural network, etc. In one or more implementations, the dynamic model is a neural network. In one or more implementations, the dynamic model, e.g. neural network, can be trained using any suitable technique, including deep learning techniques, e.g. backpropagation, as describe herein with respect to other embodiments or as otherwise suitable, with the output and backpropagation being in reference to fall prediction in relation to information from data sources 445b-475b.


In one or more implementations, as stated, the training unit 405b, alone or in coordination with the determination unit 406b, utilizes, determines and/or applies one or more static techniques with respect to one or more data sources, e.g. data sources of 445b-475b as part of an overall training process for model 409b. In one or more implementations, an output of one or more static modules or components serves as an input to the neural network of the dynamic modules or components. In one or more implementations, the one or more static modules or components are configured to perform one or more static equations applied to information from some of the data sources 445b-475b, and where the remaining data sources feed directly into the neural network. In one or more implementations, the one or more static modules or components determine the weights associated with information obtained from part of the data sources 445b-475b and those weights are static, where in one or more implementations, the one or more static components or modules perform more than one mathematical operation prior to generating an output to serve as an input to the dynamic modules or components, and where one or more weights of operations associated with the dynamic module or component, e.g. neural network, are variable. In one or more implementations, at least one equation associated with the static modules or components can be any suitable equation that correlates the likelihood of an event in relation to a data entry and/or the probability of an event occurring in relation to a data entry. In one or more implementations, the operations and/or equations of the static modules or components can be preset, in whole or in part, by a user based on experimentation on what type of equation, e.g. probabilistic equation and/or correlation equation, enhances the accuracy of an algorithm. In one or more implementations, the determination unit 406b can partially determine the operations associated with the static modules or components by providing an overall mathematical structure, e.g. as shown above and below with respect to Equation 1, that is derived from a preset grouping of probabilistic and/or correlation equations, where the accuracy of each equation is determined by one or more iterations of backpropagation during training in relation to actual fall events. In one or more implementations, the determination unit 406b can partially determine the equations that form the basis for the operations of the static modules or components and one or more users can set the basis for the same; e.g., the determination unit 406b can determine the overall mathematical structure and the one or more users can determine specific numbers, e.g. factors (multipliers, dividers, etc.) associated with the same.


In one or more implementations, as stated and implied above, the determination unit 406b can coordinate with the training unit 405b to determine which data has a highest degree of association between and individual falling or not falling, e.g. during backpropagation and/or other training operations, the determination unit 406b can log which types of data (and variations of the same) have a largest influence on whether or not an individual will suffer a fall; accordingly, in one or more implementations, a first series of training operations can be utilized without static operations to determine which data will be subject to one or more static models and/or adjustment factors.


In one or more implementations, the equation associated with one or more static modules or components can be as follows:





(Fall Count/Not Fall County)−(Not Fall Count/Fall Count)   Equation 1


In one or more implementations, the “count” of Equation 1 is based on the number of times a particular data appears in entries of a table or matrix with other data in relation to multiple individuals, where “fall count” refers to a number of times a particular data is associated with a fall and “Not Fall Count,” refers to a number of times the same data is part of an entry not associated with a fall. In one or more implementations, the relational table or matrix can be generated by the training unit 405b. In one or more implementations, for each particular data in a row, Equation 1 determines one or more (static) weights associated with each particular data, and thereafter a transformation operation is performed by the training unit 405b. In one or more implementations, the transformation operation averages all of the weights associated with a particular row, and thereafter the average of the weights is inputted into the dynamic model, e.g. neural network. In one or more implementations, the table and/or matrix of data values, weights and/or average of weights can serve as the basis of a static model dictionary for use during training or configuration of the dynamic component or module and/or in application of the same. In one or more implementations, the static dictionary is used in final model training, e.g. first the dictionary is formed, and then training commences with the dynamic modules or components.


In one or more implementations, the determination unit 406b determines an adjustment factor (e.g. for modifying a score), e.g. another static model, to apply at the output of the dynamic protocol in order to generate a final risk score and can configure an adjustment module or component to perform the same. In one or more implementations, utilizing the techniques and operations as discussed herein, the determination unit 406a can determine the type of data that will be subject to the adjustment factor by determining which type of data correlates to the highest degree with respect to a fall event occurring (or not occurring). In one or more implementations, the type of data that can be subject to the adjustment factor is a fall history of one or more individuals on a temporal basis, e.g. the number of falls an individual has suffered within a predetermined time range. In one or more implementations, utilizing the techniques and operations as discussed herein, the determination unit 406a can also determine a mathematical equation that serves as the adjustment factor. In one or more implementations, as stated expressly or implicitly herein, both the type of data that will be subject to the adjustment factor and the equation associated with the same can be determined by experimentation.


In one or more implementations, Equation 2, as provided above and below, can serve as the basis of the adjustment factor:






Nf*2%   Equation 2


“Nf” stands for the number of fall counts within a predetermined time range associated with one or more individuals. For example, an initial fall prediction score outputted by the dynamic protocol (which includes inputs from one or more static protocols) can be multiplied by the number of times an individual has fallen within a predetermined time frame, and in turn multiplied by two percent to produce a final fall prediction score during training.


In one or more implementations, the determination unit 406b, in coordination with the training unit 405b, and in order to further enhance an accuracy of a final fall prediction score, can apply a sub-algorithm of conditional static equations to the initial fall prediction score. For example, one or more threshold operations can be applied to an initial score to determine what type of adjustment factor will be applied to the initial fall prediction. For example, in one or more implementations, the one or more threshold operations can be related to the value of the initial score, the number of falls associated with a particular individual, and/or the time frame associated with those falls. By way of a specific example, and according to one or more implementations, the application and operations of the adjustment factor or protocol can be as follows: If a fall event is within 30 days and an initial score is less than 90%, then the final score can be adjusted as follows: 90+(0.2*Nf on prior 30 days). If a fall event is within 30-90 days and the initial score is less than 80%, then 80+(0.2*Nf on prior 30-90 days). If a fall event is within 90-365 days and an initial score is less than 70%, then 70+(0.2*Nf counts on prior 90-365 days).


In one or more implementations, the training protocol 405b can train or configure the dynamic module, e.g. via backpropagation, with a feedback loop or iteration sequence that includes the adjustment factor, e.g. each iteration is after the adjustment factor or protocol is applied. In one or more implementations, the adjustment factor or protocol is determined by the determination protocol 406a and is used after training (e.g. during application of a trained algorithm or protocol), but it is not used during training.


In one or more implementations, once the training unit 405b and determination unit 406b have trained, configured and/or determined the various components and/or modules to make up an a component or module that can make a fall prediction, the trained and/or configured components and/or modules can be stored in storage 407b, where storage 407b can include any suitable combination of memory components, including volatile and/or non-volatile memory, and any other suitable component for a processor to retrieve and execute the trained algorithm and its sub-protocols. In one or more implementations, the trained component is machine learning application component 408b, and can include machine learning model 409b (which in turn includes dynamic module 410b, static modules 410b′-410n′, and/or adjustment module 411b), where in one or more implementations, as expressly stated and/or implied above, dynamic module 410b is trained by training unit 405b, static modules 410b′-410n′ are determined by determination protocol 406a and/or by experimentation, and adjustment module 412b is determined by determination unit 406b and/or by experimentation.


In one or more implementations, the machine learning model 409b can be applied to one or more data sources related to one or more individuals, e.g. a patient or resident of a nursing home, hospital, or other care facility, to provide a prediction as to whether or not an individual will sustain a fall. For convenience, data sources 440nb-475b are used to describe some aspects of trained algorithm 408, e.g. data sources 440b-475b include both the training data sets to train algorithm and separate data, including real time incoming data, for applying the component 408b to render a prediction, where there is no overlap between the data associated with a first group of individuals used to train and/or determine the various components and modules of 408b and the data set associated with one or more other individuals where the component 408b is expected to make a prediction. In one or more implementations, the algorithm can be applied to or intake data from entirely separate data sources, although the data categories can be the same.


In one or more implementations, one or more users 484a′, 484n′ can make a request using one or more computer devices 480a′, 480n′ and over network 420b to the fall prediction unit 401b such that the fall prediction unit 401b outputs one or more predictions 495a′, 495n′ as to whether not an individual will suffer a fall. In one or more implementations, certain data, e.g. demographic data, activity data, MDS data, and/or vital change information related to the one or more individuals feeds directly into the dynamic module 410a′ (e.g. a neural network), and other types of data, e.g. diagnostic data and active medication data, feeds into the one or more static modules 410b′-410n′. In one or more implementations, the data types that fed directly into the neural network and the data types that fed into the static modules during training correspond to the same input feed flow in application of the trained model. In one or more implementations, the static modules 410b′-410n′ utilize the static dictionary developed during training to generate weights and weight averages associated with the incoming data stream, and thereafter, output the processed results as inputs into the dynamic module 410a′ (e.g. neural network). In one or more implementations, the static modules 410b′-410n′ perform a matching operation that matches a data type, e.g. medication or diagnosis, associated with an individual where a fall prediction is sought to an entry in a static dictionary, and outputs a weight or weight average associated with the matched entry into the dynamic protocol (e.g. neural network).


In one or more implementations, the static modules 410b′-410n′ can perform a real time computation of the incoming data stream, e.g. construct one or more new tables or matrices with respect to the incoming data stream, compute new one or more static weights and one or more static weight averages from the incoming data, and apply the one or more averages as inputs to the dynamic module 410a′. In one or more implementations, an initial score is generated by the dynamic module 410a′ (which includes inputs from the one or more static modules 410b′-410n;), and thereafter a final score is outputted to the one or more users 484a, 484n. In one or more implementations, the adjustment module 412b is based on and/or configured to run a fall history equation (e.g. another static model) and/or fall history protocol; where in one or more implementations, the adjustment module 412b is based on Equation 2 and/or one or more threshold operations as discussed above with reference to training operations.


Embodiments are not limited in the above manner, and the above system is merely an exemplary embodiment for implementing one or more features of the present disclosure.



FIG. 5A illustrates an architecture of an algorithm 500a, e.g. a machine learning training algorithm, that can train an algorithm to predict a fall of an individual. In one or more implementations, the architecture includes any suitable software and/or hardware units necessary to carry out the logic and/or flow of operations of the architecture 500a. In one or more implementations, the architecture 500a is the basis for training fall risk algorithm 118 and can be stored in a memory of an overall computer system, such as memory 144 of computer system 100, and can be implemented by one or more computer processors, such as processor 142 of computer system 100. In one or more implementations, any other suitable system or component described herein can be configured to execute algorithm 500a. As shown, and pursuant to one or more implementations, algorithm 500a includes a machine learning algorithm or model 570a for training. In one or more implementations, the algorithm utilizes data with respect to one or more individuals, where the received data, to train the algorithm 570a and, thereafter, other incoming data can be used to output one or more scores 590a that are an initial prediction of whether or not an individual can fall. The data associated with training can be demographic data 510a, diagnostic data 515a, medication data 520a, including active medication data, activity data 525a, MDS data 530a, and vital change information data 535a, where data 510a, 515a, 520a, 530a, and 535a can be as described above with reference to other implementations, where the data can be related to one or more individuals N (e.g. nursing home facility patients, hospital patients, etc.) as referenced with respect to other implementation as described above, and where the data sources can further be based in whole or in part on predetermined forms as described herein.


The machine learning algorithm 570a can be any suitable machine learning model, including but not limited to a static model, dynamic model, hybrid model, or any other suitable machine learning model.


As used herein, a “static” model or component refers to any one of: i) a pre-determined equation that receives an input of data, requires no training and produces a static factor that can adjust a weight associated with another model or component of a mode, ii) a pre-determined equation that receives data, requires no training and produces a static factor that can adjust a score outputted by another model, iii) a pre-determined equation that receives data, requires no training and produces a static factor that can adjust or modify a score outputted by another model (including a dynamic model), iv) a statically trained machine learning model that receives as data for training and produces a static, i.e. non-variable, factor that can adjust or affirmatively determine a weight associated with another model, where the adjusted weight remains unaltered during processing by the another model (even if other weights in the model are variable),) a statically trained machine learning model that receives as an input of data for training and produces a static, i.e. non-variable factor that can adjust a score outputted by another model (including a dynamic model), or vi) a combination of the previously mentioned. In one or more implementations, a “statically trained model” is a model that can be trained using any suitable machine learning technique, but requires no further training once initially trained, e.g. after a certain number of iterations of any suitable machine learning techniques such as backpropagation.


As used herein, a “dynamic” model or component refers to any one of: i) a model that has one or more weights that are variable during application of the model, ii) a model that receives continuous training or updating as it is applied, or iii) a combination of the previously mentioned.


In one or more implementations, as shown, machine learning algorithm 570a is a dynamic model in the form of a neural network. In one or more implementations, algorithm 500a includes a static component 550a and a static component 560a, which can be determined using any suitable technique as described herein, including with respect to FIGS. 4A and 4B and executed by any suitable component of a computer system, including but not limited to system 100, system 400a, and/or system 400b. In one or more implementations, static component 550b and static component 560b are predetermined equations based on the ratio of individual diagnosis/medication that contributed to a fall event, subtracted by a ratio of individual diagnosis/medication contributed for a not fall event. In one or more implementations, static component 550b and static component 560b are based on Equation 1 as provided for above. In one or more implementations, processor 142 can be configured to receive data 114 and execute the operations associated with the static components where the static components 450 and 460 can be stored in memory 144 and then integrated, upon execution, as part of fall risk algorithm 118.


In one or more implementations, the dynamic machine learning algorithm 570a directly receives input from data 510a, 520a, 525a, 530a, and 535a. In one or more implementations, static components 550a and 560a process, respectively, data from data sources 515a and 520a. In one or more implementations, the static components 550a and 560a form a relational matrix or table associated with the data inputs from sources 515a and 520a, apply Equation 1 to each row of the table, and generate one or more static weights in association therewith; thereafter, for each particular data type, the weights are averaged and are stored as a static dictionary to be inputted into the dynamic model 570a for training.


In one or more implementations, the initial weights of the model 570a, prior to initializing training, can be a randomly selected or can also be based on a probabilistic and deviation technique from fall and non-fall events in relation to those data points. In one or more implementations, the initialized weights can include a calculated weight bias and/or calculated weights based on any suitable initialization technique.


In one or more implementations, once initial weights are set for the model 570a, the model training can commence. In one or more implementations, the dynamic model 570a, e.g. neural network, receives as a direct input data from data sources 510a, 525a, 530a, and 535a, and the outputs of static components 550a and 560a (which include the processing of data from sources 515a and 520a, e.g. static weight averages from the static dictionary associated therewith). In one or more implementations, the training of model with the inputted data and outputs from the static components can include any suitable training techniques for training a dynamic model, including modulating and adjusting weights and biases in the input data, and in deeper levels of a deep learning algorithm (e.g., by back propagation methods) such that the prediction of a fall prediction becomes increasingly accurate. That is, a deviation from the fall prediction score and the prediction from the algorithm as it is trained is minimized to an acceptable minimum value. For example, the model can be validated with a Root Mean Squared Error on train/test ratio 70:30 (i.e., 70% of the data is used to train the model and 30% of the data is used to test the model). In one or more implementations, processor 142 can be configured to receive data 114 and execute the operations indicated above to facilitate the training of machine learning algorithm 570a, and thereafter, machine learning algorithm 470 can be stored in memory 144 and then integrated, upon execution, as part of fall risk algorithm 118.


In one or more implementations, during training, the machine learning model 570a, produces one or more initial scores related to fall prediction, e.g. score 590a. In one or more implementations, the one or more initial scores 590a, are adjusted by an adjustment component 585a. In one or more implementations, the adjustment component 585a can be a static component, where the static component 585a is a predetermined equation or factor that adjusts the score outputted by machine learning model 570a. In one or more implementations, one of the most important indicators as to whether or not an individual will fall in the future is whether that individual has fallen in the past and the duration since the last fall. As such, in one or more implementations, the adjustment component 585a received fall history data 590, e.g. based on a fall history of one or more individuals, and applies and equation, e.g. Equation 2 as provided above, to the initial score 590a to determine generate a final score 595a. In one or more implementations, the adjustment component utilizes an equation that multiplies the number of days Nd since the last time an individual fell by a predetermined percentage factor, where the percentage factor can be determined by applying probabilistic and deviation techniques on data associated with the falls of other individuals. In one or more implementations, an equation in conjunction with thresholding operations, as discussed above with reference to FIG. 4A and FIG. 4B is used. In one or more implementations, since the adjustment factor produced or generated by the adjustment component is based on a static equation, the adjustment component is itself a static component that does not require training. In one or more implementations, processor 142 can be configured to receive data 114 and execute the operations indicated above to produce the adjustment component 485, where the adjustment component 485 be stored in memory 144 and then integrated, upon execution, as part of fall risk algorithm 118.



FIG. 5B illustrates an architecture of an algorithm 500b, e.g. a trained algorithm with one or more machine learning components, that predict whether or not an individual will sustain a fall. In one or more implementations, the architecture includes any suitable software and/or hardware units necessary to carry out the logic and/or flow of operations of the architecture 500b. In one or more implementations, the architecture 500b is the basis for a fall risk algorithm 118 and can be stored in a memory of an overall computer system, such as memory 144 of computer system 100, and can be implemented by one or more computer processors, such as processor 142 of computer system 100. In one or more implementations, any other suitable system or component described herein can be configured to execute algorithm 500b. In one or more implementations, various components of algorithm 500b receive data associated with one or more individuals, e.g. demographic data 510b, diagnostic data 515b, medication data 520b, including active medication data, activity data 525b, MDS data 530b, and vital change information data 535b, where data 510b, 515b, 520b, 530b, and 535b can be as described above with reference to other implementations, where the data can be related to one or more individuals N (e.g. nursing home facility patients, hospital patients, etc.) as referenced with respect to other implementation as described above, and where the data sources can further be based in whole or in part on predetermined forms as described herein.


In one or more implementations, the algorithm 500b includes trained model 568b and can be any suitable machine learning model, including but not limited to a static model, dynamic model, hybrid model, or any other suitable machine learning model.


In one or more implementations, as shown trained model 568 includes a dynamic component 575b in the form of a neural network and one or more static components 570b. In one or more implementations, the one or more static components 570b process data from one or more data sources 510b-535b and provide an output to the dynamic component 575b. In one or more implementations, the static components 570b include the static dictionaries developed with respect to FIG. 5A and are configured to apply a matching operation with respect to incoming data that matches entries in the static dictionary, e.g. medical or diagnostic data associated with 515b and/or 520b, that is of the same type as an entry in the static dictionaries developed with respect to FIG. 5A; and thereafter, the corresponding weight or weight averages associated with that data entry is inputted in the dynamic components (e.g. neural network) 575b.


In one or more implementations, the one or more static component 570b perform real time calculations, and include predetermined equations based on the ratio of individual diagnosis/medication that contributed to a fall event, subtracted by a ratio of individual diagnosis/medication contributed for a not fall event. In one or more implementations, one or more static components 570b are based on Equation 1 as provided for above. In one or more implementations, one or more static components 570b process one or more of the incoming data sources, generate a relational matrix and/or table with respect to the incoming data stream, and generate one or more weights with each data type in relation to a fall event. In one or more implementations, the one or more static components 570b generate an average of weights with respect to each data type and outputs the average into the dynamic model 575b.


In one or more implementations, the one or more static components 570b only process data from data sources 515b and 520b and the remaining data sources, e.g. 510b, 525b, etc. feed directly into the neural network 575b, and the overall model 568 produces one or more initial scores 590b, where the one or more initial scores 590b relate as to whether or not an individual will sustain a fall.


In one or more implementations, the one or more initial scores 590a, are adjusted by an adjustment component 585b. In one or more implementations, the adjustment component 585b can be a static component, where the static component 585b is a predetermined equation or factor that adjusts the score outputted by machine learning model 568. In one or more implementations, one of the most important indicators as to whether or not an individual will fall in the future is whether that individual has fallen in the past and the duration since the last fall. As such, in one or more implementations, the adjustment component 585b received fall history data 580b, e.g. based on a fall history of one or more individuals, and applies and equation, e.g. Equation 2 as provided above, to the initial score 590a to determine or generate a final score 595a as to whether or not an individual will likely suffer a fall. In one or more implementations, the adjustment component utilizes an equation that multiplies the number of days Nd since the last time an individual fell by a predetermined percentage factor, where the percentage factor can be determined by applying probabilistic and deviation techniques on data associated with the falls of other individuals. In one or more implementations, an equation in conjunction with thresholding operations, as discussed above with reference to FIG. 4A and FIG. 4B is used. In one or more implementations, processor 142 can be configured to receive data 114 and execute the operations indicated above with respect to the trained machine learning model 568.


In one or more implementations, as discussed and implied above, the use of a hybrid model with static and dynamic components in an overall algorithm, e.g. algorithm 500b, increases the accuracy of the overall algorithm, which is an overall improvement to a system, e.g. system 100, that employs the algorithm. Since certain factors that are identified with a higher statistical probability or significance with respect to whether or not an individual will fall, and since other factors that are identified of having a more variable nature in relation to a prediction, e.g. their relevance is more dependent on an overall profile of an individual, an algorithm that employs static weights for the former and dynamic or variable weights for the latter can produce a more accurate and efficient prediction as to whether or not an individual will fall. In one or more implementations, the use of an adjustment factor that is static and based on data determined to be of particular significance, e.g. previous fall history, the application of a factor based on such data, whether alone or in combination with a hybrid model (which increases accuracy by itself), can also increase the accuracy and efficiency of the output of the overall algorithm. Moreover, as stated and implied above, in one or more implementations, the use of both an adjustment factor and a hybrid machine model in an overall algorithm can have a compounding effect as to the accuracy of a score. Finally, since the total score 116 associated with system 100 can be based in whole or in part on the prediction score produced by algorithm 500b (e.g. when algorithm 118 is based on algorithm 400), the algorithm 500b can enhance the prophylactic effect of allocating resources to individuals likely to fall in accordance with the implementations of the present disclosure.


In one or more implementations, as stated and implied herein, the data that can be used to train the machine learning algorithm and the data used for application of the trained algorithm to make a prediction can be based on predetermined forms with a score or value range associated therewith, including with respect to the operations associated with FIGS. 5A and 5B.


Beginning in June of 2020, Applicant conducted confidential experiments with multiple healthcare/nursing-home facilities to determine the utility, accuracy, deploy-ability, and suitability of prototype systems and methods that predict a fall. As a result of these confidential activities and further development, additional techniques, systems, features and methods were discovered that further enhance an accuracy of a system or method to determine a fall, in addition to increasing the operability and efficiency of the same in application. FIGS. 5C-5D disclosed below, in addition to other embodiments or implementations disclosed herein, reflect some of these additional techniques, systems, features and methods.



FIG. 5C illustrates an architecture of an algorithm 500c, e.g. a trained algorithm with one or more machine learning components, that predict whether or not an individual will sustain a fall. In one or more implementations, the architecture includes any suitable software and/or hardware units necessary to carry out the logic and/or flow of operations of the architecture 500c. In one or more implementations, the architecture 500c is the basis for a fall risk algorithm 118 and can be stored in a memory of an overall computer system, such as memory 144 of computer system 100, and can be implemented by one or more computer processors, such as processor 142 of computer system 100. In one or more implementations, any other suitable system or component described herein can be configured to execute algorithm 500c. In one or more implementations, various components of algorithm 500c receive data associated with one or more individuals, e.g. diagnostic data 505c, active and new medication data 510c, ADL, MDS, and Recent Vitals data 515c, balance data 520c, admission data 525c (e.g. admission to a hospital, care facility within a specified time frame, e.g. 24-48 hours), mobility data 530c (e.g. if a wheel chair, mechanical lift, or other assistance is required), psychiatric medication data 535c (e.g. where in one or more implementations, as discussed below, this can be a separate data category from active medication data 510c), vitality data 545c, behavioral data 547c (e.g. evidence of specific erratic behavior such as shaking), and continence data 549c (e.g. changes in continence), and fall history data 580c, where data 515c-549c and 580c can be as described above with reference to other implementations, where the data can be related to one or more individuals N (e.g. nursing home facility patients, hospital patients, etc.) as referenced with respect to other implementation as described above, and where the data sources can further be based in whole or in part on predetermined forms as described herein.


In one or more implementations, algorithm 500c includes component 501c, which in turn include static model 550c, static model 560c, dynamic model 570c and adjustment component 585c. Component 501c functions the same or substantially the same as described above with the various components described with respect to 500a and 500b, including training a neural network, e.g. 570c, developing and applying static dictionaries using static models, e.g. 550c and 560c, generating an initial score 590c based on the processing of the dynamic and static models, e.g. 550c, 560c, and 570c, adjusting the initial score 590c using an adjustment component 585c that receives fall history data 580c as an input to produce a second score 595c. In one or more implementations, unlike implementations with respect to FIG. 5A and FIG. 5B, the final score of 500c is based on additional processing and computations resultant from model 571c to generate final score 599c. In one or more implementations, the data source associated with component 501c can be in whole or in part as shown with respect to FIG. 5A and FIG. 5B, where in other implementations, as shown, the data ingested by the various components of 501c is more limited, e.g. 505c, 510c, and 515c, where the other data sources are processed by model 571c. In one or more implementations, since data sources 505c, 510c, and 515c can be considered a first plurality of data because said sources are associated with 501c, and data sources 520c-549c can be considered a second plurality of data associated with model 571c, where the data sources can actually be from a single source or multiple sources, with overlap as to the origin. In one or more implementations, data sources 520c-549c are specific clinical sources with data that is specific to individuals where a fall prediction is sought, where the data sources can be associated with predetermined weights (as discussed below), and where, with respect to model 571c, no training occurs except as to the determination and application of the predetermined weights.


In one or more implementations, as stated above and implied herein, experimentation and testing at one or more facilities has determined that utilizing an additional model, e.g. 571c, with specific data sources, e.g. real time data, that is specific to a particular individual or individuals can further generate a more accurate and tailored fall prediction score, which in turn can be used to provide a more appropriate prophylactic measure, e.g. as discussed herein (e.g. with respect to FIG. 2), for one or more users. In one or more implementations, model 571c is a static model or series of static models determined by computational component or unit 572c and formula logic 573c. In one or more implementations, formula logic 573c applies one or more of a probabilistic, correlation, or other technique as discussed herein in relation to data sources 520c-549c such that data sources with the highest correlation to a fall receive a higher predetermined weight than other sources. In one or more implementations, the type of data that is most important with respect as to whether or not an individual will suffer a fall can be determined by a user through experimentation and/or by applying a correlation and/or probabilistic function to each individual data and determine which data types correlate the highest as to whether or not an individual suffers a fall. In one or more implementations, once the data sources are determined in terms of correlative value, formula logic 573c determines a weight associated with each data source, where the weights can be based in whole or in part on any suitable function, including iteratively increasing the weight value based on a specific factor or doing so partially with one iterative factor with respect to some data sources and another with respect to other data sources. For example, if it is determined that a change or lack thereof in behavior is the least important, and the next important is a continence change, then the weight, e.g., Wc, for continence change can be Wb (the weight for behavioral changes) plus a factor N, and where the next important data source, e.g. nutritional changes (e.g. lack of eating) can be Wc+N. In order to optimize accuracy, “N” can be constant between some data sources, but can change if a data source is particularly important, e.g. the weight for a data source can be Wp+N+N2, where Wp is the weight for the data source immediately precedent in importance, and N2 is an additional additive to the value. In one or more implementations, factorial increases, e.g. N, 2N, 3N, etc. can be used in whole or in part. In one or more implementations, the determination of which data category receives a particular weight value can be based on data associated with other individuals that have sustained a fall, where the data used to make that determination can be of the same type as 520c-549c. In one or more implementations, once the weights are determined, the computational component 572c can generate a clinical score by a summation of the weights for a particular individual, e.g. using the following equation:





Clinical Score=W1*(balance change)+W2*(admitted in the last 24 hours)+W3*(assistance required in the last 24 hours)+W4*(psychotropic medication)+W5*(nutrition)+W6*(evidence of pain)+W7*(vitals out of range)+W8*(behavior changes)+W9*(continence change) . . . other factors.   Equation 3


The individual data categories, e.g. balance change, can be a “1” or “0” if a change in the particular category is detected or it can be a number based on threshold equations, e.g. as discussed above in reference to predetermined forms, where the initial number can be higher than one, e.g. where a particular medication is of a certain potency, a psychotic condition is of a more significant severity, etc. In one or more implementations, once the Clinical Score is determined, the formula logic 573c can apply a comparative scheme to determine what the scheme will be used to compute a final score. If the both the Clinical Score produced by the computational component 572c and the second score or fall risk model score 595c produced by component 501c both exceed a certain threshold or value T, e.g. 50, then the following scheme is applied to generate final score 599c (e.g. by the computational component 572c):





Final Score=(Greater of Two Scores[i.e. Clinical and Second Score]+(Lesser of Two Scores[i.e. Clinical and Second Scores]*reducing scaling factor))/second scaling factor   Equation 4


The greater of the two scores is added to the lesser of the two scores adjusted by a scaling factor, where the lesser of the two scores is multiplied by any suitable factor to reduce its value (e.g. based on a product, difference or a product and difference, etc.), and where the sum is divided by another scaling factor.


If the two scores are below the certain threshold T, and are within a certain range R1, e.g. the difference is small, then the final score 599c is the higher of the two scores (e.g. where the computation is performed by the computational component 572c).


If the two scores are beneath the certain threshold T, but the difference is greater than R1, then the final score 599c is an average of the scores (e.g. where the computation is performed by the computational component 572c).


In one or more implementations, since the algorithmic features of component 501c include both a score based on a fall prediction stemming from whether previous falls have occurred and what the likelihood is that a fall will occur as a result therefrom, e.g. second score 595c, and a clinical factor score produced by computation unit 572c that is based on a scheme that accounts for clinical factors unique to the subject for which a score is to be determined, a more accurate result is generated, where said accurate result can be used to provide a superior prophylactic measure in relation to a potential fall for a patient or resident of a care facility or hospital.


In one or more implementations, as stated and implied herein, the data associated with FIG. 5C can also be based on predetermined forms to enhance accuracy and efficiency in relation to said score.



FIG. 6A illustrates an architecture 600a for training a machine learning algorithm according to one or more implementations of the present disclosure. In one or more implementations, the architecture includes any suitable software and/or hardware units necessary to carry out the logic and/or flow of operations of the architecture 600a. In one or more implementations, the architecture can form the basis for one or more or all of the operations associated with machine learning training protocol 405a and/or determination protocol 406a. In one or more implementations, the architecture 600a can serve as the logical or operational architecture for machine learning training unit 406b and/or determination unit 406b. In one or more implementations, architecture 600a can form the basis for training fall risk algorithm 118 and can be stored in a memory of an overall computer system, such as memory 144 of computer system 100, and can be implemented by one or more computer processors, such as processor 142 of computer system 100. In one or more implementations, any other suitable system or component described herein can be configured to execute the architecture (e.g. as an algorithm) of 600a.


In one or more implementations, algorithmic architecture 600a includes a computational unit 601B, which further includes a static transformer 608A, a static transformer 614A, a scaler 620A, an encoder 622A, a neural network 624A, and a fall adjustment component 628A. In one or more implementations, static transformers 608A, 614A include one or more static operations 610A, 616A, where static operations 610, 616A can include one or more operations based on Equation 1. In one or more implementations, static transformers 608A, 614A intake data from data sources 602A, 602B, respectively. In one or more implementations, the architecture 600a includes a pre-processing unit 603A for performing pre-processing operations to data associated with one or more data sources.


In one or more implementations, data source 602A is data related to medical diagnoses associated with one or more individuals. In one or more implementations, data source 602B is data related to medication information, e.g. active medication, associated with one or more individuals. In one or more implementations, the medical and diagnostic data can be collected for a specified time period, e.g. for fourteen days prior with respect to each of the one or more individuals associated with the data. In one or more implementations, a user (as shown with reference to other implementations) can enter data into the data sources (e.g. databases) associated with 602A and 602B within the specified time range and in one or more implementations, the pre-processing unit 603A can perform a filtering operation, e.g. filter out all data outside the time range (e.g. within fourteen days).


In one or more implementations, static transformers 608A, 614A include relational units 611A, 617A, respectively, where relational units 611A, 617A generate tables associating fall and non-fall events in relation to the respective data sources feeding static transformers 608A, 614A. In one or more implementations, static operations 610A, 616A intake the data from their respective data sources, apply Equation 1, and in coordinating with the relational units 611A, 617A, respectively, generate tables with static weights associated with each row of a table, and in relation to a particular data entry; thereafter, the average of the weights of each instance are utilized by the static transformers 608A, 614A to be used in training neural network 624A. An example of these operations can be as follows: a portion of diagnostic data can be categorized into the following Table 4:











TABLE 4





Id
Diagnosis
Event







1
Z96.642, N39.0, R52, I10, F32.9, E78.5, N39.0, R25.1
Fall


2
K56.699, N39.0, K59.00, K21.9, R52
Not Fall


3
N39.0, R52, I10, F32.9, E78.5
Fall









In one or more implementations, with respect to diagnostic data 602A, the diagnostic data source 602A can include fall information for one or more individuals and/or another data source (as shown with reference to other implementations) can contain the fall or not fall information. In one or more implementations, the relational unit 611A generates one or more rows with multiple data entries “Z96.642,” “N39.0,R52,” “I10,F32.9,” etc. where each particular row of data, as shown, is a representation of diagnostic information. In one or more implementations, as stated and implied above, the relational unit 611A can form this relational table or matrix. In one or more implementations, the static operations 610A can be applied as follows, with specific example being with respect to “N39.0”: N39.0 appears two times in the Fall entry and the Id 1, and the sequence contains 8 diagnosis, the row weight for N39.0 is 2/8=0.25 for Fall; N39.0 appears one time in the Not Fall entry and the Id 2, and the sequence contains 5 diagnosis, the row weight for N39.0 is 1/5=0.2 for Not Fall; N39.0 appears one times in the Fall entry and the Id 3, and the sequence contains 5 diagnosis, the row weight for N39.0 is 1/5=0.2 for Fall; and where the sum up of row weights for N39.0 on Fall entries resolves as follows: (0.25+0.2)=0.45, and with the same for Not Fall being 0.2. With the initial weights computed, application of Equation 1 for a subsequent weight result is as follows: (0.45/0.2)−(0.2/045)=1.80556. Accordingly, the static weight for N39.0 is 1.80556. (It is noted that in one or more implementations, the calculation sign can also determine whether the particular diagnosis leans towards Fall or Not Fall, e.g. if the sign is negative, then it is leaning towards Not Fall, e.g. if the sign is positive, it is leaning towards Fall, e.g. if the weight is 0 then it is neutral). Thereafter, each weight as calculated using the above process can be saved as a static model dictionary, e.g. static model dictionary 612A, which can be used in training the neural network 624A. In one or more implementations, prior to initializing training, each sequence will be transformed by replacing weights based on the static model dictionary and the average of it is used in training the neural network 612A. For example, Z96.642, N39.0,R52,I10 will be transformed as follows: (0.1+1.80556+(−0.23)+0.2)/4=0.46889. In one or more implementations, the same process of operations and static dictionary formation can be used by static operations 616A and relational unit 617A to construct static dictionary 618A with respect to medical data provided by data source 602B. In various implementations, a static dictionary can be formed utilizing a combination of data from 602A and 602B and utilizing the above operations.


In one or more implementations, as shown, the architecture 600a includes data source 602C and 602D, where data source 602C includes MDS information, ADL information, demographic information, and age information of one or more individuals. In one or more implementations, the MDS information can be filtered (e.g. by the pre-processing unit 603A) based on a temporal quality, e.g. only MDS information within a certain time frame is collected. In one or more implementations, MDS information can contain ordinal scale and natural ranking (by virtue of it nature), and can be of an integer nature, which can further enhance an accuracy and efficiency of a model being trained utilizing such information, as pre-processing and natural language processing operations can be minimized before and after the training process in relation therewith. In one or more implementations, where the data source 602C is in whole or in part based on information from medical, nursing home, or assisted care facilities, the value of the data sources 602C can be set as follows by the pre-processing unit 603C: if a resident is refused, −99 is used as a value; if the value is null, −1 is can be used as a value; and if the value is unknown, −2 can be used as value; where in one or more implementations, applying this categorization to the data enhances the efficiency of the training process and/or increases the accuracy of a machine learning model to be trained by this data, e.g. neural network 624A, by ensuring all relevant data can be processed and processed in an efficiently recognizable form (e.g. integer nature).


In one or more implementations, the ADL data of data source 602C can also be restricted to a particular temporal range, e.g. within 14 days, where the pre-processing unit 603C can filter out data outside this range prior to initiating model training, and where the assist categories associated are within a 14 day time period. In one or more implementations, repeated entries for a particular assist category of the ADL data, e.g. on a particular day, can be considered as a count “1” for a particular day, with missing values being replaced with “0.”


In one or more implementations, data source 602C can include information related to the vitals of one or more individuals. In one or more implementations, the vital information is filtered by the pre-processing unit 603C and/or pre-set by a user to be with an average deviation of the last three entries for each one of the one or more individuals, with missing values being replaced with a “0.”


In one or more implementations, data source 602D can include information related to the demographics and age of an individual. In one or more implementations, when an age or gender of an individual is unknown, the value associated therewith is set by the pre-processing unit 603C and/or a user to “0” or “UNKNOWN.”


In one or more implementations, all of the data of data source 602C, after pre-processing by a user and/or pre-processing unit 603C, can be inputted into a suitable scaler, e.g. 620A, for more efficient processing in relation to training neural network 624A. For example, a min/max scaler (0,1) can be applied to further enhance the processing speed of training the neural network 624A, and ultimately improve the accuracy of the trained neural network 624A. Similarly, in one or more implementations, an encoding scheme or encoder, e.g. 622A, such as One Hot Encoding, can be applied to the gender and demographic data associated with data source 602D to achieve the same aim.


In one or more implementations, the outputs of the static transformer 608a, static transformer 614a, scaler 620A, and encoder 622A are inputted into the neural network 624A. Any suitable technique as discussed herein, e.g. deep learning techniques (e.g. backpropagation, etc.) can be used to train the neural network with the received inputs. In one or more implementation, neural network 624A can be initialized with calculated bias and class weights. In one or more implementations, the neural network 624A can be trained with 100 EPOCHs and a batchsize of 512. In one or more implementations, the neural network 624A can be trained offline or online. In one or more implementations, the neural network 624A can be trained offline with a sample size of approximately 2 million, an accuracy of 0.86 and F1-score of 0.86 on the out of sample(unseen) dataset(s) (as associated with each of the relevant data sources).


In one or more implementations, the computational unit 601b includes a fall adjustment component 628A. In one or more implementations, the fall adjustment component 628A receives data from data source 602E, where data source 602E includes fall history information associated with the one or more individuals associated with data sources 602A-602D. In one or more implementations, the fall adjustment component receives data related to fall history from other data sources, e.g. in implementations where the fall history of the one or more individuals is contained in one or more of data sources 602A-602D. In one or more implementations, the fall adjustment component 628A applies Equation 2 to fall history information associated with the one or more individuals and applies the output of Equation 2 to an initial score or output 626A generated by the neural network 624A to generate a final score 630A related to whether or not an individual will sustain a fall. In one or more implementations, the adjustment component 628A applies one or more additional thresholding operations and computational operations to generate the adjustment factor. For example, one or more threshold operations can be applied to an initial score to determine what type of adjustment factor will be applied to the initial fall prediction. For example, in one or more implementations, the one or more threshold operations can be related to the value of the initial score, the number of falls associated with a particular individual, and/or the time frame associated with those falls. By way of a specific example, and according to one or more implementations, the application and operations of the adjustment factor or protocol can be as follows: If a fall event is within 30 days and an initial score is less than 90%, then the final score can be adjusted as follows: 90+(0.2*Nf on prior 30 days). If a fall event is within 30-90 days and the initial score is less than 80%, then 80+(0.2*Nf on prior 30-90 days). If a fall event is within 90-365 days and an initial score is less than 70%, then 70+(0.2*Nf counts on prior 90-365 days).


In one or more implementations, the training of neural network 624A incorporates the results of the adjustment component 628A (e.g. during application of deep learning techniques), and in one or more implementations the results of the adjustment component are not used.



FIG. 6B illustrates an architecture 600B for applying a (trained) algorithm to predict whether one or more individuals will suffer a fall according to one or more implementations of the present disclosure. In one or more implementations, the architecture includes any suitable software and/or hardware units necessary to carry out the logic and/or flow of operations of the architecture 600b. In one or more implementations, the architecture can form the basis for one or more, or all of the operations associated with machine learning training protocol algorithm 408a. In one or more implementations, the architecture 600b can serve as the logical or operational architecture for machine learning application component 408b. In one or more implementations, architecture 600b can form the basis for training fall risk algorithm 118 and can be stored in a memory of an overall computer system, such as memory 144 of computer system 100, and can be implemented by one or more computer processors, such as processor 142 of computer system 100. In one or more implementations, any other suitable system or component described herein can be configured to execute the architecture (e.g. as an algorithm) of 600b.


In one or more implementations, algorithmic architecture 600b includes a computational unit 601b, which further includes a static transformer 608B, a static transformer 614B, a neural network 624B, and a fall adjustment component 628B. In one or more implementations, computational unit 601b includes one or more components trained and/or determined by architecture 600A; for example, a trained neural network 624B stemming from the training of training neural network 624B and static dictionaries 612B, 618B (stemming from 612A, 618A, respectively). In one or more implementations, static transformers 608B, 614B include one or more static operations 610B, 616B, where static operations 610B, 616B can include one or more operations based on Equation 1. In one or more implementations, static transformers 608B, 614B intake data from data sources 602A′, 602B′, respectively.


In one or more implementations, data source 602A′ is data related to medical diagnoses associated with one or more individuals. In one or more implementations, data source 602B′ is data related to medication information, e.g. active medication, associated with one or more individuals. In one or more implementations, the data sources 602A′ and 602B′ are data sources associated with distinct individuals than the individuals associated with the data sources of training, e.g. with the one or more individuals associated with architecture 600A. In one or more implementations, the data of architecture 600A overlaps with the data of architecture 600B. In one or more implementations, no pre-processing of data occurs with respect to 602A′ and 602B′. In one or more implementations, a pre-processing occurs as with respect to FIG. 6A, e.g. the medical and diagnostic data can be collected for a specified time period, e.g. for fourteen days prior with respect to each of the one or more individuals associated with the data. In one or more implementations, a user (as shown with reference to other implementations) can enter data into the data sources (e.g. databases) associated with 602A and 602B within the specified time range and in one or more implementations, a pre-processing unit (as shown with reference to other implementations) can perform a filtering operation, e.g. filter out all data outside the time range (e.g. within fourteen days).


In one or more implementations, the static transformer 608B, 614B are the static transformers 608A, 614A and the dictionaries 612B and 618B are the dictionaries 612A and 618B, respectively, developed as outlined with respect to FIG. 6A. In one or more implementations, static transformer 608A and 614A intake data from data sources 602A′ and 602B′ and perform a matching operation utilizing units 612B and 618B, respectively. In one or more implementation, the matching units 611B and 617B match a data in static dictionaries 612B and 618B, respectively, to data from an incoming data stream associated with one or more individuals. For example, if an individual has a diagnosis type and/or takes a medication that is of the same type as an entry of the one or more static dictionaries 612B, 618B, then the relevant matching unit will output the one or more weights and/or weight averages associated with the relevant static dictionary entry into the neural network 624B. In one or more implementations, no computation associated with static operations 610B and 616B when matching is performed, e.g. the weights and weight averages from the development/training of static dictionaries from training (e.g. as discussed with respect to FIG. 6A) are used.


In one or more implementations, static transformers 608B, 614B include relational units 611B, 617B, respectively, where relational units 611A, 617A generate tables associating fall and non-fall events in relation to the respective data sources feeding static transformers 608A, 614A. In one or more implementations, static operations 610A, 616A intake the data from their respective data sources, apply Equation 1, and in coordinating with the relational units 611A, 617A, respectively, generate tables with static weights associated with each row of a table, and in relation to a particular data entry; thereafter, the average of the weights of each instance are utilized by the static transformers 608A, 614A to be used as inputs in the neural network 624B. Accordingly, in one or more implementations, the static dictionaries 612B, 618B include new or updated entries from the static dictionaries from training, and/or replace those entries in their entirety.


In one or more implementations, as shown, the architecture 600B includes data source 602C′ and 602D′, where data source 602C′ includes MDS information, ADL information, demographic information, and age information of one or more individuals. In one or more implementations, no filtering or pre-processing of the information occurs.


In one or more implementations, the MDS information can be filtered (e.g. by the pre-processing unit (as shown with reference to other implementations)) based on a temporal quality, e.g. only MDS information within a certain time frame is collected. In one or more implementations, MDS information can contain ordinal scale and natural ranking (by virtue of its nature), and can be of an integer nature, which can further enhance an accuracy and efficiency of a trained model being applied to incoming data. In one or more implementations, where the data source 602C′ is in whole or in part based on information from medical, nursing home, or assisted care facilities, the value of the data sources 602C; can be set as follows by a pre-processing unit (as shown with reference to other implementations) or user: if a resident is refused, −99 is used as a value; if the value is null, −1 is can be used as a value; and if the value is unknown, −2 can be used as value; where in one or more implementations, applying this categorization to the data enhances the efficiency of the application process and/or increases the accuracy of a trained machine learning model to applied to this data, e.g. neural network 624B, by ensuring all relevant data can be processed and processed in an efficiently recognizable form (e.g. integer nature).


In one or more implementations, the ADL data of data source 602C′ can also be fed into neural network 624B without preprocessing, where in one or more implementation it can also be restricted to a particular temporal range, e.g. within 14 days, where a pre-processing unit (as shown with reference to other implementations) can filter out data outside this range prior to initiating model training, and where the assist categories associated are within a 14 day time period. In one or more implementations, repeated entries for a assist category of the ADL data, e.g. on a particular day, can be considered as a count “1” for a particular day, with missing values being replaced with “0.”


In one or more implementations, data source 602C′ can include information related to the vitals of one or more individuals. In one or more implementations, the vital information is not filtered. In one or more implementations, the data can be filtered by a pre-processing unit (as shown with reference to other implementations) and/or pre-set by a user to be with an average deviation of the last three entries for each one of the one or more individuals, with missing values being replaced with a “0.”


In one or more implementations, data source 602D′ can include information related to the demographics and age of an individual. In one or more implementations, when an age or gender of an individual is unknown, the value associated therewith is set by a pre-processing unit (as shown with reference to other implementations) and/or a user to “0” or “UNKNOWN.” In one or more implementations, no pre-processing is done.


In one or more implementations, all of the data of data source 602C′ can be inputted directly into neural network 624B. In other implementations (as shown with reference to other implementations), the data of data source 602C′ can be inputted into a suitable scaler for more efficient processing in relation to applying the neural network 624B to obtain a prediction as to whether or not one or more individuals will sustain a fall. For example, a min/max scaler (0,1) can be applied, e.g. in instances where a significant number of predictions are requested. Similarly, in one or more implementations, an encoding scheme or encoder (as shown with reference to other implementations), such as One Hot Encoding, can be applied to the gender and demographic data associated with data source 602D′ to achieve the same aim; and where in other implementations, no such encoding scheme is employed.


In one or more implementations, the outputs of the static transformer 608A and static transformer 614B, and data sources 602C and 602D are inputted into the neural network 624B in order to generate a prediction in relation to the one or more individuals where a prediction as to whether or not a fall will occur is sought. In one or more implementations, the neural network 624B generates an initial score 626B based on the inputted information, where the initial score 626B is an initial score indicative as to the likelihood than an individual (associated with data sources 602A′-602E′) will sustain a fall.


In one or more implementations, the computational unit 601B includes a fall adjustment component 628B. In one or more implementations, the fall adjustment component 628B receives data from data source 602E′, where data source 602E′ includes fall history information associated with the one or more individuals associated with data sources 602A′-602D′. In one or more implementations, the fall adjustment component 628B receives data related to fall history from other data sources, e.g. in implementations where the fall history of the one or more individuals is contained in one or more of data sources 602A′-602D′. In one or more implementations, the fall adjustment component 628B applies Equation 2 to fall history information associated with the one or more individuals and applies the output of Equation 2 to an initial score or output 626B generated by the neural network 624B to generate a final score 630B related to whether or not an individual will sustain a fall. In one or more implementations, the adjustment component 628B applies one or more additional thresholding operations and computational operations to generate the adjustment factor. For example, one or more threshold operations can be applied to an initial score to determine what type of adjustment factor will be applied to the initial fall prediction. For example, in one or more implementations, the one or more threshold operations can be related to the value of the initial score, the number of falls associated with a particular individual, and/or the time frame associated with those falls. By way of a specific example, and according to one or more implementations, the application and operations of the adjustment factor or protocol can be as follows: If a fall event is within 30 days and an initial score is less than 90%, then the final score can be adjusted as follows: 90+(0.2*Nf on prior 30 days). If a fall event is within 30-90 days and the initial score is less than 80%, then 80+(0.2*Nf on prior 30-90 days). If a fall event is within 90-365 days and an initial score is less than 70%, then 70+(0.2*Nf counts on prior 90-365 days).


In one or more implementations, the final score 630B is outputted to a user that made a request (via a computing device accessing algorithm 600B) for a prediction as to whether or not an individual (associated with data sources 602A′-602D′) will suffer a fall.



FIG. 7 illustrates a flow 700 for training a machine learning model in accordance with various techniques associated with the present disclosure. In one or more implementations, the flow includes block 710, where any suitable computer system or component, e.g. system 100 and processors 114, receives a plurality of data, e.g. medical data, associated with a first set of one or more individuals. Here, receiving can mean inputting the data by any means such as keying, or data transfer from a data set (e.g., MDS, ADS, and electronic medical records). In one or more implementations, the received medical data includes: i) a demographic information associated with the one or more individuals, ii) a physical activity information associated the one or more individuals, iii) a vital change information associated with the one or more individuals, iv) governmental information associated with the one or more individuals, v) a medical diagnostic information associated with the one or more individuals, vi) medication information associated with the one or more individuals, vii) a demographic information associated with the one or more individuals, viii) a physical activity information associated with the one or more individuals, and ix) vital change information associated with the one or more individuals.


In one or more implementations, the flow 700 includes a block 720 for determining one or more static components of an overall algorithm for predicting a fall. In one or more implementations, the one or more static components can be predetermined by a user (e.g. via experimentation) and/or developed utilizing any other probabilistic, correlation, or other technique as discussed herein. In one or more implementations, the one or more static components can be based on particular types of data that are identified as the most important types of data in relation to whether or not an individual will sustain a fall. In one or more implementations, the type of data that is most important with respect as to whether or not an individual will suffer a fall can be determined by a user through experimentation and/or by applying a correlation and/or probabilistic function to each individual data and determine which data types correlate the highest as to whether or not an individual suffers a fall.


In one or more implementations, the data associated with the highest correlation as to whether or not an individual will suffer a fall, e.g. fall history, can be processed according to a first static model or equation or series of threshold operations in conjunction with an equation, e.g. Equation 2 and thresholding operations discussed above, and can serve as an adjustment factor during training. In one or more implementations, the next two most important data types, e.g. diagnostic data and active medication data, can be processed by one or more static models based on Equation 1 and one or more averaging operations, where the outputs of the static equations or models associated with Equation 1 can serve as an input for training a dynamic component, as discussed below. In one or more implementations, the suitable operations to generate the static models and other operations of block 720 can be performed by any suitable computer system and its associated components, e.g. the processor 142 of computer system 100 can be configured to determine the equation based on the received data utilizing any suitable technique as described herein.


In one or more implementations, flow 700 can include block 730 for training a machine learning algorithm based on the received data. Block 730 can include an initialization operation for setting initial weights for the machine learning model. The initial weights can be randomized or determined by probabilistic and deviation techniques from a test set of other individuals and similar data types in relation to those first plurality of individuals. Alternatively, the initial weights can be set to zero, can be set to a random value by application of a random weight generator, or can be provided by another machine learning model or system. In one or more implementations, the training can begin with a calculated bias and class weights pursuant to any other suitable initialization operation.


In one or more implementations, block 730 can include inputting the outputs of the determined static models based on Equation 1 and the remaining data sources not processed by the static models based on equation 1, e.g. a demographic information associated with the one or more individuals, governmental information associated with the one or more individuals, etc. into a neural network, e.g. dynamic model. In one or more implementations, block 730 can include training the neural network utilizing any suitable training technique based on the received inputs discussed above, and can include modulating and adjusting weights and biases in the medical-related input data, and in deeper levels of a deep learning algorithm (e.g., by back propagation methods) such that the prediction of the fall score value is increasingly accurate. That is, a deviation from the fall score value and the prediction from the algorithm as it is trained is minimized to an acceptable minimum value. For example, the model can be validated with a Root Mean Squared Error on train/test ratio 70:30 (i.e., 70% of the data is used to train the model and 30% of the data is used to test the model). In one or more implementations, the training is based solely on the received inputs from the static models based on Equation 1 and the other data sources. In one or more implementations, the training associated with block 730 can include incorporation of the static model based on Equation 2, e.g. a fall history adjustment factor, where, during training, an initial score is produced by the neural network, based on the inputs provided by the static models based on Equation 1 and the other data sources, and then a final score is produced by an adjustment factor based on fall history, where the backpropagation loop is from the output associated with the final score.


In one or more implementations, any iteration or run rate can be utilized, where in one or more implementations the neural network can be trained with one hundred EPOCHS and a BATCHSIZE set at 512. In one or more implementations, the model can be trained online or offline, where in one or more implementations the neural network was trained offline. In one or more implementation, any suitable data sample size based on block 710 can be used, where in one or more application a 2.3 million sample with an accuracy of 0.86 and F1-score of 0.86 on the out of sample(unseen) dataset can be used.


Accordingly, in one or more implementations the trained machine learning model is a hybrid model that includes both a static and dynamic aspect, e.g. static and variable weights, and the trained model can receive a second plurality of medical data associated with a second plurality of individuals and generate a prediction as to whether or not any one of those individuals will suffer a fall by outputting a score or value reflective of said prediction. In one or more implementations, the variable weights associated nodes can change and/or be updated with each application of the trained model.



FIG. 8 illustrates a flow diagram 800 for applying an algorithm to predict a fall of one or more individuals. In one or more implementations, the flow 800 includes block 810, where any suitable computer system or component, e.g. system 100 and processor 142, receives a plurality of data, e.g. medical data, associated with a first set of one or more individuals. Here, receiving can mean inputting the data by any means such as keying, or data transfer from a data set (e.g., MDS, ADS, and electronic medical records). In one or more implementations, the received medical data includes: i) a demographic information associated with the one or more individuals, ii) a physical activity information associated the one or more individuals, iii) a vital change information associated with the one or more individuals, iv) governmental information associated with the one or more individuals, v) a medical diagnostic information associated with the one or more individuals, vi) medication information associated with the one or more individuals, vii) a demographic information associated with the one or more individuals, viii) a physical activity information associated the one or more individuals, and ix) vital change information associated with the one or more individuals.


In one or more implementations, pursuant to block 820, any suitable component or components of a computer system can apply any machine learning algorithm or algorithm to the received data to generate an initial score, with respect to each of the one or more individuals associated with the received data, as to a potential fall event. For example, computer processor 142 can process the received data, e.g. 114, and can provide one or more instruction in relation to system 100, such as to apply algorithm 118 to the received data. Algorithm 118 can be a hybrid machine learning model that includes a neural network and one or more static models. In one or more implementations, the neural network directly receives a portion of the incoming data stream and another portion can be processed by one or more static models, including static models or dictionaries based on Equation 1. In one or more implementations, the static models utilize a static dictionary based on average weights that correspond to a particular type of data that is the same as the type of data received during training of the neural network. In one or more implementations, the application of the static portion of the algorithm involves performance of one or more matching operations that match incoming data to data in a static dictionary (e.g. generated during a training phase and based on Equation 1), and output weight or weight averages associated with entries of the static dictionary into the neural network. In one or more (other) implementations, the static models construct a table or matrix based on the incoming data stream, produce one or more static weights associated with fall or non-fall events, e.g. pursuant to Equation 1, and then average those weights to serve as an input to the neural network.


In one or more implementations, pursuant to block 820, all remaining data sources that are not processed by the static models are fed directly into the neural network.


In one or more implementations, pursuant to block 830, the application of the machine learning model, e.g. a hybrid machine learning model, can produce an initial score associated with each of the plurality of individuals, and where the score provides a prediction as to whether or not an individual will experience a fall event.


In one or more implementations, pursuant to block 840, any suitable component or components of a computer system can apply an adjustment factor to each of the initial scores produced for each one of the plurality of individuals. For example, computer processor 142 can process the received data, e.g. 114, and can provide one or more instruction in relation to system 100, such as to apply algorithm 118 to the received data. Algorithm 118 can include an equation as discussed herein that adjusts the output of the machine learning algorithm by either adding a factor to the initial score or performing a multiplication of the factor in relation to the initial output. In one or more implementations, the adjustment factor is based on Equation 2. In one or more implementations, the adjustment factor is based on Equation 2 and threshold operations as discussed herein.


In one or more implementations, pursuant to block 840 of flow 800, the application of the adjustment factor produces a final score predictive of whether or not an individual will suffer a fall event, e.g. a respective final fall score for each of the plurality of individuals.



FIG. 9 illustrates a flow diagram 800 for assigning a resource to one or more individuals. In some implementations, flow diagram 900 carries out one or more operations associated with flow 800. In some implementations, as shown, flow 900 carries out all of the operations associated with the blocks of flow 800. In some implementations, a resource is assigned to the plurality of individuals, e.g. patients or residents, based in whole or in part on the generated final fall prediction score(s) 905. Any suitable component of the present disclosure, computer processor 142 can process the received data, e.g. 114, and can be configured to provide one or more instructions in relation to system 100, such as to suggest a resource for one or more individuals associated with the received data, where the resource allocation can be based on a request from another user device or can be automatically made (to another device), e.g. via a network, based on the processing of the received data. A process or other suitable component, e.g. 142, can be configured to compare the fall prediction score with one or more conditions (if any), respectively, associated with the plurality of each individuals. In one or more implementations, a suitable component as described herein, e.g. 142, can be configured to associate a condition and/or preference to the fall prediction score. In some implementations, the suitable component as described herein, e.g. 142, can be configured to output a specific resource to each of the plurality of individuals if he or she has a particular condition and in relation to that condition if outputted fall prediction score exceeds a certain threshold. In some implementations, the suitable component as described herein, e.g. 142, can account for multiple conditions by performing any suitable mathematical operation, e.g. a sum, product, weighting or factorial increase based on the fall prediction scores of the plurality of individuals. In some implementations, a particular condition will automatically warrant at least one resource allocation irrespective of the fall prediction score of each individual, and the addition of other conditions can impact whether additional resources are allocated to a particular individual. In some implementations each condition can be associated with a particular threshold in relation to the fall prediction score and in relation to whether a particular resource or resource is allocated.


In some implementations, for example, the suitable component as described herein, e.g. 142, can be configured to suggest or allocate a resource based solely on the fall prediction scores, respectively, of each of the plurality of individuals. For example, if the score meets or exceeds a first threshold is met, Nu then a first resource is suggested, e.g. a soothing voice of a family member or pre-recorded music associated with a family member of a particular individual of the plurality of individuals is played over a loud speaker (e.g. instructing a resident to stay in place while medical assistance arrives). If the score meets or exceeds a second threshold Nt2, then another resource or an additional resource can be suggested, e.g. a wheelchair. If the score meets or exceeds a third threshold Nt3, then another or yet additional resource can be suggested, e.g. immediate medical assistance by a care provider (nurse, therapist, doctor, etc.) can be suggested. In some implementation, since a fall prediction score is based on data that is in part particular to an individual of the plurality of individuals, inherently, the resource allocation based on the fall prediction score is specific to a user. In some implementation, in order to more particularize a resource allocation to each of the plurality of individuals, respectively, suitable component as described herein, e.g. 142, can be configured to compare a fall prediction score associated with a particular condition associated with, respectively, each of the plurality of individuals. For example, any suitable component as described herein, e.g. 142, can be configured to suggest a particular resource based on fall prediction scores that are specific to particular conditions or are weighted based on the number and/or type of conditions. For example, condition 1, e.g. dementia, can have a weight W1. If a particular individual of the plurality of individuals suffers from dementia and has a fall prediction score FS, then the product of W1 and FS can be compared to particular threshold reference scores, e.g. RS1, RS2, RS3, and so on, where if the product of W1 and FS exceeds the one or more threshold or reference scores, then a resource corresponding to each reference score or a resource corresponding to the highest score is suggested. If multiple conditions are present, e.g. condition 1, e.g. dementia, and condition 2, e.g. a broken bone or fractured bone, then multiple weights, each corresponding to each condition, e.g. W1 for dementia and W2 for a broken or fractured bone, then W1 and W2 (and so on) can be applied to the fall prediction score FS to determine whether one or more resources should be assigned to the corresponding individual. In some implementations, the weights, e.g. W1 and W2, can be higher or lower based on their relevance as to whether or not an individual will suffer a fall relative to a condition, e.g. a sprained ankle can have a lower weight than a fractured hip.


In some implementations, each condition can correspond to a particular threshold, and if the threshold, e.g. RS1, RS2, and for the condition or conditions is not met, then no resource is assigned, and if the threshold for the condition or conditions is met, then each resource associated with the particular condition is suggested by suitable component as described herein, e.g. 142, configured as such. In some implementations, the threshold for one condition can be correlated to the condition presents a higher likelihood an individual will suffer a fall as result of the condition, e.g. a sprained ankle can have a lower threshold than a fractured hip. In some implementations, suitable component as described herein, e.g. 142, can be configured to coordinate with one or more sensing, audiovisual (e.g. camera), or other suitable devices to make a suggestion as to whether or not a resource should be provided. For example, if a individual of the plurality of individuals has a score that exceeds a certain threshold or exceeds a certain threshold relative to one or more conditions, and individual of the plurality of individuals already has the suggested allocation (e.g. by conducting a scan of a particular setting and performing any suitable audiovisual operation to detect the individual of the plurality of individuals and his or her surrounding and/or allocated resources), then no further allocation can be made. By way of another example, any suitable component as described herein, e.g. 142, can be configured suggest allocating or not allocating a resource based on a threshold being met or not met relative to a fall prediction score and one or more conditions, in addition to an audiovisual scan of a particular setting. If a resource allocation is suitable only in the case of movement, e.g. leg brace or wheel chair, for example, but the individual of the plurality of individuals is on his or her bed, then a particular resource allocation can be withheld as extraneous based on an audiovisual analysis of the care setting.


In some implementations, a condition warrants more than one resource allocation or a certain fall prediction score that is weighted by multiple conditions warrants more than one resource allocation. In some implementations, a certain condition can warrant a resource allocation irrespective of a fall prediction score, but a fall prediction score weighted by additional conditions can warrant an additional resource and/or a scan or other review of a care setting, e.g. by a sensor or camera, can warrant an additional resource assignment.



FIG. 10 illustrates a flow 1000 for training a machine learning model in accordance with various techniques associated with the present disclosure. In one or more implementations, the flow includes block 1005, e.g. the flow 1000 can begin therefrom. In one or more implementations, the flow 1000 includes block 1010, where any suitable computer system or component, e.g. system 100 and processors 114, receives a first plurality of data, e.g. medical data or other, associated with a first set of one or more individuals. Here, receiving can mean inputting the data by any means such as keying, or data transfer from a data set (e.g., MDS, ADS, and electronic medical records). In one or more implementations, the first plurality of medical data can include: i) active diagnosis information associated with the one or more individuals, e.g. recent diagnoses or diagnoses that refer to present or current medical conditions associated with one or more individuals, ii) active and new medication data (e.g. medication recently or currently being take by the one or more individuals), and iii) ADL, MDS, and recent vitals of the one or more individuals.


In one or more implementations, the flow 1000 includes a block 1020 for applying a first algorithm to the first plurality of data. The first algorithm can be a hybrid machine learning algorithm as discussed with respect to FIGS. 7-9, where one or more static components and static dictionaries associated therewith process the active and new diagnostic data and the active medication data, and a dynamic component, e.g. neural network, process the ADL, MDS, and recent vital data. In one or more implementations, the flow includes block 1030 where the application of the first algorithm can generate a first score, where the first score is a prediction as to whether or not one or more of the one or more individuals will sustain a fall. The first score can be generated as described with respect to FIGS. 7-9, where with respect to one or more implementations, the first score is based on data i)-iii) and not additional data as discussed with some of the one or more implementations associated with FIGS. 7-9. In one or more implementations, an adjustment factor can be applied to the initial score, as show in block 1040 of flow 1000. In one or more implementations, the adjustment factor can be based on fall history data of the one or more individuals and/or other individuals, and that is part of or distinct from the first plurality of data, and where the adjustment factor can be based on the equations, structures and techniques discussed with respect to one or more individuals. determining one or more static components of an overall algorithm for predicting a fall. In one or more implementations, the flow 1000 can include block 1050, where block 1050 generates a second score based on application of the adjustment component, and in one or more implementations, the second score corresponds to the final score with respect to the discussion associated with FIGS. 7-9.


In one or more implementations, flow 1000 includes block 1050, where block 1050 receives a second plurality of data associated with the one or more individuals. In one or more implementations, the second plurality of data is clinical data with specific data associated with the one or more individuals, and that is within a specific temporal range (e.g. 24-72 hours), where the second plurality of data can include balance data (e.g. balance tests administered to assess how well balanced each of the one or more individuals is), admission data (e.g. admission to a hospital, care facility within a specified time frame, e.g. 24-48 hours), mobility data (e.g. if a wheel chair, mechanical lift, or other assistance is required), psychiatric medication data 535c (e.g. where in one or more implementations, as discussed below, this can be a separate data category from active medication data 510c), vitality data, behavioral data (e.g. evidence of specific erratic behavior such as shaking), and continence data (e.g. changes in continence), and as may be referenced with respect to other implementation as described above, and where the data sources can further be based in whole or in part on predetermined forms as described herein.


In one or more implementations, flow 1000 includes applying a second algorithm or a portion of a second algorithm associate as per block 1070. In one or more implementations, as stated above and implied herein, experimentation and testing at one or more facilities has determined that utilizing an additional model or algorithm with specific data sources, e.g. real time data, that is specific to a particular individual or individuals can further generate a more accurate and tailored fall prediction score, which in turn can be used to provide a more appropriate prophylactic measure, e.g. as discussed herein (e.g. with respect to FIG. 2), for one or more users. In one or more implementations, block 1070 includes all or part of this additional model. The additional model or algorithm can be a model that includes a logic component for determine which more formula or scheme to apply and a computational component for computing the formula or scheme. The second algorithm can be determined in part by correlation or probabilistic techniques that evaluated data associated with other individuals, e.g. distinct from the one or more individuals, to determine a weighting scheme for particular data sources, e.g. as discussed with respect to FIG. 5C. In one or more implementations, application of the second algorithm can include summing up the data multiplied by weights in relation to each data source. For example, if it is determined that a change or lack thereof in behavior is the least important, and the next important is a continence change, then the weight, e.g., Wc, for continence change can be Wb (the weight for behavioral changes) plus a factor N, and where the next important data source, e.g. nutritional changes (e.g. lack of eating) can be Wc+N. In order to optimize accuracy, “N” can be constant between some data sources, but can change if a data source is particularly important, e.g. the weight for a data source can be Wp+N+N2, where Wp is the weight for the data source immediately precedent in importance, and N2 is an additional additive to the value. In one or more implementations, factorial increases, e.g. N, 2N, 3N, etc. can be used in whole or in part. In one or more implementations, once the weights are determined, the computational, pursuant to block 1080, a third score (e.g. clinical score) is generated using the following equation:





Clinical Score=W1*(balance change)+W2*(admitted in the last 24 hours)+W3*(assistance required in the last 24 hours)+W4*(psychotropic medication)+W5*(nutrition)+W6*(evidence of pain)+W7*(vitals out of range)+W8*(behavior changes)+W9*(continence change) . . . other factors.   Equation 3


The individual data categories, e.g. balance change, can be a “1” or “0” if a change in the particular category is detected or it can be a number based on threshold equations, e.g. as discussed above in reference to predetermined forms, where the initial number can be higher than one. In one or more implementations, the flow includes block 1085. If the both the Clinical Score (third score) and the second score (or fall risk model score) both exceed a certain threshold or value T, e.g. 50, then the following scheme is applied to generate final score pursuant to block 1096, where the final score is a hybrid of the second score and the third score





Final Score=(Greater of Two Scores [i.e. Clinical and Second Score]+(Lesser of Two Scores [i.e. Clinical and Second Scores]*reducing scaling factor))/second scaling factor   Equation 4


The greater of the two scores is added to the lesser of the two scores adjusted by a scaling factor, where the lesser of the two scores is multiplied by any suitable factor to reduce its value (e.g. based on a product, difference or a product and difference, etc.), and where the sum is divided by another scaling factor.


If the two scores are below the certain threshold T, then the flow 1000 includes block 1090. Pursuant to block 1090, if the second score and third score are within a certain range R1, e.g. the difference is small, then the final score, pursuant to block 1095, is the higher of the two scores.


If the two scores are beneath the certain threshold T, but the difference is greater than R1, then the flow 1000 includes block 1097 and the final score is an average of the scores.


In one or more implementations, since the overall flow 1000 includes both score based on a fall prediction stemming from whether previous falls have occurred and what the likelihood is that a fall will occur as a result therefrom, e.g. a final score of the first algorithm and reflected as the second score of flow 1000, and a clinical factor score produced by that is based on a scheme that accounts for clinical factors unique to the subject for which a score is to be determined, e.g. a third score of flow 1000 produced by the second algorithm of flow 1000, a more accurate result is generated, where said accurate result can be used to provide a superior prophylactic measure in relation to a potential fall for a patient or resident of a care facility or hospital, and where said accurate result is based on a first hybrid algorithm combined with a second static algorithm forming an overall hybrid algorithm, e.g. flow 1000.


In one or more implementations, as stated and implied herein, the data associated with FIG. 5C can also be based on predetermined forms to enhance accuracy and efficiency in relation to said score.



FIG. 11 illustrates an embodiment of an exemplary computing architecture 1100 suitable for implementing various embodiments as previously described. In one embodiment, the computing architecture 1100 may include or be implemented as part of system 100, system 400a, or system 400a and/or may implement one or more operations associated with any of the Figures of the present application, including flows associated with FIGS. 7-10.


As used in this application, and as may be expressly stated or implied elsewhere herein, the terms “system” and “component” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution, examples of which one example is provided by the exemplary computing architecture 1100. For example, a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers. Further, components may be communicatively coupled to each other by various types of communications media to coordinate operations. The coordination may involve the uni-directional or bi-directional exchange of information. For instance, the components may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.


The computing architecture 1100 includes various common computing elements, such as one or more processors, multi-core processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, power supplies, and so forth. The embodiments, however, are not limited to implementation by the computing architecture 1100.


As shown in FIG. 11, the computing architecture 1100 includes a processing unit 1104, a system memory 1106 and a system bus 1108. The processing unit 1104 can be any of various commercially available processors.


The system bus 1108 provides an interface for system components including, but not limited to, the system memory 1106 to the processing unit 1104. The system bus 1108 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. Interface adapters may connect to the system bus 608 via a slot architecture. Example slot architectures may include without limitation Accelerated Graphics Port (AGP), Card Bus, (Extended) Industry Standard Architecture ((E)ISA), Micro Channel Architecture (MCA), NuBus, Peripheral Component Interconnect (Extended) (PCI(X)), PCI Express, Personal Computer Memory Card International Association (PCMCIA), and the like.


The computing architecture 1100 may include or implement various articles of manufacture. An article of manufacture may include a computer-readable storage medium to store logic. Examples of a computer-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of logic may include executable computer program instructions implemented using any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like. Embodiments may also be at least partly implemented as instructions contained in or on a non-transitory computer-readable medium, which may be read and executed by one or more processors to enable performance of the operations described herein.


The system memory 1106 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, an array of devices such as Redundant Array of Independent Disks (RAID) drives, solid state memory devices (e.g., USB memory, solid state drives (SSD) and any other type of storage media suitable for storing information. In the illustrated embodiment shown in FIG. 11, the system memory 1106 can include non-volatile memory 1110 and/or volatile memory 1112. A basic input/output system (BIOS) can be stored in the non-volatile memory 1110.


The computer 1102 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal (or external) hard disk drive (HDD) 1114, a magnetic floppy disk drive (FDD) 1116 to read from or write to a removable magnetic disk 1118, and an optical disk drive 1120 to read from or write to a removable optical disk 1122 (e.g., a CD-ROM or DVD). The HDD 1114, FDD 1116 and optical disk drive 1120 can be connected to the system bus 1108 by an HDD interface 1124, an FDD interface 1126 and an optical drive interface 1128, respectively. The HDD interface 1124 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.


The drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For example, a number of program modules can be stored in the drives and memory units 1110, 1112, including an operating system 1130, one or more application programs 1132, other program modules 1134, and program data 1136. In one embodiment, the one or more application programs 1132, other program modules 1134, and program data 1136 can include, for example, the various applications, operational blocks and/or components of the system 100, system, the system 400a, the system 400b, algorithmic architecture 500a, algorithmic architecture 500b, architecture 600a, architecture 600b, and/or flowcharts 700-900.


A user can enter commands and information into the computer 1102 through one or more wire/wireless input devices, for example, a keyboard 638 and a pointing device, such as a mouse 1140. Other input devices may include microphones, infra-red (IR) remote controls, radio-frequency (RF) remote controls, gamepads, stylus pens, card readers, dongles, finger print readers, gloves, graphics tablets, joysticks, keyboards, retina readers, touch screens (e.g., capacitive, resistive, etc.), trackballs, trackpads, sensors, styluses, and the like. These and other input devices are often connected to the processing unit 1104 through an input device interface 1142 that is coupled to the system bus 1108 but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.


A monitor 1144 or other type of display device is also connected to the system bus 608 via an interface, such as a video adaptor 1146. The monitor 1144 may be internal or external to the computer 1102. In addition to the monitor 1144, a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.


The computer 1102 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 1148. The remote computer 1148 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all the elements described relative to the computer 1102, although, for purposes of brevity, only a memory/storage device 1150 is illustrated. The logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1152 and/or larger networks, for example, a wide area network (WAN) 1154. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.


When used in a LAN networking environment, the computer 1102 is connected to the LAN 1152 through a wire and/or wireless communication network interface or adaptor 1156. The adaptor 1156 can facilitate wire and/or wireless communications to the LAN 1152, which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1156.


When used in a WAN networking environment, the computer 1102 can include a modem 1158, or is connected to a communications server on the WAN 1154 or has other means for establishing communications over the WAN 1154, such as by way of the Internet. The modem 1158, which can be internal or external and a wire and/or wireless device, connects to the system bus 1108 via the input device interface 1142. In a networked environment, program modules depicted relative to the computer 1102, or portions thereof, can be stored in the remote memory/storage device 1150. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.


The computer 1102 is operable to communicate with wire and wireless devices or entities using the IEEE 602 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 602.11 over-the-air modulation techniques). This includes at least Wi-Fi (or Wireless Fidelity), WiMax, and Bluetooth™ wireless technologies, among others. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices. Wi-Fi networks use radio technologies called IEEE 602.118 (a, b, g, n, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 602.3-related media and functions).


The various elements of the devices as previously described with reference to FIGS. 1-11 may include various hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, logic devices, components, processors, microprocessors, circuits, processors, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, software development programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. However, determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.


While the present disclosure has been described with reference to one or more particular embodiments or implementations, those skilled in the art will recognize that many changes may be made thereto without departing from the spirit and scope of the present disclosure. Each of these implementations and obvious variations thereof is contemplated as falling within the spirit and scope of the present disclosure. It is also contemplated that additional implementations according to aspects of the present disclosure may combine any number of features from any of the implementations described herein.

Claims
  • 1. An apparatus, the apparatus comprising: a memory to store instructions; andprocessing circuitry, coupled with the memory, operable to execute the instructions, that when executed, cause the processing circuitry to: apply a hybrid machine learning model (MLM) to a plurality of data associated with one or more individuals; andcompute a fall prediction score for the one or more individuals based in whole or in part on application of the hybrid MLM.
  • 2. The apparatus of claim 1, wherein the hybrid MLM includes both of i) a static component and ii) a dynamic component.
  • 3. The apparatus of claim 2, wherein the plurality of data includes: a first plurality of data associated with the one or more individuals and a second plurality of data associated with the one or more individuals, and wherein the static component determines one or more weights associated with the first plurality of data of and the dynamic component determines another one or more weights associated with the second plurality of data.
  • 4. The apparatus of claim 2, wherein one or more outputs of the static component are inputted into the dynamic component.
  • 5. The apparatus of claim 3, wherein the first plurality of data includes at least one of: i) a medical diagnostic information associated with the one or more individuals and ii) a medication information associated with the one or more individuals.
  • 6. The apparatus of claim 3, wherein the second plurality of data includes at least one of: i) a demographic information associated with the one or more individuals, ii) a physical activity information associated with the one or more individuals, iii) a vital change information associated with the one or more individuals, and iv) a governmental information associated with the one or more individuals.
  • 7. The apparatus of claim 3, wherein the first plurality of data includes all of: i) a medical diagnostic information associated with the one or more individuals and ii) a medication information associated with the one or more individuals, and wherein the second plurality of data includes all of: i) a demographic information associated with the one or more individuals, ii) a physical activity information associated the one or more individuals, iii) a vital change information associated with the one or more individuals, and iv) a governmental information associated with the one or more individuals.
  • 8. The apparatus of claim 3, wherein the processing circuitry is further configured to: generate an adjustment factor based on a previous fall history associated with the one or more individuals, and wherein the adjustment factor adjusts the fall prediction score.
  • 9. The apparatus of claim 8, wherein the adjustment factor is generated by an adjustment factor distinct from both of the static component and the dynamic component.
  • 10. The apparatus of claim 1, wherein the processing circuitry is further configured to: assign a specific resource to an individual of the one or more individuals based on the fall predictor score.
  • 11. The apparatus of claim 10, wherein assignment of resources is further based on a medical profile of the individual.
  • 12. The apparatus of claim 1, wherein at least a portion of the plurality of data is based on one or more pre-determined fillable forms.
  • 13. The apparatus of claim 12, wherein the one or more pre-determined fillable forms contain a value range associated with a medical condition of the one or more individuals.
  • 14. A computer-implemented method, the method comprising: receiving, by one or more computer processors, a first plurality of medical data associated with a first plurality of individuals; andtraining, by the one or more computer processors, a machine learning model (MLM) based on the received first plurality of data, such that the trained MLM is able to process a second plurality of medical data associated with a second plurality of individuals and output a score associated with each of the second plurality of individuals, wherein the score is an estimation of a likelihood, respectively, that each of the second plurality of individuals will suffer a fall, and wherein at least a portion of the first plurality of data and a portion of the second plurality of data are based on one or more pre-determined fillable forms associated with a value range.
  • 15. The computer-implemented method of claim 14, wherein the MLM includes at least a static component and a dynamic component.
  • 16. A non-transitory computer-readable storage medium storing computer-readable program code executable by a processor to: apply a machine learning model (MLM) to a plurality of data associated with at least one individual;compute an initial fall risk score for the at least one individual based on the application of the MLM;generate a final fall risk score by applying an adjustment factor to the initial fall risk score.
  • 17. The non-transitory computer-readable storage medium of claim 16, wherein the MLM includes a dynamic model and a static model.
  • 18. The non-transitory computer-readable storage medium of claim 17, wherein the plurality of data includes i) one or more diagnostic data associated with the at least one individual and ii) one or more medication data associated with the at least one individual.
  • 19. The non-transitory computer-readable storage medium of claim 17, wherein the dynamic model includes a neural network.
  • 20. The non-transitory computer-readable storage medium of claim 17, wherein one or more outputs of the static model are one or more inputs of the neural network.
  • 21. The non-transitory computer readable storage medium of claim 18, wherein the static model at least partially determines a first set of weights associated with the one or more diagnostic data and the one or more medication data, and wherein the dynamic model at least partially determines another one or more weights associated with the remaining plurality of data.
  • 22. The non-transitory computer-readable storage medium of claim 17, wherein the modified fall risk score is generated by another static model that is distinct from the MLM.
  • 23. The non-transitory computer-readable storage medium of claim 16, wherein the another static model is based on a fall history of the at least one individual.
  • 24. An apparatus, the apparatus comprising: a memory to store instructions; andprocessing circuitry, coupled with the memory, operable to execute the instructions, that when executed, cause the processing circuitry to: receive a plurality of medical data associated with a plurality of individuals;compute a fall prediction score associated with each of the plurality of individuals based on the received medical data; andallocate one or more resources to each of the plurality of individuals based on the fall prediction computation and a medical profile of each of the plurality of individuals.
  • 25. The apparatus of claim 24, wherein the plurality of medical data includes one or more data from one or more predetermined forms, and wherein the one or more predetermined forms include a value range associated with a medical condition.
  • 26. The apparatus of claim 25, wherein the allocated one or more resources includes a brace.
  • 27. A method comprising: receiving a first plurality of data associated with one or more individuals;applying a hybrid machine learning algorithm to the first plurality of data;generating an initial score based on an application of the hybrid machine learning algorithm to the first plurality of data;generating a second score by applying an adjustment factor to the initial score, wherein the adjustment factor is based on a fall history data of the one or more individuals;receiving a second plurality of data associated with the one or more individuals;applying another algorithm to the second plurality of data associated with the one or more individuals;generating a third score based on application of the another algorithm; andgenerating a final score based on the second score and the third score, where the final score is a prediction as to whether or not the one or more individuals will sustain a fall.
  • 28. The method according to claim 27 further comprising: assigning a resource to prevent a fall to the one or more individuals based on the final score.
  • 29. The method according to claim 28, wherein the second plurality of data is clinical data associated with the one or more individuals.
  • 30. The method according to claim 29, wherein the hybrid machine learning model includes: i) one or more static models and ii) one or more dynamic models and the second algorithm includes one or more static models.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims benefit of co-pending U.S. provisional patent application Ser. No. 63/266,394 filed Jan. 4, 2022. The aforementioned related patent application is herein incorporated by reference in its entirety.

Provisional Applications (1)
Number Date Country
63266394 Jan 2022 US