Implementing Machine Learning For Life And Health Insurance Loss Mitigation And Claims Handling

Information

  • Patent Application
  • 20210256615
  • Publication Number
    20210256615
  • Date Filed
    September 20, 2018
    5 years ago
  • Date Published
    August 19, 2021
    2 years ago
Abstract
Techniques for implementing machine learning for insurance loss mitigation or prevention, and claims handling are disclosed. In some scenarios, the insurance loss mitigation and claims handling may be associated with a disability, worker's compensation, life or health insurance policy, and the machine-learning analytics model may be trained in accordance with data that is relevant to identifying appropriate predictions in accordance with these particular types of insurance products. For instance, the machine-learning analytics model may utilize information within a dynamic data set as training data, which may include electronically accessible information. The machine-learning analytics model may additionally be implemented to identify various predictions that are indicative of a risk of insuring an individual as well as one or more actions that, when performed, may reduce the initial calculation of risk.
Description
FIELD OF INVENTION

This disclosure generally relates to implementing machine learning as part of insurance risk assessment and, more particularly, to implementing machine learning to improve upon aspects of life, worker's compensation, disability and health insurance loss mitigation and prevention, and claims handling.


BACKGROUND

Insurance policies may typically require that the risk of insuring a particular person or property be evaluated as part of an initial underwriting process. The underwriting process may typically require an insurer to assess several variables to identify the overall risk of insuring a particular person or asset. Based upon this assessed risk, the insurer may then decide, for example, how much insurance to provide and/or the cost of premiums for a specific type of insurance and amount of insurance coverage. Moreover, once an insurance policy is issued, a claim may be made by an insured, which is then reviewed by a claims handler who partially approves, fully approves, or rejects the claim, allowing the insured to be reimbursed accordingly.


However, conventional underwriting and claims handling may typically rely on manual methods performed by an insurance underwriter and claims handler, respectively, which may be time-consuming, arduous, and prone to human error that may lead to an inaccurate assessment of risk. Furthermore, current insurance pricing and underwriting techniques may identify risks by looking at previous and current medical history, but may be unable to consistently and accurately predict and assess future risks, which may likewise prevent an insurer from effectively managing loss mitigation and prevention.


SUMMARY

The present disclosure generally relates to techniques implementing machine learning for insurance loss mitigation and prevention, and claims handling. In particular, electronically accessible data may be analyzed that is relevant to specific types of insurance policies. For instance, for life and health insurance policies, disability and worker's compensation policies, electronic medical records, demographic information, insurance records, lifestyle information, psychographic information, etc., may be collected to form part (or all) of a dynamic data set, which may change over time as additional information is collected and/or as additional users contribute to an overall data pool. In some cases, the information collected and/or analyzed may pertain only to humans. In some embodiments the information collected and/or analyzed may pertain to domesticated animals (e.g., dogs, cats, thoroughbreds, etc.) and/or livestock.


Information contained within this dynamic data set may then be used to train one or more machine-learning analytics models, algorithm, or module (and/or other artificial intelligence models, algorithms, or modules) such that, when a user requests information regarding a new or existing insurance product, his application information or “user data,” may be analyzed in accordance with a trained machine-learning analytics model, algorithm, or module to predict certain risk variables that may be indicative of risk. These risk variables may include, for instance, predicted medical-related conditions that are likely to occur in accordance with the data analyzed via the trained machine-learning analytics model, algorithm, or module. From these risk variables, an initial risk assessment may be made, which may include a scaled risk score or other suitable indicator to quantify the risk of insuring the user given the likelihood, for example (in the case of a life or health insurance policy) of the various medical-related conditions occurring within some future time horizon that coincides with the insurance coverage.


Moreover, the machine-learning analytics models, algorithms, or modules (and/or other artificial intelligence models, algorithms, or modules) may be further implemented to identify one or more loss-mitigation, and/or loss-prevention, variables that represent a reduction in the initial risk assessment. For instance, the loss-mitigation, and/or loss-prevention, variables may include or more intervening actions (e.g., a type and frequency of exercise, daily nutritional guidance, lifestyle habits, etc.) that, when executed by the user, reduce the initial risk assessment value to a new, reduced risk level. These loss-mitigation, and/or loss-prevention, variables may be shared with a user applying for a new insurance policy and/or renewing an existing policy along with calculated insurance premiums, which may reflect each respective level of risk. In this way, an insurer may mitigate the loss (and/or prevent losses from occurring) of insuring a particular person while providing options to the user to reduce a premium rate as long as the user complies with the actions, such as health improving actions, indicated by the various identified loss-mitigation, and/or loss-prevention, variables.


Furthermore, upon insuring a user (via traditional insurance application techniques or those described herein designed to mitigate insurer loss certain aspects described herein implement trained machine-learning analytics models, algorithms, or modules (and/or other artificial intelligence models, algorithms, or modules) to improve upon the claims handling process. In particular, some aspects include using data contained within the dynamic data set (e.g., insurer data such as previous claims history and insurance records) to streamline the claims handling process. These aspects may additionally or alternatively include predicting a claim amount or settled amount, and/or partially executing portions of electronic claims and/or performing other claims-based actions using these predictions.


In one aspect, a computer-implemented method for implementing a machine-learning analytics model, algorithm, or module (and/or other artificial intelligence model, algorithm, or module) to calculate a level of risk of insuring a user and/or how to reduce this risk may be provided. The method may include one or more processors and/or associated transceivers (I) accessing a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and/or lifestyle information; (2) training a machine-learning analytics model, algorithm, or module (and/or other artificial intelligence model, algorithm, or module) using the dynamic data set as training data to generate a trained machine-learning analytics model, algorithm, or module (and/or other trained artificial intelligence model, algorithm, or module); (3) receiving user data associated with a user; (4) applying the trained machine-learning analytics model, algorithm, or module (and/or other trained artificial intelligence model, algorithm, or module) to the user data to predict one or more medical-related conditions associated with the user based upon the user data; (5) determining, in accordance with the trained machine-learning analytics model, algorithm, or module (and/or other trained artificial intelligence model, algorithm, or module) a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions; (6) identifying, in accordance with the trained machine-learning analytics model, algorithm, or module (and/or other trained artificial intelligence model, algorithm, or module) one or more intervening actions that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk; and/or (7) transmitting the one or more intervening actions to a computing device to be presented to the user. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In another aspect, a computing device for implementing a machine-learning analytics model (and/or other artificial intelligence model, algorithm, or module) to calculate a level of risk of insuring a user and/or how to reduce this risk may be provided. The computing device may include a communication unit configured to access a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and/or lifestyle information, and to receive user data associated with a user. Additionally, the computing device may include a processing unit that is configured to (1) train a machine-learning analytics model using the dynamic data set as training data to generate a trained machine-learning analytics model; (2) apply the trained machine-learning analytics model to the user data to predict a set of one or more medical-related conditions associated with the user; (3) determine a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions in accordance with the trained machine-learning analytics model; and/or (4) identify one or more intervening actions in accordance with the trained machine-learning analytics model that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk. Moreover, the communication unit may be further configured to transmit the one or more intervening actions to a computing device to be presented to the user. The computing device may include additional, less, or alternate components, including those discussed elsewhere herein.


In yet another aspect, a non-transitory computer readable media may be provided to calculate a level of risk of insuring a user and/or how to reduce this risk. The instructions stored on the non-transitory computer readable may, when executed by one or more processors, cause the one or more processors to: (1) access a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and/or lifestyle information; (2) train a machine-learning analytics model (and/or other artificial intelligence model, algorithm, or module) using the dynamic data set as training data to generate a trained machine-learning analytics model; (3) receive user data associated with a user; (4) apply the trained machine-learning analytics model to the user data to predict one or more medical-related conditions associated with the user based upon the user data; (5) determine, in accordance with the trained machine-learning analytics model, a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions; (6) identify, in accordance with the trained machine-learning analytics model, one or more intervening actions that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk; and/or (7) transmit the one or more intervening actions to a computing device to be presented to the user. The non-transitory computer readable media device may include additional, less, or alternate instructions stored thereon, including those discussed elsewhere herein.


Advantages will become more apparent to those skilled in the art from the following description of the preferred embodiments which have been shown and described by way of illustration. As will be realized, the present embodiments may be capable of other and different embodiments, and their details are capable of modification in various respects. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS

The Figures described below depict various aspects of the system and methods disclosed therein. It should be understood that each Figure depicts one embodiment of a particular aspect of the disclosed system and methods, and that each of the Figures is intended to accord with a possible embodiment thereof. Further, wherever possible, the following description refers to the reference numerals included in the following Figures, in which features depicted in multiple Figures are designated with consistent reference numerals.


There are shown in the drawings arrangements which are presently discussed, it being understood, however, that the present embodiments are not limited to the precise arrangements and instrumentalities shown, wherein:



FIG. 1 illustrates a block diagram of an exemplary computer system 100 implementing machine-learning for insurance loss mitigation and claims handling, in accordance with certain aspects of the present disclosure;



FIG. 2 illustrates a block diagram of an exemplary machine-learning analytics engine 200; in accordance with certain aspects of the present disclosure;



FIG. 3 illustrates an exemplary data set 300 including a dynamic data set and user data, in accordance with certain aspects of the present disclosure;



FIG. 4 depicts an exemplary artificial neural network 400, which may be trained by the machine-learning analytics engine 200 of FIG. 2, in accordance with certain aspects of the present disclosure;



FIG. 5 depicts an exemplary neuron 500 that may correspond to the neuron labeled as “1,1” in hidden layer 404-1 of FIG. 4, in accordance with certain aspects of the present disclosure;



FIG. 6 depicts text-based content of an exemplary electronic claim record 600 that may be processed by a machine-learning analytics engine, in accordance with certain aspects of the present disclosure; and



FIG. 7 illustrates an exemplary computer-implemented method flow 700, in accordance with certain aspects of the present disclosure.





The Figures depict preferred embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the systems and methods illustrated herein may be employed without departing from the principles of the invention described herein.


DETAILED DESCRIPTION

Artificial Intelligence System for Life & Health Insurance


The present embodiments are directed to, inter cilia, machine learning via training a model using electronically accessible data that is associated with various users and/or is relevant to particular types of insurance, such as health and life insurance, for example. The electronically accessible data may include information that changes over time as additional data is collected, aggregated, accessed, and/or retrieved, and thus may be considered a dynamic data set in that it may periodically or continuously change over time.


Techniques are disclosed to train a machine-learning analytics model using various portions of the dynamic data set as inputs, and may include re-training the machine-learning analytics model as information within the dynamic data set changes, thus adapting and improving over time as additional information is received and processed. This machine-learning analytics model, once trained, may then be applied to user data received from one or more users seeking to purchase insurance coverage (e.g., health or life insurance). The machine-learning analytics model may be executed, for example, on an external computing device that receives user data from various computing devices utilized by one or more users requesting insurance coverage and/or from internal insurer systems (e.g., via an agent). Thus, the machine-learning analytics model may automate and improve upon the efficiency and accuracy of existing insurance loss mitigation and prevention, and claims handling processes, in accordance with aspects of the present disclosure. The techniques disclosed herein may allow for real-time loss mitigation, and may be less costly because machines are capable of working around the clock, do not take vacation or observe holidays, and are able to more comprehensively use large volumes (e.g., terabytes or more) of empirical data than are humans.


The dynamic data set, portions of which (or the entirely of) may function as inputs to the machine-learning analytics model, may be obtained from various sources. For instance, the dynamic data set may include information harvested from historical claims, electronic medical records, demographic information, insurance records, surveys, collected lifestyle information, application data, etc. Other inputs to the machine-learning analytics model (which may be additionally or alternatively included as part of the dynamic data set) may include health-related data received from fitness trackers, health-related software applications such as weight loss applications, activity loggers, physical sensors (e.g., heart rate monitors, blood pressure monitors, thermometers, weights scales, glucose monitors, baby monitors, pregnancy monitors, sleep monitors, etc.), social media, etc.


Thus, the present aspects may facilitate predicting new loss-mitigation variables (and/or loss-prevention variables) that allow an insurer to better mitigate (or prevent) loss by identifying present and future risks that would otherwise not be foreseeable using traditional underwriting. The present aspects may dynamically characterize the risk of providing health and/or life insurance to new applicants, and/or dynamically update pricing models as new information is collected and the machine-learning analytics model is re-trained. As a result, the present aspects allow an insurer to improve upon the accuracy and efficiency in how the insurance underwriting process assesses certain risks, identifies variables to mitigate these risks, prices insurance policies to accurately reflect those risks, and/or streamlines the overall insurance claims process or customer experience, in particular with regards to life and health insurance.


Computer System Overview



FIG. 1 illustrates a block diagram of an exemplary computer system 100 implementing machine-learning for insurance loss mitigation and claims handling, in accordance with certain aspects of the present disclosure. The high-level architecture may include both hardware and software applications, as well as various data communication channels for communicating data between the various hardware and software components. Generally, the system 100 may automatically monitor data (which may be dynamically occurring) associated with various electronic records, data sources, and/or users, and use this data set to implement the various machine-learning implementations discussed herein to facilitate improvements to the insurance loss mitigation and prevention, and claims handling process.


In the present aspect, the computer system 100 may include one or more client devices 102, one or more back-end computing devices 120, one or more health institutions 150, and/or a communication network 180. The system 100 may further include any suitable number X of back-end computing devices 120, which may include back-end computing devices 120.1-120.X, for example. The system 100 may include additional, less, or alternate components, including those discussed elsewhere herein.


For the sake of brevity, the system 100 is illustrated as including a single client device 102, three back-end computing devices 120, two health institutions 150, and a single communication network 180. However, the aspects described herein may include any suitable number of such components. For example, back-end computing devices 120 may include several hundred components, including a server configured to communicate with several client devices 102, each of which may be operated by a separate user. Moreover, several back-end computing devices 120 may receive data from each separate client device 102, from several health institutions 150, and/or transmit data to each separate client device 102 and several health institutions 150, as further discussed herein.


To provide another example, one or more of back-end computing devices 120 may receive user data from one or more client devices 102 such that insurance policy pricing may be calculated and transmitted to each client device 102, where it is then displayed to each respective user. To provide additional examples, client device 102 may represent one client device from several different client devices for the same user or for different users. For example, client device 102 may represent a user's smartphone as well as a user's desktop computer, each of which may collect and communicate with one or more health institutions 150 and/or one or more back-end computing devices 120, as further discussed below.


Communication network 180 may be configured to facilitate communications between one or more client devices 102, one or more health institutions 150, and/or one or more back-end computing devices 120 using any suitable number of wired and/or wireless links, which may be represented as links 117.1-117.3, for example. For example, communication network 180 may include any suitable number of nodes, radio frequency links, wireless or digital communication channels, additional wired and/or wireless networks that may facilitate one or more landline connections, internet service provider (ISP) backbone connections, satellite links, public switched telephone network (PSTN), etc.


To provide additional examples, the present aspects include communication network 180 being implemented, for example, as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), or any suitable combination of local and/or external network connections. To provide further examples, communications network 180 may include wired telephone and cable hardware, satellite, cellular phone communication networks, base stations, macrocells, femtocells, etc. In the present aspects, communication network 180 may provide one or more client devices 102 with connectivity to network services, such as Internet services, for example, and/or support application programming interface (API) calls between one or more client devices 102, one or more health institutions 150, and/or one or more back-end computing devices 120.


One or more health institutions 150 may include any suitable number and/or type of health institutions that are associated with various medical records. For example, one or more health institutions 150 may include hospitals, physician offices, dentists, mental health providers, pediatric care facilities, emergency care facilities, psychiatric care providers, imaging (e.g., X-ray, ultrasound, MRI, CT) facilities, chiropractors, therapists, nurses, pharmacists, dieticians, laboratories, etc. In various aspects, one or more users (e.g., a user associated with client device 102) may have electronic medical records that are created by one or more health institutions 150 (e.g., via medical staff entering or maintaining electronic medical records), stored locally at the one or more health institutions 150, or otherwise stored on one or more suitable storage devices and accessible via the one or more health institutions 150. Thus, each user's electronic medical records may be held at a single institution (or accessible storage device) that is part of or accessed by the one or more health institutions 150, or spread out across several different health institutions 150 or storage devices. In this way, the health institutions 150 may function, in some instances, as gateways of medical records such that the provisions of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) may be met by obtaining user authorization to access such information.


In one aspect, electronic medical records held at one or more health institutions 150 may be accessible via a secure connection to communication network 180, for example, by client device 102 and/or one or more back-end computing devices 120. For example, one or more health institutions 150 may provide online services that allow a user to access her accounts using client device 102 and/or another suitable computing device. To provide another example, a user may authorize a third party (e.g., an insurer) upon applying or requesting information (and providing proof of user consent) regarding health and/or life insurance policies to access electronic medical records associated with one or more health institutions 150.


In any event, upon receipt of a valid and authenticated request for medical record data, one or more health institutions 150 may transmit medical-related information to client device 102 and/or one or more back-end computing devices 120. The medical-related information may include any suitable data relevant to assessing the current and future risk of insuring a user for various insurance products (e.g., life and health insurance), and/or how to reduce this risk to mitigate insurer loss. For example; the medical-related data transmitted by one or more health institutions 150 may include individual and/or family medical history, details regarding specific medical procedures (e.g., when, how much, where treated), information regarding a user's current health from previous checkups, whether the user and/or user's family member have any congenital defects, whether the user or user's family members have been diagnosed with a particular disease or condition, in addition to genomic information (e.g., raw data from consumer-grade and/or personalized medicine genetic testing), etc.


In various aspects, back-end computing devices 120 may include any suitable number and/or type of components configured to receive, send, store, and/or analyze data to facilitate the functionality performed via the various embodiments as described herein. For example, as shown in FIG. 1, back-end computing devices 120 may include one or more machine-learning analytics engines 120.1, one or more databases 120.2, and/or one or more database servers 120.X.


In the present aspects, machine-learning analytics engine 120.1 may be configured to access data from and/or store data to one or more additional data sources that may be included as one or more of back-end computing devices 120. Additionally or alternatively, the machine-learning analytics engine 120.1 may access data from one or more health institutions 150 and/or data provided by one or more users associated with one or more client devices 120. In various aspects, any combination and/or subset of this aforementioned data may form a dynamic data set that changes over time as additional data is collected, which may be stored and/or updated in one or more back-end components 120 and/or accessed by the machine-learning analytics engine 120.1. For example, machine-learning analytics engine 120.1 may use any suitable portion of the dynamic data set as training data to train a machine-learning analytics model.


Once the machine-learning analytics model is trained in this way, the machine-learning analytics model may be applied to received user data to predict various medical-related conditions associated with a new or updated life or health insurance policy. Moreover, once such predictions are made, aspects include the machine-learning analytics engine 120.1 determining an initial level of risk associated with insuring the user based upon the one or more predicted medical-related conditions for a particular life or health insurance policy as part of an artificial intelligence (AI) driven underwriting process. The machine-learning analytics engine 120.1 may then identify one or more intervening actions that, when executed by the user within a future time period, reduce the initial level of risk associated with insuring the user to a second level of risk. Moreover, the machine-learning analytics engine 120.1 may calculate pricing (e.g., premiums) associated with the initial and the reduced level of risk, and transmit this information and/or the one or more intervening actions to a computing device (e.g., client device 102) for presentation to the user.


To provide additional examples, as shown in FIG. 1, the dynamic data set used to train the machine-learning analytics model may be accessed via the one or more back-end computing devices 120 and/or any other suitable data sources. For instance, the dynamic data set may include data associated with third-party data providers that may not be readily obtained from the user (e.g., via communications with client device 102), and/or from the health institutions 150. For example, the additional data sources may include information such as data mined from social media (which may detail lifestyle and activities, and sporting, fitness, and eating habits), data provided by the insurer (e.g., insurance claim or other history known by the insurer or the user's previous insurers), psychographic information, demographic information, lifestyle information, etc. In other words, the dynamic data set may include, for example, any suitable type of information that is relevant to the calculation of an initial and reduced level of risk and/or the identification of intervening actions taken to reduce the initial level of assessed risk of insuring the user and, in turn, relevant to calculating pricing for a specific type of insurance product.


For example, a user's demographic information may include any suitable type of information that is relevant to the calculation of risks and/or actions to reduce this assessed risk, which may be used to calculate insurance pricing for a particular user and for a particular insurance product. For instance, the demographic information may include a user's age or age bracket, gender, marital status, household size, name and address, a particular region where the user currently lives (e.g., a city, state, zip code, county, etc.), whether the user has any children, total household income, languages spoken, whether the user owns a home, whether the user rents, level of education, employment status, number and type of vehicles (or other assets) owned, where the user works, etc.


In one embodiment, a severe influenza outbreak or epidemic may occur, and a machine learning model may be trained using historical medical claims data to determine those current patients or users who are most at risk. A second model may be used, in conjunction with multiple mitigation approaches (e.g., therapies and/or medications) to determine the relative effectiveness of the approaches. Effectiveness may be measured by comparing a patient's first condition to the same patient's second condition with respect to one or more ailments and/or medical/lifestyle conditions. Each condition may be assigned a weight, which may improve or worsen over time in response to the patient taking action. Effectiveness may be measured with respect to each weight, or aggregated to form an overall effectiveness score. Future mitigation advice may be modified based upon A/B testing of loss mitigation approaches. For example, whether diet, exercise, or some combination of the two was more successful for patients of a particular type.


Additionally, a user's psychographic information may also include any suitable type of information that is relevant to the calculation of risks and/or actions to reduce this assesse risk, which may be used to calculate insurance pricing for a particular user and for a particular insurance product. For instance, the psychographic information may include a level of risk tolerance in general and/or for various specific types of activities, aspects of the user's personality, values, opinions, attitudes, interests, lifestyles, etc.


To provide yet another example, a user's lifestyle information may also include any suitable type of information that is relevant to the calculation of risks and/or actions to reduce this assesse risk, which may be used to calculate insurance pricing for a particular user and for a particular insurance product. For instance, the lifestyle information may include data received from fitness trackers, data received via connected (e.g., smart) medical devices, data indicative of a user's frequency and intensity of exercise, information indicative of a user's diet such as caloric intake and/or nutritional information, etc.


In one aspect, the machine-learning analytics engine 120.1 may be implemented as any suitable number and/or type of computing device (e.g., one or more computer servers) configured to communicate with other components such as other back-end components 120.2-120.X, one or more client devices 102, and/or one or more health institutions 150 (or suitable databases and/or storage devices associated therewith), etc. In various aspects, machine-learning analytics engine 120.1 may be configured to process application programming interface (API) service calls, to support one or more applications installed on one or more client devices 102, etc., the details of which are further discussed below.


Furthermore, certain aspects described herein allocate the calculations and functionality for executing the machine-learning based loss mitigation and prevention, and claims handling primarily with machine-learning analytics engine 120.1, for ease of explanation. However, aspects include machine-learning analytics engine 120.1 working in conjunction with any suitable number of other components of system 100 (or others not shown in FIG. 1) to facilitate the functionality associated with the aspects of the disclosure as described herein. For example, machine-learning analytics engine 120.1 may work in conjunction with other servers, databases, cloud-based servers, etc., included as part of the one or more back-end computing devices 120. To provide another example, machine-learning analytics engine 120.1 may work in conjunction with client device 102. Additionally or alternatively, one or more functions described herein with respect to machine-learning analytics engine 120.1 may also be performed via one or more client devices 102, which similarly may work in conjunction with work one or more of back-end computing devices 120.


Database 102.2 may include one or more storage devices configured to collect, store, delete, update, and/or modify data in accordance with one or more instructions received from one or more other back-end components 120, one or more client devices 102, and/or one or more health institutions 150. For example, database 120.2 may include any suitable combination of one or more storage mediums such as hard disk drives, solid state memory, cloud-based storage devices, etc. In various aspects, database 120.2 may store data in addition to or instead of data stored locally by machine-learning analytics engine 120.1. In doing so, machine-learning analytics engine 120.1, database 120.2, and/or other back-end components 12.0 may store any suitable type of data used to facilitate the various functionalities of certain aspects as described herein.


Examples of the data stored among the various components of one or more back-end components 120 include information contained in the dynamic data sets as discussed herein and/or subsets of information included in the dynamic data sets, insurance plan information, one or more intervening actions to reduce the initial assessed level of risk, logs or monitored data used to determine whether a user has been or will execute the various intervening actions, executable code, algorithms, instructions, etc., used to train, re-train, and otherwise execute the machine-learning analytics model, other calculations as discussed herein, etc. Moreover, data stored in database 120.2 (and/or one or more other back-end components 120) may be accessed via client device 102 and/or the one or more health institutions 150 as needed.


In some aspects, data stored in database 120.2 (and/or one or more other back-end components 120) may include private or confidential information such as electronic medical records, insurance-related information (e.g., a history of insurance claims and/or retrieved insurance information maintained for one or more users etc. Thus, some aspects include utilizing secure data storage and access procedures when data is written to or retrieved from database 120.2 (and/or one or more other back-end components 120) via machine-learning analytics engine 120.1. These procedures may include, for example, secure login and authentication procedures and/or the encryption of data stored in database 120.2 (and/or one or more other back-end components 120).


Database server 120.X may be configured as any suitable number and/or type of storage devices to perform substantially similar functions as machine-learning analytics engine 120.1. In some embodiments, machine-learning analytics engine 120.1 and database server 120.X may be implemented as a single device, and thus both machine-learning analytics engine 120.1 and database server 120.X may not be present in some aspects. But in other aspects, database server 120.X may perform dedicated database operations, while machine-learning analytics engine 120.1 may perform communication and analytical-based functions.


For example, machine-learning analytics engine 120.1 may handle communications with client computing devices 102, one or more health institutions 150, and/or one or more other back-end components 120, and perform calculations related to training, re-training, and executing a machine-learning analytics model. Continuing this example, in such a case, database server 120.X may facilitate the receipt of data included in the dynamic data set that is used to train and re-train the machine-learning analytics model. For example, as new information is received over time, database server 120.X may append, substitute, update, or otherwise modify the information contained within the dynamic data set so that it remains up-to-date, allowing the machine-learning analytics model to “learn” over time.


To provide another example, database 120.2 (and/or one or more other back-end components 120) may store user information, logon credentials, and contact information for one or more users. Machine-learning analytics engine 120.1 may access the information contained within the dynamic data set stored in database 120.2 (and/or one or more other back-end components 120) to correlate data received from various data sources to a particular user to facilitate the application of the machine-learning analytics model for that user.


In the present aspects, client device 102 may include a processing unit 104, a communication unit 106, a user interface 108, a display 110, and a memory unit 114. Client device 102 may include additional, less, or alternate components, including those discussed elsewhere herein. In various aspects, client device 102 may be implemented as any suitable computing device configured to receive user input, display information, and/or communicate with other components of the system 100. For example, client device 102 may be implemented as a smartphone or other suitable mobile computing device. To provide additional examples, client device 102 may be implemented as a personal digital assistant (PDA), a desktop computer, a tablet computer, a laptop computer, a phablet, a GNSS-enabled device, a smart watch, smart glasses, a smart bracelet, wearable electronics, a pager, a computing device configured for wireless communication, etc.


Client device 102 may be configured to communicate using any suitable number and/or type of communication protocols, such as Wi-Fi, cellular, BLUETOOTH, NFC, RFID, Internet Protocols, etc. For example, client device 102 may be configured to communicate with communication network 180 using a cellular communication protocol to send data to and/or receive data from the one or more health institutions 150 and/or the one or more back-end computing devices 120 via communication network 180 using one or more communication links, such as links 117.1-117.3, for example.


To this end, communication unit 106 may be configured to facilitate data communications between client device 102 and one or more of communication network 180, one or more health institutions 150, and/or one or more back-end computing devices 120 in accordance with any suitable number and/or type of communication protocols. In the present aspects, communication unit 106 may be configured to facilitate data communications based upon the particular component and/or network with which client device 102 is communicating.


Such communications may facilitate the transmission of user data collected from client device 102, which is then utilized by machine-learning analytics engine 120.1 in accordance with the execution of the trained machine-learning analytics model to predict a set of one or more medical-related conditions associated with the user, to determine an initial level of risk associated with insuring the user in accordance with the predicted medical-related conditions, to identify one or more intervening actions to reduce the first level of risk, to calculate insurance premiums, etc., as further discussed herein.


In the present aspects, communication unit 106 may be implemented with any suitable combination of hardware and/or software to facilitate this functionality. For example, communication unit 106 may be implemented with any suitable number and type of wired and/or wireless transceivers, network interfaces, physical layers (PHY), ports, antennas, etc.


User interface 108 may be configured to facilitate user interaction with client device 102. For example, user interface 108 may include a user-input device such as an interactive portion of display 110 (e.g., a “soft” keyboard displayed on display 110), an external hardware keyboard configured to communicate with client device 102 via a wired or a wireless connection (e.g., a BLUETOOTH keyboard), an external mouse, or any other suitable user-input device.


Display 110 may be implemented as any suitable type of display that may facilitate user interaction, such as a capacitive touch screen display, a resistive touch screen display, etc. In various aspects, display 110 may be configured to work in conjunction with user-interface 108 and/or processing unit 104 to detect user inputs upon a user selecting a displayed interactive icon or other graphic, to identify user selections of objects displayed via display 110, to display notifications and/or pricing information for specific insurance products, etc.


Processing unit 104 may be implemented as any suitable type and/or number of processors, such as a host processor for the relevant device in which client device 102 is implemented, for example. Processing unit 104 may be configured to communicate with one or more of communication unit 106, user interface 108, display 110, and/or memory unit 114 to send data to and/or to receive data from one or more of these components.


For example, processing unit 104 may be configured to communicate with memory unit 114 to store data to and/or to read data from memory unit 114. In accordance with various embodiments, memory unit 114 may be a computer-readable non-transitory storage device, and may include any suitable combination of volatile (e.g., a random access memory (RAM)), or non-volatile memory (e.g., battery-backed RAM, FLASH, etc.). In the present aspects, memory unit 114 may be configured to store instructions executable by processing unit 104. These instructions may include machine readable instructions that, when executed by processing unit 104, cause processing unit 104 to perform various acts.


In the present aspects, insurer application 115 is a portion of memory unit 114 configured to store instructions, that when executed by processing unit 104, cause processing unit 104 to perform various acts in accordance with applicable aspects as described herein. In certain aspects, a user may utilize insurer application 115 (or other suitable component(s) of client device 102), to begin the process of requesting information regarding various insurance products, such as purchasing new life or health insurance policies or updating existing ones. This may be implemented, for example, upon launching life planning application 115 to facilitate communications with machine-learning analytics engine 120.1 and/or other suitable components.


For example, instructions stored in insurer application 115 may facilitate processing unit 104 performing functions such as displaying various prompts in accordance with an insurer-based application. This may include, for instance, prompts regarding various types of insurance products for which a user desires to obtain pricing information and/or the details associated with such insurance products. Insure application 115 may thus facilitate the collection of portions of, or the entirety of, user data in conjunction with user interface 108, which may then be transmitted to the one or more backend components 120 via communication unit 106. Insurer application 115 may also facilitate a user consenting to the insurer accessing electronic medical records or other sensitive information, as well as presenting information associated with the calculated insurance premium pricing via display 110.


In some aspects, insurer application 115 may reside in memory unit 114 as a default application bundle that may be included as part of the operating system (OS) utilized by client device 102. But in other aspects, insurer application 115 may be installed on client device 102 as one or more downloads, such as an executable package installation file downloaded from a suitable application source via a connection to the Internet or other suitable device, network, external memory storage device, etc.


For example, insurer application 115 may be stored in any suitable portions of memory unit 114 upon installation of a package file downloaded in such a manner. Examples of package download files may include downloads via the iTunes store, the Google Play Store, the Windows Phone Store, a package installation file downloaded from another computing device, etc. Once downloaded, insurer application 115 may be installed on client device 102 as part of an installation package such that, upon installation of insurer application 115, memory unit 114 may store executable instructions such that, when executed by processing unit 104, cause client device 102 to implement the various functions of the aspects as described herein.


Exemplary Machine-Learning Analytics Engine



FIG. 2 illustrates a block diagram of an exemplary machine-learning analytics engine 200, in accordance with aspects of the present disclosure. In one aspect, machine-learning analytics engine 200 may be an implementation of machine-learning analytics engine 120.1, as shown and discussed with respect to FIG. 1. In the present aspects, machine-learning analytics engine 200 may include a processor unit 222, a communication unit 224, and a memory unit 226. Machine-learning analytics engine 200 may include additional, less, or alternate components, including those discussed elsewhere herein.


It should be noted that, although only a single machine-learning analytics engine 200 is shown in FIG. 2, this is only one of many aspects. In some aspects, multiple computing devices, servers, etc., may be configured to have a logical presence of a single entity, such as a server bank or an arrangement known as “cloud computing,” for example. These configurations may provide various advantages, such as enabling near real-time uploads and downloads of information as well as periodic uploads and downloads of information. However, for ease of discussion and not limitation, the machine-learning analytics engine 200 is referred to herein using the singular tense.


Machine-learning analytics engine 200 may be configured to communicate using any suitable number and/or type of communication protocols, such as Wi-Fi, cellular, BLUETOOTH, NFC, RFID, Internet Protocols, etc. For example, the machine-learning analytics engine 200 may be configured to communicate via wireless communication or data transmission over one or more radio frequency links or communication channels, and/or communicate with one or more communication networks (e.g., communication network 180) using a cellular communication protocol to send data to and/or receive data from one or more health institutions (e.g., one or more health institutions 150), one or more back-end computing devices (e.g., one or more back-end computing devices 120), and/or one or more client devices 102 (e.g., client device 102) via such communications.


To this end, communication unit 22.4 may be configured to facilitate data communications between various components in accordance with any suitable number and/or type of communication protocols. In the present aspects, communication unit 224 may be configured to facilitate data communications based upon the particular component and/or network with which machine-learning analytics engine 200 is communicating. In the present aspects, communication unit 224 may be implemented with any suitable combination of hardware and/or software to facilitate this functionality. For example, communication unit 224 may be implemented with any suitable number of wired and/or wireless transceivers, network interfaces, physical layers (PHY), ports, antennas, etc.


Again, such communications may facilitate the receipt of user data and/or data collected from one or more health institutions, one or more back-end computing devices, and/or one or more client devices 102. This collected data may form part of, or entirely of, the dynamic data set, which may be used by the machine-learning analytics engine 200 to train one or more machine-learning analytics models. Once trained, aspects include the machine-learning analytics engine 200 applying the machine-learning analytics model to the user data to predict various risk-based variables, to assess risk in accordance with the risk-based variables, to identify loss-mitigation variables, to calculate insurance premiums in accordance with varying levels of assessed risk, and/or to assist in the claim handling process.


Processing unit 222 may be implemented as any suitable type and/or number of processors, such as a host processor for the relevant device in which machine-learning analytics engine 200 is implemented, for example. Processing unit 222 may be configured to communicate with one or more of communication unit 224 and/or memory unit 226 to send data to and/or to receive data from one or more of these components.


For example, processing unit 222 may be configured to communicate with memory unit 226 to store data to and/or to read data from memory unit 226. In accordance with various embodiments, memory unit 226 may be a computer-readable non-transitory storage device, and may include any combination of volatile (e.g., a random access memory (RAM)), or a non-volatile memory (e.g., battery-backed RAM, FLASH, etc.). In the present aspects, memory unit 226 may be configured to store instructions executable by processor unit 222. These instructions may include machine readable instructions that, when executed by processor unit 222, cause processor unit 222 to perform various acts.


In the present aspects, machine-learning application 227 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in machine-learning application 227 may facilitate processing unit 222 executing the various functions described below with respect to each of the modules stored in memory unit 226. Some of these functions may include, for example, collecting and aggregating various types of data, training one or more machine-learning analytic models, predicting one or more medical-related conditions, assessing the risk of insuring one or more users, identifying intervening actions that mitigate the insurer's loss, calculating premiums, transmitting premiums and/or other notifications to relevant computing devices (e.g., client device 102), etc. These functions are further discussed below with respect to the each of the additional modules stored in memory unit 226.


Some aspects include machine-learning application 227, data aggregation module 229, machine learning training module 231, prediction module 233, risk assessment module 235, loss-mitigation variable identification module 237, premium calculation module 239, and/or claim handling module 241 being implemented as one or more software applications, sets of computer-executable instructions, algorithms, etc., which are stored on the memory unit 226 and executable by the processing unit 222. For example, memory unit 226 may represent a tangible, non-transitory computer-readable medium, with each of machine-learning application 227, data aggregation module 229, machine learning training module 231, prediction module 233, risk assessment module 235, loss-mitigation variable identification module 237, premium calculation module 239, and/or claim handling module 241 including instructions executable by one or more processors (e.g., processing unit 222) that, when executed by the one or more processors, cause the one or more processors to perform various acts as described herein. To provide another example, the machine-learning application 227, data aggregation module 229, machine learning training module 231, prediction module 233, risk assessment module 235, loss-mitigation variable identification module 237, premium calculation module 239, and/or claim handling module 241 may be implemented at least partially in firmware and/or in hardware of the machine-learning analytics engine 200.


The various applications and modules shown in FIG. 2 and discussed herein may be executed on the same processing unit 222 or on different computer processors (which may also be part of separate components not pictured in FIG. 2) in some aspects, as desired. Further, while the machine-learning application 227, data aggregation module 229, machine learning training module 231, prediction module 233, risk assessment module 235, loss-mitigation variable identification module 237, premium calculation module 239, and/or claim handling module 241 are depicted as separate components of memory unit 226, two or more of these components may be integrated into different integrated applications and/or integrated modules. Moreover, one or more of the machine-learning application 227, data aggregation module 229, machine learning training module 231, prediction module 233, risk assessment module 235, loss-mitigation variable identification module 237, premium calculation module 239, and/or claim handling module 241 may be implemented in conjunction with other applications (not shown) that are stored and executed via the machine-learning analytics engine 200 and/or other components in communication with machine-learning analytics engine 200.


In the present aspects, data aggregation module 229 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in data aggregation module 229 may facilitate processing unit 222 performing functions associated with receiving, storing, and updating user data and/or other data that is used as part of a dynamic data set. For instance, data aggregation module 229 may include instructions to facilitate machine-learning analytics engine 200 monitoring various data sources and/or receiving data from one or more data sources over time to build dynamic data sets that may change over time as new data is acquired. To provide another example, data aggregation module 229 may include instructions to facilitate machine-learning analytics engine 200 receiving user data prompted from users making inquiries about health and life insurance products.


Again, the data sources may include, for example, data received from one or more financial institutions (e.g., one or more financial institutions 150), one or more back-end computing devices (e.g., one or more back-end computing devices 120), and/or one or more client devices 102 (e.g., client device 102). In various aspects, this data may represent user input (e.g., via client device 102) and/or other types of data that may be acquired with or without user input. For instance, a user may be solicited via a suitable computing device (e.g, via client device 102) in the form of survey questions, prompts, etc., for information that is then aggregated with other data included in the user's data profile. Some examples of data that may be acquired from a user in this way may include, for example, any suitable portion of medical information, demographic information, insurance information, psychographic information, lifestyle information, etc., as further discussed herein.


Exemplary Data Set


To provide an illustrative example with reference to FIG. 3, which illustrates an exemplary data set 300 including a dynamic data set and user data, in accordance with certain aspects of the present disclosure, data aggregation module 229 may facilitate processing unit 222 (e.g., via communication unit 224) aggregating various portions of data to form the dynamic data set. Additionally, data aggregation module 229 may facilitate processing unit 222 receiving and storing user data, which is used to calculate an insurance premium for a specific type of insurance product. The dynamic data set and user data may thus represent data that is received, accessed, and/or stored in any suitable portion of machine-learning analytics engine 200 (e.g., memory unit 226) and/or another suitable storage device that is accessible by machine-learning analytics engine 200 (e.g., one or more back-end components 120).


It will be understood that the examples shown in FIG. 3 associated with the dynamic data set and user data, such as the electronic medical records, demographic information, insurance records, lifestyle information, etc., are but some examples of the types of information that may be relevant to train and execute a machine-learning analytics model for one or more particular users. The dynamic data set and user data may thus include, for example, any suitable number and/or type of information that is useful or otherwise relevant to calculate health and/or life insurance policies for one or more users, including social media information gathered with the user's permission or affirmative consent. For example, although not illustrated in FIG. 3, the dynamic data set may include psychographic information that is relevant to or indicative of the risk of insuring a user for a particular type of insurance policy.


As shown in FIG. 3, electronic medical records for various users may include data such as a history of various symptoms and diagnoses, pre-existing conditions, congenital defects, a history of blood work data (e.g., triglycerides, cholesterol levels, glucose levels, etc.), and other recorded health metrics (e.g., height, weight, BMI, pulse, blood pressure, body temperature, etc.). Thus, the electronic medical records may include any suitable type of information that is relevant to assessing an initial risk of providing health and/or life insurance for a user and/or intervening actions that may be taken by the user to reduce this initially assessed risk, which may represent part of the dynamic data set that is used as training data for a machine-learning analytics model, as further discussed herein.


To provide another example, as shown in FIG. 3, demographic information may include age (or age bracket), gender, location data such as the user's current address or residential region, blood type, etc., for various users. In various aspects, this demographic information may provide various insights when used as training data, such that correlations may be made amongst similar users and compared to future users as part of a machine-learning analytics model, as further discussed herein. For example, the demographic information may allow a correlation to be made among other users with similar demographic data, for which similar risk assessments may thus be identified.


As yet another example, insurance records may include, for various users, a history of medical claims, a history of other types of insurance claims, current insurance policy information for various users (e.g., policy numbers, dates, coverage, premiums, etc.), insurance pricing information, risk tables and/or data mapping various conditions, behaviors, etc., to specific levels of risk, etc. As further discussed herein, this insurance information may be used to train a machine-learning analytics model by establishing an initial correlation between specific types of insurance policy information, user data for specific policies, and the assessed risk and pricing amongst similar insurance plans.


To provide an additional example, lifestyle information may indicate, for several users, each user's general preference regarding various lifestyle choices, which may represent preferences regarding how often each user prefers to travel (and where), how often a user receives a health-related checkup, how often each user exercises (and the type of exercise), self-logged health data (e.g., information from weight loss applications such as caloric intake, data accessed via fitness trackers such as heart rate, etc.), how each user prefers to commute to work, each user's occupation, etc. Again, like the aforementioned demographic information and other data that may form part of the dynamic data set, the lifestyle information may be utilized to identify risk correlations amongst users, which may then be used, for example, to predict future risks for similar users via the machine-learning analytics model.


In the present aspects, machine learning training module 231 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in machine learning training module 231 may facilitate processing unit 222 performing functions associated with identifying, accessing, and using various portions of the dynamic data set as training data for a machine-learning analytics model.


In various aspects, any suitable number and type of machine-learning analytics model may be implemented, and therefore the data selected from the dynamic data set, as well as the particular type of training process, may be adapted to the particular type of a machine-learning analytics model and/or artificial intelligence system that is implemented. For example, a machine-learning analytics model may be implemented in accordance with decision tree learning, association rule learning, an artificial neural network, deep learning, inductive logic programming, support vector machines, clustering, Bayesian networks, reinforcement learning, reinforced learning, combined learning, representation learning, similarity and metric learning, sparse dictionary learning, genetic algorithms, rule-based learning, a MapReduce programming model used in accordance with the HADOOP framework, etc.


Generally, the overall process of training the machine-learning analytics model may include defining the sample inputs, the importance (e.g., weighting) of these inputs, and defining one or more outputs that are determined using the weighted inputs. Based upon this initial training framework, the machine-learning process allows correlations to be made among different subsets of data within the dynamic data set, correlations to be made among data contained in the data set and received user data, and/or specific predictions to be formulated. Over time, as additional data becomes available, or as the dynamic data set is updated, the trained machine-learning model may identify new correlations, vary the weighing of certain inputs, change the inputs, etc., such that different correlations may be made, the accuracy of predictions may be improved, and/or new predictions may be made.


For instance, the data contained as part of the dynamic data set may represent time-series data, with each data point including a particular value and a corresponding indication of time at which the value was collected, observed, or generated by a particular data source. An example of a machine-learning training process is provided below with reference to FIGS. 4 and 5, which uses a neural network as an example. However, as discussed above, aspects include any suitable type of machine-learning model being trained and executed to facilitate the aspects as described herein.


Exemplary Artificial Neural Network


In various aspects, a processor or a processing element (e.g., processing unit 222) may be trained using supervised or unsupervised machine learning, and the machine learning program may employ a neural network, which may be a convolutional neural network, a deep learning neural network, deep learning or reinforced learning model or module, or a combined learning module or program that learns in two or more fields or areas of interest. Other types of artificial intelligence or machine learning may be used. Machine learning may involve identifying and recognizing patterns in existing data in order to facilitate making predictions for subsequent data. Models may be created based upon example inputs in order to make valid and reliable predictions for novel inputs.


Additionally or alternatively, the machine learning programs may be trained by inputting sample data sets or certain data into the programs, such as via images, electronic records, mobile devices, smart or autonomous vehicles, fitness devices, etc. The machine learning programs may utilize deep learning algorithms that may be primarily focused on pattern recognition, and may be trained after processing multiple examples. The machine learning programs may include any suitable number and type of programs in accordance with the particular artificial intelligence system that is implemented, such as Bayesian program learning (BPL), voice recognition and synthesis, image or object recognition, optical character recognition, natural language processing, etc., either individually or in combination. The machine learning programs may also include, for example, natural language processing, semantic analysis, automatic reasoning, etc.


Supervised or unsupervised machine learning techniques may also be used. In supervised machine learning, a processing element (e.g., processing unit 222) may be provided with example inputs and their associated outputs, and may seek to discover a general rule that maps inputs to outputs. As a result, when subsequent novel inputs are provided the processing element may, based upon the discovered rule, accurately predict the correct output. In unsupervised machine learning, the processing element may be required to find its own structure in unlabeled example inputs.


For instance, FIG. 4 depicts an exemplary artificial neural network 400, which may be trained by the machine-learning analytics engine 200 of FIG. 2, in accordance with aspects of the present disclosure. For example, processing unit 222 may execute instructions stored in machine learning training module 231 to facilitate this process. The example neural network 400 may include layers of neurons, including input layer 402, one or more hidden layers 404-1 through 404-n, and output layer 406. Each layer comprising neural network 400 may include any number of neurons i.e., q and r may be any positive integers. Again, although a neural network is illustrated, aspects include the implementation of any suitable type of machine-learning system to achieve the methods and systems described herein, which may be of a different structure and configuration than those depicted in FIGS. 4 and 5.


Input layer 402 may receive different input data from within the dynamic data set. For example, input layer 402 may include a first input at that represents an insurance type (e.g., health or life insurance), a second input a2 representing patterns identified in input data, a third input a3 representing various age groups, a fourth input a4 representing a gender, a fifth input a5 representing a particular level, frequency, and/or type of exercise, a sixth input a6 representing body mass index, and so on. Six inputs are shown in FIG. 4 for purposes of brevity, although aspects include input layer 402 utilizing any suitable number of (e.g., hundreds or thousands or more) inputs. It should be appreciated that life and health insurance policies may relate to humans or animals (e.g., pets, livestock, etc.). Input may relate to farm/ranch insurance.


In some embodiments, the number of elements used by neural network 400 may change during the training process, and some neurons may be bypassed or ignored if, for example, during execution of the neural network, they are determined to be of less relevance. Furthermore, this training process may be repeated periodically or continuously as additional data is collected and added to the dynamic data set or as data within the dynamic data set is updated. In this way, the neural network 400 may be re-trained over time to continuously provide the most accurate predictions in accordance with the most recent dynamic data set.


Each neuron in hidden layer(s) 404-1 through 404-n may process one or more inputs from input layer 402, and/or one or more outputs from a previous one of the hidden layers, to generate a decision or other output. Output layer 406 may include one or more outputs each indicating a label, confidence factor, and/or weight describing one or more inputs. A label may indicate the presence (e.g., frequent exercise, intensive and/or long duration of exercise, high cholesterol, a high stress job) or absence (e.g., lack of exercise, low cholesterol, low intensity or duration of exercise) of a condition. In some embodiments, however, outputs of neural network 400 may be obtained from a hidden layer 404-1 through 404-n in addition to, or in place of, output(s) from output layer(s) 406.


In some embodiments, each layer may have a discrete, recognizable function with respect to input data. For example, if n=3, a first layer may analyze one dimension of inputs, a second layer a second dimension, and the final layer a third dimension of the inputs, where all dimensions are analyzing a distinct and unrelated aspect of the input data. For example, the dimensions may correspond to aspects of a user considered strongly determinative of risk of insuring a user for health or life insurance, then those that are considered of intermediate importance, and finally those that are of less relevance. In other embodiments, the layers may not be clearly delineated in terms of the functionality they respectively perform. For example, two or more of hidden layers 404-1 through 404-n may share decisions, with no single layer making an independent decision.


In some embodiments, neural network 400 may be constituted by a recurrent neural network, wherein the calculation performed at each neuron is dependent upon a previous calculation. It should be appreciated that recurrent neural networks may be more useful in performing certain tasks. Therefore, in one embodiment, a recurrent neural network may be trained with respect to a specific piece of functionality with respect to system 100 of FIG. 1. For example, in one embodiment, a recurrent neural network may be trained and utilized as part of machine-learning analytics engine 200 to automatically identify specific information that may render a person ineligible for certain types of insurance coverage (e.g., a user may be ineligible for life insurance if he has a specific type of pre-existing condition such as HIV).



FIG. 5 depicts an example neuron 500 that may correspond to the neuron labeled as 1,1″ in hidden layer 404-1 of FIG. 4, according to one embodiment. Again, each of the inputs to neuron 500 (e.g., the inputs comprising input layer 402) may be weighted, such that input a1 through ap corresponds to weights w1 through wp, as determined during the training process of neural network 400.


In some embodiments, some inputs may lack an explicit weight or may be associated with a weight below a relevant threshold. The weights may be applied to a function cc, which may be a summation and may produce a value z1, which may be input to a function 520, labeled as f1,1(z1). The function 520 may be any suitable function such as a linear function, a non-linear function, a sigmoid function, etc. In any event, as depicted in FIG. 5, the function 520 may produce multiple outputs, which may be provided to neuron(s) of a subsequent layer, or used directly as an output of neural network 500. For example, the outputs may correspond to various medical-related predictions for a particular user, or may be calculated values used as inputs to subsequent functions.


It should be appreciated that the structure and function of the neural network 400 and neuron 500 depicted are for illustration purposes only, and that other suitable configurations may exist. For example, the output of any given neuron may depend not only on values determined by past neurons, but also future neurons.


Exemplary Predictive and Risk Assessment Functions


Referring back to FIG. 2, in the present aspects, prediction module 233 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to identify various risk-based variables in accordance with applicable aspects as described herein. For example, instructions stored in prediction module 233 may facilitate processing unit 222 applying the trained machine-learning analytics model to identify, from model outputs, various predictions regarding one or more users for whom health or life insurance coverage is sought. These predictions may include, for example, any suitable type of forecasted information that may be derived from the machine-learning analytics model in accordance with the particular type of model, the manner in which the model is trained, and/or the specific portions of the dynamic data set and how they are mapped and weighted in accordance with the trained machine-learning analytics model.


For instance, certain aspects include the predictions being related to one or more medical-related conditions associated with a user based upon his user data. For example, the user data may be received as a result of a user inquiring about a particular type of health and/or life insurance product, coverage associated with a new policy, coverage regarding a renewed policy, etc., and include information about that user and/or information received from that user. As shown in FIG. 3, the user data may include, for example, user identifying information (e.g., a name and contact information), the parameters associated with a particular type of insurance product, such as deductibles, a type of insurance (e.g., HMO health, PPO health, life insurance, etc.), a life insurance term, the specific benefits that are sought (e.g., life insurance benefits), answers to prompts presented via client device 102, etc. Moreover, and as further discussed in the examples below, the user data may include data retrieved from other sources (e.g., the dynamic data set) that is correlated to a particular user.


To provide an illustrative example, assume that a User A inquires about a PPO health insurance plan, and provides user data indicating the user's name, contact information, age, gender, and a desired deductible. User A may also consent to the insurer collecting additional information from various sources that may then be used to supplement this data, such as information obtained via electronic medical records (e.g., prescription history obtained from one's pharmacy records). The type of information to include as part of the user data, as well as the various sources to provide the user data, may be defined as part of the trained machine-learning analytics model and/or a result of the processing unit 222 executing instructions stored in prediction module 233, in various embodiments.


Continuing this example, once an adequate amount of user data is collected for User A, the trained machine-learning analytics model may be applied to the user data to predict specific medical-related conditions that, within some future time period, are likely to be associated with or experienced by User A. Continuing the previous example, assume that the user data indicates that User A is a 25-year old male having a slightly above average BMI of 26. Further assume that the user data includes electronic medical record information, and indicates from recent blood work that User A has normal levels of cholesterol and triglycerides. Moreover, assume that the information obtained via one or more data sources indicates that User A has an occupation that is known as being relatively sedentary, and that although User A lives within walking or bicycling distance from his place of work, User A decides to commute by driving each day.


This information, taken at one snapshot in time, may not reveal anything that indicates an excessively high risk of providing User A with health insurance. In other words, a traditional underwriting process may only consider present health-related information, and may not consider or identify other sources of information as contributors to future risk. On the other hand, because the present aspects use a trained machine-learning analytics model, this model may be leveraged to identify patterns or correlations using data from these additional sources. Continuing the previous example for User A, the machine-learning analytics model may predict that certain behaviors, lifestyles, and medical conditions in the present may correlate to a high likelihood of certain medical conditions occurring within a future time horizon, thus predicting their occurrence within some future time period. These medical-related conditions may include, for instance, conditions that may develop, injuries that are likely to occur, etc.


For instance, User A has only a slightly than normal BMI and normal blood work metrics. However, the lifestyle information associated with User A (i.e., that User A could partake in exercise as part of a daily commute but chooses not to) is indicative of a desire to not regularly exercise. Because this trend is likely to be maintained in the future given that User A is now 25 years old, it is also likely User A's BMI will increase over time as a result of a sedentary lifestyle. In various aspects, these predictions may be made in accordance with the trained machine-learning analytics model. For example, the machine-learning analytics model may identify users from within the dynamic data set that have similar (or identical) metrics as User A, such as other male users that had same age or were within a range matching User A, had (at that age) the same BMI or a BMI within a range matching User A, had sedentary occupations, and who did, in fact, not regularly exercise from the age of 25-35.


The trained machine-learning analytics model may then determine an overall trend by analyzing the users within this subset of users that are similar to User A, and determine that most of these users had a BMI that increased during the next 3 years, 5 years, etc. in other words, aspects include the machine-learning analytics model, upon being executed, identifying similar users having corresponding information matching that of a particular user, and then predicting one or more medical-related conditions associated with that particular user based upon one or more medical-related conditions and/or trends that were observed within the set of similar users.


Additionally, certain aspects include re-training the machine-learning analytics model over time as additional information is added to the dynamic data set and/or as information in the dynamic data set is updated. This may include, for instance, defining new inputs to the machine-learning analytics model and/or defining new outputs (e.g., alternative or new predictions). In this way, the machine-learning analytics model may be periodically or continuously re-trained such that the accuracy of predictions determined by the model is increased over time.


For example, it may later be determined that the correlation between driving to work instead of walking or bicycling for users similar to User A is not strongly correlated to a trend of subsequently-increased BMI Instead, additional information included in the dynamic data set may identify that, in spite of User A's sedentary occupation and lack of exercise during a commute, fitness tracker data shows that User A still maintains 10,000 steps per day, and therefore this additional information may be considered for future users to alter subsequent predictions.


To provide another illustrative example, assume that a User B inquires about a term life insurance plan, and provides user data indicating the user's name, contact information, age, gender, and a desired amount of coverage (i.e., benefit). User B may also consent to the insurer collecting additional information from various sources, which may then be used to supplement this data, such as information obtained via electronic medical records, for example. Again, the type of information to include as part of the user data, as well as the various sources to provide the user data, may be defined as part of the trained machine-learning analytics model and/or a result of the processing unit 222 executing instructions stored in prediction module 233, in various embodiments.


Continuing this example, assume that the user data indicates that User B is a 33-year old female having a normal BMI of 21, and that User B's electronic medical record indicates healthy blood metrics. Thus, this information may not reveal anything that indicates an excessively high risk of providing User B with life insurance. However, the present aspects include the use of a trained machine-learning analytics model to identify other indicators and/or correlations that may be relevant to an increased risk of providing life insurance for User B. For example, further assume that User B's psychographic profile indicates that User B is not particularly risk averse, and User B's lifestyle information indicates that User B often travels to adventuresome destinations and partakes (or is likely to partake) in life-threatening activities, such as base jumping and skydiving. Moreover, assume that the insurance records associated with User B indicate a driving history of several accidents in which User B was at fault, and that police reports associated with User B's insurance records indicate several instances of excessive speeding.


Continuing this example for User B, the machine-learning analytics model may thus predict that, even though User B may be quite healthy, her personality traits and lifestyle represent a high risk with regards to life insurance coverage. As a result, aspects may include the trained machine-learning analytics model predicting a high likelihood of User B dying at an age that is much less than average among users with similar health information. And, if the likelihood of User B dying within a time period matches that of the requested life insurance term, this prediction may be taken into consideration when pricing the life insurance to adequately reflect this risk, as further discussed below.


Again, the machine-learning analytics model may be periodically or continuously re-trained such that the accuracy of predictions determined by the model is increased over time. In this example, it may be later determined that a set of users similar to User B, upon having a child, often cease their high-risk lifestyle. Thus, upon re-training the machine-learning analytics model, if may be determined that a new User C (similar otherwise to User B) has a 6-month-old child from information available via the dynamic data set. Thus, this prediction may later be changed for future users to indicate a less likelihood of User C (and other future users similar to User C) suffering a premature death.


In the present aspects, risk assessment module 235 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in risk assessment module 235 may facilitate processing unit 222 performing functions associated with identifying an initial level of risk to insure a user in accordance with a particular insurance policy, based upon the outputs of the trained machine-learning analytics model. This may include, for example, generating a level of risk associated with insuring a user for health or life insurance based upon the one or more predicted medical-related conditions (or other suitable risk-based variables) output by the trained machine-learning analytics model, as discussed above.


This initial level of risk may represent, for example, any suitable type of system that assesses the overall risk of insuring a user, which may include a numeric scoring and/or weighted system, for example. In various aspects, the level of risk may include a weighting to each predicted medical-related condition output by the trained machine-learning analytics model. Thus, the level of risk may identify a particular risk associated with individually identified conditions, and/or a level of risk associated with the combined probability of all the conditions occurring in accordance with the specific type of insurance coverage that is to be provided.


To provide an illustrative example, an overall level of risk may indicate, for each prediction that is made, the impact or contribution towards the overall risk of insuring the user. For instance, BMI may be considered a relatively accurate factor in assessing the risk of insuring an individual for health insurance, although a trend indicating increasing LDL cholesterol and triglycerides may represent an even higher risk. Therefore, assuming that the trained machine-learning analytics model projects increases to BNB, LDL cholesterol levels, and triglycerides within the next 3 years, a weight may be assigned to each of these conditions to represent the severity or likelihood of each prediction towards an overall risk assessment of insuring the individual. In various aspects, these weightings may be applied to any suitable number of predicted conditions, and may be combined in any suitable manner to derive an overall risk score.


In the present aspects, loss-mitigation variable identification module 237 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in loss-mitigation variable identification module 237 may facilitate processing unit 222 performing functions associated with identifying various loss-mitigation variables that are directed to reducing the initial risk assessment of the user. In various aspects, the loss-mitigation variables may be any suitable type of behavior, action, and/or decision that, when performed by the user within a certain future time period, may reduce or eliminate the impact of the one or more predicted risk-based variables. In other words, the identified loss-mitigation variables allow an insurer to mitigate loss by better insulating the insurer from certain types of claims being made that are associated with high claim payouts.


In some aspects, this future time period may correspond to the term of insurance coverage, and thus the predictions and loss-mitigation variables may be particularly relevant to the insurer. For instance, the identification of the loss-mitigation variables may proactively, attempt to prevent certain claims from occurring at all or, if these claims are made, their severity (and thus cost) may be reduced. Therefore, in either case, loss to both the insurer and the user is mitigated.


Continuing the previous example with a health and life insurance policy, loss mitigation and/or prevention in this context may include preventing certain injuries or medical conditions from occurring or reducing their severity if their occurrence cannot be entirely prevented. For example, the cost of cancer treatment may be very high if caught in a later stage, but much less if diagnosed early. Thus, the identified loss-mitigation variables may include, in this example, a user agreeing to follow a medical wellness examination schedule that is recommended for similar users. In doing so, cancers more common for people over certain ages may be diagnosed early on, mitigating the loss borne by the insurer regarding the protracted and expensive treatments that would otherwise be needed if diagnosed at a later stage.


To provide an illustrative example continuing the previous one with User A, assume that User A, as noted above, has only a slightly than normal BMI and normal blood work metrics, but that the machine-learning analytics model has identified, as a predicted medical-related condition, that User A's BMI will likely (e.g., a 75% likelihood) increase during the next 2 years by 20%. Continuing this example, loss-mitigation variable identification module 237 may include instructions that, when executed by processing unit 222, identifies one or more intervening action. These actions may include those that, when performed by the user within the next 2 years, reduce this likelihood to less than 50%.


In various aspects, these intervening actions may include suggestions regarding any suitable type of activity, suggestions regarding changes in behavior, suggestions regarding daily nutritional intake, suggestions regarding a type and frequency of regular exercise, suggestions regarding changes to or adopting new lifestyle habits, etc. For example, lifestyle habits may include medical options, such as taking one's medicine regularly, being vaccinated against influenza or another pathogen, visiting a cardiologist, lowering one's glucose levels, etc. Using the present example, these intervening actions may include a suggestion for User A to begin taking a commute to work that involves more walking and the accompanying routes to do so, the use of a fitness monitor to provide additional information in this regard, and a suggestion for User A to install a calorie-tracking application on his mobile computing device.


In various aspects, any suitable number and/or type of loss-mitigation variables may be calculated and transmitted to the user's computing device (e.g., client device 102) for presentation to the user depending upon the specific type of insurance product, the determined risk-based variables, their likelihood of occurrence, etc. In any event, certain aspects may include the machine-learning analytics model calculating these loss-mitigation variables in a manner that reduces the likelihood of the risk-based variables (e.g., one or more predicted medical-related conditions) occurring. Likewise, these aspects may include calculating the loss-mitigation variables such that the initial assessed level of risk (which was calculated assuming that the likelihood of these risk-based variables occurring will remain unchanged over a future time horizon) is reduced. In other words, any suitable number of loss-mitigation variables may be identified to sufficiently reduce the probability of various medical-related conditions occurring over a future time horizon.


These loss-mitigation variables may be calculated in accordance with the machine-learning analytics model in various ways. For instance, in some aspects, the machine-learning analytics model may identify previous suggestions regarding intervening actions, and identify whether these suggestions were followed by the user and/or whether these suggestions were correlated to a reduction in the identified conditions. In other aspects, the machine-learning analytics model may look at data inputs that identify the most successful types of suggestions from third-party databases, psychological profiling data, demographic data correlations, etc., in an attempt to present suggestions that will be the most successful to modify behavior for users similar to, in this example, User A.


To provide another illustrative example using User B as identified above, assume that User B has a normal BMI of 21 and healthy blood metrics, but that the machine-learning analytics model identified, as a predicted medical-related condition, that User B will likely suffer from a premature death (e.g., a 60% chance of accidental death prior to age 45). Continuing this example, loss-mitigation variable identification module 237 may include instructions that, when executed by processing unit 222, identify one or more intervening actions to reduce this likelihood to less than 40%. Using the present example, these intervening actions may include a suggestion for User B to take additional safety or training courses regarding specific high risk activities. To provide additional examples, these intervening actions may include warning User B about the dangers involved in participating in certain activities, or suggesting that User B have a wellness examination to determine the health of her heart. Again, these intervening actions may be transmitted to a computing device associated with User B (e.g., client device 102) in the form of notifications or suggestions, such that they may be presented to User B.


Additionally or alternatively, certain aspects may include loss-mitigation variable identification module 237 including instructions that, when executed by processing unit 222, cause machine-learning analytics engine 200 to monitor data and/or track feedback regarding the actions of a user. Continuing the previous examples, this may include, for instance, machine-learning analytics engine 200 determining whether a user has actually executed, or continued to execute, the previously identified loss-mitigation variables. Continuing the previous example for User A, assume that the loss-mitigation variables include suggestions for User A to average 10,000 steps per day and to have a wellness examination each year. In certain aspects, machine-learning analytics engine 200 may monitor data provided from User A and/or other sources to determine whether User A has actually followed through with these actions.


For instance, machine-learning analytics engine 200 may periodically ask User A for feedback, require that User A link relevant fitness tracking device account information to his insurance policy, or otherwise request access to this information from User A. In this way, machine-learning analytics engine 200 may monitor user activity to determine if the various intervening actions are actually being executed by User A. Certain aspects may include machine-learning analytics engine 200 confirming that User A is performing the suggested activities in this way, and may send updated notifications to a relevant computing device (e.g., client device 102). However, if User A is not executing the actions represented by the loss-mitigation variables, machine-learning analytics engine 200 may send the computing device 102 warnings or notifications that a failure to do so may impact current and future insurance premiums, as further discussed below.


In any event, the machine-learning analytics engine 200 may re-train the implemented machine-learning analytics model over time as new data is acquired via the monitoring activities described above. For instance, in the event that User A is not executing the suggested actions described immediately above, the machine-learning analytics model may use different inputs, different weights, and/or different portions of the dynamic data set to determine new or alternate actions that are more likely to be carried out by the user.


Continuing the example above with User A, assume that User A is going to annual wellness examinations but only averaging 7000 steps a day for the last 18 months instead of 10,000. In this case, machine-learning analytics engine 200 may re-train the machine-learning analytics model to identify other users similar in age, activity level, and occupation to User A. From this set of users, the machine-learning analytics model may then identify that similar users increased their activity level when they joined a gym that was close (e.g., within 0.25 miles) to their place of work. Using this new information, the machine-learning analytics engine 200 may replace the previous loss-mitigation variable requiring the user to accumulate 10,000 steps a day with a new one that requires the user to attend a gym for 3 hours a week.


And, by continuing to monitor location data consensually shared by User A, machine-learning analytics engine 200 may further track User A's activity to determine whether User A is performing this new intervening activity. In this way, aspects include machine-learning analytics engine 200 continuously receiving feedback regarding the calculated loss-mitigating variables. Moreover, the machine-learning analytics engine 200 may calculate new or alternate loss-mitigating variables as needed to continuously ensure that the likelihood of the predicted risk-based variables occurring is being reduced.


In other words, in the context of health and life insurance, machine-learning analytics engine 200 may facilitate loss mitigation for an insurer by acting as a virtual “health coach assistant,” In accordance with such aspects, the machine-learning analytics engine 200 may identify an ongoing health action plan for an insured user. This health action plan may include, for instance, transmitting one or more loss-mitigation variables to the user's computing device periodically. Again, these loss-mitigation variables may include suggestions to improve an insured's health over time, to prevent unnecessary premature deaths, and to ensure that serious medical conditions either do not occur or are diagnosed in their early stages.


In the present aspects, premium calculation module 239 may be a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in premium calculation module 239 may facilitate processing unit 222 performing functions associated with calculating a health or life insurance premium based upon different levels of risk of insuring users for a particular insurance policy. Again, these insurance policy types may be, for instance, life and insurance policies with various details identified by the user (e.g., deductible, term, etc.).


This may include, for example, calculating premium pricing as an output of the trained machine-learning analytics model and/or in accordance with a correlation between premiums and corresponding assessed risk levels. In any event, insurance premiums may be calculated for a particular type of insurance policy in accordance with the details associated with that insurance policy. Of course, in the event that the machine-learning analytics model is used for pricing, the machine-learning analytics model may also be trained using pricing metrics associated with various insurance policies as input, such that the pricing may be modified and become more accurate over time.


In various aspects, this may include calculating different insurance premiums associated with different levels of assessed risk. For instance, one insurance premium may be calculated that is associated with the initially calculated level of risk, i.e., assuming that the determined risk-based variables will occur with a particular likelihood over a future time horizon. A separate, lower, insurance premium may then be calculated in accordance with a reduced level of risk, which assumes that the user will execute the one or more intervening actions within the future time horizon (or sooner, if applicable). In some aspects, each calculated premium may then be transmitted to the user's computing device (e.g., client device 102), and the user may be given an option to purchase an insurance product in accordance with the lower premium conditioned upon the user performing the one or more loss-mitigating variables.


Furthermore, in the event that the user decides to purchase the insurance product at the lower calculated premium (i.e., agreeing to perform the suggested actions), some aspects include machine-learning analytics engine 200 monitoring the user's activity monitoring data once an insurance policy is actually issued (at this premium) to determine whether the user is performing the suggested actions. Again, the user activity monitoring data may include any suitable type of information relevant to make this determination, such as tracking the user's location, requesting feedback from the user, requesting proof of medical examinations, periodically accessing medical records, monitoring fitness tracker data, etc. Thus, any portion of the user activity monitoring data may be stored as part of the dynamic data set and/or the user data, as described herein, and accessed via the machine-learning analytics engine 200 as needed.


As discussed above, the machine-learning analytics model may also be re-trained based upon the collection of the user activity monitoring data. In some aspects, as described above, this may include the determination of new or alternative loss-mitigation variables. Additionally or alternatively, aspects may include re-training the machine-learning analytics model to determine the likelihood that the user will continue to perform the one or more intervening actions during the future time horizon.


To provide an illustrative example using User A, assume that User A is provided with one premium for health insurance associated with an initial level of assessed risk, i.e., assuming that there is a 75% likelihood that User A's BMI will increase during the next 2 years by 20%. Further assume that a second health insurance premium is calculated based upon the user performing various actions that will decrease the likelihood of the user's BMI increasing to less than 50%. It is then assumed that User A agrees to perform the suggested activities and elects to purchase the health insurance at the second, lower premium (as it reflects a lower risk to the insurer). Then, the machine-learning analytics engine 200 may continue to monitor the aforementioned user data to determine not only that User A is performing the suggested acts, but the likelihood that User A will continue to do so.


Continuing this example, assume that the user activity monitoring data associated with User A for the first 2 weeks after the health insurance policy issued indicates that User A is consuming an average of only 60% the recommended daily caloric intake. Furthermore, assume that the user activity monitoring data indicates that User A has averaged 18,000 steps per day —well in excess of the recommended 10,000. In one aspect, the machine-learning analytics engine 200 may re-train the machine-learning analytics model (or use a different machine-learning analytics model) to calculate the likelihood of the user continuing to perform the suggested actions during the next two years (less the initial two weeks). In this example, the machine-learning analytics model may identify one or more patterns amongst users similar to User A (e.g., a similar age, gender, occupation, interests, etc.) who have not performed similar suggested activities. This may include, for instance, a pattern indicating that most of these similar users shared a common trait of excessively partaking in diet and exercise initially, but then fail to maintain this activity over the course of the calculated future time horizon.


Assuming that User A fits into this pattern given this example of user activity monitoring data, then certain aspects may include the machine-learning analytics model 200 identifying new loss-mitigation variables, alternative loss-mitigation variables, and/or updating the existing loss-mitigation variables, which may then be transmitted to a suitable computing device associated with User A (client device 102). Additionally or alternatively, certain aspects may include the machine-learning analytics model 200 transmitting notifications to User A regarding updates or adjustments to the loss-mitigation variables.


For instance, machine-learning analytics model 200 may generate a message informing User A that users with similar habits usually fail to maintain this activity in the long run, that it is suggested for User A to reduce his daily activity and to increase his caloric intake. In other words, certain aspects may include attempting to further change User A's behavior to achieve the overall goal of preventing weight gain.


It should be appreciated that in some embodiments, a machine learning model may be constructed as discussed above to mitigate and/or prevent loss with respect to worker's compensation, disability, life, health, or other types of insurance. Similarly, User A above may be, in some embodiments, a corporation or other legal entity.


In various aspects, the machine-learning analytics engine 200 may continuously or periodically use the user activity monitoring data to update the likelihood of the user continuing to perform the suggested actions, and provide updated and/or new suggestions as needed. In this way, machine-learning analytics engine 200 may dynamically update the loss-mitigation variables and/or generate new loss-mitigation variables to adapt to changes in the user's behavior. In doing so, the machine-learning analytics engine 200 helps to ensure that the user continues to perform the suggested actions, thereby minimizing the insurer's loss for health and life-related insurance claims.


In the present aspects, claim handling module 214 is a portion of memory unit 226 configured to store instructions, that when executed by processing unit 222, cause processing unit 222 to perform various acts in accordance with applicable aspects as described herein. For example, instructions stored in claim handling module 241 may facilitate processing unit 222 performing functions associated with implementing streamlined claims handling processes and/or improving upon traditional claims handling via by leveraging various aspects of machine-learning.


Exemplary Electronic Claim Record



FIG. 6 depicts text-based content of an exemplary electronic claim record 600 that may be processed by a machine-learning analytics engine in accordance with various aspects of the present disclosure, such as machine-learning analytics engine 200 of FIG. 2, for example. The term “text-based content” as used herein includes printing (e.g., characters A-Z and numerals 0-9), in addition to non-printing characters (e.g., whitespace, line breaks, formatting, and control characters). Text-based content may be in any suitable character encoding, such as ASCII or UTF-8 and text-based content may include HTML.


Although text-based-content is depicted in the embodiment of FIG. 6, as discussed above, data input and/or used as part of the electronic claim file may include images, including hand-written notes, and the AI platform (e.g., machine-learning analytics engine 200) may include a neural network trained to recognize hand-writing and to convert hand-writing to text. Further, “text-based content” may be formatted in any acceptable data format, including structured query language (SQL) tables, flat files, hierarchical data formats XML, JSON, etc.) or as other suitable electronic objects. In some embodiments, image and audio data may be fed directly into the neural network(s) without being converted to text first.


With respect to FIG. 6, electronic claim record 600 includes two sections 610a-610b, which respectively represent policy information and loss information (i.e., the cost of health-related services rendered). Policy information 610a may include information about the insurance policy under which the claim has been made, including the person to whom the policy is issued, contact information, the type of plan, deductibles, maximum payouts per year, etc. Policy information 610a may be read, for example, by machine-learning analytics engine 200 analyzing claim data and/or individual claims.


Additional information about the insured (e.g., location, if the issue was related to a pre-existing condition, historical claim data, historical telematics data, family medical history, etc.) may be obtained from various data sources to supplement the input data included with the electronic claim record 600. For example, additional data may be obtained from the dynamic data set and/or the user data (e.g., insurance records/data, as shown in FIG. 3). In some embodiments, in addition to policy information 610a, the electronic claim record 600 may include loss information 610b. In the context of health insurance, the loss information generally corresponds to costs associated with a particular medical condition or accident that necessitated some type of medical treatment for which a claim amount is initially submitted. In the context of a life insurance claim, the loss information 610b may correspond to the total payout in accordance with the life insurance policy benefit.


In any event, the loss information 610b may include the total fees the date and time the services were rendered, whether personal injury occurred, whether medical professionals made any statements in connection with the loss, etc. For instance, the loss information 610b may include (for health insurance, as shown in FIG. 6) a medical diagnosis, services rendered, details associated with the procedures required, the length of a hospital stay, etc. For life insurance policy claims (not shown), the loss information 610b may include, for instance, additional details such as a time and cause of death, whether an autopsy was performed, etc.


In addition to the loss information 610b, the electronic claim record 600 may include additional information such as linked data 620a-g. It should be appreciated that although only links 620a-g are shown in FIG. 6, more or fewer links may be included, in some embodiments. For example, electronic claim record 600 may link to notice of loss 620a, one or more photographs 620b, one or more audio recordings 620c, one or more investigator's reports 620d, one or more forensic reports 620e, one or more diagrams 620f, and/or one or more payments 620g. Data in links 620a-620g may be ingested by an AI platform, such as machine-learning analytics engine 200, for example. Moreover, as described above, each insurance claim (or various details associated with each claim) may be used as inputs to a neural net as part of training a machine-learning analytics model.


Instructions stored in claim handling module 241 may cause processing unit 222 to retrieve, for each link 620a-620g, all available data or a subset thereof. The data represented by each of links 620a-620g may be included as part of the dynamic data set, as part of the user data, and/or as part of any other suitable additional data sources. Each of links 620a-620g may also be processed, weighted, and/or analyzed according to the type of data contained therein. For instance, machine-learning analytics engine 200 may analyze images included and/or associated with photograph link 620b using any suitable type of image processing to recognize, classify, and/or categorize images (e.g., endoscopic images, ultrasound images, etc.) for use in a health or life insurance claim. To provide another example, machine-learning analytics engine 200 may analyze audio recordings (e.g., doctor's notes, annotations, telephone calls, etc.) included and/or associated with audio recording link 620c using a speech-to-text algorithm to translate audio to text for use in a health or life insurance claim.


In various aspects, a relevance order may be established for each of the links 620a-620g, and processing of the data associated with each respective link may be completed according to that order. For example, portions of a claim that are identified as most dispositive of risk may be identified and processed first. If, in that example, they are dispositive of pricing, then processing of further claim elements may be abated to save processing resources. In one embodiment, once a given number of labels is generated (e.g., 50) processing may automatically abate.


Once the various input data comprising electronic claim record 600 have been processed, instructions stored in claim handling module 241 may cause processing unit 222, in one aspect, to execute a text-based analysis of that information, which is then further utilized by the machine-learning analytics engine 200. For example, if the machine-learning analytics model is being trained, then the output of the text-based analysis may be passed to the particular model as part of the training process. Using the aforementioned neural network as an example, the neurons comprising a first input layer of the neural network may be trained such that each neuron receives particular input(s) that may correspond, in one aspect, to one or more pieces of information from policy information 610a and loss information 610b. Similarly, one or more input neurons may be configured to receive particular input(s) from links 620a-620g.


In various aspects, the data inputs provided by the electronic claim record 600 and/or other information used to train and apply the machine-learning analytics model may be useful to make various predictions associated with insurance claims (e.g., life and health insurance claims). For example, the total cost of a new claim may be predicted by applying the machine-learning analytics model trained using historical electronic claim data from the dynamic data set, of which electronic claim record 600 may be one data point.


In other words, by correlating similar claims, users, polices, diagnoses, etc., machine-learning analytics engine 200 may predict, with a particular probability, the total payout on new claims. Continuing this example, the trained model may be configured so that inputting sample parameters, such as those in the example electronic claim record 600, may accurately predict, for example, the estimate of total costs ($12,214) and the settled amount ($9,500). In this case, random weights may be chosen for all input parameters. Certain aspects may include the machine-learning analytics model being dynamically re-trained as additional electronic claim data is collected, such that the predicted dollar values and the correct dollar values converge.


Moreover, certain aspects may include the machine-learning analytics engine 200 performing certain actions in response to various predictions being made about the claims. To provide an illustrative example, assume that a particular electronic claim includes a set of information that correlates to other claims that have a high rate of being flagged, rejected, and/or manually reviewed before they are paid. In one aspect, the machine-learning analytics engine 200 may automatically route, flag, or otherwise manage the claim handling process to ensure that a particular claim is expedited, taking into account the likelihood of the claim requiring further processing.


In one aspect, the machine-learning analytics engine 200 may also modify the information available within an electronic claim record. For example, the machine-learning analytics engine 200 may predict a series of labels (i.e., text) as described above that pertain to a given claim. The labels may then be appropriately weighted in accordance with their relevant, or contribution, towards claim loss value. Next, the labels and corresponding weights, in one embodiment, may be used in conjunction with base rate information to predict a claim loss value. In any event, once the claim loss value is computed, it may be associated with the claim by, for example, writing the amount to the loss information section of the electronic claim record (e.g., to the loss information section 610b of FIG. 6).


Exemplary Computer-Implemented Methods



FIG. 7 illustrates an exemplary computer-implemented method flow 700, in accordance with certain aspects of the present disclosure. In the present aspects, one or more portions of method 700 (or the entire method 700) may be implemented by any suitable device, and one or more portions of method 700 may be performed by more than one suitable device in combination with one another. For example, one or more portions of method 700 may be performed by machine-learning analytics engine 200, as shown in FIG. 2. In one aspect, method 700 may be performed by any suitable combination of one or more processors, instructions, applications, programs, algorithms, routines, etc. In one embodiment, method 700 may be performed via processing unit 222 executing instructions stored in memory unit 226, as shown in FIG. 2, in conjunction with data collected, received, and/or generated via one or more health institutions (e.g., one or more health institutions 150), one or more back-end computing devices (e.g., one or more back-end computing devices 120), and/or one or more client devices 102 (e.g., client device 102).


Method 700 may start when one or more processors access (block 702) a dynamic data set. This dynamic data set may include, for instance, the dynamic data set shown and discussed herein with reference to FIG. 3, which may include electronic medical records, demographic information, insurance records, lifestyle information, family medical history information, etc. Again, the information accessed from the dynamic data set may include any suitable type of information that is relevant to determine the level or risk associated with insuring a user for a particular type of insurance product.


Method 700 may include one or more processors training (block 704) a machine-learning analytics model. This may include, for example, training any suitable number and type of machine-learning analytics models based upon the specific type of insurance product that is sought. For example, the machine-learning analytics model may include neural networks (e.g., as shown in FIGS. 4-5) that are trained in accordance with specific inputs accessed via the dynamic data set (block 704), Training the machine-learning analytics model may include defining the sample inputs, the importance (e.g., weighting) of the various inputs, and defining the outputs that are determined using the weighted inputs (block 704).


Method 700 may include one or more processors receiving (block 706) user data. This may include, for example, receiving data entered by a user in response to various prompts (e.g., via client device 102). In various aspects, the user data may be received from one or more sources, and may include data extracted from the dynamic data set that is correlated to a particular user once data is received from that user identifying him or her (block 706). For instance, the user data may represent any suitable number and/or type of information that is useful or otherwise relevant to calculate health and/or life insurance policies for one or more users (block 706).


Method 700 may include one or more processors applying (block 708) the trained machine-learning analytics model to the user data to predict one or more medical-related conditions. This may include, for instance, identifying similar users within the dynamic data set compared to the user represented by the user data, identifying correlations among the user data and/or among the information contained within the dynamic data set, etc., to formulate various predictions regarding the likelihood of various medical-related conditions occurring within some future time period (block 708).


Method 700 may include one or more processors determining (block 710) an initial level or risk associated with a risk of insuring the user. This may include, for instance, assigning a weight or importance to each predicted risk-based variable (e.g., medical-related condition) output by the trained machine-learning analytics model. This may additionally include, for instance, calculating an overall risk level associated with insuring a user for a specific type of insurance policy (e.g., a health of life insurance policy), assuming that the predicted risk-based variables will occur over a particular future time horizon in accordance with a specific probability.


Method 700 may include one or more processors identifying (block 712) one or more loss-mitigating variables (or loss-prevention variables) that reduce the initial determined (block 710) level of risk. This may include, for instance, the identification of various actions that, when performed by a user, may reduce the likelihood of the various risk-based variables occurring within a future time period. Again, these actions may be determined, for example, in accordance with a re-trained or alternate machine-learning analytics model (compared to the model that was applied to determine the risk-based variables) to determine based upon a correlation to other similar actions and/or other similar users such that actions that have been previously-known to work for a particular user are selected.


Method 700 may include one or more processors calculating (block 714) an insurance premium corresponding to the initially determined (block 710) level of risk and/or a reduced level of risk that is associated with an assumption that the user will perform the one or more identified (block 712) actions. This may include, for example, executing another trained machine-learning analytics model to calculate pricing and/or correlating each calculated level of risk to an insurance premium, as discussed herein.


Method 700 may include one or more processors transmitting (block 716) an insurance premium corresponding to the initially determined (block 710) level of risk and/or a reduced level of risk that is associated with an assumption that the user will perform the one or more identified (block 712) actions. Additionally or alternatively, this may include transmitting the one or more loss-mitigating variables (e.g., one or more actions to be taken by the user). In any event, the calculated insurance premiums and/or one or more loss-mitigating variables, upon being transmitted, may be received by a suitable computing device (e.g., client device 102), and presented to a user on a suitable display (e.g., display 110). The method may include additional, less, or alternate functionality or actions, including those discussed elsewhere herein.


Technical Advantages


The aspects described herein may be implemented as part of one or more computer components such as a client device and/or one or more back-end components, such as one or more machine-learning analytics engines 120.1 and/or machine-learning analytics engine 200, for example. Furthermore, the aspects described herein may be implemented as part of a computer network architecture that facilitates communications between various other devices and/or components. Thus, the aspects described herein address and solve issues of a technical nature that are necessarily rooted in computer technology.


For instance, aspects include analyzing various sources of data to train a machine-learning analytics model and to execute the trained model to make various predictions. In doing so, the aspects overcome issues associated with the inconvenience of manual and/or unnecessary monitoring of such data. Moreover, because of the nature of machine-learning systems, juxtapositions of data, and/or correlations of information may be made, which would not be possible within the confines of traditional insurance underwriting. Furthermore, because the machine-learning analytics model may be re-trained as additional information is added to the dynamic data set, the accuracy and efficiency of the system is improved over time given the inherent nature of machine learning systems. Without the improvements suggested herein, additional time, processing resources, and memory usage would be required to achieve these results and, in some instances, the results would be otherwise unachievable.


Furthermore, the machine-learning techniques described herein improve upon existing technologies by more accurately forecasting and mitigating conditions representative of future risk to an insurer, and allow for large data sets to be monitored from a larger number of sources than would otherwise be feasible or practical. Due to these improvements, the aspects address computer-related issues regarding efficiency over the traditional amount of processing power and models used to assess risk and/or price insurance in a manner that accurately reflects insurer risk and mitigates the loss borne by an insurer in the event that a claim is made. Still further, the improvements discussed herein leverage machine-learning techniques to streamline the electronic claim process by accessing a history of claims. These improvements further increase the speed in which an insurer may process insurance claim data, as well as increasing the overall insurance claim process as compared to traditional claims handling.


Thus, the aspects may also improve upon computer technology by requiring fewer calculations due to the increased efficiency provided, for example, via the combination of processes, steps, elements, and/or components described herein. In other words, the specific combination of elements and/or components working in conjunction with one another (e.g., via networked communications) and of itself represent a significant improvement to the overall technology involved.


Exemplary Computer-Implemented Method for Implementing MACHINE LEARNING TO CALCULATE AND MITIGATE INSURER RISK


In one aspect, a computer-implemented method for implementing a machine-learning analytics model to calculate a level of risk of insuring a user, and/or how to reduce this risk may be provided. The method may include one or more processors and/or associated transceivers (1) accessing a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and/or lifestyle information; (2) training a machine-learning analytics model (or other artificial intelligence model) using the dynamic data set as training data to generate a trained machine-learning analytics model; (3) receiving user data associated with a user; (4) applying the trained machine-learning analytics model to the user data to predict one or more medical-related conditions associated with the user based upon the user data; (5) determining, in accordance with the trained machine-learning analytics model, a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions; (6) identifying, in accordance with the trained machine-learning analytics model, one or more intervening actions that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk; and/or (7) transmitting the one or more intervening actions to a computing device to be presented to the user. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


For instance, in various aspects, the first and second levels of risk associated with insuring the user may represent insuring the user for a health insurance or a life insurance policy. Moreover, the machine-learning analytics model may include a neural net, such that training the machine-learning analytics model includes training a neural net.


In some aspects, premiums may be calculated for each of the first and the second level of risk, and these premium calculations may additionally be transmitted to the computing device for presentation to the user. In some instances, the first and second levels of risk, and their respective calculated premiums, may be associated with a risk of insuring a user for a health or life insurance product. In such aspects, upon insuring the user for the health or the life insurance policy in accordance with the calculated health or life insurance premium, the method may additionally include accessing the dynamic data set to collect user activity monitoring data and/or applying the trained machine-learning analytics model to the user activity monitoring data to determine a likelihood of whether the user will continue to execute the one or more intervening actions during the future time period. Still further, when the insurance product is a life or health insurance product, the future time period may correspond to a period of insurance coverage for a health or a life insurance policy.


Additionally, certain aspects may include the intervening actions including various suggestions that, when executed by the user, may reduce the initial level of risk borne by the insurer. These intervening actions may include, for instance, suggestions regarding a type and/or frequency of exercise, daily nutrition, lifestyle habits, etc.


Exemplary Computing Device for Implementing Machine Learning TO CALCULATE AND MITIGATE INSURER RISK


In another aspect, a computing device for implementing a machine-learning analytics model to calculate a level of risk of insuring a user, and/or how to reduce this risk may be provided. The computing device may include a communication unit configured to access a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and lifestyle information, and to receive user data associated with a user. Additionally, the computing device may include a processing unit that is configured to (1) train a machine-learning analytics model using the dynamic data set as training data to generate a trained machine-learning analytics model; (2) apply the trained machine-learning analytics model to the user data to predict a set of one or more medical-related conditions associated with the user; (3) determine a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions in accordance with the trained machine-learning analytics model; and/or (4) identify one or more intervening actions in accordance with the trained machine-learning analytics model that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk. Moreover, the communication unit may be further configured to transmit the one or more intervening actions to a computing device to be presented to the user. The computing device may include additional, less, or alternate components, including those discussed elsewhere herein.


For instance, in various aspects, the first and second levels of risk associated with insuring the user may represent insuring the user for a health insurance or a life insurance policy. Moreover, the machine-learning analytics model may include a neural net, such that training the machine-learning analytics model includes training a neural net.


In some aspects, the processing unit may be configured to calculate the premiums for each of the first and the second level of risk, and the communication unit may be further configured to transmit these premium calculations to the computing device for presentation to the user. In some instances, the first and second levels of risk, and their respective calculated premiums, may be associated with a risk of insuring a user for a health or life insurance product. In such aspects, upon insuring the user for the health or the life insurance policy in accordance with the calculated health or life insurance premium, the processing unit may be further configured to access the dynamic data set to collect user activity monitoring data and/or apply the trained machine-learning analytics model to the user activity monitoring data to determine a likelihood of whether the user will continue to execute the one or more intervening actions during the future time period. Still further, when the insurance product is a life or health insurance product, the future time period may correspond to a period of insurance coverage for a health or a life insurance policy.


Additionally, in some aspects, the intervening actions including various suggestions that, when executed by the user, may reduce the initial level of risk borne by the insurer. These intervening actions may include, for instance, suggestions regarding a type and/or frequency of exercise, daily nutrition, lifestyle habits, etc.


Exemplary Computer-Readable Media for Implementing Machine LEARNING TO CALCULATE AND MITIGATE INSURER RISK


In yet another aspect, a non-transitory computer readable media may be provided to calculate a level of risk of insuring a user and/or how to reduce this risk. The instructions stored on the non-transitory computer readable may, when executed by one or more processors, cause the one or more processors to: (1) access a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and/or lifestyle information; (2) train a machine-learning analytics model using the dynamic data set as training data to generate a trained machine-learning analytics model; (3) receive user data associated with a user; (4) apply the trained machine-learning analytics model to the user data to predict one or more medical-related conditions associated with the user based upon the user data; (5) determine, in accordance with the trained machine-learning analytics model, a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions; (6) identify, in accordance with the trained machine-learning analytics model, one or more intervening actions that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk; and/or (7) transmit the one or more intervening actions to a computing device to be presented to the user. The non-transitory computer readable media device may include additional, less, or alternate instructions stored thereon, including those discussed elsewhere herein.


For instance, in various aspects, the instructions may cause the one or more processors to calculate the first and second levels of risk as those associated with a risk of insuring the user for health insurance or life insurance. Moreover, the machine-learning analytics model may include a neural net, and the instructions may cause the one or more processors to train the machine-learning analytics model by training a neural net.


In some aspects, the instructions may cause the one or more processors to calculate the premiums for each of the first and the second level of risk, and these premium calculations may additionally be transmitted to the computing device for presentation to the user. In some instances, the first and second levels of risk, and their respective calculated premiums, may be associated with a risk of insuring a user for a health or life insurance product. In such aspects, upon insuring the user for the health or the life insurance policy in accordance with the calculated health or life insurance premium, the instructions may cause the one or more processors to access the dynamic data set to collect user activity monitoring data and/or apply the trained machine-learning analytics model to the user activity monitoring data to determine a likelihood of whether the user will continue to execute the one or more intervening actions during the future time period. Still further, when the insurance product is a life or health insurance product, the future time period may correspond to a period of insurance coverage for a health or a life insurance policy.


Furthermore, certain aspects may include the instructions causing the one or more processors to determine the intervening actions including various suggestions that, when executed by the user, may reduce the initial level of risk borne by the insurer. These intervening actions may include, for instance, suggestions regarding a type and/or frequency of exercise, daily nutrition, lifestyle habits, etc.


Additional Considerations


As discussed herein, data may be collected from various sourced to generate, update, and/or modify a dynamic data set that is used to train and apply a machine-learning analytics model. As described herein, the collection of data may be performed after the user provides their affirmative consent or permission, in some aspects. Furthermore, there are several references herein to re-training the machine-learning analytics models to perform different calculations, make different types of predictions, utilize different types of information as inputs, use different weightings, etc. In such cases, it will be understood that, as an alternative to re-training a machine-learning analytics model, more than one machine-learning analytics model may be implemented, each with a particular function and/or structure. These machine-learning analytics models may be of the same type or different types depending on the available information and/or the particular objective that is sought to be achieved by each one.


Moreover, in many instances throughout the present disclosure the example insurance types are life or health insurance policies. However, the aspects described herein are applicable to any suitable type of insurance policy that may implement risk assessment as part of the pricing structure.


This detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One may be implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.


Furthermore, although the present disclosure sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical. Numerous alternative embodiments may be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims. Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents.


The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical. Numerous alternative embodiments may be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.


The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


Additionally, certain embodiments are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In exemplary embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.


The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some exemplary embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a vehicle, within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.


Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.


As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.


As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and 13 are true (or present).


In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description, and the claims that follow, should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.


The patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s).


The various systems and methods described herein are directed to an improvement to computer functionality, and improve the functioning of conventional computers, as described, for example, in the “Technical Advantages” Section and elsewhere herein.

Claims
  • 1. A computer-implemented method, comprising: accessing, via one or more processors, a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and lifestyle information;training, via one or more processors, a machine-learning analytics model using the dynamic data set as training data to generate a trained machine-learning analytics model;receiving, via the one or more processors, user data associated with a user;applying, via the one or more processors, the trained machine-learning analytics model to the user data to predict a set of one or more medical-related conditions associated with the user;determining, via the one or more processors in accordance with the trained machine-learning analytics model, a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions;identifying, via the one or more processors in accordance with the trained machine-learning analytics model, one or more intervening actions that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk;transmitting, via the one or more processors, the one or more intervening actions to a computing device to be presented to the user;monitoring user activity associated with the one or more identified intervening actions to collect user activity monitoring data;re-training, via the one or more processors, the trained machine-learning analytics model using the user activity monitoring data; andapplying, via the one or more processors, the trained machine-learning analytics model to the user activity monitoring data to determine a likelihood of whether the user will continue to execute the one or more intervening actions during the future time period.
  • 2. The computer-implemented method of claim 1, wherein the first and second levels of risk associated with insuring the user represent insuring the user for a health insurance, worker's compensation, disability or life insurance policy.
  • 3. The computer-implemented method of claim 1, further comprising: calculating, via the one or more processors, a first insurance premium associated with insuring the user in accordance with the first level of risk;calculating, via the one or more processors, a second insurance premium associated with insuring the user in accordance with the second level of risk; andtransmitting, via the one of more processors, the first and the second insurance premium to the computing device for presentation to the user.
  • 4. The computer-implemented method of claim 1, further comprising: calculating, via the one or more processors, a health or life insurance premium associated with insuring the user in accordance with the second level of risk; andupon insuring the user for the health or the life insurance policy in accordance with the calculated health or life insurance premium, accessing, via one or more processors, the dynamic data set to collect user activity monitoring data.
  • 5. The computer-implemented method of claim 1, wherein the act of training the machine-learning analytics model includes training a neural net.
  • 6. The computer-implemented method of claim 1, wherein the one or more intervening actions include suggestions regarding (i) a type and frequency of exercise, (ii) daily nutrition, and (iii) lifestyle habits.
  • 7. The computer-implemented method of claim 1, wherein the future time period corresponds to a period of insurance coverage for a health, worker's compensation, disability or life insurance policy.
  • 8. A computing device, comprising: a communication unit configured to access a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and lifestyle information, and to receive user data associated with a user; anda processing unit configured to: train a machine-learning analytics model using the dynamic data set as training data to generate a trained machine-learning analytics model;apply the trained machine-learning analytics model to the user data to predict a set of one or more medical-related conditions associated with the user;determine a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions in accordance with the trained machine-learning analytics model;identify one or more intervening actions in accordance with the trained machine-learning analytics model that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk;transmit, via the communication unit, the one or more intervening actions to a computing device to be presented to the user;monitor user activity associated with the one or more identified intervening actions to collect user activity monitoring data;re-train the trained machine-learning analytics model using the user activity monitoring data; andapply the trained machine-learning analytics model to the user activity monitoring data to determine a likelihood of whether the user will continue to execute the one or more intervening actions during the future time period.
  • 9. The computing device of claim 8, wherein the first and second levels of risk associated with insuring the user represent insuring the user for a health, worker's compensation, disability or life insurance policy.
  • 10. The computing device of claim 8, wherein the processing unit is further configured to: calculate a first insurance premium associated with insuring the user in accordance with the first level of risk;calculate a second insurance premium associated with insuring the user in accordance with the second level of risk; andtransmit the first and the second insurance premium to the computing device for presentation to the user.
  • 11. The computing device of claim 8, wherein the processing unit is further configured to: calculate a health or life insurance premium associated with insuring the user in accordance with the second level of risk; andupon insuring the user for the health or the life insurance policy in accordance with the calculated health or life insurance premium, access the dynamic data set to collect user activity monitoring data.
  • 12. The computing device of claim 8, wherein the processing unit is further configured to train the machine-learning analytics model by training a neural net.
  • 13. The computing device of claim 8, wherein the one or more intervening actions include suggestions regarding (i) a type and frequency of exercise, (ii) daily nutrition, and (iii) lifestyle habits.
  • 14. The computing device of claim 8, wherein the future time period corresponds to a period of insurance coverage for a health, worker's compensation, disability or life insurance policy.
  • 15. A non-transitory computer readable media having instructions stored thereon that, when executed by one or more processors, cause the one or more processors to: access a dynamic data set associated with one or more users including electronic medical records, demographic information, insurance records, and lifestyle information;train a machine-learning analytics model using the dynamic data set as training data to generate a trained machine-learning analytics model;receive user data associated with a user;apply the trained machine-learning analytics model to the user data to predict a set of one or more medical-related conditions associated with the user;determine, in accordance with the trained machine-learning analytics model, a first level of risk associated with insuring the user based upon the one or more predicted medical-related conditions;identify, in accordance with the trained machine-learning analytics model, one or more intervening actions that, when executed by the user within a future time period, reduce the first level of risk associated with insuring the user to a second level of risk;transmit the one or more intervening actions to a computing device to be presented to the user;monitor user activity associated with the one or more identified intervening actions to collect user activity monitoring data;re-train the trained machine-learning analytics model using the user activity monitoring data; andapply the trained machine-learning analytics model to the user activity monitoring data to determine a likelihood of whether the user will continue to execute the one or more intervening actions during the future time period.
  • 16. The non-transitory computer readable media of claim 15, wherein the first and second levels of risk associated with insuring the user represent insuring the user for a health insurance or a life insurance policy, and wherein the future time period corresponds to a period of insurance coverage for the health or the life insurance policy.
  • 17. The non-transitory computer readable media of claim 15, further including instructions that, when executed by one or more processors, cause the one or more processors to (i) calculate a first insurance premium associated with insuring the user in accordance with the first level of risk, (ii) calculate a health or life insurance premium associated with insuring the user in accordance with associated with insuring the user in accordance with the second level of risk, and (iii) transmit the first and the second insurance premium to the computing device for presentation to the user.
  • 18. The non-transitory computer readable media of claim 15, further including instructions that, when executed by one or more processors, cause the one or more processors to (i) calculate a health or life insurance premium associated with insuring the user in accordance with the second level of risk, and (ii) upon insuring the user for the health or the life insurance policy in accordance with the calculated premium, access the dynamic data set to collect user activity monitoring data.
  • 19. The non-transitory computer readable media of claim 15, wherein the instructions to train the machine-learning analytics model further include instructions that, when executed by one or more processors, cause the one or more processors to train the machine-learning analytics model by training a neural net.
  • 20. The non-transitory computer readable media of claim 15, wherein the one or more intervening actions include suggestions regarding (i) a type and frequency of exercise, (ii) daily nutrition, and (iii) lifestyle habits.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority and the benefit of: U.S. Application No. 62/564,055, filed Sep. 27, 2017 and entitled “REAL PROPERTY MONITORING SYSTEMS AND METHODS FOR DETECTING DAMAGE AND OTHER CONDITIONS;” U.S. Application No. 62/580,655, filed Nov. 2, 2017 and entitled REAL PROPERTY MONITORING SYSTEMS AND METHODS FOR DETECTING DAMAGE AND OTHER CONDITIONS;” U.S. Application No. 62/610,599, filed Dec. 27, 2017 and entitled “AUTOMOBILE MONITORING SYSTEMS AND METHODS FOR DETECTING DAMAGE AND OTHER CONDITIONS;” U.S. Application No. 62/621,218, filed Jan. 24, 2018 and entitled “AUTOMOBILE MONITORING SYSTEMS AND METHODS FOR LOSS MITIGATION AND CLAIMS HANDLING;” U.S. Application No. 62/621,797, filed Jan. 25, 2018 and entitled “AUTOMOBILE MONITORING SYSTEMS AND METHODS FOR LOSS RESERVING AND FINANCIAL, REPORTING,” U.S. Application No. 62/580,713, filed Nov. 2, 2017 and entitled “REAL PROPERTY MONITORING SYSTEMS AND METHODS FOR DETECTING DAMAGE AND OTHER CONDITIONS;” U.S. Application No. 62/618,192, filed Jan. 17, 2018 and entitled “REAL PROPERTY MONITORING SYSTEMS ANI) METHODS FOR DETECTING DAMAGE AND OTHER CONDITIONS;” U.S. Application No. 62/625,140, filed Feb. 1, 2018 and entitled “SYSTEMS ANI) METHODS FOR ESTABLISHING LOSS RESERVES FOR BUILDING/REAL PROPERTY INSURANCE;” U.S. Application No. 62/646,729, filed Mar. 22, 2018 and entitled “REAL PROPERTY MONITORING SYSTEMS AND METHODS FOR LOSS MITIGATION AND CLAIMS HANDLING;” U.S. Application No. 62/646,735; filed Mar. 22, 2018 and entitled “REAL PROPERTY MONITORING SYSTEMS ANI) METHODS FOR RISK DETERMINATION;” U.S. Application No. 62/646,740; filed Mar. 22, 2018 and entitled “SYSTEMS ANI) METHODS FOR ESTABLISHING LOSS RESERVES FOR BUILDING/REAL PROPERTY INSURANCE;” U.S. Application No. 62/617,851, filed Jan. 16, 2018 and entitled “IMPLEMENTING MACHINE LEARNING FOR LIFE AND HEALTH INSURANCE PRICING AND UNDERWRITING;” U.S. Application No. 62/622,542, filed Jan. 26, 2018 and entitled “IMPLEMENTING MACHINE LEARNING FOR LIFE AND HEALTH INSURANCE LOSS MITIGATION AND CLAIMS HANDLING;” U.S. Application No. 62/632,884, filed Feb. 20, 2018 and entitled “IMPLEMENTING MACHINE LEARNING FOR LIFE AND HEALTH INSURANCE LOSS RESERVING ANI) FINANCIAL REPORTING;” U.S. Application No. 62/652,121, filed Apr. 3, 2018 and entitled “IMPLEMENTING MACHINE LEARNING FOR LIFE AND HEALTH INSURANCE CLAIMS HANDLING;” the entire disclosures of which are hereby incorporated by reference herein in their entireties.

Provisional Applications (15)
Number Date Country
62652121 Apr 2018 US
62646729 Mar 2018 US
62646735 Mar 2018 US
62646740 Mar 2018 US
62632884 Feb 2018 US
62625140 Feb 2018 US
62622542 Jan 2018 US
62621797 Jan 2018 US
62621218 Jan 2018 US
62618192 Jan 2018 US
62617851 Jan 2018 US
62610599 Dec 2017 US
62580655 Nov 2017 US
62580713 Nov 2017 US
62564055 Sep 2017 US