Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.
This document relates to systems and techniques for electronic notebooks.
Conventional techniques for recording information include physical notebooks and simple electronic notebooks. However, such notebooks tend to be static in nature, recording manually entered information and manually selected locations and then displaying such information as entered.
The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.
This document describes systems, processes and techniques that may be used to manage and process the recording, arrangement, text processing, word recognition, and/or review of information for or in an electronic notebook such as a patient, psychiatrist, psychologist, or other medical professional electronic notebook. For example, an electronic or digital notebook may optionally be managed by a hosted, secure, cloud based system comprised of co-located and/or geographically distributed server systems. The electronic notebook may be accessed over a network by one or more users via one or more users. For example, the users may comprise one or more medical professionals (e.g., a psychiatrist, a family physician, a neurologist, a geriatrician, a therapist, etc.), a patient, a family member of the patient, a caretaker, etc. The electronic notebook may enable two or more users to collaborate over a network with respect to a patient's data and care. Optionally, a user of the electronic notebook may issue an invitation to one or more other users to collaborate.
Embodiments will now be described with reference to the drawings summarized below. These drawings and the associated description are provided to illustrate example aspects of the disclosure, and not to limit the scope of the invention.
This document describes systems, processes and techniques that may be used to manage and process the recording, arrangement, text processing, word recognition, and/or review of, information in an electronic notebook such as an electronic psychiatrist, psychologist, or other medical professional notebook. For example, a notebook may optionally be managed by a hosted a secure, cloud based system comprised of co-located and/or geographically distributed server systems. The electronic notebook may be accessed over a network by one or more users via one or more user terminals (e.g., via a desktop computer, laptop computer, tablet, smart phone, networked television, network connected wearable device). By way of illustration the electronic notebook may be accessed as a web document via a browser and/or via a dedicated application (sometimes referred to herein as an “app”) installed and hosted on a user device. Optionally, some or all of the information processing described herein may be performed via a system remote from the user terminal (e.g., by the cloud system), or optionally some or all of the information processing described herein may be performed by the user terminal. Optionally, some of the information processing described herein may be performed via a system remote from the user terminal (e.g., the cloud system), and some of the information processing described herein may be performed by the user terminal. The notebook may include multiple sections, as discussed elsewhere herein. Optionally, the various electronic notebook sections may be organized using visual tabs.
Information between a user terminal and the remote system may be synchronized periodically and/or in response to an event (e.g., a detection of a change of data or receipt new data). By way of example, as will be discussed in greater detail herein, a system (e.g., the cloud system or a user device hosting an electronic notebook application) may generate, using information recorded or accessed via the electronic notebook a health timeline for a patient. The health timeline may be updated and the updates may be continuously or periodically synchronized.
The following example relates an electronic medical information notebook. Optionally, some or all of the information communicated between a user terminal app (e.g., an electronic notebook app) and a remote system are transmitted securely to comply with certain regulatory specifications. For example, in order to ensure confidentiality of medication information, the medical information may be handled so as to comply with the Health Insurance Portability and Accountability Act (HIPPA). For example, some or all of the information may be encrypted using an encryption key.
The data may be secured by establishing a virtual private network (VPN) which establishes an encrypted transmission path between the user terminal and remote system. Optionally, Secure Sockets Layer (SSL), a secure transfer tunnel, may be used to encrypt data in transit between the user terminal (e.g., the notebook app and/or browser) and the remote system. Optionally, some or all of the information may be stored on the user terminal and/or the remote system using file encryption. Optionally, the encryption key may be stored physically separate from the data being encrypted (e.g., on different physical servers).
Optionally, access to notebook and/or other medical information is restricted through user authentication. User authentication may be received in the form of a password and/or biometrics. For example, the user terminal may be equipped with a fingerprint scanner which may be used to compare a fingerprint of someone attempting to access the user terminal and/or the notebook information with that of an authorized user. If there is a match access may be granted to the user terminal and/or notebook information. If the fingerprints fail to match, access to the user terminal and/or notebook information may be denied. Another form of biometrics may be in the form of facial recognition. For example, the user terminal may be equipped with a camera which may be used to capture an image of someone attempting to access the user terminal and/or notebook information. Features extracted from the image may be compared to stored features of an authorized user. If there is a match, access may be granted to the user terminal and/or notebook information. If the facial features fail to match, access to the user terminal and/or notebook information may be denied. Other authentication techniques may be used, such as voice recognition, secure fobs, and the like.
Optionally, the users may comprise one or more medical professionals (e.g., a psychiatrist, a family physician, a neurologist, a geriatrician, a therapist, etc.), patients, patient family member, etc. The electronic notebook may enable two or more users to collaborate over a network. Optionally, a user of the electronic notebook may issue an invitation to one or more other users to collaborate. For example, the collaboration may relate to providing information with respect to a patient (e.g., past or recommended future treatments, changes in the patient's life style, etc.). The invitation may be transmitted from the user's terminal directly to the invitee's terminal, or the invitation may be routed through the remote system to the invitee's terminal. The invitation may be provided to the invitee via a pop-up invitation displayed on the invitee's terminal (e.g., by the notebook app), via an SMS/MMS message, via an email message, via a notebook interface presented via a browser, etc.
A user (who may be patient, a medical professional, a family member, a caretaker, etc.) may utilize the electronic notebook to record information regarding a patient/client (e.g., a patient with a mental or physical illness, a patient with a physical or cognitive disability, a patient with a drug addiction issue, a patient with aging-related issues, etc.). The notebook may be used to record, process, and reproduce textual information, audio recordings, video recordings (which may include an associated audio recording track), photographs, medical diagnoses, x-rays, MRI scans, CAT scans, PET scans, medical test reports, medical treatment information, and/or other information. For example, textual information may include questions asked by a medical professional of a patient and/or patient's family members, and responses to such questions. Optionally, a given item of information recorded in the notebook may be stored in association with metadata, such some or all of the following an identifier (e.g., name or user ID) associated with the user that recorded the information, an identifier indicating the user function (e.g., psychiatrist, patient, parent of the patient, child of the patient, etc.), geo-location information indicating the physical location of the user when the user entered in the information (e.g., GPS location information received from the user terminal, such as a mobile phone), etc.
By way of example, the electronic notebook may be utilized to record which medical professional a patient first encountered when admitted to an emergency room, other medical professionals the patient was treated by in the emergency room, who performed which tests (e.g., x-rays, MRI, other scans, blood tests, etc.). By way of further example, the electronic notebook may be used to list potential diagnoses, and to indicate when a given listed diagnosis has been determined to be no longer a potential diagnoses.
The notebook may also be used to search for and/or display specialists of a specified type that are in the geographic area of the patient (e.g., within a specified region, city, zip code, a specific number of miles from the patient's residence and/or from the device hosting the notebook, etc.). For example, a search for specialists of a specified type that are in the geographic area of the patient may be executed by a search engine which will return a list of names that satisfy the search criteria. The specialist's name may be presented by the notebook app (or a browser) in the form of a link or in association with a link, wherein if the user clicks on the link, the notebook will access and display additional information regarding the specialist, such as the schools attended, the hospitals where the specialist interned, the hospitals where the specialist had a fellowship, the hospitals that the specialist has admission privileges for, rating from one or more rating sources, etc.
The electronic medical information notebook may be configured to make it easy for a patient or patient caretaker to access and understand the medical information, and to enter information, as well as appointments, records, and to do lists. As will be discussed, the electronic notebook may include user interfaces configured to receive background and biographical information for a patient, the ability to record verbal discussions at an appointment, the ability to convert voice-to-text, the ability to generate lists of questions that are to be asked at an appointment, the ability to transmit the list of questions to one or more recipients prior to the appointment, the ability to record referral information, the ability to receive and record contact information, the ability to record office visit notes, the ability to share information from the notebook with others, the ability to record treatment plan information, the ability to record medication and prescription information, the ability to record medical procedure information, the ability to record a diary/chronology of appointments, interventions, testing, etc., the ability to combine the diary with collected biographical, medical and clinical information, the ability to communicate with medical professionals (e.g., for the purposes of providing check-in information via video conferencing or messaging, text chats, VoIP, or otherwise), the ability to receive updates relevant to a user's area of concern, the ability to record, track, and analyze medical insurance related matters, the ability to search for and access resources by diagnosis, the ability to calendar events, such as medical appointments.
With respect to the optional Hadoop implementation, other systems may submit tasks to the job tracker, which in turn, distributes the tasks to available task tracker nodes. Optionally, the job tracker may attempt to distribute a given task to a node in geographic proximity to the needed data. While the foregoing example refers to Hadoop clusters and related components, other distributed platforms may optionally be used in addition or instead to process and store data, such as large amounts of data including structured, unstructured, and/or semi-structured data, (e.g., distributed platforms utilizing Bashreduce, Qizmt, Spark, Disco Project, etc.).
The notebook management system 104 may communicate over one or more wired and/or wireless local and/or wide area networks (e.g., the Internet) 101 with one or more user terminals 118, 120. The user terminals 118, 120 may be wireless mobile devices, such as smart phones, tablets, laptops, wearables, or the like. The wireless mobile devices may optionally be equipped with wireless interfaces to communicate over WiFi, Bluetooth™, other local area wireless networks, other personal area networks, cellular networks, or the like. The wireless mobile devices may optionally be equipped one or more antennas connected to respective wireless interfaces. The antennas may be located within the housing of the mobile device, and or on the housing surface of the mobile device. The user terminals 118, 120 may be wired or wireless non-mobile devices, such as a desktop computer, a fixed or large networked television, a game console, or the like. The user terminals 118, 120 may include a variety of sensors (e.g., sound, image, orientation, pressure, light, acceleration, and/or other sensors) configured to detect user input and interaction with the user terminals 118, 120. The user terminals 118, 120 may include touch screens configured to display user interfaces and data and receive user input via touch. The user terminals may include physical keyboards. The user terminals 118, 120 may include one or more microphones to receive voice data and commands, and one or more speakers to play audible content. The user terminals 118, 120 may include a camera configured to capture, record, and/or stream video data (which may be stored or streamed in association with captured audio data) to other systems, such as the notebook management system 104. The user terminals 118, 120 may be associated with the various user-types discussed herein, such as patients, family members of patients, patient caretakers, medical personnel, medical facilities, or other members of a support network.
The notebook management system 104 may communicate over one or more wired and/or wireless local and/or wide area networks 102 with one or more remote servers or computing systems 106, 108, that may be associated with medical service providers, one or more medical databases 108, 110, or third party contact and calendar systems 112, 114. The network 101 and the network 102 may be the same or different networks.
The user terminal 200 may include one or more wireless and/or wired interfaces. For example, the user terminal 200 may include a WiFi interface 216, a Bluetooth interface 218, a cellular interface 220, an NFC (near field communication) interface 222, and/or one or more physical connectors 224 (e.g., a USB connector, a LIGHTING connector, and/or other connector). The user terminal 200 further comprises a processor device (e.g., a microprocessor) 230, volatile memory (e.g., RAM solid state memory) and non-volatile memory (e.g., FLASH memory), and a power management device 234.
The electronic notebook application may be provided or accessed in the form of any application obtained/downloaded by the user terminal 200 via a third party application store and/or via the notebook management system 104. As described herein, the electronic notebook user interfaces may include a variety of data entry fields. The fields may be populated via a keyboard, a stylus, via voice entry (provided via the microphone 204) which may be converted to text via a voice-to-text module, or via facial, limb, or figure gestures captured by the camera 206. The keyboard and/or stylus may be included with the user terminal 200. The stylus may optionally be configured with a sensor to determine stylus inclination and/or a sensor to measure the pressure being applied to the stylus by the user. The pressure and inclination information may be transmitted to the user terminal 200 (e.g., via Bluetooth or other wireless or wired protocol) and such information may be used to identify user issues as described elsewhere herein.
The notebook application 304 may be configured to perform some or all of the functions and processes described herein.
When a user initially accesses the electronic notebook application to generate a new electronic notebook, the electronic notebook application may provide a user interface listing various medically-related conditions or categories. By way of non-limiting example, the conditions may include one or more of the following:
The application may access and/or generate an electronic notebook template customized for the selected category. For example, the notebook may include different questions and/or types of questions for different categories, and corresponding different information receiving fields. Examples of such templates will be discussed in greater detail elsewhere herein. Optionally, a free-form text field (e.g., the illustrated “Other” field) may be provided configured to receive a description of a condition that is not listed. Thus, for example, if a patient is suffering from a non-listed condition, a description of the condition, or related keywords, may be entered into the free-form field. The application may utilize natural language processing (sometimes referred to as computational linguistics) to analyze and understand the text entry. Natural language processing may comprise the utilization of machine learning that analyzes patterns in data to improve the natural language processing software's ability to understand the entry. Natural language processing may utilize sentence segmentation, part-of-speech tagging (e.g., subject, object, modification, noun, adjective, number, etc.), parsing, named entity extraction (e.g., locating and classifying elements in text into various categories such as the names of persons, organizations, locations, expressions of times, quantities, monetary values, percentages, etc.), paraphrase recognition (determining when different phrases or sentences have the same meaning), and/or co-reference resolution (finding all expressions that refer to the same entity in a text). Fuzzy logic may be used to generate or select a template that includes suitable questions and fields.
Optionally, handwritten entries provided via handwritten touch entry (e.g., via a stylus or user finger/digit) may be analyzed to identify user stress. For example, the smoothness or jaggedness of the handwritten entry may be identified (e.g., by identifying discontinuities or abrupt horizontal inputs followed immediately by abrupt vertical inputs) to infer whether the user is undergoing stress. Similarly, stylus/finger pressure and inclination information may be received (e.g., via a wireless interface), stored and analyzed to identify user stress (e.g., pressure or inclination angle above a respective threshold may indicate stress).
The electronic notebook may include fields for receiving content and/or demographic information of a patient. For example, as illustrated in
As noted above, a health timeline may be generated. The health timeline may include some or all of the biographical information collected by the application. The health timeline may be utilized to help provide an overview of the patient's issues and potential relationships between such biographical information and the patient's medical issues and/or treatment. Thus, the health timeline may provide a quick overview of the patient and the patient's medical history.
Example questions and response fields presented by the application will now be discussed. Some or all of the collected data may be used to generate a health timeline.
Autistic Spectrum Disorder (see, e.g.,
Developmental Disorders/Learning Disorder (see, e.g.,
Emotional or Psychiatric Disorder (see, e.g.,
Aging (see, e.g.,
Optionally, if the user fails to enter information in response to a given query, the application will automatically notify the patient's primary physician and/or team leader of such failure and may prompt the primary physician (e.g., via an email, text message, phone call, alert within an electronic notebook interface, or otherwise) to follow-up.
Life Altering Illness (see, e.g.,
The electronic notebook may optionally include instructions with respect to voice recording appointments and regarding preparing questions for the appointment (see, e.g.,
The electronic notebook may optionally include fields of user and/or patient questions for the medical service provider (see, e.g.,
The electronic notebook may include a record control, which when activated, enables video and/or audio of a given appointment to be recorded by the device hosting or accessing the application (see, e.g.,
The spoken voice captured via the recording may be converted to text using a voice-to-text module. The voice-to-text module may perform the conversion using one or more of pattern matching, (where a spoken word is recognized in its entirety), pattern and feature analysis (where a spoken word is broken into bits and recognized from key features, such as the vowels it contains), language modeling and statistical analysis (in which a knowledge of grammar and the probability of certain words or sounds following on from one another is used to speed up recognition and improve accuracy), and/or neural networks (trainable brain-like computer models that can reliably recognize patterns, such as word sounds, after training).
A characterization module (which may be included in the app and/or the remote system) may be utilized to recognize certain types of content in the recorded content (e.g., in the voice recording or the text file generated by the voice-to-text module). For example, the characterization module may identify certain keywords in a given phrase, sentence or series of sentences that correspond to certain subject matter. The corresponding identified subject matter may then be selected and inserted into a corresponding field/section in the electronic notebook for convenient and organized viewing by a user. Optionally, the complete speech-to-text file for a given recording may be inserted/accessible from a calendar, where the calendar includes a listing of past and future appointments, an indication as to whether there are one or more appointment recordings, and any associated other files (e.g., test reports (e.g., drug tests, blood tests, psychiatric reports, orthopedic reports, etc.), prescriptions, imaging reports (e.g., x-ray, MRI, CAT scan, etc.). An example subject matter may include professional recommendations, and example subset subject matter of professional recommendations may include recommended tests, medications, interventions, etc. Thus, for example, if the characterization module identifies certain words or phrases that are typically associated with professional recommendations, such as “recommend” (or variations thereof, such as “recommended” or “recommends”), “prescribe” (or variations thereof, such as prescription), “dose” (or variations thereof, such as “dosage”), “refer” (or variations thereof, such as “referral”), “test” (or variations thereof, such as “tests, or equivalents, such as “exams”, etc.), inspect (or variations thereof, such as “self-inspect”), “avoid,” “take”, “inject”, etc., associated sentences or phrases may be categorized as recommendations. Another example subject matter may include patient concerns. Thus, for example, if the characterization module identifies certain words or phrases that are typically associated with patient concerns, such as “problem,” “worry”, “concern,” “anxiety,” “nervous,” “agitated,” “uneasy”, or derivatives or equivalents, associated sentences or phrases may be categorized as concerns.
Another example subject matter may include patient history. Thus, for example, if the characterization module identifies certain words or phrases that are typically associated with patient history, such as “age at first diagnosis”, “other family members with the same diagnosis,” “other diagnosis,” “other family members with another developmental or psychological diagnosis”, “the patient has been hospitalized,” “taking medication,” “been prescribed,” “in treatment,” other patient history questions or information, discussed herein, etc.,” derivatives or equivalents, associated sentences or phrases may be categorized as patient history. As similarly discussed above with respect to text entries, natural language processing may be utilized to analyze and understand the text file generated by the voice-to-text module.
If a professional recommendation is identified (e.g., a test, or medication, or intervention), it may be automatically added to a to-do list/reminder. The to-do list may be accessed via a menu entry available on some or all pages of the virtual notebook. The to-do list may optionally be presented each time a user opens the app. Once the recommendation has been implemented (e.g., the test has been performed, the patient has started taking the medication, etc.), the patient, caretaker, or medical professional, can mark the to-do item as completed/implemented. Optionally, items marked as completed/implemented may be moved from a viewable active to-do list to a completed/implemented to-do list.
Optionally, the virtual notebook may include an appointment list function. The list may enable a user (e.g., a patient or patient caretaker) to generate a list of questions to ask a medical professional at an upcoming appointment. For example, the user can enter a name of the medical professional, the medical professional specialty, the appointment date. Optionally, the user can send the list to the medical professional prior to the appointment so that the medical questions can be prepared to answer the questions. Optionally, the application will automatically issue a reminder (e.g., via a pop-up alert, an SMS/MMS message, an email message, etc.) to the user a certain period of time (e.g., a certain number of days) prior to the appointment to send the list to the medical professional. The list may be entered via a keyboard, a stylus, or via voice entry which may be converted to text via the voice-to-text module. Optionally, on the day/time of the appointment, the application may pop-up the list to ensure the user does not forget to ask the medical questions. The medical professional's responses can be recorded, converted to text, and entered in the notebook underneath or otherwise in association with the corresponding questions. The response is determined to correspond to a recommendation, it may be entered into the to-do list as similarly discussed above.
The electronic network may also include a referral user interface (see, e.g.,
The following are example referral user interface questions which are associated with corresponding user entry fields:
Optionally, the referral contact information (e.g., phone number, email address, physical address, fax number, Skype® name, office assistant/associate, specialty) etc.) is automatically added to a contact database of the user. Optionally, other information, such as the name of the referral source, dates of appointments/visits, etc. may be recorded in the referral contact record, such as that illustrated in the contacts user interface illustrated in
The electronic network may also include an office visit user interface, such as the example user interface illustrated in
The following are example office user interface fields:
The electronic network may also include “wall” user interface, an example of which is illustrated in
A search field configured to receive a user search query enabling a user to search through wall postings. For example, the user search query may be submitted to a search engine which will identify and return postings and/or links to postings that match the search query, which may be in turn displayed via the user terminal (e.g., via the notebook app or a browser interface).
A search filter user interface enabling the user to select predefined search filters to define a search or to narrow search results. For example, the filters may include one or more diagnosis or conditions (e.g., drug addiction, aging, dementia, special needs, seizure disorder, Parkinson's, cancer, hyperactivity, etc.), one or more treatments (e.g., anti-anxiety medication, chemotherapy, blood pressure medication, antiviral medications, anti-seizure medication, etc.), treatment side effects (e.g., sleeplessness, nausea, anxiety, dizziness, etc.), favorites (which will cause posts that have been marked as favorites by the user or that are posted by a posting user that has been marked as a favorite), etc.
A posting field via which the user can post information, experiences, opinions, resources, etc. There may be a field for posting what the user found helpful or useful with respect to the patient's medical issue, and there may a field for posting what the user did not helpful or useful.
A tag field via which the user can assign one or more tags to the user's post to make it easier for other users to search for and access the post. For example, the tags may include some or all of the filter terms and/or other terms.
The wall user interface may also include one or more controls, such a “favorite” control via which enables a user to indicate that certain posts or certain posting users are favorites.
The notebook application or the remote system can track the prescribed follow-up dates, follow-up frequency, and/or follow-up intervals, and when a determination is made that it is time for a follow-up (e.g., the day of the scheduled follow-up or a specified number of days prior to the scheduled follow-up), an alert may be generated and provided to the patient, designated caregiver, significant other, specified physician, team leader, and/or other designated person to follow-up. The alert may be presented via a notification transmitted to the recipient's device (e.g., via a pop-up alert, a text message, an MMS message, an email, a calendar alert, or otherwise).
The application or remote system may detect that follow-up has not occurred (e.g., by detecting that the alert recipient and/or patient has not confirmed via a user interface that the follow-up occurred). If it is detected that the follow-up has not occurred in the prescribed time frame, then the treatment professional attached to the follow-up will be alerted to check in with the patient and/or those who were designated to perform the follow-up.
A separate medical procedure record may be created for each medical procedure. For example, the medical procedure user interface may include fields which may be populated with the names of the medical procedures that have been performed for the patient, the date of a given procedure, the name of the person that ordered the procedure, where was the procedure performed, who performed the procedure an indication as to whether the user has a copy of the procedure results, if the user does not have a copy of the procedure results then name of the person that has a copy of the procedure results, and/or other information.
The diary may be updated (e.g., via voice entry, a keyboard, stylus, or using information recorded via another section of the notebook) with each new appointment, intervention, test, etc. Optionally, the diary will sequentially present dates on which an event occurred, and brief description of the event (e.g., “appointment with neurologist”, “prescription of Felodipine to control blood pressure,” “MRI scan”, etc.). Optionally, an additional information control (e.g., a “more” control) may be provided which when activated by a user will cause additional information regarding the event to be accessed and displayed. For example, the additional information may be accessed from another notebook section. By way of illustration, the information may be accessed from the office visit section, the treatment plan section, the medication section, the clinical/therapeutic treatment section, and/or other section. There may be separate diaries for different on-going health concerns.
FIG. 4S1 illustrates an example health timeline generated by the system using collected data described herein. The generated health timeline may begin at a certain point in time, such as a significant biological date (e.g. date of birth of the patient), and may indicate, in chronological order, the dates each diagnosis was made and the date of each medical intervention/treatment. Optionally, the health timeline may be updated in real time in response to the receipt of new or updated diagnosis and/or treatment data. Optionally, new or updated information may be emphasized (e.g., by color, font, icon, etc.) for a determined or specified period of time (e.g., 7 days, 1 month, 180 days, or other time period). Optionally, the health timeline may be configured so that it is linear (where a unit of distance is equal to a set amount of time) or is non-linear in terms of time. For example, the timeline may be a logarithmic where the timeline is laid out according to a logarithmic scale such that the time axis itself is plotted logarithmically so that more recent time periods (and associated diagnosis and/or treatment data) may be provided with relatively more space on the timeline than older time periods.
Optionally, the health timeline may be zoomable to focus on a particular time period and to display additional entries for a given time period. For example, the timeline may be zoomable via a user interface that enables the user to use a lasso or other tool to indicate a beginning and end portion of the timeline that is to be zoomed. Optionally, in addition or instead, the user interface enables the user to numerically specify a numerical zoom factor and/or to use a touch interface to stretch/zoom out a given portion of the health timeline by touching one end of the portion with a thumb, touching the other end of the portion with a finger, and then moving the thumb and finger apart. Optionally, a selected portion of the health timeline may be generated and displayed at a higher time resolution at the same time the original timeline is displayed at the original resolution, as illustrated in
FIGS. 4S2 and 4S3 illustrates an example master combined introduction user interface and timeline user interface. The master/combined user interface merges some or all of the information from the Biographical, Medical, Clinical, Therapeutic, Diary, and/or other sections of the notebook into a unified timeline that lists the events, treatments, and other information with associated dates and/or times. Thus, the master combined user interface provides an overall snapshot for the patient. Optionally, the master combined user interface may be similar in appearance to the health timeline, with additional information. The master combined user interface may include all or a subset of the information included in the health timeline.
The generated master combined user interface may include a timeline that begins at a certain point in time, such as a significant biological date (e.g. date of birth of the patient), and may indicate, in chronological order, significant biographical information (e.g., where and when the patient went to school, when the patent was married, how many children the patient has and when each child was born, when and where the patient has been employed, whether the patient's parents are still alive, and if not when a given parent died, etc.), diagnosis dates, medical intervention/treatment dates, diary entries, listings of appointments with medical professionals (e.g., therapist, psychiatrist or psychologist, and their name), treatment frequency, psychological and/or other patient testing, etc. Optionally, the master combined user interface may be updated in real time in response to the receipt of new or updated biographical, medical, clinical, therapeutic, and/or diary data. Optionally, new or updated information may be emphasized (e.g., by color, font, icon, etc.) for a determined or specified period of time after being received or initially displayed (e.g., 7 days, 1 month, 180 days, or other time period). Optionally, the master combined user interface timeline may be configured so that it is linear (where a unit of distance is equal to a set amount of time) or is non-linear in terms of time. For example, the timeline may be a logarithmic where the timeline is laid out according to a logarithmic scale such that the time axis itself is plotted logarithmically so that more recent time periods (and associated diagnosis and/or treatment data) may be provided with relatively more space on the timeline than older time periods.
Optionally, the master combined user interface timeline (and/or the health timeline) may be zoomable to focus on a particular time period and to display additional entries for a given time period. Optionally, the user interface enables the user to numerically specify a numerical zoom factor and/or to use a touch interface to stretch/zoom out a given portion of the timeline by touching one end of the portion with a thumb or other digit, touching the other end of the portion with a finger or other digit, and then moving the thumb and finger apart. Similarly, the user interface may enable the user to zoom in on a given portion using a pinch gesture, by touching one end of the portion with a first digit (e.g., a thumb), touching the other end of the portion with a second digit (e.g., a finger), and then moving the two digits (e.g., thumb and finger) together. As similarly discussed above with respect to the health timeline, optionally, the timeline may be zoomable via a user interface that enables the user to use a lasso or other tool to indicate a beginning and end portion of the timeline that is to be zoomed. Optionally, a selected portion of the timeline may be generated and displayed at a higher time resolution at the same time the original timeline is displayed at the original resolution, as illustrated in
The check-in user interface may include the following selectable options with respect to who is performing the check-in: Patient; or Significant Other/Care Giver/Family Member. The selected option may be communicated to the check-in information recipient so that the recipient will know who is providing the check-in information. The check-in frequency may be scheduled and the schedule may be determined based at least in part on a recommendation of the treating professional. The schedule may be entered into the notebook calendar (e.g., via the treatment plan section of the notebook). The notebook will then provide an alert or cue (e.g., via a pop-up alert, an SMS/MMS message, an email message, etc.) to one or more designated alert recipients (e.g., the patient, the patient's significant other, child, caretaker, etc.) that it is time to check-in. The alert may be provided on the scheduled check-in day and/or a certain period of time (e.g., a certain number of days) prior to the check-in day. Check-in may also be performed in an unscheduled manner (e.g., based on perceived need by the patient).
A field may be provided configured to receive free-form text via which the check-in information may be provided. Natural language or keyword processing may be utilized to identify (e.g., with a certain likelihood percentage) that words or phrases used in the information provided during the check-in (e.g., by the patient, significant other, caregiver, family member), or slurred or unintelligible speech, indicate an elevated or immediate need for attention. If the processing detects such an elevated or immediate need for attention, an alert may be provided to the treating professional (e.g., via a pop-up alert, an SMS/MMS message, an email message, etc.), where the alert may indicate that the check-in information needs to be urgently reviewed, and that the patient may need immediate attention. For example, words and phrases that indicate urgency may include some or all of the following terms and/or other terms: hopeless, worthless, suicidal, anxious, depressed, afraid, helpless, afraid, out-of-control, gun, knife, rage, violent, etc. By way of further example, urgency may be indicated if system or app detects that the information provided via the check-in user interface is unintelligible or the speech (e.g., slurred speech) or text patterns indicate that the user is engaging in substance abuse (e.g., of drugs or alcohol) or is suffering a stroke. The alert may be dynamically generated and composed to include the keywords/terms that triggered the alert, and/or may indicate that unintelligible/slurred speech was detected. Speaker-adaptive, continuous speech recognition may be utilized in converting speech to text.
Optionally, if the app or system detects that the patient has not checked at the specified interval or schedule, an alert may be generated and provided to the treating professional and/or team leader indicating that the patient failed to check-in as specified. An alert may also be provided to the patient and other designated recipient with a request to check-in.
With reference to the example user interface illustrated in
If the Significant Other/Caregiver/Family Member option was selected, the check-in user interface may be configured to include fields that prompt the non-patient user to indicate how the patient is generally doing, what are the user's most significant concerns regarding the patient, how the user is feeling, what the user thinks the patient is not telling the treating professional, what the user needs help with, etc.
The check-in section of the notebook may be configured to enable real-time or recorded videos or images (optionally with an associated voice track) of the patient and/or significant other/caregiver/family member to be transmitted to the treatment provider and/or support staff terminals to enable them to visually see and assess the patient and/or significant other/caregiver/family member. Such visual content may provide significant or critical information in making mental and/or physical health assessments (e.g., enable the treatment provider to detect if someone has suffered a stroke or heart attack, or is suffering from drug abuse). Optionally, some or all of the information provided via the fields described above may in addition or instead be provided via the visual content, optionally including a corresponding sound track (of the patient or non-patient speaking). The visual content may optionally be time-stamped indicating what day and time it was recorded and/or transmitted. A record control may be provided to initiate a video or image recording via the camera on the user terminal.
The financial user interface may include fields via which some or all of the following may be specified: an indication as to whether the patient has insurance, and if so the corresponding insurance information (e.g., the identification number, Rx Bin, Rx PCN, Rx Group, Plan code, Group number, etc.). The financial user interface may also prompt the user (the patient or non-patient user) to indicate whether the user wants to the use the notebook for keeping track of insurance billing matters. If the user responds affirmatively, the user may be prompted to provide claim information (e.g., receipts, itemized bills, what the treatment or visit was for, to whom is payment to be made, etc.).
The financial user interface may also prompt the user to indicate whether the patient has disability coverage or Long Term Disability Coverage (LTD), and if so, provide related information (e.g., the insurance provider, the policy number, etc.). If the user does have LTD coverage, the financial user interface may also prompt the user to indicate whether the user wants to utilize the notebook to keep track of the patient's related claims. If the user indicates that the user wants to utilize the notebook to keep track of the patient's LTD claims, the user is requested to provide information regarding the claims (e.g., evidence that the patient is disabled and the nature of the disability, such as a doctor's statement or form regarding the doctor's opinion on the patient's condition, evidence that the patient had been employed when the disability occurred, evidence that any waiting period has expired, etc.).
The financial user interface may also prompt the user to indicate whether the patient has Supplemental Security Income (SSI), and if so, to provide related information, such as the patient's income, the patient's assets, the patient's living arrangements, the patient's citizenship or alien status, the patient's health issues and how they affect the patient's daily activities and ability to work, etc. If the user does have SSI, the financial user interface may also prompt the user to indicate whether the user wants to utilize the notebook to keep track of the patient's related SSI claims. If the user indicates that the user wants to utilize the notebook to keep track of the patient's SSI claims, the user is requested to provide information regarding the claims.
The financial user interface may also prompt the user to indicate whether the patient has Social Security Disability Insurance (SSDI), and if so, to provide related information, such as the patient's Social Security number and proof of the patient's age, names, addresses and phone numbers of doctors, caseworkers, hospitals, and clinics that took care of the patient and the dates of appointments, names and dosages of the medications the patient is taking, medical records laboratory and test results, a summary of where the patient worked and the kind of work the patient did, the patient's most recent W-2 form or, if self-employed, a copy of the patient's federal tax return, information about the patient's family members, Social Security numbers and proof of age for each family member who may qualify for benefits, etc. If the user does have SSDI, the financial user interface may also prompt the user to indicate whether the user wants to utilize the notebook to keep track of the patient's related SSDI claims. If the user indicates that the user wants to utilize the notebook to keep track of the patient's SSDI claims, the user is requested to provide information regarding the claims.
A grid may be generated and displayed configured to aid in the tracking of claims and payments made to treatment providers, including the payment amounts and payment dates. For example, the grid rows may correspond to respective treatment providers, and the columns may correspond to payment dates (or vice versa). A given grid cell may list a payment amount (e.g., a payment made or due). Thus, the grid may provide an “at a glance” summary of payments made and payments due enabling a patient/caretaker or other user that has financial responsibility with respect to the patient.
At block 306B, natural language processing is optionally performed on the user input (e.g., touch input, keyboard input, or the text generated by the speech-to-text process). At block 308B, keyword identification is performed (e.g., keywords that indicate the topic of the user input or that indicate an emotion or feeling of wellbeing). At block 310B a determination is made as to whether the identified keywords in the user input indicate a critical/safety related condition. The keyword criticality determination may be performed by comparing the user input against a data store of keywords that indicate a potential critical condition. For example, the keyword data store may include some or all of the following keywords (or key phrases): hopeless, worthless, suicidal, anxious, depressed, afraid, helpless, afraid, out-of-control, gun, knife, rage, furious, violent, drinking, drunk, drugged, scream, out-to-get-me, etc. A given keyword may be associated with a condition type, such as a psychological condition, a pharmaceutical condition, an orthopedic condition, etc. Optionally, a critically determination may weight different keywords differently, and the criticality determination may optionally calculate a criticality score based on the number of keywords and the keyword weighting. If the criticality score exceeds a specified threshold, a determination may be made that a potential critical condition. For example, the following formula may be used:
Criticality Score=Weght1(of Keyword1)+Weght2(of Keyword2) . . . +Weghtn(of Keywordn)
Where a potential critical condition exists if Criticality Score≥Criticality Score
If the keyword criticality determination identifies a potential critical condition, at block 312B, an alert may be dynamically generated and transmitted to one or more destinations based on one or more rules accessed from a rule data store. For example, a rule may indicate that all alerts are to be transmitted to a previously identified caretaker/family member and a previously identified primary physician. Another rule may indicate that if a keyword is associated with a physiological condition, then an alert is to be transmitted to a specified psychiatrist or psychologist.
Thus, processes and techniques are described that may be used to receive, manage and process the recording, arrangement, text processing, word recognition, and/or review of information for or in an electronic notebook.
The methods and processes described herein may have fewer or additional steps or states and the steps or states may be performed in a different order. Not all steps or states need to be reached. The methods and processes described herein may be embodied in, and fully or partially automated via, software code modules executed by one or more general purpose computers. The code modules may be stored in any type of computer-readable medium or other computer storage device. Some or all of the methods may alternatively be embodied in whole or in part in specialized computer hardware. The systems described herein may optionally include displays, user input devices (e.g., touchscreen, keyboard, mouse, voice recognition, etc.), network interfaces, etc.
The results of the disclosed methods may be stored in any type of computer data repository, such as relational databases and flat file systems that use volatile and/or non-volatile memory (e.g., magnetic disk storage, optical storage, EEPROM and/or solid state RAM).
The various illustrative logical blocks, modules, routines, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.
Moreover, the various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a general purpose processor device, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor device can be a microprocessor, but in the alternative, the processor device can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor device can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor device includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor device can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor device may also include primarily analog components. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
The elements of a method, process, routine, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor device, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of a non-transitory computer-readable storage medium. An exemplary storage medium can be coupled to the processor device such that the processor device can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor device. The processor device and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor device and the storage medium can reside as discrete components in a user terminal.
Conditional language used herein, such as, among others, “can,” “may,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without other input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.
Disjunctive language such as the phrase “at least one of X, Y, Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
While the phrase “click” may be used with respect to a user selecting a control, menu selection, or the like, other user inputs may be used, such as voice commands, text entry, gestures, etc. User inputs may, by way of example, be provided via an interface, such as via text fields, wherein a user enters text, and/or via a menu selection (e.g., a drop down menu, a list or other arrangement via which the user can check via a check box or otherwise make a selection or selections, a group of individually selectable icons, etc.). When the user provides an input or activates a control, a corresponding computing system may perform the corresponding operation. Some or all of the data, inputs and instructions provided by a user may optionally be stored in a system data store (e.g., a database), from which the system may access and retrieve such data, inputs, and instructions. The notifications/alerts and user interfaces described herein may be provided via a Web page, a dedicated or non-dedicated phone application, computer application, a short messaging service message (e.g., SMS, MMS, etc.), instant messaging, email, push notification, audibly, a pop-up interface, and/or otherwise.
The user terminals described herein may be in the form of a mobile communication device (e.g., a cell phone), laptop, tablet computer, interactive television, game console, media streaming device, head-wearable display, networked watch, etc. The user terminals may optionally include displays, user input devices (e.g., touchscreen, keyboard, mouse, voice recognition, etc.), network interfaces, etc.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it can be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As can be recognized, certain embodiments described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain embodiments disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Number | Name | Date | Kind |
---|---|---|---|
3594066 | Cook | Jul 1971 | A |
3649765 | Rabiner | Mar 1972 | A |
4682361 | Selbach | Jul 1987 | A |
5293584 | Brown | Mar 1994 | A |
5579393 | Conner | Nov 1996 | A |
5633910 | Cohen | May 1997 | A |
5699404 | Satyamurti | Dec 1997 | A |
5823948 | Ross, Jr. | Oct 1998 | A |
5924074 | Evans | Jul 1999 | A |
6039688 | Douglas | Mar 2000 | A |
6047254 | Ireton | Apr 2000 | A |
6234964 | Iliff | May 2001 | B1 |
6236968 | Kanevsky | May 2001 | B1 |
6290646 | Cosentino | Sep 2001 | B1 |
6292771 | Haug | Sep 2001 | B1 |
6411933 | Maes | Jun 2002 | B1 |
6544294 | Greenfield | Apr 2003 | B1 |
6726636 | Der Ghazarian | Apr 2004 | B2 |
6941271 | Soong | Sep 2005 | B1 |
7174332 | Baxter | Feb 2007 | B2 |
7222075 | Petrushin | May 2007 | B2 |
7302490 | Gupta | Nov 2007 | B1 |
7647555 | Wilcox | Jan 2010 | B1 |
7770117 | Uy | Aug 2010 | B1 |
7783072 | Work | Aug 2010 | B2 |
7788605 | Shoemaker | Aug 2010 | B1 |
8374992 | Meyyappan et al. | Feb 2013 | B2 |
8533511 | Ma et al. | Sep 2013 | B2 |
8606595 | Udani | Dec 2013 | B2 |
8775213 | Hughes | Jul 2014 | B2 |
8826123 | Audet | Sep 2014 | B2 |
8868436 | Gotthardt | Oct 2014 | B2 |
9158335 | Zheng | Oct 2015 | B2 |
9252962 | Valeti | Feb 2016 | B1 |
9256588 | Moscovich et al. | Feb 2016 | B1 |
9256719 | Berini | Feb 2016 | B2 |
9305155 | Vo | Apr 2016 | B1 |
9619616 | Raduchel | Apr 2017 | B2 |
9658756 | Freeman | May 2017 | B2 |
9733801 | Audet | Aug 2017 | B2 |
9788799 | Wagner | Oct 2017 | B2 |
9899038 | Khaleghi | Feb 2018 | B2 |
9928379 | Hoffer | Mar 2018 | B1 |
9934793 | Bae | Apr 2018 | B2 |
9959556 | Cordell | May 2018 | B1 |
9963033 | Miller | May 2018 | B2 |
10032120 | Collins | Jul 2018 | B2 |
10121345 | Fields et al. | Nov 2018 | B1 |
10235998 | Khaleghi | Mar 2019 | B1 |
10402926 | Kelly | Sep 2019 | B2 |
10484845 | Khaleghi | Nov 2019 | B2 |
10516938 | Zass | Dec 2019 | B2 |
10573314 | Khaleghi | Feb 2020 | B2 |
10657166 | Gorzela | May 2020 | B2 |
10726393 | Kaufman | Jul 2020 | B2 |
11222716 | Vozila | Jan 2022 | B2 |
11386896 | Khaleghi | Jul 2022 | B2 |
20020010679 | Felsher | Jan 2002 | A1 |
20020012526 | Sai | Jan 2002 | A1 |
20020022975 | Blasingame | Feb 2002 | A1 |
20020026329 | Saito | Feb 2002 | A1 |
20020029157 | Marchosky | Mar 2002 | A1 |
20020035486 | Huyn | Mar 2002 | A1 |
20020062225 | Siperco | May 2002 | A1 |
20020082865 | Bianco | Jun 2002 | A1 |
20020116188 | Amir | Aug 2002 | A1 |
20020138271 | Shaw | Sep 2002 | A1 |
20020145742 | Koenig et al. | Oct 2002 | A1 |
20030115054 | Iso-Sipila | Jun 2003 | A1 |
20030140044 | Mok | Jul 2003 | A1 |
20040034869 | Wallace | Feb 2004 | A1 |
20040059599 | Mcivor | Mar 2004 | A1 |
20040133560 | Simske | Jul 2004 | A1 |
20040243443 | Asano | Dec 2004 | A1 |
20050055399 | Savchuk | Mar 2005 | A1 |
20050096906 | Barzilay | May 2005 | A1 |
20050137723 | Liu | Jun 2005 | A1 |
20050147214 | Goerg et al. | Jul 2005 | A1 |
20050149569 | Hariharan et al. | Jul 2005 | A1 |
20050165626 | Karpf | Jul 2005 | A1 |
20050172022 | Brown | Aug 2005 | A1 |
20060001666 | Cake et al. | Jan 2006 | A1 |
20060011399 | Brockway et al. | Jan 2006 | A1 |
20060028556 | Bunn | Feb 2006 | A1 |
20060047497 | Chen | Mar 2006 | A1 |
20060052674 | Eisenstein | Mar 2006 | A1 |
20060053009 | Jeong | Mar 2006 | A1 |
20060085347 | Yiachos | Apr 2006 | A1 |
20060148528 | Jung | Jul 2006 | A1 |
20060293891 | Pathuel | Dec 2006 | A1 |
20070024454 | Singhal | Feb 2007 | A1 |
20070074114 | Adjali | Mar 2007 | A1 |
20070124135 | Schultz | May 2007 | A1 |
20070168413 | Barletta | Jul 2007 | A1 |
20070208800 | Frohlich | Sep 2007 | A1 |
20070216708 | Mackay | Sep 2007 | A1 |
20070276270 | Tran | Nov 2007 | A1 |
20080040151 | Moore | Feb 2008 | A1 |
20080066973 | Furuki | Mar 2008 | A1 |
20080104048 | Surendran | May 2008 | A1 |
20080126426 | Manas | May 2008 | A1 |
20080133233 | Tsubura | Jun 2008 | A1 |
20080195495 | Rubin et al. | Aug 2008 | A1 |
20080244453 | Cafer | Oct 2008 | A1 |
20080301176 | Fanelli et al. | Dec 2008 | A1 |
20080303811 | Van Luchene | Dec 2008 | A1 |
20080313536 | Larsen | Dec 2008 | A1 |
20080319750 | Potter | Dec 2008 | A1 |
20090055735 | Zaleski | Feb 2009 | A1 |
20090077045 | Kirchmeier | Mar 2009 | A1 |
20090117922 | Bell | May 2009 | A1 |
20090292554 | Schultz | Nov 2009 | A1 |
20090313347 | Engel | Dec 2009 | A1 |
20100034639 | Moniz et al. | Feb 2010 | A1 |
20100036871 | Beckey et al. | Feb 2010 | A1 |
20100037219 | Chen et al. | Feb 2010 | A1 |
20100076333 | Burton | Mar 2010 | A9 |
20100169108 | Karkanias | Jul 2010 | A1 |
20100228656 | Wasserblat et al. | Sep 2010 | A1 |
20100262435 | Smith | Oct 2010 | A1 |
20100286490 | Koverzin | Nov 2010 | A1 |
20110034565 | Regan | Feb 2011 | A1 |
20110040155 | Guzak | Feb 2011 | A1 |
20110068934 | Weng et al. | Mar 2011 | A1 |
20110091050 | Hanai | Apr 2011 | A1 |
20110099189 | Barraclough | Apr 2011 | A1 |
20110099490 | Barraclough | Apr 2011 | A1 |
20110118555 | Dhumne | May 2011 | A1 |
20110148668 | Li | Jun 2011 | A1 |
20110184781 | Hussam | Jul 2011 | A1 |
20110191122 | Kharraz Tavakol | Aug 2011 | A1 |
20110202866 | Huang | Aug 2011 | A1 |
20110239158 | Barraclough | Sep 2011 | A1 |
20120005099 | Beckey | Jan 2012 | A1 |
20120112879 | Ekchian | May 2012 | A1 |
20120198385 | Audet | Aug 2012 | A1 |
20120299926 | Hodes | Nov 2012 | A1 |
20120306648 | Karaffa | Dec 2012 | A1 |
20120306925 | Hwang | Dec 2012 | A1 |
20120323589 | Udani | Dec 2012 | A1 |
20120323796 | Udani | Dec 2012 | A1 |
20130009907 | Rosenberg | Jan 2013 | A1 |
20130024206 | Hughes | Jan 2013 | A1 |
20130085781 | Navani | Apr 2013 | A1 |
20130111331 | Rosen et al. | May 2013 | A1 |
20130124192 | Lindmark | May 2013 | A1 |
20130135095 | Stochita | May 2013 | A1 |
20130163956 | Medhurst | Jun 2013 | A1 |
20130185071 | Chen | Jul 2013 | A1 |
20130257777 | Benko | Oct 2013 | A1 |
20130275151 | Moore | Oct 2013 | A1 |
20130325493 | Wong | Dec 2013 | A1 |
20140019119 | Liu | Jan 2014 | A1 |
20140068489 | Wyland | Mar 2014 | A1 |
20140074454 | Brown | Mar 2014 | A1 |
20140081667 | Joao | Mar 2014 | A1 |
20140136233 | Atkinson | May 2014 | A1 |
20140143671 | Kovalick | May 2014 | A1 |
20140164310 | Chen | Jun 2014 | A1 |
20140164784 | Sinderbrand | Jun 2014 | A1 |
20140172707 | Kuntagod | Jun 2014 | A1 |
20140172804 | Kalifmann | Jun 2014 | A1 |
20140195221 | Frank | Jul 2014 | A1 |
20140244277 | Krishna Rao | Aug 2014 | A1 |
20140249860 | Rynchek | Sep 2014 | A1 |
20140253467 | Hicks | Sep 2014 | A1 |
20140304005 | Hughes | Oct 2014 | A1 |
20140304200 | Wall | Oct 2014 | A1 |
20140379374 | Vinals | Dec 2014 | A1 |
20150038123 | Tuukkanen | Feb 2015 | A1 |
20150058013 | Pakhomov | Feb 2015 | A1 |
20150072330 | Rosenberg | Mar 2015 | A1 |
20150100339 | Kim | Apr 2015 | A1 |
20150134346 | Hyde | May 2015 | A1 |
20150149095 | Otvos | May 2015 | A1 |
20150164436 | Maron | Jun 2015 | A1 |
20150169717 | Wang | Jun 2015 | A1 |
20150178457 | Grimley | Jun 2015 | A1 |
20150206544 | Carter | Jul 2015 | A1 |
20150228277 | Anhari | Aug 2015 | A1 |
20150242585 | Spiegel | Aug 2015 | A1 |
20150257681 | Shuster | Sep 2015 | A1 |
20150258892 | Wu | Sep 2015 | A1 |
20150310455 | Vinals | Oct 2015 | A1 |
20150314681 | Riley, Sr. | Nov 2015 | A1 |
20150363657 | Shigemura | Dec 2015 | A1 |
20150379200 | Gifford | Dec 2015 | A1 |
20160004820 | Moore | Jan 2016 | A1 |
20160012196 | Mark | Jan 2016 | A1 |
20160027264 | Choi | Jan 2016 | A1 |
20160078771 | Zhuang | Mar 2016 | A1 |
20160080403 | Cunningham | Mar 2016 | A1 |
20160143594 | Moorman | May 2016 | A1 |
20160297359 | Kirsch et al. | Oct 2016 | A1 |
20160342741 | Chin | Nov 2016 | A1 |
20170007167 | Kostic et al. | Jan 2017 | A1 |
20170060997 | Lee | Mar 2017 | A1 |
20170161439 | Raduchel | Jun 2017 | A1 |
20170166054 | Ayala Rodriguez | Jun 2017 | A1 |
20170190251 | Wu | Jul 2017 | A1 |
20170195637 | Kusens | Jul 2017 | A1 |
20170200449 | Penilla et al. | Jul 2017 | A1 |
20170235888 | Rahman | Aug 2017 | A1 |
20170251985 | Howard | Sep 2017 | A1 |
20170319184 | Sano | Nov 2017 | A1 |
20170333560 | Epshtein | Nov 2017 | A1 |
20170359551 | Shaw | Dec 2017 | A1 |
20180027006 | Zimmermann | Jan 2018 | A1 |
20180032997 | Gordon | Feb 2018 | A1 |
20180060899 | Das | Mar 2018 | A1 |
20180090155 | Moriya | Mar 2018 | A1 |
20180144763 | Khaleghi | May 2018 | A1 |
20180153862 | Coric | Jun 2018 | A1 |
20180193652 | Srivastava | Jul 2018 | A1 |
20180200142 | Freeman | Jul 2018 | A1 |
20180211059 | Aunger | Jul 2018 | A1 |
20180214061 | Knoth | Aug 2018 | A1 |
20180267700 | Kaditz | Sep 2018 | A1 |
20180285542 | Xiao | Oct 2018 | A1 |
20180322265 | Kwok-Suzuki et al. | Nov 2018 | A1 |
20190006040 | Fleming | Jan 2019 | A1 |
20190035132 | Dirksen | Jan 2019 | A1 |
20190051144 | David | Feb 2019 | A1 |
20190060367 | Zhang | Feb 2019 | A1 |
20190095632 | Seinen | Mar 2019 | A1 |
20190155954 | Goyal | May 2019 | A1 |
20190208354 | Raduchel | Jul 2019 | A1 |
20190210607 | Kobayashi | Jul 2019 | A1 |
20190239789 | Jung | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
109102825 | Dec 2018 | CN |
10-2018-0004312 | Jan 2018 | KR |
Entry |
---|
Healow Home Page, dated Apr. 17, 2016, 3 pages—https://web.archive.org/web/20160417210345/https://healow.com/apps/jsp/webview/signIn.jsp. |
Matheson, “Watch Your Tone—Voice-Analytics Software Helps Customer-Service Reps Build Better 2 Rapport with Customers,” MIT News Office, http://news.mit.edu/2016/startug-cogito-voice-analyticscall-centers-otsd-0120, Jan. 20, 2016, 4 pages. |
Mullin, “Rewriting Life—Voice Analysis Tech Could Diagnose Disease,” 3 httgs://www.technologyreview.com/s/603200/voice-analysis-tech-could-diagnose-disease/, Jan. 19, 2017, 9 pages. |
Nield, “Scientists Can Now Diagnose Depression Just by Listening to Your Voice,” IEEE Transactions 4 on Affective Computing, Science Alert, https://www.sciencealert.com/this-comguter-grogram-can-tellwhen-someone-s-deoressed-bv-their-soeeach-oatterns, Jul. 11, 2016, 4 pages. |
PCT International Search Report and Written Opinion, regarding International Application No. PCT/US2020/017781 dated Jun. 2, 2020, 12 pages. |
PCT International Search Report and Written Opinion, regarding International Application No. PCT/US2019/019438, dated Jun. 14, 2019, 19 pages. |
Scherer et al., “Investigating Voice Quality as a Speaker-Independent Indicator of Depression and 5 PTSD”, University of Southern California, Institute for Creative Technologies, 5 pages, Los Angeles, California, 2013. |
Shah, Bhakti, “eClinicalWorks Invest $25 Million in Patient Engagement,” dated Feb. 6, 2013, 3 pages—https://www.eclinicalworks.com/pr-eclinicalworks-invests-25-million/. |
Number | Date | Country | |
---|---|---|---|
20230353989 A1 | Nov 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17645547 | Dec 2021 | US |
Child | 18326804 | US | |
Parent | 16682374 | Nov 2019 | US |
Child | 17645547 | US | |
Parent | 16250279 | Jan 2019 | US |
Child | 16682374 | US | |
Parent | 15988191 | May 2018 | US |
Child | 16250279 | US | |
Parent | 15862552 | Jan 2018 | US |
Child | 15988191 | US | |
Parent | 15198762 | Jun 2016 | US |
Child | 15862552 | US |