Infusion pump system and method

Information

  • Patent Grant
  • 11464906
  • Patent Number
    11,464,906
  • Date Filed
    Wednesday, January 15, 2020
    4 years ago
  • Date Issued
    Tuesday, October 11, 2022
    2 years ago
Abstract
Some embodiments of an infusion pump system may be configured to allow the user to communicate with the infusion pump system using voice or image input. Optionally, particular embodiments can interpret the voice or image input using speech or image recognition capabilities. By incorporating speech or image recognition equipment within the infusion pump system, user interactions with the pump system can be enhanced and simplified.
Description
TECHNICAL FIELD

This document relates to an infusion pump system, such as a portable infusion pump system for dispensing insulin or another medicine.


BACKGROUND

Pump devices are commonly used to deliver one or more fluids to a targeted individual. For example, a medical infusion pump device may be used to deliver a medicine to a patient as part of a medical treatment. The medicine that is delivered by the infusion pump device can depend on the condition of the patient and the desired treatment plan. For example, infusion pump devices have been used to deliver insulin to the vasculature of diabetes patients so as to regulate blood-glucose levels.


Users of infusion pump devices often need to communicate with the infusion pump via a user interface to control the operations of the infusion pump in a safe and effective manner. For example, a user may press a series of buttons on the user interface to enter food intake data into the infusion pump, such as a number of grams of carbohydrates that is indicative of a recently or soon-to-be consumed meal. The food intake data can be used in conjunction with other parameters stored by the infusion pump system to calculate a suggested bolus dosage of insulin based on the grams of carbohydrates entered by the user. In another example, a user may enter information into the infusion pump system via a user interface that indicates that the user is going to perform a level of physical exercise. In some circumstances, the infusion pump system may reduce the amount of a planned dispensation of insulin in response to the exercise information entered by the user.


SUMMARY

Some embodiments of an infusion pump system may be configured to receive user input at the infusion pump system using voice input. Some such embodiments can interpret the user's voice input using speech recognition technology, and in response to the user's voice input, the infusion pump system can automatically perform one or more tasks (e.g., without additional user intervention). By incorporating speech recognition equipment within the infusion pump system, user communications with the pump system can be enhanced and simplified. In particular embodiments, the infusion pump system may further include a capability to perform natural language processing of the user's voice input, thereby providing an infusion pump system configured to correlate any one of a number of spoken phrases into selected tasks. In addition or in the alternative, some embodiments of an infusion pump system may be configured to allow the user to provide input to the infusion pump system using photographic images. For example, the user may take a photo of a soon-to-be-consumed meal, and the photo may be provided as food intake data that is input to the infusion pump system for purposes of performing one or more tasks by the infusion pump system. In response, the infusion pump system may, for example, use image recognition technology to estimate the carbohydrate and other nutritional contents of the food depicted in the photo and then suggest a particular bolus dosage of insulin (or other medicine) corresponding to the food in the photo.


In particular embodiments described herein, a medical infusion pump system may include a portable housing that defines a space to receive a medicine. The system may also include a pump drive system to dispense medicine from the portable housing when the medicine is received in the space. In some embodiments, the system may also include control circuitry that communicates control signals to the pump drive system to control dispensation of the medicine from the portable housing. The system may also include a speech recognition system that is in communication with the control circuitry. The control circuitry may select one or more tasks to be performed by the infusion pump system in response to the speech recognition system receiving a user's voice input.


In some embodiments of the medical infusion pump system that includes the speech recognition system, at least a portion of the speech recognition system may be stored in one or more computer-readable memory devices at a remote server system, and the control circuitry may be configured to communicate with the remote server system to use the speech recognition system. Optionally, at least a portion of the speech recognition system may be disposed in the portable housing. Further, the control circuitry may be housed in a controller housing that is removably attachable to the portable housing, and at least a portion of the speech recognition system may be disposed in the controller housing. In some embodiments, the speech recognition system may optionally comprise a first subsystem and a second subsystem. At least a portion of the first subsystem may be stored in one or more computer-readable memory devices at a remote server system that communicates with the control circuitry. In addition, at least a portion of the second subsystem may be stored in one or more computer-readable memory devices in the portable housing or in a controller device housing in which the control circuitry is housed and that is removably attachable to the portable housing. In some embodiments, the medical infusion pump system may also include a voice synthesizer for outputting audible human language communications from the infusion pump system. In particular embodiments, the medical infusion pump system may include a remote control device that is separate from the portable housing and that houses the control circuitry. The remote control device may be configured to wirelessly communicate with a wireless communication device housed in the portable housing, and the remote control device may include a microphone for receiving the voice input. Further, in some embodiments the medical infusion pump optionally includes a voice synthesizer for outputting audible human language communications from the remote control device.


In particular embodiments described herein, a medical infusion pump system may include a portable housing that defines a space to receive a medicine. The system may also include a pump drive system to dispense medicine from the portable housing when the medicine is received in the space. In some embodiments, the system may also include control circuitry that communicates control signals to the pump drive system to control dispensation of the medicine from the portable housing. The system may also include an image recognition system in communication with the control circuitry. The control circuitry may select one or more tasks to be performed by the infusion pump system in response to the image recognition system receiving user input comprising a user-provided digital image.


In some embodiments of the medical infusion pump system that includes the image recognition system, at least a portion of the image recognition system may be stored in one or more computer-readable memory devices at a remote server system, and the control circuitry may be configured to communicate with the remote server system to use the image recognition system. In particular embodiments, at least a portion of the image recognition system may be disposed in the portable housing. Optionally, the control circuitry may be housed in a controller housing that is removably attachable to the portable housing, and at least a portion of the image recognition system may be disposed in the controller housing. In some embodiments, the image recognition system may comprise a first subsystem and a second subsystem. At least a portion of the first subsystem may be stored in one or more computer-readable memory devices at a remote server system that communicates with the control circuitry, and at least a portion of the second subsystem may be stored in one or more computer-readable memory devices in the portable housing or in a controller device housing in which the control circuitry is housed and that is removably attachable to the portable housing. Optionally, the medical infusion pump system that includes the image recognition system may include a voice synthesizer for outputting audible human language communications from the infusion pump system. The system may also optionally include a remote control device that is separate from the portable housing and that houses the control circuitry. The remote control device may be configured to wirelessly communicate with a wireless communication device housed in the portable housing, and the remote control device may include a camera device for receiving the digital image. Some such embodiments may include a voice synthesizer for outputting audible human language communications from the remote control device.


Some embodiments described herein may include a method of controlling a portable infusion pump system. The method may include receiving a user's voice input that is indicative of a task associated with using a portable infusion pump system, and controlling the portable infusion pump system to change an operation of the portable infusion pump system based upon the user's voice input. The method may optionally include, prompting a user via a user interface display to confirm the operation change of the portable infusion pump system in response to receiving the user's voice input. In some embodiments, the operation change may optionally comprise calculating or initiating a bolus dispensation of a medicine from the portable infusion pump system.


Some embodiments described herein may include another method of controlling a portable infusion pump system. The method may include receiving user input comprising a digital image that is indicative of a food item consumed or to be consumed by the user of the portable infusion pump system, and controlling the portable infusion pump system to change an operation of the portable infusion pump system based upon the user input comprising the digital image. The method may optionally include, prompting a user via a user interface display to confirm the operation change of the portable infusion pump system in response to receiving the user input comprising the digital image. In some embodiments, the operation change may optionally comprise calculating or initiating a bolus dispensation of a medicine from the portable infusion pump system.


Some or all of the embodiments described herein may provide one or more of the following advantages. First, some embodiments of the infusion pump system may be configured to receive user input via speech recognition technology. Second, some embodiments of the infusion pump system may be configured to receive user input via image recognition technology. Third, some embodiments of an infusion pump system equipped with speech or image recognition technology may facilitate convenient user input of information to the infusion pump system. Third, the safety and efficacy of an infusion pump system may be enhanced because the convenient manner of inputting data to the infusion pump using speech or image recognition may facilitate more timely and complete data entry by the user. Fourth, in some circumstances, some users who may be unable (mentally or physically) to reliably operate a conventional push-button user interface of an infusion pump system may instead be served by embodiments of the system described herein, which can permit such users to reliably input data to an infusion pump system using the speech or image recognition communication interface. Fifth, the infusion pump system equipped with speech or image recognition capabilities may be configured to be portable, wearable, and (in some circumstances) concealable. For example, a user can conveniently wear the infusion pump system on the user's skin under clothing or can carry the pump system in the user's pocket (or other portable location) while receiving the medicine dispensed from the pump device.


The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the invention will be apparent from the description and drawings, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1 is a diagram depicting the use of an infusion pump system equipped with speech recognition capabilities, in accordance with some embodiments.



FIG. 2 is a flowchart describing a process of using an infusion pump system including with speech recognition equipment, in accordance with some embodiments.



FIG. 3 is a schematic diagram of an infusion pump system including with speech recognition equipment, in accordance with some embodiments.



FIG. 4 is a diagram depicting the use of an infusion pump system equipped with image recognition capabilities, in accordance with some embodiments.



FIG. 5 is a diagram depicting the use an infusion pump system equipped with natural language speech recognition capabilities, in accordance with some embodiments.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

Referring to FIG. 1, some embodiments of an infusion pump system 10 can include speech recognition equipment for purposes of receiving and responding to a user's voice input. The infusion pump system 10 may include, among other elements, a control device 50 and a pump device 60 that receives communications from the control device 50. In the embodiment depicted in FIG. 1, the control device 50 wirelessly communicates with the pump device 60, but the system 10 can be implemented using a control device that is removably attached to a corresponding pump device (e.g., for hard-wired electrical communication) or using a control device that is housed together with the pump device (e.g., in a single portable construct). In one example, the infusion pump system 10 can be configured to perform a series of steps A through F in response to a user's voice input 16. By incorporating voice recognition capabilities within the infusion pump system 10, user communications with the portable pump 60 can be enhanced and simplified. As a result, the accuracy and completeness of the data entered by the user into the portable pump 60 can be improved, and the user may experience greater convenience and time efficiency during interactions with the infusion pump system 10.


In this embodiment, the infusion pump system 10 includes the remote control device 50 in communication with the portable pump 60, which is used to dispense insulin or another medication to a user 15 via an infusion set 70 attached to and penetrating the user's skin 20. In some embodiments, the portable pump 60 optionally includes a user interface 62 comprised of input devices such as buttons 63a, 63b, 64a, 64b, 64c and output devices such as display 65. In addition, in this embodiment the user 15 can communicate with the infusion pump system 10 by providing voice input, such as the example verbal statement 16 depicted in FIG. 1. Such a verbal statement can be received by voice recognition equipment housed in the control device 50, in the pump device 60, or both. In particular embodiments, the portable pump 60 may also include a wireless communications circuit 40 that facilitates short-range wireless communications 45 between the internal control circuitry of the portable pump 60 and the external remote control device 50.


The infusion pump system 10 is configured to controllably dispense a medicine to be infused into the tissue or vasculature of a targeted individual, such as a human or animal patient. In some embodiments, the portable pump 60 includes the housing structure 66 that defines a cavity in which a fluid cartridge (not shown) can be slidably received. For example, the fluid cartridge can be a carpule that is either user-fillable or is preloaded with insulin or another medicine for use in the treatment of Diabetes (e.g., Byetta®, Symlin®, or others). Such a cartridge may be supplied, for example, by Eli Lilly and Co. of Indianapolis, Ind. Other examples of medicines that can be contained in the fluid cartridge include: pain relief drugs, hormone therapy, blood pressure treatments, anti-emetics, osteoporosis treatments, or other injectable medicines. The fluid cartridge may have other configurations. For example, in some embodiments the fluid cartridge may comprise a reservoir that is integral with the pump housing structure 66 (e.g., the fluid cartridge can be defined by one or more walls of the pump housing structure 66 that surround a plunger to define a reservoir in which the medicine is injected or otherwise received).


Still referring to FIG. 1, in this embodiment, the portable pump 60 optionally includes a cap device 68 to retain the fluid cartridge in the cavity of the housing structure 66 and to penetrate a septum of the fluid cartridge for purposes of establishing fluid communication with the infusion set 70. The portable pump 60 includes a drive system that advances a plunger (not shown in FIG. 1) in the fluid cartridge so as to dispense fluid therefrom. In some embodiments, the dispensed fluid exits the fluid cartridge, passes through a flexible tube 72 of the infusion set 70 to a cannula housing 74 retained to the user's skin 20 by a skin adhesive patch 78. The dispensed fluid can enter through the skin 20 via a cannula 76 attached to the underside of the cannula housing 74.


In some embodiments, the infusion pump system 10 can be configured to supply scheduled basal dosages of insulin (or another medication) along with user-selected bolus dosages. The basal delivery rate can be selected to maintain a user's blood glucose level in a targeted range during normal activity throughout the day. The user-selected bolus deliveries may provide substantially larger amounts of insulin in particular circumstances in which the user consumed (or will consume) carbohydrates (e.g., during a meal) or in which the user's blood glucose level requires a significant downward correction. In some embodiments, the infusion pump system 10 can suggest a bolus dosage to the user in a manner that accounts for the user's food intake, the user's recent blood glucose level (e.g., manually input into the portable pump 60 by the user, detected from an integral blood test strip analyzer, wirelessly transmitted to the portable pump 60 from an external blood strip reader device, wirelessly transmitted to the portable pump 60 from an body-worn continuous glucose monitoring device, or the like), the rate of change in the user's blood glucose level, and previously delivered insulin that has not acted on the user. For example, a user can enter a carbohydrate value indicative of a meal into the portable pump 60, and in response thereto, the portable pump 60 can output a suggested bolus dosage to the user. In another example, as will be described further below, the user can provide a voice input that identifies food items that the user will consume, and the infusion pump system 10 can use speech recognition technology to determine a suggested bolus dosage that corresponds to the food items.


In some embodiments, the infusion pump system 10 may modify a bolus suggestion (e.g., a bolus delivery in conjunction with a meal) in response to certain circumstances. For example, the infusion pump system 10 may decrease or otherwise modify a post-meal bolus delivery based on a rapidly falling blood glucose level, a current blood glucose level that is below a threshold limit, based on an increased level of physical activity, or the like.


The infusion pump system 10 can be configured to be portable and can be wearable and concealable. For example, a user can conveniently wear some or all of the infusion pump system 10 on the user's skin (e.g., using skin adhesive) underneath the user's clothing or carry the portable pump 60 or remote control device 50 in the user's pocket (or other portable location) while receiving the medicine dispensed from the infusion pump system 10. As such, the pump system 10 can be used to deliver medicine to the tissues or vasculature of the user in a portable, concealable, and discrete manner.


Still referring to FIG. 1, the portable pump 60 includes the user interface 62 that permits a user to monitor and control the operation of the infusion pump system 10. In some embodiments, the user interface 62 includes a display 65 and the user-selectable buttons (e.g., five buttons 63a, 63b, 64a, 64b, and 64c in this embodiment) that are in electrical communication with the control circuitry of the portable pump 60. For example, the display 65 may be used to communicate a number of status indicators, alarms, settings, and/or menu options for the infusion pump system 10. In some embodiments, the user may press one or more of the buttons 63a, 63b, 64a, 64b, and 64c to shuffle through a number of menus or program screens that show particular status indicators, settings, and/or data (e.g., review data that shows the medicine dispensing rate, the amount of medicine delivered during the last bolus, the delivery time of the last bolus, the total amount of medicine dispensed in a given time period, the amount of medicine scheduled to be dispensed at a particular time or date, the approximate amount of medicine remaining in the cartridge, or the like).


In some embodiments, the user can adjust the settings or otherwise program the portable pump 60 by pressing one or more buttons 63a, 63b, 64a, 64b, and 64c of the user interface 62. For example, in embodiments of the infusion pump system 10 configured to dispense insulin, the user may press one or more of the buttons 63a, 63b, 64a, 64b, and 64c to change the dispensation rate of insulin or to request that a bolus of insulin be dispensed immediately, at a scheduled later time, over a period of time, or following a particular time-based profile. In another example, the user may use the buttons 63a, 63b, 64a, 64b, and 64c to manually input information such as the user's current blood glucose level (e.g., as measured by an external blood glucose meter), the current rate of change in the user's blood glucose level, or the like into the portable pump 60.


As an alternative to, or in conjunction with, pressing one or more buttons 63a, 63b, 64a, 64b, and 64c of the user interface 62 to adjust or program the infusion pump system 10, the example infusion pump system 10 can receive voice input from the user. The use of speech recognition equipment (housed in the control device 50, in the pump device 60, or both) provides an additional functionality that can enhance and simplify user interactions with the portable pump 60. For instance, using speech recognition, the need to manually actuate multiple buttons 63a, 63b, 64a, 64b, and 64c in a specific order for purposes shuffling through menus may be eliminated or otherwise reduced in some circumstances. In one example, as depicted in FIG. 1, the user of infusion pump system 10 has consumed, or will soon consume, a bagel and orange juice. As such, the user can cause the infusion pump system to initiate a task in response to the voice input (dispensing a corresponding bolus of insulin to counteract the effects of the intake of the bagel and orange juice). The bolus dispensation of insulin may be intended to cause the user's blood glucose level to remain within a target range. To begin the process, the user 15 can speak the statement 16 that identifies the food to be consumed. Such a verbal statement can be received by a component of the voice recognition equipment, such as a microphone device 51 housed in the control device 50 (or, optionally, a component of the voice recognition equipment housed in the pump device 60, such as a microphone device 61). In response to receiving the voice input, the infusion pump system 10 can interpret the statement 16, determine a recommended bolus dispensation, and present the recommendation to the user 15 for confirmation. Upon receipt of user confirmation, the infusion pump system 10 initiates or schedules the bolus dispensation task. By incorporating such voice recognition capabilities within the infusion pump system 10, user communications with the portable pump 60 can be enhanced and simplified. As a result, the accuracy and completeness of the data entered by the user into the portable pump 60 can be improved, and the user can experience greater convenience and time efficiency.


Still referring to FIG. 1, in this example at step A, the user 15 speaks the statement 16 that reflects a task or set of tasks that the user 15 wants the infusion pump system 10 to perform. For example, the user 15 makes the statement 16, “I am going to eat a bagel and orange juice.” As will be described further, the infusion pump system 10 will receive and process the statement 16 and recommend a bolus dispensation of insulin to compensate for the bagel and orange juice to be consumed by the user 15.


In this example, the user 15 has made a statement 16 that identifies types of food that will be consumed, but it should be understood from the description herein that many other types of statements corresponding to other infusion pump tasks can be similarly initiated using voice input. For instance, in other non-limiting examples of the types of statements that can be made to initiate tasks, the user 15 may speak a command to “stop the pump,” “start the pump,” or “stop the bolus.” Further, the user 15 may speak a command to “start a temporary basal rate of 50% for 3 hours,” or “I am going to exercise for 1 hour.” In still further examples, the user 15 may speak commands such as: “prime the infusion set,” “my blood glucose level is 130,” “I am going to sleep now,” “display estimated battery life,” “display estimated medicine expiration time,” “snooze all alerts for 30 minutes,” “how much insulin do I have on board,” “how long have I been using this infusion set,” “what time is it,” “change to basal pattern B,” “change to my weekend basal pattern,” “soccer practice starts in 30 minutes” (which would be linked to a particular pre-programmed temporary basal pattern), “give that bolus as a square-wave bolus,” “give that bolus as a 4-hour combo bolus,” “remind me to bolus in an hour,” “remind me to check my blood sugar in an hour,” “remind me to eat lunch at 11:30,” “blocked set alarm acknowledged,” and the like. It should be recognized that the user 15 can provide a wide variety of types of statements to initiate a wide variety of tasks by the infusion pump system 10, and that the examples provided here are merely illustrative. In some embodiments, as will be described further in reference to FIG. 5, a natural language processing module can be implemented in infusion pump system 10 to further enhance the speech recognition capabilities of the infusion pump system 10.


Still referring to FIG. 1, in this example, the verbal statement 16 is received by the microphone 51 of the remote control device 50. In some embodiments, the user may press a button on the control device 50 or otherwise prompt the control device 50 to prepare for receiving the voice input. The remote control device 50 can include electronic circuitry for converting the statement 16 to an audio signal (e.g., an “audio file,” “waveform,” or “sample”) that corresponds to the statement 16. The audio signal corresponding to the statement 16 can be saved (temporarily or permanently) in a computer-readable memory device housed in the control device 50, a computer-readable memory device housed in the pump device 60, or both.


In this embodiment the control device 50 is depicted as a smart phone device, but it should be understood from the description herein that, in other embodiments, the control device 50 can be implemented in the form of devices other than a smart phone device. Some other example devices that can be used similarly to the remote control device 50 can include, but are not limited to, a personal computer, a tablet computing device, a blood glucose meter device (e.g., an external blood strip reader), a continuous glucose meter device, a wearable computing device, a PDA, or a custom remote device. In still other embodiments, the control device is not a remote device, but instead is included as part of, or mechanically attached together with, the pump device. For instance, in such embodiments the pump device of the infusion pump system can be equipped with the capabilities to perform the functions described herein in regard to the remote control device 50. Further, in some embodiments certain operations or parts of certain operations may be performed at a remote server system, including a cloud-based server system, rather than completely on a personal computing device such as the remote control device 50. Accordingly, the remote control device 50, or equivalent, can be connected to a network such as the internet or an intranet system. Such a division of tasks may provide better process optimization, computational efficiency, and response time.


Still referring to FIG. 1, in this example at step B, the infusion pump system 10 performs a speech recognition function in response to receiving the voice input 16. In some implementations, the speech recognition function provides a repeatable process of translating a voice utterance to a text transcription using an automated speech recognition (“ASR”) system. In some ASR systems, acoustic and language models can be used by speech recognition engines to statistically analyze an encoded voice utterance in order to create one or more likely text strings that reflect the sounds of the speaker. Some ASR systems may use phonetic dictionaries (e.g., lists of words and their phonetic spellings) when performing speech recognition. Such phonetic dictionaries have been compiled by including pronunciation guides from standard language dictionaries, and by manually labeling acoustic examples of various words spoken by various speakers. In some embodiments, the ASR system can use a language model that includes a large vocabulary statistical language model capable of transcribing complex user utterances. Many speech recognition engines have a group of parameters that may be adjusted to change the way that a voice utterance is analyzed.


Using an ASR system in the remote control device 50, or remotely located at a server in communication with the remote control device 50, or in a combination of tasks among the remote control device 50 and at a remote server, the audio signal from the voice input 16 can be transcribed to one or more candidate text transcriptions correlating to the audio signal of statement 16. In some embodiments, the control device 50 can generate speech recognition confidence values for the candidate transcriptions that are generated. In particular embodiments, the transcription with the highest confidence value may be selected by the ASR system as the designated transcription. Other techniques may also be used to create transcription(s) in response to the voice input 16, and to select which candidate transcription to use as the designated transcription. In some circumstances, no candidate transcription having a confidence value that surpasses a threshold confidence level is identified. In some such circumstances, the control device 50 may request clarification from the user 15, or may request more information from the user 15. Such requests may be presented to the user 15 audibly using voice synthesis at the remote control device 50, or visually by presenting an indication on the display of the remote control device 50, or by a combination of audible and visual indicators.


Still referring to FIG. 1, in this example at step C, the text transcription(s) of the speech recognition process from step B are compared to a compilation of tasks in a task database 80 indicative of available tasks to be performed by the infusion pump system 10. Such a comparison function (at step C) can be useful for determining the task to be performed by the pump system 10 that most likely corresponds the voice input 16 from the user 15. In some embodiments, the task database 80 is stored in a computer-readable memory device housed in the remote control device 50. However, the task database 80 can also be stored in a computer-readable memory device housed in the portable pump 60, stored in a computer-readable memory device housed at a remote server system in communication with the remote control device 50 or the portable pump 60, or at a combination of such computer-readable memory locations (e.g., a distributed database configuration). In this embodiment, the task database 80 is a repository that stores an extensive number of tasks available to be performed by the pump system corresponding to a variety of types of voice input statements, such as statement 16. The transcription(s) of the voice input from step B can be compared to the tasks listed in the task database 80 to find matching task(s) to be performed by the pump system 10. In some embodiments, a confidence level for the match between the transcription(s) and the task(s) can be determined. Optionally, the task with the highest confidence level can be automatically selected for implementation by the control device 50, the pump device 60, or both. In particular embodiments, if no task is determined at step C with a confidence level that surpasses a threshold level, or if multiple tasks have confidence levels that are within a differentiation threshold level of each other, the user 15 is prompted to verify which task should be implemented (e.g., presented in visual and/or audio output via the user interface of the control device 50 or the pump device 60 with a request for clarification or more information as described above). In some such cases, the user 15 may be presented with the task having the highest confidence level and the user 15 may be prompted to confirm that task should be implemented by the infusion pump system 10 (e.g., the prompt for verification from the user can be presented in visual and/or audio output via the user interface of the control device 50 or the pump device 60).


Still referring to FIG. 1, in this example at step D, the selected task from step C is characterized in preparation for presentation to the user 15 (for subsequent user confirmation). Depending on the task, additional information may be acquired from a database 90 as a part of the preparation step. The database 90, as with the task database 80, can be stored in one or more computer-readably memory devices housed in various locations including in the remote control device 50, the portable pump 60, a remote server including cloud-based servers, and at a combination of such locations. As depicted by this example in FIG. 1, the database 90 queried by the process at step D can contain nutritional information for a variety of food items. The nutritional information can include, but is not limited to, carbohydrates, fat, protein, and the glycemic index for food items. In some embodiments, the database 90 can also include the user's 15 most current blood glucose reading, an insulin-on-board level, an insulin sensitivity factor for the user 15, bolus delivery preference, and the like. In particular embodiments, some or all of such nutritional information and other data can be considered when the task is being prepared for presentation to the user 15. For example, in response to the statement 16, the nutritional information for a bagel 92 and orange juice 94 can be queried from the database 90. In some embodiments, the data stored in database 90 is customizable by the user 15. For example, the user 15 may make a particular food item, like a peanut butter and jelly sandwich 96, such that it has particular nutritional contents. The user's custom nutritional information can be given preference in the database 90 over the default nutritional information. In some embodiments, as part of the preparation for presenting the task to the user 15, the user 15 may first be presented with a request for additional information. For example, the user 15 may be presented with a request to input a current blood glucose level. After the receipt of such addition information, the preparation for presenting the task to the user 15 can be completed.


Still referring to FIG. 1, in this example at step E, the finalized task to be performed by the infusion pump system is presented to the user 15 for confirmation. The task may be presented to the user 15 audibly using voice synthesis at the control device 50 (or the pump device 60), or visually by presenting an indication on the display of the control device 50 (or the pump device 60), or by a combination of audible and visual indicators at one or both of the control device 50 and the pump device 60. For example, in response to the statement 16, the user 15 is presented with information indicating that the infusion pump system 10 has identified a task related to the user's 15 intent to consume 74 grams of carbohydrates (48 grams from the bagel and 26 grams from the orange juice), and that the infusion pump system 10 recommends a corresponding bolus dispensation of 4.9 Units of insulin. To confirm that task, the user 15 can select “YES” 52 on the remote control device 50. In response to a selection of the “YES” button 52, the control device 50 can communicate with the pump device 60 so as to initiate the dispensation of the bolus dosage (e.g., 4.9 Units in this example), as described below. Or to deny that task, the user 15 can select “NO” 54 on the remote control device 50. Optionally, in response to a selection of the “NO” button 54, the control device 50, can present the user with an option to manually input or verbally speak a specific number for a bolus dosage that is different from the suggested dosage displayed on the screen at step E. Alternatively, or in addition to, the manual selection of “YES” 52 or “NO” 54, the user 15 may speak “yes” or “no” to the remote control device 50 to confirm or deny the task presented.


Still referring to FIG. 1, in this example at step F, the remote control device 50 communicates the task to the portable pump 60 for activation of the portable pump 60 in accordance with the task confirmed by the user 15 (e.g., after the user selected the “YES” button 52). In the example, the display 65 of the portable pump 60 indicates that a bolus dispensation of 4.9 Units has been initiated. In this embodiment, communications between the remote control device 50 and the portable pump 60 are conducted by short-range wireless technologies such as, but not limited to, RF, Bluetooth, NFC, IR, Bluetooth low energy, ANT+, and the like. Accordingly, the portable pump 60 can include a wireless communication circuit 40 that sends and receives data in cooperation with the remote control device 50. In alternative embodiments, the communications between the remote control device 50 and the portable pump 60 can be via a hardwired connection therebetween.


Referring now to FIG. 2, the control circuitry of a medical device (e.g., a portable infusion pump in this embodiment) that includes speech recognition equipment can implement a process 200 of receiving voice input from a user, and controlling the medical device in accordance with task(s) associated with the voice input. Such a process 200, for example, can be implemented by the control circuitry housed in the control device 50, the portable pump 60, or a combination thereof, and other embodiments of infusion pump systems described herein (e.g., FIGS. 3, 4, and 5).


In operation 205, the control circuitry of a medical device can receive voice input from a vocal utterance spoken by a user of the medical device. The voice input can be indicative of a task associated with using the medical device. One example of a medical device to perform operation 205 is depicted in FIG. 1, where the infusion pump system 10 includes the control device 50 that is in communication with the portable pump device 60 of the infusion pump system 10. As explained, the control device 50 can receive the voice input via the microphone 51 located in the remote control device 50. In other embodiments, another type of control device 50 (e.g., a tablet computing device, a blood glucose meter device, a body-worn continuous glucose monitoring device, a custom remote, a removably attachable control device, and the like) can perform the same steps as the remote control device 50, which is implemented in FIG. 1 as a smartphone device. In still further embodiments, no remote control device 50 is included in the infusion pump system 10, and the receipt of the voice input can received directly at the microphone 61 housed in the portable pump device 60.


In operation 210, the voice input is coded to digital format (e.g., an “audio file,” “waveform,” “sample,” and the like) by the control circuitry of the medical device and saved in memory of the medical device. For example, in the context of the infusion pump system 10 of FIG. 1, the remote control device 50 can convert the voice input to digital format and save the digitized voice input in memory.


In operation 215, the digitized voice input is analyzed by the control circuitry of the medical device to determine one or more candidate textual transcriptions corresponding to the voice input. This step of the process 200 can be optionally performed using an ASR system, as explained above in regard to FIG. 1. In some embodiments, the control circuitry of the medical device communicates with a remote server to perform some or all of the ASR system operations.


In operation 220, the control circuitry of the medical device compares the textual transcription(s) from operation 215 to tasks pertaining to the medical device and that are stored in a task database. In some embodiments, the task database is stored in the memory of the medical device. In alternative embodiments, the task database is stored at a remote server system that is accessible by the medical device over a network such as the internet. One or more tasks that are stored in the task database can be identified as candidates to have a correspondence to the textual transcription(s). A statistical confidence level can be generated in regard to the correspondence between the textual transcription(s) and the candidate task(s).


In operation 225, the control circuitry of the medical device compares the statistical confidence level(s) generated in operation 220 to a predetermined threshold confidence level. If one and only one particular task has a statistical confidence level that surpasses the threshold confidence level, that particular task is selected as the task to present to the user, and the process 200 moves to operation 235. However, if no particular task has a statistical confidence level that surpasses the threshold confidence level, or if multiple tasks have statistical confidence level(s) that surpass the threshold confidence level, then the process 200 moves to operation 230. In alternative embodiments, if multiple tasks have statistical confidence level(s) that surpass the threshold confidence level, then the task with the highest confidence level is selected as the task to present to the user. In some such alternative embodiments, the task with the highest confidence level is only selected if the confidence level of the task is greater than the next highest confidence level by more than a predetermined differential threshold value.


In operation 230, the control circuitry of the medical device requests user clarification in regard to the voice input that was previously provided by the user in operation 205. The request for user clarification can be presented audibly to the user by voice synthesis via the medical device, by displaying information on the user interface display of the medical device, or both. In some circumstances, the clarification requested may be in relation to a candidate task that had a statistical confidence level that was determined to be less than the threshold confidence level. For instance, such a clarification request could be, “Do you want to stop the pump?” In another circumstance, the clarification requested may be general, rather than in relation to a candidate task. For example, in that circumstance the clarification request could be, “Your input was not understood—please try again,” or another indication that the voice input should be restated. After requesting user clarification, the process 200 returns to operation 205 and waits for further voice input from the user.


In operation 235, after selecting a task in operation 225, the control circuitry of the medical device characterizes the selected task, as needed, in preparation for presentation to the user for confirmation of the task. For example, some data may need to be obtained and some calculations may need to be performed to prepare the task for presentation to the user. To provide a more specific example, as described in the context of the infusion pump system 10 of FIG. 1, the nutritional content of the food items (bagel and orange juice) were obtained from a database. The nutritional content of the bagel and orange juice were included in the task as presented to the user for confirmation.


In operation 240, the control circuitry of the medical device presents the task to the user for user confirmation. The task can be presented audibly to the user by voice synthesis via the medical device, by displaying information on the user interface display of the medical device, or both. As described in relation to the infusion pump system 10 of FIG. 1, in some embodiments the presentation of the task can include a description of the task and selectable elements on the user interface of the medical device such as buttons or soft-keys corresponding to “YES” and “NO” or the user can provide “yes” or “no” inputs by speaking to the medical device. The user's responsive input is received by the control circuitry of the medical device in operation 245.


In operation 250, the control circuitry of the medical device determines whether the user input received in operation 245 was a confirmation or a denial of the task that was presented to the user. If the user input was a denial of the task that was presented to the user, the process 200 proceeds to operation 230 where user clarification is requested as described above. If the user input was a confirmation of the task that was presented to the user, the process 200 proceeds to operation 255 where the control circuitry of the medical device communicates the task to other portions of the device so as to implement the task. In this embodiment, the control circuitry communicates the task to the pump controller to implement the task. In the context of the infusion pump system 10 of FIG. 1, the operation is exemplified in step F with the remote control device 50 sending a wireless signal 45 to the portable pump 60 to initiate a bolus of 4.9 units of insulin.


Now referring to FIG. 3, various embodiments of a portable infusion pump system 300 can include a pump controller device 360 that is equipped with speech recognition capabilities. As described further herein, the speech recognition process can take place at the pump controller device 360, at a remote server 310 (which can be multiple servers in a system) in communication with the pump controller device 360, or by a combination of both the pump controller device 360 and the remote server 310. Certain items of the infusion pump system 300 are shown with dashed lines to indicate that they are optional or alternative items, as explained below.


The pump controller device 360 includes a control module 361 that can be made up of one or more components. In this embodiment, the control module 361 is configured to communicate control or power signals to the other components of the infusion pump system 300, and to receive inputs and signals therefrom. In some embodiments, the control circuitry can include a main processor board that is in communication with a power supply board. The control circuitry can include at least one processor that coordinates the electrical communication to and from the control module 361 and other components of the infusion pump system 300. For example, the user interface 362 of the pump controller device 360 can include input components (e.g., buttons, touchscreen, microphone, or a combination thereof) and output components (e.g., display screen, speaker, vibratory device, or a combination thereof) that are electrically connected to the control circuitry of the control module 361. In some embodiments, the control module 361 can receive input commands from a user's button selections (e.g., buttons as shown in FIG. 1, 4, or 5), and thereby cause the display device of the user interface 362 to output a number of menus or program screens that show particular settings and data (e.g., review data that shows the medicine dispensing rate, the total amount of medicine dispensed in a given time period, the amount of medicine scheduled to be dispensed at a particular time or date, the approximate amount of medicine remaining the cartridge, the amount of battery life remaining, or the like).


The processor of the control module 361 can be arranged on a main processor circuit board of the control module 361 along with a number of other electrical components such as computer-readable memory devices. The control circuitry can be programmable in that the user or a clinician may provide one or more instructions to adjust a number of settings for the operation of the infusion pump system 300. Such settings may be stored in the memory devices of the control module 361. Furthermore, the control module 361 may include one or more dedicated memory devices that store executable software instructions for the processor. The control module 361 may include other components, such as sensors, that are electrically connected to the main processor board. A rechargeable battery pack (not shown) may provide electrical energy to the control module 361, and to other components of the pump controller device 360 (e.g., user interface 362, speech recognition module 363, and others).


Still referring to FIG. 3, the user interface 362 of the pump controller device 360 permits a user to monitor and control the operation of the pump controller device 360. For example, the user interface 362 can include a display device having an active area that outputs information to a user, and buttons (e.g., actuatable buttons as shown in FIG. 1, 4, or 5, or touchscreen soft-key buttons defined on the display device) that the user can use to provide input. The display device can be used to communicate a number of settings or menu options for the infusion pump system 300. The display may include an active area in which numerals, text, symbols, images, or a combination thereof can be displayed (refer, for example, to FIG. 1). For example, the user may press one or more buttons to shuffle through a number of menus or program screens that show particular settings and data (e.g., review data that shows the medicine dispensing rate, the total amount of medicine dispensed in a given time period, the amount of medicine scheduled to be dispensed at a particular time or date, the approximate amount of medicine remaining in the cartridge, or the like). In some embodiments, the user can adjust the settings or otherwise program the control module 361 via the user interface 362. For example, in embodiments of the infusion pump system 300 configured to dispense insulin, the user may press one or more of the buttons of the user interface 362 to change the dispensation rate of insulin or to request that a bolus of insulin be dispensed immediately or at a scheduled, later time.


The user interface 362 can also include components that facilitate voice communications between the pump controller device 360 and a user. In some embodiments, the user interface 362 includes a microphone (refer, for example, to microphone 51 or microphone 61 in FIG. 1). The microphone can receive voice input from the user, such as when the user wants to initiate a task using the speech recognition capabilities of the infusion pump system 300. Further, in some embodiments, the user interface 362 includes a speaker. The speaker can be used to provide audible communications (e.g., synthesized speech, audible beeps or tones, or the like) from the infusion pump system 300 to the user. For example, the infusion pump system 10 of FIG. 1 provided an audible characterization of the task to the user 15 in step E, and the process 200 of FIG. 2 provided an audible request for clarification to the user in operation 230.


Still referring to FIG. 3, the pump controller device 360, the remote server 310, or both the pump controller device 360 and the remote server 310, can optionally include speech recognition modules 363 and 313, task databases 364 and 314, and food and activity databases 365 and 315 respectively. These subsystems can facilitate voice communications between the pump controller device 360 and a user, for example, as described in reference to FIG. 1. The pump controller device 360 and the remote server 310 can be in communication with each other via a network 330, such as a wireless network, WiFi network, wired network, LAN, intranet, internet, telephone network, and so on—and combinations of such networks. The pump controller device 360 can communicate with the network 330 using a wireless connection 320, or a wired connection, or both a wireless connection 320 and a wired connection. Such wireless communication may occur, for example, via a wireless communication module 367 using radio-frequency, Bluetooth, WiFi, or other such wireless communication methods, and combinations of such methods. The remote server 310 can include one or more processors 312 that can execute instructions embodied in a computer program. The processors 312 can include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.


In some embodiments of the infusion pump system 300, the pump controller device 360 includes the speech recognition module 363, task database 364, and food and activity database 365, while the remote server 310 may not have analogous sub-systems. In such embodiments, the speech recognition process and other operations for facilitating voice communications between the pump controller device 360 and a user are performed entirely at the pump controller device 360. In alternative embodiments of the infusion pump system 300, the remote server 310 includes speech recognition module 313, task database 314, and food and activity database 315, while the pump controller device 360 does not have analogous sub-systems. In such embodiments, the speech recognition process and other operations for facilitating voice communications between the pump controller device 360 and a user are performed by the remote server 310.


In particular embodiments, both the pump controller device 360 and the remote server 310 include the sub-systems for performing speech recognition and other operations for facilitating voice communications between the pump controller device 360 and a user. That is, the pump controller device 360 includes the speech recognition module 363, the task database 364, and the food and activity database 365; and in addition the remote server 310 includes the speech recognition module 313, the task database 314, and the food and activity database 315. In alternative embodiments, one or more of the sub-systems are located in both the pump controller device 360 and the remote server 310, but not all of the sub-systems are located in both.


Various techniques can be used to coordinate the activities between the pump controller device 360 and the remote server 310 when some or all of the sub-systems are arranged in both the pump controller device 360 and the remote server 310. For example, in some embodiments the processing can be initiated locally at the pump controller device 360, and if the pump controller device 360 is unable to attain the threshold statistical confidence levels for the textual transcription of the voice signal or the task matching (refer to FIG. 1), then the sub-systems of the remote server 310 can be activated to assist the pump controller device 360. If the remote server 310 attains results with higher statistical confidence levels, then the results from the remote server 310 can be used rather than the results from the pump controller device 360. That technique may be beneficial because, for example, the task database 314 and the food and activity database 315 at the remote server 310 may have a larger library of data than the task database 364 and the food and activity database 365 at the pump controller device 360. In another example, processing in the sub-systems of both the pump controller device 360 and the remote server 310 can be initiated concurrently, and whichever completes processing first can be used for presentation to the user. Or, when processing in the sub-systems of both the pump controller device 360 and the remote server 310 are initiated concurrently, the results having the highest statistical confidence level can be used for presentation to the user. It should be understood that many other arrangements for coordinating the activities between the pump controller device 360 and the remote server 310, when some or all of the sub-systems are arranged in both the pump controller device 360 and the remote server 310, are envisioned and within the scope of this disclosure.


The speech recognition modules 363 and 313 are electrical communication with the control module 361. Optionally, the speech recognition modules 363 and 313 can facilitate the operations of an ASR (“automated speech recognition”) system. Using the ASR system, a digitized audio signal of a user voice input can be transcribed to one or more candidate text transcriptions that are correlated to the audio signal. In some embodiments, statistical confidence values for the candidate transcriptions are generated. In particular embodiments, the transcription with the highest confidence value may be selected as the designated transcription by the ASR system. Other ASR techniques may also be used to create transcription(s), and to select which candidate transcription to use as the designated transcription.


The task databases 364 and 314 are electrical communication with the control module 361. The task databases 364 and 314 are data repositories containing textual tasks and code that relate to the operation of the infusion pump system 300. The textual tasks contained in the task databases 364 and 314 can be compared to the textual transcriptions provided from the ASR system in operation in the speech recognition modules 363 and 313. Accordingly, candidate tasks can be identified as matches with voice inputs provided by a user of the infusion pump system 300. In some embodiments, when no matching task is determined that surpasses a statistical confidence threshold value, the infusion pump system 300 may prompt the user for clarification of the voice input.


The food and activity databases 365 and 315 are electrical communication with the control module 361. The food and activity databases 365 and 315 are data repositories containing data and other types of information that can be used to pre-process a task in preparation presentation to the user and in preparation for implementation of the task. For example, in the infusion pump system 10 of FIG. 1 the database 90 contained the nutritional information for the food items (a bagel and orange juice) that the user 15 identified in the statement 16. The nutritional information was used to populate the task that was presented to the user 15 and communicated to the portable pump 60 for execution.


Still referring to FIG. 3, optionally, the pump controller device 360 may also serve as the pump unit for the infusion pump system 300, thereby dispensing medicine from the same housing that contains the control module 361 and other components. In those particular embodiments, the pump controller device 360 can be optionally equipped with an internally housed medicine reservoir and drive system 368 in hardwired electrical communication with the control module 361. Such embodiments of the portable infusion pump system 300 can employ a reusable pump apparatus. Therefore, in those embodiments, the infusion pump system 300 may optionally serve as a reusable device that houses the control module 361 and the integral reservoir and pump drive system 368 within a single housing construct. In those circumstances, the pump controller device 360 can be adapted to slidably receive a medicine cartridge in the form of a carpule that is preloaded with insulin or another medicine, or alternatively can be adapted to have a refillable internal reservoir. The pump drive system 368 can act upon the fluid cartridge to controllably dispense medicine through an infusion set (refer, for example, to infusion set 70 in FIG. 1) and into the user's tissue or vasculature. In this embodiment, the user can wear the pump controller device 360 on the user's skin under clothing or in the user's pocket while receiving the medicine dispensed through the infusion set.


Still referring to FIG. 3, as an alternative to the internally housed medicine reservoir and drive system 368, the infusion pump system 300 can include a separate pump device 370 (including a reservoir and a drive system) that is in communication (wireless communication or a releasable electrical connection) with the pump controller device 360. In these embodiments, the separate pump device 370 can be configured as a disposable and non-reusable pump component while the controller device 360 is configured to be reused with a series of the pump devices 370. In the depicted embodiment shown in FIG. 3, wireless communications are used between the separate pump device 370 and the pump controller device 360, using the wireless communication module 367 in the pump controller device 360. The wireless communications of the wireless communication module 367 can utilize any of a variety of wireless communication technologies. For example the wireless communication module 367 can employ NFC (near field communication), Bluetooth, RF (radio frequency), infrared, ultrasonic, electromagnetic induction, and the like, and combinations thereof. Alternatively, a releasable electrical connection can be used between the separate pump device 370 and the pump controller device 360 so as to provide hardwired electrical communication between the control module 361 of the controller device 360 and the drive system of the pump device 370. In such embodiments, the separate pump device 370 can be removably attachable with the controller device 360 so that the two housings are mechanically mounted together during dispensation of the medicine from the separate pump device 370.


In brief, in embodiments of the infusion pump system 300 that include the separate pump device 370, the pump controller device 360 may be configured as a reusable component that provides electronics and a user interface to control the operation of the infusion pump system 300, and the separate pump device 370 can be a disposable component that is discarded after a single use. For example, the separate pump device 370 can be a “one time use” component that is thrown away after the fluid cartridge therein is exhausted. Thereafter, the user can wirelessly connect or removably mount a new separate pump device 370 to the reusable pump controller device 360 for the dispensation of a new supply of medicine from the new pump device 370. Accordingly, the user is permitted to reuse the pump controller device 360 (which may include complex or valuable electronics) while disposing of the relatively low-cost separate pump device 370 after each use. Such an infusion pump system 300 can provide enhanced user safety as a new separate pump device 370 is employed with each new fluid cartridge.


Still referring to FIG. 3, the pump controller device 360 can also optionally include an internal blood strip reader 366 mounted therein and being in electrical communication with the control module 361. In such embodiments of the pump controller device 360, test strips (e.g., blood test strips) containing a sample of the user's blood can be inserted into the blood strip reader 366 portion of the pump controller device 360, to be tested for characteristics of the user's blood. The results of the analysis can be used to affect the dosage or schedule of medicine dispensations from the pump controller device 360 to the user as determined by the control module 361. As an alternative to or in addition to the internal blood strip reader 366 housed in the pump controller device 360, the pump controller device 360 can be configured to communicate with an external blood glucose detection device 380, such as a continuous glucose monitor or a handheld blood glucose meter. For example, the test strips (e.g., glucose test strips) containing a sample of the user's blood can be inserted into external handheld blood glucose meter 380, which then analyzes the characteristics of the user's blood and communicates the information (via a wired or wireless connection) to the pump controller device 360. In other embodiments, the user interface 362 of the pump controller device 360 can be employed by the user to manually enter the user's blood glucose information as reported on a screen of a handheld blood glucose meter 380. In still other embodiments, the infusion pump system 300 can include a continuous glucose monitor 380 (as an alternative to or in addition to the internally housed blood strip reader 366) that can continuously monitor characteristics of the user's blood and communicate the information (via a wired or wireless connection) to the pump controller device 360.


Optionally, as shown in FIG. 3, the pump controller device 360 can also optionally include an image recognition module 369. As described in more detail below (e.g., in connection with FIG. 4), the image recognition module 369 can be used as part of an image recognition operation that facilitates efficient communications between the user and the pump controller device 360. The image recognition module can include a digital camera, image storage memory, and one or more programs configured to determine candidate matching images from a user-input image (as described in detail below). In optional embodiments, both the pump controller device 360 and the remote server 310 include the sub-systems for performing image recognition and other operations for facilitating efficient communications between the pump controller device 360 and a user. That is, the pump controller device 360 includes the image recognition module 369, and in addition the remote server 310 includes the image recognition module 319.


Referring now to FIG. 4, some embodiments of an infusion pump system 400 can include image recognition equipment for purposes of receiving and responding to a user's digital image input. The infusion pump system 400 may include, among other elements, a control device 450 and the pump device 60 that receives communications from the control device 450. Similar to the embodiment previously described in connection in FIG. 1, the control device 450 wirelessly communicates with the pump device 60, but the system 400 can be implemented using a control device that is removably attached to a corresponding pump device (e.g., for hard-wired electrical communication) or using a control device that is housed together with the pump device (e.g., in a single portable construct). Optionally, the controller device 450 can be implemented as the same controller device 50 previously described in connection in FIG. 1.


In this example, the infusion pump system 400 can be configured to perform a series of steps A′ through E′ are illustrated that describe operations of an example infusion pump system 400 including with image recognition equipment. By incorporating image recognition capabilities within the infusion pump system 400, user communications with a portable pump 60 can be enhanced and simplified. As a result, the accuracy and completeness of the data entered by the user into the portable pump 60 can be improved, and the user can experience greater convenience and time efficiency. In some embodiments of the infusion pump system 400, speech recognition capabilities (e.g., as described in reference to FIG. 1) can be included along with the image recognition capabilities.


As previously described, the infusion pump system 400 can include the remote control device 450 in electrical communication with the portable pump 60, which is used to supply insulin or another medication to a user via an infusion set 70 attached to and penetrating the user's skin 20. In some embodiments, the portable pump 60 includes the user interface 62 comprised of input devices such as buttons 63a, 63b, 64a, 64b, 64c and output devices such as display 65. In addition, in this embodiment the user can communicate with the infusion pump system 400 by providing image input, such as example digital image 440 of a bagel 444 and a serving of orange juice 442. In particular embodiments, the portable pump 60 may also include the wireless communications circuit 40 that facilitates short-range wireless communications 45 between the internal control circuitry of the portable pump 60 and the external remote control device 450. As with the previously described system 10 of FIG. 1, the infusion pump system 400 is configured to controllably dispense a medicine to be infused into the tissue or vasculature of a targeted individual, such as a human or animal patient.


Still referring to FIG. 4, in this example at step A′, the remote control device 450 is used to take a photographic image 440 (e.g., a digital photo) of a bagel 444 and a serving of orange juice 442 that the user is going to consume. As will be described further, the infusion pump system 400 will receive and process the image 440 and recommend a bolus dispensation of insulin to compensate for the bagel 444 and orange juice 442 to be consumed by the user. In some embodiments, a reference object of known size is optionally included in the photographic image 440 to assist with estimating the quantity of food items in the image 440. Examples of such reference objects include the user's hand or finger, a business card, a coin, an insulin pump, and the like.


In this embodiment, the image 440 is received by a digital camera system 459 housed in the remote control device 450. The remote control device 450 includes electronic circuitry for digitizing the image 440 into pixels. The digitized image can be stored (permanently or temporarily) in a computer-readable memory device of the remote control device 450. In other embodiments, the image 440 can be received by a digital camera system 69 housed in the pump device 60, and the image can be stored in a computer-readable memory device of the remote control device 50.


In this embodiment the control device 450 is depicted as a smart phone device, but it should be understood from the description herein that, in other embodiments, the control device 450 can be implemented in the form of devices other than a smart phone device. Some other example devices that can be used similarly to the remote control device 450 can include, but are not limited to, a personal computer, a tablet computing device, a blood glucose meter device (e.g., an external blood strip reader), a continuous glucose meter device, a wearable computing device (e.g., glasses equipped with a camera and computer network connectivity), a PDA, a digital camera, or a custom remote device. In still other embodiments, the control device is not a remote device, but instead is included as part of, or mechanically attached together with, the pump device. For instance, in such embodiments the pump device of the infusion pump system can be equipped with the capabilities to perform the functions described herein in regard to the remote control device 450. Further, in some embodiments certain operations or parts of certain operations may be performed at a remote server system, including a cloud-based server system, rather than completely on a personal computing device such as the remote control device 450. Accordingly, the remote control device 450, or equivalent, can be connected to a network such as the internet or an intranet system. Such a division of tasks may provide better process optimization, computational efficiency, and response time.


Still referring to FIG. 4, in this example at step B′, image recognition is performed in response to receiving the digital file of image 440. For example, in response to the receipt of the image 440, the control device 450 can perform an image recognition function so as to determine that the food items depicted in the image 440 include a bagel and a glass of orange juice. In one implementation, the digital file of image 440 is matched to one or more candidate images (e.g., model images of food items or other items) from an image database 480. The image database 480 can be stored in a computer-readable memory device of the remote control device 450, stored in a computer-readable memory device of the portable pump 60, stored in a computer-readable memory device of a remote server system in communication with the remote control device 450, or a combination thereof. The image recognition process can be performed at the remote control device 450 or portable pump 60, at the remote server, or at both the remote control device 450 or portable pump 60 and the remote server. Performing the image recognition at a remote server may provide better process optimization, computational efficiency, and response time due to the high level of data processing power required for efficient image recognition—but it is not a requirement to perform the image recognition at a remote server.


In this embodiment, the control device 450 is equipped with an image recognition module (refer, for example, to element 369 in FIG. 3) that is configured to compare the digital file of image 440 with digital files of images that are stored in the image database 480. This process can result in finding candidate matching images. Each image is composed of pixels that are expressed as a series of numbers. One approach to matching the images to use the image recognition module to search for patterns and sequences in numerical data that make up the digital files. If the image recognition module can identify similar numerical series in multiple images, it can recognize that the images may be all of the same subject. In some embodiments, a statistical confidence level can be calculated in regard to the candidate matching images. In particular embodiments, the image with the highest confidence value may be selected by the image recognition system as the designated matching image. Other techniques may also be used to select which candidate image to use as the designated matching image. For example, in some embodiments statistical priority can be given to foods that the user has previously utilized the image recognition technique to identify. In some circumstances, no candidate image having a confidence value that surpasses a threshold confidence level is identified. In some such circumstances, the remote control device 450 may request clarification from the user, or may request more information from the user (such as another photograph from a different perspective or using different lighting). Such requests may be presented to the user audibly using voice synthesis at the remote control device 450, or visually by presenting an indication on the display of the remote control device 450, or by a combination of audible and visual indicators.


Still referring to FIG. 4, in this example at step C′, nutritional information of the food in the image 440 is obtained from database 490, and a recommended bolus dispensation is calculated. In some embodiments, the calculation of the recommended bolus dispensation can take into account the user's preferred dispensation method, such as a fast bolus, a timed bolus (with preferred time of delivery), or a combination bolus (including a preferred division between a present and an upcoming timed dispensation, and the preferred duration of the upcoming timed dispensation). The database 490, as with the image database 480, can be stored in one or more computer-readable memory devices at various locations including at the remote control device 450, the portable pump 60, a remote server system including cloud-based servers, or at a combination of such locations. As depicted by this example, the database 490 can contain nutritional information for a variety of food items. The nutritional information can include, but is not limited to, carbohydrates, fat, protein, and the glycemic index for food items. In some embodiments, the database 490 can also include the user's most current blood glucose reading, an insulin-on-board level, an insulin sensitivity factor for the user, and the like. In particular embodiments, some or all of such nutritional information and other data can be considered when the task is being prepared for presentation to the user. For example, in response to the receipt of the image 440, the bagel's nutritional information 492 and orange juice's nutritional information 494 can be queried from the database 490. In some embodiments, the data stored in database 490 is customizable by the user as described above in regard to database 90 of FIG. 1. The user's custom nutritional information can be given preference in the database 490 over the default nutritional information. In some embodiments, as part of the preparation for presenting the task to the user, the user may first be presented with a request for additional information. For example, the user may be presented with a request to input a current blood glucose level. After the receipt of such addition information, the preparation for presenting the task to the user can be completed.


In some embodiments, step C′ can be performed as follows. The candidate matching images selected from the image database 480 as determined by the image recognition process of step B′ can have metadata associated therewith. The metadata can identify the type of food in the image(s) (e.g., a bagel and a serving of orange juice). Using such metadata, the nutritional information for the food types can be queried from the database 490. The nutritional information obtained from the database 490 can be used in computations—along with other parameters such as the user's most current blood glucose reading, an insulin-on-board level, an insulin sensitivity factor for the user, and the like—to determine a recommended bolus dispensation. The descriptions of the food items identified as matching the image 440, and the recommended associated bolus can then be characterized in preparation for presentation to the user (for subsequent user confirmation).


Still referring to FIG. 4, in this example at step D′, the task is presented to the user for confirmation that the task is what the user 15 desires. The task may be presented to the user audibly using voice synthesis at the remote control device 450, or visually by presenting an indication on the display of the remote control device 450, or by a combination of audible and visual indicators. For example, in response to the image 440, the user is presented with information indicating that the infusion pump system 400 has identified a task related to the user's intent to consume 74 grams of carbohydrates (48 grams from the bagel and 26 grams from the orange juice), and that the infusion pump system 400 recommends a corresponding bolus dispensation of 4.9 Units of insulin. To confirm that task, the user can select “YES” 452 on the remote control device 450. In response to a selection of the “YES” button 452, the control device 450 can communicate with the pump device 60 so as to initiate the dispensation of the bolus dosage (e.g., 4.9 Units in this example), as described below. Or to deny that task, the user can select “NO” 454 on the remote control device 450. Optionally, in response to a selection of the “NO” button 454, the control device 450 can present the user with an option to manually input or verbally speak a specific number for a bolus dosage that is different from the suggested dosage displayed on the screen at step E. Alternatively, or in addition to, the manual selection of “YES” 452 or “NO” 454, the user may speak “yes” or “no” to the remote control device 450 to confirm or deny the task presented.


In this example at step E′, the remote control device 450 communicates the task to the portable pump 60 for activation of the portable pump 60 in accordance with the task confirmed by the user (e.g., after the user selected the “YES” button 452). In the example, the display 65 of the portable pump 60 indicates that a bolus dispensation of 4.9 Units has been initiated. In this embodiment, communications between the remote control device 450 and the portable pump 60 are conducted by short-range wireless technologies such as, but not limited to, RF, Bluetooth, NFC, IR, and the like. Accordingly, the portable pump 60 can include a wireless communication circuit 40 that sends and receives data in cooperation with the remote control device 450. In alternative embodiments, the communications between the remote control device 450 and the portable pump 60 can be via a hardwired connection therebetween.


In another embodiment, rather than (or in addition to) using photographic image recognition to ascertain nutritional information for food to be consumed, a portable spectroscope scanner system can be used to ascertain nutritional information for food to be consumed. In this technique, a user can scan food items to be consumed using a portable spectroscope scanner. The spectroscope scanner will create a spectrograph of the food items that can be analyzed to determine nutritional information of the food items. Some spectroscope scanner systems may utilize a reference material placed next to the food for calibration as part of routine use or occasionally.


In some embodiments, the spectroscope scanner transmits the spectrograph data to another processing device that operates a spectrograph analysis application that can be run to determine the nutritional information of the food that was scanned. Such processing devices can include a cloud-based computer system or a local computing device, such as a smartphone, tablet PC, desktop PC, an infusion pump, and the like. In some embodiments, the spectroscope scanner may be able to determine the nutritional information of the food that was scanned without the assistance of another processing device. In particular embodiments, as part of the analysis of the spectrograph, statistical priority can be given to foods that the user has previously utilized the spectrograph analysis technique to identify. The processing device that analyzes the spectrograph can determine the nutritional information and then transmit the nutritional information to the remote control device 450. The remote control device 450 can display the nutritional information to the user, and display a prompt by which the user can initiate a corresponding bolus dispensation via the portable pump device 60, in a manner analogous to that described above.


Referring now to FIG. 5, some embodiments of an infusion pump system 500 can include natural language processing (“NLP”) capabilities for purposes of receiving and responding to a user's voice input. The infusion pump system 500 may include, among other elements, a control device 550 and the pump device 60 that receives communications from the control device 550. Similar to the embodiment previously described in connection in FIG. 1, the control device 550 wirelessly communicates with the pump device 60, but the system 500 can be implemented using a control device that is removably attached to a corresponding pump device (e.g., for hard-wired electrical communication) or using a control device that is housed together with the pump device (e.g., in a single portable construct). Optionally, the controller device 550 can be implemented as the same controller device 50 previously described in connection in FIG. 1.


In this example, the infusion pump system 400 can be configured to perform a series of steps A″ through G″ are illustrated that describe operations of an example infusion pump system 500 equipped with natural language processing (“NLP”) technology. Using NLP, the infusion pump system 500 is capable of receiving instructions from a user 515 via natural language input. One or more NLP algorithms can be stored in the computer-readable memory device in as part of a speech recognition module (refer, for example, to module 363 in FIG. 3), including machine learning algorithms for language processing. By incorporating NLP capabilities within the infusion pump system 500, user communications with a portable pump 60 can be enhanced and simplified. As a result, the accuracy and completeness of the data entered by the user 515 into the portable pump 60 can be improved, and the user 515 can experience greater convenience and time efficiency.


Similar to previously described embodiments, the infusion pump system 500 can include the remote control device 550 in electrical communication with the portable pump 60 that is used to supply insulin or another medication to a user 515 via an infusion set 70 attached to and penetrating the user's skin 20. In particular embodiments, the portable pump 60 may also include the wireless communications circuit 40 that facilitates short-range wireless communications 545 between the internal control circuitry of the portable pump 60 and the external remote control device 550.


As an alternative to, or in conjunction with, pressing one or more buttons 63a, 63b, 64a, 64b, and 64c of the user interface 62 to communicate with the infusion pump system 500, the example infusion pump system 500 can receive natural language voice input from the user 515. The use of NLP technology provides an additional functionality that can enhance and simplify user 515 interactions with the portable pump 60. For instance, using natural language equipment (which may optionally a microphone 551 or 61 and a corresponding NLP software program implemented by the system 500), the need for user activation of multiple buttons 63a, 63b, 64a, 64b, and 64c for shuffling through menus may be eliminated or otherwise reduced in some circumstances. In addition, using NLP equipment, the capabilities of the infusion pump system 500 can extend beyond those that are accessible via the user interface 62. In one such example, as depicted in FIG. 5, the user 515 of infusion pump system 500 has ascertained that his or her blood glucose level is above normal at 220 mg/dl. As such, the user is concerned and desires to initiate appropriate measures to cause his or her blood glucose to reduce to a normal level.


Still referring to FIG. 4, in this example at step A″, the user 515 speaks a natural language statement 516 that reflects a question or concern that the user 15 wants the infusion pump system 10 to respond to. In this example, the user 515 speaks the statement 516, “My blood glucose is 220, what do I do?” As will be described further, the infusion pump system 500 will receive and process the statement 516 and recommend a bolus dispensation of insulin to correct the user's 515 high blood glucose level.


In this example, the user 515 has made a statement 516 that identifies the user's 515 blood glucose level, but many other types of statements corresponding to other tasks, questions, or concerns can be similarly initiated using natural language voice input. For instance, in other non-limiting examples such statements can include “I am going for a 45 minute jog,” “tell me about my last bolus,” “how long have I been wearing this infusion set,” “what do I do about the current alarm,” or “how much insulin is left in my reservoir?” It should be recognized from the description herein that the user 515 can provide a wide variety of types of statements to initiate a wide variety of responses by the infusion pump system 500, and that the examples provided here are merely illustrative.


The natural language statement 516 is received by the microphone 551 of the control device 550. The remote control device 550 can include electronic circuitry for converting the statement 516 to an audio signal (e.g., an “audio file,” “waveform,” or “sample”) that corresponds to the statement 516. The audio signal corresponding to the statement 516 can be saved in the memory of the remote control device 550. In other embodiments, the natural language statement can be received by the microphone 61 housed in the pump device 60.


In this embodiment the control device 550 is depicted as a smart phone device, but it should be understood from the description herein that, in other embodiments, the control device 550 can be implemented in the form of devices other than a smart phone device. Some other example devices that can be used similarly to the remote control device 550 can include, but are not limited to, a personal computer, a tablet computing device, a blood glucose meter device (e.g., an external blood strip reader), a continuous glucose meter device, a wearable computing device, a PDA, or a custom remote device. In still other embodiments, the control device is not a remote device, but instead is included as part of, or mechanically attached together with, the pump device. For instance, in such embodiments the pump device of the infusion pump system can be equipped with the capabilities to perform the functions described herein in regard to the remote control device 550. Further, in some embodiments certain NLP operations or parts of certain NLP operations may be performed at a remote server system, including a cloud-based server system, rather than completely on a personal computing device such as the remote control device 550. Accordingly, the remote control device 550, or equivalent, can be connected to a network such as the internet or an intranet system. Such a division of tasks may provide better process optimization, computational efficiency, and response time.


Still referring to FIG. 4, in this example at step B″, speech recognition is performed in response to receiving the voice input. This step is performed as described in step B of FIG. 1. A text transcription of the statement 516 is generated and stored (temporarily or permanently) in the computer-readable memory device of the control device 550, of the pump device 60, of the remote server system, or a combination thereof.


In this example at step C″, the text transcription(s) of the speech recognition process from step B″ is processed using a NLP program executed by the control device 550, the pump device 60, the remote server system, or a combination thereof to determine the likely meaning of the statement 516 and how the infusion pump system 500 should respond. In some cases, in addition to processing the text transcription(s) using NLP, the text transcription(s) is compared to a compilation of tasks or queries in a natural language search engine database 580 to determine the task most likely represented by the statement 516. In some embodiments, the natural language search engine database 580 is stored in the computer-readable memory device of the remote control device 550. However, the natural language search engine database 580 can also be stored in the computer-readable memory device in the portable pump 60, stored in the computer-readable memory device of a remote server system in communication with the remote control device 550 or the portable pump 60, or stored in computer-readable memory devices at a combination of such locations. In this embodiment, the natural language search engine database 580 is a storage repository that is programmed to contain an extensive number of tasks and queries that correspond to a variety of types of user voice input statements, such as statement 516. The transcription(s) of the voice input from step B″ can be compared to the tasks stored in the natural language search engine database 580 to find matching tasks or queries. In some embodiments, a confidence level for the match between the transcription(s) and the task(s) or queries can be determined. The task or query with the highest confidence level can be selected. In particular embodiments, if no task query has such a confidence level that surpasses a threshold level, or if multiple tasks or queries have confidence levels that are within a differentiation threshold level of each other, the user 515 is presented with a request for clarification or more information as described above. In some such cases, the user 515 may be presented with the task or query having the highest confidence level and the user 515 may be asked whether that task is what the user 515 wants the infusion pump system 500 to perform.


Still referring to FIG. 4, in this example at step D″, a response to the selected task or query from step C″ is characterized in preparation for presentation to the user 515 (e.g., for subsequent user confirmation). Depending on the task or query, additional information may be acquired from a disease management database 590 as a part of the preparation step. The disease management database 590, as with the natural language search engine database 580, can be stored in one or more computer-readable memory devices at various locations including at the remote control device 550, the portable pump 60, a remote server including cloud-based servers, and at a combination of such locations. As depicted by this example, the disease management database 590 can contain types of data that are related to the user's 515 health and metabolic status. The data can include, but is not limited to, blood glucose level, insulin sensitivity, weight, insulin on board (“IOB”), and food on board (“FOB”). In some embodiments, the disease management database 590 can also include the user's 515 most current blood glucose reading, an insulin-on-board level, an insulin sensitivity factor for the user 515, and the like. In particular embodiments, some or all of such information and other data can be considered when the task or query is being prepared for presentation to the user 515. For example, in response to the statement 516, the insulin sensitivity, weight, IOB and FOB can be queried from the disease management database 590. In some embodiments, the data stored in disease management database 590 is customizable by the user 515. For example, the user 515 may input a particular insulin sensitivity factor that reflects the user's 515 insulin sensitivity. The user's 515 custom data can be given preference in the disease management database 590 over the default data. In some embodiments, as part of the preparation for presenting the task to the user 515, the user 515 may first be presented with a request for additional information. For example, the user 515 may be presented with a request to input nutritional information of food items consumed in the past few hours. After the receipt of such addition information, the preparation for presenting the task to the user 515 can be completed.


In this example at step E″, the task or query is presented to the user 515 for confirmation that the task or query is what the user 515 desires. The task or query may be presented to the user 515 audibly using voice synthesis at the remote control device 550, or visually by presenting an indication on the display of the remote control device 550, or by a combination of audible and visual indicators. For example, in response to the statement 516, the user 515 is presented with information indicating that the infusion pump system 500 recommends a correction bolus dispensation of 5.5 Units of insulin. To confirm that task, the user 515 can select “YES” 552 on the remote control device 550. In response to a selection of the “YES” button 552, the control device 550 can communicate with the pump device 60 so as to initiate the dispensation of the bolus dosage (e.g., 4.9 Units in this example), as described below. Or to deny that task, the user 515 can select “NO” 554 on the remote control device 550. Optionally, in response to a selection of the “NO” button 554, the control device 550 can present the user with an option to manually input or verbally speak a specific number for a bolus dosage that is different from the suggested dosage displayed on the screen at step E. Alternatively, or in addition to, the manual selection of “YES” 552 or “NO” 554, the user 515 may speak “yes” or “no” to the remote control device 550 to confirm or deny the task presented. At step F″, the remote control device 550 receives such user confirmation.


At step G″, the remote control device 550 communicates the task to the portable pump 60 for activation of the portable pump 60 in accordance with the task confirmed by the user 515 (e.g., after the user selected the “YES” button 552). In the example, the display 65 of the portable pump 60 indicates that a bolus dispensation of 5.5 Units has been initiated. In this embodiment, communications between the remote control device 550 and the portable pump 60 are conducted by short-range wireless technologies such as, but not limited to, RF, Bluetooth, NFC, IR, and the like. Accordingly, the portable pump 60 can include a wireless communication circuit 40 that sends and receives data in cooperation with the remote control device 550. In alternative embodiments, the communications between the remote control device 550 and the portable pump 60 can be via a hardwired connection therebetween.


A number of embodiments of the invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention. Accordingly, other embodiments are within the scope of the following claims.

Claims
  • 1. A medical infusion pump system, comprising: a portable housing defining a space to receive a medicine;a pump drive system to dispense the medicine from the portable housing when the medicine is received in the space;control circuitry that communicates control signals to the pump drive system to control dispensation of the medicine from the portable housing when the medicine is received in the space; andone or more computing devices comprising a speech recognition system, the one or more computing devices being in communication with the control circuitry, the one or more computing devices including a microphone and being adapted to communicate with the control circuitry, the one or more computing devices being adapted to receive a voice input from a user comprising an instruction to deliver a bolus and a user estimate of an amount of carbohydrates for one or more food items that have been or will be consumed by the user, the one or more computing devices being adapted to: 1) determine a suggested bolus dosage that corresponds to the one or more food items based on at least the user estimate of the amount of carbohydrates;2) prompt the user to confirm or deny the suggested bolus dosage through manual interaction with a touchscreen or one or more buttons of the one or more computing devices; and3) communicate the confirmed suggested bolus dosage to the control circuitry once said bolus dosage is confirmed by the user,wherein the one or more computing devices are further configured to analyze the received voice input from the user to determine at least one textual transcript corresponding to the voice input and identify, using the at least one textual transcript, a numerical value of the amount of carbohydrates for the one or more food items that have been or will be consumed by the user, wherein the numerical value of the amount of carbohydrates for the one or more food items that have been or will be consumed by the user is used to determine the suggested bolus dosage.
  • 2. The medical infusion pump system of claim 1, wherein the one or more computing devices comprises a remote server system, and the control circuitry is configured to communicate with the remote server system to receive user voice inputs received by the speech recognition system.
  • 3. The medical infusion pump system of claim 1, wherein at least a portion of the one or more computing devices is disposed in the portable housing.
  • 4. The medical infusion pump system of claim 1, wherein the one or more computing devices are further configured to display the suggested bolus dosage and at least one of (i) a food item indicated by the voice input, and (ii) a numerical value of the user estimate of the amount of carbohydrates, wherein the suggested bolus dosage and the at least one of (i) the food item indicated by the voice input, and (ii) the numerical value of the user estimate of the amount of carbohydrates are displayed simultaneously as part of a single user interface display.
  • 5. The medical infusion pump system of claim 1, wherein information about a recent blood glucose level, insulin sensitivity, insulin-on-board, and food-on-board for the user are used to determine the suggested bolus dosage in addition to the user estimate of the amount of carbohydrates.
  • 6. The medical infusion pump system of claim 1, wherein the one or more computing devices are further configured to, in response to the user denying the suggested bolus dosage, present the user with an option to manually input or verbally speak a specific number of units for a bolus dosage.
  • 7. The medical infusion pump system of claim 1, wherein the one or more computing devices are further configured to present the user with an option to manually input or verbally speak a specific number of units for a bolus dosage.
  • 8. The medical infusion pump system of claim 1, wherein the voice input comprises a specific name or type of food that the user will consume.
  • 9. The medical infusion pump system of claim 1, wherein the one or more computing devices comprises a smartphone.
  • 10. The medical infusion pump system of claim 9, wherein the smartphone is configured to permit the user to adjust the suggested bolus dosage.
  • 11. The medical infusion pump system of claim 9, wherein the smartphone stores information about a recent blood glucose level, an insulin sensitivity, weight, an insulin-on-board, and a food-on-board for the user and uses this information to determine the suggested bolus dosage.
  • 12. The medical infusion pump system of claim 1, wherein the one or more computing devices are further configured to: receive user input comprising a digital image that is indicative of the one or more food items consumed or to be consumed by the user of the portable infusion pump system; anddetermine another suggested bolus dosage based on the digital image.
  • 13. The medical infusion pump system of claim 1, wherein the one or more computing devices are further configured to: receive user input comprising a digital image of the one or more food items consumed or to be consumed by the user of the portable infusion pump system;identify the one or more food items depicted in the digital image; anddetermine another suggested bolus dosage based on the identified one or more food items depicted in the digital image.
  • 14. A method of controlling a portable infusion pump system, comprising: receiving a user's voice input that is indicative of a task associated with using a portable infusion pump system, wherein the voice input comprises a user estimate of an amount of carbohydrates that has been or will be consumed by the user and an instruction to deliver a bolus;determining a suggested bolus dosage using at least the user estimate of the amount of carbohydrates;in response to receiving the voice input, displaying, on a display screen, the suggested bolus dosage;prompting a user to confirm or deny the suggested bolus dosage through manual interaction with a touchscreen or one or more buttons of a user interface, wherein if the user denies the suggested bolus dosage, the user is presented with an option to manually input or verbally speak a specific number of units for a bolus dosage;communicating said bolus dosage to control circuitry in the portable infusion pump system in response to confirmation of the suggested bolus dosage by the user;analyzing the received voice input to determine at least one textual transcript corresponding to the voice input; andidentifying, using the at least one textual transcript, a numerical value of the amount of carbohydrates that have been or will be consumed by the user, wherein the numerical value of the amount of carbohydrates that have been or will be consumed by the user is used to determine the suggested bolus dosage.
  • 15. The method of claim 14, wherein the display screen comprises the touchscreen.
  • 16. The method claim 14, further comprising: receiving user input comprising a digital image that is indicative of a food item consumed or to be consumed by the user of the portable infusion pump system; andcontrolling the portable infusion pump system to change an operation of the portable infusion pump system based upon the user input comprising the digital image.
  • 17. The method of claim 16, further comprising prompting the user via the user interface to confirm the operation change of the portable infusion pump system in response to receiving the user input comprising the digital image.
  • 18. The method of claim 17, wherein the operation change comprises calculating or initiating a bolus dispensation of a medicine from the portable infusion pump system.
  • 19. The method of claim 14 further comprising, displaying, on the display screen and along with the suggested bolus dosage, at least one of (i) a food item indicated by the voice input, and (ii) a numerical value of the user estimate of the amount of carbohydrates, wherein the suggested bolus dosage and the at least one of (i) the food item indicated by the voice input, and (ii) the numerical value of the user estimate of the amount of carbohydrates are displayed simultaneously as part of a single user interface display.
  • 20. The method of claim 14, wherein information about a recent blood glucose level, insulin sensitivity, insulin-on-board, and food-on-board for the user are used to determine the suggested bolus dosage in addition to the user estimate of the amount of carbohydrates.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 14/094,185, filed Dec. 2, 2013, now U.S. Pat. No. 10,569,015, issued Feb. 25, 2020, the entire contents of which are hereby incorporated by reference.

US Referenced Citations (772)
Number Name Date Kind
2605765 Kollsman Aug 1952 A
3688764 Owen Sep 1972 A
3886938 Szabo et al. Jun 1975 A
4077405 Haerten et al. Mar 1978 A
4151845 Clemens May 1979 A
4231368 Becker Nov 1980 A
4235234 Martin et al. Nov 1980 A
4265241 Portner et al. May 1981 A
4300554 Hessberg et al. Nov 1981 A
4313439 Babb et al. Feb 1982 A
4373527 Fischell Feb 1983 A
4398908 Siposs Aug 1983 A
4435173 Siposs et al. Mar 1984 A
4443218 Decant et al. Apr 1984 A
4475901 Kraegen et al. Oct 1984 A
4493704 Beard et al. Jan 1985 A
4529401 Leslie et al. Jul 1985 A
4552561 Eckenhoff et al. Nov 1985 A
4564054 Gustavsson Jan 1986 A
4652260 Fenton et al. Mar 1987 A
4668220 Hawrylenko May 1987 A
4681569 Coble et al. Jul 1987 A
4731726 Allen, III Mar 1988 A
4734092 Millerd Mar 1988 A
4749109 Kamen Jun 1988 A
4768506 Parker et al. Sep 1988 A
4838857 Strowe et al. Jun 1989 A
4850817 Nason et al. Jul 1989 A
4902278 Maget et al. Feb 1990 A
5029591 Teves Jul 1991 A
5045064 Idriss Sep 1991 A
5088981 Howson et al. Feb 1992 A
5088990 Hivale et al. Feb 1992 A
5176632 Bernardi Jan 1993 A
5190522 Wojcicki et al. Mar 1993 A
5209230 Swedlow et al. May 1993 A
5225763 Krohn et al. Jul 1993 A
5250027 Lewis et al. Oct 1993 A
5261882 Sealfon Nov 1993 A
5314412 Rex May 1994 A
5335994 Weynant nee Girones Aug 1994 A
5338157 Blomquist Aug 1994 A
5342180 Daoud Aug 1994 A
5349575 Park Sep 1994 A
5389078 Zalesky et al. Feb 1995 A
5395340 Lee Mar 1995 A
5411487 Castagna May 1995 A
5545143 Fischell et al. Aug 1996 A
5551850 Williamson et al. Sep 1996 A
5554123 Herskowitz Sep 1996 A
5569186 Lord et al. Oct 1996 A
5626566 Petersen et al. May 1997 A
5637095 Nason et al. Jun 1997 A
5656032 Kriesel et al. Aug 1997 A
5665065 Colman et al. Sep 1997 A
5672167 Athayde et al. Sep 1997 A
5693018 Kriesel et al. Dec 1997 A
5718562 Lawless Feb 1998 A
5741216 Hemmingsen et al. Apr 1998 A
5766155 Hyman et al. Jun 1998 A
5772635 Dastur et al. Jun 1998 A
5800420 Grose et al. Sep 1998 A
5816306 Giacomel Oct 1998 A
5822715 Worthington et al. Oct 1998 A
5851197 Marano et al. Dec 1998 A
5852803 Ashby et al. Dec 1998 A
5858001 Tsals et al. Jan 1999 A
5858005 Kriesel Jan 1999 A
5873731 Prendergast Feb 1999 A
5893838 Daoud et al. Apr 1999 A
5914941 Janky Jun 1999 A
5919167 Mulhauser et al. Jul 1999 A
5925018 Ungerstedt Jul 1999 A
5928201 Poulsen et al. Jul 1999 A
5947934 Hansen et al. Sep 1999 A
5951530 Steengaard et al. Sep 1999 A
5957889 Poulsen et al. Sep 1999 A
5984894 Poulsen et al. Nov 1999 A
5984897 Petersen et al. Nov 1999 A
5997475 Bortz Dec 1999 A
6003736 Ljunggren Dec 1999 A
6010485 Buch-Rasmussen et al. Jan 2000 A
6033377 Rasmussen et al. Mar 2000 A
6045537 Klitmose Apr 2000 A
6056728 Von Schuckmann May 2000 A
6074372 Hansen Jun 2000 A
6106498 Friedli Aug 2000 A
6110149 Klitgaard et al. Aug 2000 A
6126595 Amano et al. Oct 2000 A
6127061 Shun et al. Oct 2000 A
6156014 Petersen et al. Dec 2000 A
6171276 Lippe et al. Jan 2001 B1
6231540 Smedegaard May 2001 B1
6233471 Berner et al. May 2001 B1
6248067 Causey, III et al. Jun 2001 B1
6248090 Jensen et al. Jun 2001 B1
6248093 Moberg Jun 2001 B1
6251113 Appelbaum et al. Jun 2001 B1
6269340 Ford et al. Jul 2001 B1
6277098 Klitmose et al. Aug 2001 B1
6292440 Lee Sep 2001 B1
6302855 Lav et al. Oct 2001 B1
6302869 Klitgaard Oct 2001 B1
6368314 Kipfer et al. Apr 2002 B1
6375638 Nason et al. Apr 2002 B2
6379301 Worthington et al. Apr 2002 B1
6379339 Klitgaard et al. Apr 2002 B1
6381496 Meadows et al. Apr 2002 B1
6397098 Uber et al. May 2002 B1
6404098 Kayama et al. Jun 2002 B1
6427088 Bowman, IV et al. Jul 2002 B1
D461241 Moberg et al. Aug 2002 S
D461891 Moberg Aug 2002 S
6434528 Sanders Aug 2002 B1
6436072 Kullas et al. Aug 2002 B1
6461329 Van et al. Oct 2002 B1
6461331 Van Antwerp Oct 2002 B1
6474219 Klitmose et al. Nov 2002 B2
6485461 Mason et al. Nov 2002 B1
6491684 Joshi et al. Dec 2002 B1
6508788 Preuthun Jan 2003 B2
6524280 Hansen et al. Feb 2003 B2
6533183 Aasmul et al. Mar 2003 B2
6537251 Klitmose Mar 2003 B2
6537268 Gibson et al. Mar 2003 B1
6540672 Simonsen et al. Apr 2003 B1
6544229 Danby et al. Apr 2003 B1
6547764 Larsen et al. Apr 2003 B2
6551276 Mann et al. Apr 2003 B1
6554798 Mann et al. Apr 2003 B1
6554800 Nezhadian et al. Apr 2003 B1
6558320 Causey, III et al. May 2003 B1
6558345 Houben et al. May 2003 B1
6558351 Steil et al. May 2003 B1
6562001 Lebel et al. May 2003 B2
6562011 Buch-Rasmussen May 2003 B1
6564105 Starkweather et al. May 2003 B2
6569126 Poulsen et al. May 2003 B1
6571128 Lebel et al. May 2003 B2
6572542 Houben et al. Jun 2003 B1
6572545 Knobbe et al. Jun 2003 B2
6577899 Lebel et al. Jun 2003 B2
6582404 Klitgaard et al. Jun 2003 B1
6585644 Lebel et al. Jul 2003 B2
6585699 Ljunggreen et al. Jul 2003 B2
6587199 Luu Jul 2003 B1
6589229 Connelly et al. Jul 2003 B1
6599281 Struys et al. Jul 2003 B1
6605067 Larsen Aug 2003 B1
6605072 Struys et al. Aug 2003 B2
6613019 Munk Sep 2003 B2
6641533 Causey, III et al. Nov 2003 B2
6648821 Lebel et al. Nov 2003 B2
6650951 Jones et al. Nov 2003 B1
6656158 Mahoney et al. Dec 2003 B2
6656159 Flaherty Dec 2003 B2
6659948 Lebel et al. Dec 2003 B2
6659978 Kasuga et al. Dec 2003 B1
6659980 Moberg et al. Dec 2003 B2
6663602 Møller Dec 2003 B2
6668196 Villegas et al. Dec 2003 B1
6669668 Kleeman et al. Dec 2003 B1
6669669 Flaherty et al. Dec 2003 B2
6685674 Douglas et al. Feb 2004 B2
6687546 Lebel et al. Feb 2004 B2
6689100 Connelly et al. Feb 2004 B2
6690192 Wing Feb 2004 B1
6691043 Ribeiro, Jr. Feb 2004 B2
6692457 Flaherty Feb 2004 B2
6692472 Hansen et al. Feb 2004 B2
6694191 Starkweather et al. Feb 2004 B2
6699218 Flaherty et al. Mar 2004 B2
6702779 Connelly et al. Mar 2004 B2
6715516 Ohms et al. Apr 2004 B2
6716198 Larsen Apr 2004 B2
6723072 Flaherty et al. Apr 2004 B2
6723077 Pickup et al. Apr 2004 B2
6733446 Lebel et al. May 2004 B2
6736796 Shekalim May 2004 B2
6740059 Flaherty May 2004 B2
6740072 Starkweather et al. May 2004 B2
6740075 Lebel et al. May 2004 B2
6744350 Blomquist Jun 2004 B2
6749587 Flaherty Jun 2004 B2
6752785 Van et al. Jun 2004 B2
6752787 Causey, III et al. Jun 2004 B1
6758810 Lebel et al. Jul 2004 B2
6761286 Py et al. Jul 2004 B2
6768425 Flaherty et al. Jul 2004 B2
6780156 Haueter et al. Aug 2004 B2
6786246 Ohms et al. Sep 2004 B2
6786890 Preuthun et al. Sep 2004 B2
6796957 Carpenter et al. Sep 2004 B2
6796970 Klitmose et al. Sep 2004 B1
6799149 Hartlaub Sep 2004 B2
6809653 Mann et al. Oct 2004 B1
6810290 Lebel et al. Oct 2004 B2
6811533 Lebel et al. Nov 2004 B2
6811534 Bowman, IV et al. Nov 2004 B2
6813519 Lebel et al. Nov 2004 B2
6827702 Lebel et al. Dec 2004 B2
6830558 Flaherty et al. Dec 2004 B2
6852104 Blomquist Feb 2005 B2
6854620 Ramey Feb 2005 B2
6854653 Eilersen Feb 2005 B2
6855129 Jensen et al. Feb 2005 B2
6872200 Mann et al. Mar 2005 B2
6873268 Lebel et al. Mar 2005 B2
6878132 Kipfer Apr 2005 B2
6893415 Madsen et al. May 2005 B2
6899695 Herrera May 2005 B2
6899699 Enggaard May 2005 B2
6922590 Whitehurst Jul 2005 B1
6923763 Kovatchev et al. Aug 2005 B1
6925393 Kalatz et al. Aug 2005 B1
6936006 Sabra Aug 2005 B2
6936029 Maim et al. Aug 2005 B2
6945961 Miller et al. Sep 2005 B2
6948918 Hansen Sep 2005 B2
6950708 Bowman, IV et al. Sep 2005 B2
6960192 Flaherty et al. Nov 2005 B1
6979326 Mann et al. Dec 2005 B2
6997911 Klitmose Feb 2006 B2
6997920 Mann et al. Feb 2006 B2
7005078 Van Lintel et al. Feb 2006 B2
7008399 Larson et al. Mar 2006 B2
7014625 Bengtsson Mar 2006 B2
7018360 Flaherty et al. Mar 2006 B2
7025743 Mann Apr 2006 B2
7029455 Flaherty Apr 2006 B2
7054836 Christensen et al. May 2006 B2
7060059 Keith et al. Jun 2006 B2
7066910 Bauhahn et al. Jun 2006 B2
7070580 Nielsen Jul 2006 B2
7098803 Mann et al. Aug 2006 B2
7104972 Møller et al. Sep 2006 B2
7109878 Mann et al. Sep 2006 B2
7123964 Betzold et al. Oct 2006 B2
7128727 Flaherty et al. Oct 2006 B2
7133329 Skyggebjerg et al. Nov 2006 B2
7172572 Diamond et al. Feb 2007 B2
7179226 Crothall et al. Feb 2007 B2
7204823 Estes et al. Apr 2007 B2
7220240 Struys et al. May 2007 B2
7232423 Mernoe Jun 2007 B2
7267665 Steil et al. Sep 2007 B2
7291107 Hellwig et al. Nov 2007 B2
7354420 Steil et al. Apr 2008 B2
7402153 Steil et al. Jul 2008 B2
7404796 Ginsberg Jul 2008 B2
7429255 Thompson Sep 2008 B2
7491187 Van et al. Feb 2009 B2
7494481 Moberg et al. Feb 2009 B2
7547281 Hayes et al. Jun 2009 B2
7553281 Hellwig Jun 2009 B2
7569030 Lebel et al. Aug 2009 B2
7569050 Moberg et al. Aug 2009 B2
7570980 Ginsberg Aug 2009 B2
7591801 Brauker et al. Sep 2009 B2
7597682 Moberg Oct 2009 B2
7641649 Moberg et al. Jan 2010 B2
7651845 Doyle et al. Jan 2010 B2
7654982 Carlisle et al. Feb 2010 B2
7670288 Sher Mar 2010 B2
7708717 Estes et al. May 2010 B2
7734323 Blomquist et al. Jun 2010 B2
7785313 Mastrototaro Aug 2010 B2
7789859 Estes et al. Sep 2010 B2
7794426 Briones et al. Sep 2010 B2
7806853 Wittmann et al. Oct 2010 B2
7806854 Damiano et al. Oct 2010 B2
7815602 Mann et al. Oct 2010 B2
7819843 Mann et al. Oct 2010 B2
7828528 Estes et al. Nov 2010 B2
7850641 Lebel et al. Dec 2010 B2
7875022 Wenger et al. Jan 2011 B2
7904061 Zaffino et al. Mar 2011 B1
7938797 Estes May 2011 B2
7938801 Hawkins et al. May 2011 B2
7946985 Mastrototaro et al. May 2011 B2
7959598 Estes Jun 2011 B2
7967812 Jasperson et al. Jun 2011 B2
7976492 Brauker et al. Jul 2011 B2
8029459 Rush et al. Oct 2011 B2
8062249 Wilinska et al. Nov 2011 B2
8105268 Lebel et al. Jan 2012 B2
8114023 Ward et al. Feb 2012 B2
8152789 Starkweather et al. Apr 2012 B2
8206296 Jennewine Jun 2012 B2
8206350 Mann et al. Jun 2012 B2
8208984 Blomquist et al. Jun 2012 B2
8221385 Estes Jul 2012 B2
8226556 Hayes et al. Jul 2012 B2
8246540 Ginsberg Aug 2012 B2
8262616 Grant et al. Sep 2012 B2
8267893 Moberg et al. Sep 2012 B2
8273052 Damiano et al. Sep 2012 B2
D669165 Sims et al. Oct 2012 S
8282626 Wenger et al. Oct 2012 B2
8287487 Estes Oct 2012 B2
8318154 Frost et al. Nov 2012 B2
8348844 Kunjan et al. Jan 2013 B2
8348886 Kanderian et al. Jan 2013 B2
8348923 Kanderian et al. Jan 2013 B2
8352011 Van et al. Jan 2013 B2
8372039 Mernoe et al. Feb 2013 B2
8417311 Rule Apr 2013 B2
8430847 Mernoe et al. Apr 2013 B2
8439834 Schmelzeisen-Redeker et al. May 2013 B2
8439897 Yodfat et al. May 2013 B2
8454576 Mastrototaro et al. Jun 2013 B2
8460231 Brauker et al. Jun 2013 B2
8467972 Rush Jun 2013 B2
8475409 Tsoukalis Jul 2013 B2
8480655 Jasperson et al. Jul 2013 B2
8548544 Kircher et al. Oct 2013 B2
8548552 Tsoukalis Oct 2013 B2
8551045 Sie et al. Oct 2013 B2
8560082 Wei Oct 2013 B2
8560131 Haueter et al. Oct 2013 B2
8562558 Kamath et al. Oct 2013 B2
8562587 Kovatchev et al. Oct 2013 B2
8568713 Frost et al. Oct 2013 B2
8579854 Budiman et al. Nov 2013 B2
8579879 Palerm et al. Nov 2013 B2
8585591 Sloan et al. Nov 2013 B2
8585593 Kovatchev et al. Nov 2013 B2
8585637 Wilinska et al. Nov 2013 B2
8585638 Blomquist Nov 2013 B2
8597274 Sloan et al. Dec 2013 B2
8615366 Galley et al. Dec 2013 B2
8622988 Hayter Jan 2014 B2
8679016 Mastrototaro et al. Mar 2014 B2
8679060 Mernoe et al. Mar 2014 B2
8690820 Cinar et al. Apr 2014 B2
8694115 Goetz et al. Apr 2014 B2
8706691 Mcdaniel et al. Apr 2014 B2
8718949 Blomquist et al. May 2014 B2
8721585 Brauker et al. May 2014 B2
8727982 Jennewine May 2014 B2
8734422 Hayter May 2014 B2
8734428 Blomquist May 2014 B2
8747315 Brauker et al. Jun 2014 B2
8762070 Doyle et al. Jun 2014 B2
8771222 Kanderian et al. Jul 2014 B2
8777896 Starkweather et al. Jul 2014 B2
8777924 Kanderian et al. Jul 2014 B2
8784364 Kamen et al. Jul 2014 B2
8784369 Starkweather et al. Jul 2014 B2
8784370 Lebel et al. Jul 2014 B2
8795224 Starkweather et al. Aug 2014 B2
8795252 Hayter Aug 2014 B2
8808230 Rotstein Aug 2014 B2
8876755 Taub et al. Nov 2014 B2
8882741 Brauker et al. Nov 2014 B2
8903501 Perryman Dec 2014 B2
8919180 Gottlieb et al. Dec 2014 B2
8920401 Brauker et al. Dec 2014 B2
8926585 Brauker et al. Jan 2015 B2
8945094 Nordh Feb 2015 B2
8956291 Valk et al. Feb 2015 B2
8956321 Dejournett Feb 2015 B2
8977504 Hovorka Mar 2015 B2
8992475 Mann et al. Mar 2015 B2
9034323 Frost et al. May 2015 B2
9050413 Brauker et al. Jun 2015 B2
9056165 Steil et al. Jun 2015 B2
9056168 Kircher et al. Jun 2015 B2
9078963 Estes Jul 2015 B2
9089305 Hovorka Jul 2015 B2
9149233 Kamath et al. Oct 2015 B2
9155843 Brauker et al. Oct 2015 B2
9247901 Kamath et al. Feb 2016 B2
9314566 Wenger et al. Apr 2016 B2
9320471 Hayes et al. Apr 2016 B2
9333298 Kim et al. May 2016 B2
9415157 Mann et al. Aug 2016 B2
9474855 Mccann et al. Oct 2016 B2
9480796 Starkweather et al. Nov 2016 B2
9486172 Cobelli et al. Nov 2016 B2
9486578 Finan et al. Nov 2016 B2
9561324 Estes Feb 2017 B2
9878097 Estes Jan 2018 B2
9968729 Estes May 2018 B2
10449294 Estes Oct 2019 B1
10569015 Estes Feb 2020 B2
20010003542 Kita Jun 2001 A1
20010041869 Causey, III et al. Nov 2001 A1
20010056262 Cabiri Dec 2001 A1
20020002326 Causey, III et al. Jan 2002 A1
20020004651 Ljndggreen et al. Jan 2002 A1
20020007154 Hansen et al. Jan 2002 A1
20020013784 Swanson Jan 2002 A1
20020016534 Trepagnier et al. Feb 2002 A1
20020016568 Lebel Feb 2002 A1
20020032402 Daoud et al. Mar 2002 A1
20020040208 Flaherty et al. Apr 2002 A1
20020046315 Miller et al. Apr 2002 A1
20020055845 Ueda et al. May 2002 A1
20020072720 Hague et al. Jun 2002 A1
20020091358 Klitmose Jul 2002 A1
20020107476 Mann Aug 2002 A1
20020126036 Flaherty et al. Sep 2002 A1
20020156462 Stultz Oct 2002 A1
20020164973 Janik et al. Nov 2002 A1
20030028089 Galley Feb 2003 A1
20030055380 Flaherty Mar 2003 A1
20030065308 Lebel et al. Apr 2003 A1
20030088238 Poulsen May 2003 A1
20030104982 Wittmann et al. Jun 2003 A1
20030121055 Kaminski et al. Jun 2003 A1
20030125672 Adair et al. Jul 2003 A1
20030161744 Vilks et al. Aug 2003 A1
20030198558 Nason et al. Oct 2003 A1
20030199825 Flaherty Oct 2003 A1
20030208113 Mault et al. Nov 2003 A1
20030216683 Shekalim Nov 2003 A1
20030216686 Lynch et al. Nov 2003 A1
20040006316 Patton Jan 2004 A1
20040010207 Flaherty et al. Jan 2004 A1
20040019325 Shekalim Jan 2004 A1
20040064088 Gorman et al. Apr 2004 A1
20040064096 Flaherty et al. Apr 2004 A1
20040068230 Estes et al. Apr 2004 A1
20040078028 Flaherty et al. Apr 2004 A1
20040087894 Flaherty May 2004 A1
20040092865 Flaherty et al. May 2004 A1
20040092878 Flaherty May 2004 A1
20040115068 Hansen et al. Jun 2004 A1
20040116866 Gorman et al. Jun 2004 A1
20040127844 Flaherty Jul 2004 A1
20040153032 Garribotto et al. Aug 2004 A1
20040158207 Hunn et al. Aug 2004 A1
20040167464 Ireland et al. Aug 2004 A1
20040171983 Sparks et al. Sep 2004 A1
20040176720 Kipfer Sep 2004 A1
20040176727 Shekalim Sep 2004 A1
20040187952 Jones Sep 2004 A1
20040204673 Flaherty Oct 2004 A1
20040204744 Penner et al. Oct 2004 A1
20040220551 Flaherty et al. Nov 2004 A1
20040235446 Flaherty et al. Nov 2004 A1
20040260233 Garibotto et al. Dec 2004 A1
20050010165 Hickle Jan 2005 A1
20050021005 Flaherty et al. Jan 2005 A1
20050021104 DiLorenzo Jan 2005 A1
20050022274 Campbell et al. Jan 2005 A1
20050033223 Herrera Feb 2005 A1
20050038332 Saidara et al. Feb 2005 A1
20050049179 Davidson et al. Mar 2005 A1
20050065760 Murtfeldt et al. Mar 2005 A1
20050090808 Malave et al. Apr 2005 A1
20050090851 Devlin Apr 2005 A1
20050095063 Fathallah May 2005 A1
20050101933 Marrs et al. May 2005 A1
20050107743 Fangrow, Jr. May 2005 A1
20050113745 Stultz May 2005 A1
20050124866 Elaz et al. Jun 2005 A1
20050137530 Campbell et al. Jun 2005 A1
20050160858 Mernoe Jul 2005 A1
20050171512 Flaherty Aug 2005 A1
20050182366 Vogt et al. Aug 2005 A1
20050192561 Mernoe Sep 2005 A1
20050203461 Flaherty et al. Sep 2005 A1
20050215982 Malave et al. Sep 2005 A1
20050222645 Malave et al. Oct 2005 A1
20050234404 Vilks et al. Oct 2005 A1
20050238507 DiIanni et al. Oct 2005 A1
20050240544 Kil et al. Oct 2005 A1
20050245878 Mernoe et al. Nov 2005 A1
20050251097 Mernoe Nov 2005 A1
20050267402 Stewart et al. Dec 2005 A1
20050273059 Mernoe et al. Dec 2005 A1
20050277890 Stewart et al. Dec 2005 A1
20060036214 Mogensen et al. Feb 2006 A1
20060041229 Garibotto et al. Feb 2006 A1
20060042633 Bishop et al. Mar 2006 A1
20060069351 Safabash et al. Mar 2006 A9
20060069382 Pedersen Mar 2006 A1
20060074381 Malave et al. Apr 2006 A1
20060075269 Liong et al. Apr 2006 A1
20060095014 Ethelfeld May 2006 A1
20060125654 Liao et al. Jun 2006 A1
20060135913 Ethelfeld Jun 2006 A1
20060142698 Ethelfeld Jun 2006 A1
20060151545 Imhof et al. Jul 2006 A1
20060173406 Hayes Aug 2006 A1
20060173410 Moberg et al. Aug 2006 A1
20060178633 Garibotto et al. Aug 2006 A1
20060184104 Cheney et al. Aug 2006 A1
20060184119 Remde et al. Aug 2006 A1
20060200073 Radmer et al. Sep 2006 A1
20060206054 Shekalim Sep 2006 A1
20060214511 Dayan Sep 2006 A1
20060247574 Maule et al. Nov 2006 A1
20060247581 Pedersen et al. Nov 2006 A1
20060253086 Moberg et al. Nov 2006 A1
20060258973 Volt Nov 2006 A1
20060258976 Shturman et al. Nov 2006 A1
20060264835 Nielsen et al. Nov 2006 A1
20060264890 Moberg et al. Nov 2006 A1
20060264894 Moberg et al. Nov 2006 A1
20070016127 Staib et al. Jan 2007 A1
20070060870 Tolle et al. Mar 2007 A1
20070073228 Mernoe et al. Mar 2007 A1
20070073235 Estes Mar 2007 A1
20070073236 Mernoe et al. Mar 2007 A1
20070078818 Zivitz et al. Apr 2007 A1
20070079836 Reghabi et al. Apr 2007 A1
20070088271 Richards Apr 2007 A1
20070093750 Jan et al. Apr 2007 A1
20070093786 Goldsmith et al. Apr 2007 A1
20070100222 Mastrototaro et al. May 2007 A1
20070106218 Yodfat et al. May 2007 A1
20070112298 Mueller et al. May 2007 A1
20070118364 Wise May 2007 A1
20070118405 Campbell et al. May 2007 A1
20070123819 Mernoe et al. May 2007 A1
20070124002 Estes et al. May 2007 A1
20070142776 Kovelman et al. Jun 2007 A9
20070155307 Ng et al. Jul 2007 A1
20070156092 Estes et al. Jul 2007 A1
20070156094 Safabash et al. Jul 2007 A1
20070166170 Nason et al. Jul 2007 A1
20070167905 Estes et al. Jul 2007 A1
20070167912 Causey et al. Jul 2007 A1
20070169607 Keller et al. Jul 2007 A1
20070173761 Kanderian et al. Jul 2007 A1
20070179444 Causey et al. Aug 2007 A1
20070191702 Yodfat et al. Aug 2007 A1
20070219432 Thompson Sep 2007 A1
20070219480 Kamen et al. Sep 2007 A1
20070233521 Wehba et al. Oct 2007 A1
20070239116 Follman et al. Oct 2007 A1
20070248238 Abreu Oct 2007 A1
20070252774 Qi et al. Nov 2007 A1
20070255250 Moberg et al. Nov 2007 A1
20070282299 Hellwig Dec 2007 A1
20070287931 Dilorenzo Dec 2007 A1
20080009824 Moberg et al. Jan 2008 A1
20080015422 Wessel Jan 2008 A1
20080027574 Thomas Jan 2008 A1
20080031481 Warren et al. Feb 2008 A1
20080045891 Maule et al. Feb 2008 A1
20080051697 Mounce et al. Feb 2008 A1
20080051698 Mounce et al. Feb 2008 A1
20080051714 Moberg et al. Feb 2008 A1
20080051716 Stutz Feb 2008 A1
20080051730 Bikovsky Feb 2008 A1
20080051738 Griffin Feb 2008 A1
20080077081 Mounce et al. Mar 2008 A1
20080097326 Moberg et al. Apr 2008 A1
20080097375 Bikovsky Apr 2008 A1
20080097381 Moberg et al. Apr 2008 A1
20080103022 Dvorak et al. May 2008 A1
20080109050 John May 2008 A1
20080125700 Moberg et al. May 2008 A1
20080125701 Moberg et al. May 2008 A1
20080129535 Thompson et al. Jun 2008 A1
20080172027 Blomquist Jul 2008 A1
20080177149 Weinert Jul 2008 A1
20080183060 Steil et al. Jul 2008 A1
20080188796 Steil et al. Aug 2008 A1
20080198012 Kamen Aug 2008 A1
20080201325 Doniger et al. Aug 2008 A1
20080208627 Skyggebjerg Aug 2008 A1
20080215035 Yodfat et al. Sep 2008 A1
20080234630 Iddan et al. Sep 2008 A1
20080255516 Yodfat et al. Oct 2008 A1
20080269683 Bikovsky Oct 2008 A1
20080269687 Chong et al. Oct 2008 A1
20080269714 Mastrototaro et al. Oct 2008 A1
20080269723 Mastrototaro et al. Oct 2008 A1
20080294094 Mhatre et al. Nov 2008 A1
20080294109 Estes et al. Nov 2008 A1
20080294142 Patel et al. Nov 2008 A1
20080300572 Rankers et al. Dec 2008 A1
20080306434 Dobbles et al. Dec 2008 A1
20080306444 Brister et al. Dec 2008 A1
20080312512 Brukalo et al. Dec 2008 A1
20080312634 Helmerson et al. Dec 2008 A1
20080319381 Yodfat et al. Dec 2008 A1
20080319383 Byland et al. Dec 2008 A1
20080319384 Yodfat et al. Dec 2008 A1
20080319394 Yodfat et al. Dec 2008 A1
20080319414 Yodfat et al. Dec 2008 A1
20080319416 Yodfat et al. Dec 2008 A1
20090036760 Hayter Feb 2009 A1
20090036870 Mounce et al. Feb 2009 A1
20090043291 Thompson Feb 2009 A1
20090048584 Thompson Feb 2009 A1
20090069745 Estes et al. Mar 2009 A1
20090069746 Miller et al. Mar 2009 A1
20090069749 Miller et al. Mar 2009 A1
20090069784 Estes et al. Mar 2009 A1
20090069785 Miller et al. Mar 2009 A1
20090069787 Estes et al. Mar 2009 A1
20090076453 Mejlhede et al. Mar 2009 A1
20090076849 Diller Mar 2009 A1
20090082728 Bikovsky Mar 2009 A1
20090093756 Minaie et al. Apr 2009 A1
20090099507 Koops Apr 2009 A1
20090105636 Hayter et al. Apr 2009 A1
20090112333 Sahai Apr 2009 A1
20090118664 Estes et al. May 2009 A1
20090143916 Boll et al. Jun 2009 A1
20090164190 Hayter Jun 2009 A1
20090177142 Blomquist et al. Jul 2009 A1
20090177154 Blomquist Jul 2009 A1
20090192722 Shariati et al. Jul 2009 A1
20090198191 Chong et al. Aug 2009 A1
20090198215 Chong et al. Aug 2009 A1
20090221890 Saffer et al. Sep 2009 A1
20100010330 Rankers et al. Jan 2010 A1
20100049164 Estes Feb 2010 A1
20100057040 Hayter Mar 2010 A1
20100057041 Hayter Mar 2010 A1
20100094078 Weston Apr 2010 A1
20100121167 Mcgarraugh May 2010 A1
20100165795 Elder et al. Jul 2010 A1
20100168538 Keenan et al. Jul 2010 A1
20100168820 Maniak et al. Jul 2010 A1
20100174229 Hsu Jul 2010 A1
20100174266 Estes Jul 2010 A1
20100179409 Kamath et al. Jul 2010 A1
20100211005 Edwards et al. Aug 2010 A1
20100249530 Rankers Sep 2010 A1
20100273738 Valcke et al. Oct 2010 A1
20100286601 Yodfat et al. Nov 2010 A1
20100286653 Kubel et al. Nov 2010 A1
20100298685 Hayter Nov 2010 A1
20100298765 Budiman et al. Nov 2010 A1
20100324382 Cantwell et al. Dec 2010 A1
20100324977 Dragt Dec 2010 A1
20100325864 Briones et al. Dec 2010 A1
20110009813 Rankers Jan 2011 A1
20110015511 Bousamra et al. Jan 2011 A1
20110040247 Mandro et al. Feb 2011 A1
20110071464 Palerm Mar 2011 A1
20110098637 Hill Apr 2011 A1
20110098674 Vicente et al. Apr 2011 A1
20110105955 Yudovsky et al. May 2011 A1
20110106050 Yodfat et al. May 2011 A1
20110118699 Yodfat et al. May 2011 A1
20110124996 Reinke May 2011 A1
20110130716 Estes et al. Jun 2011 A1
20110163880 Halff et al. Jul 2011 A1
20110199194 Waldock et al. Aug 2011 A1
20110224523 Budiman Sep 2011 A1
20110313390 Roy et al. Dec 2011 A1
20110319813 Kamen et al. Dec 2011 A1
20120016304 Patel et al. Jan 2012 A1
20120029468 Diperna et al. Feb 2012 A1
20120046606 Arefieg Feb 2012 A1
20120065894 Tubb et al. Mar 2012 A1
20120078067 Kovatchev et al. Mar 2012 A1
20120123234 Atlas et al. May 2012 A1
20120150556 Galasso et al. Jun 2012 A1
20120172694 Desborough et al. Jul 2012 A1
20120172802 Blomquist Jul 2012 A1
20120185267 Kamen et al. Jul 2012 A1
20120197207 Stefanski Aug 2012 A1
20120203467 Kamath et al. Aug 2012 A1
20120209208 Stefanski Aug 2012 A1
20120227737 Mastrototaro et al. Sep 2012 A1
20120238851 Kamen et al. Sep 2012 A1
20120238853 Arefieg Sep 2012 A1
20120238999 Estes et al. Sep 2012 A1
20120245448 Shariati et al. Sep 2012 A1
20120245556 Kovatchev et al. Sep 2012 A1
20120245855 Kamath et al. Sep 2012 A1
20120246106 Atlas et al. Sep 2012 A1
20120259191 Shariati et al. Oct 2012 A1
20120277723 Skladnev et al. Nov 2012 A1
20120283694 Yodfat et al. Nov 2012 A1
20120289931 Robinson et al. Nov 2012 A1
20120302991 Blomquist et al. Nov 2012 A1
20120323590 Udani Dec 2012 A1
20120330270 Colton Dec 2012 A1
20130046281 Javitt Feb 2013 A1
20130053818 Estes Feb 2013 A1
20130053819 Estes Feb 2013 A1
20130053820 Estes et al. Feb 2013 A1
20130102867 Desborough et al. Apr 2013 A1
20130116649 Breton et al. May 2013 A1
20130138205 Kushwaha et al. May 2013 A1
20130159456 Daoud et al. Jun 2013 A1
20130165041 Bukovjan et al. Jun 2013 A1
20130204186 Moore et al. Aug 2013 A1
20130204202 Trombly et al. Aug 2013 A1
20130218126 Hayter et al. Aug 2013 A1
20130237932 Thueer et al. Sep 2013 A1
20130245563 Mercer et al. Sep 2013 A1
20130245604 Kouyoumjian et al. Sep 2013 A1
20130253418 Kamath et al. Sep 2013 A1
20130275139 Coleman Oct 2013 A1
20130281965 Kamen et al. Oct 2013 A1
20130297334 Galasso et al. Nov 2013 A1
20130338629 Agrawal et al. Dec 2013 A1
20130338630 Agrawal et al. Dec 2013 A1
20130345663 Agrawal et al. Dec 2013 A1
20140005633 Finan Jan 2014 A1
20140025015 Cross et al. Jan 2014 A1
20140031759 Kouyoumjian et al. Jan 2014 A1
20140039383 Dobbles et al. Feb 2014 A1
20140052091 Dobbles et al. Feb 2014 A1
20140052092 Dobbles et al. Feb 2014 A1
20140052093 Dobbles et al. Feb 2014 A1
20140052094 Dobbles et al. Feb 2014 A1
20140052095 Dobbles et al. Feb 2014 A1
20140066884 Keenan et al. Mar 2014 A1
20140066885 Keenan et al. Mar 2014 A1
20140066886 Roy et al. Mar 2014 A1
20140066887 Mastrototaro et al. Mar 2014 A1
20140066888 Parikh et al. Mar 2014 A1
20140066889 Grosman et al. Mar 2014 A1
20140066890 Sloan et al. Mar 2014 A1
20140066892 Keenan et al. Mar 2014 A1
20140088557 Mernoe et al. Mar 2014 A1
20140094766 Estes et al. Apr 2014 A1
20140107607 Estes Apr 2014 A1
20140114278 Dobbles et al. Apr 2014 A1
20140121635 Hayter May 2014 A1
20140128705 Mazlish May 2014 A1
20140128803 Dobbles et al. May 2014 A1
20140180203 Budiman et al. Jun 2014 A1
20140180240 Finan et al. Jun 2014 A1
20140228627 Soffer et al. Aug 2014 A1
20140228668 Wakizaka et al. Aug 2014 A1
20140235981 Hayter Aug 2014 A1
20140249500 Estes Sep 2014 A1
20140276553 Rosinko et al. Sep 2014 A1
20140276554 Finan et al. Sep 2014 A1
20140276555 Morales Sep 2014 A1
20140276583 Chen et al. Sep 2014 A1
20140309615 Mazlish Oct 2014 A1
20140323959 Lebel et al. Oct 2014 A1
20150018633 Kovachev et al. Jan 2015 A1
20150025471 Enggaard Jan 2015 A1
20150025495 Peyser Jan 2015 A1
20150030641 Anderson et al. Jan 2015 A1
20150045737 Stefanski Feb 2015 A1
20150073337 Saint et al. Mar 2015 A1
20150080789 Estes et al. Mar 2015 A1
20150120323 Galasso et al. Apr 2015 A1
20150136336 Huang May 2015 A1
20150148774 Yao May 2015 A1
20150157794 Roy et al. Jun 2015 A1
20150164414 Matthews Jun 2015 A1
20150165119 Palerm et al. Jun 2015 A1
20150217051 Mastrototaro et al. Aug 2015 A1
20150217052 Keenan et al. Aug 2015 A1
20150265767 Vazquez et al. Sep 2015 A1
20150265768 Vazquez et al. Sep 2015 A1
20150314062 Blomquist et al. Nov 2015 A1
20150320933 Estes Nov 2015 A1
20150328402 Nogueira et al. Nov 2015 A1
20150351683 Brauker et al. Dec 2015 A1
20150352282 Mazlish Dec 2015 A1
20150352283 Galasso Dec 2015 A1
20160000998 Estes Jan 2016 A1
20160030669 Harris et al. Feb 2016 A1
20160038673 Morales Feb 2016 A1
20160082187 Schaible et al. Mar 2016 A1
20160082188 Blomquist et al. Mar 2016 A1
20160158438 Monirabbasi et al. Jun 2016 A1
20160162662 Monirabbasi et al. Jun 2016 A1
20160213841 Geismar et al. Jul 2016 A1
20160256629 Grosman et al. Sep 2016 A1
20170182248 Rosinko Jun 2017 A1
20170189614 Mazlish et al. Jul 2017 A1
20170203036 Mazlish et al. Jul 2017 A1
Foreign Referenced Citations (61)
Number Date Country
2543545 May 2005 CA
19627619 Jan 1998 DE
10236669 Feb 2004 DE
PA200401893 Dec 2004 DK
PA 200401893 Dec 2004 DK
0062974 Oct 1982 EP
0275213 Jul 1988 EP
0496141 Jul 1992 EP
0612004 Aug 1994 EP
0580723 Oct 1995 EP
1045146 Oct 2000 EP
1136698 Sep 2001 EP
1177802 Feb 2002 EP
0721358 May 2002 EP
1495775 Jan 2005 EP
1527792 May 2005 EP
1754498 Feb 2007 EP
1818664 Aug 2007 EP
2585252 Jan 1987 FR
0747701 Apr 1956 GB
2218831 Nov 1989 GB
H09504974 May 1997 JP
2000513974 Oct 2000 JP
2002-085556 Mar 2002 JP
2002507459 Mar 2002 JP
2002523149 Jul 2002 JP
2003-531691 Oct 2003 JP
2010-502361 Jan 2010 JP
2010-524639 Jul 2010 JP
WO 1990015928 Dec 1990 WO
WO 1997021457 Jun 1997 WO
WO 1998004301 Feb 1998 WO
WO 1998011927 Mar 1998 WO
WO 1998057683 Dec 1998 WO
WO 1999021596 May 1999 WO
WO 1999039118 Aug 1999 WO
WO 1999048546 Sep 1999 WO
WO 2001054753 Aug 2001 WO
WO 2001072360 Oct 2001 WO
WO 2001091822 Dec 2001 WO
WO 2001091833 Dec 2001 WO
WO 2002040083 May 2002 WO
WO 2002057627 Jul 2002 WO
WO 2002068015 Sep 2002 WO
WO 2002084336 Oct 2002 WO
WO 2002100469 Dec 2002 WO
WO 2003026726 Apr 2003 WO
WO 2003103763 Dec 2003 WO
WO 2004056412 Jul 2004 WO
WO 2004110526 Dec 2004 WO
WO 2005002652 Jan 2005 WO
WO 2005039673 May 2005 WO
WO 2005072794 Aug 2005 WO
WO 2005072795 Aug 2005 WO
WO 2006067217 Jun 2006 WO
WO 2006097453 Sep 2006 WO
WO 2006105792 Oct 2006 WO
WO 2006105793 Oct 2006 WO
WO 2006105794 Oct 2006 WO
WO 2007141786 Dec 2007 WO
2008134146 Nov 2008 WO
Non-Patent Literature Citations (11)
Entry
Accu-Chek Spirit, “Pump Therapy Made for You,” Roche, 2006, 6 pages.
Asante Solutions Pearl User Manual, Asante Inc., 2012, 180 pages.
Collins and Lee, “Microfluidic flow transducer based on the measurement of electrical admittance,” Lab Chip, 2004 4 pages.
Debiotech News Release, “Debiotech reveals its new miniaturized Disposable Insulin Nanopump™ for Diabetes therapy,” available at http://www.debiotech.com/news/nw_159.html Apr. 24, 2006, 3 pages.
International Search Report and Written Opinion in International Application No. PCT/US2014/67665, dated Apr. 21, 2015, 13 pages.
Medtronic News Release, “Medtronic Receives FDA Approval for World's First Insulin Pump with Real-time Continuous Glucose Monitoring,” Apr. 13, 2006, 3 pages.
OmniPod Insulin Management System—Investor Relations—Press Release, Feb. 1, 2005, http://investors.insulet.com/phoenix.zhtml?c=209336&p=irol-newsArticle&ID=988708&highlight= 1 page.
OmniPod Quick Start Guide, 2007, 2 pages.
Patent Abstracts of Japan, vol. 1999, No. 04, and JP 11 010036, Apr. 30, 1999 and Jan. 19, 1999, Toray Ind. Inc.
The Medtronic Diabetes Connection, 2006, 6 pages.
Xilas Temp Touch, “The latest in high-tech and convenient devices,” DOCNews, vol. 2, No. 7, Jul. 1, 2005, http://docnews.diabetesjournals.ord/cgi/content/full/2/7/13, 3 pages.
Related Publications (1)
Number Date Country
20200147305 A1 May 2020 US
Continuations (1)
Number Date Country
Parent 14094185 Dec 2013 US
Child 16743332 US