Apparatus and method for monitoring medication adherence

Information

  • Patent Grant
  • 10506971
  • Patent Number
    10,506,971
  • Date Filed
    Friday, November 30, 2018
    6 years ago
  • Date Issued
    Tuesday, December 17, 2019
    5 years ago
Abstract
A system and method for positioning a pill to be ingested by a user in a field of view of an image acquisition camera. The method includes the steps of determining a desired location of the mouth of a user in a field of view of an image acquisition camera, determining a current position of a pill to be ingested by the user in the field of view of the image acquisition apparatus, and indicating on a display a movement to be made by the user to move the pill towards mouth of the user.
Description
FIELD OF THE INVENTION

This invention relates generally to a method and apparatus for assisting monitoring of, and for improving medication adherence, and more particularly to an apparatus and method for providing feedback to a user of a gesture recognition system for monitoring of medication adherence related to positioning the user, a medication, or other object within an image acquisition area, and to provide positive feedback related to the position of the user, the medication and the like in order to encourage and improve adherence to medication administration protocols and to shape patient behavior in a positive manner. The invention may also act to insure and aid in the ability for proper acquisition of various gesture sequences and images when applied to an image acquisition system potentially having a narrow field of view, or a fixed direction camera. Failure on any of these fronts may result in additional instruction or encouragement from the system, in either an automated or personal manner, and may be provided as input data to an overall population management system.


BACKGROUND OF THE INVENTION

Gesture recognition systems typically include high sensitivity cameras including a wide field of view, multiple cameras, tracking systems, or any combination of the above. Such systems typically require custom system hardware systems to allow for proper image acquisition. When being employed to assist in monitoring medication adherence, it may not be possible or convenient to provide such a specialized device to each user.


Employing gesture recognition as well as object recognition and tracking to monitor medication adherence may result in the monitoring of individuals and medication. These individuals and the medication they are holding or administering may be improperly positioned within a field of view of the camera. While the higher quality gesture recognition systems noted above may be able to capture necessary images in spite of such variability, standard camera systems that may be provided in a standard computer or laptop configuration may not be sufficient for performing gesture recognition. Furthermore, mobile devices, typically including an even lower resolution camera and lower computing power may have more difficulty in implementing such a gesture recognition system.


Therefore, it would be desirable to provide an apparatus and system that allow for proper gesture recognition in such a context, even employing lower quality hardware components and processing power, and that overcomes other drawbacks of the prior art.


SUMMARY OF THE INVENTION

In accordance with various embodiments of the invention a gesture recognition assistance method and apparatus for assisting in monitoring medication adherence may be provided. Such an assistance method may include a directional system for insuring that a user is properly positioned within an image acquisition space. The directional system may preferably include a color and/or shape based indication system to direct a user to a location in an image acquisition space to insure proper image acquisition. Alternative embodiments of the invention may further include sound to aid in conveying information to a user. The directional system may also be employed to determine and instruct changes in location of a pill or other medication. In accordance with a preferred embodiment of the system, when implemented on a mobile device or other reduced processing power device, a user may attempt to have their image acquired by the device including a fixed, relatively low resolution camera, a user may be shown an image or other representation of themselves and including various indications of incorrect positioning. The user may further be shown various arrows indicating a direction in which to move. Further, color may be employed to give a further indication to the user of proper location.


Therefore, by providing such guidance to the user, not only is the user guided through a sequence of steps to aid in proper medication administration, by effectively reducing the variability in user behavior, the processing required to accurately determine whether the user is properly administering such medication is also reduced.


Furthermore, the present invention may also be applicable to additional forms of medication administration, such as injectables, inhalers, topical applications, ocular applications and the like, as well as various other medical maintenance procedures, such as user of cardiac monitors, or other monitoring devices or the like. Indeed, the invention may be applicable to any medical procedure in which reducing variability of patient behavior to allow for remote monitoring employing a gesture recognition system may be employed.


Still other objects and advantages of the invention will in part be obvious and will in part be apparent from the specification and drawings.


The invention accordingly comprises the several steps and the relation of one or more of such steps with respect to each of the others, and the apparatus embodying features of construction, combinations of elements and arrangement of parts that are adapted to affect such steps, all as exemplified in the following detailed disclosure, and the scope of the invention will be indicated in the claims.





BRIEF DESCRIPTION OF THE DRAWINGS

For a more complete understanding of the invention, reference is made to the following description and accompanying drawings, in which:



FIG. 1 depicts a user being directed to move closer to a display and image capture apparatus in accordance with an embodiment of the invention;



FIG. 2 depicts a user indicated as situated as being a proper distance from the display and image capture apparatus in accordance with an embodiment of the invention;



FIG. 3 depicts a user being directed to move laterally in relation to a display and image capture apparatus in accordance with an embodiment of the invention;



FIG. 4 depicts a position target in accordance with an embodiment of the invention;



FIG. 5 depicts a position indicator in accordance with an embodiment of the invention;



FIG. 6 depicts a position indicator indicating that a desired object has been identified in accordance with an embodiment of the invention;



FIG. 7 depicts a position indicator and target indicating that the position indicator has reached the desired target to provide positive reinforcement to a user in accordance with an embodiment of the invention;



FIG. 8 depicts various positioning locations of a position indicator in relation to the display and image acquisition apparatus to provide an indication of position relative to the camera and mouth of the user in accordance with an embodiment of the invention;



FIG. 9 depicts a position indicator indicating that a desired object has been identified in relation to a mouth position indicator target in accordance with an embodiment of the invention;



FIG. 10 depicts the position indicator of FIG. 9 in a different position relative to the mouth position indicator target of FIG. 9 in accordance with an embodiment of the invention;



FIG. 11 depicts the position indicator of FIG. 9 and the mouth position indicator of FIG. 9 indicating that the position indicator has reached the desired mouth position indicator target position in accordance with an embodiment of the invention; and



FIG. 12 depicts a display and image capture apparatus that may be employed in accordance with one or more embodiments of the present invention.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The invention will now be described making reference to the following drawings in which like reference numbers denote like structure or steps. Referring first to FIG. 12, a display and image capture apparatus 1200 that may be employed in accordance with one or more embodiments of the present invention is shown. Apparatus 1200 preferably includes a housing 1210, an image capture camera 1220 and a display 1230. Image capture camera 1220 may comprise a single camera element, a stereo camera element or other appropriate image capture device. Other elements, as known to one of ordinary skill in the art, including ports, power attachments, processors for processing data, and the like may also be provided in accordance with apparatus 1200. It is further contemplated in accordance with the various embodiments of the invention that apparatus may include one or more self-check mechanisms, including mechanisms for determining proper ambient light, direction and background of the camera and the background imaged by the camera, or other environmental issues that may be improved to further aid in the imaging of any images by apparatus 1200. Additionally, if at any time it is determined that lighting conditions are too difficult for imaging apparatus 1200, it may be possible to provide a light burst from a strobe or display to aid in illuminating the images to be captured. The following descriptions of the various embodiments of the invention will assume that such a display and image capture apparatus 1200 is employed. The invention, however, shall not be so limited and may be employed on any structure of image capture camera and display, whether contained in a single or multiple apparatuses.


Referring next to FIG. 1, a face of a user 110 is shown on display 1230 of display and image capture apparatus 1200. As can be seen in FIG. 1, this face is shown as being small, an indication of being greater than a desired distance from an imaging apparatus. This situation of being too far away may be further accentuated by a small head tracing 120 being approximately the size of the user's head image. It has been determined by the inventors of the present invention that proper positioning of the face of a user improves the ability to perform various gesture recognition procedures, and in particular aids in reducing variability in action that must be processed by image capture apparatus 1200. Therefore, by reducing variability in position, and in turn therefore variability in action, gesture recognition for determining medication adherence may be performed in an easier manner. Furthermore, proper face positioning may be even more important when one considers reduced-resolution cameras. Thus, if the user is positioned too far, perhaps not enough pixels may be imaged to properly perform such gesture recognition, while being positioned too close may result in image capture device not being able to image the entire face of the user. Further, if lighting conditions are poor, the user may be asked to move closer to aid in image capture. Thus, proper positioning may also be provided as a method for addressing variations in camera resolution, processing power, lighting or other environmental conditions, and the like.


Also provided may be an intermediate sized head tracing 130 showing a direction in which the user is to move, and a full sized head tracing 140. Once the user's head is sized coincident with the full sized head tracing, it may be considered that the user is at a correct distance from the image capture camera and display, as is shown in FIG. 2. It is further contemplated that if the user is positioned closer than desired to image capture camera 1220, the user image may be shown on display 1230 as larger than the full sized head tracing 140. In either case, as is shown in FIG. 1, the desired sized head tracing 140 may be shown in a different line format than the other head tracings, in this exemplary case, a dotted line for the full sized head tracing 140, solid lines for the others. Of course any desired line formats may be used. Further, in addition to providing such head tracings, an avatar or other human or other indicator indicating a direction in which the user is to move may be provided on display 1230. Further, the incorrect head tracings may be provided in a first color, such as red, while the correct head tracing may be provided in a second color, such as green in order to further provide user feedback and an easy to recognize indication of correct or incorrect action. Any desirable combination of color and line format may be employed.


Referring next to FIG. 3, once a user is positioned a proper distance from display and image capture apparatus 1200 (or, alternatively, while the user is still not positioned a proper distance from the display and image capture apparatus 1200, the present embodiment being described for clarity), he or she may be positioned incorrectly in a lateral direction to a desired position. Thus as is shown, display 1230 may show an image of user 110 and may include a full sized head tracing 220 indicating a current position of the user (of course, the smaller and larger head tracings of FIG. 1 may be provided here if the user is also an incorrect distance from display and image capture apparatus 1200), and a desired position full head tracing 240. As shown, the head tracing and the desired head tracing are preferably drawn employing different line formats to indicate the desired position. An arrow 230 may also be provided indicating a direction in which the user is to move. As with FIG. 1, an avatar or other human or other indicator may also be provided on display 1230, indicating a direction in which the user is to move. Further, the incorrect head tracings may be provided in a first color, such as red, while the correct head tracing may be provided in a second color, such as green. In addition, the same red or other color may be used to illuminate a side of the image if the user moves off the screen in a particular direction, the side of movement being illuminated on display 1230.


Next, FIGS. 4-7 depict an embodiment of the invention for locating a single or group of objects, and in accordance with this embodiment of the invention such an object may comprise a pill or other medication to be ingested by a user. Thus, as is shown in FIG. 4, a circular indicator 410 may be provided on display 1230 indicative of a first position that the user is to place the object or in this situation, a pill. Of course, for this circular indicator and any other indicators described in accordance with the invention, any desired indicators may be employed. The invention may be similarly applicable to other medication administration procedures, employing for example, injectables, inhalers, topical applications, ocular applications and the like, as well as various other medical maintenance procedures, such as user of cardiac monitors, or other monitoring devices or the like, or any other medical device that may benefit from remote, automated monitoring. As is then shown in FIG. 5, a pill 505 is shown adjacent indicator 410, as is shown by the outer ring proximity indicator 510. Once the pill is positioned concentric with the indicator 410 and the proximity indicator, the pill is shown in a target bulls eye 610, including an outer ring and a center indicator. To properly position pill 505, target bulls eye 610 as the user moves pill 505, target bulls eye moves in display 1230 in a similar manner. Thus, movement in both the lateral direction and in the direction towards and away from image capture apparatus 1200 may be monitored, and feedback provided to the user. For example, as the user moves pill 505 closer to display 1230, the pill and indicator 410 may grow larger, while movement away from the display may cause the indicator to shrink. Thus, in addition to assisting the user in placing the pill properly laterally, the indicator system may also aid in positioning the pill a proper distance from the image capture apparatus 1200. Concentric larger rings 710 (as shown in FIG. 7) may be positioned on display 1230 to indicate a desired position to which the user is to move pill 505. When the target bulls eye is positions concentric to larger rings 710, and is indicated as being a correct size to fit therein, thus providing a concentric set of circles on display 1230, it can be determined by the user that pill 505 has been properly positioned in the various desired directions. As with the earlier embodiments, arrows or other directional indicators may be provided, and color may be used to aid in directing the user. Thus, a red color may be used to indicate a side of an image to which the pill may have been moved off the screen.


Referring next to FIG. 8, if a user is positioning a pill, or is to position a pill on the display, indicator 410 may be shown on display 1230 in various locations vertically in order to account for relative vertical positioning of the image acquisition camera relative to the user. Thus, if the camera is positioned above the user, the user will be encouraged to hold the pill at a higher location, thus allowing for the imaging of the pill and user at the same time. Similarly, if the camera is positioned lower than the user, the user will be encouraged to hold the pill at a lower location, thus being positioned approximately between the camera and the user's face, as shown by the lower indicator 410 in FIG. 8. Thus, indicator 410 may be positioned in any location along the arrow shown in FIG. 8 to ensure proper pill location to aid in proper imaging of the user and the pill. Such positioning further aids in reducing the possibility of occlusion of the pill or other medication by the hand of the user, or other possible occlusion, and additionally aids in keeping the pill or other medication within the field of view of image capture camera 1220.


Referring next to FIG. 9, a method and system for aiding in confirming pill ingestion by a user in accordance with an embodiment of the invention is shown. Once again, a pill 505 is included under a target bulls eye 610 as originally portrayed in FIG. 6. Also shown in a location target 910, similar to the concentric large rings 710 of FIG. 7. Location target 910, however, is preferably provided to determine and indicate a location of the user's mouth on display 1230. Therefore, in accordance with the invention, the user's mouth location is determined in accordance with the image acquisition camera, and thereafter, location target 910 may be provided on display 1230 coincident therewith. As is further shown in FIG. 10, as the user moves pill 505, target bulls eye 610 follows along, indicating a relative position as compared with location target 910 on display 1230. Various techniques noted above, including arrows, color and the like may be employed in aiding the user in positioning pill 505 concentric with location target 910. Of course, as the position of location target 910 is preferably coextensive with the user's mouth, movement of the pill to the user's mouth may result in the desired interaction between pill 505 and location target 910, thus encouraging the desired action. It is through this continual monitoring that the user is further encouraged to want to perform actions correctly. As if being watched, the automated system provided in accordance with various embodiments of the present invention provides positive reinforcement to a user, and also may provided the added incentive of consistent and continuous monitoring. Rather than requiring an actual person to continuously review the actions of the user, automatic monitoring and feedback may provide a similar user response without the need for additional personnel. Furthermore, such an automated system may provide even greater consistency in monitoring, thus further encouraging desired behavior and further reducing variability in user action.


Once properly positioned, as is shown in FIG. 11, properly positioned locator 615 is shown in a differently displayed location target 915, thus indicating successful positioning. As noted above, color change, change in line texture or the like may be used to indicate that the user has properly positioned the pill relative to their mouth, thus providing positive feedback to the user upon proper action having been taken.


Therefore, in accordance with the invention, a set of one or more indicators may be provided to a user to aid in proper following of a predetermined script to aid in the proper acquisition of the various movements of the user, so that the system is able to determine whether the user is properly positioned in front of an image acquisition camera, and whether the user has properly positioned a pill for ingestion. Various positive reinforcement mechanisms may be employed in accordance with the monitoring system of the various embodiments of the invention, thus encouraging the user to properly perform various actions and reducing the variability in such actions, therefore easing the burden on the system to properly determine proper performance of such actions by the user. The user may also perceive the system as one that is easy to use, thus even further encouraging its use.


It will thus be seen that the objects set forth above, among those made apparent from the preceding description, are efficiently attained and, because certain changes may be made in carrying out the above method and in the construction(s) set forth without departing from the spirit and scope of the invention, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.


It is also to be understood that this description is intended to cover all of the generic and specific features of the invention herein described and all statements of the scope of the invention which, as a matter of language, might be said to fall there between.

Claims
  • 1. A method for positioning a user in a field of view of an image acquisition camera by providing automatic guidance to the user on how to position the user, the method comprising the steps of: displaying on a display an image of a user within a field of view of an image acquisition camera;determining a desired location of a user in a field of view of an image acquisition camera;determining a current position of the user in the field of view of the image acquisition apparatus;presenting a location target on the display indicating the desired location of the user in the field of view of the image acquisition camera;acquiring an image of the user within the field of view of the image acquisition camera by the image acquisition camera;following movement of the user in the field of view of the image acquisition camera by relative movement of the user image on the display; andconfirming interaction between the user and the location target when they are determined to be coextensive on the display.
  • 2. The method of claim 1, wherein if the current position of the user is too far from the image acquisition camera, an image of the user is displayed on the display in a small size.
  • 3. The method of claim 2, further comprising the step of providing a head tracing indicating a proper size that the image of the user should be.
  • 4. The method of claim 3, wherein an indicator is provided when the user is properly positioned, and the image of the user is coincident with the head tracing.
  • 5. The method of claim 4, wherein the step of indicating comprises a change in color.
  • 6. The method of claim 1, wherein the desired location of the user is adjusted in accordance with one or more environmental factors.
  • 7. The method of claim 1, wherein the display further displays an indication if the user moves out of the field of view of the image acquisition camera and is therefore no longer displayed on the display.
  • 8. The method of claim 1, further comprising the step of providing a head tracing indicating a desired proper location of the image of the user.
  • 9. The method of claim 8, wherein an indicator directs the user to move in the direction of the head tracing.
  • 10. A system for tracking positioning of a user by providing automatic guidance to the user on how to position the user, comprising: a display for displaying an image of a user within a field of view of an image acquisition camera;an image acquisition apparatus for acquiring an image of a user;a processor for determining a desired location of a body part of a user in a field of view of the image acquisition apparatus, and determining a current position of the body part of the user in the field of view of the image acquisition apparatus; anda display for displaying the acquired image of the user, for presenting a location target to move the body part of the user to a desired location within the field of view of the image acquisition camera;the processor further following movement of the user in the field of view of the image acquisition camera by relative movement of the user image on the display, and confirming interaction between the user and the location target when they are determined to be coextensive on the display.
  • 11. The system of claim 10, wherein the display further displays an indicator of the current position of body part of the user within the field of view of the image acquisition camera.
  • 12. The system of claim 11, further comprising an indicator on the display indicating movement of the indicator towards the location target within the field of view of the image acquisition camera.
  • 13. The system of claim 12, wherein provision of the indicated movement improves the ability of the user to properly position their body part within the field of view of the image acquisition camera.
  • 14. The system of claim 11, wherein positioning of the body part of the user is confirmed when the indicator and the location target are determined to be coextensive in one or more of the plurality of images.
  • 15. The system of claim 10, wherein the display further displays an indication if the body part moves out of the field of view of the image acquisition camera.
Parent Case Info

This application is a continuation, and claims priority, of co-pending U.S. application Ser. No. 15/840,836, filed Dec. 13, 2017, now U.S. Pat. No. 10,149,648, issued Dec. 11, 2018, which is a continuation of U.S. application Ser. No. 15/337,551, filed Oct. 28, 2016, now U.S. Pat. No. 9,844,337, issued Dec. 19, 2017, which is a continuation of U.S. application Ser. No. 14/073,525, filed Nov. 6, 2013, now U.S. Pat. No. 9,486,720, issued Nov. 8, 2016, which is a continuation of U.S. application Ser. No. 12/899,510, filed Oct. 6, 2010, now U.S. Pat. No. 8,605,165, issued Dec. 10, 2013. The contents of all of the prior applications are incorporated herein by reference in their entirety.

US Referenced Citations (142)
Number Name Date Kind
3814845 Hurlbrink et al. Jun 1974 A
5065447 Barnsley et al. Nov 1991 A
5441047 David et al. Aug 1995 A
5544649 David et al. Aug 1996 A
5619991 Sloane Apr 1997 A
5646912 Cousin Jul 1997 A
5752621 Passamante May 1998 A
5764296 Shin Jun 1998 A
5772593 Hakamata Jun 1998 A
5810747 Brudny et al. Sep 1998 A
5911132 Sloane Jun 1999 A
5961446 Beller et al. Oct 1999 A
6151521 Guo et al. Nov 2000 A
6154558 Hsieh Nov 2000 A
6233428 Fryer May 2001 B1
6283761 Joao Sep 2001 B1
6380858 Yarin et al. Apr 2002 B1
6409661 Murphy Jun 2002 B1
6421650 Goetz et al. Jul 2002 B1
6483993 Misumi et al. Nov 2002 B1
6484144 Martin et al. Nov 2002 B2
6535637 Wootton et al. Mar 2003 B1
6611206 Eshelman et al. Aug 2003 B2
6705991 Bardy Mar 2004 B2
6879970 Shiffman et al. Apr 2005 B2
6988075 Hacker Jan 2006 B1
7184047 Crampton Feb 2007 B1
7184075 Reiffel Feb 2007 B2
7256708 Rosenfeld et al. Aug 2007 B2
7277752 Matos Oct 2007 B2
7304228 Bryden et al. Dec 2007 B2
7307543 Rosenfeld et al. Dec 2007 B2
7317967 DiGianfilippo et al. Jan 2008 B2
7340077 Gokturk Mar 2008 B2
7395214 Shillingburg Jul 2008 B2
7415447 Shiffman et al. Aug 2008 B2
7448544 Louie et al. Nov 2008 B1
7562121 Berisford et al. Jul 2009 B2
7627142 Kurzweil et al. Dec 2009 B2
7657443 Crass et al. Feb 2010 B2
7692625 Morrison et al. Apr 2010 B2
7747454 Bartfeld et al. Jun 2010 B2
7761311 Clements et al. Jul 2010 B2
7769465 Matos Aug 2010 B2
7774075 Lin et al. Aug 2010 B2
7874984 Elsayed et al. Jan 2011 B2
7881537 Ma et al. Feb 2011 B2
7908155 Fuerst et al. Mar 2011 B2
7912733 Clements et al. Mar 2011 B2
7956727 Loncar Jun 2011 B2
7983933 Karkanias et al. Jul 2011 B2
8107672 Goto Jan 2012 B2
8321284 Clements et al. Nov 2012 B2
20010049673 Dulong et al. Dec 2001 A1
20010056358 Dulong et al. Dec 2001 A1
20020026330 Klein Feb 2002 A1
20020093429 Matsushita et al. Jul 2002 A1
20020143563 Hufford et al. Oct 2002 A1
20030164172 Chumas et al. Sep 2003 A1
20030190076 Delean Oct 2003 A1
20030225325 Kagermeier et al. Dec 2003 A1
20040100572 Kim May 2004 A1
20040107116 Brown Jun 2004 A1
20040155780 Rapchak Aug 2004 A1
20050144150 Ramamurthy et al. Jun 2005 A1
20050148847 Uchiyama Jul 2005 A1
20050149361 Saus et al. Jul 2005 A1
20050180610 Kato et al. Aug 2005 A1
20050182664 Abraham-Fuchs et al. Aug 2005 A1
20050234381 Niemetz et al. Oct 2005 A1
20050267356 Ramasubramanian et al. Dec 2005 A1
20060066584 Barkan Mar 2006 A1
20060169294 Kaler et al. Aug 2006 A1
20060218011 Walker et al. Sep 2006 A1
20060238549 Marks Oct 2006 A1
20070008112 Covannon et al. Jan 2007 A1
20070008113 Spoonhower et al. Jan 2007 A1
20070030363 Cheatle et al. Feb 2007 A1
20070041621 Lin et al. Feb 2007 A1
20070118389 Shipon May 2007 A1
20070194034 Vasiadis Aug 2007 A1
20070233035 Wehba et al. Oct 2007 A1
20070233049 Wehba et al. Oct 2007 A1
20070233050 Wehba et al. Oct 2007 A1
20070233281 Wehba et al. Oct 2007 A1
20070233520 Wehba et al. Oct 2007 A1
20070233521 Wehba et al. Oct 2007 A1
20070265880 Bartfeld et al. Nov 2007 A1
20070273504 Tran Nov 2007 A1
20080000979 Poisner Jan 2008 A1
20080093447 Johnson et al. Apr 2008 A1
20080114226 Music et al. May 2008 A1
20080114490 Jean-Pierre May 2008 A1
20080138604 Kenney et al. Jun 2008 A1
20080140444 Karkanias et al. Jun 2008 A1
20080161660 Arneson Jul 2008 A1
20080162192 Vonk et al. Jul 2008 A1
20080178126 Beeck et al. Jul 2008 A1
20080201174 Ramasubramanian et al. Aug 2008 A1
20080219493 Tadmor Sep 2008 A1
20080239104 Koh Oct 2008 A1
20080273097 Nagashima Nov 2008 A1
20080275738 Shillingburg Nov 2008 A1
20080290168 Sullivan et al. Nov 2008 A1
20080297589 Kurtz et al. Dec 2008 A1
20080303638 Nguyen et al. Dec 2008 A1
20090012818 Rodgers Jan 2009 A1
20090018867 Reiner Jan 2009 A1
20090043610 Nadas et al. Feb 2009 A1
20090048871 Skomra Feb 2009 A1
20090095837 Lindgren Apr 2009 A1
20090127339 Needhan et al. May 2009 A1
20090128330 Monroe May 2009 A1
20090159714 Coyne, III et al. Jun 2009 A1
20090217194 Martin et al. Aug 2009 A1
20090245655 Matsuzaka Oct 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090299142 Uchiyama et al. Dec 2009 A1
20100042430 Bartfeld Feb 2010 A1
20100050134 Clarkson Feb 2010 A1
20100057646 Martin et al. Mar 2010 A1
20100092093 Akatsuka et al. Apr 2010 A1
20100130250 Choi May 2010 A1
20100136509 Mejer et al. Jun 2010 A1
20100138154 Kon Jun 2010 A1
20100225773 Lee Sep 2010 A1
20100234792 Dacey, Jr. et al. Sep 2010 A1
20100255598 Melker Oct 2010 A1
20100262436 Chen et al. Oct 2010 A1
20100316979 Von Bismarck Dec 2010 A1
20110004059 Arneson Jan 2011 A1
20110021952 Vallone Jan 2011 A1
20110069159 Soler et al. Mar 2011 A1
20110119073 Hanina et al. May 2011 A1
20110141009 Izumi Jun 2011 A1
20110153360 Hanina et al. Jun 2011 A1
20110161109 Pinsonneault et al. Jun 2011 A1
20110190595 Bennett et al. Aug 2011 A1
20110195520 Leider et al. Aug 2011 A1
20110275051 Hanina et al. Nov 2011 A1
20120046542 Csavoy et al. Feb 2012 A1
20120075464 Derenne et al. Mar 2012 A1
Non-Patent Literature Citations (48)
Entry
Ammouri, et al., “Face and Hands Detection and Tracking Applied to the Monitoring of Medication Intake,” Computer and Robot Vision, 2008. CRV '08. Canadian Conference, 147(154):28-30, (May 2008).
Batz, et al. “A computer Vision System for Monitoring Medication Intake,” in Proc. IEEE 2nd Canadian Conf. on Computer and Robot Vision, Victoria, BC, Canada, 2005, pp. 362-369.
Bilodeau et al. Monitoring of Medication Intake Using a Camera System. Journal of Medical Systems 2011. [retrieved on Feb. 18, 2013] Retrieved from ProQuest Technology Collection.
Chen, Pauline W., “Texting as a Health Tool for Teenagers”, The New York Times, Nov. 5, 2009, http://www.nytmes.com/2009/11/05/health/0512/899,510.
Danya International, Inc., “Pilot Study Using Cell Phones for Mobile Direct Observation Treatment to Monitor Medication Compliance of TB Patients”, Mar. 20, 2009, www.danya.com/MDOT.asp.
Global Tuberculosis Control: A short update to the 2009 report, World Health Organization, (2009).
Huynh et al., “Real time detection, tracking and recognition of medication intake.” World Academy of Science, Engineering and Technology 60 (2009), 280-287.
Mintchell, “Exploring the Limits of Machine Vision”, Automating World, Oct. 1, 2011.
Osterberg, Lars and Blaschke, Terrence, “Adherence to Medication”, New England Journal of Medicine 2005; 353:487-97, Aug. 4, 2005.
Super-Resolution, Wikipedia, (Oct. 5, 2010).
University of Texas, GuideView, Mar. 15, 2007, http://www.sahs.uth.tmc.edu/MSriram/GuideView.
Valin, et al. “Video Surveillance of Medication intake”, Int. Conf. of the IEEE Engineering in Medicine and Biology Society, New York City, USA, Aug. 2006.
Wang et al. “Recent Developments in human motion analysis.” Pattern Recognition 36 (220) 585-601 (Nov. 2001).
Whitecup, Morris S., “2008 Patient Adherence Update: New Approaches for Success”, www.guideline.com , The Trend Report Series, (Oct. 1, 2008).
Final Office Action from PTO, (U.S. Appl. No. 12/620,686), (dated May 8, 2012), 1-24.
Final Office Action from PTO, (U.S. Appl. No. 13/558,377), dated May 7, 2013, 1-29.
Final Office Action from PTO, (U.S. Appl. No. 12/646,383), (dated May 8, 2012), 1-31.
Final Office Action from PTO, (U.S. Appl. No. 13/588,380), (dated Mar. 1, 2013), 1-27.
Final Office Action from PTO, (U.S. Appl. No. 12/646,603), (dated Feb. 1, 2012), 1-17.
Final Office Action from PTO, (U.S. Appl. No. 12/728,721), (dated Apr. 12, 2012), 1-31.
Final Office Action from PTO, (U.S. Appl. No. 12/815,037), (dated Sep. 13, 2012), 1-15.
Final Office Action from PTO, (U.S. Appl. No. 12/899,510), (dated Aug. 20, 2013).
Final Office Action from PTO, (U.S. Appl. No. 12/898,338), dated Nov. 9, 2012), 1-12.
Final Office Action from PTO, (U.S. Appl. No. 13/189,518), (dated Jul. 23, 2013), 1-16.
International Preliminary Report on Patentability, (PCT/US2010/056935) (dated May 31, 2012), 1-8.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/620,686), (dated Dec. 21, 2011),1-78.
Non-Final Office Action from PTO, (U.S. Appl. No. 13/558,377), (dated Oct. 22, 2012), 1-21.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/646,383), (dated Dec. 22, 2011),1-78.
Non-Final Office Action from PTO, (U.S. Appl. No. 13/558,380), (date Oct. 4, 2012), 1-20.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/646,603), (dated Oct. 13, 2011),1-74.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/646,603), (dated Jun. 13, 2013), 1-16.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/728,721), (dated Jan. 6, 2012), 1-31.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/728,721), (dated May 9, 2013), 1-25.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/815,037), (dated Mar. 28, 2012),1-17.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/815,037), (dated Jul. 18, 2013), 1-19.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/899,510), (dated Jan. 23, 2013), 1-20.
Non-Final Office Action from PTO, (U.S. Appl. No. 12/898,338), (dated Jun. 19, 2012), 1-16.
Non-Final Office Action from PTO, (U.S. Appl. No. 13/189,518), (dated Dec. 21, 2012), 1-10.
Non-Final Office Action from PTO, (U.S. Appl. No. 13/235,387), dated Sep. 12, 2013), 1-16.
PCT Search report and written opinion, (PCT/US2010/56935, (dated Jan. 12, 2011),1-9.
PCT Search report and written opinion, (PCT/US2011/35093, (dated Sep. 12, 2011),1-8.
PCT Search report and written opinion, (PCT/US11/54666), (dated Feb. 28, 2012), 1-13.
PCT Search report and written opinion, (PCT/US11/54668), dated Feb. 28, 2012, 1-12.
PCT Search report and written opinion, (PCT/US12/41785), (dated Aug. 17, 2012),1-10.
PCT Search report and written opinion, (PCT/US12/42843), (dated Aug. 31, 2012), 1-8.
PCT Search report and written opinion, (PCT/US2012/051554), (dated Oct. 19, 2012), 1-12.
PCT Search report and written opinion, (PCT/US12/59139), (dated Dec. 18, 2012), 1-15.
PCT Search report and written Opinion, (PCT/US13/20026), (dated Aug. 5, 2013), 1-14.
Related Publications (1)
Number Date Country
20190192070 A1 Jun 2019 US
Continuations (4)
Number Date Country
Parent 15840836 Dec 2017 US
Child 16206871 US
Parent 15337551 Oct 2016 US
Child 15840836 US
Parent 14073525 Nov 2013 US
Child 15337551 US
Parent 12899510 Oct 2010 US
Child 14073525 US