In these applications, technology, embodiments and ergonomics of a miniature vein enhancement system is described that uses scanned lasers and other light sources to accomplish the acquisition and projection of a vein pattern to aid the user in locating a vein for venipuncture. In this application, additional capabilities and how these capabilities improve the processes associated with venipuncture are set forth.
Key among these is the use of enhanced embodiments that allow additional data collection and application processes that are critical to the successful performance of venipuncture and related medical and business procedures.
It is known in the art to use an apparatus to enhance the visual appearance of the veins and arteries in a patient to facilitate insertion of needles into those veins and arteries. Such a system is described in U.S. Pat. Nos. 5,969,754 and 6,556,858 incorporated herein by reference as well as a paper by Zeman, H. D. et al., “The Clinical Evaluation of Vein Contrast Enhancement”, Department of Biomedical Engineering, University of Tennessee Health Science Center, Memphis, Tenn. 8 Feb. 2007. Luminetx is currently marketing such a device under the name “Veinviewer Imaging System” and other information related thereto is available on their website, www.luminetx.com, which is incorporated herein by reference.
In this application, the use of biometrics to identify the user and/or the patient to aid in the management and safety of processes involving venipuncture are described. One such biometric that is unique to this field is to use vein patterns to identify the individual. For example, U.S. Pat. No. 5,787,185, provides for “A method of verifying the identity of an individual comprises capturing an image of the subcutaneous vein pattern at a predetermined region of the individual, converting the captured image to a plurality of stored values representative of the intensity of said image at specified relative locations, processing the stored values to produce a second plurality of stored values representative of the image of the vein pattern having enhanced contrast and subjecting the second plurality of stored values to a thresholding process to select those above a predetermined value and storing a set of measurements derived from the selected ones of said second plurality of stored values for comparison with a corresponding set of measurements made on the individual.”
Current embodiments are large, fixed mounted devices. The present invention allows this capability to be integrated into a handheld device as well as to be used as part of the management of the venipuncture process and the handling of delivered drugs and drawn blood.
1) It is an object of the invention to integrate data collection capabilities such as bar code decoding and biometrics and use those to control the use of the vein scanner
2) It is another object of the invention to combine the projection of vein patterns with the projection of data such as text and/or icons to enhance the usability and capabilities of the device
3) It is another object of the invention to project variable data overlaid on the vein pattern.
4) It is yet another object of the invention to enable payment schemes such that the practitioner can acquire the device at a reduced initial cost and pay per use or pay per user.
5) It is yet another object of the invention to gather and then project back on the patient additional parameters of the patient such as pulse and temperature.
6) It is yet another object of the invention to connect the scanner to a host computer to allow data collected to be stored for future uses such as patient history and billing through wired or wireless means.
7) It is yet another object of the invention to connect to other devices monitoring the patient and to project the data from those monitors back on to the patient.
8) It is yet another object of the invention to project the data collected by the invention and other devices onto an arbitrary surface so that the practitioner can view the data.
9) It is yet another object of the invention to allow the practitioner to avoid blood vessels when performing a procedure on a patient.
10) It is yet another object of this invention to vary the resolution over the working area to allow certain portions of the viewed area to be scanned in higher resolutions than in other areas thereby using the limited bandwidth of the device in an optimized manner.
11) It is yet another object of the invention to provide additional lighting on the working area to aid the practitioner to see the patient's body.
12) It is yet another object of the invention to provide a mechanism for measuring pulse and projecting the pulse rate back on to the body.
13) It is yet another object of the invention to provide a mechanism for measuring blood oxygen levels and projecting the value back on to the body.
14) It is yet another object of the invention to provide a mechanism for reading pulse rate without body contact when the target person is participating in exercise.
15) It is yet another object of the invention to provide the ability for a remote practitioner to perform or assist in the performance of a venipuncture.
In this application a handheld vein detection device is described with capabilities that allow it to enhance the management of venipuncture processes including blood draw, venous or arterial injection or intravenous starts through the use of bar codes and biometrics; as well as embodiments that enhance usability; and the ability to detect and record additional data about the patient such as the location of venous valves, diagnostic data and re-project that data on to the patient's body.
The preferred embodiment of the invention simultaneously captures information about the target part of the body (e.g., forearm) such as vein location and then re-projects information about the captured image directly over that area of capture. As previously described, in one embodiment a visible representation of the vein position(s) is projected directly on the vein locations thereby highlighting the veins and making them easier for the practitioner to find.
In enhanced embodiments, digital information in the form of text and/or icons can be included in the information that is re-projected on to the skin. The text and icons can be any useful information to assist the practitioner in performing the procedure. This information may be real time information such as vein depth or it may be patient information such as patient name or it may be mistake proofing information such as the quantity of blood needed to be drawn next so that the vial size can be verified.
Referring to
An alternative embodiment uses variable markings to indicate some value about the patient's status. In
The process of obtaining blood, storing it, transporting it, testing it and returning the test results has many steps, involves multiple persons and is prone to errors. Through the use of the invention many of these mistakes can be eliminated. Referring to
Once an identifier or identifiers have been acquired 1001, they can be compared to a database 1005 of valid users of the device either in a locally stored database or in the case of a scanner that is connected either wired or wirelessly, a remote database can be queried. If the user isn't found in the database 1003 or if the user does not have appropriate training or privileges for the procedure they can be prompted to try again 1002. This prompt could be in a display built into the device, through audio feedback, through visual feedback of an indicator such as an LED, or in the preferred embodiment, through the projection of text or icons from the scanner's projection field. If they user has been successfully identified 1006 then the process moves on to verifying that the patient is the correct patient.
Verification of the patient 1007 can take all of the forms described above for the user of the device. In addition, it is common in hospital environments for a patient to wear a bar coded wrist band. The device can be used to scan that wrist band as a means of identifying the patient. Once an identifier or identifiers have been acquired 1010, they can be compared to a database 1011 of valid patients of the device either in a locally stored database or in the case of a scanner that is connected either wired or wirelessly, a remote database can be queried. If the patient isn't found in the database 1011 or if there are not orders on file for the procedure, the user can be prompted to try again or to correct the situation 1008. This prompt could be in a display built into the device, through audio feedback, through visual feedback of an indicator such as an LED, or in the preferred embodiment, through the projection of text or icons from the scanner's projection field.
Now that the practitioner and the patient have been validated, a list of blood needed and the equipment needed to perform the procedure 1013 can be presented to the user through one of the modalities described above.
Since the invention can be used for bar code data collection, it can be used to ensure that the appropriate materials are in hand before the venipuncture is performed. As the user selects each piece of equipment 1014 they can use bar code or other means of identification such as RFID to verify that the item 1015 is on the list of needed equipment. The process repeats and prompts the user 1016 to continue with gathering equipment until all of the equipment has been identified 1018.
Once a complete set of equipment has been verified 1019, the user can use the vein scanner to find a vein and perform the venipuncture 1020. As the user fills the vials 1021 they can be identified again as filled and if all of the required vials have not been filled 1023, the user can be prompted to continue 1022.
Once all of the vials are filled 1080 the user verifies that the process has completed successfully 1075 and then packages the specimens taken for delivery to the lab 1027/1028 and. The status of the completed process is reported either in the documents forwarded on with the specimens and in a preferred embodiment, this information can be updated in to the medical records database using one of the input and connectivity modalities previously described.
The flowchart and the associated descriptions are shown to highlight how additional input and output modalities integrated into the vein scanner can be used to enhance the process of blood collection. However, the specifics of the processes implemented in different medical environments will require changes to the process flow to meet the unique requirements of the specific environment.
Improper or mistaken delivery of medicines is one of the leading causes of accidental injury or death in the healthcare system. In addition to the use of the vein scanner to help identify the position of veins to aid in the taking of blood specimens, the vein scanner can be used to assist in the delivery of medicines into the patient.
In addition, an enhanced scanner ban assist in the management of the injection process in a similar way as described above for taking blood specimens. Referring to
A primary focus of the invention has been to locate blood vessels so that a practitioner can access that vessel with a needle for withdrawing blood or injecting a material into the blood stream. In medical practice, some inject-able substances must be injected into the muscle rather than into a vein. Through the use of the invention, veins can be located for this procedure and the area between veins can be targeted by the practitioner thereby reducing the risk of inadvertently injecting the substance into a vein.
In an embodiment of the invention where the laser is scanned in a predictable and repeatable pattern, such as a raster pattern, an image of the targeted area can be captured and stored into memory. This image can be of sufficient quality and resolution to allow image processing techniques that are well known in the art to be applied to this captured image.
As previously described, this stored image can be used for many purposes such as using various image processing techniques to find the vein pattern within the image. In addition to the location of the vein, with appropriate image processing software the scanner can be used to capture and present additional information about objects seen in the field of view.
In a digital camera a photo detector array that has elements that are separately sensitive to red, green and blue light are used with either ambient light or a broad spectrum white light to capture the image. In a laser based system, the image being captured is based on the target surface's reflection of the light of the specific color (wavelength) of the laser being projected.
For each laser used in the system an additional color can be captured and included in the image. In the preferred embodiment with an infrared and a visible red laser, an image can be captured based on both the reflectivity at infrared and red. Depending on the desired result, one or both of these images can be used. For example, in a bar code application, a bar code could be scanned with the infrared laser that is invisible to the human eye. With the addition of other color lasers, more information about the target surface can be captured.
Since the needle is a large object that is above the surface of the body prior to venipuncture, image processing software can be used to identify the position of the needle using its different characteristics when compared to skin and sub-skin body parts. Since the invention both captures an image and projects an image, the user can be prompted with projected feedback as to the position of the needle in relation to the nearest vein in the field of view. Since it is important that the needle pierces the vein in its center, icons such as cross hairs or arrows can be used to guide the practitioner to the proper position. For example, right/left/forward arrows can be used to accurately guide the practitioner to the center of the vein by showing them the direction to move the needle. The invention is also able to use other feedback mechanisms such as audible feedback or that of projecting text into the field of view with prompting information.
Once the image is captured from the penetrating infrared laser, many algorithms well known in the art can be used to find the veins. Several things are typical in the types of veins that one would wish to select for venipuncture, including that it has a sizeable segment that is linear. Therefore, a line detection algorithm can be used. Another thing that is characteristic of the vein image being captured is that there are edge boundaries between the area surrounding the vein and the vein itself. An edge detection algorithm such as the one described by Canny in the IEEE Transactions on Pattern Analysis and Machine Intelligence archive, Volume 8, Issue 6 (November 1986), Pages: 679-698, Year of Publication: 1986, ISSN: 0162-8828, can be used to find these edges.
In a preferred embodiment, the output intensity of the laser (or the gain of the photo detection circuitry, or a combination of both) can be varied from frame to frame as the scanner passes over the target area. The higher the intensity, the deeper the laser will penetrate the body. At a low setting, the veins identified will be close to the surface. At higher settings, more and deeper veins will be identified.
By comparing the images between frames a determination can be made of which veins are close to the surface of the body and which veins are deeper. As the intensity increases and new veins become visible, the image processing software can tag the veins seen with an indicator of the intensity at which they first became visible. This provides an indication of relative depth.
Several user interfaces are possible. In a system with multiple, differently colored projection lasers, veins at different depths can be displayed in different colors. In a system with a single projection color, the lines that represent the veins could have patterns superimposed over the vein to indicate depth such as crosshatching for shallow and solid for deep. An alternative embodiment would be to allow the user to set what depth to look at so that only the veins of a specific depth are presented to the user.
There is a significant body of knowledge on the use of image capture devices to read bar codes such as described U.S. Pat. No. 6,223,988. Since the invention is capable of capturing an image of the necessary resolution and quality for bar code detection, algorithms well known in the art can be used to find and decode bar codes that are in the field of view.
Bar codes are commonly used to identify users (through ID cards or badges), identify patients (through wrist bands) and to identify equipment (through bar-coded labels) in medical environments. It is an important extension of the invention to have it able to read bar codes so that it can be fully integrated into the work flows of the medical environment.
The invention can be implemented such that it captures images of the field of view, and stores those images into computer memory. Using techniques well known in the art, the captured images can be searched for the presence of bar codes such as those shown in
By re-using the image capture capabilities of the scanner for bar code reading a very small, tightly integrated, low cost embodiment can be created that combines vein location and bar code scanning.
An alternative implementation of bar code scanning would be to use a commercially available off the shelf bar code scanning engine such as those available from Hand Held Products integrated into the housing of the invention. This would have the advantage of keeping the vein detection portion of the design simpler and forgo the need to develop or integrate bar code reading into the vein detection electronics and software. The disadvantage is that there would be redundancies that might increase the size and cost of the unit.
As previously described, the ability to identify the user of the invention and the patient upon which the invention will be used are very useful to the management of the medical procedures. Simple methods of passwords or PINs can be used for the user identification, and as previously described bar coded wrist bands can be used to identify the patient. However, each of these approaches has its weaknesses in that the password or PINs can be misplaced and the wrist band is only available in a hospital environment.
Once again, the image capture capabilities of the invention can be used for capturing many biometric identifiers such as but not limited to fingerprint, face recognition, iris or retina recognition, and vein pattern recognition through techniques well known in the art.
There are many biometric capture devices available on the market that are designed to be integrated into an OEM devices such as laptops. This includes fingerprint modules such as those available from AuthenTec (www.authentec.com) which can be integrated into the invention. This avoids the complexity of the user having to ‘take a picture’ of their finger and allows them to simply swipe their finger on the sensor.
While fingerprint has become a common method of biometric identification, any biometric identification device could be integrated into the invention to allow the user and patient to be identified.
Non contact biometrics can be valuable in a medical situation. For example, a fingerprint would be difficult to read from the gloved hand of the practitioner and the device would have to be cleaned between patients if they came in physical contact with it for a fingerprint swipe. Therefore, non contact biometrics are very desirable. By adding a microphone and associated analog to digital circuitry, sound sampling can be included in the invention and voice recognition techniques well known in the art can be applied to user and patient identification.
The scanning speed of a moving laser and the resolution of the captured image are subject to many physical limitations such as the maximum speed of the mirror, and the bandwidth of the electronic circuitry. However, it is desirable in some embodiments to have a higher resolution of the image on a particular area of the body than the device is inherently capable of. One mechanism to overcome this limitation would be to reduce the area being scanned so that whatever the native resolution of the device is, it is being applied to a smaller physical area. This trades off scanning area vs. resolution.
An alternative embodiment of the device would is to change the resolution of the scanned image in a particular area of the field of view. For example, since the practitioner would typically focus their attention on the center of the field of view, if the device can be made to have higher resolution at the center, the information content of the projected image would be higher at that most important point, without reducing the total field of view of the device. In a scanned laser implementation, this is done by varying the amplitude of the drive circuit so that the mirror moves less quickly in the center and more quickly outside of the center. The increase in dwell time at the center of the image provides higher resolution in that area. The decrease in dwell time outside the center would reduce the resolution in that area. The advantage of this over the previous embodiment is that the total field of view is maintained while sill gaining an improvement in resolution of the targeted area.
Most medical practitioners will have the need to draw blood or otherwise stick the patient in their normal course of practice. However, for many of these practitioners this is not done on a regular basis. For example, in a general practitioners office this might be only two to three times a day. This presents a dilemma. On one hand, they don't do the procedure often enough to become very skilled, and would therefore gain an extraordinary benefit from a vein location device, but on the other hand they don't perform the procedure often enough to cost justify the acquisition of such a device.
Records of usage can be:
To overcome-this dilemma, the device can be modified to provide a per use-type licensing model. This can be approached by keeping track of the number of times the device is activated in memory that is built in to the device. Various schemes can be used to activate the device such as:
In the simplest embodiment, a vein scanner is a stand alone device that simply detects and projects a vein pattern back on to the skin of the patient. In this embodiment, no data about the process or the patient is retained for future use. In an enhanced embodiment, the device can be used to collect, store, manipulate and transmit information about the user, the usage of the device, and the patient. Also, updated control software and firmware can be uploaded to the device. In addition, data can be transferred into the scanner that can be used as previously described to manage the processes associated with venipuncture.
The transfer of information can either be batched up so that when needed the device can be connected to a computer system for upload and download, but in normal operation it is unconnected, or the device can be always connected to a computer system either through a wired or wireless connection. There are advantages and disadvantages and costs associated with each of these techniques that are well known.
While a bi-directional connection that allows the systems to acknowledge that communications have been properly received, a one-way communication scheme can be used for simple embodiments.
The device can be connected to a computer system using a cable such as commonly seen in bar code scanners at store checkouts. The advantage of this implementation is that it is typically low cost and doesn't require that a battery be in the handheld and eliminates the need to charge the battery. The disadvantage is that the unit cannot be moved far from the computer and the cable can get in the way and easily breaks. Approaches well known in the art can be used for cabled communication such as RS232, USB or Ethernet can be used for transport medium.
A wireless implementation eliminates the problems associated with, a cable with the tradeoff that a battery is needed in the device. Wireless can be implemented with well understood approaches such as point-to-point proprietary protocols such as seen in remote control key fobs or standard protocols such as Bluetooth or WiFi.
Since the device contains both a means of measuring light with its photo detector and emitting light with its lasers, these devices can also be used to transmit and receive data. Thorough the use of modulation schemes that are well known in the art, the laser can be used to transmit data to a remote receiver that could be integrated into the cradle or into a stand alone receiver. This receiver would have a photo detector system capable of detecting and decoding the modulated laser light. It would also have an output system such as a LED or LEDs that would transmit a modulated light signal at a wavelength that can be detected by the photo detector in the handheld. In this manner a two-way communication session can be established.
Memory can be added to the system so that system events can be captured and system control information can be kept in that memory. From time to time, the information collected will need to be sent to another computer or new control information will need to be downloaded to the scanner. Any of the communication techniques previously described can be used on an ad-hoc basis to connect the scanner to a computer and to transfer the information. When not transferring information, the scanner is disconnected from the computer system and continues to operate in a stand alone mode.
In many field medical situations, there is limited ambient light available for performing procedures such as venipuncture. One mechanism that a practitioner would use to improve lighting would be to use a handheld flashlight to illuminate a specific area of the patient's body. One embodiment of the invention would be to integrate the scanning capability into a flashlight and the inverse of integrating a flashlight into the device.
One embodiment is that the invention be used such that the intensity of the projected field is bright enough to illuminate the body area without the need of additional lighting during the venipuncture process. This can be implemented so that the intensity is controllable by the user with the same hand that is holding the scanner. By varying the intensity of both the dark areas that represent veins and the light areas that represent the spaces between veins, and keeping the contrast between the two sufficiently different, the field of work can be kept well lit while maintaining the ability to discern the vein positions.
Most of the embodiments of the invention have been described as stand-alone devices. However, since the scan engine of the invention can be miniaturized, it is uniquely suited to be integrated into other devices already in use by the practitioner. For example, the scan engine can be built into an otoscope such that it becomes a multi-function device for both looking into shaded areas of the body such as the ear canal and then used for vein location. The advantages to the user are that they have a single device to buy and manage and a single battery can be used.
Another device that the invention can be integrated with is a flashlight. Depending on the application, the scanner can be integrated so that the light beam and the scan/projection field are aligned along the same path. In this manner, the area of the body is illuminated by the flashlight and the vein pattern is viewable in the same field. The laser projection will be bright enough for most applications, but the flashlight/scanner combo can contain controls that alternately switch between scan/project only, light only and both or other combinations of those modes. The intensity of both the flash light and the projector can be controlled in these modes to ensure that the pattern remains visible without becoming over-bright or too dim.
An alternative approach is to add lighting capabilities to the scanner. Finer control over the modes of the device can be accomplished in this way. In a preferred embodiment, the scanner is implemented so that multiple lasers allow any color to be projected on to the body including white. In this way, the field of view can be illuminated in white light and the other colors can be used to identify the position of the veins and to project data into the field of view. Since there is no competing illumination from a flashlight bulb, the image will be easier to read, while still illuminating the body part.
A further enhancement of the invention would be to implement a non-linear scanning pattern so that the illuminated area can be made arbitrarily wide, thereby increasing the device's utility as a flashlight while maintaining a good vein image in the center of the scan area.
A user interface can be implemented that allows the practitioner to switch between these modes and to modify the parameters such as intensity, field of view and field of high resolution. This user interface can be either controls for user input or based on the distance to the target area or both.
The use of robots to provide fine control of surgical tools is well known in the art. In U.S. Pat. No. 7,155,316, a robot system for use in surgical procedures is described. In U.S. Pat. No. 7,158,859, a mobile robot that provides the ability to remotely control the robot and to see the patient, the patient surroundings and the tools in use is described.
In a normal patient encounter for venipuncture, the practitioner would use several senses to locate a vein and then to perform venipuncture. This would include looking for both visible and tactile indicators of the vein position. The invention enhances the visible feedback of the position of the veins. By using the visual position enhancement, the ability to use a tele-robot to perform venipuncture is enhanced.
The vein scanner engine can be integrated into one of these robots so that the scanning/projection field is positioned in alignment with the end of a robotic arm which contains the needle or catheter for the venipuncture procedure. The engine can be mounted directly to the arm that contains the needle or it can be mounted to a separately controllable arm.
The engine can work in one of three modes. It can work as it normally does such that the acquisition of the vein pattern and the projection of the vein pattern is directly on the patient's skin. In this mode, the cameras that are already on the robot can view the pattern on the skin and send that back to the remote operator on the screen that is already provided for the remote operator.
In a second mode, the detected vein pattern is captured and then rather than projecting the pattern back on to the patient, the pattern is sent to the remote practitioner and is shown to them on a screen. In this mode, there are several possible embodiments. One is that the pattern of veins can be overlaid on the video display of the patient's body. In this manner, what they see on the screen is very much like what they would see on the body. The advantage of this is that the image from the vein scanner can be enhanced and made brighter than it would be in the first mode.
In a third mode, a combination of the first two modes can be used that allows the practitioner to switch between the modes or for both modes to be used simultaneously on two different screens or windows on a single screen.
Referring to
Pulse oximetry is a non-invasive method of measuring arterial oxygen saturation. The use of an infrared light source and a red light source along with a photo detector in a finger clip arrangement to measure pulse and oxygen levels in blood are well known in the art. The basis for the measurement is the differential absorption between oxygenated hemoglobin and non-oxygenated hemoglobin of the two different wavelengths of light. The traditional technique for performing this measurement requires contact with the body and relies on the transmission of the light through an area of the body with a small cross section such as a finger or an earlobe. By alternating between the red and infrared light sources and measuring the level of the light that is passed through a body part, a waveform that is known as a photoplethysmograph (PPG) is captured. Through signal processing techniques that are well known in the art, oxygenation levels and pulse rates can be calculated.
Several articles and patents have been published that use non-contact techniques to measure the necessary waveforms by replacing the photo detector with a remote device such as a CMOS camera. U.S. Pat. No. 6,491,639 describes the use of a light sources and the photo detector in the same plane eliminating the need for a small cross section part of the body to be used for measurement of the necessary waveforms. This design relies on the internal reflections of the body such that the light is directed in to the body and then reflected out of the body rather than completely passing through a small cross section portion of the body.
In the article by Humphreys, K. et al, “A CMOS Camera-Based Pulse Oximetry Imaging System.” Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference (2005). 7 Feb. 2007, they describe a non-contact pulse oximetry system that uses a CMOS camera as the detector. In this design, the detector is held at a distance from the body and a bounding box is selected in the captured image. The average intensity of the bounding box is used as a proxy of the photo detector value that would be found in a contact pulse oximeter. By plotting the changing value of the average intensity of that location over time a PPG is derived that can be used to measure oxygenation and pulse using the same techniques as previously described.
Our micro vein enhancer invention uses scanned lasers to capture a pattern of veins below the surface of the skin and then re-projects a vein pattern image directly back on to the body. In one embodiment those laser light sources are visible red for projecting the image and infrared for detecting the veins. The vein detection relies on absorption characteristics that are related to those used in pulse oximetry in that the infrared light is absorbed differentially by hemoglobin than surrounding tissues.
Furthermore, we have previously described how a micro vein enhancer can be used to capture images based on the light transmitted by the lasers and reflected back to the micro vein enhancer's photo detector circuitry. Through the combination of these techniques, a miniature vein enhancer can be extended to read pulse and oxygen levels without contact with the patient.
Referring to
A further mechanism is to project using both the visible red and infrared lasers in an alternating pattern. The alternation can be handled in multiple ways. Possibilities include the lasers time sliced so that short pulses of each laser are alternated while the laser scans. Alternatively, the lasers can be switched between scan lines with right to left with one laser and left to right with the other laser. Alternately, the lasers can be switched on some multiple of scan lines or on alternate complete passes of the scan area.
Referring to
Referring back to
Since the preferred embodiment is a handheld device, the motion of the device in relation to the patient must be allowed for. In the simplest embodiment, since the device is breaking the scan area down into a series of subsections and generating the PPG from the subsections and there would be similar return signals from these subsections, the motion can be ignored. However, if the motion becomes too severe or if the area being imaged is heterogeneous, then the system can use digital or optical image stabilization well known in the art to maintain alignment from frame to frame. An alternative method would be to keep PPGs from all of the subsections, then as the hand moves, the system would use the PPG from areas that still remain in view.
Many commercially available exercise machines such as treadmills and bicycles provide the capability to measure pulse rate. Knowledge of the current pulse rate is very valuable for managing excursive intensity. One commonly seen embodiment relies on hand contact with electrodes that are used to measure heart rate. In many situations, such as running, it is difficult or dangerous to maintain contact with the electrodes. Another common embodiment uses a chest strap that measures heart rate and then wirelessly transmits it to the exercise machine so that the value can be displayed. This requires that the user acquire this separate device in order for pulse rate to be displayed. In a preferred embodiment of the invention, a capture only scan engine is mounted in proximity to the user such that the scanning pattern is oriented such that it strikes an uncovered portion of the body such as the hands, arms, neck or face. As described previously, the PPG waveform can be captured and the pulse rate can be fed back into the exercise machine and displayed to the user.
Referring to
This application is a continuation of U.S. patent application Ser. No. 15/096,518, which is a continuation of U.S. patent application Ser. No. 13/910,257, now issued as U.S. Pat. No. 9,345,427, which is a continuation of U.S. patent application Ser. No. 11/807,359, now issued as U.S. Pat. No. 8,489,178, which claims priority on U.S. Provisional Patent Application Ser. No. 60/898,506, filed Jan. 31, 2007, and on U.S. Provisional Patent Application Ser. No. 60/817,623 filed Jun. 29, 2006, and which parent application is also a continuation-in-part of U.S. patent application Ser. No. 11/700,729 filed Jan. 31, 2007, and U.S. patent application Ser. No. 11/478,322, filed on Jun. 29, 2006. The disclosures of each of these applications are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3136310 | Meltzer | Jun 1964 | A |
3349762 | Kapany | Oct 1967 | A |
3511227 | Johnson | May 1970 | A |
3527932 | Thomas | Sep 1970 | A |
3818129 | Yamamoto | Jun 1974 | A |
3984629 | Gorog | Oct 1976 | A |
4030209 | Dreiding | Jun 1977 | A |
4057784 | Tafoya | Nov 1977 | A |
4109647 | Stern et al. | Aug 1978 | A |
4162405 | Chance et al. | Jul 1979 | A |
4182322 | Miller | Jan 1980 | A |
4185808 | Donohoe et al. | Jan 1980 | A |
4213678 | Pomerantzeff et al. | Jul 1980 | A |
4265227 | Ruge | May 1981 | A |
4312357 | Andersson et al. | Jan 1982 | A |
4315318 | Kato et al. | Feb 1982 | A |
4321930 | Jobsis et al. | Mar 1982 | A |
4393366 | Hill | Jul 1983 | A |
4495949 | Stoller | Jan 1985 | A |
4502075 | DeForest et al. | Feb 1985 | A |
4510938 | Jobsis et al. | Apr 1985 | A |
4536790 | Kruger et al. | Aug 1985 | A |
4565968 | Macovski | Jan 1986 | A |
4567896 | Barnea et al. | Feb 1986 | A |
4576175 | Epstein | Mar 1986 | A |
4590948 | Nilsson | Mar 1986 | A |
4586190 | Tsuji | Apr 1986 | A |
4596254 | Adrian et al. | Jun 1986 | A |
4619249 | Landry | Oct 1986 | A |
4669467 | Willett et al. | Jun 1987 | A |
4697147 | Moran et al. | Sep 1987 | A |
4699149 | Rice | Oct 1987 | A |
4703758 | Omura | Nov 1987 | A |
4766299 | Tierney et al. | Aug 1988 | A |
4771308 | Tejima et al. | Sep 1988 | A |
4780919 | Harrison | Nov 1988 | A |
4799103 | Muckerheide | Jan 1989 | A |
4817622 | Pennypacker et al. | Apr 1989 | A |
4846183 | Martin | Jul 1989 | A |
4861973 | Hellekson et al. | Aug 1989 | A |
4862894 | Fujii | Sep 1989 | A |
4899756 | Sonek | Feb 1990 | A |
4901019 | Wedeen | Feb 1990 | A |
4926867 | Kanda et al. | May 1990 | A |
RE33234 | Landry | Jun 1990 | E |
4938205 | Nudelman | Jul 1990 | A |
5074642 | Hicks | Dec 1991 | A |
5088493 | Giannini et al. | Feb 1992 | A |
5103497 | Hicks | Apr 1992 | A |
5146923 | Dhawan | Sep 1992 | A |
5174298 | Dolfi et al. | Dec 1992 | A |
5184188 | Bull et al. | Feb 1993 | A |
5214458 | Kanai | May 1993 | A |
5222495 | Clarke et al. | Jun 1993 | A |
5261581 | Harden, Sr. | Nov 1993 | A |
5293873 | Fang | Mar 1994 | A |
5339817 | Nilsson | Aug 1994 | A |
5371347 | Plesko | Dec 1994 | A |
5406070 | Edgar et al. | Apr 1995 | A |
5418546 | Nakagakiuchi et al. | May 1995 | A |
5423091 | Lange | Jun 1995 | A |
5436655 | Hiyama et al. | Jul 1995 | A |
5445157 | Adachi et al. | Aug 1995 | A |
D362910 | Creaghan | Oct 1995 | S |
5485530 | Lakowicz et al. | Jan 1996 | A |
5487740 | Sulek et al. | Jan 1996 | A |
5494032 | Robinson et al. | Feb 1996 | A |
5497769 | Gratton et al. | Mar 1996 | A |
5504316 | Bridgelall et al. | Apr 1996 | A |
5519208 | Esparza et al. | May 1996 | A |
5541820 | McLaughlin | Jul 1996 | A |
5542421 | Erdman | Aug 1996 | A |
5598842 | Ishihara et al. | Feb 1997 | A |
5603328 | Zucker et al. | Feb 1997 | A |
5608210 | Esparza et al. | Mar 1997 | A |
5610387 | Bard et al. | Mar 1997 | A |
5625458 | Alfano et al. | Apr 1997 | A |
5631976 | Bolle et al. | May 1997 | A |
5655530 | Messerschmidt | Aug 1997 | A |
5678555 | O'Connell | Oct 1997 | A |
5716796 | Bull et al. | Feb 1998 | A |
5719399 | Alfano et al. | Feb 1998 | A |
5740801 | Branson | Apr 1998 | A |
5747789 | Godik | May 1998 | A |
5756981 | Roustaei et al. | May 1998 | A |
5758650 | Miller et al. | Jun 1998 | A |
5772593 | Hakamata | Jun 1998 | A |
5787185 | Clayden | Jul 1998 | A |
5814040 | Nelson et al. | Sep 1998 | A |
5836877 | Zavislan | Nov 1998 | A |
5847394 | Alfano et al. | Dec 1998 | A |
5860967 | Zavislan et al. | Jan 1999 | A |
5929443 | Alfano et al. | Jul 1999 | A |
5946220 | Lemelson | Aug 1999 | A |
5947906 | Dawson, Jr. et al. | Sep 1999 | A |
5966204 | Abe | Oct 1999 | A |
5966230 | Swartz et al. | Oct 1999 | A |
5969754 | Zeman | Oct 1999 | A |
5982553 | Bloom et al. | Nov 1999 | A |
5988817 | Mizushima et al. | Nov 1999 | A |
5995856 | Mannheimer et al. | Nov 1999 | A |
5995866 | Lemelson | Nov 1999 | A |
6006126 | Cosman | Dec 1999 | A |
6032070 | Flock et al. | Feb 2000 | A |
6056692 | Schwartz | May 2000 | A |
6061583 | Ishihara et al. | May 2000 | A |
6083486 | Weissleder et al. | Jul 2000 | A |
6101036 | Bloom | Aug 2000 | A |
6113536 | Aboul-Hosn et al. | Sep 2000 | A |
6122042 | Wunderman et al. | Sep 2000 | A |
6132379 | Patacsil et al. | Oct 2000 | A |
6135599 | Fang | Oct 2000 | A |
6141985 | Cluzeau et al. | Nov 2000 | A |
6142650 | Brown et al. | Nov 2000 | A |
6149061 | Massieu et al. | Nov 2000 | A |
6149644 | Xie | Nov 2000 | A |
6171301 | Nelson et al. | Jan 2001 | B1 |
6178340 | Svetliza | Jan 2001 | B1 |
6179260 | Ohanian | Jan 2001 | B1 |
6230046 | Crane et al. | May 2001 | B1 |
6240309 | Yamashita et al. | May 2001 | B1 |
6251073 | Imran et al. | Jun 2001 | B1 |
6263227 | Boggett et al. | Jul 2001 | B1 |
6272376 | Marcu et al. | Aug 2001 | B1 |
6301375 | Choi | Oct 2001 | B1 |
6305804 | Rice et al. | Oct 2001 | B1 |
6314311 | Williams et al. | Nov 2001 | B1 |
6334850 | Amano et al. | Jan 2002 | B1 |
6353753 | Flock et al. | Mar 2002 | B1 |
6424858 | Williams | Jul 2002 | B1 |
6436655 | Bull et al. | Aug 2002 | B1 |
6438396 | Cook et al. | Aug 2002 | B1 |
6463309 | Ilia | Oct 2002 | B1 |
6464646 | Shalom et al. | Oct 2002 | B1 |
6523955 | Eberl et al. | Feb 2003 | B1 |
6542246 | Toida | Apr 2003 | B1 |
6556854 | Sato et al. | Apr 2003 | B1 |
6556858 | Zeman | Apr 2003 | B1 |
6599247 | Stetten | Jul 2003 | B1 |
6631286 | Pfeiffer et al. | Oct 2003 | B2 |
6648227 | Swartz et al. | Nov 2003 | B2 |
6650916 | Cook et al. | Nov 2003 | B2 |
6689075 | West | Feb 2004 | B2 |
6690964 | Bieger et al. | Feb 2004 | B2 |
6702749 | Paladini et al. | Mar 2004 | B2 |
6719257 | Greene et al. | Apr 2004 | B1 |
6755789 | Stringer et al. | Jun 2004 | B2 |
6777199 | Bull et al. | Aug 2004 | B2 |
6782161 | Barolet et al. | Aug 2004 | B2 |
6845190 | Smithwick et al. | Jan 2005 | B1 |
6882875 | Crowley | Apr 2005 | B1 |
6889075 | Marchitto et al. | May 2005 | B2 |
6913202 | Tsikos et al. | Jul 2005 | B2 |
6923762 | Creaghan, Jr. | Aug 2005 | B1 |
6980852 | Jersey-Willuhn et al. | Dec 2005 | B2 |
7092087 | Kumar et al. | Aug 2006 | B2 |
7113817 | Winchester, Jr. et al. | Sep 2006 | B1 |
7158660 | Gee, Jr. et al. | Jan 2007 | B2 |
7158859 | Wang et al. | Jan 2007 | B2 |
7204424 | Yavid et al. | Apr 2007 | B2 |
7225005 | Kaufman et al. | May 2007 | B2 |
7227611 | Hull et al. | Jun 2007 | B2 |
7239909 | Zeman | Jul 2007 | B2 |
7247832 | Webb | Jul 2007 | B2 |
7280860 | Ikeda et al. | Oct 2007 | B2 |
7283181 | Allen et al. | Oct 2007 | B2 |
7302174 | Tan et al. | Nov 2007 | B2 |
7333213 | Kempe | Feb 2008 | B2 |
D566283 | Brafford et al. | Apr 2008 | S |
7359531 | Endoh et al. | Apr 2008 | B2 |
7376456 | Marshik-Geurts et al. | May 2008 | B2 |
7428997 | Wiklof et al. | Sep 2008 | B2 |
7431695 | Creaghan | Oct 2008 | B1 |
7448995 | Wiklof et al. | Nov 2008 | B2 |
7532746 | Marcotte et al. | May 2009 | B2 |
7545837 | Oka | Jun 2009 | B2 |
7559895 | Stetten et al. | Jul 2009 | B2 |
7579592 | Kaushal | Aug 2009 | B2 |
7608057 | Woehr et al. | Oct 2009 | B2 |
7699776 | Walker et al. | Apr 2010 | B2 |
7708695 | Akkermans et al. | May 2010 | B2 |
7792334 | Cohen et al. | Sep 2010 | B2 |
7846103 | Cannon, Jr. et al. | Dec 2010 | B2 |
7848103 | Zhan | Dec 2010 | B2 |
7904138 | Goldman et al. | Mar 2011 | B2 |
7904139 | Chance | Mar 2011 | B2 |
7925332 | Crane et al. | Apr 2011 | B2 |
7966051 | Xie et al. | Jun 2011 | B2 |
8032205 | Mullani | Oct 2011 | B2 |
8078263 | Zeman et al. | Dec 2011 | B2 |
8187189 | Jung et al. | May 2012 | B2 |
8199189 | Kagenow et al. | Jun 2012 | B2 |
8320998 | Sato | Nov 2012 | B2 |
8336839 | Boccoleri et al. | Dec 2012 | B2 |
8364246 | Thierman | Jan 2013 | B2 |
8467855 | Yasui | Jun 2013 | B2 |
8480662 | Stolen et al. | Jul 2013 | B2 |
8494616 | Zeman | Jul 2013 | B2 |
8498694 | McGuire, Jr. et al. | Jul 2013 | B2 |
8509495 | Xu et al. | Aug 2013 | B2 |
8537203 | Seibel et al. | Sep 2013 | B2 |
8548572 | Crane | Oct 2013 | B2 |
8630465 | Wieringa et al. | Jan 2014 | B2 |
8649848 | Crane et al. | Feb 2014 | B2 |
20010006426 | Son et al. | Jul 2001 | A1 |
20010056237 | Cane et al. | Dec 2001 | A1 |
20020016533 | Marchitto | Feb 2002 | A1 |
20020111546 | Cook | Aug 2002 | A1 |
20020118338 | Kohayakawa | Aug 2002 | A1 |
20020188203 | Smith et al. | Dec 2002 | A1 |
20030018271 | Kimble | Jan 2003 | A1 |
20030037375 | Riley et al. | Feb 2003 | A1 |
20030052105 | Nagano et al. | Mar 2003 | A1 |
20030120154 | Sauer et al. | Jun 2003 | A1 |
20030125629 | Ustuner | Jul 2003 | A1 |
20030156260 | Putilin et al. | Aug 2003 | A1 |
20040015062 | Ntziachristos et al. | Jan 2004 | A1 |
20040015158 | Chen et al. | Jan 2004 | A1 |
20040022421 | Endoh et al. | Feb 2004 | A1 |
20040046031 | Knowles et al. | Mar 2004 | A1 |
20040071322 | Choshi | Apr 2004 | A1 |
20040171923 | Kalafut et al. | Sep 2004 | A1 |
20040186357 | Soderberg | Sep 2004 | A1 |
20040222301 | Willins et al. | Nov 2004 | A1 |
20040237051 | Clauson | Nov 2004 | A1 |
20050017924 | Utt et al. | Jan 2005 | A1 |
20050033145 | Graham et al. | Feb 2005 | A1 |
20050043596 | Chance | Feb 2005 | A1 |
20050047134 | Mueller et al. | Mar 2005 | A1 |
20050085732 | Sevick-Muraca et al. | Apr 2005 | A1 |
20050085802 | Gruzdev et al. | Apr 2005 | A1 |
20050113650 | Pacione et al. | May 2005 | A1 |
20050131291 | Floyd et al. | Jun 2005 | A1 |
20050135102 | Gardiner et al. | Jun 2005 | A1 |
20050141069 | Wood et al. | Jun 2005 | A1 |
20050143662 | Marchitto et al. | Jun 2005 | A1 |
20050146765 | Turner et al. | Jul 2005 | A1 |
20050154303 | Walker et al. | Jul 2005 | A1 |
20050157939 | Arsenault et al. | Jul 2005 | A1 |
20050161051 | Pankratov et al. | Jul 2005 | A1 |
20050168980 | Dryden et al. | Aug 2005 | A1 |
20050174777 | Cooper et al. | Aug 2005 | A1 |
20050175048 | Stern et al. | Aug 2005 | A1 |
20050187477 | Serov et al. | Aug 2005 | A1 |
20050215875 | Khou | Sep 2005 | A1 |
20050265586 | Rowe et al. | Dec 2005 | A1 |
20050281445 | Marcotte et al. | Dec 2005 | A1 |
20060007134 | Ting | Jan 2006 | A1 |
20060020212 | Xu et al. | Jan 2006 | A1 |
20060025679 | Viswanathan et al. | Feb 2006 | A1 |
20060052690 | Sirohey et al. | Mar 2006 | A1 |
20060081252 | Wood | Apr 2006 | A1 |
20060100523 | Ogle et al. | May 2006 | A1 |
20060103811 | May et al. | May 2006 | A1 |
20060122515 | Zeman et al. | Jun 2006 | A1 |
20060129037 | Kaufman et al. | Jun 2006 | A1 |
20060129038 | Zelenchuk et al. | Jun 2006 | A1 |
20060151449 | Warner, Jr. et al. | Jul 2006 | A1 |
20060173351 | Marcotte et al. | Aug 2006 | A1 |
20060184040 | Keller et al. | Aug 2006 | A1 |
20060206027 | Malone | Sep 2006 | A1 |
20060232660 | Nakajima et al. | Oct 2006 | A1 |
20060253010 | Brady et al. | Nov 2006 | A1 |
20060271028 | Altshuler et al. | Nov 2006 | A1 |
20060276712 | Stothers | Dec 2006 | A1 |
20070015980 | Numada et al. | Jan 2007 | A1 |
20070016079 | Freeman et al. | Jan 2007 | A1 |
20070070302 | Govorkov et al. | Mar 2007 | A1 |
20070115435 | Rosendaal | May 2007 | A1 |
20070129634 | Hickey et al. | Jun 2007 | A1 |
20070176851 | Willey et al. | Aug 2007 | A1 |
20070238957 | Yared | Oct 2007 | A1 |
20080045841 | Wood et al. | Feb 2008 | A1 |
20080147147 | Griffiths et al. | Jun 2008 | A1 |
20080194930 | Harris et al. | Aug 2008 | A1 |
20080214940 | Benaron | Sep 2008 | A1 |
20090018414 | Toofan | Jan 2009 | A1 |
20090082629 | Dotan | Mar 2009 | A1 |
20090171205 | Kharin et al. | Jul 2009 | A1 |
20100051808 | Zeman et al. | Mar 2010 | A1 |
20100061598 | Seo | Mar 2010 | A1 |
20100087787 | Woehr et al. | Apr 2010 | A1 |
20100177184 | Berryhill et al. | Jul 2010 | A1 |
20100312120 | Meier | Dec 2010 | A1 |
20110275932 | Leblond et al. | Nov 2011 | A1 |
20130147916 | Bennett et al. | Jun 2013 | A1 |
20140039309 | Harris et al. | Feb 2014 | A1 |
20140046291 | Harris et al. | Feb 2014 | A1 |
20140194747 | Kruglick | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
2289149 | May 1976 | FR |
1298707 | Dec 1972 | GB |
1507329 | Apr 1978 | GB |
S60-108043 | Jun 1985 | JP |
04-042944 | Feb 1992 | JP |
07-255847 | Oct 1995 | JP |
08-023501 | Jan 1996 | JP |
08-164123 | Jun 1996 | JP |
2000-316866 | Nov 2000 | JP |
2002-328428 | Nov 2002 | JP |
2002-345953 | Dec 2002 | JP |
2004-237051 | Aug 2004 | JP |
2004-329786 | Nov 2004 | JP |
2006-102360 | Apr 2006 | JP |
2003-0020152 | Mar 2003 | KR |
WO 1994 22370 | Oct 1994 | WO |
WO 1996 39925 | Dec 1996 | WO |
WO 1998 26583 | Jun 1998 | WO |
WO 1999 48420 | Sep 1999 | WO |
WO 2001-82786 | Nov 2001 | WO |
WO 2003-009750 | Feb 2003 | WO |
WO 2005-053773 | Jun 2005 | WO |
WO 2007-078447 | Jul 2007 | WO |
Entry |
---|
Wiklof, Chris, “Display Technology Spawns Laser Camera,” LaserFocusWorld, Dec. 1, 2004, vol. 40, Issue 12, PennWell Corp., USA. |
Nikbin, Darius, “IPMS Targets Colour Laser Projectors,” Optics & Laser Europe, Mar. 1006, Isue 137, p. 11. |
http://sciencegeekgirl.wordpress.com/category/science-myths/page/2/ Myth 7: Blood is Blue. |
http://www.exploratorium.edu/sports/hnds_up/hands6.html “Hands Up! To Do & Notice: Getting the Feel of Your Hand”. |
http://www.wikihow.com/See-Blook-Veins-in-Your-Hand-With-a-Flashlight “How to See Blood Veins in Your Hand With a Flashlight”. |
Number | Date | Country | |
---|---|---|---|
20200114092 A1 | Apr 2020 | US |
Number | Date | Country | |
---|---|---|---|
60898506 | Jan 2007 | US | |
60817623 | Jun 2006 | US | |
60757704 | Jan 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15096518 | Apr 2016 | US |
Child | 16674212 | US | |
Parent | 13910257 | Jun 2013 | US |
Child | 15096518 | US | |
Parent | 11807359 | May 2007 | US |
Child | 13910257 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11700729 | Jan 2007 | US |
Child | 11807359 | US | |
Parent | 11478322 | Jun 2006 | US |
Child | 11700729 | US |