This invention relates generally to the field of blood loss management and more specifically to a new and useful system and method for estimating an amount of a blood component in a volume of fluid in the field of blood loss management.
In one example, a system for assessing a fluid canister is provided, comprising a mounting structure with a canister recess and an imaging device recess, an inter recess wall between the canister recess and the imaging device recess, a scale coupled to the mounting structure and configured with at least one measurement element in communication with the canister recess, and a scale communication module configured to transmit weight information from the scale to a computing device. The measurement element may comprise a piezoelectric element. The imaging device recess may comprise a data interface in wired communication with the communication module. The system may further comprise a first aperture located in the inter-recess inter-recess wall. The first aperture may include a window and seal between the window and the inter-recess wall. The system may further comprise a second aperture located in the inter-recess wall. The inter-recess wall may comprise a curved portion with a concave surface facing the canister recess. The inter-recess wall may further comprise a flat portion facing the imaging device recess. The canister recess may comprise a movable surface. The system may further comprise a fluid canister configured to removably reside in the canister recess, and wherein the reflective insert may be configured to reside inside the fluid canister. The system may further comprise a reflective insert configured to reside within the fluid canister. The inter-recess wall may comprise a first aperture located at a vertical height corresponding to the reflective insert when placed at a bottom of the fluid canister when the fluid canister may be fully seated in the canister recess. The fluid canister may have a frusto-conical shape. The inter-recess wall has a vertical angle matching a frusto-conical angle of the fluid canister. The system may further comprise an imaging device configured to be removably inserted into the imaging device recess. The imaging device may be a computing device comprising an imaging assembly configured to acquire canister images from canister located in the canister recess and a processor configured to receive weight information from the communication module. The processor may be further configured to acquire a canister image with the imaging assembly upon detecting a weight change using the weight information. The computing device may further comprise a computing communication module configured to transmit the canister images and weight information from the computing device. The fluid canister may comprise an inlet and an outlet, wherein the outlet may be configured to be coupled to a vacuum source. The computing device may be configured to acquire canister images at the same acquisition rate that the processor may be configured to acquire weight information. The acquisition rate may be in the range of about one acquisition every 1 to 5 seconds.
In another example, a method of assessing a fluid canister is provided, comprising detecting the weight a fluid canister attached to a vacuum system, generating an image of the fluid canister, and determining a hemoglobin value of the fluid canister using the image. The imaging may be initiated upon detecting a change in the weight of the fluid canister. The method may further comprise modifying the hemoglobin value using the weight. The method may further comprise draining the fluid canister, and setting a tare weight of the fluid canister after draining the fluid canister.
In still another example, a blood monitoring system is provided, comprising a canister, a mount, a weighing scale, an imaging system, and a processor, wherein the canister defines an internal volume and comprises a translucent section. The blood monitoring system may further comprise a reflective insert arranged within the internal volume and adjacent and offset from the translucent section. The mount may be configured to engage an exterior surface of the canister. The mount may define a first window configured to seal over the exterior surface of the canister proximal the translucent section. The mount may further define a second window adjacent the first window and configured to seal over the exterior surface of the canister proximal the translucent section. The first window may be substantially optically isolated from the second window. The weighing scale may be coupled to the mount and may be configured to output a signal corresponding to a weight of contents in the canister. The imaging system may comprise an optical emitter aligned with the first window and configured to illuminate the reflective insert through the translucent section of the canister. The imaging system may further comprise a camera aligned with the second window. The processor may be configured to transform an image captured by the camera into an estimated concentration of a blood component in a fluid within the canister and to estimate an amount of the blood component in the canister based on the estimated concentration of the blood component and an output of the weighing scale.
The following description of embodiments of the invention is not intended to limit the invention to these embodiments but rather to enable a person skilled in the art to make and use this invention. Variations, configurations, implementations, example implementations, and examples described herein are optional and are not exclusive to the variations, configurations, implementations, example implementations, and examples they describe. The invention described herein can include any and all permutations of these variations, configurations, implementations, example implementations, and examples.
Generally, the system 100 includes a canister 102 configured to collect and hold fluid, an optical emitter 128 that illuminates fluid in the canister 102, a camera 130 that captures images of illuminated fluid, and a processor 132 that transforms color values contained in images captured by the camera 130 into estimations of a quality of fluid contained in the canister 102, such as a concentration of total hemoglobin, free hemoglobin, whole red blood cells, or whole blood, etc. in the fluid in the canister 102. The system also includes a weighing scale 106, and the system 100 can generate an estimation of a mass or volume of one or more blood components in the canister by merging an output of the weighing scale 106 with a blood component concentration thus estimated from color values in an image of the canister 102. For example, an estimate of the total hemoglobin content of the fluid in the canister may be calculated using the combination of the estimated concentration and volume of a blood component generated from the image and weight information from the scale, respectively.
In one particular example, as shown in
In other variations, the scale 106 may also be used to detect other activity relating to the canister 102 and/or the fluid in the canister 102. For example, removal of the canister 102 may be detected so that the processor 132 can store the last or final concentration and volume information from the removed canister 102 and reset any counter(s) or register(s) for measuring any new canister.
The system 100 can be integrated into a surgical suction system within an operating room, surgical or procedure suite, emergency room, medical clinic, or other medical or health-related setting. In particular the system can interface with a primary canister and a suction wand in a surgical suction system to intermittently accumulate fluid collected with the suction wand, to capture an image of this fluid, to transform this image into an estimation of a quality of the fluid, and to then release its contents into the primary canister. For example, a vacuum pump 134 and regulator 136 coupled to a primary canister 138 can draw vacuum on the primary canister 138; the primary canister 138 can be fluidly coupled to the (intermediate) canister 102 of the system 100, and the suction wand 140 can be fluidly coupled to the (intermediate) canister 102 of the system 100 such that, when the vacuum pump 134 draws a vacuum on the primary canister 138, vacuum is communicated to the suction wand 140 via the (intermediate) canister 102 of the system 100. A nurse, anesthesiologist, surgeon, or other operator can thus manipulate the suction wand 140 to collect fluids from within and around a patient during a surgery and to dispense these fluids into the (intermediate) canister 102. The system 100 repeatedly captures and processes images of fluid in the canister 102 and samples the weighing scale 106 to generate updated fluid quality and quantity estimations throughout operations or procedures. In this example, once the (intermediate) canister 102 is full, its contents can be dispensed into the primary canister 138 for holding; the (intermediate) canister 102 can then be refilled via the suction wand 140 and its contents analyzed optically and/or by weight.
The system 100 can therefore be implemented in conjunction with a surgical wand and/or a primary (suction) canister within a surgical or other medical, clinical, or hospital setting to collect and image discrete volumes of blood and other bodily fluids. Components in the system that contact hazardous waste (e.g., blood, mucus, urine, etc.) can be disposable, and sensor and processing components of the system can be reusable. For example, the canister and the reflective insert can be used during a single operation or surgery and then disposed of, and the mount, weighing scale, imaging system, and processor can installed on multiple canisters across multiple surgeries over time to optically analyze qualities of fluids captured in these one-time-use canisters.
In other examples, the suction system may be attached to other vacuum systems, such as a negative pressure wound therapy system or a chest tube system, or an indwelling surgical draining tube, for assessing the amount and/or type of fluid loss or accumulation at those anatomical sites.
As noted previously, the intermediate canister 102 defines an internal volume 112 and a translucent section 114 or sidewall; and may include a reflective insert 116 configured to be inserted or arranged within the internal volume 112 and adjacent and offset from the translucent section 114. Generally, the canister 102 defines a vessel configured to collect fluid over time, includes a translucent or transparent material through which the imaging system 126 can illuminate contents of the vessel and capture images of contents of the vessel, and may include a reflective insert 116 (or reflective surface) that reflects and spreads light output from the imaging system 126 across a local volume of fluid to be imaged. The reflective insert 116 may cooperate with the wall 142 of the canister 102 to constrain a local volume of fluid in the canister 102 to a relatively shallow depth such that the imaging system 126 can capture color data through the full depth of this local volume of fluid (substantially) despite a concentration of red blood cells in the canister that may progressively block light transmission at greater depths. In some variations, the reflective insert 116 and the canister 102 may comprise recesses 195 and projections 197 configured to set the rotational orientation of the insert 116 and the canister 102.
In one implementation, the canister has a frusto-conical shape and is comprised of a substantially transparent polymer (e.g., polyethylene terephthalate, polymethyl methacrylate, polycarbonate, cellulose acetate butyrate) and may be configured to hold 3,000 milliliters of fluid. In other examples, the canister may have a capacity in the range of about 500 ml to 10,000 ml, or about 1,000 ml to about 5,000 ml, or about 1,000 ml to 3,000 ml. The reflective insert may be comprised of any suitable material, for example, a polymer (e.g., white nylon, polycarbonate, polyethylene, polymethyl methacrylate) structure configured to sit in, or couple to, the bottom of the canister. In this implementation, the canister can include an engagement feature in its base or in the wall of the vessel proximal its base and configured to retain the reflective insert. In other variations, the canister may comprise a polygonal shape, a cylindrical shape, or other shape, including one with at least one planar side surface or wall.
Alternatively, as depicted in
Referring back to
In the particular example in
The scale 106 may also be used to detect other activity relating to the canister 102 For example, removal of the canister 102 may be detected so that the processor 132 can store the last or final concentration and volume information from the removed canister 102 and reset any counter(s) or register(s) for measuring any new canister. Weight oscillations resulting from intermittent suctioning of fluid when the suction wand 140 is adjacent to fluid-air interface may occur, and the processor may be configured omit or correct for transient peaks in the detected weight.
The canister can also include a disposable agitator element configured to be remotely actuated by an agitator driver in the mount. For example, the canister 600 depicted in
In another example depicted in
In another variation, depicted in
The centrally spinning magnetic stirring element 622 may be configured to reside in a central recess 624 of a ring-shaped reflective insert 626. As shown in
In still another example depicted in
However, the canister can include any other suitable type of agitator element remotely configured to be remotely actated to stir or redistribute contents of the canister. Additional examples are provided below.
Referring back to
In one implementation, the imaging system 126 may include a camera 130 and a flash element or optical emitter 128 integrated into a standalone computing device, such as a smartphone, a tablet, or a personal media player. In this implementation, the computing device can execute a native image processing application that locally performs the method described below. The computing device can also include a display 180, opposite the camera 130 and an optical emitter 128, and configured to display or render a weight or volume of contents of the canister 102, a composition of fluid contained in the canister 102 (e.g., a concentration or volume fraction of hemoglobin, red bloods cells, or whole blood, etc. in the canister); and/or notifications, such as a prompt to empty the canister if fluid in the canister is approaching a maximum fill level, a prompt to empty the canister or to stir the contents of the canister if sediment is obscuring the camera, or a prompt to salvage red blood cells from the contents of the canister, such as described below.
As shown in
The mount 104 is configured to receive and support the base 158 of the canister 102. In one example, the mount 104 defines a frusto-conical receptacle 184 sized to fit the canister 102, as shown in
In one implementation, the mount 104 comprises a computing device receptacle 192 that is configured to transiently receive a standalone computing device 131 (as described above) and to support the computing device 131 with its camera 130 and optical emitter 128 or flash element facing the canister 102, as shown in
However, the mount can define any other geometry or function in any other way to transiently couple the optical emitter and the camera to the canister, and vice versa. In some variations, the computing device receptacle may be a modular or adjustable receptacle, to permit the use of different computing devices with the system, e.g. an IOS, Android, Windows or Linux tablet/cellphone, or camera system. In some other variations, a lens may be provided in the optical path of the second window corresponding to the camera 130. A lens may facilitate focused image capture, which may be used to detect and/or characterize sediment or other materials found in the canister.
The mount 104 defines a first window 120 configured to align with the optical emitter 128 and a second window 124 configured to align with the camera 130. In particular, the first window 120 is configured to pass light from the optical emitter 128 to the wall 142 of the canister 102, which passes light into fluid in the canister 102 and onto the reflective insert 116, thereby illuminating the fluid and the reflective insert 116; the second window 124 is configured to pass light reflected and refracted out of the wall 142 of the canister 102 by the reflective insert 116 and the fluid into the camera 130. The mount 104 can include a first seal 122a around a perimeter of a first side of the first window 120 and configured to seal the first window 120 against the exterior surface 118 of the canister 102 when the canister 102 is installed in the mount 104; a second seal 122b around a perimeter of the opposite side of the first window 120 and configured to seal the first window 120 against an exterior surface of the computing device 131—around the optical emitter 128—when the computing device 131 is installed in the mount 104, as shown in
Referring back to
In some embodiments, the processor may receive weight information from the scale in a continuous or a variable manner. The sampling rate for the weight may be in the range of about 1000 Hz to about once every 5 minutes, or about 60 Hz to about 1 Hz. In some variations, when the detected rate of fluid weight increase is higher or in a certain range, the sampling rate of the scale may be increased, as well as image capture rate or illumination rate of the imaging system.
The scale may also be used to indicate other states of events relating to canister use. For example, the complete unweighting of the scale, or reduction of weight below the tare weight of the canister, may be used to indicate removal of the canister. During use of the vacuum system, the detected weight may increase in a generally linear fashion while suctioning liquid material, but may exhibit some variation when suctioning mixtures of liquid and solid or semi-solid materials or tissue. The weight may also oscillate when the suction device is used at a liquid/air interface and the processor of the system may be configured to detect such states and to wait for the oscillations to stop before reporting any weight changes.
As noted previously, the system typically comprises a processor that may be configured to transform an image captured by the color camera into an estimated concentration of a blood component in a fluid within the canister and to estimate an amount of the blood component in the canister based on the estimated concentration of the blood component and an output of the weighing scale. Generally, the processing functions to locally execute one or more aspects of the method described below.
In the implementation described above in which the optical emitter 128 and the camera 130 are integrated into a standalone computing device 131, as shown in
In another variation, the camera, the optical emitter, the digital display, and the processor are integrated into the mount. However, the system can include any other integrated or discrete elements that cooperate to collect fluid from a suction wand, to weigh the fluid, to image the fluid, to transform images of the fluid into estimations of the quality of the fluid, and to generate estimations of the quantity of one or more blood components in the fluid over time.
Generally, one or more portions the method may be executed locally by the system 100 described above to automatically capture an image of a (sub)volume of fluid contained in a canister 102 and to transform absolute color values in the image and/or color gradients across pixels in the image into a quantitative estimation of a concentration of a blood component in the canister 102. For example, the system 100 may transform an image into an estimation of a mass per unit volume of hemoglobin, a volume fraction of red blood cells, or a volume fraction of whole blood, etc. of fluid contained in the canister. In particular, the system 100 may actuate an optical emitter to illuminate the volume of fluid in S110, triggers a camera to capture an image in S120, and processes the image to generate a blood component concentration estimation in S130. As noted previously, the light source or optical emitter and the camera or optical detector may be provided in the computing device 131, or may be integrated into the mount 104.
The method described herein may be executed locally by the system, e.g. the computing device 131, for estimating an amount of a blood component in a volume of fluid described above. However, portions of the method may additionally or alternatively be executed remotely from the system, such as by another local computing device connected to the system, by a local distributed network, or by a remote server.
The method may further comprise at S110 illuminating an insert 116 within a canister 104 according to an illumination schedule (e.g., using an optical emitter); and at S120, capturing an image of the insert 116 (e.g., using an optical detector). Generally, the system is configured to illuminate the reflective insert 116 within the canister 102—and therefore a volume of fluid between the reflective insert and the camera—and to capture an image of the illuminated volume of fluid located between the insert 116 and the wall 142 of the canister 102.
In one implementation, to capture an image of a volume of fluid in the canister, the system 100 powers on the optical emitter at a static, preset illumination power in S110, triggers the camera to capture an image in S120, and then deactivates the optical emitter. The power level may be in the range of 1 lumen to 1,000 lumens, or about 3 lumens to about 100 lumens, or about 3 lumens to about 50 lumens, or about 5 lumens to about 20 lumens, or about 15 lumens to 30 lumens, about or may be anywhere from 1% to 100% or about 30% to about 100%, or about 70% to about 100% of the light source's maximum power.
In another implementation, to capture an image of a volume of fluid in the canister 102, the system 100 first activates the optical emitter at a select illumination power, such as by pulse-width modulating the optical emitter at a selected duty cycle, in order to achieve target brightness in an image subsequently captured by the camera. For example, the system 100 can pulse-width modulate the optical emitter at frequency greater than a fastest shutter speed implemented by the camera (e.g., 500 Hz for a camera operable at a maximum shutter speed of 1/100 s). The system then triggers the camera to capture an image in Block S120, such as 0.002 second after the optical emitter is activated in Block S110. Once the image is recorded in Block S120, the system can deactivate the optical emitter.
In some other embodiments, the processor may be configured to initiate image capture upon a signal from the scale indicating a change in the weight of the canister contents.
In the foregoing implementation, the system 100 can progress through a set of duty cycles—such as down from 100% duty or up from 0% in 1%, 5%, 10%, 20% duty increments—and capture an image at each duty until the camera captures an image that meets one or more target color parameters, such as a lightest color limit, a darkest color limit, or target color gradient between the first region and the second region of the image. In one exemplary implementation, the system can increase the duty cycle of the optical emitter—starting at 0%—and capture an image for each duty cycle through the camera until a captured image contains a contiguous horizontal line of pixels containing less than a threshold number of black pixels or pixels darker than a threshold dark color value. For example, once an image is captured by the camera, the system can scan a single horizontal line of pixels centered vertically in the image and count a number of consecutive pixels (or a total number of pixels) along the scan line containing the color black or containing a color value less than (i.e., darker than) a threshold darkness value. In this example, if the number of consecutive pixels (or total number of pixels) along the scan line exceeds a threshold count, the system can reject the image, increase the duty cycle of the optical emitter, capture a subsequent image through the camera, and similarly process the subsequent image. The system can repeat this process until a final image with a number of consecutive pixels (or total number of pixels) along a scan line less than the threshold count is captured. The system can then process this final image in Block S130, as described below.
In another exemplary implementation, the system can decrease the duty cycle of the optical emitter—starting from 100%—and capture an image for each duty cycle through the camera until a captured image contains a contiguous horizontal line of pixels containing less than a threshold number of white pixels or pixels lighter than a threshold light color value.
In the foregoing exemplary implementations, for a subsequent sampling period, the system can repeat the foregoing process, starting with a low duty cycle (e.g., 0%) or a high duty cycle (e.g., 100%) at the optical emitter and then increase or decrease the duty cycle, respectively, until a suitable image is captured at the camera. Alternatively, the system can begin a new imaging period by setting the optical emitter to implement a last duty cycle from the preceding imaging period. The system can then capture a first image in the new imaging period through the camera, either increase or decrease the duty cycle of the optical emitter if the first image contains an excess number of black or dark pixels or if the first image contains an excess number of white or light pixels, respectively, capture and process a subsequent image, and then repeat the foregoing until an image containing color data of suitable quality is achieved. In these implementations, the system can thus vary the illumination power output by the optical emitter and process images captured under various illumination powers in order to identify and record an image containing a suitable quality of color data that can be transformed into a quality (e.g., a blood component concentration) of a volume of fluid in the canister.
Additionally or alternatively, the system may set an illumination power (by setting a duty cycle) of the optical emitter and then vary the shutter speed of the camera—and therefore an exposure of an image captured with the camera—to achieve an image with a quality of color data suitable for transformation into a quality of the volume of fluid in the canister. For example, the system may operate the optical emitter at a duty cycle of 100%; decrease the shutter speed of the camera (e.g., from 1/200 s to 1/100 s, then 1/30 s, 1/20 s, 1/15 s, 1/12 s, etc.); and capture an image through the camera for each shutter speed until a captured image contains a contiguous horizontal line of pixels containing less than a threshold number of black pixels or pixels darker than a threshold dark color value.
In the foregoing implementations, the system may implement any other method or technique to set an illumination power, a shutter speed, or any other illumination or image-capture parameter for the imaging system. Similarly, the system may implement any other method or technique to confirm that an image captured by the camera—for a given set of illumination and image-capture parameters—contains sufficient color data for transformation into a quality of fluid within the canister. The system may also manipulate multiple illumination and image capture parameters—such as both a duty of the optical emitter and a shutter speed of the camera—to achieve a target color quality in an image.
The system can therefore capture multiple images during a single imaging period and discard all but a single image containing sufficient color data for transformation into a quality (e.g., a blood component concentration) of a volume of fluid contained in the canister. The system may tag this select image with illumination and/or image capture parameters executed by the imaging system to capture the select image, such as the duty implemented by the optical emitter and/or the shutter speed implemented by the camera when the select image was captured. In order to transform the select image into a fluid quality in Block S130, the system can then select a set of template images based on these illumination and image capture parameters for comparison to the select image or insert these illumination and image capture parameters into a parametric model that is then applied to color values in select images, as described below.
In one variation, the system captures multiple images at different illumination and/or image-capture parameters in Blocks S110 and S120. For example, in a single imaging period, the system may: set the optical emitter at 0% duty and capture a first image; set the optical emitter at 50% duty and capture a second image; and then set the optical emitter at 100% duty and capture a third image. In another example, the system may: step the duty of the optical emitter upward from a minimum duty (e.g., 0%); capture an image at each duty step; store a first image including a total number of black pixels less than a threshold number of black pixels; and store a last image including a total number of white pixels less than a threshold number of white pixels (or vice versa). In yet another example, the system may: set the duty cycle of the optical emitter (e.g., at a static value of 80%); step the shutter speed downward from a maximum shutter speed (e.g., 1/200 s); capture an image at each shutter speed; store a first image including a total number of black pixels less than a threshold number of black pixels; and store a last image including a total number of white pixels less than a threshold number of white pixels (or vice versa).
However, the system may manipulate any other illumination and/or image capture parameter across a set of images. The system may then process this set of images in Block S130 to estimate a quality of the fluid within the canister during the corresponding imaging period.
Block S130 of the method depicts, color intensities of pixels in the image 300, and a color gradient 302 from a first region 304 to a second region 306 in the image 300, estimating a concentration of a blood component in a fluid within the canister, the first region 304 corresponding to proximity to the optical emitter, and the second region 306 corresponding to remoteness from the optical emitter. Generally, in Block S130, the system transforms color values contained in pixels in an image 300 captured by the camera into one or more of: a concentration of red bloods cells; a concentration of hemoglobin; a proportion of whole blood cells to lysed red blood cells (or free hemoglobin); a concentration of whole blood; a concentration of plasma; a concentration of white blood cells; etc. in a volume of fluid contained in the canister. In particular, the system can implement parametric and/or non-parametric (e.g., template-matching) techniques to transform color data contained in an image 300 captured by the camera into a blood component concentration value for a volume of fluid contained in the canister, such as described, for example, in U.S. patent application Ser. No. 13/544,664 and Ser. No. 13/738,919.
In one implementation, the system implements template matching techniques to match one or more color values (e.g., intensity in the red color space) in an image captured by the camera to a template image of a fluid of known blood component concentration and stored in (local or remote) memory. In one example implementation, the system can match a color gradient from a first side of the image (corresponding to a shortest distance to the optical emitter) to an opposite side of the image (corresponding to a greatest distance from the optical emitter) to a template gradient of one or more known blood component proportions. In this exemplary implementation, the system may select a single template image containing a lightest color, a darkest color, and/or a linear or non-linear color gradient nearest the lightest color, darkest color, and/or color gradient represented in the current image and assign one or more blood component concentration values associated with the template image to the current image. Similarly, the system may select two or more template images exhibiting lightest colors, darkest colors, and/or color gradients nearest those of the current image and then average blood component concentration values associated with these template images to generate an estimation of a blood component concentration in the canister at a time the current image was captured.
In the foregoing implementation, the system may apply template images from multiple template image sets—each image template set corresponding to a subset of known blood component concentration values—to the current image in order to generate estimations of multiple blood component concentrations representative of a volume of fluid contained in the canister from a single image of the canister. For example, the system may match a difference between a lightest color value and a darkest color value in the current image to a template image in a first template image set to generate an estimation of the concentration of hemoglobin in the volume of fluid in the canister; the system may then match a non-linear color gradient between the first side of the image and the second side of the image to a template image in a second template image set to generate an estimation of the proportion of lysed red blood cells in the volume of fluid in the canister.
Furthermore, in this implementation, the system may select or filter available template images based on illumination and image-capture parameters implemented by the imaging system to capture the current image. For example, the system may set the duty of the optical emitter at 70% percent, capture an image, and then select a template image set containing template images captured by similar systems with optical emitters operating at 70% duty. In another example, the system may set the duty of the optical emitter at 100% percent, set the shutter speed of the camera at 1/20 s, capture an image, and then select a template image set containing template images captured by similar systems with optical emitters operating at 100% duty and cameras operating at a shutter speed of 1/20 s.
However, in this implementation, the system may implement any other method or technique to select a template image of known blood component concentration and to match the template image to a current image captured by the camera to generate an estimation of a blood component concentration in a volume of fluid contained in the canister at a time the current image was captured.
In another implementation, the system passes quantitative data represented in one or more pixels in the current image into a parametric model that outputs a quantitative estimation of the concentration of one or more blood components in a volume of fluid contained in the canister, as shown in
The system may also calculate coefficients of a linear, logarithmic, polynomial, power, or other trendline of the color gradient from the first region of the image (e.g., a pixel or pixel cluster of lightest color) to the second region of the image (e.g., a pixel or pixel cluster of darkest color) and pass these coefficient values into a parametric model. The system may also identify a trendline type (e.g., linear, logarithmic, or polynomial, etc.) that best fits the color gradient represented in the current image, select a parametric model for the identified trendline type, and then pass coefficients of a trendline of the identified trendline type into the selected parametric model to generate an estimation of the concentration of the blood component in the canister.
In this implementation, in addition to color values of pixels in the current image, the system may also pass illumination and/or image-capture parameters implemented by the imaging system to capture the current image—such as a duty of the optical emitter or the shutter speed of the camera—into the parametric model. Alternatively, the system may select a particular parametric model from a set of available parametric models based on the illumination and/or image-capture parameters implemented by the imaging system to capture the current image; the system may then pass color values of pixels in the current image into the selected parametric model to output an estimation of a blood component concentration in the canister.
In the variation above in which the system captures multiple images through the camera in a single imaging period, the system may also implement any of the foregoing methods and techniques to compare absolute color values or color gradients across two or more images in a set of images. For example, the system may capture two images of the volume of fluid in the canister under two distinct lighting conditions (e.g., 20% duty and 80% duty at the optical emitter) and then characterize a difference in the color gradients across both images as a concentration of whole red blood cells and a concentration of free hemoglobin in the volume of fluid in the canister.
However, the system may implement any other parametric or non-parametric techniques to transform color data contained in one or more images captured by the camera into an estimation of a quality of a volume of fluid contained in the canister.
In one variation, the system determines a quality of an image output, as depicted in Block S120, and selectively discards this image or passes this image on to the next step of the process. In one implementation, the system scans the image vertically (e.g., along one or more vertical columns of pixels in the image) for a sharp shift in color value from a lower region of the image to an upper region of the image. The system may then correlate this color shift with collection of sediment on the bottom of the vessel, discard the image, and/or trigger manual or automatic removal of sediment from the field of view of the camera if such a color shift is detected in the image. In particular, due to proximity of the optical emitter to the camera, as sediment collects on the bottom of the canister and obscures the field of view of the camera, sediment may similarly obscure projection of light from the optical emitter onto the reflective insert such that sediment in the field of view of the camera remains substantially dim compared to the reflective insert when the optical emitter is actuated. Therefore, an image captured by the camera after sediment has collected in the field of view of the camera may contain a contiguous column of relatively dark pixels corresponding to a segment, extending upwardly from the bottom of the image, and rapidly transitioning into a contiguous column of relatively bright pixels corresponding to the reflective insert (and to fluid between the wall of the canister and the reflective insert). The system may scan one or more vertical columns of pixels in an image captured by the camera and then discard the image as containing insufficient color data of the fluid if a column of pixels in the image includes a transition from a line of dark pixels to a line of light pixels (or if the image includes more than a threshold number of dark pixels in a vertical column of dark pixels below a line of light pixels).
In one example, if a color shift is detected in an image, the system can issue an audible or visual prompt (e.g., through the display) to agitate the contents of the vessel. The system can then sample an integrated accelerometer to determine if the canister has been agitated or continue to capture and analyze images to determine if sediment has been removed from the field of view of the camera. Alternatively, if a color shift indicative of obscuration of the camera is detected in an image recently captured by the camera, the system can automatically activate an agitator—as described above—to mix contents and redistribute sediment within the canister prior to capturing. For example, the system can activate the agitator for a preset period of time (e.g., 10 seconds) or until images captured by the camera no longer exhibit such a sharp shift in color value. In this example, once a sharp color value shift is no longer detected in the field of view of the color system, the system can deactivate the agitator, pause for a period of time (e.g., five seconds) to allow fluid within the canister to slow, and then execute Blocks S110, S120, and S130 as described above to capture and process an image of fluid in canister.
Furthermore, in this variation, if sediment is detected in a current image but the current image contained sufficient color data to provide a reliable estimation of the concentration of a blood component in the canister, the system can remove (e.g., crop) a region of the current image correlated with obscuration by sediment and pass the remainder of the current image to Block S130 for processing. However, the system may implement any other method or technique to confirm the quality of images captured by the camera and selectively pass on and/or reject these images.
While capturing images in Block S120, the system may also sample the weighing scale and apply a value output by the weighing scale to the blood component concentration value to estimate a quantity (e.g., a volume, a mass) of the blood component in the canister in Block S140. In one implementation, the system 100 continuously samples the weighing scale and records outputs of the weighing scale 106 with corresponding timestamps in memory. In this implementation, for an image captured in Block S120 and processed in Block S130, the system 100 can retrieve—from memory—a weighing scale output value (e.g., weight) recorded at a time nearest a time that the image 300 was captured. (The system 100 can also retrieve multiple weighing scale outputs recorded around the time that the image was captured and then average these values.) The system 100 can then divide this weighing scale output value for the imaging period by a static estimated density of fluid collected in the canister 102 (e.g., 1030 kg/m3 for a mixture of saline and blood) to estimate the volume of fluid in the canister. By then multiplying this estimated volume by the blood component concentration, the system can estimate the volume (or mass) of the blood component (e.g., hemoglobin, red blood cells) in the canister, as depicted in Block S150.
The system can repeat Blocks S110, S120, S130, S140, and S150 throughout an operation—such as at a rate of 1 Hz—in order to update estimations of a volume of fluid in the canister, a quality of the volume of fluid, and/or a quantity of a blood component in the canister over time.
Throughout operation, as shown in Block S100, the system 100 may update an integrated display 180 over time to visually indicate a current estimated volume of fluid in the canister 102, a current estimated quality of the volume of fluid, and/or a current estimated quantity of the blood component in the canister 102, as shown in
The systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof. When implemented as a system, such system may comprise, inter alia, components such as software modules, general-purpose CPU, RAM, etc. found in general-purpose computers, and/or FPGAs and/or ASICs found in more specialized computing devices. In implementations where the innovations reside on a server, such a server may comprise components such as CPU, RAM, etc. found in general-purpose computers. Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above. The computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
In the present description, the terms component, module, device, etc. may refer to any type of logical or functional circuits, blocks and/or processes that may be implemented in a variety of ways. For example, the functions of various circuits and/or blocks can be combined with one another into any other number of devices. Or, the devices can comprise programming instructions transmitted to a general purpose computer or to processing/graphics hardware via a transmission carrier wave. Also, the devices can be implemented as hardware logic circuitry implementing the functions encompassed by the innovations herein. Finally, the devices can be implemented using special purpose instructions (SIMD instructions), field programmable logic arrays or any mix thereof which provides the desired level performance and cost.
Aspects of the method and system described herein, such as the logic, may also be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (“PLDs”), such as field programmable gate arrays (“FPGAs”), programmable array logic (“PAL”) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits. Some other possibilities for implementing aspects include: memory devices, microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc. Furthermore, aspects may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. The underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (“MOSFET”) technologies like complementary metal-oxide semiconductor (“CMOS”), bipolar technologies like emitter-coupled logic (“ECL”), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, and so on.
As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the embodiments of the invention without departing from the scope of this invention as defined in the following claims.
This application is a continuation of prior U.S. application Ser. No. 15/389,365, filed on Dec. 22, 2016, and claims priority to U.S. Provisional Patent Application No. 62/387,234, filed on Dec. 23, 2015, both of which are hereby incorporated by reference in their entirety. These applications are also related to U.S. patent application Ser. No. 13/544,664, filed on Jul. 9, 2012, and issued as U.S. Pat. No. 9,652,655 on May 16, 2017, and to U.S. patent application Ser. No. 13/738,919, filed on Jan. 10, 2013, issued as U.S. Pat. No. 8,983,167 on Mar. 17, 2015, both of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
2707955 | Borden | May 1955 | A |
3182252 | Den | May 1965 | A |
3199507 | Kamm | Aug 1965 | A |
3367431 | Prindle | Feb 1968 | A |
3646938 | Haswell | Mar 1972 | A |
3687209 | Goldberg et al. | Aug 1972 | A |
3832135 | Drozdowski et al. | Aug 1974 | A |
3863724 | Dalia, Jr. | Feb 1975 | A |
3864571 | Stillman et al. | Feb 1975 | A |
3948390 | Ferreri | Apr 1976 | A |
4105019 | Haswell | Aug 1978 | A |
4149537 | Haswell | Apr 1979 | A |
4244369 | Mcavinn et al. | Jan 1981 | A |
4402373 | Comeau | Sep 1983 | A |
4422548 | Cheesman et al. | Dec 1983 | A |
4429789 | Puckett, Jr. | Feb 1984 | A |
4512431 | Bloomfield | Apr 1985 | A |
4562842 | Morfeld et al. | Jan 1986 | A |
4583546 | Garde | Apr 1986 | A |
4642089 | Zupkas et al. | Feb 1987 | A |
4681571 | Nehring | Jul 1987 | A |
4773423 | Hakky | Sep 1988 | A |
4784267 | Gessler et al. | Nov 1988 | A |
4832198 | Alikhan | May 1989 | A |
4917694 | Jessup | Apr 1990 | A |
4922922 | Pollock et al. | May 1990 | A |
4961533 | Teller et al. | Oct 1990 | A |
5014798 | Glynn | May 1991 | A |
5029584 | Smith | Jul 1991 | A |
5031642 | Nosek | Jul 1991 | A |
5048683 | Westlake | Sep 1991 | A |
5119814 | Minnich | Jun 1992 | A |
5119830 | Davis | Jun 1992 | A |
5128036 | Svensson | Jul 1992 | A |
5132087 | Kristen et al. | Jul 1992 | A |
5190059 | Fabian et al. | Mar 1993 | A |
5231032 | Ludvigsen | Jul 1993 | A |
5236664 | Ludvigsen | Aug 1993 | A |
5285682 | Micklish | Feb 1994 | A |
5348533 | Papillon et al. | Sep 1994 | A |
5369713 | Schwartz et al. | Nov 1994 | A |
5458566 | Herrig et al. | Oct 1995 | A |
5492537 | Vancaillie | Feb 1996 | A |
5522805 | Vancaillie et al. | Jun 1996 | A |
5568262 | Lachapelle et al. | Oct 1996 | A |
5595456 | Berg et al. | Jan 1997 | A |
5629498 | Pollock et al. | May 1997 | A |
5633166 | Westgard et al. | May 1997 | A |
5646788 | Bietry | Jul 1997 | A |
5650596 | Morris et al. | Jul 1997 | A |
5709670 | Vancaillie et al. | Jan 1998 | A |
5774865 | Glynn | Jun 1998 | A |
5807358 | Herweck et al. | Sep 1998 | A |
5851835 | Groner | Dec 1998 | A |
5923001 | Morris et al. | Jul 1999 | A |
5931824 | Stewart et al. | Aug 1999 | A |
5944668 | Vancaillie et al. | Aug 1999 | A |
5956130 | Vancaillie et al. | Sep 1999 | A |
5971948 | Pages et al. | Oct 1999 | A |
5984893 | Ward | Nov 1999 | A |
5996889 | Fuchs et al. | Dec 1999 | A |
6006119 | Soller et al. | Dec 1999 | A |
6061583 | Ishihara et al. | May 2000 | A |
6294999 | Yarin et al. | Sep 2001 | B1 |
6359683 | Berndt | Mar 2002 | B1 |
6510330 | Enejder | Jan 2003 | B1 |
6641039 | Southard | Nov 2003 | B2 |
6699231 | Sterman et al. | Mar 2004 | B1 |
6704500 | Takematsu | Mar 2004 | B2 |
6728561 | Smith et al. | Apr 2004 | B2 |
6730054 | Pierce et al. | May 2004 | B2 |
6777623 | Ballard | Aug 2004 | B2 |
6998541 | Morris et al. | Feb 2006 | B2 |
7001366 | Ballard | Feb 2006 | B2 |
7112273 | Weigel et al. | Sep 2006 | B2 |
7147626 | Goodman et al. | Dec 2006 | B2 |
7158030 | Chung | Jan 2007 | B2 |
7180014 | Farber et al. | Feb 2007 | B2 |
7274947 | Koo et al. | Sep 2007 | B2 |
7297834 | Shapiro | Nov 2007 | B1 |
7299981 | Hickle | Nov 2007 | B2 |
7364545 | Klein | Apr 2008 | B2 |
7384399 | Ghajar | Jun 2008 | B2 |
7430047 | Budd et al. | Sep 2008 | B2 |
7430478 | Fletcher-Haynes et al. | Sep 2008 | B2 |
7469727 | Marshall | Dec 2008 | B2 |
7499581 | Tribble et al. | Mar 2009 | B2 |
7557710 | Sanchez et al. | Jul 2009 | B2 |
7641612 | Mccall | Jan 2010 | B1 |
D611731 | Levine | Mar 2010 | S |
7670289 | Mccall | Mar 2010 | B1 |
7703674 | Stewart et al. | Apr 2010 | B2 |
7708700 | Ghajar | May 2010 | B2 |
7711403 | Jay et al. | May 2010 | B2 |
7749217 | Podhajsky | Jul 2010 | B2 |
7795491 | Stewart et al. | Sep 2010 | B2 |
7819818 | Ghajar | Oct 2010 | B2 |
7909806 | Goodman et al. | Mar 2011 | B2 |
7966269 | Bauer et al. | Jun 2011 | B2 |
7995816 | Roger et al. | Aug 2011 | B2 |
8025173 | Michaels | Sep 2011 | B2 |
8105296 | Morris et al. | Jan 2012 | B2 |
8181860 | Fleck et al. | May 2012 | B2 |
8194235 | Kosaka et al. | Jun 2012 | B2 |
8241238 | Hiruma et al. | Aug 2012 | B2 |
8279068 | Morris et al. | Oct 2012 | B2 |
8398546 | Pacione et al. | Mar 2013 | B2 |
8472693 | Davis et al. | Jun 2013 | B2 |
8479989 | Fleck et al. | Jul 2013 | B2 |
8576076 | Morris et al. | Nov 2013 | B2 |
8626268 | Adler et al. | Jan 2014 | B2 |
8693753 | Nakamura | Apr 2014 | B2 |
8704178 | Pollock et al. | Apr 2014 | B1 |
8792693 | Satish et al. | Jul 2014 | B2 |
8797439 | Coley et al. | Aug 2014 | B1 |
8897523 | Satish et al. | Nov 2014 | B2 |
8983167 | Satish et al. | Mar 2015 | B2 |
9047663 | Satish et al. | Jun 2015 | B2 |
9171368 | Satish et al. | Oct 2015 | B2 |
9595104 | Satish et al. | Mar 2017 | B2 |
9646375 | Satish et al. | May 2017 | B2 |
9652655 | Satish et al. | May 2017 | B2 |
9773320 | Satish et al. | Sep 2017 | B2 |
9936906 | Satish et al. | Apr 2018 | B2 |
9981790 | Ost et al. | May 2018 | B1 |
10641644 | Satish et al. | May 2020 | B2 |
10679342 | Nowicki | Jun 2020 | B2 |
10853938 | Sandmann | Dec 2020 | B2 |
20020124017 | Mault | Sep 2002 | A1 |
20030069509 | Matzinger et al. | Apr 2003 | A1 |
20030095197 | Wheeler et al. | May 2003 | A1 |
20030130596 | Goltz | Jul 2003 | A1 |
20040031626 | Morris et al. | Feb 2004 | A1 |
20040129678 | Crowley et al. | Jul 2004 | A1 |
20050051466 | Carter et al. | Mar 2005 | A1 |
20050163354 | Ziegler | Jul 2005 | A1 |
20050209585 | Nord et al. | Sep 2005 | A1 |
20050265996 | Lentz | Dec 2005 | A1 |
20060058593 | Drinan et al. | Mar 2006 | A1 |
20060178578 | Tribble et al. | Aug 2006 | A1 |
20060224086 | Harty | Oct 2006 | A1 |
20060241453 | Nguyen-Dinh et al. | Oct 2006 | A1 |
20070004959 | Carrier et al. | Jan 2007 | A1 |
20070008622 | Sommer | Jan 2007 | A1 |
20070108129 | Mori et al. | May 2007 | A1 |
20070243137 | Hainfeld | Oct 2007 | A1 |
20070287182 | Morris et al. | Dec 2007 | A1 |
20080029416 | Paxton | Feb 2008 | A1 |
20080030303 | Kobren et al. | Feb 2008 | A1 |
20080045845 | Pfeiffer et al. | Feb 2008 | A1 |
20080194906 | Mahony et al. | Aug 2008 | A1 |
20090076470 | Ryan | Mar 2009 | A1 |
20090257632 | Lalpuria et al. | Oct 2009 | A1 |
20090310123 | Thomson | Dec 2009 | A1 |
20090317002 | Dein | Dec 2009 | A1 |
20100003714 | Bachur, Jr. et al. | Jan 2010 | A1 |
20100007727 | Torre-Bueno | Jan 2010 | A1 |
20100025336 | Carter et al. | Feb 2010 | A1 |
20100027868 | Kosaka et al. | Feb 2010 | A1 |
20100066996 | Kosaka et al. | Mar 2010 | A1 |
20100087770 | Bock et al. | Apr 2010 | A1 |
20100150759 | Mazur et al. | Jun 2010 | A1 |
20100280117 | Patrick et al. | Nov 2010 | A1 |
20110066182 | Falus | Mar 2011 | A1 |
20110118647 | Paolini et al. | May 2011 | A1 |
20110144595 | Cheng | Jun 2011 | A1 |
20110192745 | Min | Aug 2011 | A1 |
20110196321 | Wudyka | Aug 2011 | A1 |
20110200239 | Levine et al. | Aug 2011 | A1 |
20110275957 | Bhandari | Nov 2011 | A1 |
20110305376 | Neff | Dec 2011 | A1 |
20110316973 | Miller et al. | Dec 2011 | A1 |
20120000297 | Hashizume et al. | Jan 2012 | A1 |
20120064132 | Aizawa et al. | Mar 2012 | A1 |
20120065482 | Robinson et al. | Mar 2012 | A1 |
20120127290 | Tojo et al. | May 2012 | A1 |
20120210778 | Palmer et al. | Aug 2012 | A1 |
20120257188 | Yan et al. | Oct 2012 | A1 |
20120262704 | Zahniser et al. | Oct 2012 | A1 |
20120271170 | Emelianov et al. | Oct 2012 | A1 |
20120309636 | Gibbons et al. | Dec 2012 | A1 |
20120327365 | Makihira | Dec 2012 | A1 |
20130010094 | Satish et al. | Jan 2013 | A1 |
20130094996 | Janssenswillen | Apr 2013 | A1 |
20130170729 | Wardlaw et al. | Jul 2013 | A1 |
20130245599 | Williams et al. | Sep 2013 | A1 |
20130301901 | Satish et al. | Nov 2013 | A1 |
20130303870 | Satish et al. | Nov 2013 | A1 |
20130308852 | Hamsici et al. | Nov 2013 | A1 |
20140063180 | Sharma | Mar 2014 | A1 |
20140079297 | Tadayon et al. | Mar 2014 | A1 |
20140128838 | Satish et al. | May 2014 | A1 |
20140207091 | Heagle et al. | Jul 2014 | A1 |
20140330094 | Pacione et al. | Nov 2014 | A1 |
20150294460 | Satish et al. | Oct 2015 | A1 |
20150294461 | Satish et al. | Oct 2015 | A1 |
20150310634 | Babcock et al. | Oct 2015 | A1 |
20150354780 | Wang | Dec 2015 | A1 |
20160015602 | Panzini et al. | Jan 2016 | A1 |
20160027173 | Satish et al. | Jan 2016 | A1 |
20160123998 | Macintyre et al. | May 2016 | A1 |
20160228639 | Zin | Aug 2016 | A1 |
20160243314 | Rodiera Olive | Aug 2016 | A1 |
20160327427 | Briones et al. | Nov 2016 | A1 |
20160331282 | Satish et al. | Nov 2016 | A1 |
20170011276 | Mehring et al. | Jan 2017 | A1 |
20170023446 | Rietveld et al. | Jan 2017 | A1 |
20170184442 | Satish et al. | Jun 2017 | A1 |
20170189621 | Rodiera et al. | Jul 2017 | A1 |
20170351894 | Satish et al. | Dec 2017 | A1 |
20170352152 | Satish et al. | Dec 2017 | A1 |
20180104681 | Lee et al. | Apr 2018 | A1 |
20180154088 | Broselow | Jun 2018 | A1 |
20190008427 | Satish et al. | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
2870635 | Oct 2013 | CA |
101505813 | Aug 2009 | CN |
102009007733 | Aug 2010 | DE |
3393539 | Oct 2018 | EP |
S59161801 | Oct 1984 | JP |
61176357 | Aug 1986 | JP |
62144652 | Jun 1987 | JP |
H03223629 | Oct 1991 | JP |
H06510210 | Nov 1994 | JP |
H07308312 | Nov 1995 | JP |
1137845 | Feb 1999 | JP |
2000227390 | Aug 2000 | JP |
2001050792 | Feb 2001 | JP |
2002331031 | Nov 2002 | JP |
2003075436 | Mar 2003 | JP |
2005052288 | Mar 2005 | JP |
3701031 | Sep 2005 | JP |
2006280445 | Oct 2006 | JP |
2008055142 | Mar 2008 | JP |
2008519604 | Jun 2008 | JP |
2009535639 | Oct 2009 | JP |
2010516429 | May 2010 | JP |
2011036371 | Feb 2011 | JP |
2011515681 | May 2011 | JP |
2011252804 | Dec 2011 | JP |
2014531570 | Nov 2014 | JP |
2019507615 | Mar 2019 | JP |
WO-9217787 | Oct 1992 | WO |
WO-9639927 | Dec 1996 | WO |
WO-9710856 | Mar 1997 | WO |
WO-2006053208 | May 2006 | WO |
WO-2007129948 | Nov 2007 | WO |
WO-2008094703 | Aug 2008 | WO |
WO-2008094703 | Aug 2009 | WO |
WO-2009117652 | Sep 2009 | WO |
WO-2011019576 | Feb 2011 | WO |
WO-2011145351 | Nov 2011 | WO |
WO-201 3009709 | Jan 2013 | WO |
WO-201 3172874 | Nov 2013 | WO |
WO-201 3173356 | Nov 2013 | WO |
WO-201 4013213 | Jan 2014 | WO |
WO-201 3009709 | May 2014 | WO |
WO-201 4099629 | Jun 2014 | WO |
WO-201 5161003 | Oct 2015 | WO |
WO-2015160997 | Oct 2015 | WO |
WO-2016187071 | Nov 2016 | WO |
WO-201 7111324 | Jun 2017 | WO |
WO-201 7112913 | Jun 2017 | WO |
Entry |
---|
“U.S. Appl. No. 13/544,646, Notice of Allowance dated May 12, 2014”, 10 pgs. |
“U.S. Appl. No. 13/544,664, Final Office Action dated Feb. 12, 2016”, 10 pgs. |
“U.S. Appl. No. 13/544,664, Non Final Office Action dated Aug. 2, 2016”, 7 pgs. |
“U.S. Appl. No. 13/544,664, Non Final Office Action dated Aug. 13, 2015”, 9 pgs. |
“U.S. Appl. No. 13/544,664, Notice of Allowance dated Feb. 15, 2017”, 10 pgs. |
“U.S. Appl. No. 13/544,679, Non Final Office Action dated May 9, 2014”, 9 pgs. |
“U.S. Appl. No. 13/544,679, Notice of Allowance dated Sep. 3, 2014”, 8 pgs. |
“U.S. Appl. No. 13/738,919, Non Final Office Action dated Sep. 5, 2014”, 8 pgs. |
“U.S. Appl. No. 13/738,919, Notice of Allowance dated Nov. 10, 2014”, 10 pgs. |
“U.S. Appl. No. 13/894,054, Final Office Action dated Aug. 26, 2016”, 7 pgs. |
“U.S. Appl. No. 13/894,054, Non Final Office Action dated Mar. 30, 2016”, 9 pgs. |
“U.S. Appl. No. 13/894,054, Non Final Office Action dated Apr. 20, 2017”, 7 pgs. |
“U.S. Appl. No. 14/613,807, Non Final Office Action dated Mar. 20, 2015”, 8 pgs. |
“U.S. Appl. No. 14/613,807, Notice of Allowance dated Jun. 25, 2015”, 10 pgs. |
“U.S. Appl. No. 14/687,862, Non Final Office Action dated Mar. 24, 2017”, 22 pgs. |
“U.S. Appl. No. 14/687,862, Notice of Allowance dated Aug. 8, 2017”, 6 pgs. |
“U.S. Appl. No. 14/876,628, Final Office Action dated Jul. 26, 2016”, 5 pgs. |
“U.S. Appl. No. 14/876,628, Non Final Office Action dated Dec. 15, 2015”, 8 pgs. |
“U.S. Appl. No. 14/876,628, Notice of Allowance dated Oct. 26, 2016”, 11 pgs. |
“U.S. Appl. No. 15/154,917, Non Final Office Action dated May 9, 2019”, 7 pgs. |
“U.S. Appl. No. 15/389,365, Final Office Action dated Feb. 4, 2019”, 23 pgs. |
“U.S. Appl. No. 15/389,365, Non Final Office Action dated Jul. 13, 2018”, 17 pgs. |
“U.S. Appl. No. 15/389,365, Notice of Allowance dated Jan. 15, 2020”, 8 pgs. |
“U.S. Appl. No. 15/389,365, Notice of Allowance dated Aug. 28, 2019”, 5 pgs. |
“U.S. Appl. No. 15/389,365, Response filed Jan. 11, 2019 to Non Final Office Action dated Jul. 13, 2018”, 21 pgs. |
“U.S. Appl. No. 15/389,365, Response filed Aug. 1, 2019 to Final Office Action dated Feb. 4, 2019”, 19 pgs. |
“U.S. Appl. No. 15/389,365, Supplemental Notice of Allowability dated Jan. 27, 2020”, 3 pgs. |
“U.S. Appl. No. 15/416,986, Non Final Office Action dated Apr. 11, 2018”, 7 pgs. |
“U.S. Appl. No. 15/594,017, Non Final Office Action dated Feb. 21, 2019”, 24 pgs. |
“Blood loss measurement: Technology opportunity assessment”, Merck for Mother's Program, (2012), 9 pgs. |
“European Application Serial No. 12810640.8, Extended European Search Report dated Apr. 1, 2015”, 8 pgs. |
“European Application Serial No. 13790449.6, Extended European Search Report dated Nov. 17, 2015”, 7 pgs. |
“European Application Serial No. 13790688.9, Extended European Search Report dated Nov. 23, 2015”, 9 pgs. |
“European Application Serial No. 15780590.4, Extended European Search Report dated Sep. 18, 2017”, 8 pgs. |
“European Application Serial No. 16183350.4, Extended European Search Report dated Nov. 4, 2016”, 8 pgs. |
“European Application Serial No. 16880130.6, Extended European Search Report dated Sep. 4, 2019”, 9 pgs. |
“European Application Serial No. 16880130.6, Response filed Mar. 17, 20 to Extended European Search Report dated Sep. 4, 2019”, 22 pgs. |
“European Application Serial No. 16880130.6, Response to Communication pursuant to Rules 161(2) and 162 EPC filed Mar. 5, 2019”, 9 pgs. |
“European Application Serial No. 19156549.8, Extended European Search Report dated Jul. 12, 2019”, 8 pgs. |
“International Application Serial No. PCT/US2012/045969, International Search Report dated Sep. 17, 2012”, 2 pgs. |
“International Application Serial No. PCT/US2012/045969, Written Opinion dated Sep. 17, 2012”, 4 pgs. |
“International Application Serial No. PCT/US2013/021075, International Search Report dated Mar. 26, 2013”, 2 pgs. |
“International Application Serial No. PCT/US2013/021075, Written Opinion dated Mar. 26, 2013”, 6 pgs. |
“International Application Serial No. PCT/US2013/040976, International Search Report dated Sep. 24, 2013”, 2 pgs. |
“International Application Serial No. PCT/US2013/040976, Written Opinion dated Sep. 24, 2013”, 4 pgs. |
“International Application Serial No. PCT/US2015/026042, International Search Report dated Jul. 8, 2015”, 2 pgs. |
“International Application Serial No. PCT/US2015/026042, Written Opinion dated Jul. 8, 2015”, 4 pgs. |
“International Application Serial No. PCT/US2016/032561, International Search Report dated Aug. 18, 2016”, 2 pgs. |
“International Application Serial No. PCT/US2016/032561, Written Opinion dated Aug. 18, 2016”, 5 pgs. |
“International Application Serial No. PCT/US2016/068452, International Preliminary Report on Patentabilty dated Jul. 5, 2018”, 11 pgs. |
“International Application Serial No. PCT/US2016/068452, International Search Report dated Mar. 8, 2017”, 3 pgs. |
“International Application Serial No. PCT/US2016/068452, Written Opinion dated Mar. 8, 2017”, 9 pgs. |
“Optimizing protocols in obstetrics”, ACOG, Series 2, (2012), 25 pgs. |
“Quantification of blood loss: AWHONN practice brief No. 1”, AWHONN Practice Brief, (2014), 1-3. |
Adkins, A R, et al., “Accuracy of blood loss estimations among anesthesia providers”, AANA Journal 82, (2014), 300-306. |
Aklilu, A, “Gauss Surgical Measures Blood Loss with a Smartphone”, [Online], Retrieved from the Internet: <http://www.health2con.com/news/2012/06/14/gauss-surgical-measures-blood-loss-with-a-smartphone>, (Jun. 14, 2012). |
Al-Kadri, H M, et al., “Effect of education and clinical assessment on the accuracy of postpartum blood loss estimation”, BMC Preq. Childbirth 14, 110, 7 pgs. |
Bellad, et al., “Standardized Visual Estimation of Blood Loss during Vaginal Delivery with its Correlation Hematocrit Changes—A Descriptive Study”, South Asian Federation of Obstetrics and Gynecology 1.1, (2009), 29-34. |
Bose, P, et al., “Improving the accuracy of estimated blood loss at obstetric haemorrhage using clinical reconstructions”, BJOG 113(8), (2006), 919-924. |
Eipe, N, et al., “Perioperative blood loss assessment—How accurate?”, Indian J. Anaesth. 50(1), (2006), 35-38. |
Habak, P J, et al., “A comparison of visual estimate versus calculated estimate of blood loss at vaginal delivery”, British J. Med. Medical Res. 11(4), (2016), 1-7. |
Holmes, A A, et al., “Clinical evaluation of a novel system for monitoring surgical hemoglobin loss”, Anesth. Analg. 119, (2014), 588-594. |
Jones, R, “Quantitative measurement of blood loss during delivery”, AWHONN, (2015), S41. |
Kamiyoshihara, M, et al., “The Utility of an Autologous Blood Salvage System in Emergency Thoracotomy for a Hemothorax After Chest Trauma”, Gen. Thorac. Cardiovasc. Surg. 56, (2008), 222. |
Lyndon, A, et al., “Blood loss: Clinical techniques for ongoing quantitative measurement”, CMQCC Obstetric Hemorrhage Toolkit, (2010), 1-7. |
Lyndon, A, et al., “Cumulative quantitative assessment of blood loss”, CMQCC Obstetric Hemorrhage Toolkit Version 2.0, (2015), 80-85. |
Manikandan, D, et al., “Measurement of blood loss during adenotonsillectomy in children and factors affecting it”, Case Reports in Clinical Medicine 4, (2015), 151-156. |
Pogorelc, D, “iPads in the OR: New Mobile Platform to Monitor Blood Loss During Surgery”, MedCityNews, [Online], Retrieved from the Internet: <http://medcitynews.com/2012/06/ipads-in-the-or-new-mobile-platform-to-monitor-blood-loss-during-surgery>, (Jun. 6, 2012), 4 pgs. |
Roston, R B, et al., “Chapter 9: Blood loss: Accuracy of visual estimation”, A comprehensive textbook of postpartum hemorrhage: An essential clinical reference for effective management 2nd edition, Sapiens, (2012), 71-72. |
Sant, et al., “Exsanguinated Blood vol. Estimation Using Fractal Analysis of Digital Images”, Journal of Forensic Sciences 57, (2012), 610-617. |
Satish, S, et al., “System and Methods for Managing Blood Loss of a Patient”, U.S. Appl. No. 15/943,561, filed Apr. 2, 2018, 66 pgs. |
Schorn, M N, et al., “Measurement of blood loss: Review of the literature”, J. Midwifery and Women's Health 55, (2010), 20-27. |
Sukprasert, M, et al., “Increase accuracy of visual estimation of blood loss from education programme”, J. Med. Assoc. Thai 89, (2006), S54-S59. |
“European Application Serial No. 16880130.6, Communication Pursuant to Article 94(3) EPC dated Mar. 23, 2021”, 6 pgs. |
“Japanese Application Serial No. 2018-532640, Notification of Reasons for Refusal dated May 18, 2021”, w/ English Translation, 6 pgs. |
“Japanese Application Serial No. 2018-532640, Response filed Feb. 17, 2021 to Notification of Reasons for Refusal dated Nov. 24, 2020”, w/ English claims, 15 pgs. |
“Japanese Application Serial No. 2018-532640, Notification of Reasons for Refusal dated Nov. 24, 2020”, with English translation, 6 pgs. |
“European Application Serial No. 16880130.6, Response filed Jul. 20, 2021 to Communication Pursuant to Article 94(3) EPC dated Mar. 23, 2021”, 17 pgs. |
“Japanese Application Serial No. 2018-532640, Response filed Aug. 6, 2021 to Notification of Reasons for Refusal dated May 18, 2021”, with machine translation, 9 pgs. |
Number | Date | Country | |
---|---|---|---|
20200232841 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
62387234 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15389365 | Dec 2016 | US |
Child | 16841464 | US |