Laryngoscope with handle-grip activated recording

Information

  • Patent Grant
  • 12102301
  • Patent Number
    12,102,301
  • Date Filed
    Wednesday, November 3, 2021
    3 years ago
  • Date Issued
    Tuesday, October 1, 2024
    7 months ago
Abstract
Disclosed are embodiments of a laryngoscope that facilitates targeted recording (video, or audio, or both video and audio) of time intervals associated with active laryngoscopy. In accordance with the teachings of the disclosure, a characteristic that reliably defines the interval of active laryngoscopy is used to trigger recording. One such characteristic is that the operator's hand is gripping the handle of the laryngoscope. Accordingly, preferred embodiments implement a laryngoscope having a handle so designed that when the handle is gripped by the operator's hand, recording is initiated and continued for as long as the operator's hand maintains a grip on the laryngoscope handle.
Description
TECHNICAL FIELD

The disclosed subject matter pertains generally to the area of medical devices, and more specifically to the area of video laryngoscopy.


BACKGROUND INFORMATION

Many medical instruments rely on the skill of the caregiver for proper use. To enhance those skills, some medical instruments may be provided with one or more features that give the caregiver more feedback on how well the caregiver is using the medical instrument. One feedback feature that has been popularized with laryngoscopes—a tubular endoscope inserted into the larynx through the mouth to observe the interior of the larynx—is a video camera which allows a caregiver to see and capture an image during the procedure. These video laryngoscopes as they are called are specialized laryngoscopes that provide real-time video of the airway anatomy captured by a small video camera on the blade inserted into the airway. They provide a video image on a small screen on the laryngoscope device, or on the screen of a device that is connected to the laryngoscope device.


Video laryngoscopes are particularly useful for intubating “difficult airways.” Although video laryngoscopes can be costly to purchase and use; they may speed up the time to successful intubation by allowing the caregiver to better see the airway during the intubation process. Video laryngoscopy is becoming more commonly used to secure an airway in both hospital and pre-hospital environments.


Laryngoscopy to facilitate endotracheal intubation is one of the most challenging critical care procedures widely performed in prehospital emergency medical care. Some laryngoscopes are designed with a camera or fiber optics on the blade to facilitate enhanced visualization of the target structures during laryngoscopy, and some of the video laryngoscopes allow recording of the video data stream. Such products record the entire data stream from power-on to power-off of the device (or stream that entire interval of data to an accessory recording device such as a computer). However, the data of value comprises only a (potentially very small) portion of the entire recorded data stream. The excess data recorded is undesirable because it increases file size and memory requirements, necessitates more time for data transfer and data review, and increases the risk of privacy concerns surrounding the content of the captured data.


There is a long-felt yet unmet need for improvements in the area of video laryngoscopes.


SUMMARY OF EMBODIMENTS

Embodiments are directed to a laryngoscope that facilitates targeted recording (video, or audio, or both video and audio) of time intervals associated with active laryngoscopy. In accordance with the teachings of this disclosure, a characteristic that reliably defines the interval of active laryngoscopy is used to trigger recording. One such characteristic is that the operator's hand is gripping the handle of the laryngoscope. Accordingly, preferred embodiments implement a laryngoscope having a handle so designed that when the handle is gripped by the operator's hand, recording is initiated and continued for as long as the operator's hand maintains a grip on the laryngoscope handle.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a perspective view of a video laryngoscope, according to an illustrative embodiment, in use on a patient.



FIG. 2 is another perspective view of a video laryngoscope, according to an illustrative embodiment.



FIG. 3 is a functional block diagram generally illustrating one possible example of components of a laryngoscope according to various embodiments.



FIG. 4 is a conceptual diagram of electrical signals (e.g., video image data) being stored temporarily in a buffer for later storage in a non-volatile memory, in accordance with one embodiment.



FIG. 5 is a conceptual diagram of yet another embodiment of a laryngoscope system in which components may be distributed over two or more devices, in accordance with the teachings of this disclosure.





DETAILED DESCRIPTION

In describing more fully this disclosure, reference is made to the accompanying drawings, in which illustrative embodiments of the disclosure are shown. This disclosure may, however, be embodied in a variety of different forms and should not be construed as so limited.


Generally described, the disclosure is directed to a laryngoscope that includes a capture device and a grip sensor. The grip sensor is configured to detect that the laryngoscope is being gripped, such as by a medical care giver. In response to detecting that the laryngoscope is being gripped, data from the capture device is stored in a non-volatile manner. Specific embodiments and implementations are described below.


Initial Discussion of Operative Environment



FIG. 1 is a conceptual drawing that depicts an example of a patient 105 in whom a laryngoscope 110 is placed to perform a laryngoscopy. A laryngoscope is a tool used to view a patient's airway anatomy when inserting an endotracheal tube (not shown). A laryngoscopy is often performed to facilitate tracheal intubation during both elective and emergency airway management procedures in patients meeting indications for definitive airway management such as inability to adequately oxygenate or ventilate, inability to maintain airway patency or protect the airway from aspiration, or anticipation of clinical course. Tracheal intubation is the insertion of a breathing tube into the trachea for the purpose of protecting the airway and facilitating respiratory gas exchange.


The laryngoscope illustratively has a blade 107 attached by some connector mechanism 114 to a handle 112. The handle allows a caregiver to control the placement of the blade of the laryngoscope into an oral cavity 109 of the patient against the patient's tongue in connection with the laryngoscope procedure.


The laryngoscope shown in FIG. 1 is more particularly a video laryngoscope which includes a video display 120. The video display shown in FIG. 1 is illustrated as integral to the laryngoscope, although it may alternatively be connected to the laryngoscope either wirelessly or through a wired connection (not shown). The video display provides real-time video of the airway anatomy. Video laryngoscopes are used to improve visibility of the airway of the patient in order to make the task of inserting the tube easier and are particularly useful for intubating “difficult airways.”


In the setting of emergency airway management, intubation is usually a time-sensitive procedure, due to risk of physiologic deterioration from the prolonged apnea resulting from lengthy intubation attempts, or due to the risk of interrupting other life-saving interventions such as chest compressions in a cardiac arrest patient or rapid evacuation and transport of a traumatically injured patient. It is therefore important to accomplish the laryngoscopy task quickly, preferably, within a single attempt. Making multiple attempts extends the period without adequate gas exchange, and consequent hypercapnia and hypoxia. For this reason, quality assurance measures are desirable to measure, monitor, and provide feedback on the timing of intubation attempts and the associated change in physiological measures (such as pulse oximetry and capnometry). The video captured by a video laryngoscope provides one form of enhanced quality assurance and quality improvement measure by providing video feedback on the procedure as it is performed, and, if the video is recorded, allowing detailed video review after the procedure is over.


Having thus introduced background on laryngoscopy, we now turn to features that are provided by this disclosure.



FIG. 2 is a perspective view of one embodiment of a laryngoscope 200 in accordance with the disclosure. The laryngoscope 200 includes an image capture portion 210 (e.g., a camera), a handle portion 250, and a monitor 225. Internally (so not shown), the laryngoscope 200 of this embodiment includes a processor, a memory unit, a user interface, and a communication module. These components are described in detail below in conjunction with FIG. 3. The image capture portion 210 is configured to capture one or more of a photo images, a video stream of images, or a coded image. Alternatively (or additionally), the image capture portion 210 may be configured to capture audio signals in addition to, or perhaps in lieu of, the video image(s).


The image capture portion 210 may include a light source, an image reflector unit, a window configured for diffusing light, a camera shutter, a scanner shutter, a lens system, light channels, a photodetector array, and/or a decode module. The photodetector array, in one example, may comprise a photo-sensitive charge coupled device (CCD) array that upon exposure to reflected light when enabled, generates an electrical pattern corresponding to the reflected light. Alternatively, a laser scanner and a photo-detector may be used to illustrate that different types of photodetectors besides the charge coupled photo sensitive element arrays may be used with this disclosure. As is known in the art, the light source (or a separate light source) illuminates a subject and the photodetector array converts light reflected from the subject into corresponding electrical signals. The electrical signals are in turn rendered on the monitor 225 for viewing by a user.


It should be appreciated that the image capture portion 210 may be implemented in any appropriate location or even in multiple locations. As illustrated in FIG. 2, the image capture portion 210 is integrated into the “blade” portion of the laryngoscope, but it could be implemented elsewhere, such as in the handle 250, or as a component of an attached extension, such as monitor 225. Still further, the image capture portion 210 could be located on a remote wired or wirelessly connected component of an electronic monitoring and medical care support system (e.g. a vital signs monitor, a monitor/defibrillator, a tablet device used for purposes such as documentation, or a mechanical chest compression system). Yet even further, more than one image capture portion 210 may be used and implemented in multiple locations.


In this embodiment, the monitor 225 is formed integral (or coupled) to the handle 250. Alternatively or in addition, the monitor 225 may take the form of a display that is separate from the handle 250. For example, the separate display may be connected to the laryngoscope 200 by a wired or wireless connection. Various monitors may be used with embodiments of this disclosure, such as an LCD screen, an e-paper (or other bi-stable) display, a CRT display, or any other type of visual display. One illustrative embodiment that uses one or more remote monitors is illustrated in FIG. 5 and described below.


The laryngoscope 200 of this particular embodiment further includes power button 216 and a grip sensor 275. The power button 216 operates to activate the laryngoscope 200 when in use and power on its various elements, e.g., the image capture portion 210, including its light source, and the monitor 225. The grip sensor 275 may be any suitable sensor that can detect that an operator is gripping the handle 250, thereby indicating that the laryngoscope 200 is in active use. The grip sensor 275 may be implemented as a pressure sensor, force sensor, temperature sensor, optical sensor, impedance (e.g., capacitive or resistive) sensor, or the like. Alternatively, the grip sensor 275 could be implemented as a tactile or electromechanical switch. In preferred embodiments, when the handle is gripped by the operator's hand (as detected by the grip sensor 275), recording is initiated and continued for as long as the operator's hand maintains a grip on the laryngoscope handle 250. Although shown in FIG. 2 as a single grip sensor 275, alternative embodiments may employ one or more grip sensors or sensor elements. Multiple grip sensors or sensor elements could be used, for example, to help differentiate the circumferential grip of a hand versus stray contact with some other object or objects. In addition, no particular significance should attach to the placement of the grip sensor 275 as shown in FIG. 2. Rather, the placement of the grip sensor 275, either on the handle 250 or elsewhere, should be viewed as a design choice with the purpose of enhancing the determination of an in-use grip by an operator versus stray contact.


In operation, an operator may handle the laryngoscope 200 in preparation for its use as described above in conjunction with FIG. 1. As part of preparing to use the laryngoscope 200, the operator will press the power button 216 to initiate the several systems of the laryngoscope 200, including the image capture portion 210 (in particular it's illuminating light source) and, perhaps, the monitor 225. At that point, the image capture portion 210 may be actively capturing and rendering images and/or video to the monitor 225. As discussed above, it may be desirable to record the images being captured by the image capture portion 210 when the laryngoscope 200 is in use. However, and also for the reasons discussed above, it may be undesirable to record all of or even a substantial portion of the images being captured. Accordingly, while in active use, an operator engages the laryngoscope 200 by holding the handle 250 in the area of the one or more grip sensors 275. The grip sensors 275 may passively determine that the laryngoscope 200 is in active use, such as by merely detecting that the operator's hand is in place in such a manner that it is likely the laryngoscope 200 is in active use. Alternatively, the grip sensor 275 may enable the operator to affirmatively indicate that the laryngoscope 200 is in active use, such as through the use of a tactile switch or the like.


In an embodiment where the image capture portion 210 is located remote to the handle 210 (e.g., wirelessly connected to a remote electronic monitoring and medical care support system), the grip sensor 275 and laryngoscope 200 may cause a signal to be sent to the remote component (not shown) to initiate recording on that remote component.



FIG. 3 is a functional block diagram 810 generally illustrating components of one particular implementation of a laryngoscope 300 constructed in accordance with this disclosure. In this embodiment, the laryngoscope 300 includes a processor 820, a memory unit 830, a communication module 840, a user interface 850, an image capture module 860, an optional audio capture module 870, and a power source 880.


Processor 820 may be implemented in any number of ways. Such ways include, by way of example and not of limitation, digital and/or analog processors such as microprocessors and digital-signal processors (DSPs); controllers such as microcontrollers; software running in a machine; programmable circuits such as Field Programmable Gate Arrays (FPGAs), Field-Programmable Analog Arrays (FPAAs), Programmable Logic Devices (PLDs), Application Specific Integrated Circuits (ASICs), any combination of one or more of these, and so on.


Memory unit 830 may be implemented in any number of ways. Such ways include, by way of example and not of limitation, volatile memory (e.g., RAM, etc.), nonvolatile memories (e.g., hard drive, solid state drive, etc.), read-only memories (e.g., CD-ROM, DVD, etc.), or any combination of these. Memory 830 may include programs containing instructions for execution by processor 820. The programs provide instructions for execution by the processor 820, and can also include instructions regarding protocols and decision making analytics, etc. In addition, memory unit 830 can store rules, configurations, data, etc. Although illustrated as being collocated with the other components of the laryngoscope 300, it should be appreciated that the memory unit 830, or an ancillary memory unit (not shown), could alternatively be located on a remote wired or wirelessly connected component.


Communication module 840 may be implemented as hardware, firmware, software, or any combination thereof, configured to transmit data between the laryngoscope 300 and other devices. In an illustrative embodiment, the communication module 840 may include a wireless module and/or a hardwire connect module. The wireless module may illustratively be a Wi-Fi module. Additionally or alternatively, the wireless module may be a Bluetooth module, a cellular communication module (e.g., GSM, CDMA, TDMA, etc.), a Wi-Max module, or any other communication module that enables a wireless communication link for the bidirectional flow of data between the image capture device and an external device. The hardwire connect module may be a hardware and software based data connector configured to connect with a data outlet of an external device such as a computer. The hardwire connect module may be implemented as one or more ports and associated circuitry and software that allow bidirectional flow of data between the laryngoscope 300 and other devices. Illustratively, the hardwire connect module may be an Ethernet connector, an RS232 connector, a USB connector, an HDMI connector, or any other wired connector. The communication module 840 is capable of enabling transmission of data captured using the image capture module 860 to other devices.


The image capture module 860 is hardware and software configured to provide the optical functionality required for the capture of photo image, a video stream of images, or a coded image. Examples of the image capture module 860 may include a complementary metal-oxide semiconductor (CMOS), N-type metal-oxide semiconductor (NMOS), a charge coupled device (CCD), an active-pixel sensor (APS), or the like. The image capture module 860 may additionally include video processing software, such as a compression/decompression module, and other components, such as an analog-to-digital converter. The image capture module 860 still further includes a light source (not shown) which is used to illuminate the subject area under view. Still further, the image capture module 860 could be implemented with additional features, such as a zooming and/or panning capability.


Although only a single image capture module 860 is shown in FIG. 3, it should be appreciated that more than one image capture module 860 could be implemented within the laryngoscope 300 or distributed among the laryngoscope 300 and other medical devices with which the laryngoscope 300 is in communication. For example, the laryngoscope 300 could have one image capture module 860 integral to the laryngoscope 300 and also be in active communication with an attachment or extension that includes a second image capture module 860.


An optional audio capture module 870 may also be included for capturing audio signals. In such an embodiment, the optional audio capture module 870 may include a microphone or other audio signal capture hardware (e.g., an analog-to-digital converter, etc.), and may also include audio processing software, such as a compression/decompression module, or the like.


The image capture module 860 and the optional audio capture module 870 could either or both be located somewhere in the body of the laryngoscope 200 (e.g., the handle, the blade, or a directly-attached extension to the laryngoscope 200, such as the monitor). Alternately or in addition, the image capture module 860 and the optional audio capture module 870 could either or both be located on a remote wired or wirelessly connected component of an electronic monitoring and medical care support system (e.g. a vital signs monitor, a monitor/defibrillator, a tablet device used for purposes such as documentation, or a mechanical chest compression system).


Power source 880 is a component that provides power to the several components of the laryngoscope 300. Power source 880 may be a battery, an externally provided power source, or a combination of both.


User interface 850 is hardware and software configured to enable an operator to interact with and control the several components of the laryngoscope 300, such as the processor 820, the memory unit 830, the image capture module 860, the communication module 840, and the optional audio capture module 870. The user interface may include a keypad 852 (either physical or virtual), a monitor 854, a power button 856 for turning on the laryngoscope 300, and a grip sensor 858 for initiating the capture of a media stream (e.g., images, video, audio, or any combination thereof). For example, user interface 850 may include a monitor 854 to display a series of instructions and a touch screen keyboard for the operator to enter data to configure the laryngoscope 300 for a particular operation.


For example, the monitor 854 may initially show a home screen with an activate virtual button to activate the capture of image data. On pressing the activate button, the monitor 854 may display a menu of instructions to navigate the operator to configure the laryngoscope 300 to the desired settings. Alternatively, the menu may pop up automatically on the screen of the display when the power button 856 is pressed.


User interface 850 may also include a speaker (not shown), to issue voice prompts, etc. User interface 850 may additionally include various other controls, such as pushbuttons, keyboards, switches, and so on, for purposes not relevant to the present disclosure.


In operation, and as discussed above, an operator may press (or otherwise trigger) the power button 856 to power on the several components of the laryngoscope 300, including the image capture module 860 (and, in particular, its illuminating light source), the processor 820, and the memory unit 830. In this way, the laryngoscope 300 is operative to begin capturing images from the image capture module 860. Additionally or alternatively, the laryngoscope 300 may also be operative to begin capturing audio from the optional audio capture module 870.


As discussed above, upon detection of the operator's grip upon the grip sensor 880, the processor 820 causes data from the image capture module 860 (and/or the audio capture module 870) to be recorded by the memory 830. Conversely, when the grip sensor 880 no longer detects the presence of the operator's grip, the processor 820 causes the data from the image capture module 860 (and/or the audio capture module 870) to cease being recorded, although the image capture module 860 may continue to stream images.


In addition, in embodiments that implement communication between the laryngoscope 300 and other devices, such as via the communication module 840, various functions and features may be distributed or duplicated at those other devices. Turning briefly to FIG. 5, an illustrative environment is shown in which a laryngoscope 599 in accordance with the teachings of this disclosure, is deployed. In this illustrative embodiment, the laryngoscope 599 includes wireless and/or wired communication to other devices. For example, the laryngoscope 599 of this embodiment may be in communication with one or more of a monitor/defibrillator 572, an EMS tablet 526, a handheld device 530, and/or a head-worn display device 531. The laryngoscope 599 may additionally be in communication with remote services, such as a remote hospital computer 528 and/or an online tracking system 501. In other embodiments, the laryngoscope 599 may be implemented as a component or pod of a modular external defibrillator system such as disclosed in aforementioned U.S. patent application Ser. No. 11/256,275 “Defibrillator/Monitor System having a Pod with Leads Capable of Wirelessly Communicating.”


It will be appreciated that the laryngoscope 599 of this embodiment may be in direct communication (e.g., Bluetooth, Wi-Fi, NFC, hard-wire, etc.) with devices that are reasonably co-located with the laryngoscope 599. Alternatively, the laryngoscope 599 may be in indirect communication with other devices (e.g., hospital computer 528) through intermediary communication components. For example, the laryngoscope 599 may connect directly to the defibrillator/monitor 572, which then relays signals to the hospital computer 528.


In the embodiment illustrated in FIG. 5, it will be appreciated that signals (e.g., video or audio data) may be displayed directly on the laryngoscope 599 as well as communicated to the other devices for simultaneous display. In this example, the images being captured by the laryngoscope 599 may be conveyed to, for instance, an EMS tablet 526 in use by another medical technician on the scene. In yet another alternative, the images being captured may be conveyed from the laryngoscope 599 through to the hospital computer 528 at which those images may be reviewed by other medical personnel, such as doctors or nurses, who may be able to render real-time assistance from their remote location.


Similarly, one or more of the remote devices may include recording mechanisms for recording all or a portion of the images being conveyed by the laryngoscope 599. These and other embodiments will become apparent to those skilled in the art from a careful study of this disclosure.


Enhancements to the Data Storage Process


A laryngoscope configured in accordance with the present disclosure may further include logic to enhance or simplify the capture of relevant data during a laryngoscopy while reducing irrelevant data. More specifically, it is envisioned that an operator of a video laryngoscope may be in a stressful and difficult situation and may be occupied or preoccupied with several tasks simultaneously. To that end, it may be detrimental to the recording process if data streaming from the image capture module 860 are only recorded immediately upon activation of the grip sensor 858. For instance, an operator of the laryngoscope 300 may need to remove his or her hand briefly or readjust his or her grip on the laryngoscope 300 during operation. Embodiments of the disclosure may be implemented to address this situation. Referring now to FIGS. 3 and 4, the memory unit 830 of the laryngoscope 300 includes both non-volatile memory 410 and volatile memory 420. The volatile memory 420 includes a buffer for temporarily storing data. As shown in FIG. 4, image data being captured by the image capture module 860 is written to the volatile memory 420 and stored there temporarily for some amount of time. The duration that image data is stored will depend on many factors, including the size of the volatile memory, the pixel density of the captured images, any compression applied to the data stream, and the like. Accordingly, image data may be stored in the volatile memory 420 for some time after it is captured. The length of time image data is stored by also be field-configurable by the operator, or perhaps adjustable by the manufacturer or a service representative.


Upon the grip sensor 858 detecting the operator's grip, rather than only storing image data from that moment forward, prior image data may also be stored to non-volatile memory 410 for later use. For instance, as illustrated in FIG. 4, the grip sensor 858 may detect the operator's grip at some time “T” 41. At that instant, the processor 820 may cause image data from some configurable prior time (“T-N” 413) to be written to non-volatile memory. By way of example, the operator may grip the laryngoscope at a certain moment, thereby activating the grip sensor 858. In response, image data from the prior 30 seconds (or any other configurable time period) may be written from the volatile memory to the non-volatile memory for later use. In this way, the likelihood that relevant data is lost is decreased. For instance, should the operator release his or her grip on the laryngoscope briefly, but re-grip the laryngoscope within the configurable time frame, no image data would be lost.


In yet another enhancement, the laryngoscope 300 could be configured to continue writing image data to non-volatile memory 410 for some fixed or configurable interval of time after the grip sensor detects that the handle is no longer being gripped. Such an enhancement would further ameliorate the possibility of relevant data being lost due to brief or temporary releases of the handle.


In still another enhancement, the amount of data stored from prior-to the triggering handle-grip event, and/or after the sensor(s) detects that the handle is no longer being gripped, could be configured to be different for video vs audio data, in a system where both types of data are automatically recorded upon triggering from a handle-grip event. For example, audio data for an interval prior to and after the actual laryngoscopy may provide value for capturing details about the preparation for the procedure, and the confirmation of successful completion of the procedure, whereas video data while the operator's hand is not holding the handle is likely of no explicit value.


Additional Alternative Embodiments

Certain additional embodiments may become apparent to those skilled in the art based on the preceding disclosure. For example, one alternative embodiment may be implemented in systems in which the amount of storage space is not a constraint, yet identification of most-relevant image (or audio) data remains desirable. In such an embodiment, image data may be captured and stored to non-volatile memory for later use. In this way, the entirety of the image data stream may be available for later analysis. However, it continues to remain desirable to have easy and quick access to the most-relevant image (or audio) data recorded. Thus, the grip sensor 858 may be used to trigger the recording of a time-stamped event that is synchronized or embedded within the stream of image data. In this way, for example, a user may later quickly jump to those events (e.g., intubation, extraction, etc.) within the full recorded data stream. Still further, the grip and release events could be used to configure timing of a report of 15 intubation that, in addition to the audio or video recording, also includes physiologic data, queried from a host monitor, from a period of time related to the times of grip and release (for example, 60 seconds before grip to 180 seconds after release).


In still another alternative embodiment, a (non-video) laryngoscope may be configured with an audio loop recorder, rather than an image capture device, that captures audio data for an interval related to the grip and release times. Such an alternative embodiment would be low-cost, yet could be used in a system in which responders are trained to “talk to the device” and provide verbal comments describing the event as it progresses.


In yet another alternative embodiment, the grip sensor may be implemented in a manner that not only recognizes the presence versus absence of the operator's hand grip, but also measures the magnitude of the grip force/pressure, and/or the amount of handle surface area that is being directly contacted by the operator's hand. Such quantitative measurements could then be recorded by the laryngoscope system, allowing enhanced review and analysis of these aspects of laryngoscopy technique.


In an extension of the above embodiment, a certain pattern of applied grip pressure/force—such as, for example, two squeezes in rapid succession akin to a double mouse click on a computer—could be used to trigger recording of a discrete snapshot image, separate from the continuous video file that may also be in the process of being recorded. Such an operator-triggered snapshot image capture capability would allow the operator to capture a favorable or optimal image of interesting or noteworthy airway anatomy or airway conditions, for purposes such as documentation, education, and quality improvement. One specific use of such an operator-triggered snapshot image capture capability would be to facilitate transfer or transmission of that image by the laryngoscope system (and/or a communicatively-coupled monitor-defibrillator system) into the patient care report, the patient's medical record, or a handoff report, specifically to alert and illustrate to downstream care providers specific airway features that may make the patient more challenging to intubate. Such a “difficult airway handoff alert and illustration”, forwarded from a first care provider who has already navigated a difficult airway, to a downstream care provider who may otherwise be unaware of the specific challenges posed by the particular patient's airway, would allow downstream care providers to make safer and more informed decisions about additional airway management procedures such as endotracheal tube exchange or extubation.


In yet another alternative embodiment, the grip sensor could be incorporated into a sleeve that fits over/around the handle of the laryngoscope, rather than into the handle itself. Such a sleeve could then also be used on direct (non-video) laryngoscopes as well, wherein the hand grip detected by the sensor could be used for the aforementioned tasks of initiating and terminating recording of audio data, as well as recording specific event times associated with handling of the laryngoscope.


In yet another alternative embodiment, the pressure or force sensor indicative of active use of the laryngoscope could be located elsewhere on the laryngoscope. For example, a pressure or force sensor could be incorporated into the laryngoscope blade, allowing detection of contact and lifting force applied the anterior structures of the airway. Such pressure or force detection could then b e used to similarly control video and/or audio recording as in the earlier-described embodiments. The magnitude of the pressure or force could also be measured and recorded, which would be useful for analysis of laryngoscopy technique and correlation to physiologic responses to the laryngoscopy procedure. In a related embodiment, one or more pressure or force sensors could be incorporated into the rear base of the laryngoscope blade and/or handle, such that the sensors could detect force being applied to the patient's upper (maxillary) teeth, which might result in dental trauma. Detection of such dental contact force could be used to trigger an alert or alarm notifying the operator of the hazard.


In yet another alternative embodiment, one or more proximity sensors, pointed in one or more directions, could be located in or near the blade of the laryngoscope. These sensors could be used to detect entry of the blade into a confined space such as the patient's mouth, or close approach of the blade-side of the handle to the patient's face. Detection of close proximity to the face or mouth while the video laryngoscope is on could then be used to similarly control video and/or audio recording as in the earlier-described embodiments.


Other embodiments may include combinations and sub-combinations of features described above or shown in the several figures, including for example, embodiments that are equivalent to providing or applying a feature in a different order than in a described embodiment, extracting an individual feature from one embodiment and inserting such feature into another embodiment; removing one or more features from an embodiment; or both removing one or more features from an embodiment and adding one or more features extracted from one or more other embodiments, while providing the advantages of the features incorporated in such combinations and sub-combinations. As used in this disclosure, “feature” or “features” can refer to structures and/or functions of an apparatus, article of manufacture or system, and/or the steps, acts, or modalities of a method.


In the foregoing description, numerous details have been set forth in order to provide a sufficient understanding of the described embodiments. In other instances, well-known features have been omitted or simplified to not unnecessarily obscure the description.


A person skilled in the art in view of this description will be able to practice the disclosed invention. The specific embodiments disclosed and illustrated herein are not to be considered in a limiting sense. Indeed, it should be readily apparent to those skilled in the art that what is described herein may be modified in numerous ways. Such ways can include equivalents to what is described herein. In addition, the invention may be practiced in combination with other systems. The following claims define certain combinations and subcombinations of elements, features, steps, and/or functions, which are regarded as novel and non-obvious. Additional claims for other combinations and subcombinations may be presented in this or a related document.

Claims
  • 1. A laryngoscope comprising: an image capture module configured to capture video as a data stream;a handle having a sensor configured to detect that the handle is gripped by a hand; anda processor configured to: embed a time-stamped event within the data stream based on the sensor detecting that the handle is gripped by the hand, wherein the time-stamped event corresponds to a time of a grip of the handle; andassociate the video with physiologic data obtained by a monitor-defibrillator during a period of time associated with the time of the grip.
  • 2. The laryngoscope of claim 1, wherein the period of time starts before the time of the grip.
  • 3. The laryngoscope of claim 1, the time-stamped event being a first time-stamped event, wherein the sensor is further configured to detect that the handle is released by the hand, and wherein the processor is further configured to embed a second time-stamped event within the data stream based on the sensor detecting that the handle is released by the hand.
  • 4. The laryngoscope of claim 3, wherein the second time-stamped event corresponds to a time of a release of the handle, and wherein the period of time is further associated with the time of the release.
  • 5. The laryngoscope of claim 4, wherein the period of time starts before the time of the grip and ends after the time of the release.
  • 6. The laryngoscope of claim 1, wherein the processor is further configured to cause the video to be stored in memory.
  • 7. The laryngoscope of claim 1, wherein the sensor is further configured to detect a force applied to the handle by the hand, and wherein the processor is further configured to: determine that the handle is gripped with a pattern of applied force based on the sensor detecting the force applied to the handle; andcause the image capture module to capture an image and store the image in memory based on determining that the handle is gripped with the pattern of applied force.
  • 8. A method comprising: capturing video as a data stream by an image capture module of a laryngoscope;detecting, by a sensor disposed in a handle of the laryngoscope, that the handle is gripped by a hand;embedding, by a processor of the laryngoscope, a time-stamped event within the data stream based on the detecting, by the sensor, that the handle is gripped by the hand, wherein the time-stamped event corresponds to a time of a grip of the handle; andassociating, by the processor, the video with physiologic data obtained by a monitor-defibrillator during a period of time associated with the time of the grip.
  • 9. The method of claim 8, the time-stamped event being a first time-stamped event, the method further comprising: detecting, by the sensor, that the handle is no longer being gripped by the hand; andembedding, by the processor, a second time-stamped event within the data stream based on the detecting, by the sensor, that the handle is no longer being gripped by the hand.
  • 10. The method of claim 8, further comprising causing the video to be stored in memory.
  • 11. The method of claim 8, wherein the period of time starts before the time of the grip.
  • 12. The method of claim 8, further comprising: detecting, by the sensor, an amount of pressure with which the handle is gripped by the hand; andcausing, by the processor, the amount of pressure to be stored in memory.
  • 13. The method of claim 8, further comprising: detecting, by the sensor, an amount of a surface area of the handle that is gripped by the hand; andcausing, by the processor, the amount of the surface area to be stored in memory.
  • 14. A laryngoscope comprising: a blade;a camera integrated into the blade and configured to capture a stream of image data;a handle having a sensor configured to detect that the handle is gripped by a hand; anda processor configured to: record a time-stamped event within the stream of image data based on the sensor detecting that the handle is gripped by the hand, wherein the time-stamped event corresponds to a time of a grip of the handle; andassociate at least a portion of the stream of image data with physiologic data obtained by a monitor-defibrillator during a period of time associated with the time of the grip.
  • 15. The laryngoscope of claim 14, wherein the period of time starts before the time of the grip.
  • 16. The laryngoscope of claim 14, the time-stamped event being a first time-stamped event, wherein the sensor is further configured to detect that the handle is no longer being gripped by the hand, and wherein the processor is further configured to record a second time-stamped event within the stream of image data based on the sensor detecting that the handle is no longer being gripped by the hand.
  • 17. The laryngoscope of claim 16, wherein the second time-stamped event corresponds to a time of a release of the handle, and wherein the period of time is further associated with the time of the release.
  • 18. The laryngoscope of claim 14, wherein the sensor is further configured to detect a pressure applied to the handle by the hand, and wherein the processor is further configured to: determine that the handle is gripped with a pattern of applied pressure based on the sensor detecting the pressure applied to the handle; andcause the camera to capture an image and store the image in memory based on determining that the handle is gripped with the pattern of applied pressure.
  • 19. The laryngoscope of claim 14, wherein the sensor is further configured to detect an amount of pressure with which the handle is gripped by the hand, and wherein the processor is further configured to cause the amount of pressure to be stored in memory.
  • 20. The laryngoscope of claim 14, wherein the sensor is further configured to detect an amount of a surface area of the handle that is gripped by the hand, and wherein the processor is further configured to cause the amount of the surface area to be stored in memory.
CROSS REFERENCE TO RELATED APPLICATIONS

This patent application is a continuation of U.S. application Ser. No. 16/409,301, filed May 10, 2019, which is a continuation of U.S. application Ser. No. 15/423,453, filed Feb. 2, 2017, now issued as U.S. Pat. No. 10,299,668, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/290,220 filed on Feb. 2, 2016, entitled “Laryngoscope With Handle-Grip Activated Video/Audio Recording,” the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/449,922 filed on Jan. 24, 2017, entitled “Laryngoscope With Changeable Blade Shape And Size,” and the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/290,195 filed on Feb. 2, 2016, entitled “Carbon Monoxide (CO) Environmental Monitoring on an Emergency Medical Device, Monitor or Defibrillator” the disclosures of which are all hereby incorporated by reference for all purposes. The present disclosure is further related to prior-filed U.S. patent application Ser. No. 14/491,669 field on Sep. 19, 2014, entitled “Multi-Function Video System,” and to prior-filed U.S. patent application Ser. No. 11/256,275 filed Oct. 21, 2005, entitled “Defibrillator/Monitor System having a Pod with Leads Capable of Wirelessly Communicating,” the disclosures of which are hereby incorporated by reference for all purposes.

US Referenced Citations (164)
Number Name Date Kind
3724455 Unger Apr 1973 A
3865101 Saper et al. Feb 1975 A
4096856 Smith et al. Jun 1978 A
4947245 Ogawa et al. Aug 1990 A
5012411 Policastro et al. Apr 1991 A
5070859 Waldvogel Dec 1991 A
5078134 Heilman et al. Jan 1992 A
5105821 Reyes Apr 1992 A
5311449 Adams May 1994 A
5470343 Fincke et al. Nov 1995 A
5565759 Dunstan Oct 1996 A
5593426 Morgan et al. Jan 1997 A
5685314 Geheb et al. Nov 1997 A
5716380 Yerkovich et al. Feb 1998 A
5724985 Snell et al. Mar 1998 A
5814089 Stokes et al. Sep 1998 A
5827178 Berall Oct 1998 A
5914585 Grabon Jun 1999 A
6134468 Morgan et al. Oct 2000 A
6141584 Rockwell et al. Oct 2000 A
6183417 Geheb et al. Feb 2001 B1
6188407 Smith et al. Feb 2001 B1
6223077 Schweizer et al. Apr 2001 B1
6275737 Mann Aug 2001 B1
6402691 Peddicord et al. Jun 2002 B1
6422669 Salvatori et al. Jul 2002 B1
6441747 Khair et al. Aug 2002 B1
6591135 Palmer et al. Jul 2003 B2
6771172 Robinson et al. Aug 2004 B1
6957102 Silver et al. Oct 2005 B2
6978181 Snell Dec 2005 B1
7110825 Vaynberg et al. Sep 2006 B2
7245964 Moore et al. Jul 2007 B2
7570994 Tamura et al. Aug 2009 B2
7736301 Webler et al. Jun 2010 B1
7912733 Clements et al. Mar 2011 B2
7946981 Cubb May 2011 B1
7978062 LaLonde et al. Jul 2011 B2
D663026 McGrail et al. Jul 2012 S
D669172 McGrail et al. Oct 2012 S
8715172 Girgis May 2014 B1
8740782 McGrath Jun 2014 B2
8986204 Pacey et al. Mar 2015 B2
9095298 Ashcraft et al. Aug 2015 B2
9283342 Gardner Mar 2016 B1
9316698 McGrath et al. Apr 2016 B2
9457197 Aoyama et al. Oct 2016 B2
9468367 Ouyang et al. Oct 2016 B2
9498112 Stewart et al. Nov 2016 B1
9622646 Ouyang et al. Apr 2017 B2
9821131 Isaacs Nov 2017 B2
9880227 McGrath et al. Jan 2018 B2
9943222 Chan Apr 2018 B2
10299668 Walker May 2019 B2
20020022769 Smith Feb 2002 A1
20020116028 Greatbatch et al. Aug 2002 A1
20020116029 Miller et al. Aug 2002 A1
20020116033 Greatbatch et al. Aug 2002 A1
20020116034 Miller et al. Aug 2002 A1
20020128689 Connelly et al. Sep 2002 A1
20020128691 Connelly Sep 2002 A1
20020133086 Connelly et al. Sep 2002 A1
20020133199 MacDonald et al. Sep 2002 A1
20020133200 Weiner et al. Sep 2002 A1
20020133201 Connelly et al. Sep 2002 A1
20020133202 Connelly et al. Sep 2002 A1
20020133208 Connelly Sep 2002 A1
20020133211 Weiner et al. Sep 2002 A1
20020133216 Connelly et al. Sep 2002 A1
20020138102 Weiner et al. Sep 2002 A1
20020138103 Mulhauser et al. Sep 2002 A1
20020138107 Weiner et al. Sep 2002 A1
20020138108 Weiner et al. Sep 2002 A1
20020138110 Connelly et al. Sep 2002 A1
20020138112 Connelly et al. Sep 2002 A1
20020138113 Connelly et al. Sep 2002 A1
20020138124 Helfer et al. Sep 2002 A1
20020143258 Weiner et al. Oct 2002 A1
20020147470 Weiner et al. Oct 2002 A1
20020177793 Sherman et al. Nov 2002 A1
20020183796 Connelly Dec 2002 A1
20020198569 Foster et al. Dec 2002 A1
20030028219 Powers et al. Feb 2003 A1
20030050538 Naghavi et al. Mar 2003 A1
20030088156 Berci May 2003 A1
20030088275 Palmer et al. May 2003 A1
20030100819 Newman et al. May 2003 A1
20030195390 Graumann Oct 2003 A1
20030212311 Nova et al. Nov 2003 A1
20040049233 Edwards Mar 2004 A1
20040096808 Price et al. May 2004 A1
20040122476 Wung Jun 2004 A1
20040162586 Covey et al. Aug 2004 A1
20050075671 Vaisnys et al. Apr 2005 A1
20050124866 Elaz et al. Jun 2005 A1
20050244801 DeSalvo Nov 2005 A1
20060069326 Heath Mar 2006 A1
20060142808 Pearce et al. Jun 2006 A1
20060167531 Gertner et al. Jul 2006 A1
20060217594 Ferguson Sep 2006 A1
20070106121 Yokota et al. May 2007 A1
20070173697 Dutcher Jul 2007 A1
20070179342 Miller Aug 2007 A1
20080208006 Farr Aug 2008 A1
20080294010 Cooper Nov 2008 A1
20090203965 Fujiyama et al. Aug 2009 A1
20090312638 Bartlett Dec 2009 A1
20090318758 Farr et al. Dec 2009 A1
20090322867 Carrey Dec 2009 A1
20100067002 Ishii Mar 2010 A1
20100191054 Supiez Jul 2010 A1
20100198009 Farr et al. Aug 2010 A1
20100249513 Tydlaska Sep 2010 A1
20100274082 Iguchi et al. Oct 2010 A1
20100274090 Ozaki et al. Oct 2010 A1
20100317923 Endo et al. Dec 2010 A1
20110009694 Schultz Jan 2011 A1
20110034768 Ozaki et al. Feb 2011 A1
20110054252 Ozaki et al. Mar 2011 A1
20110144436 Nearman et al. Jun 2011 A1
20110270038 Jiang et al. Nov 2011 A1
20120320340 Coleman, III Dec 2012 A1
20130018227 Schoonbaert Jan 2013 A1
20130024208 Vining Jan 2013 A1
20130060089 McGrath et al. Mar 2013 A1
20130066153 McGrath et al. Mar 2013 A1
20130123641 Goldfain et al. May 2013 A1
20130197312 Miller et al. Aug 2013 A1
20130304142 Curtin et al. Nov 2013 A1
20130304146 Aoyama et al. Nov 2013 A1
20130304147 Aoyama et al. Nov 2013 A1
20130308839 Taylor et al. Nov 2013 A1
20140051923 Mirza et al. Feb 2014 A1
20140073880 Boucher et al. Mar 2014 A1
20140118517 Fueki et al. May 2014 A1
20140160261 Miller et al. Jun 2014 A1
20140272860 Peterson et al. Sep 2014 A1
20140277227 Peterson et al. Sep 2014 A1
20140278463 Merry et al. Sep 2014 A1
20140343359 Farr et al. Nov 2014 A1
20150080655 Peterson Mar 2015 A1
20150109193 Sly et al. Apr 2015 A1
20150112146 Donaldson Apr 2015 A1
20150238692 Peterson et al. Aug 2015 A1
20160022132 Chan Jan 2016 A1
20160051781 Isaacs Feb 2016 A1
20160095506 Dan et al. Apr 2016 A1
20160120394 McGrath et al. May 2016 A1
20160128548 Lai May 2016 A1
20160157700 Nearman et al. Jun 2016 A1
20160206188 Hruska et al. Jul 2016 A1
20160206189 Nearman et al. Jul 2016 A1
20160213236 Hruska et al. Jul 2016 A1
20160242637 Tydlaska et al. Aug 2016 A1
20160256047 Newcomb et al. Sep 2016 A1
20160345803 Mallory Dec 2016 A1
20170105614 McWilliam et al. Apr 2017 A1
20170215720 Walker et al. Aug 2017 A1
20170291001 Rosenblatt et al. Oct 2017 A1
20170300623 Rosenblatt et al. Oct 2017 A1
20170304572 Qiu Oct 2017 A1
20180020902 Merz et al. Jan 2018 A1
20180078318 Barbagli et al. Mar 2018 A1
20180263647 Aljuri et al. Sep 2018 A1
Foreign Referenced Citations (17)
Number Date Country
0801959 Oct 1997 EP
0923961 Jun 1999 EP
1228782 Aug 2002 EP
1250944 Oct 2002 EP
100969510 Jul 2010 KR
WO0070889 Nov 2000 WO
WO0166182 Sep 2001 WO
WO02056756 Jul 2002 WO
WO02060529 Aug 2002 WO
WO03009895 Feb 2003 WO
WO2004093979 Nov 2004 WO
WO2005058413 Jun 2005 WO
WO2005058416 Jun 2005 WO
WO-2009130666 Oct 2009 WO
WO2013056194 Apr 2013 WO
WO2013071153 May 2013 WO
WO2014004905 Jan 2014 WO
Non-Patent Literature Citations (55)
Entry
JedMed Horus, An Imaging System, Apr. 24, 2012, 2 pages.
Office Action from U.S. Appl. No. 10/583,209, mailed Mar. 16, 2010, 10 pp.
Office Action from U.S. Appl. No. 10/583,175, dated Apr. 29, 2010, 8 pp.
Office Action for U.S. Appl. No. 16/409,301, mailed on Apr. 30, 2021, Walker, “Laryngoscope With Handle-Grip Activated Recording”, 13 Pages.
Office Action for U.S. Appl. No. 16/572,120, mailed on Jan. 1, 2021, 9 pages.
Office Action from U.S. Appl. No. 10/583,176, mailed Jan. 16, 2013, 6 pp.
Office Action for U.S. Appl. No. 11/256,275, mailed on Jan. 6, 2009, 8 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed on Jan. 7, 2019, 9 pages.
Office Action from U.S. Appl. No. 10/583,175, mailed Oct. 2, 2009, 14 pp.
Office Action for U.S. Appl. No. 10/583,209, dated Oct. 19, 2010 (9 pp).
Office Action for U.S. Appl. No. 10/583,209, mailed on Oct. 19, 2010, 9 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed on Oct. 21, 2014, 9 pages.
Office Action for U.S. Appl. No. 14/310,841, mailed on Nov. 1, 2016, 13 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed on Nov. 16, 2017, 11 pages.
Office Action for U.S. Appl. No. 10/583,176, mailed Nov. 17, 2011, 7 pages.
Office Action from U.S. Appl. No. 10/583,175, mailed Nov. 22, 2013, 8 pp.
Office Action for U.S. Appl. No. 13/103,783, mailed on Nov. 28, 2011, 7 pages.
Office Action for U.S. Appl. No. 13/965,667, mailed on Nov. 5, 2013, 8 pages.
Office Action for U.S. Appl. No. 15/245,450, mailed on Dec. 15, 2017, 8 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed on Dec. 2, 2015, 7 pages.
Office Action for U.S. Appl. No. 13/103,783, mailed on Feb. 1, 2013, 8 pages.
Office Action for U.S. Appl. No. 14/491,669, mailed on Feb. 15, 2019, 10 pages.
Office Action for U.S. Appl. No. 15/829,660, mailed on Feb. 18, 2020, 12 pages.
Office Action from U.S. Appl. No. 10/583,175, mailed Feb. 28, 2013, 18 pp.
Office Action for U.S. Appl. No. 13/965,667, mailed on Feb. 28, 2014, 8 page.
Office Action for U.S. Appl. No. 11/256,275, mailed on Feb. 3, 2010, 10 pages.
Office Action for U.S. Appl. No. 11/256,275, mailed on Feb. 3, 2011, 10 pages.
Office Action for U.S. Appl. No. 14/310,841, mailed on Feb. 8, 2016, 11 pages.
Office Action for U.S. Appl. No. 10/583,209, mailed on Mar. 16, 2010, 10 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed on Mar. 23, 2015, 7 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed Mar. 9, 2017, 10 pages.
Office Action from U.S. Appl. No. 10/583,176, mailed May 1, 2012, 6 pp.
Office Action for U.S. Appl. No. 15/423,453, mailed May 18, 2018, Walker, “Laryngoscope With Handle-Grip Activated Recording,” 8 pages.
Office Action for U.S. Appl. No. 10/583, 176, mailed May 20, 2011, 8 pages.
Office Action for U.S. Appl. No. 14/069,021, mailed on Jun. 14, 2018, 8 pages.
Office Action for U.S. Appl. No. 11/256,275, mailed on Jun. 9, 2008, 5 pages.
Office Action for U.S. Appl. No. 11/256,275, mailed on Jun. 9, 2009, 10 pages.
Office Action for U.S. Appl. No. 15/245,450, mailed on Jul. 13, 2018, 5 pages.
Office Action from U.S. Appl. No. 10/583,175, mailed Jul. 29, 2013, 15 pp.
Office Action for U.S. Appl. No. 15/829,660, mailed on Jul. 30, 2019, 19 pages.
Office Action for U.S. Appl. No. 14/310,841, mailed on Jul. 7, 2016, 13 pages.
Office Action for U.S. Appl. No. 14/491,669, mailed on Aug. 29, 2019, 8 pages.
Office Action for U.S. Appl. No. 14/498,735, mailed on Sep. 15, 2015, 8 pages.
International Search Report and Written Opinion from international application No. PCT/US2004/012421, mailed Sep. 13, 2004, 7 pp.
International preliminary report on patentability from international application No. PCT/US2004/012421, mailed Oct. 28, 2005, 6 pp.
International Search Report and Written Opinion from international application No. PCT/US2004/042376, mailed Mar. 24, 2005, 6 pp.
International preliminary report on patentability from international application No. PCT/US2004/042376, mailed Jun. 20, 2006, 6 pp.
International preliminary report on patentability from international application No. PCT/US2004/042377, mailed Dec. 17, 2004, 6 pp.
International Search Report and Written Opinion from international application No. PCT/US2004/042792, mailed Jul. 20, 2005, 12 pp.
International preliminary report on patentability from international application No. PCT/US2004/042792, mailed Jun. 20, 2006, 8 pp.
International Search Report and Written Opinion from international application No. PCT/US2004/042377, mailed Dec. 17, 2004, 5 pp.
Response to Office Action mailed Oct. 2, 2009, from U.S. Appl. No. 10/583,175, filed Jan. 4, 2010, 14 pp.
Response to Office Action dated Mar. 16, 2010, from U.S. Appl. No. 10/583,209, filed Jun. 16, 2010, 17 pp.
Response to Office Action dated Apr. 29, 2010, from U.S. Appl. No. 10/583,175, filed Jul. 29, 2010, 13 pp.
Response to Office Action for U.S. Appl. No. 10/583,209, dated Dec. 20, 2010, 8 pp.
Related Publications (1)
Number Date Country
20220054000 A1 Feb 2022 US
Provisional Applications (3)
Number Date Country
62449922 Jan 2017 US
62290220 Feb 2016 US
62290195 Feb 2016 US
Continuations (2)
Number Date Country
Parent 16409301 May 2019 US
Child 17518197 US
Parent 15423453 Feb 2017 US
Child 16409301 US