In the past, clinicians have relied on various guidance systems, such as ultrasound systems, for assistance in capturing and rendering an image of a vessel (e.g., vein, artery, etc.). However, conventional ultrasound systems only provide Doppler results to build an object, where such objects are visual images without any data characteristics associated with these images. Hence, diagnoses surrounding vessel health has been solely based on manual inspection of low-resolution images, which may lead to an unacceptable level of inaccurate diagnoses.
Hence, a system that leverages artificial intelligence to produce mixed reality and/or virtual reality images is needed.
Disclosed herein is a medical analytic system including an ultrasound probe and a console communicatively coupled to the ultrasound probe. The console comprises an alternative reality (“AR”) anatomy representation logic. The AR anatomy representation logic is configured to initiate a capture of information associated with multiple sub-images at different longitudinal positions of an ultrasound image of an anatomical element. The AR anatomy representation logic is also configured to position and orient each sub-image longitudinally based on usage parameters during emission of the ultrasound signals for capturing of the ultrasound image. Lastly, the AR anatomy representation logic is configured to assemble each of the sub-images to form a virtual representation of the anatomical element for rendering in an AR environment.
In some embodiments, the usage parameters include a speed in movement of the ultrasound probe during emission of the ultrasound signals for capturing of the ultrasound image.
In some embodiments, the usage parameters include a direction in movement of the ultrasound probe during emission of the ultrasound signals for capturing of the ultrasound image.
In some embodiments, the AR environment includes a mixed reality. The mixed reality includes the virtual representation of the anatomical element positioned over a real-world setting including a real depiction of a portion of a patient's body having the anatomical element.
In some embodiments, the anatomical element is a vessel within an arm or a leg of a patient.
In some embodiments, the console further includes a communication interface configured to provide a rendering of the virtual object to an AR headset.
Also disclosed herein is a medical analytic system including an ultrasound probe a console communicatively coupled to the ultrasound probe. The console includes a processor and a memory. The memory includes an AR anatomy representation logic including logic selected from the group consisting of visualization logic, virtual slice positioning logic, virtual object assembly logic, and virtual object display logic, provided at least two of the foregoing are selected. The visualization logic is configured to capture information associated with multiple sub-images at different layers of an ultrasound image of an anatomical element. The virtual slice positioning logic is configured to position and orient each sub-image based on usage parameters during emission of the ultrasound signals from the ultrasound probe for capturing of the ultrasound image. The virtual object assembly logic, when executed by the processor, is configured to assemble each of the sub-images to form a virtual representation of the anatomical element. The virtual object display logic, when executed by the processor, is configured to render the virtual representation of the anatomical element in an AR environment.
In some embodiments, the usage parameters include a speed in movement of the ultrasound probe during emission of the ultrasound signals for capturing of the ultrasound image.
In some embodiments, the usage parameters includes a direction in movement of the ultrasound probe during emission of the ultrasound signals for capturing of the ultrasound image.
In some embodiments, the AR environment includes a mixed reality. The mixed reality includes the virtual representation of the anatomical element positioned over a real-world setting including a real depiction of a portion of a patient's body having the anatomical element.
In some embodiments, the console further includes a communication interface to provide a rendering of the virtual object to an AR headset.
In some embodiments, the anatomical element is a vessel.
In some embodiments, the virtual object display logic is configured to render the virtual representation of the anatomical element as an overlay over an image or a series of images.
In some embodiments, the image includes the ultrasound image and the series of images including a video of a real-world setting.
In some embodiments, the visualization logic and the virtual slice positioning logic are implemented within the ultrasound probe. In addition, the virtual object assembly logic and the virtual object display logic are executed by the processor and implemented within the console.
In some embodiments, the visualization logic, the virtual slice positioning logic, the virtual object assembly logic, and the virtual object display logic are implemented as software executed by the processor within the console.
Also disclosed herein is a method including an information-capturing operation, a positioning-and-orienting operation, and an assembling operation. The information-capturing operation includes initiating a capture of information associated with multiple sub-images at different longitudinal positions of an ultrasound image of an anatomical element. The positioning-and-orienting operation includes positioning and orienting each sub-image of the multiple sub-images longitudinally based on usage parameters occurring during emission of ultrasound signals for capturing of the ultrasound image. The assembling operation includes assembling each of the sub-images to form a virtual representation of the anatomical element for rendering in an AR environment.
In some embodiments, the usage parameters include a speed in movement of the ultrasound probe during emission of the ultrasound signals for capturing of the ultrasound image. Alternatively, the usage parameters include a direction in movement of the ultrasound probe during emission of the ultrasound signals for capturing of the ultrasound image.
In some embodiments, the AR environment includes a mixed reality. The mixed reality includes the virtual representation of the anatomical element positioned over a real-world setting including a real depiction of a portion of a patient's body having the anatomical element.
In some embodiments, the anatomical element is a vessel within a body of a patient.
These and other features of embodiments of the present invention will become more fully apparent from the following description and appended claims, or may be learned by the practice of embodiments of the invention as set forth hereinafter.
A more particular description of the present disclosure will be rendered by reference to specific embodiments thereof that are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. Example embodiments of the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Reference will now be made to figures wherein like structures will be provided with like reference designations. It is understood that the drawings are diagrammatic and schematic representations of exemplary embodiments of the invention, and are neither limiting nor necessarily drawn to scale.
Regarding terms used herein, it should be understood the terms are for the purpose of describing some particular embodiments, and the terms do not limit the scope of the concepts provided herein. Ordinal numbers (e.g., first, second, third, etc.) are sometimes used to distinguish or identify different components or operations, and do not supply a serial or numerical limitation. For example, “first,” “second,” and “third” components or operations need not necessarily appear in that order, and the particular embodiments including such components or operations need not necessarily be limited or restricted to the three components or operations. Similarly, labels such as “left,” “right,” “top,” “bottom,” “front,” “back,” and the like are used for convenience and are not intended to imply, for example, any particular fixed location, orientation, or direction. Instead, such labels are used to reflect, for example, relative location, orientation, or directions. Singular forms of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
The terms “logic” and “component” are representative of hardware and/or software that is configured to perform one or more functions. As hardware, logic (or component) may include circuitry having data processing and/or storage functionality. Examples of such circuitry may include, but are not limited or restricted to a processor, a programmable gate array, a microcontroller, an application specific integrated circuit, combinatorial circuitry, or the like. Alternatively, or in combination with the hardware circuitry described above, the logic (or component) may be software in the form of one or more software modules, which may be configured to operate as its counterpart circuitry. The software modules may include, for example, an executable application, a daemon application, an application programming interface (“API”), a subroutine, a function, a procedure, a routine, source code, or even one or more instructions. The software module(s) may be stored in any type of a suitable non-transitory storage medium, such as a programmable circuit, a semiconductor memory, non-persistent storage such as volatile memory (e.g., any type of random-access memory “RAM”), persistent storage such as non-volatile memory (e.g., read-only memory “ROM”, power-backed RAM, flash memory, phase-change memory, etc.), a solid-state drive, hard disk drive, an optical disc drive, or a portable memory device.
With respect to “alternative reality,” the term alternative reality may pertain to virtual reality, augmented reality, and mixed reality unless context suggests otherwise. “Virtual reality” includes virtual content in a virtual setting, which setting can be a fantasy or a real-world simulation. “Augmented reality” and “mixed reality” include virtual content in a real-world setting such as a real depiction of a portion of a patient's body including the anatomical element. Augmented reality includes the virtual content in the real-world setting, but the virtual content is not necessarily anchored in the real-world setting. For example, the virtual content can be information overlying the real-world setting. The information can change as the real-world setting changes due to time or environmental conditions in the real-world setting, or the information can change as a result of a consumer of the augmented reality moving through the real-world setting; however, the information remains overlying the real-world setting. Mixed reality includes the virtual content anchored in every dimension of the real-world setting. For example, the virtual content can be a virtual object anchored in the real-world setting. The virtual object can change as the real-world setting changes due to time or environmental conditions in the real-world setting, or the virtual object can change to accommodate the perspective of a consumer of the mixed reality as the consumer moves through the real-world setting. The virtual object can also change in accordance with any interactions with the consumer or another real-world or virtual agent. Unless the virtual object is moved to another location in the real-world setting by the consumer of the mixed reality, or some other real-world or virtual agent, the virtual object remains anchored in the real-world setting. Mixed reality does not exclude the foregoing information overlying the real-world setting described in reference to augmented reality.
In the following description, the terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. As an example, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, components, functions, steps or acts are in some way inherently mutually exclusive.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by those of ordinary skill in the art.
Briefly summarized, embodiments disclosed herein are directed to a medical analytic system for representing a region of a patient's body for analysis. One of the embodiments may be directed to monitoring for the advancement of a medical component (e.g., needle, introducer, catheter, etc.) through sound waves (ultrasound) for example. As disclosed, the medical analytic system may include, in some embodiments, an ultrasound-imaging system and an AR headset for the analysis, where the ultrasound-imaging system includes AR anatomy representation logic.
More specifically, the ultrasound-imaging system includes an ultrasound probe and a console, which may be configured to include the AR anatomy representation logic or a portion thereof. The ultrasound probe is configured to emit ultrasound signals (sound waves) into a patient and receive echoed ultrasound signals (sound waves) from the patient by way of a piezoelectric sensor array or an array of capacitive micromachined ultrasonic transducers (“CMUTs”). According to one embodiment of the disclosure, the ultrasound probe may receive commands from the console to capture information associated with numerous “slices” of an anatomical element (e.g., a vessel, tissue, etc.) during ultrasound scanning; namely information associated with multiple (two or more) sub-images of an ultrasound image of the anatomical element where the sub-images are captured transverse to a longitudinal axis of the anatomical element. Each slice constitutes information associated with a two-dimensional (“2D”) or three-dimensional (“3D”) planar sub-image of the anatomical element, where multiple slices are overlaid to collectively reproduce a 3D representation of the anatomical element. Alternatively, as another embodiment of the disclosure, the ultrasound probe may include visualization logic that automatically captures ultrasound scanning information as to individual slices of an image associated with the anatomical element and provides the same to the console with results of the piezoelectric sensor array or the array of CMUTs.
The console features electronic circuitry including memory and a processor configured to transform the echoed ultrasound signals to produce ultrasound-image segments corresponding to anatomical structures of the patient. These ultrasound-image segments may be combined to form ultrasound frames for display. Additionally, according to one embodiment of the disclosure, the AR anatomy representation logic may be deployed as hardware, software, or a combination of hardware and software. For instance, when deployed as software, the AR anatomy representation logic may include visualization logic configured to issue commands to the ultrasound probe to capture multiple “slices” of the anatomical element image (e.g., a vessel, artery, etc.) during ultrasound scanning. The AR anatomy representation logic may further include virtual slice positioning logic to adjust the orientation and position each image slice, virtual object assembly logic to orderly assemble the imaged anatomical element, and virtual object display logic to render a virtual object along with the ultrasound imaged object.
More specifically, as the ultrasound probe scans and moves along an ultrasound imaging area to capture a vessel for example, the visualization logic controls the capturing of information associated with vertically-oriented portions (slices) of the ultrasound image (hereinafter, “slice images”) and returns the slice images to the visual slice positioning logic. The virtual slice positioning logic is configured to determine the longitudinal orientation and positioning of each slice image based, at least in part, on the direction and speed of the ultrasound probe when in use, where such information is provided to the virtual object assembly logic. The virtual object assembly logic is configured to form the virtual object by longitudinally organizing the slice images and laterally overlaying the virtual object over the ultrasound image. For example, each visualization represented by a slice image would be positioned proximate to a neighboring slice image to construct, in a longitudinal direction, the virtual object such as a vessel virtual object.
Thereafter, the virtual object display logic is configured to display the collective slice images as the anatomical element in a virtual context (i.e., as a virtual object within a virtual reality view, a mixed reality view, or as a 3D model of the vessel).
The alternative-reality headset includes a display screen coupled to a headset frame having electronic circuitry including memory and a processor. The display screen may be configured such that a wearer of the alternative-reality headset can see the patient through the display screen. The display screen is configured to display objects of virtual anatomy over the patient corresponding to the ultrasound-image segments.
In some embodiments, the ultrasound probe is configured with a pulsed-wave Doppler imaging mode for emitting and receiving the ultrasound signals. The console is configured to capture ultrasound-imaging frames in accordance with the pulsed-wave Doppler imaging mode, combine the ultrasound-imaging frames together with an aggregation function, and segment the ultrasound-imaging frames or the aggregated ultrasound-imaging frames into the ultrasound-image segments with an image segmentation function.
In some embodiments, when the AR anatomy representation logic is activated, the console is configured to generate the virtual object as an aggregate of the ultrasound-image segments overlaid by a collection of virtualizations (image slices) by the virtual object assembly logic. The console is configured to send the objects of virtual anatomy to the alternative-reality headset for display over the patient.
Medical Analytic System Architecture
Referring to
Notwithstanding the foregoing, in some embodiments of the medical analytic system 100, at least a portion of the functionality of the AR anatomy representation logic 150 may be deployed within the AR headset 140 in lieu of the console 120. Herein, the AR headset 140 or another component operating in cooperation with the AR headset 140 may serve as the console or performs the functions (e.g., processing) thereof.
More specifically, as shown in
The console 120 includes a number of components of the medical analytic system 100, and the console 120 can take any form of a variety of forms to house the number of components. The one-or-more processors 214 and the memory 212 (e.g., non-volatile memory such as electrically erasable, programmable, read-only memory “EEPROM” or flash) of the console 120 are configured for controlling various functions of the medical analytic system 100 such as executing the AR anatomy representation logic 150 during operation of the medical analytic system 100. A digital controller or analog interface 220 is also included with the console 120, and the digital controller or analog interface 220 is in communication with the one-or-more processors 214 and other system components to govern interfacing between the ultrasound probe 130, the AR headset 140, as well as other system components.
The console 120 further includes ports 222 for connection with additional components such as optional components 224 including a printer, storage media, keyboard, etc. The ports 222 may be implemented as universal serial bus (“USB”) ports, though other types of ports or a combination of port types can be used, as well as other interfaces or connections described herein. A power connection 226 may be included with the console 120 to enable operable connection to an external power supply 228. An internal power supply 230 (e.g., disposable or rechargeable battery) can also be employed, either with the external power supply 228 or exclusive of the external power supply 228. Power management circuitry 232 is included with the digital controller or analog interface 220 of the console 120 to regulate power use and distribution.
A display 234 can be, for example, a liquid crystal display (“LCD”) integrated into the console 120 and used to display information to the clinician during a procedure. For example, the display 234 can be used to display an ultrasound image of a targeted internal body portion of the patient attained by the ultrasound probe 130. Additionally, or in the alternative, the display 234 can be used to display the virtual object positioned overlying the ultrasound image without the need of an AR headset 140. The virtual object would provide a more detailed, virtual representation of the internal anatomical element of the patient being imaged (e.g., vessel, tissue, etc.).
Alternatively, the display 234 can be separate from the console 120 instead of integrated into the console 120; however, such a display is different than that of the AR headset 140. The console 120 can further include a console button interface 236. In combination with control buttons on the ultrasound probe 130, the console button interface 236 can be used by a clinician to immediately call up a desired mode on the display 234 for use by the clinician. For example, two operating modes may include a first mode (e.g., ultrasound mode) and a second mode (e.g., AR enhanced mode), as stated above.
The ultrasound probe 130 is configured to emit ultrasound signals into the patient and receive the echoed ultrasound signals from the patient by way of a piezoelectric sensor array 238 or array of CMUTs. The ultrasound probe 130 can be configured with a continuous wave or a pulsed-wave imaging mode. For example, the ultrasound probe 130 can configured with the foregoing pulsed-wave Doppler imaging mode for emitting and receiving the ultrasound signals.
The ultrasound probe 130 further includes a button-and-memory controller 240 for governing operation of the ultrasound probe 130 and buttons thereof. The button-and-memory controller 240 can include non-volatile memory such as EEPROM. The button-and-memory controller 240 is in operable communication with an ultrasound probe interface 242 of the console 120, where the ultrasound probe interface 242 includes a piezoelectric input-output (“I/O”) component 244 for interfacing with the piezoelectric sensor array 238 (or CMUT input-output (“I/O”) component for interfacing with the array of CMUTs) of the ultrasound probe 130 and a button-and-memory I/O component 246 for interfacing with the button-and-memory controller 240 of the ultrasound probe 130. Hence, the operating mode of the ultrasound-imaging system 110 may be controlled at the ultrasound probe 130 (via the button-and-memory controller 240) and/or at the console 120 (via the console button interface 236).
As further illustrated in
Alternatively, according to another embodiment of the disclosure, the visualization logic 250 may be configured to produce slice images based on data associated with each ultrasound frame that is generated, where the aggregate of ultrasound frames constitutes the ultrasound image. From the data associated with each ultrasound frame, the visualization logic 250 is configured to generate a virtual representation of a portion of the anatomical element captured by that ultrasound image.
The virtual slice positioning logic 260 is configured to determine the position/orientation of each slice image generated by the visualization logic 250 based on, at least in part, the direction and speed of the ultrasound probe 130 when in use. For example, the speed in the movement of the ultrasound probe 130 may be relied upon to determine placement of a virtual slice image over the ultrasound image and along at least x-y axes of the virtual object. Similarly, the direction of the ultrasound probe 130 may be relied upon to determine placement of the virtual slice image over the ultrasound image along any or all of the x, y or z-axis.
The virtual object assembly logic 270 is communicatively coupled to the virtual slice positioning logic 260. Based on the positioning of the slice images, the virtual object assembly logic 270 is configured to generate a virtual object by longitudinally positioning each slice image at a determined location, where the virtual object is an aggregate of the positioned, slice images. As an illustrative example, each slice image is longitudinally organized adjacent to a neighboring slice image to construct the virtual object, such as a vessel virtual object for example.
The virtual object display logic 280 is communicatively coupled to the virtual object assembly logic 270. Herein, the virtual object display logic 280 is configured to display the aggregated, slice images as the virtual object that represent an anatomical element under analysis in a virtual context. The virtual context may include, but is not limited or restricted to a virtual reality view, a mixed reality view, or a 3D model of the anatomical element.
Referring to
As shown in
Referring now to
Method
Referring to
After producing the multiple slices of the anatomical element image during an ultrasound scanning session, the AR anatomy representation logic, based on certain usage parameters associated with the ultrasound probe, determines the positioning of the slice image (operation 530) and the arrangement (assembly) of the slice images to produce the virtual object (operation 540). The positioning/assembly may be directed to lateral (xz-axis or yz-axis) arrangement of the slice images within the virtual representation of the anatomical element that corresponds to the virtual object. The usage parameters may include, but are not limited or restricted to speed and/or direction in movement of the ultrasound probe over the ultrasound area.
Thereafter, the AR anatomy representation logic is configured to display the virtual object in an alternative reality (operation 550), such as an overlay over the ultrasound image, where the virtual object may be more prevalent when viewed using an AR headset. However, the rendering of the virtual object on the display may be conducted so that the virtual object is visible without the AR headset.
Embodiments of the invention may be embodied in other specific forms without departing from the spirit of the present disclosure. The described embodiments are to be considered in all respects only as illustrative, not restrictive. The scope of the embodiments is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/063,709, filed Aug. 10, 2020, which is incorporated by reference in its entirety into this application.
Number | Name | Date | Kind |
---|---|---|---|
5148809 | Biegeleisen-Knight et al. | Sep 1992 | A |
5181513 | Touboul et al. | Jan 1993 | A |
5325293 | Dorne | Jun 1994 | A |
5441052 | Miyajima | Aug 1995 | A |
5549554 | Miraki | Aug 1996 | A |
5573529 | Haak et al. | Nov 1996 | A |
5775322 | Silverstein et al. | Jul 1998 | A |
5879297 | Haynor et al. | Mar 1999 | A |
5908387 | LeFree et al. | Jun 1999 | A |
5967984 | Chu et al. | Oct 1999 | A |
5970119 | Hofmann | Oct 1999 | A |
6004270 | Urbano et al. | Dec 1999 | A |
6019724 | Gronningsaeter et al. | Feb 2000 | A |
6068599 | Saito et al. | May 2000 | A |
6074367 | Hubbell | Jun 2000 | A |
6129668 | Haynor et al. | Oct 2000 | A |
6132379 | Patacsil et al. | Oct 2000 | A |
6216028 | Haynor et al. | Apr 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6245018 | Lee | Jun 2001 | B1 |
6263230 | Haynor et al. | Jul 2001 | B1 |
6375615 | Flaherty et al. | Apr 2002 | B1 |
6436043 | Bonnefous | Aug 2002 | B2 |
6498942 | Esenaliev et al. | Dec 2002 | B1 |
6503205 | Manor et al. | Jan 2003 | B2 |
6508769 | Bonnefous | Jan 2003 | B2 |
6511458 | Milo et al. | Jan 2003 | B2 |
6524249 | Moehring et al. | Feb 2003 | B2 |
6543642 | Milliorn | Apr 2003 | B1 |
6554771 | Buil et al. | Apr 2003 | B1 |
6592520 | Peszynski et al. | Jul 2003 | B1 |
6592565 | Twardowski | Jul 2003 | B2 |
6601705 | Molina et al. | Aug 2003 | B2 |
6612992 | Hossack et al. | Sep 2003 | B1 |
6613002 | Clark et al. | Sep 2003 | B1 |
6623431 | Sakuma et al. | Sep 2003 | B1 |
6641538 | Nakaya et al. | Nov 2003 | B2 |
6647135 | Bonnefous | Nov 2003 | B2 |
6687386 | Ito et al. | Feb 2004 | B1 |
6749569 | Pellegretti | Jun 2004 | B1 |
6754608 | Svanerudh et al. | Jun 2004 | B2 |
6755789 | Stringer et al. | Jun 2004 | B2 |
6840379 | Franks-Farah et al. | Jan 2005 | B2 |
6857196 | Dalrymple | Feb 2005 | B2 |
6979294 | Selzer et al. | Dec 2005 | B1 |
7074187 | Selzer et al. | Jul 2006 | B2 |
7244234 | Ridley et al. | Jul 2007 | B2 |
7359554 | Klingensmith et al. | Apr 2008 | B2 |
7534209 | Abend et al. | May 2009 | B2 |
7599730 | Hunter et al. | Oct 2009 | B2 |
7637870 | Flaherty et al. | Dec 2009 | B2 |
7681579 | Schwartz | Mar 2010 | B2 |
7691061 | Hirota | Apr 2010 | B2 |
7699779 | Sasaki et al. | Apr 2010 | B2 |
7720520 | Willis | May 2010 | B2 |
7727153 | Fritz et al. | Jun 2010 | B2 |
7734326 | Pedain et al. | Jun 2010 | B2 |
7831449 | Ying et al. | Nov 2010 | B2 |
7905837 | Suzuki | Mar 2011 | B2 |
7925327 | Weese | Apr 2011 | B2 |
7927278 | Selzer et al. | Apr 2011 | B2 |
8014848 | Birkenbach et al. | Sep 2011 | B2 |
8050523 | Younge et al. | Nov 2011 | B2 |
8060181 | Rodriguez Ponce et al. | Nov 2011 | B2 |
8068581 | Boese | Nov 2011 | B2 |
8075488 | Burton | Dec 2011 | B2 |
8090427 | Eck et al. | Jan 2012 | B2 |
8105239 | Specht | Jan 2012 | B2 |
8172754 | Watanabe et al. | May 2012 | B2 |
8175368 | Sathyanarayana | May 2012 | B2 |
8200313 | Rambod et al. | Jun 2012 | B1 |
8211023 | Swan et al. | Jul 2012 | B2 |
8228347 | Beasley et al. | Jul 2012 | B2 |
8298147 | Huennekens et al. | Oct 2012 | B2 |
8303505 | Webler et al. | Nov 2012 | B2 |
8323202 | Roschak et al. | Dec 2012 | B2 |
8328727 | Miele et al. | Dec 2012 | B2 |
8388541 | Messerly et al. | Mar 2013 | B2 |
8409103 | Grunwald et al. | Apr 2013 | B2 |
8449465 | Nair et al. | May 2013 | B2 |
8553954 | Saikia | Oct 2013 | B2 |
8556815 | Pelissier et al. | Oct 2013 | B2 |
8585600 | Liu et al. | Nov 2013 | B2 |
8622913 | Dentinger et al. | Jan 2014 | B2 |
8706457 | Hart et al. | Apr 2014 | B2 |
8727988 | Flaherty et al. | May 2014 | B2 |
8734357 | Taylor | May 2014 | B2 |
8744211 | Owen | Jun 2014 | B2 |
8754865 | Merritt et al. | Jun 2014 | B2 |
8764663 | Smok et al. | Jul 2014 | B2 |
8781194 | Malek et al. | Jul 2014 | B2 |
8781555 | Burnside et al. | Jul 2014 | B2 |
8790263 | Randall et al. | Jul 2014 | B2 |
8849382 | Cox et al. | Sep 2014 | B2 |
8939908 | Suzuki et al. | Jan 2015 | B2 |
8961420 | Zhang | Feb 2015 | B2 |
9022940 | Meier | May 2015 | B2 |
9138290 | Hadjicostis | Sep 2015 | B2 |
9155517 | Dunbar et al. | Oct 2015 | B2 |
9204858 | Pelissier et al. | Dec 2015 | B2 |
9220477 | Urabe et al. | Dec 2015 | B2 |
9257220 | Nicholls et al. | Feb 2016 | B2 |
9295447 | Shah | Mar 2016 | B2 |
9320493 | Visveshwara | Apr 2016 | B2 |
9357980 | Toji et al. | Jun 2016 | B2 |
9364171 | Harris et al. | Jun 2016 | B2 |
9427207 | Sheldon et al. | Aug 2016 | B2 |
9445780 | Hossack et al. | Sep 2016 | B2 |
9456766 | Cox et al. | Oct 2016 | B2 |
9456804 | Tamada | Oct 2016 | B2 |
9459087 | Dunbar et al. | Oct 2016 | B2 |
9468413 | Hall et al. | Oct 2016 | B2 |
9492097 | Wilkes et al. | Nov 2016 | B2 |
9521961 | Silverstein et al. | Dec 2016 | B2 |
9554716 | Burnside et al. | Jan 2017 | B2 |
9582876 | Specht | Feb 2017 | B2 |
9597008 | Henkel et al. | Mar 2017 | B2 |
9610061 | Ebbini et al. | Apr 2017 | B2 |
9636031 | Cox | May 2017 | B2 |
9649037 | Lowe et al. | May 2017 | B2 |
9649048 | Cox et al. | May 2017 | B2 |
9702969 | Hope Simpson et al. | Jul 2017 | B2 |
9715757 | Ng et al. | Jul 2017 | B2 |
9717415 | Cohen | Aug 2017 | B2 |
9731066 | Liu et al. | Aug 2017 | B2 |
9814433 | Benishti et al. | Nov 2017 | B2 |
9814531 | Yagi et al. | Nov 2017 | B2 |
9861337 | Patwardhan et al. | Jan 2018 | B2 |
9895138 | Sasaki | Feb 2018 | B2 |
9913605 | Harris et al. | Mar 2018 | B2 |
9949720 | Southard et al. | Apr 2018 | B2 |
10043272 | Forzoni et al. | Aug 2018 | B2 |
10380919 | Savitsky et al. | Aug 2019 | B2 |
10380920 | Savitsky et al. | Aug 2019 | B2 |
10424225 | Nataneli et al. | Sep 2019 | B2 |
10434278 | Dunbar et al. | Oct 2019 | B2 |
10449330 | Newman et al. | Oct 2019 | B2 |
10524691 | Newman et al. | Jan 2020 | B2 |
10674935 | Henkel et al. | Jun 2020 | B2 |
10751509 | Misener | Aug 2020 | B2 |
10758155 | Henkel et al. | Sep 2020 | B2 |
10765343 | Henkel et al. | Sep 2020 | B2 |
10896628 | Savitsky et al. | Jan 2021 | B2 |
11062624 | Savitsky et al. | Jul 2021 | B2 |
11120709 | Savitsky et al. | Sep 2021 | B2 |
11600201 | Savitsky et al. | Mar 2023 | B1 |
20020038088 | Imran et al. | Mar 2002 | A1 |
20020148277 | Umeda | Oct 2002 | A1 |
20030047126 | Tomaschko | Mar 2003 | A1 |
20030060714 | Henderson et al. | Mar 2003 | A1 |
20030073900 | Senarith et al. | Apr 2003 | A1 |
20030093001 | Martikainen | May 2003 | A1 |
20030106825 | Molina et al. | Jun 2003 | A1 |
20030120154 | Sauer et al. | Jun 2003 | A1 |
20040055925 | Franks-Farah et al. | Mar 2004 | A1 |
20050000975 | Carco et al. | Jan 2005 | A1 |
20050049504 | Lo et al. | Mar 2005 | A1 |
20050165299 | Kressy et al. | Jul 2005 | A1 |
20050251030 | Azar et al. | Nov 2005 | A1 |
20050267365 | Sokulin et al. | Dec 2005 | A1 |
20060013523 | Childlers et al. | Jan 2006 | A1 |
20060015039 | Cassidy et al. | Jan 2006 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060079781 | Germond-Rouet et al. | Apr 2006 | A1 |
20060184029 | Haim et al. | Aug 2006 | A1 |
20060210130 | Germond-Rouet et al. | Sep 2006 | A1 |
20070043341 | Anderson et al. | Feb 2007 | A1 |
20070049822 | Bunce et al. | Mar 2007 | A1 |
20070073155 | Park et al. | Mar 2007 | A1 |
20070199848 | Ellswood et al. | Aug 2007 | A1 |
20070239120 | Brock et al. | Oct 2007 | A1 |
20070249911 | Simon | Oct 2007 | A1 |
20080021322 | Stone et al. | Jan 2008 | A1 |
20080033293 | Beasley et al. | Feb 2008 | A1 |
20080033759 | Finlay | Feb 2008 | A1 |
20080051657 | Rold | Feb 2008 | A1 |
20080146915 | McMorrow | Jun 2008 | A1 |
20080177186 | Slater et al. | Jul 2008 | A1 |
20080221425 | Olson et al. | Sep 2008 | A1 |
20080294037 | Richter | Nov 2008 | A1 |
20080300491 | Bonde et al. | Dec 2008 | A1 |
20090012399 | Sunagawa et al. | Jan 2009 | A1 |
20090143672 | Harms et al. | Jun 2009 | A1 |
20090143684 | Cermak et al. | Jun 2009 | A1 |
20090156926 | Messerly et al. | Jun 2009 | A1 |
20090306509 | Pedersen et al. | Dec 2009 | A1 |
20100020926 | Boese | Jan 2010 | A1 |
20100106015 | Norris | Apr 2010 | A1 |
20100179428 | Pedersen et al. | Jul 2010 | A1 |
20100211026 | Sheetz et al. | Aug 2010 | A2 |
20100277305 | Garner et al. | Nov 2010 | A1 |
20100286515 | Gravenstein et al. | Nov 2010 | A1 |
20100312121 | Guan | Dec 2010 | A1 |
20110002518 | Ziv-Ari et al. | Jan 2011 | A1 |
20110071404 | Schmitt et al. | Mar 2011 | A1 |
20110295108 | Cox et al. | Dec 2011 | A1 |
20110313293 | Lindekugel et al. | Dec 2011 | A1 |
20120179038 | Meurer et al. | Jul 2012 | A1 |
20120197132 | O'Connor | Aug 2012 | A1 |
20120209121 | Boudier | Aug 2012 | A1 |
20120220865 | Brown et al. | Aug 2012 | A1 |
20120238875 | Savitsky et al. | Sep 2012 | A1 |
20120277576 | Lui | Nov 2012 | A1 |
20130041250 | Pelissier et al. | Feb 2013 | A1 |
20130102889 | Southard et al. | Apr 2013 | A1 |
20130131499 | Chan et al. | May 2013 | A1 |
20130131502 | Blaivas et al. | May 2013 | A1 |
20130150724 | Blaivas et al. | Jun 2013 | A1 |
20130188832 | Ma et al. | Jul 2013 | A1 |
20130218024 | Boctor et al. | Aug 2013 | A1 |
20130324840 | Zhongping et al. | Dec 2013 | A1 |
20140005530 | Liu et al. | Jan 2014 | A1 |
20140031690 | Toji | Jan 2014 | A1 |
20140036091 | Zalev et al. | Feb 2014 | A1 |
20140073976 | Fonte et al. | Mar 2014 | A1 |
20140100440 | Cheline et al. | Apr 2014 | A1 |
20140155737 | Manzke et al. | Jun 2014 | A1 |
20140180098 | Flaherty et al. | Jun 2014 | A1 |
20140188133 | Misener | Jul 2014 | A1 |
20140188440 | Donhowe et al. | Jul 2014 | A1 |
20140257104 | Dunbar et al. | Sep 2014 | A1 |
20140276059 | Sheehan | Sep 2014 | A1 |
20140276081 | Tegels | Sep 2014 | A1 |
20140276085 | Miller | Sep 2014 | A1 |
20140276690 | Grace | Sep 2014 | A1 |
20140343431 | Vajinepalli et al. | Nov 2014 | A1 |
20150005738 | Blacker | Jan 2015 | A1 |
20150011887 | Ahn et al. | Jan 2015 | A1 |
20150065916 | Maguire et al. | Mar 2015 | A1 |
20150073279 | Cai et al. | Mar 2015 | A1 |
20150112200 | Oberg et al. | Apr 2015 | A1 |
20150209113 | Burkholz | Jul 2015 | A1 |
20150209526 | Matsubara et al. | Jul 2015 | A1 |
20150294497 | Ng et al. | Oct 2015 | A1 |
20150297097 | Matsubara et al. | Oct 2015 | A1 |
20150327841 | Banjanin | Nov 2015 | A1 |
20150359991 | Dunbar et al. | Dec 2015 | A1 |
20160029995 | Navratil et al. | Feb 2016 | A1 |
20160029998 | Brister et al. | Feb 2016 | A1 |
20160058420 | Cinthio et al. | Mar 2016 | A1 |
20160100970 | Brister et al. | Apr 2016 | A1 |
20160101263 | Blumenkranz et al. | Apr 2016 | A1 |
20160113699 | Sverdlik et al. | Apr 2016 | A1 |
20160120607 | Sorotzkin et al. | May 2016 | A1 |
20160143622 | Xie et al. | May 2016 | A1 |
20160166232 | Merritt | Jun 2016 | A1 |
20160202053 | Walker et al. | Jul 2016 | A1 |
20160213398 | Liu | Jul 2016 | A1 |
20160278743 | Kawashima | Sep 2016 | A1 |
20160278869 | Grunwald | Sep 2016 | A1 |
20160296208 | Sethuraman et al. | Oct 2016 | A1 |
20160374644 | Mauldin, Jr. et al. | Dec 2016 | A1 |
20170079548 | Silverstein et al. | Mar 2017 | A1 |
20170086785 | Bjaerum | Mar 2017 | A1 |
20170100092 | Kruse et al. | Apr 2017 | A1 |
20170164923 | Matsumoto | Jun 2017 | A1 |
20170172424 | Eggers et al. | Jun 2017 | A1 |
20170188839 | Tashiro | Jul 2017 | A1 |
20170196535 | Arai et al. | Jul 2017 | A1 |
20170215842 | Ryu et al. | Aug 2017 | A1 |
20170259013 | Boyden et al. | Sep 2017 | A1 |
20170265840 | Bharat et al. | Sep 2017 | A1 |
20170303894 | Scully | Oct 2017 | A1 |
20170367678 | Sirtori et al. | Dec 2017 | A1 |
20180015256 | Southard et al. | Jan 2018 | A1 |
20180116723 | Hettrick et al. | May 2018 | A1 |
20180125450 | Blackbourne et al. | May 2018 | A1 |
20180161502 | Nanan et al. | Jun 2018 | A1 |
20180199914 | Ramachandran et al. | Jul 2018 | A1 |
20180214119 | Mehrmohammadi et al. | Aug 2018 | A1 |
20180225993 | Buras | Aug 2018 | A1 |
20180228465 | Southard et al. | Aug 2018 | A1 |
20180235576 | Brannan | Aug 2018 | A1 |
20180250078 | Shochat et al. | Sep 2018 | A1 |
20180272108 | Padilla et al. | Sep 2018 | A1 |
20180279996 | Cox et al. | Oct 2018 | A1 |
20180286287 | Razzaque | Oct 2018 | A1 |
20180310955 | Lindekugel et al. | Nov 2018 | A1 |
20180317881 | Astigarraga | Nov 2018 | A1 |
20180366035 | Dunbar et al. | Dec 2018 | A1 |
20190060014 | Hazelton et al. | Feb 2019 | A1 |
20190069923 | Wang | Mar 2019 | A1 |
20190076121 | Southard et al. | Mar 2019 | A1 |
20190088019 | Prevrhal | Mar 2019 | A1 |
20190105017 | Hastings | Apr 2019 | A1 |
20190117190 | Djajadiningrat | Apr 2019 | A1 |
20190223757 | Durfee | Jul 2019 | A1 |
20190239850 | Dalvin | Aug 2019 | A1 |
20190282324 | Freeman | Sep 2019 | A1 |
20190298457 | Bharat | Oct 2019 | A1 |
20190307516 | Schotzko et al. | Oct 2019 | A1 |
20190339525 | Yanof | Nov 2019 | A1 |
20190355278 | Sainsbury | Nov 2019 | A1 |
20190365348 | Toume et al. | Dec 2019 | A1 |
20200041261 | Bernstein | Feb 2020 | A1 |
20200069285 | Annangi et al. | Mar 2020 | A1 |
20200113540 | Gijsbers et al. | Apr 2020 | A1 |
20200129136 | Harding | Apr 2020 | A1 |
20200188028 | Feiner | Jun 2020 | A1 |
20200230391 | Burkholz | Jul 2020 | A1 |
20210007710 | Douglas | Jan 2021 | A1 |
20210045716 | Shiran et al. | Feb 2021 | A1 |
20210166583 | Buras et al. | Jun 2021 | A1 |
20210307838 | Xia et al. | Oct 2021 | A1 |
20210353255 | Schneider et al. | Nov 2021 | A1 |
20210402144 | Messerly | Dec 2021 | A1 |
20220022969 | Misener | Jan 2022 | A1 |
20220031965 | Durfee | Feb 2022 | A1 |
20220160434 | Messerly et al. | May 2022 | A1 |
20220168050 | Sowards et al. | Jun 2022 | A1 |
20220172354 | Misener et al. | Jun 2022 | A1 |
20220211442 | McLaughlin et al. | Jul 2022 | A1 |
20230113291 | de Wild et al. | Apr 2023 | A1 |
20230240643 | Cermak et al. | Aug 2023 | A1 |
20230389893 | Misener et al. | Dec 2023 | A1 |
20240008929 | Misener et al. | Jan 2024 | A1 |
20240050061 | McLaughlin et al. | Feb 2024 | A1 |
20240058074 | Misener | Feb 2024 | A1 |
20240062678 | Sowards et al. | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
2006201646 | Nov 2006 | AU |
114129137 | Sep 2022 | CN |
0933063 | Aug 1999 | EP |
1504713 | Feb 2005 | EP |
1591074 | May 2008 | EP |
3181083 | Jun 2017 | EP |
3530221 | Aug 2019 | EP |
2000271136 | Oct 2000 | JP |
2014150928 | Aug 2014 | JP |
2018175547 | Nov 2018 | JP |
20180070878 | Jun 2018 | KR |
20190013133 | Feb 2019 | KR |
2013059714 | Apr 2013 | WO |
2014115150 | Jul 2014 | WO |
2014174305 | Oct 2014 | WO |
2015017270 | Feb 2015 | WO |
2017096487 | Jun 2017 | WO |
2017214428 | Dec 2017 | WO |
2018026878 | Feb 2018 | WO |
2018134726 | Jul 2018 | WO |
2018206473 | Nov 2018 | WO |
2019232451 | Dec 2019 | WO |
2020002620 | Jan 2020 | WO |
2020016018 | Jan 2020 | WO |
2019232454 | Feb 2020 | WO |
2020044769 | Mar 2020 | WO |
WO-2020102665 | May 2020 | WO |
2020186198 | Sep 2020 | WO |
2022263763 | Dec 2022 | WO |
2023235435 | Dec 2023 | WO |
2024010940 | Jan 2024 | WO |
2024039608 | Feb 2024 | WO |
2024039719 | Feb 2024 | WO |
Entry |
---|
State, A., et al. (Aug. 1996). Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies. In Proceedings of the 23rd annual conference on computer graphics and interactive techniques (pp. 439-446) (Year: 1996). |
PCT/US2021/045218 filed Aug. 9, 2021 International Search Report and Written Opinion dated Nov. 23, 2021. |
Sebastian Vogt: “Real-Time Augmented Reality for Image-Guided Interventions”, Oct. 5, 2009, XPO55354720, Retrieved from the Internet: URL: https://opus4.kobv.de/opus4-fau/frontdoor/deliver/index/docId/1235/file/SebastianVogtDissertation.pdf. |
William F Garrett et al: “Real-time incremental visualization of dynamic ultrasound volumes using parallel BSP trees”, Visualization '96. Proceedings, IEEE, NE, Oct. 27, 1996, pp. 235-ff, XPO58399771, ISBN: 978-0-89791-864-0 abstract, figures 1-7, pp. 236-240. |
PCT/US12/61182 International Seach Report and Written Opinion dated Mar. 11, 2013. |
PCT/US2021/049123 filed Sep. 3, 2021 International Search Report and Written Opinion dated Feb. 4, 2022. |
PCT/US2021/049294 filed Sep. 7, 2021 International Search Report and Written Opinion dated Dec. 8, 2021. |
PCT/US2021/049712 filed Sep. 9, 2021 International Search Report and Written Opinion dated Dec. 14, 2021. |
PCT/US2021/052055 filed Sep. 24, 2021 International Search Report and Written Opinion dated Dec. 20, 2021. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Decision on Appeal dated Nov. 1, 2017. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Examiner's Answer dated Nov. 16, 2015. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Final Office Action dated Dec. 5, 2014. |
U.S. Appl. No. 13/656,563, filed Oct. 19, 2012 Non-Final Office Action dated Jul. 18, 2014. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Final Office Action dated Jun. 2, 2020. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Non-Final Office Action dated Dec. 16, 2019. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Dec. 11, 2020. |
U.S. Appl. No. 15/650,474, filed Jul. 14, 2017 Notice of Allowance dated Mar. 1, 2021. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Advisory Action dated Dec. 22, 2020. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Examiner's Answer dated Jun. 3, 2021. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Final Office Action dated Oct. 13, 2020. |
U.S. Appl. No. 15/951,903, filed Apr. 12, 2018 Non-Final Office Action dated May 22, 2020. |
U.S. Appl. No. 17/020,476, filed Sep. 14, 2020 Non-Final Office Action dated Feb. 9, 2022. |
Ikhsan Mohammad et al: “Assistive technology for ultrasound-guided central venous catheter placement”, Journal of Medical Ultrasonics, Japan Society of Ultrasonics in Medicine, Tokyo, JP, vol. 45, No. 1, Apr. 19, 2017, pp. 41-57, XPO36387340, ISSN: 1346-4523, DOI: 10.1007/S10396-017-0789-2 [retrieved on Apr. 19, 2017]. |
PCT/US2021/044419 filed Aug. 3, 2021 International Search Report and Written Opinion dated Nov. 19, 2021. |
PCT/US2021/050973 filed Sep. 17, 2021 International Search Report and Written Opinion dated Nov. 7, 2022. |
Lu Zhenyu et al “Recent advances in 5 robot-assisted echography combining perception control and cognition.” Cognitive Computation and Systems the Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage Herts. SG1 2AY UK vol. 2 No. 3 Sep. 2, 2020 (Sep. 2, 2020). |
Pagoulatos, N. et al. “New spatial localizer based on fiber optics with applications in 3D ultrasound imaging” Proceeding of Spie, vol. 3976 (Apr. 18, 2000; Apr. 18, 2000). |
PCT/US2021/042369 filed Jul. 20, 2021 International Search Report and Written Opinion dated Oct. 25, 2021. |
PCT/US2021/053018 filed Sep. 30, 2021 International Search Report and Written Opinion dated May 3, 2022. |
PCT/US2021/055076 filed Oct. 14, 2021 International Search Report and Written Opinion dated Mar. 25, 2022. |
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Non-Final Office Action dated Mar. 6, 2023. |
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Restriction Requirement dated Dec. 15, 2022. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Non-Final Office Action dated Mar. 31, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Restriction Requirement dated Jan. 12, 2023. |
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Restriction Requirement dated Feb. 27, 2023. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Restriction Requirement dated Feb. 1, 2023. |
Stolka, P.J., et al., (2014). Needle Guidance Using Handheld Stereo Vision and Projection for Ultrasound-Based Interventions. In: Galland, P., Hata, N., Barillot, C., Hornegger, J., Howe, R. (eds) Medical Image Computing and Computer-Assisted Intervention—MICCAI 2014. MICCAI 2014. (Year: 2014). |
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Final Office Action dated Aug. 29, 2023. |
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Non-Final Office Action dated Jun. 5, 2023. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Non-Final Office Action dated Jun. 6, 2023. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Restriction Requirement dated Jul. 13, 2023. |
PCT/US2023/024067 filed May 31, 2023 International Search Report and Written Opinion dated Sep. 15, 2023. |
PCT/US2023/027147 filed Jul. 7, 2023 International Search Report and Written Opinion dated Oct. 2, 2023. |
PCT/US2023/030160 filed Aug. 14, 2023 International Search Report and Written Opinion dated Oct. 23, 2023. |
PCT/US2023/030347 filed Aug. 16, 2023 International Search Report and Written Opinion dated Nov. 6, 2023. |
Practical guide for safe central venous catheterization and management 2017 Journal of Anesthesia vol. 34 published online Nov. 30, 2019 pp. 167-186. |
U.S. Appl. No. 17/380,767, filed Jul. 20, 2021 Notice of Allowance dated Aug. 31, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Final Office Action dated Oct. 16, 2023. |
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Board Decison dated Oct. 25, 2023. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Final Office Action dated Nov. 21, 2023. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Non-Final Office Action dated Oct. 6, 2023. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Non-Final Office Action dated Sep. 14, 2023. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Advisory Action dated Jan. 19, 2024. |
U.S. Appl. No. 17/393,283, filed Aug. 3, 2021 Non-Final Office Action dated Feb. 29, 2024. |
U.S. Appl. No. 17/478,754, filed Sep. 17, 2021 Restriction Requirement dated Jan. 22, 2024. |
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Non-Final Office Action dated Mar. 22, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Advisory Action dated Jan. 24, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Non-Final Office Action dated Mar. 21, 2024. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Advisory Action dated Apr. 4, 2024. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Final Office Action dated Jan. 25, 2024. |
U.S. Appl. No. 17/832,389, filed Jun. 3, 2022 Notice of Allowance dated May 15, 2024. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Final Office Action dated Mar. 15, 2024. |
U.S. Appl. No. 17/478,754, filed Sep. 17, 2021 Non-Final Office Action dated Jul. 1, 2024. |
U.S. Appl. No. 17/491,308, filed Sep. 30, 2021 Notice of Allowance dated Jun. 27, 2024. |
U.S. Appl. No. 17/501,909, filed Oct. 14, 2021 Final Office Action dated Aug. 5, 2024. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Advisory Action dated Jun. 7, 2024. |
U.S. Appl. No. 17/861,031, filed Jul. 8, 2022 Notice of Allowance dated Jul. 3, 2024. |
U.S. Appl. No. 18/385,101 filed Oct. 30, 2023 Notice of Allowance dated Aug. 20, 2024. |
Number | Date | Country | |
---|---|---|---|
20220039777 A1 | Feb 2022 | US |
Number | Date | Country | |
---|---|---|---|
63063709 | Aug 2020 | US |