The present disclosure relates to the treatment of patients with lung diseases and, more particularly, to devices, systems, and methods for implementing a dynamic 3D lung map view for tool navigation inside a patient's lungs.
Lung cancer has an extremely high mortality rate, especially if it is not diagnosed in its early stages. The National Lung Screening Trial has demonstrated that a reduction in mortality occurs if diagnostic scans such as computed tomography (CT) scans are used for early detection for those at risk of contracting the disease. While CT scans increase the possibility that small lesions and nodules in the lung can be detected, these lesions and nodules still require biopsy and cytological examination before a diagnosis can be rendered and treatment can be undertaken.
To perform a biopsy, as well as many treatments, navigation of tools within the lungs to the point of biopsy or treatment is necessary. Accordingly, improvements to systems and methods of navigating are continually being sought.
Provided in accordance with the present disclosure is a method for implementing a dynamic three-dimensional (3D) lung map view for navigating a probe inside a patient's lungs.
In an aspect of the present disclosure, the method includes loading a navigation plan into a navigation system, the navigation plan including a planned pathway shown in a 3D model generated from a plurality of CT images, inserting the probe into a patient's airways, the probe including a location sensor in operative communication with the navigation system, registering a sensed location of the probe with the planned pathway, selecting a target in the navigation plan, resenting a view of the 3D model showing the planned pathway and indicating the sensed location of the probe, navigating the probe through the airways of the patient's lungs toward the target, iteratively adjusting the presented view of the 3D model showing the planned pathway based on the sensed location of the probe, and updating the presented view by removing at least a part of an object forming part of the 3D model.
In another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes zooming in when the probe approaches the target.
In yet another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes zooming in when the diameter of an airway within which the probe is sensed to be located is less than a predetermined threshold.
In another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes changing the presented view to a view wherein the airway tree bifurcation is maximally spread.
In yet another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes aligning the view with the sensed location of the probe to show where the probe is and what lies ahead of the probe.
In another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes changing the presented view to be orthogonal to a vector from the probe to the pathway.
In yet another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes changing the presented view to be perpendicular to the sensed location of the probe in relation to the 3D model to show the area around the probe.
In another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes changing the presented view to be behind the sensed location of the probe in relation to the 3D model to show the area ahead of the probe.
In yet another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes changing the presented view to be at the tip of the probe and orthogonal to the directing in which the probe is moving.
In another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes changing the presented view to be perpendicular to a vector from the probe to the target to show the alignment of the probe to the target.
In yet another aspect of the present disclosure, iteratively adjusting the presented view of the 3D model includes rotating the presented view around a focal point to improve a 3D perception of the sensed location of the probe in relation to the 3D model.
In a further aspect of the present disclosure, updating the presented view by removing at least part of an object includes removing at least part of an object which is outside of a region of interest.
In yet a further aspect of the present disclosure, updating the presented view by removing at least part of an object includes removing at least part of an object which is obstructing the probe.
In a further aspect of the present disclosure, updating the presented view by removing at least part of an object includes removing at least part of an object which is obstructing the target.
In yet a further aspect of the present disclosure, updating the presented view by removing at least part of an object includes removing at least part of an object which is not relevant to the sensed location of the probe.
In a further aspect of the present disclosure, updating the presented view by removing at least part of an object includes removing at least part of an object which is not relevant to a current selected state of the navigation system.
In another aspect of the present disclosure, the method further includes presenting an alert.
In a further aspect of the present disclosure, presenting an alert includes presenting an alert when the probe is approaching the pleura.
In yet a further aspect of the present disclosure, presenting an alert includes presenting an alert when the tool is approaching major blood vessels.
In a further aspect of the present disclosure, presenting an alert includes presenting an alert when the sensed location of the probe is off of the planned pathway.
Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.
Various aspects and features of the present disclosure are described hereinbelow with references to the drawings, wherein:
Devices, systems, and methods for implementing a dynamic 3D lung map view for tool navigation inside a patient's lungs are provided in accordance with the present disclosure. A location sensor may be incorporated into different types of tools and catheters to track the location and assist in navigation of the tools. The tracked location of the location sensor may be used to visually show the location of a tool on the dynamic 3D lung map. The location of the location sensor within the body of a patient, with reference to a 3D map or 2D images as well as a planned pathway assists the clinician in navigating lungs of the patient. However, because of the amounts of data being presented and the ability to show details of the airways, there is a desire to assist the clinician and eliminate unessential data or data regarding portions of the anatomy that are unrelated to a specific navigation or a specific procedure. In addition, there is a desire to harness this detailed anatomical data and alert the clinician regarding proximity to certain anatomical features. These and other aspects of the present disclosure are detailed herein below.
The dynamic 3D lung map view, as disclosed herein, is one of a variety of views that may be presented by an electromagnetic navigation (EMN) system which may be used by a clinician to perform an ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® (ENB) procedure. Among other tasks that may be performed using the EMN system are planning a pathway to target tissue, navigating a positioning assembly to the target tissue, and navigating a variety of tools, such as a locatable guide (LG) and/or a biopsy tool to the target tissue.
An ENB procedure generally involves at least two phases: (1) planning a pathway to a target located within, or adjacent to, the patient's lungs; and (2) navigating a probe to the target along the planned pathway. These phases are generally referred to as (1) “planning” and (2) “navigation.” An example of the planning software described herein can be found in U.S. patent application Ser. Nos. 13/838,805, 13/838,997, and 13/839,224, all of which are filed by Covidien LP on Mar. 15, 2013 and entitled “Pathway Planning System and Method,” all of which are incorporated herein by reference. An example of the planning software can be found in commonly assigned U.S. Provision Patent Application No. 62/020,240 entitled “SYSTEM AND METHOD FOR NAVIGATING WITHIN THE LUNG” the entire contents of which are incorporated herein by reference.
Prior to the planning phase, the patient's lungs are imaged by, for example, a computed tomography (CT) scan, although additional applicable methods of imaging will be known to those skilled in the art. The image data assembled during the CT scan may then be stored in, for example, the Digital Imaging and Communications in Medicine (DICOM) format, although additional applicable formats will be known to those skilled in the art. The CT scan image data may then be loaded into a planning software application (“application”) to be used during the planning phase of the ENB procedure.
The application may use the CT scan image data to generate a three-dimensional (3D) model of the patient's lungs. The 3D model may include, among other things, a model airway tree corresponding to the actual airways of the patient's lungs, and showing the various passages, branches, and bifurcations of the patient's actual airway tree. Additionally, the 3D model may include lesions, markers, blood vessels, and/or a 3D rendering of the pleura. While the CT scan image data may have gaps, omissions, and/or other imperfections included in the image data, the 3D model is a smooth representation of the patient's airways, with any such gaps, omissions, and/or imperfections in the CT scan image data filled in or corrected. As described in more detail below, the 3D model may be viewed in various orientations. For example, if a clinician desires to view a particular section of the patient's airways, the clinician may view the 3D model represented in a 3D rendering and rotate and/or zoom in on the particular section of the patient's airways. Additionally, during the navigation phase of an ENB procedure, while a tool is being navigated through the patient's airways, the clinician may want to have the presented view of the 3D model dynamically updated as the tool is navigated. Such a dynamic 3D lung map view is disclosed below.
Prior to the start of the navigation phase of an ENB procedure, the 3D model is registered with the actual lungs of the patient. One potential method of registration involves navigating a locatable guide into each lobe of the patient's lungs to at least the second bifurcation of the airways of that lobe. The position of the locatable guide is tracked during this registration phase, and the 3D model is iteratively updated based on the tracked position of the locatable guide within the actual airways of the patient's lungs. This registration process is described in commonly-owned U.S. Provisional Patent Application Ser. No. 62/020,220 entitled “Real-Time Automatic Registration Feedback”, filed on Jul. 2, 2014, by Brown et al. With reference to
As illustrated in
Catheter guide assemblies 90, 100 including LG 92 and EWC 96 are configured for insertion through a working channel of bronchoscope 50 into the patient's airways (although the catheter guide assemblies 90, 100 may alternatively be used without bronchoscope 50). LG 92 and EWC 96 are selectively lockable relative to one another via a locking mechanism 99. A six degrees-of-freedom electromagnetic tracking system 70, e.g., similar to those disclosed in U.S. Pat. No. 6,188,355 and published PCT Application Nos. WO 00/10456 and WO 01/67035, entitled “Wireless six-degree-of-freedom locator”, filed on Dec. 14, 1998 by Gilboa, the entire contents of each of which is incorporated herein by reference, or any other suitable positioning measuring system, is utilized for performing navigation, although other configurations are also contemplated. Tracking system 70 is configured for use with catheter guide assemblies 90, 100 to track the position of EM sensor 94 as it moves in conjunction with EWC 96 through the airways of the patient, as detailed below.
As shown in
Also shown in
Although the EM sensor 94 is described above as being included in LG 92 it is also envisioned that EM sensor 94 may be embedded or incorporated within biopsy tool 102 where biopsy tool 102 may alternatively be utilized for navigation without need of LG 92 or the necessary tool exchanges that use of LG 92 requires. A variety of useable biopsy tools are described in U.S. Provisional Patent Application Nos. 61/906,732 and 61/906,762 both entitled “DEVICES, SYSTEMS, AND METHODS FOR NAVIGATING A BIOPSY TOOL TO A TARGET LOCATION AND OBTAINING A TISSUE SAMPLE USING THE SAME”, filed Nov. 20, 2013 and U.S. Provisional Patent Application No. 61/955,407 having the same title and filed Mar. 14, 2014, the entire contents of each of which are incorporated herein by reference and useable with the EMN system 10 as described herein.
During procedure planning, workstation 80 utilizes computed tomographic (CT) scan image data for generating and viewing a three-dimensional (3D) model of the patient's airways, enables the identification of target tissue on the 3D model (automatically, semi-automatically or manually), and allows for the selection of a pathway through the patient's airways to the target tissue. The 3D model may be presented on a display monitor associated with workstation 80, or in any other suitable fashion.
Using workstation 80, various views of the 3D model may be presented and may be manipulated by a clinician to facilitate identification of a target and selection of a suitable pathway through the patient's airways to access the target. For example, EMN application 81 may be configured in various states to display the 3D model in a variety of view modes. Some of these view modes may include a dynamic 3D lung map view, as further described below. For each view of the 3D model, the angle from which the 3D model is displayed may correspond to a view point. The view point may be fixed at a predefined location and/or orientation, or may be adjusted by the clinician operating workstation 80.
The 3D model may also show marks of the locations where previous biopsies were performed, including the dates, times, and other identifying information regarding the tissue samples obtained. These marks may also be selected as the target to which a pathway can be planned. Once selected, the pathway is saved for use during the navigation procedure.
Following procedure planning, a procedure may be undertaken in which the EM sensor 94, in conjunction with tracking system 70, enables tracking of EM sensor 94 (and thus the distal end of the EWC or the tool 102) as EM sensor 94 is advanced through the patient's airways following the pathway planned during the procedure planning phase.
Turning now to
Referring now to
Next, at step S304, EMN application 81 determines whether any objects are visible from the current view point but are outside of a region of interest for the current navigation procedure. An example might be other targets, or portions of the patient's physiology, such as blood vessels and the heart, which lie outside of the region in which the pathway is located, such as in other lobes of the patient's lungs, or along other branches of airway tree 404 which are not used for the current procedure. If EMN application 81 determines that such objects are visible, those objects may be removed from the view at step S306, as shown below by
Thereafter, or if EMN application 81 determines that there are no such objects in the view, processing proceeds to step S308, where EMN application 81 determines whether there are objects obstructing the view of digital probe 406 and/or target 510. For example, depending on the angle of the view point, the surrounding airways which do not form part of the planned pathway may lie in the line of sight and between the view point and probe 406 or target 510. If EMN application 81 determines that such objects are obstructing the view of probe 406 or target 510, those objects may be removed from the view at step S310, as shown below by
Thereafter, or if EMN application 81 determines that there are no such objects in the view, processing proceeds to step S312, where EMN application 81 determines if there are any objects visible in the view which are unrelated to the position of probe 406, the type of tool being used in the current navigation procedure, or the selected state of EMN application 81. For example, markers indicating the location of previous biopsies at different target locations may be within the view angle from the view point, but are not relevant to the current procedure, as shown below by
Thereafter, or if EMN application 81 determines that there are no such objects in the view, processing proceeds to step S316, where EMN application 81 determines whether digital probe 406, and thus sensor 94, is approaching the pleural boundaries of the patient's lungs. EMN application 81 may determine that sensor 94 is approaching the pleural boundaries of the patient's lungs based on, for example, the distance between sensor 94 and the pleura, the angle between sensor 94 and the pleura, the speed at which sensor 94 is moving, and/or any combination thereof. The determination may further be based on a known or estimated rate of navigational errors. When sensor 94 is close to the pleura, there is an increased risk of injury, such as pneumothorax, to the patient, and the clinician may want to be aware of that to proceed with added caution. Thus, if EMN application 81 determines that sensor 94 is close to the pleura, EMN application 81 may present an alert to the clinician at step S318. EMN application 81 may also take known or estimated navigational errors into account when determining whether sensor 94 is approaching the pleura.
Thereafter, or if EMN application 81 determines that sensor 94 is not approaching the pleura, processing proceeds to step S320, where EMN application 81 determines whether sensor 94 is approaching one or more major blood vessels. As with the pleura, when sensor 94 is close to major blood vessels, particularly where a tool 102, such as a biopsy or microwave ablation tool, is being deployed, there is added risk of injury to the patient, and the clinician may want to be aware that sensor 94 is close to major blood vessels to proceed with added caution. Thus, if EMN application 81 determines that sensor 94 is close major blood vessels, EMN application 81 may present an alert to the clinician at step S322. Additionally, as with the pleura, EMN application 81 may take known or estimated navigational errors into account when determining whether sensor 94 is approaching major blood vessels.
Thereafter, or if EMN application 81 determines that sensor 94 is not approaching any major blood vessels, processing proceeds to step S324, where EMN application 81 determines whether probe 406 has arrived at the target. If EMN application 81 determines that probe 406 has not arrived at the target, processing returns to step S302. In this way, the dynamic 3D lung map view is continuously updated and/or adjusted during the navigation procedure. If EMN application 81 determines that digital probe 406 has arrived at the target, processing proceeds to step S326, where EMN application 81 determines whether there are more targets to be visited. If EMN application 81 determines that there are no more targets to be visited, processing ends. Otherwise, processing returns to step S302.
The example dynamic 3D lung map view further shows additional targets 718 which have been hidden because they are located outside of the region of interest, as described above with regard to step S304 of
By using the dynamic 3D lung map view described above during an ENB procedure, the clinician may be presented with a continuously updated view of the 3D model which is adjusted as the tool, and thus sensor 94, is moved through the patient's airways. The dynamic 3D lung map view presents the clinician with a view of the 3D model from a viewpoint which clearly shows digital probe 406, and removes objects which may obscure digital probe 406, airway tree 404, target 510, and/or other objects which are relevant to the ENB procedure being performed.
Detailed embodiments of devices, systems incorporating such devices, and methods using the same as described herein. However, these detailed embodiments are merely examples of the disclosure, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for allowing one skilled in the art to variously employ the present disclosure in appropriately detailed structure. While the preceding embodiments are described in terms of bronchoscopy of a patient's airways, those skilled in the art will realize that the same or similar devices, systems, and methods may be used in other lumen networks, such as, for example, the vascular, lymphatic, and/or gastrointestinal networks as well.
With respect to memory 202 described above in connection with
Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Input device 210 may be any device by means of which a user may interact with workstation 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
Further aspects of image and data generation, management, and manipulation useable in either the planning or navigation phases of an ENB procedure are more fully described in commonly-owned U.S. Provisional Patent Application Ser. No. 62/020,177 entitled “Methods for Marking Biopsy Location”, filed on Jul. 2, 2014, by Brown.; U.S. Provisional Patent Application Ser. No. 62/020,238 entitled “Intelligent Display”, filed on Jul. 2, 2014, by Kehat et al.; U.S. Provisional Patent Application Ser. No. 62/020,242 entitled “Unified Coordinate System for Multiple CT Scans of Patient Lungs”, filed on Jul. 2, 2014, by Greenburg.; U.S. Provisional Patent Application Ser. No. 62/020,245 entitled “Alignment CT”, filed on Jul. 2, 2014, by Klein et al.; U.S. Provisional Patent Application Ser. No. 62/020,250 entitled “Algorithm for Fluoroscopic Pose Estimation”, filed on Jul. 2, 2014, by Merlet.; U.S. Provisional Patent Application Ser. No. 62/020,261 entitled “System and Method for Segmentation of Lung”, filed on Jul. 2, 2014, by Markov et al.; and U.S. Provisional Patent Application Ser. No. 62/020,258 entitled “Cone View—A Method of Providing Distance and Orientation Feedback While Navigating in 3D”, filed on Jul. 2, 2014, by Lachmanovich et al., the entire contents of all of which are hereby incorporated by reference.
While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.
This application is a divisional of U.S. patent application Ser. No. 17/199,433 filed Mar. 11, 2021, now allowed, which is a continuation of U.S. patent application Ser. No. 17/068,820, filed Oct. 12, 2020, now U.S. Pat. No. 11,389,247, which is a continuation of U.S. patent application Ser. No. 16/828,947, filed Mar. 24, 2020, now U.S. Pat. No. 10,799,297, which is a continuation of U.S. patent application Ser. No. 16/418,495, filed May 21, 2019, now U.S. Pat. No. 10,646,277, which is a continuation of U.S. patent application Ser. No. 16/148,174, filed Oct. 1, 2018, now U.S. Pat. No. 10,660,708, which is a continuation of U.S. patent application Ser. No. 15/828,551, filed Dec. 1, 2017, now U.S. Pat. No. 10,105,185, which is a continuation of U.S. patent application Ser. No. 15/447,472, filed Mar. 2, 2017, now U.S. Pat. No. 9,848,953, which is a continuation of U.S. patent application Ser. No. 14/751,257, filed Jun. 26, 2015, now U.S. Pat. No. 9,603,668, which claims the benefit of the filing date of provisional U.S. Patent Application No. 62/020,262, filed Jul. 2, 2014.
Number | Name | Date | Kind |
---|---|---|---|
5592939 | Martinelli | Jan 1997 | A |
5611025 | Lorensen et al. | Mar 1997 | A |
5676673 | Ferre et al. | Oct 1997 | A |
5697377 | Wittkampf | Dec 1997 | A |
5699799 | Xu et al. | Dec 1997 | A |
5715836 | Kliegis et al. | Feb 1998 | A |
5729129 | Acker | Mar 1998 | A |
5752513 | Acker et al. | May 1998 | A |
5782762 | Vining | Jul 1998 | A |
5881124 | Giger et al. | Mar 1999 | A |
5891030 | Johnson et al. | Apr 1999 | A |
5913820 | Bladen et al. | Jun 1999 | A |
5920319 | Vining et al. | Jul 1999 | A |
5967980 | Ferre et al. | Oct 1999 | A |
5971767 | Kaufman et al. | Oct 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
6019725 | Vesely et al. | Feb 2000 | A |
6047080 | Chen et al. | Apr 2000 | A |
6083162 | Vining | Jul 2000 | A |
6138045 | Kupinski et al. | Oct 2000 | A |
6151404 | Pieper | Nov 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6181348 | Geiger | Jan 2001 | B1 |
6201387 | Govari | Mar 2001 | B1 |
6233476 | Strommer et al. | May 2001 | B1 |
6246784 | Summers et al. | Jun 2001 | B1 |
6266551 | Osadchy et al. | Jul 2001 | B1 |
6332089 | Acker et al. | Dec 2001 | B1 |
6346940 | Fukunaga | Feb 2002 | B1 |
6366800 | Vining et al. | Apr 2002 | B1 |
6381485 | Hunter et al. | Apr 2002 | B1 |
6387092 | Burnside et al. | May 2002 | B1 |
6466815 | Saito et al. | Oct 2002 | B1 |
6496188 | Deschamps et al. | Dec 2002 | B1 |
6501848 | Carroll et al. | Dec 2002 | B1 |
6501981 | Schweikard et al. | Dec 2002 | B1 |
6505065 | Yanof et al. | Jan 2003 | B1 |
6522907 | Bladen et al. | Feb 2003 | B1 |
6526162 | Asano et al. | Feb 2003 | B2 |
6535756 | Simon et al. | Mar 2003 | B1 |
6578579 | Burnside et al. | Jun 2003 | B2 |
6584174 | Schubert et al. | Jun 2003 | B2 |
6603868 | Ludwig et al. | Aug 2003 | B1 |
6611793 | Burnside et al. | Aug 2003 | B1 |
6650927 | Keidar | Nov 2003 | B1 |
6651669 | Burnside | Nov 2003 | B1 |
6694163 | Vining | Feb 2004 | B1 |
6757557 | Bladen et al. | Jun 2004 | B1 |
6783523 | Qin et al. | Aug 2004 | B2 |
6792390 | Burnside et al. | Sep 2004 | B1 |
6829379 | Knoplioch et al. | Dec 2004 | B1 |
6850794 | Shahidi | Feb 2005 | B2 |
6892090 | Verard et al. | May 2005 | B2 |
6898263 | Avinash et al. | May 2005 | B2 |
6909913 | Vining | Jun 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6925200 | Wood et al. | Aug 2005 | B2 |
7006677 | Manjeshwar et al. | Feb 2006 | B2 |
7072501 | Wood et al. | Jul 2006 | B2 |
7085400 | Holsing et al. | Aug 2006 | B1 |
7096148 | Anderson et al. | Aug 2006 | B2 |
7149564 | Vining et al. | Dec 2006 | B2 |
7167180 | Shibolet | Jan 2007 | B1 |
7174202 | Bladen et al. | Feb 2007 | B2 |
7179220 | Kukuk | Feb 2007 | B2 |
7236558 | Saito et al. | Jun 2007 | B2 |
7236618 | Chui | Jun 2007 | B1 |
7301332 | Govari et al. | Nov 2007 | B2 |
7315639 | Kuhnigk | Jan 2008 | B2 |
7324104 | Bitter et al. | Jan 2008 | B1 |
7336809 | Zeng et al. | Feb 2008 | B2 |
7397937 | Schneider et al. | Jul 2008 | B2 |
7428334 | Schoisswohl et al. | Sep 2008 | B2 |
7452357 | Vlegele et al. | Nov 2008 | B2 |
7505809 | Strommer et al. | Mar 2009 | B2 |
7517320 | Wibowo et al. | Apr 2009 | B2 |
7518619 | Stoval et al. | Apr 2009 | B2 |
7630752 | Viswanathan | Dec 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7659912 | Akimoto et al. | Feb 2010 | B2 |
7702153 | Hong et al. | Apr 2010 | B2 |
7751865 | Jascob et al. | Jul 2010 | B2 |
7756316 | Odry | Jul 2010 | B2 |
7788060 | Schneider | Aug 2010 | B2 |
7792565 | Vining | Sep 2010 | B2 |
7805269 | Glossop | Sep 2010 | B2 |
7809176 | Gündel | Oct 2010 | B2 |
7811294 | Strommer et al. | Oct 2010 | B2 |
7822461 | Geiger et al. | Oct 2010 | B2 |
7901348 | Soper et al. | Mar 2011 | B2 |
7907772 | Wang et al. | Mar 2011 | B2 |
7929014 | Akimoto et al. | Apr 2011 | B2 |
7951070 | Ozaki et al. | May 2011 | B2 |
7969142 | Krueger et al. | Jun 2011 | B2 |
7985187 | Wibowo et al. | Jul 2011 | B2 |
8009891 | Vaan | Aug 2011 | B2 |
8049777 | Akimoto et al. | Nov 2011 | B2 |
8055323 | Sawyer | Nov 2011 | B2 |
8102416 | Ito et al. | Jan 2012 | B2 |
8126241 | Zarkh et al. | Feb 2012 | B2 |
8131344 | Strommer et al. | Mar 2012 | B2 |
8170328 | Masumoto et al. | May 2012 | B2 |
8199981 | Koptenko et al. | Jun 2012 | B2 |
8200314 | Bladen et al. | Jun 2012 | B2 |
8202213 | Ito et al. | Jun 2012 | B2 |
8208708 | Homan et al. | Jun 2012 | B2 |
8219179 | Ganatra et al. | Jul 2012 | B2 |
8257346 | Qin et al. | Sep 2012 | B2 |
8267927 | Dalal et al. | Sep 2012 | B2 |
8290228 | Cohen et al. | Oct 2012 | B2 |
8298135 | Ito et al. | Oct 2012 | B2 |
8335359 | Fidrich et al. | Dec 2012 | B2 |
8391952 | Anderson | Mar 2013 | B2 |
8417009 | Mizuno | Apr 2013 | B2 |
8494612 | Vetter et al. | Jul 2013 | B2 |
8509877 | Mori et al. | Aug 2013 | B2 |
8672836 | Higgins et al. | Mar 2014 | B2 |
8682045 | Vining et al. | Mar 2014 | B2 |
8696549 | Holsing et al. | Apr 2014 | B2 |
8698806 | Kunert et al. | Apr 2014 | B2 |
8700132 | Ganatra et al. | Apr 2014 | B2 |
8706184 | Mohr et al. | Apr 2014 | B2 |
8706193 | Govari et al. | Apr 2014 | B2 |
8709034 | Keast et al. | Apr 2014 | B2 |
8730237 | Ruijters et al. | May 2014 | B2 |
8768029 | Helm et al. | Jul 2014 | B2 |
8784400 | Roschak | Jul 2014 | B2 |
8798227 | Tsukagoshi et al. | Aug 2014 | B2 |
8798339 | Mielekamp et al. | Aug 2014 | B2 |
8801601 | Prisco et al. | Aug 2014 | B2 |
8819591 | Wang et al. | Aug 2014 | B2 |
8827934 | Chopra et al. | Sep 2014 | B2 |
8862204 | Sobe et al. | Oct 2014 | B2 |
9008754 | Steinberg et al. | Apr 2015 | B2 |
9129048 | Stonefield et al. | Sep 2015 | B2 |
9603668 | Weingarten et al. | Mar 2017 | B2 |
9770216 | Brown et al. | Sep 2017 | B2 |
9848953 | Weingarten et al. | Dec 2017 | B2 |
9918659 | Chopra et al. | Mar 2018 | B2 |
9974525 | Weingarten | May 2018 | B2 |
10373719 | Soper et al. | Aug 2019 | B2 |
10376178 | Chopra | Aug 2019 | B2 |
10405753 | Sorger | Sep 2019 | B2 |
10478162 | Barbagli et al. | Nov 2019 | B2 |
10480926 | Froggatt et al. | Nov 2019 | B2 |
10524866 | Srinivasan et al. | Jan 2020 | B2 |
10555788 | Panescu et al. | Feb 2020 | B2 |
10610306 | Chopra | Apr 2020 | B2 |
10638953 | Duindam et al. | May 2020 | B2 |
10674970 | Averbuch et al. | Jun 2020 | B2 |
10682070 | Duindam | Jun 2020 | B2 |
10706543 | Donhowe et al. | Jul 2020 | B2 |
10709506 | Coste-Maniere et al. | Jul 2020 | B2 |
10743748 | Gilboa | Aug 2020 | B2 |
10772485 | Schlesinger et al. | Sep 2020 | B2 |
10796432 | Mintz et al. | Oct 2020 | B2 |
10823627 | Sanborn et al. | Nov 2020 | B2 |
10827913 | Ummalaneni et al. | Nov 2020 | B2 |
10835153 | Rafii-Tari et al. | Nov 2020 | B2 |
10885630 | Li et al. | Jan 2021 | B2 |
20030013972 | Makin | Jan 2003 | A1 |
20050182295 | Soper et al. | Aug 2005 | A1 |
20050207630 | Chan | Sep 2005 | A1 |
20080118135 | Averbuch et al. | May 2008 | A1 |
20080123921 | Gielen et al. | May 2008 | A1 |
20080183073 | Higgins et al. | Jul 2008 | A1 |
20090012390 | Pescatore et al. | Jan 2009 | A1 |
20090030306 | Miyoshi et al. | Jan 2009 | A1 |
20090096807 | Silverstein et al. | Apr 2009 | A1 |
20090142740 | Liang et al. | Jun 2009 | A1 |
20100030064 | Averbuch | Feb 2010 | A1 |
20100290693 | Cohen | Nov 2010 | A1 |
20100310146 | Higgins et al. | Dec 2010 | A1 |
20100312094 | Guttman et al. | Dec 2010 | A1 |
20110085720 | Averbuch | Apr 2011 | A1 |
20110137156 | Razzaque et al. | Jun 2011 | A1 |
20110237897 | Gilboa | Sep 2011 | A1 |
20110251607 | Kruecker et al. | Oct 2011 | A1 |
20120203065 | Higgins et al. | Aug 2012 | A1 |
20120249546 | Tschirren et al. | Oct 2012 | A1 |
20120280135 | Bal | Nov 2012 | A1 |
20120287238 | Onishi et al. | Nov 2012 | A1 |
20130023730 | Kitamura et al. | Jan 2013 | A1 |
20130144124 | Prisco et al. | Jun 2013 | A1 |
20130165854 | Sandhu et al. | Jun 2013 | A1 |
20130231556 | Holsing et al. | Sep 2013 | A1 |
20130236076 | Averbuch et al. | Sep 2013 | A1 |
20130303945 | Blumenkranz et al. | Nov 2013 | A1 |
20130317352 | Case | Nov 2013 | A1 |
20140035798 | Kawada et al. | Feb 2014 | A1 |
20140066766 | Stonefield et al. | Mar 2014 | A1 |
20140298270 | Wiemker et al. | Oct 2014 | A1 |
20140343408 | Tolkowsky | Nov 2014 | A1 |
20150148690 | Chopra et al. | May 2015 | A1 |
20150265368 | Chopra et al. | Sep 2015 | A1 |
20150305612 | Hunter | Oct 2015 | A1 |
20150313503 | Seibel et al. | Nov 2015 | A1 |
20160000302 | Brown | Jan 2016 | A1 |
20160000414 | Brown et al. | Jan 2016 | A1 |
20160005220 | Weingarten | Jan 2016 | A1 |
20160073928 | Soper et al. | Mar 2016 | A1 |
20160157939 | Larkin et al. | Jun 2016 | A1 |
20160183841 | Duindam et al. | Jun 2016 | A1 |
20160192860 | Allenby et al. | Jul 2016 | A1 |
20160287344 | Donhowe et al. | Oct 2016 | A1 |
20170112576 | Coste-Maniere et al. | Apr 2017 | A1 |
20170172664 | Weingarten et al. | Jun 2017 | A1 |
20170209071 | Zhao et al. | Jul 2017 | A1 |
20170265952 | Donhowe et al. | Sep 2017 | A1 |
20170311844 | Zhao et al. | Nov 2017 | A1 |
20170319165 | Averbuch | Nov 2017 | A1 |
20180078318 | Barbagli et al. | Mar 2018 | A1 |
20180153621 | Duindam et al. | Jun 2018 | A1 |
20180235709 | Donhowe et al. | Aug 2018 | A1 |
20180240237 | Donhowe et al. | Aug 2018 | A1 |
20180256262 | Duindam et al. | Sep 2018 | A1 |
20180263706 | Averbuch | Sep 2018 | A1 |
20180279852 | Rafii-Tari et al. | Oct 2018 | A1 |
20180325419 | Zhao et al. | Nov 2018 | A1 |
20190000559 | Berman et al. | Jan 2019 | A1 |
20190000560 | Berman et al. | Jan 2019 | A1 |
20190008413 | Duindam et al. | Jan 2019 | A1 |
20190038359 | Weingarten et al. | Feb 2019 | A1 |
20190038365 | Soper et al. | Feb 2019 | A1 |
20190065209 | Mishra et al. | Feb 2019 | A1 |
20190110839 | Rafii-Tari et al. | Apr 2019 | A1 |
20190175062 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183318 | Froggatt et al. | Jun 2019 | A1 |
20190183585 | Rafii-Tari et al. | Jun 2019 | A1 |
20190183587 | Rafii-Tari et al. | Jun 2019 | A1 |
20190192234 | Gadda et al. | Jun 2019 | A1 |
20190209016 | Herzlinger et al. | Jul 2019 | A1 |
20190209043 | Zhao et al. | Jul 2019 | A1 |
20190216548 | Ummalaneni | Jul 2019 | A1 |
20190239723 | Duindam et al. | Aug 2019 | A1 |
20190239831 | Chopra | Aug 2019 | A1 |
20190250050 | Sanborn et al. | Aug 2019 | A1 |
20190254649 | Walters et al. | Aug 2019 | A1 |
20190269462 | Weingarten et al. | Sep 2019 | A1 |
20190269470 | Barbagli et al. | Sep 2019 | A1 |
20190272634 | Li et al. | Sep 2019 | A1 |
20190298160 | Ummalaneni et al. | Oct 2019 | A1 |
20190298451 | Wong et al. | Oct 2019 | A1 |
20190320878 | Duindam et al. | Oct 2019 | A1 |
20190320937 | Duindam et al. | Oct 2019 | A1 |
20190336238 | Yu et al. | Nov 2019 | A1 |
20190343424 | Blumenkranz et al. | Nov 2019 | A1 |
20190350659 | Wang et al. | Nov 2019 | A1 |
20190365199 | Zhao et al. | Dec 2019 | A1 |
20190365479 | Rafii-Tari | Dec 2019 | A1 |
20190365486 | Srinivasan et al. | Dec 2019 | A1 |
20190380787 | Ye et al. | Dec 2019 | A1 |
20200000319 | Saadat et al. | Jan 2020 | A1 |
20200000526 | Zhao | Jan 2020 | A1 |
20200008655 | Schlesinger et al. | Jan 2020 | A1 |
20200030044 | Wang et al. | Jan 2020 | A1 |
20200030461 | Sorger | Jan 2020 | A1 |
20200038750 | Kojima | Feb 2020 | A1 |
20200043207 | Lo et al. | Feb 2020 | A1 |
20200046431 | Soper et al. | Feb 2020 | A1 |
20200046436 | Tzeisler et al. | Feb 2020 | A1 |
20200054399 | Duindam et al. | Feb 2020 | A1 |
20200060512 | Holsing et al. | Feb 2020 | A1 |
20200060771 | Lo et al. | Feb 2020 | A1 |
20200069192 | Sanborn et al. | Mar 2020 | A1 |
20200077870 | Dicarlo et al. | Mar 2020 | A1 |
20200078095 | Chopra et al. | Mar 2020 | A1 |
20200078103 | Duindam et al. | Mar 2020 | A1 |
20200085514 | Blumenkranz | Mar 2020 | A1 |
20200109124 | Pomper et al. | Apr 2020 | A1 |
20200129045 | Prisco | Apr 2020 | A1 |
20200129239 | Bianchi et al. | Apr 2020 | A1 |
20200138515 | Wong | May 2020 | A1 |
20200146588 | Hunter et al. | May 2020 | A1 |
20200155116 | Donhowe et al. | May 2020 | A1 |
20200170623 | Averbuch | Jun 2020 | A1 |
20200170720 | Ummalaneni | Jun 2020 | A1 |
20200179058 | Barbagli et al. | Jun 2020 | A1 |
20200188038 | Donhowe et al. | Jun 2020 | A1 |
20200205903 | Srinivasan et al. | Jul 2020 | A1 |
20200205904 | Chopra | Jul 2020 | A1 |
20200214664 | Zhao et al. | Jul 2020 | A1 |
20200229679 | Zhao et al. | Jul 2020 | A1 |
20200242767 | Zhao et al. | Jul 2020 | A1 |
20200275860 | Duindam | Sep 2020 | A1 |
20200297442 | Adebar et al. | Sep 2020 | A1 |
20200315554 | Averbuch et al. | Oct 2020 | A1 |
20200330795 | Sawant et al. | Oct 2020 | A1 |
20200352427 | Deyanov | Nov 2020 | A1 |
20200364865 | Donhowe et al. | Nov 2020 | A1 |
Number | Date | Country |
---|---|---|
0013237 | Jul 2003 | BR |
0116004 | Jun 2004 | BR |
101877996 | Nov 2010 | CN |
103068294 | Apr 2013 | CN |
486540 | Sep 2016 | CZ |
2709512 | Aug 2017 | CZ |
2884879 | Jan 2020 | CZ |
3413830 | Sep 2019 | EP |
3478161 | Feb 2020 | EP |
3641686 | Apr 2020 | EP |
3644885 | May 2020 | EP |
3644886 | May 2020 | EP |
2002306403 | Oct 2002 | JP |
2009018184 | Jan 2009 | JP |
2011193885 | Oct 2011 | JP |
03005028 | Jan 2004 | MX |
225663 | Jan 2005 | MX |
226292 | Feb 2005 | MX |
246862 | Jun 2007 | MX |
265247 | Mar 2009 | MX |
284569 | Mar 2011 | MX |
2009138871 | Nov 2009 | WO |
2011102012 | Aug 2011 | WO |
2013192598 | Dec 2013 | WO |
Entry |
---|
Canadian Office Action issued in Canadian Application No. 2953390 dated May 20, 2021, 6 pages. |
Chinese Office Action for application No. 201580035779.3 dated Nov. 28, 2017 with English Translation (8 pages). |
Chinese Rejection Decision issued in Chinese Patent Application No. 201580035779.3 dated Aug. 19, 2019, 4 pages. No English translation available. |
Chinese Office Action dated Dec. 3, 2018 issued in corresponding CN Appln. No. 201580035779.3. |
Examination Report No. 1 for standard patent application issued in Australian Patent Application No. 2019204469 dated Oct. 10, 2019, 3 pages. |
Examination Report No. 1 for standard patent application issued in Australian Patent Application No. 2020205248 dated Nov. 6, 2020, 4 pages. |
Extended European Search Report for application No. 15814621.7 dated Mar. 12, 2018 (9 pages). |
Japanese Office Action For application No. 2016-575425 dated Mar. 19, 2019 with English translation. |
Non-Final Office Action issued in U.S. Appl. No. 17/526,933 dated Jan. 27, 2022. |
Non-Final Office Action issued in U.S. Appl. No. 17/531,670 dated Mar. 18, 2022. |
Notice of Allowance issued in U.S. Appl. No. 17/199,433 dated Jul. 27, 2021. |
Notification of the Fourth Office Action issued in Chinese Patent Application No. 201580035779.3 dated Jun. 21, 2019, 17 pages. |
Office Action issued in U.S. Appl. No. 17/199,429 dated May 19, 2021. |
Rejection Decision issued in Chinese Patent Application No. 201580035779.3 dated Oct. 13, 2021, with English google translation. |
Srikantha et al. “Ghost Detection and Removal for High Dynamic Range Images: Recent Advances”. Published in “Signal Processing: Image Communication (2012) 10.1016/j.image.2012.02.001”. DOI : 10.1016/j.image.2012.02.001 (Year: 2012). |
The Fifth Office Action issued in Chinese Patent Application No. 201580035779.3 dated Apr. 1, 2021 with English Translation. |
U.S. Office Action issued in U.S. Appl. No. 16/148,174 dated Aug. 8, 2019, 43 pages. |
US Office Action issued in U.S. Appl. No. 17/068,820 dated Sep. 24, 2021. |
Communication pursuant to Article 94(3) EPC issued in European Patent Application No. 15814621.7 dated May 10, 2023. |
Number | Date | Country | |
---|---|---|---|
20230077714 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
62020262 | Jul 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17199433 | Mar 2021 | US |
Child | 17992549 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17068820 | Oct 2020 | US |
Child | 17199433 | US | |
Parent | 16828947 | Mar 2020 | US |
Child | 17068820 | US | |
Parent | 16418495 | May 2019 | US |
Child | 16828947 | US | |
Parent | 16148174 | Oct 2018 | US |
Child | 16418495 | US | |
Parent | 15828551 | Dec 2017 | US |
Child | 16148174 | US | |
Parent | 15447472 | Mar 2017 | US |
Child | 15828551 | US | |
Parent | 14751257 | Jun 2015 | US |
Child | 15447472 | US |