Camera tracking system identifying phantom markers during computer assisted surgery navigation

Information

  • Patent Grant
  • 12048493
  • Patent Number
    12,048,493
  • Date Filed
    Thursday, March 31, 2022
    2 years ago
  • Date Issued
    Tuesday, July 30, 2024
    4 months ago
Abstract
A camera tracking system for computer assisted navigation during surgery. Operations identify stray markers in a frame of tracking data from tracking cameras, and identify stray markers of a reference array. Stray markers of the reference array are designated assigned status and, otherwise, designated unknown status. The operations designate other of the assigned status stray markers and any of the unknown status stray markers along a same epipolar line of the tracking cameras as one of the assigned status stray markers as being epipolar ambiguous status. For each one of the epipolar ambiguous status stray markers, the operations estimate 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras when determining location of the one of the epipolar ambiguous status stray markers. The operations designate the unknown status stray markers within a threshold distance of the estimated 3D locations of phantom markers as being phantom status.
Description
FIELD

The present disclosure relates to medical devices and systems, and more particularly, camera tracking systems used for computer assisted navigation during surgery.


BACKGROUND

A computer assisted surgery navigation system can provide a surgeon with computerized visualization of how a surgical instrument that is posed relative to a patient correlates to a pose relative to medical images of the patient's anatomy. Camera tracking systems for computer assisted surgery navigation typically use a set of cameras to track pose of a reference array on the surgical instrument, which is being positioned by a surgeon during surgery, relative to a patient reference array (also “dynamic reference base” (DRB)) affixed to a patient. The camera tracking system uses the relative poses of the reference arrays to determine how the surgical instrument is posed relative to a patient and to correlate to the surgical instrument's pose relative to the medical images of the patient's anatomy. The surgeon can thereby use real-time visual feedback of the relative poses to navigate the surgical instrument during a surgical procedure on the patient.


During the surgical procedure, a surveillance marker is affixed to the patient to provide information on whether the patient reference array has shifted. If the surveillance marker's location changes relative to the patient reference array, the camera tracking system can display a meter indicating the amount of movement and may display a pop-up warning message to inform the user that the patient reference array may have been bumped. If the patient reference array has indeed been bumped, the registration of the patient reference array to the tracked coordinate system may be invalid and could result in erroneous navigation of the surgical instrument.


In one approach, the surveillance marker is identified to the camera tracking system by pointing with a tool having a pose tracked by the camera tracking system. If the surveillance marker is to be added by a user pointing with a tool, the software waits until a stray candidate marker is within a threshold distance (e.g., defined based on pose tracking tolerance of the tool tip) for longer than a threshold amount of time (e.g., about 2 seconds) and then registers that stray candidate marker as the surveillance marker.


In another approach, the surveillance marker is identified to the tracking system by pressing a button on a display screen. If the surveillance marker is to be added from the display screen with a single button click, there can be only one stray (unregistered) candidate marker so that software of the camera tracking system can properly select the stray candidate marker for registration as the surveillance marker.


In another approach, the surveillance marker is identified to the tracking system by a user interface where the interface shows the user all the possible surveillance marker candidates and the user selects the preferred candidate by clicking on in with a touch screen or other gesture. The system may show the surveillance marker candidates to the user in different ways. One possible way to show the candidates is to provide a 3D view with icons representing each candidate and icons representing other nearby structures such as the DRB or planned screws. Another possible way to show the surveillance marker candidates is to stream a live video to the user where they can clearly see the actual physical marker, to which the user then gestures to identify.


SUMMARY

Some embodiments of the present disclosure are directed to providing operations by the camera tracking system to improve registration of stray markers, such as a surveillance marker, when phantom markers appear in frames of tracking data from tracking cameras.


Some embodiments are directed to a camera tracking system for computer assisted navigation during surgery, which includes at least one processor that is operative to receive a stream of frames of tracking data from tracking cameras configured with a partially overlapping field-of-view. For each of a plurality of the frames in the stream, the operations identify stray markers in the frame, and identify which of the stray markers are part of a reference array. The operations designate stray markers that are part of the reference array as being assigned status, and designate stray markers that are not part of the reference array as being unknown status. For each one of the assigned status stray markers, the operations designate any other of the assigned status stray markers and any of the unknown status stray markers that are along a same epipolar line of the tracking cameras as the one of the assigned status stray markers as being epipolar ambiguous status. For each one of the epipolar ambiguous status stray markers, the operations estimate 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras when determining location of the one of the epipolar ambiguous status stray markers. The operations designate any of the unknown status stray markers within a threshold distance of the estimated 3D locations of the phantom markers as being phantom status, and include in a candidate registration set the unknown status stray markers that do not have phantom status.


Some embodiments are directed to a related method by a camera tracking system for computer assisted navigation during surgery. The method receives a stream of frames of tracking data from tracking cameras configured with a partially overlapping field-of-view. For each of a plurality of the frames in the stream, the method identifies stray markers in the frame, and identify which of the stray markers are part of a reference array. The method designates stray markers that are part of the reference array as being assigned status, and designates stray markers that are not part of the reference array as being unknown status. For each one of the assigned status stray markers, the method designates any other of the assigned status stray markers and any of the unknown status stray markers that are along a same epipolar line of the tracking cameras as the one of the assigned status stray markers as being epipolar ambiguous status. For each one of the epipolar ambiguous status stray markers, the method estimates 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras when determining location of the one of the epipolar ambiguous status stray markers. The method designates any of the unknown status stray markers within a threshold distance of the estimated 3D locations of the phantom markers as being phantom status, and includes in a candidate registration set the unknown status stray markers that do not have phantom status.


Other camera tracking system and corresponding methods and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional camera tracking system, methods. and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.





DESCRIPTION OF THE DRAWINGS

Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:



FIG. 1 is an overhead view of personnel wearing extended reality (XR) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and which may further include a surgical robot for robotic assistance according to some embodiments;



FIG. 2 illustrates the camera tracking system and the surgical robot positioned relative to a patient according to some embodiments;



FIG. 3 further illustrates the camera tracking system and the surgical robot configured according to some embodiments;



FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset, a computer platform, imaging devices, and a surgical robot which are configured to operate according to some embodiments;



FIG. 5 illustrates a patient reference array (DRB) and a surveillance marker;



FIG. 6 illustrates a flowchart of operations that may be performed by a camera tracking system for computer assisted navigation during surgery according to some embodiments; and



FIG. 7 illustrates the camera tracking system with spaced apart tracking cameras which are viewing actual tracking markers on the same epipolar line as an imaging plane of the tracking cameras.





DETAILED DESCRIPTION

It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.


The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.


Various embodiments of the present disclosure are directed to providing operations by the camera tracking system to improve registration of candidate markers, such as a surveillance marker, when phantom markers appear in frames of tracking data from tracking cameras. Before describing these embodiments is detail, various components that may be used for performing embodiments in a navigated surgery system are described with reference to FIGS. 1-5.



FIG. 1 is an overhead view of personnel wearing extended reality (XR) headsets 150 during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery during a surgical procedure and which may further include a surgical robot 100 for robotic assistance, according to some embodiments. FIG. 2 illustrates the camera tracking system 200 and the surgical robot 100 positioned relative to a patient, according to some embodiments. FIG. 3 further illustrates the camera tracking system 200 and the surgical robot 100 configured according to some embodiments. FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 400, imaging devices 420, and the surgical robot 100 which are configured to operate according to some embodiments. FIG. 5 illustrates a patient reference array 116 (also “dynamic reference base” (DRB)) and a surveillance marker 500.


The XR headset 150 may be configured to augment a real-world scene with computer generated XR images. The XR headset 150 may be configured to provide an augmented reality (AR) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user. Alternatively, the XR headset 150 may be configured to provide a virtual reality (VR) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer-generated AR images on a display screen. The XR headset 150 can be configured to provide both AR and VR viewing environments. Thus, the term XR headset can referred to as an AR headset or a VR headset.


Referring to FIGS. 1-5, the surgical robot 100 may include, for example, one or more robot arms 104, a display 110, an end-effector 112, for example, including a guide tube 114, and an end effector reference array which can include one or more tracking markers. A patient reference array 116 (DRB) has a plurality of tracking markers 117 and is secured directly to the patient 210 (e.g., to a bone of the patient 210). A spaced apart surveillance marker 500 (FIG. 5) has a single marker 502 connected to a shaft that is secured directly to the patient 210 at a spaced apart location from the patient reference array 116. Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc.


The camera tracking system 200 includes tracking cameras 204 which may be spaced apart stereo cameras configured with partially overlapping field-of-views. The camera tracking system 200 can have any suitable configuration of arm(s) 202 to move, orient, and support the tracking cameras 204 in a desired location, and may contain at least one processor operable to track location of an individual marker and pose of an array of markers. As used herein, the term “pose” refers to the location (e.g., along 3 orthogonal axes) and/or the rotation angle (e.g., about the 3 orthogonal axes) of markers (e.g., DRB) relative to another marker (e.g., surveillance marker) and/or to a defined coordinate system (e.g., camera coordinate system). A pose may therefore be defined based on only the multidimensional location of the markers relative to another marker and/or relative to the defined coordinate system, based on only the multidimensional rotational angles of the markers relative to the other marker and/or to the defined coordinate system, or based on a combination of the multidimensional location and the multidimensional rotational angles. The term “pose” therefore is used to refer to location, rotational angle, or combination thereof.


The tracking cameras 204 may include, e.g., infrared cameras (e.g., bifocal or stereophotogrammetric cameras), operable to identify, for example, active and passive tracking markers for single markers (e.g., surveillance marker 500) and reference arrays which can be formed on or attached to the patient 210 (e.g., patient reference array, DRB), end effector 112 (e.g., end effector reference array), XR headset(s) 150 worn by a surgeon 120 and/or a surgical assistant 126, etc. in a given measurement volume of a camera coordinate system while viewable from the perspective of the tracking cameras 204. The tracking cameras 204 may scan the given measurement volume and detect light that is emitted or reflected from the markers in order to identify and determine locations of individual markers and poses of the reference arrays in three-dimensions. For example, active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking cameras 204 or other suitable device.


The XR headsets 150 may each include tracking cameras (e.g., spaced apart stereo cameras) that can track location of a surveillance marker and poses of reference arrays within the XR camera headset field-of-views (FOVs) 152 and 154, respectively. Accordingly, as illustrated in FIG. 1, the location of the surveillance marker and the poses of reference arrays on various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 and/or a FOV 600 of the tracking cameras 204.



FIGS. 1 and 2 illustrate a potential configuration for the placement of the camera tracking system 200 and the surgical robot 100 in an operating room environment. Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 and/or other displays 34, 36, and 110 to display surgical procedure navigation information. The surgical robot 100 is optional during computer-aided navigated surgery.


The camera tracking system 200 may operate using tracking information and other information provided by multiple XR headsets 150 such as inertial tracking information and optical tracking information (frames of tracking data). The XR headsets 150 operate to display visual information and may play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 100 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment. The camera tracking system 200 may track markers in 6 degrees-of-freedom (6DOF) relative to three axes of a 3D coordinate system and rotational angles about each axis. The XR headsets 150 may also operate to track hand poses and gestures to enable gesture-based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 may have a 1-10× magnification digital color camera sensor called a digital loupe. In some embodiments, one or more of the XR headsets 150 are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.


An “outside-in” machine vision navigation bar supports the tracking cameras 204 and may include a color camera. The machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 while positioned on wearers' heads. The patient reference array 116 (DRB) is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112, instrument reference array 170, and reference arrays on the XR headsets 150.


During a surgical procedure using surgical navigation, the surveillance marker 500 is affixed to the patient to provide information on whether the patient reference array 116 has shifted. For example, during a spinal fusion procedure with planned placement of pedicle screw fixation, two small incisions are made over the posterior superior iliac spine bilaterally. The DRB and the surveillance marker are then affixed to the posterior superior iliac spine bilaterally. If the surveillance marker's 500 location changes relative to the patient reference array 116, the camera tracking system 200 may display a meter indicating the amount of movement and/or may display a pop-up warning message to inform the user that the patient reference array may have been bumped. If the patient reference array has indeed been bumped, the registration of the patient reference array to the tracked coordinate system may be invalid and could result in erroneous navigation which is off target.


When present, the surgical robot (also “robot”) may be positioned near or next to patient 210. The robot 100 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the surgical procedure. The camera tracking system 200 may be separated from the robot system 100 and positioned at the foot of patient 210. This location allows the tracking camera 200 to have a direct visual line of sight to the surgical area 208. In the configuration shown, the surgeon 120 may be positioned across from the robot 100, but is still able to manipulate the end-effector 112 and the display 110. A surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110. If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. An anesthesiologist 122, nurse or scrub tech can operate equipment which may be connected to display information from the camera tracking system 200 on a display 34.


With respect to the other components of the robot 100, the display 110 can be attached to the surgical robot 100 or in a remote location. End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor. In some embodiments, end-effector 112 can comprise a guide tube 114, which is configured to receive and orient a surgical instrument, tool, or implant used to perform a surgical procedure on the patient 210.


As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” The term “instrument” is used in a non-limiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein. Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc. Although generally shown with a guide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument in a desired manner.


The surgical robot 100 is operable to control the translation and orientation of the end-effector 112. The robot 100 may move the end-effector 112 under computer control along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis, such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled. In some embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a six DOF robot arm comprising only rotational axes. For example, the surgical robot 100 may be used to operate on patient 210, and robot arm 104 can be positioned above the body of patient 210, with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210.


In some example embodiments, the XR headsets 150 can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.


In some further embodiments, surgical robot 100 can be operable to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory. The surgical robot 100 can be operable to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument. Thus, in use, a surgeon or other user can use the surgical robot 100 as part of computer assisted navigated surgery, and has the option to stop, modify, or manually control the autonomous or semi-autonomous movement of the end-effector 112 and/or the surgical instrument.


Reference arrays of markers can be formed on or connected to robot arms 102 and/or 104, the end-effector 112 (e.g., end-effector array 114 in FIG. 2), and/or a surgical instrument (e.g., instrument array 170) to track poses in 6 DOF along 3 orthogonal axes and rotation about the axes. The reference arrays enable each of the marked objects (e.g., the end-effector 112, the patient 210, and the surgical instruments) to be tracked by the tracking camera 200, and the tracked poses can be used to provide navigated guidance during a surgical procedure and/or used to control movement of the surgical robot 100 for guiding the end-effector 112 and/or an instrument manipulated by the end-effector 112.


Referring to FIG. 3 the surgical robot 100 may include a display 110, upper arm 102, lower arm 104, end-effector 112, vertical column 312, casters 314, a table 318, and ring 324 which uses lights to indicate statuses and other information. Cabinet 106 may house electrical components of surgical robot 100 including, but not limited, to a battery, a power distribution module, a platform interface board module, and a computer. The camera tracking system 200 may include a display 36, tracking cameras 204, arm(s) 202, a computer housed in cabinet 330, and other components.


In computer-assisted navigated surgeries, perpendicular 2D scan slices, such as axial, sagittal, and/or coronal views, of patient anatomical structure are displayed to enable user visualization of the patient's anatomy alongside the relative poses of surgical instruments. An XR headset or other display can be controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy. The 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn't necessarily formed from a scan of the patient.


Example Surgical System:



FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 400, imaging devices 420, and a surgical robot 100 which are configured to operate according to some embodiments.


The imaging devices 420 may include a C-arm imaging device, an O-arm imaging device, and/or a patient image database. The XR headset 150 provides an improved human interface for performing navigated surgical procedures. The XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 400, that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 438 of the XR headset 150 and/or another display device. The display device 438 may include a video projector, flat panel display, etc. The user may view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen. The XR headset 150 may additionally or alternatively be configured to display on the display device 438 video streams from cameras mounted to one or more XR headsets 150 and other cameras.


Electrical components of the XR headset 150 can include a plurality of cameras 430, a microphone 432, a gesture sensor 434, a pose sensor (e.g., inertial measurement unit (IMU)) 436, the display device 438, and a wireless/wired communication interface 440. The cameras 430 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.


The cameras 430 may be configured to operate as the gesture sensor 434 by tracking for identification user hand gestures performed within the field of view of the camera(s) 430. Alternatively, the gesture sensor 434 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 434 and/or senses physical contact, e.g., tapping on the sensor 434 or its enclosure. The pose sensor 436, e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.


As explained above, a surgical system includes the camera tracking system 200 which may be connected to a computer platform 400 for operational processing and which may provide other operational functionality including a navigation controller 404 and/or of an XR headset controller 410. The surgical system may include the surgical robot 100. The navigation controller 404 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 200. The navigation controller 404 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 100, where the steering information is displayed through the display device 438 of the XR headset 150 and/or another display device to indicate where the surgical tool and/or the end effector of the surgical robot 100 should be moved to perform the surgical plan.


The electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 400 through the wired/wireless interface 440. The electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 400 or directly connected, to various imaging devices 420, e.g., the C-arm imaging device, the I/O-arm imaging device, the patient image database, and/or to other medical equipment through the wired/wireless interface 440.


The surgical system may include a XR headset controller 410 that may at least partially reside in the XR headset 150, the computer platform 400, and/or in another system component connected via wired cables and/or wireless communication links. Various functionality is provided by software executed by the XR headset controller 410. The XR headset controller 410 is configured to receive information from the camera tracking system 200 and the navigation controller 404, and to generate an XR image based on the information for display on the display device 438.


The XR headset controller 410 can be configured to operationally process frames of tracking data from tracking cameras from the cameras 430 (tracking cameras), signals from the microphone 1620, and/or information from the pose sensor 436 and the gesture sensor 434, to generate information for display as XR images on the display device 438 and/or as other for display on other display devices for user viewing. Thus, the XR headset controller 410 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user. For example, the XR headset controller 410 may reside within the computer platform 400 which, in turn, may reside within the cabinet 330 of the camera tracking system 200, the cabinet 106 of the surgical robot 100, etc.


Identifying Phantom Markers Imaged by Tracking Cameras:


Regardless of the workflow for registering a stray maker, such as the surveillance marker, the presence of “phantom” markers can be problematic. Phantom markers occur as a result of epipolar stereo tracking ambiguity, reflections and other environmental conditions, and do not represent the 3D location of a stray actual marker. A stray actual marker is a physical marker, e.g., surveillance marker, that appears in frames of tracking data from tracking cameras and is intended to be tracked by the camera tracking system, but which has not yet been registered with the camera tracking system. If a phantom marker also appears in the frames, the presence of the phantom markers can make it infeasible to register the stray actual marker with just a single button press, because the system does not know which of the marker candidates to use. The camera tracking system also becomes susceptible to error if, for example, the actual marker is not present (e.g., obscured from view) while only one phantom is present in the frames, which can cause the single button press to trigger incorrect registration of the phantom marker as the actual marker. Alternatively, if the user happens to accidentally point a tracked tool at a point in space where a phantom marker is closer to the tip than the stray actual marker or inadvertently gesture to identify a phantom adjacent to the intended candidate, the camera tracking system could incorrectly register the phantom marker as the actual marker.


As explained above, various embodiments of the present disclosure are directed to providing operations by the camera tracking system 200 which may improve registration of stray markers, such as the surveillance marker 500 in FIG. 5, when phantom markers are present in frames of tracking data from the tracking cameras 204. Phantom markers can occur as a result of epipolar stereo tracking ambiguity, reflections and other environmental conditions and do not represent 3D locations of actual stray markers. Other phantom markers occur due to stereo ambiguity between tracking cameras 204 when multiple actual markers appear on the same vertical row of the image sensors of the tracking cameras 204. It has been determined that a characteristic of phantom markers is that they do not move through 3D space like actual markers when the tracking cameras 204 are moved to point toward the scene from different perspectives, because the phantom markers are not anchored to the camera coordinate system in the same way as actual markers. For example, rolling the tracking cameras 204 slightly to cause a small change in perspective of the tracking cameras 204 can break stereo ambiguity and cause many phantoms to disappear when determining 3D locations of stray actual markers in frames of tracking data. Phantom marker visibility is generally less consistent than that of stray actual markers, meaning that phantom markers may be visible only from one particular perspective of the tracking cameras but from no other perspective.


Some embodiments are directed to operations which classify stray markers as “assigned” status, “unknown” status, and/or “phantom” status in a frame of tracking data from tracking cameras on a tracking bar, XR headset, etc. Stray markers may be filtered so that registration is only performed on a stray marker which does not have phantom status, such as when a user registers a surveillance marker. By eliminating any phantom status strays being used in the registration process, the operations may increase the incidence of scenarios where the surveillance marker can be selected through a user interface, e.g., by a user-selectable registration button, instead of requiring tracking of a user-posed pointing tool and which can avoid or prevent various errors such as described above. These operations may utilize the inconsistent visibility and location of phantoms markers versus actual stray markers.


In some embodiments, both the patient reference array (e.g., DRB) and surveillance marker are viewed from more than one perspective of the tracking cameras, e.g., arranged in a stereo configuration with partially overlapping field-of-view. More than one perspective can be obtained by moving the tracking cameras to provide rotational and/or linear location offset between frames of tracking data received from the tracking cameras. In one scenario, a user affixes a DRB and a surveillance marker to the patient and then moves the tracking cameras on a camera bar to provide more optimal positioning for tracking of markers during a surgical procedure. While the tracking cameras are being moved and the surveillance marker and DRB are visible, operations are performed to identify candidate markers in frames of tracking data from the tracking cameras, and to process the candidate markers into various different statuses, which are referred to without limitation as, e.g., assigned status, unknown status, and ambiguous status.


In one operational embodiment, when a frame is received from the tracking cameras, the position of the DRB in the camera coordinate system is recorded along with all stray makers (also “strays”) visible from this perspective of the tracking cameras. For easier comparison to other frames, locations of stray markers are recorded relative to the DRB, not in the camera coordinate system in accordance with one embodiment. Next, the set of previous camera positions and the corresponding sets of stray markers are compared to the current frame. If the current camera orientation for a received frame is the same (e.g., within a tolerance threshold, such as less than 10 mm or 1 degree) as a previous camera orientation for a previously received frame, the stray markers from the stored set are compared to the previous set for this orientation and any stray markers now present that were not present previously are included in a set for further processing and stored. This operation accounts for actual stray markers that may have been blocked from line of sight in a previous frame. If this camera orientation is different than any previous orientation (e.g., more than the tolerance threshold), any stray markers common to both sets (e.g., difference in position relative to DRB<1 mm) can be classified with unknown status for further processing to determine whether they should be designated as phantom status, while any candidate markers not common to both sets may be more likely to be phantom and may not be included in further processing or may be processed to have an increased likelihood of being designated as phantom status. As the tracking cameras continue to move and more orientations are recorded, their respective sets of stray markers are compared to the sets of stray markers for all other orientations. In some embodiment, only stray markers that are present in greater than some minimum number of orientations are included in a candidate registration set which is used to perform registration of one or more markers in the set. With these operations, it is possible for stray markers to have status changed from unknown to phantom or vice versa depending on in how many different tracking camera orientations they were identified in the corresponding received frames.


The operation for changing stray markers' status, based on how many frames from different tracking camera orientations they are identified in, can be directed to situations where phantom markers are identified in frames from more than one tracking camera perspective by chance and where there may be frames where the actual stray markers are obscured or not visible to the tracking cameras for some reason. It would be undesirable for such frames to cause the actual stray markers to inadvertently be designated as phantom strays thereafter.


Some further embodiments are directed to identifying phantom markers that can arise specifically from epipolar ambiguity of the tracking cameras when determining locations of actual stray markers, such as a surveillance marker. FIG. 7 illustrates the camera tracking system 200 with spaced apart tracking cameras 204, which are viewing actual tracking markers 700a and 700b, and which are assumed to be on the same epipolar, e.g., horizontal, line as an imaging plane of the tracking cameras 204. Phantom markers 710a and 710b arise due to epipolar ambiguity of the tracking cameras 204 when determining locations of stray markers, including the actual tracking markers 700a and 700b, in frames received from the tracking cameras 204, based on the relative spacing and orientation of the imaging planes of the tracking cameras 204.



FIG. 6 illustrates a flowchart of operations that may be performed by a camera tracking system for computer assisted navigation during surgery in accordance with some other embodiments.


Referring to FIG. 6, the operations receive 600 a stream of frames of tracking data from tracking cameras configured with a partially overlapping field-of-view. As the frames are received, the operations process a present frame to identify 600 stray markers in the frame. The identification may determine 3D locations of the stray markers. The operations process the stray markers to identify 602 which of the stray markers are part of a reference array, such as a DRB. The operations designate 604 stray markers that are part of the reference array as being assigned status. The operations designate 606 stray markers that are not part of the reference array as being unknown status. The terms “assigned status” and “unknown status” are used in a non-limiting manner only to differential one status from the other status. Accordingly, the term assigned status may be interchangeably replaced with the term first status and, similarly, the term unknown status may be replaced with the term second status.


For each one of the assigned status stray markers, the operations designate 606 any other of the assigned status stray markers and any of the unknown status stray markers that are along a same epipolar line of the tracking cameras as the one of the assigned status stray markers as being epipolar ambiguous status. For each one of the epipolar ambiguous status stray markers, the operations estimate 608 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras when determining location of the one of the epipolar ambiguous status stray markers. The operations designate 610 any of the unknown status stray markers within a threshold distance of the estimated 3D locations of the phantom markers as being phantom status, and include 612 in a candidate registration set the unknown status stray markers that do not have phantom status.


The threshold distance value may be a function of the epipolar ambiguity of the tracking cameras. In some embodiments, the threshold distance is not greater than 2 millimeters so that, for example, if a unknown status stray markers is within a 2 millimeter cubic box centered at the computed 3D location of the phantom marker, that unknown status stray marker is designated 610 as phantom status.


The operations may further perform registration of one or more of the unknown status stray markers in the candidate registration set. For example, as will be explained in further detail below, when only one unknown status stray marker exists in the candidate registration set, the camera tracking system may display a registration indicia that can be selected by a user to register the unknown status stray marker as a surveillance marker or another defined marker.


Because phantom markers should not be allowed to be registered, the operations may prevent registration of any of the phantom status stray markers.


The camera tracking system may operate to track location of the registered one of the unknown status stray markers relative to the reference array. For example, the operations may track location of the surveillance marker relative to the DRB to determine if the DRB and/or the surveillance marker has moved, such as from being bumped by a user, and may trigger a warning notification to be generated to the user of a threshold movement is identified.


The operations may limit registration 614 to being performed on only unknown status stray markers that are identified in at least a threshold number of the plurality of frames. The operations may further limit registration 614 to being performed on only unknown status stray markers that are identified in at least the threshold number of the plurality of frames which have been determined to have camera movement offsets greater than a threshold movement offset.


The operations may determine a movement offset of the tracking cameras between receipt of the present frame and receipt of a previous frame or, in a further embodiment, receipt of any of the previously received frames. Thus, the “previous frame” may be the frame received in sequence immediately before the present frame or may be any of the frames that were received in the stream before the present frame. The determination of the movement offset may include determining a rotational offset of the tracking cameras and/or a linear location offset of the tracking cameras.


The threshold movement offset may be, for example, defined as more than 10 millimeters of linear location offset of the tracking cameras and/or defined as more than one degree of rotational offset of the tracking cameras between capturing the present frame and capturing a previous frame. The movement offset may be determined based on comparing the 3D locations of the candidate markers of the present frame set to the 3D locations of the candidate markers of the previous frame set or, in some embodiments, to the 3D locations of the candidate markers in any of the frame sets that were identified from earlier frames in the stream. The decision of whether the movement offset is less than the threshold movement offset, may include comparing the 3D locations of the candidate markers of the present frame set to the 3D locations of the candidate markers of any of the previously received frame sets in order to, for example, determine whether the present orientation of the tracking cameras is not sufficiently different from an earlier orientation of the tracking cameras.


In one embodiment, the operations determine 3D locations of the stray markers, determine a camera movement offset of the tracking cameras between receipt of the frame and receipt of a previous frame, and based on when the camera movement offset is less than a threshold movement offset, not perform the estimation 608 of 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras. For example, the operations may wait for the camera movement offset to be at least the threshold movement offset before processing a next frame to perform operations 608-612.


Example Registration Operations:


Some further embodiments are directed to utilizing the operations of FIG. 6 to enable the camera tracking system to automatically register or to provide a simplified user interface for triggering registration of a candidate marker as a defined type of marker, such as a surveillance marker.


For example, after unknown status stray markers that do not have phantom status have been identified (included 612 in the candidate registration set), the operations may generate a user interface through which a user provides at least one command to cause registration 614 of those stray markers for location tracking by the camera tracking system. The operation may then enable tracking of the location of the one of the candidate markers as a surveillance marker tracked relative to a reference array.


For example, when the candidate registration set contains a single candidate marker, e.g., a single unknown status stray marker that do not have phantom status, the operations may display a user-selectable indicia that can be selected by a user to trigger registration of the candidate marker as a surveillance marker. Because the surveillance marker should be positioned relatively closely to a DRB in order to allow tracking of any movement of the DRB and/or surveillance marker, e.g., due to being bumped, the operations may require that single single unknown status stray marker without phantom status to be within a threshold distance of the DRB or another defined reference array before displaying the user-selectable indicia allowing the user to trigger registration of that stray marker as a surveillance marker. The threshold distance from the DRB or other defined reference array may be, for example, less than 30 centimeters, in accordance with some embodiments.


For example, in one embodiment, the camera tracking system determines a particular one of the stray markers included in the candidate registration set satisfies a defined rule for corresponding to a surveillance marker such as by being the only stray marker in the candidate registration set. The camera tracking system can respond to the determination by displaying a registration initiation indicia selected by a user to trigger registration of the particular one of the stray markers as the surveillance marker. Once registration of the stray marker as the surveillance marker is complete, the camera tracking system can then track location of the surveillance marker relative to the DRB or other reference array. In a further embodiment, to satisfy the defined rule the particular one of the candidate markers included in the candidate registration set needs to be determined to be within a threshold distance from the reference array. As explained above, Because the surveillance marker should be positioned relatively closely to a DRB in order to allow tracking of any movement of the DRB and/or surveillance marker, e.g., due to being bumped, the operations may require the single stray marker to be within a threshold distance of the DRB or another defined reference array before displaying the user-selectable indicia allowing the user to trigger registration of the stray marker as a surveillance marker. The threshold distance from the DRB or other defined reference array may be, for example, less than 30 centimeters, in accordance with some embodiments.


The camera tracking system may display visual cues to a user to facilitate involvement in some of the operations described in FIG. 6.


In one embodiments, as part of operations of FIG. 6, the camera tracking system may display a first type of graphical object at locations of assigned status stray markers and display a second type of graphical object at locations of unknown status stray markers, where the first type of graphical object has a different shape and/or color than the second type of graphical object.


The camera tracking system may provide guidance to a user to begin and/or end movement of the tracking cameras to facilitate registration and tracking of markers. For example, in one embodiment, the system displays an indication to a user that further movement of the tracking cameras is not needed for registration, responsive to determining that the stray marker(s) included in the candidate registration set satisfies a defined rule. The defined rule may correspond to determining that the stray marker(s) have been present in at least a threshold number of previous frames, may further include determining that those threshold number of previous frames have at least a threshold offset relative to each other.


In some other embodiments, the camera tracking system may determine locations of the the unknown status stray markers, and display graphical indications overlaid on at least one of the frames at the locations of the unknown status stray markers. The graphical indications may be overlaid at locations in at least one of the frames determined based on the determined 3D locations. The operations receive a user selection of one of the graphical indications, and perform registration of one of the unknown status stray markers with the location corresponding to the selected one of the graphical indications.


The operations may receive the user selection of one of the graphical indications through a touch screen interface, such as by the user touch-selecting one of the graphical indications to register the one of the unknown status stray markers with the location corresponding to the selected one of the graphical indications.


Alternatively or additionally, the operations may display a graphical representation of a tool being tracked by the camera tracking system while the tool is manipulated by the user. The operations receive the user selection of the one of the graphical indications based on determining a tracked location on the tool is within a threshold selection distance from the location of the one of the graphical indications while a further defined condition is satisfied. Thus, for example, the user can indicate which of the stray markers is to be registered as the surveillance marker by positioning an end of the displayed graphically representation of the tool within the threshold selection distance of the displayed graphical indication associated with the to-be-selected stray marker.


As explained above, once the stray marker(s) are registered with the camera tracking system, the camera tracking system may then perform operations for navigated surgical procedures. The operations can include to track pose of an instrument relative to the registered marker, and generate steering information based on comparison of the pose of the instrument relative to a planned pose of the instrument. The steering information can indicate where the instrument needs to be moved and angularly oriented to become aligned with the planned pose when performing a surgical procedure.


Another embodiment for a method to eliminate stray markers from consideration is to assess whether the marker is located in an untenable place. For example, does the stray maker appear to be on the surface of the robot arm, inside the patient, or on the bed? If so, it is most likely a phantom stray and can be eliminated from the set of candidates, e.g., not included in the candidate registration set. Position of a stray marker relative to the robot or patient requires knowledge of the location of these other structures. Because the robot arm is tracked and can have position sensors in each joint, the operations may compute the location of the arm surface based on these tracked parameters and compare locations of the surfaces of the robot arm to locations of each stray marker. To determine whether a tracked location is inside the patient, the operations may use the registration of the CT scan volume to the tracking cameras and to use planned implant locations, image processing, machine vision, or a manual identification of surface points to compute where the patient is positioned relative to the cameras.


Further Definitions and Embodiments

In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.


When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.


It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.


As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.


Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).


These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.


It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.


Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims
  • 1. A camera tracking system for computer assisted navigation during surgery, comprising at least one processor operative to: receive a stream of frames of tracking data from tracking cameras configured with a partially overlapping field-of-view; andfor each of a plurality of the frames in the stream, identify stray markers in the frame,identify which of the stray markers are part of a reference array,designate stray markers that are part of the reference array as being assigned status,designate stray markers that are not part of the reference array as being unknown status,for each one of the assigned status stray markers, designate any other of the assigned status stray markers and any of the unknown status stray markers that are along a same epipolar line of the tracking cameras as the one of the assigned status stray markers as being epipolar ambiguous status,for each one of the epipolar ambiguous status stray markers, estimate 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras when determining location of the one of the epipolar ambiguous status stray markers,designate any of the unknown status stray markers within a threshold distance of the estimated 3D locations of the phantom markers as being phantom status, andinclude in a candidate registration set the unknown status stray markers that do not have phantom status.
  • 2. The camera tracking system of claim 1, wherein the at least one processor is further operative to: perform registration of one of the unknown status stray markers in the candidate registration set.
  • 3. The camera tracking system of claim 2, wherein the at least one processor is further operative to: prevent registration of any of the phantom status stray markers.
  • 4. The camera tracking system of claim 2, wherein the at least one processor is further operative to: track location of the registered one of the unknown status stray markers relative to the reference array.
  • 5. The camera tracking system of claim 2, wherein the at least one processor is further operative to: limit registration to being performed on only unknown status stray markers that are identified in at least a threshold number of the plurality of frames which have been determined to have camera movement offsets greater than a threshold movement offset.
  • 6. The camera tracking system of claim 1, wherein the threshold distance is not greater than 2 millimeters.
  • 7. The camera tracking system of claim 1, wherein the at least one processor is further operative to: display a first type of graphical object at locations of assigned status stray markers; anddisplay a second type of graphical object at locations of unknown status stray markers, wherein the first type of graphical object has a different shape and/or color than the second type of graphical object.
  • 8. The camera tracking system of claim 1, wherein the at least one processor is further operative to: determine whether a particular one of the unknown status stray markers included in the candidate registration set satisfies a defined rule for corresponding to a surveillance marker;display a registration initiation indicia selectable by a user to trigger registration of the particular one of the unknown status stray markers as the surveillance marker; andtrack location of the surveillance marker relative to the reference array.
  • 9. The camera tracking system of claim 8, wherein the reference array and the surveillance marker are configured to be affixed to a patient, and the at least one processor is further operative to: track pose of an instrument relative to the reference array; andgenerate steering information based on comparison of the pose of the instrument relative to a planned pose of the instrument, wherein the steering information indicates where the instrument needs to be moved and angularly oriented to become aligned with the planned pose when performing a surgical procedure.
  • 10. The camera tracking system of claim 8, wherein to satisfy the defined rule the particular one of the unknown status stray markers included in the candidate registration set is within a threshold registration distance from the reference array.
  • 11. The camera tracking system of claim 1, wherein the at least one processor is further operative to: determine locations of the unknown status stray markers;display graphical indications overlaid on at least one of the frames at the locations of the unknown status stray markers;receive a user selection of one of the graphical indications; andperform registration of one of the unknown status stray markers with the location corresponding to the selected one of the graphical indications.
  • 12. The camera tracking system of claim 11, wherein the at least one processor is further operative to receive the user selection of one of the graphical indications through a touch screen interface.
  • 13. The camera tracking system of claim 12, wherein the at least one processor is further operative to: display a graphical representation of a tool being tracked by the camera tracking system while manipulated by the user; andreceive the user selection of the one of the graphical indications based on determining whether a tracked location on the tool is within a threshold selection distance from the location of the one of the graphical indications while a further defined condition is satisfied.
  • 14. The camera tracking system of claim 1, wherein the at least one processor is further operative to: determine three-dimensional (3D) locations of the stray markers,determine a camera movement offset of the tracking cameras between receipt of the frame and receipt of a previous frame, andbased on when the camera movement offset is less than a threshold movement offset, not perform the estimation of 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras.
  • 15. The camera tracking system of claim 14, wherein the determination of the camera movement offset determines rotational offset of the tracking cameras.
  • 16. The camera tracking system of claim 14, wherein the determination of the camera movement offset determines linear location offset of the tracking cameras.
  • 17. A method by a camera tracking system for computer assisted navigation during surgery, the method comprising: receiving a stream of frames of tracking data from tracking cameras configured with a partially overlapping field-of-view; andfor each of a plurality of the frames in the stream, identifying stray markers in the frame,identifying which of the stray markers are part of a reference array,designating stray markers that are part of the reference array as being assigned status,designating stray markers that are not part of the reference array as being unknown status,for each one of the assigned status stray markers, designate any other of the assigned status stray markers and any of the unknown status stray markers that are along a same epipolar line of the tracking cameras as the one of the assigned status stray markers as being epipolar ambiguous status,for each one of the epipolar ambiguous status stray markers, estimating 3D locations where phantom markers can appear in the frame based on epipolar ambiguity of the tracking cameras when determining location of the one of the epipolar ambiguous status stray markers,designating any of the unknown status stray markers within a threshold distance of the estimated 3D locations of the phantom markers as being phantom status, andincluding in a candidate registration set the unknown status stray markers that do not have phantom status.
  • 18. The method of claim 17, further comprising: performing registration of one of the unknown status stray markers in the candidate registration set.
  • 19. The method of claim 18, further comprising: limiting registration to being performed on only unknown status stray markers that are identified in at least a threshold number of the plurality of frames which have been determined to have camera movement offsets greater than a threshold movement offset.
  • 20. The method of claim 17, further comprising: determining whether a particular one of the unknown status stray markers included in the candidate registration set satisfies a defined rule for corresponding to a surveillance marker;displaying a registration initiation indicia selectable by a user to trigger registration of the particular one of the unknown status stray markers as the surveillance marker; andtracking location of the surveillance marker relative to the reference array.
US Referenced Citations (680)
Number Name Date Kind
4150293 Franke Apr 1979 A
5246010 Gazzara et al. Sep 1993 A
5354314 Hardy et al. Oct 1994 A
5397323 Taylor et al. Mar 1995 A
5598453 Baba et al. Jan 1997 A
5772594 Barrick Jun 1998 A
5791908 Gillio Aug 1998 A
5820559 Ng et al. Oct 1998 A
5825982 Wright et al. Oct 1998 A
5887121 Funda et al. Mar 1999 A
5911449 Daniele et al. Jun 1999 A
5951475 Gueziec et al. Sep 1999 A
5987960 Messner et al. Nov 1999 A
6012216 Esteves et al. Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6122541 Cosman et al. Sep 2000 A
6144875 Schweikard et al. Nov 2000 A
6157853 Blume et al. Dec 2000 A
6167145 Foley et al. Dec 2000 A
6167292 Badano et al. Dec 2000 A
6201984 Funda et al. Mar 2001 B1
6203196 Meyer et al. Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6212419 Blume et al. Apr 2001 B1
6231565 Tovey et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6306126 Montezuma Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6320929 Von Der Haar Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6340363 Bolger et al. Jan 2002 B1
6377011 Ben-Ur Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6424885 Niemeyer et al. Jul 2002 B1
6447503 Wynne et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6477400 Barrick Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6487267 Wolter Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6507751 Blume et al. Jan 2003 B2
6535756 Simon et al. Mar 2003 B1
6560354 Maurer, Jr. et al. May 2003 B1
6565554 Niemeyer May 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6614453 Suri et al. Sep 2003 B1
6614871 Kobiki et al. Sep 2003 B1
6619840 Rasche et al. Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6645196 Nixon et al. Nov 2003 B1
6666579 Jensen Dec 2003 B2
6669635 Kessman et al. Dec 2003 B2
6701173 Nowinski et al. Mar 2004 B2
6757068 Foxlin Jun 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6786896 Madhani et al. Sep 2004 B1
6788018 Blumenkranz Sep 2004 B1
6804581 Wang et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6827351 Graziani et al. Dec 2004 B2
6837892 Shoham Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892090 Verard et al. May 2005 B2
6920347 Simon et al. Jul 2005 B2
6922632 Foxlin Jul 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6988009 Grimm et al. Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6996487 Jutras et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7007699 Martinelli et al. Mar 2006 B2
7016457 Senzig et al. Mar 2006 B1
7043961 Pandey et al. May 2006 B2
7062006 Pelc et al. Jun 2006 B1
7063705 Young et al. Jun 2006 B2
7072707 Galloway, Jr. et al. Jul 2006 B2
7083615 Peterson et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7099428 Clinthorne et al. Aug 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7130676 Barrick Oct 2006 B2
7139418 Abovitz et al. Nov 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164968 Treat et al. Jan 2007 B2
7167738 Schweikard et al. Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7172627 Fiere et al. Feb 2007 B2
7194120 Wicker et al. Mar 2007 B2
7197107 Arai et al. Mar 2007 B2
7231014 Levy Jun 2007 B2
7231063 Naimark et al. Jun 2007 B2
7239940 Wang et al. Jul 2007 B2
7248914 Hastings et al. Jul 2007 B2
7301648 Foxlin Nov 2007 B2
7302288 Schellenberg Nov 2007 B1
7313430 Urquhart et al. Dec 2007 B2
7318805 Schweikard et al. Jan 2008 B2
7318827 Leitner et al. Jan 2008 B2
7319897 Leitner et al. Jan 2008 B2
7324623 Heuscher et al. Jan 2008 B2
7327865 Fu et al. Feb 2008 B2
7331967 Lee et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7339341 Oleynikov et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7422592 Morley et al. Sep 2008 B2
7435216 Kwon et al. Oct 2008 B2
7440793 Chauhan et al. Oct 2008 B2
7460637 Clinthorne et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7533892 Schena et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7555331 Viswanathan Jun 2009 B2
7567834 Clayton et al. Jul 2009 B2
7594912 Cooper et al. Sep 2009 B2
7606613 Simon et al. Oct 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7623902 Pacheco Nov 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7643862 Schoenefeld Jan 2010 B2
7660623 Hunter et al. Feb 2010 B2
7661881 Gregerson et al. Feb 2010 B2
7683331 Chang Mar 2010 B2
7683332 Chang Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7691098 Wallace et al. Apr 2010 B2
7702379 Avinash et al. Apr 2010 B2
7702477 Tuemmler et al. Apr 2010 B2
7711083 Heigl et al. May 2010 B2
7711406 Kuhn et al. May 2010 B2
7720523 Omernick et al. May 2010 B2
7725253 Foxlin May 2010 B2
7726171 Langlotz et al. Jun 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7751865 Jascob et al. Jul 2010 B2
7760849 Zhang Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7763015 Cooper et al. Jul 2010 B2
7787699 Mahesh et al. Aug 2010 B2
7796728 Bergfjord Sep 2010 B2
7813838 Sommer Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7831294 Viswanathan Nov 2010 B2
7834484 Sartor Nov 2010 B2
7835557 Kendrick et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7853313 Thompson Dec 2010 B2
7865269 Prisco et al. Jan 2011 B2
D631966 Perloff et al. Feb 2011 S
7879045 Gielen et al. Feb 2011 B2
7881767 Strommer et al. Feb 2011 B2
7881770 Melkent et al. Feb 2011 B2
7886743 Cooper et al. Feb 2011 B2
RE42194 Foley et al. Mar 2011 E
RE42226 Foley et al. Mar 2011 E
7900524 Calloway et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7909122 Schena et al. Mar 2011 B2
7925653 Saptharishi Apr 2011 B2
7930065 Larkin et al. Apr 2011 B2
7935130 Williams May 2011 B2
7940999 Liao et al. May 2011 B2
7945012 Ye et al. May 2011 B2
7945021 Shapiro et al. May 2011 B2
7953470 Vetter et al. May 2011 B2
7954397 Choi et al. Jun 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7974677 Mire et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7983733 Viswanathan Jul 2011 B2
7988215 Seibold Aug 2011 B2
7996110 Lipow et al. Aug 2011 B2
8004121 Sartor Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8019045 Kato Sep 2011 B2
8021310 Sanborn et al. Sep 2011 B2
8035685 Jensen Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
8046057 Clarke Oct 2011 B2
8052688 Wolf, II Nov 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8057397 Li et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8062375 Glerum et al. Nov 2011 B2
8066524 Burbank et al. Nov 2011 B2
8073335 Labonville et al. Dec 2011 B2
8079950 Stern et al. Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
8092370 Roberts et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8105320 Manzo Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8109877 Moctezuma de la Barrera et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8116430 Shapiro et al. Feb 2012 B1
8120301 Goldberg et al. Feb 2012 B2
8121249 Wang et al. Feb 2012 B2
8123675 Funda et al. Feb 2012 B2
8133229 Bonutti Mar 2012 B1
8142420 Schena Mar 2012 B2
8147494 Leitner et al. Apr 2012 B2
8150494 Simon et al. Apr 2012 B2
8150497 Gielen et al. Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8165658 Waynik et al. Apr 2012 B2
8170313 Kendrick et al. May 2012 B2
8179073 Farritor et al. May 2012 B2
8182476 Julian et al. May 2012 B2
8184880 Zhao et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8208988 Jenser Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8224024 Foxlin et al. Jul 2012 B2
8224484 Swarup et al. Jul 2012 B2
8225798 Baldwin et al. Jul 2012 B2
8228368 Zhao et al. Jul 2012 B2
8231610 Jo et al. Jul 2012 B2
8263933 Hartmann et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8241271 Millman et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8271069 Jascob et al. Sep 2012 B2
8271130 Hourtash Sep 2012 B2
8281670 Larkin et al. Oct 2012 B2
8282653 Nelson et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8320991 Jascob et al. Nov 2012 B2
8332012 Kienzle, III Dec 2012 B2
8333755 Cooper et al. Dec 2012 B2
8335552 Stiles Dec 2012 B2
8335557 Maschke Dec 2012 B2
8348931 Cooper et al. Jan 2013 B2
8353963 Glerum Jan 2013 B2
8358818 Miga et al. Jan 2013 B2
8359730 Burg et al. Jan 2013 B2
8374673 Adcox et al. Feb 2013 B2
8374723 Zhao et al. Feb 2013 B2
8379791 Forthmann et al. Feb 2013 B2
8386019 Camus et al. Feb 2013 B2
8392022 Ortmaier et al. Mar 2013 B2
8394099 Patwardhan Mar 2013 B2
8395342 Prisco Mar 2013 B2
8398634 Manzo et al. Mar 2013 B2
8400094 Schena Mar 2013 B2
8414957 Enzerink et al. Apr 2013 B2
8418073 Mohr et al. Apr 2013 B2
8450694 Baviera et al. May 2013 B2
8452447 Nixon May 2013 B2
RE44305 Foley et al. Jun 2013 E
8462911 Vesel et al. Jun 2013 B2
8465476 Rogers et al. Jun 2013 B2
8465771 Wan et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8467852 Csavoy et al. Jun 2013 B2
8469947 Devengenzo et al. Jun 2013 B2
RE44392 Hynes Jul 2013 E
8483434 Buehner et al. Jul 2013 B2
8483800 Jensen et al. Jul 2013 B2
8486532 Enzerink et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8500722 Cooper Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8504201 Moll et al. Aug 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8506556 Schena Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8512318 Tovey et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8518120 Glerum et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Issacs Sep 2013 B2
8527094 Kumar et al. Sep 2013 B2
8528440 Morley et al. Sep 2013 B2
8532741 Heruth et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8548563 Simon et al. Oct 2013 B2
8549732 Burg et al. Oct 2013 B2
8551114 Ramos de la Pena Oct 2013 B2
8551116 Julian et al. Oct 2013 B2
8556807 Scott et al. Oct 2013 B2
8556979 Glerum et al. Oct 2013 B2
8560118 Green et al. Oct 2013 B2
8561473 Blumenkranz Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8571638 Shoham Oct 2013 B2
8571710 Coste-Maniere et al. Oct 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574303 Sharkey et al. Nov 2013 B2
8585420 Burbank et al. Nov 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597198 Sanborn et al. Dec 2013 B2
8600478 Verard et al. Dec 2013 B2
8603077 Cooper et al. Dec 2013 B2
8611985 Lavallee et al. Dec 2013 B2
8613230 Blumenkranz et al. Dec 2013 B2
8621939 Blumenkranz et al. Jan 2014 B2
8624537 Nowlin et al. Jan 2014 B2
8630389 Kato Jan 2014 B2
8634897 Simon et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8639000 Zhao et al. Jan 2014 B2
8641726 Bonutti Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8657809 Schoepp Feb 2014 B2
8660635 Simon et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8675939 Moctezuma de la Barrera Mar 2014 B2
8678647 Gregerson et al. Mar 2014 B2
8679125 Smith et al. Mar 2014 B2
8679183 Glerum et al. Mar 2014 B2
8682413 Lloyd Mar 2014 B2
8684253 Giordano et al. Apr 2014 B2
8685098 Glerum et al. Apr 2014 B2
8693730 Umasuthan et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8696458 Foxlin et al. Apr 2014 B2
8700123 Okamura et al. Apr 2014 B2
8706086 Glerum Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706301 Zhao et al. Apr 2014 B2
8717430 Simon et al. May 2014 B2
8727618 Maschke et al. May 2014 B2
8734432 Tuma et al. May 2014 B2
8738115 Amberg et al. May 2014 B2
8738181 Greer et al. May 2014 B2
8740882 Jun et al. Jun 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8764448 Yang et al. Jul 2014 B2
8771170 Mesallum et al. Jul 2014 B2
8781186 Clements et al. Jul 2014 B2
8781630 Banks et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8787520 Baba Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798231 Notohara et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8812077 Dempsey Aug 2014 B2
8814793 Brabrand Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8818105 Myronenko et al. Aug 2014 B2
8820605 Shelton, IV Sep 2014 B2
8821511 von Jako et al. Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827996 Scott et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8830224 Zhao et al. Sep 2014 B2
8834489 Cooper et al. Sep 2014 B2
8834490 Bonutti Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8844789 Shelton, IV et al. Sep 2014 B2
8855822 Bartol et al. Oct 2014 B2
8858598 Seifert et al. Oct 2014 B2
8860753 Bhandarkar et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864798 Weiman et al. Oct 2014 B2
8864833 Glerum et al. Oct 2014 B2
8867703 Shapiro et al. Oct 2014 B2
8870880 Himmelberger et al. Oct 2014 B2
8876866 Zappacosta et al. Nov 2014 B2
8880223 Raj et al. Nov 2014 B2
8882803 Iott et al. Nov 2014 B2
8883210 Truncale et al. Nov 2014 B1
8888821 Rezach et al. Nov 2014 B2
8888853 Glerum et al. Nov 2014 B2
8888854 Glerum et al. Nov 2014 B2
8894652 Seifert et al. Nov 2014 B2
8894688 Suh Nov 2014 B2
8894691 Iott et al. Nov 2014 B2
8906069 Hansell et al. Dec 2014 B2
8964934 Ein-Gal Feb 2015 B2
8992580 Bar et al. Mar 2015 B2
8996169 Lightcap et al. Mar 2015 B2
9001963 Sowards-Emmerd et al. Apr 2015 B2
9002076 Khadem et al. Apr 2015 B2
9044190 Rubner et al. Jun 2015 B2
9107683 Hourtash et al. Aug 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9215968 Schostek et al. Dec 2015 B2
9308050 Kostrzewski et al. Apr 2016 B2
9380984 Li et al. Jul 2016 B2
9393039 Lechner et al. Jul 2016 B2
9398886 Gregerson et al. Jul 2016 B2
9398890 Dong et al. Jul 2016 B2
9414859 Ballard et al. Aug 2016 B2
9420975 Gutfleisch et al. Aug 2016 B2
9492235 Hourtash et al. Nov 2016 B2
9592096 Maillet et al. Mar 2017 B2
9750465 Engel et al. Sep 2017 B2
9757203 Hourtash et al. Sep 2017 B2
9795354 Menegaz et al. Oct 2017 B2
9814535 Bar et al. Nov 2017 B2
9820783 Donner et al. Nov 2017 B2
9833265 Donner et al. Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9925011 Gombert et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
10034717 Miller et al. Jul 2018 B2
20010036302 Miller Nov 2001 A1
20020035321 Bucholz et al. Mar 2002 A1
20040068172 Nowinski et al. Apr 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20050096502 Khalili May 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050171558 Abovitz et al. Aug 2005 A1
20060100610 Wallace et al. May 2006 A1
20060173329 Marquart et al. Aug 2006 A1
20060184396 Dennis et al. Aug 2006 A1
20060241416 Marquart et al. Oct 2006 A1
20060291612 Nishide et al. Dec 2006 A1
20070015987 Benlloch Baviera et al. Jan 2007 A1
20070021738 Hasser et al. Jan 2007 A1
20070038059 Sheffer et al. Feb 2007 A1
20070073133 Schoenefeld Mar 2007 A1
20070156121 Millman et al. Jul 2007 A1
20070156157 Nahum et al. Jul 2007 A1
20070167712 Keglovich et al. Jul 2007 A1
20070233238 Huynh et al. Oct 2007 A1
20080004523 Jensen Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080033283 Dellaca et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080108912 Node-Langlois May 2008 A1
20080108991 von Jako May 2008 A1
20080109012 Falco et al. May 2008 A1
20080144906 Allred et al. Jun 2008 A1
20080161680 von Jako et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080177203 von Jako Jul 2008 A1
20080214922 Hartmann et al. Sep 2008 A1
20080228068 Viswanathan et al. Sep 2008 A1
20080228196 Wang et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080269596 Revie et al. Oct 2008 A1
20080287771 Anderson Nov 2008 A1
20080287781 Revie et al. Nov 2008 A1
20080300477 Lloyd et al. Dec 2008 A1
20080300478 Zuhars et al. Dec 2008 A1
20080302950 Park et al. Dec 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080319311 Hamadeh Dec 2008 A1
20090012509 Csavoy et al. Jan 2009 A1
20090030428 Omori et al. Jan 2009 A1
20090080737 Battle et al. Mar 2009 A1
20090185655 Koken et al. Jul 2009 A1
20090198121 Hoheisel Aug 2009 A1
20090216113 Meier et al. Aug 2009 A1
20090228019 Gross et al. Sep 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090259230 Khadem et al. Oct 2009 A1
20090264899 Appenrodt et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20100022874 Wang et al. Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100125286 Wang et al. May 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100228117 Hartmann Sep 2010 A1
20100228265 Prisco Sep 2010 A1
20100249571 Jensen et al. Sep 2010 A1
20100274120 Heuscher Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20110022229 Jang et al. Jan 2011 A1
20110077504 Fischer et al. Mar 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110137152 Li Jun 2011 A1
20110213384 Jeong Sep 2011 A1
20110224684 Larkin et al. Sep 2011 A1
20110224685 Larkin et al. Sep 2011 A1
20110224686 Larkin et al. Sep 2011 A1
20110224687 Larkin et al. Sep 2011 A1
20110224688 Larkin et al. Sep 2011 A1
20110224689 Larkin et al. Sep 2011 A1
20110224825 Larkin et al. Sep 2011 A1
20110230967 O'Halloran et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110282189 Graumann Nov 2011 A1
20110286573 Schretter et al. Nov 2011 A1
20110295062 Gratacos Solsona et al. Dec 2011 A1
20110295370 Suh et al. Dec 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120035507 George et al. Feb 2012 A1
20120046668 Gantes Feb 2012 A1
20120051498 Koishi Mar 2012 A1
20120053597 Anvari et al. Mar 2012 A1
20120059248 Holsing et al. Mar 2012 A1
20120071753 Hunter et al. Mar 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120136372 Amat Girbau et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120184839 Woerlein Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120226145 Chang et al. Sep 2012 A1
20120235909 Birkenbach et al. Sep 2012 A1
20120245596 Meenink Sep 2012 A1
20120253332 Moll Oct 2012 A1
20120253360 White et al. Oct 2012 A1
20120256092 Zingerman Oct 2012 A1
20120294498 Popovic Nov 2012 A1
20120296203 Hartmann et al. Nov 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130016889 Myronenko et al. Jan 2013 A1
20130030571 Ruiz Morales et al. Jan 2013 A1
20130035583 Park et al. Feb 2013 A1
20130060146 Yang et al. Mar 2013 A1
20130060337 Petersheim et al. Mar 2013 A1
20130094742 Feilkas Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130116706 Lee et al. May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130144307 Jeong et al. Jun 2013 A1
20130158542 Manzo et al. Jun 2013 A1
20130165937 Patwardhan Jun 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130178870 Schena Jul 2013 A1
20130204271 Brisson et al. Aug 2013 A1
20130211419 Jensen Aug 2013 A1
20130211420 Jensen Aug 2013 A1
20130218142 Tuma et al. Aug 2013 A1
20130223702 Holsing et al. Aug 2013 A1
20130225942 Holsing et al. Aug 2013 A1
20130225943 Holsing et al. Aug 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130237995 Lee et al. Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130261640 Kim et al. Oct 2013 A1
20130272488 Bailey et al. Oct 2013 A1
20130272489 Dickman et al. Oct 2013 A1
20130274761 Devengenzo et al. Oct 2013 A1
20130281821 Liu et al. Oct 2013 A1
20130296884 Taylor et al. Nov 2013 A1
20130303887 Holsing et al. Nov 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130317521 Choi et al. Nov 2013 A1
20130325033 Schena et al. Dec 2013 A1
20130325035 Hauck et al. Dec 2013 A1
20130331686 Freysinger et al. Dec 2013 A1
20130331858 Devengenzo et al. Dec 2013 A1
20130331861 Yoon Dec 2013 A1
20130342578 Isaacs Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345757 Stad Dec 2013 A1
20140001235 Shelton, IV Jan 2014 A1
20140012131 Heruth et al. Jan 2014 A1
20140031664 Kang et al. Jan 2014 A1
20140046128 Lee et al. Feb 2014 A1
20140046132 Hoeg et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140073914 Lavallee et al. Mar 2014 A1
20140080086 Chen Mar 2014 A1
20140081128 Verard et al. Mar 2014 A1
20140088612 Bartol et al. Mar 2014 A1
20140094694 Moctezuma de la Barrera Apr 2014 A1
20140094851 Gordon Apr 2014 A1
20140096369 Matsumoto et al. Apr 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140121676 Kostrzewski et al. May 2014 A1
20140128882 Kwak et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140142592 Moon et al. May 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140163581 Devengenzo et al. Jun 2014 A1
20140171781 Stiles Jun 2014 A1
20140171900 Stiles Jun 2014 A1
20140171965 Loh et al. Jun 2014 A1
20140180308 von Grunberg Jun 2014 A1
20140180309 Seeber et al. Jun 2014 A1
20140187915 Yaroshenko et al. Jul 2014 A1
20140188132 Kang Jul 2014 A1
20140194699 Roh et al. Jul 2014 A1
20140130810 Azizian et al. Aug 2014 A1
20140221819 Sarment Aug 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20140234804 Huang et al. Aug 2014 A1
20140257328 Kim et al. Sep 2014 A1
20140257329 Jang et al. Sep 2014 A1
20140257330 Choi et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140276931 Parihar et al. Sep 2014 A1
20140276940 Seo Sep 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140288413 Hwang et al. Sep 2014 A1
20140299648 Shelton, IV et al. Oct 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140303643 Ha et al. Oct 2014 A1
20140305995 Shelton, IV et al. Oct 2014 A1
20140309659 Roh et al. Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140323803 Hoffman et al. Oct 2014 A1
20140324070 Min et al. Oct 2014 A1
20140330288 Date et al. Nov 2014 A1
20140364720 Darrow et al. Dec 2014 A1
20140371577 Maillet et al. Dec 2014 A1
20150039034 Frankel et al. Feb 2015 A1
20150085970 Bouhnik et al. Mar 2015 A1
20150146847 Liu May 2015 A1
20150150524 Yorkston et al. Jun 2015 A1
20150196261 Funk Jul 2015 A1
20150213633 Chang et al. Jul 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150342647 Frankel et al. Dec 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160166329 Langan et al. Jun 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160249990 Glozman et al. Sep 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160320322 Suzuki Nov 2016 A1
20160331335 Gregerson et al. Nov 2016 A1
20170135770 Scholl et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170212723 Atarot et al. Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170231710 Scholl et al. Aug 2017 A1
20170258426 Risher-Kelly et al. Sep 2017 A1
20170273748 Hourtash et al. Sep 2017 A1
20170296277 Hourtash et al. Oct 2017 A1
20170360493 Zucher et al. Dec 2017 A1
Non-Patent Literature Citations (1)
Entry
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn)
Related Publications (1)
Number Date Country
20230310086 A1 Oct 2023 US