The present invention relates to robotic surgery and more particularly to ligament balancing in orthopedic surgery and knee arthroplasty with robotic assistance.
Ligaments are important in knee kinematics as together with bones they ensure knee stability. Ligaments also determine knee joint kinematics across the range of motion and provide additional functions like proprioception (sensory information about knee state).
In order to correctly plan the most appropriate implant position for a specific patient, a surgeon needs to consider how to maintain proper balance of a patient's knee ligaments. The balance procedure is performed intra-operatively, i.e., after opening patient knee. Usually, the surgeon will remove any osteophytes first in order to prevent them from interfering with ligaments assessment. Next, the surgeon will move the knee across the range of motion with no medio-lateral forces and then while applying the lateral varus and valgus forces to feel how the knee moves. The surgeons attempts to identify knee laxity, any irregularities, force required to move etc. The procedure provide initial information to consider during implant planning stage, such as how to adapt for this patient implant sizing and placement, target deformity correction, etc. After performing the necessary cuts and placing a trial implant, the surgeon will typically redo the ligaments balancing process to compare the present balancing to the initial balancing in view of a desired goal to be achieved. Similarly, after placing a final implant the surgeon may perform a final check of the ligaments balancing.
Limitations of existing processes to assess ligaments balancing can include the following:
Some embodiments of the present disclosure are directed to a surgical robot system that includes a surgical robot having a robot base and a robot arm connected to the robot base. The surgical robot system further includes a joint manipulation arm configured to be attached to the robot arm and to be connected to an appendage of a patient and moved to apply force and/or torque to a joint connecting the appendage through movement of the robot arm. The surgical robot system further includes a force and/or torque sensor apparatus configured to output a feedback signal providing an indication of an amount of force and/or torque that is being applied to the robot arm and/or the joint manipulation arm. At least one controller is configured to determine ligaments balancing at the joint based on a plurality of measurements of the feedback signal, and to output information characterizing the ligaments balancing.
Some other related embodiments are directed to a method by a surgical robot system which includes a surgical robot with a robot arm connected to move a join manipulation arm to apply force and/or torque to a joint connecting an appendage of a patient. The method includes obtaining, from a force and/or torque sensor apparatus, a feedback signal providing an indication of an amount of force and/or torque that is being applied to the robot arm and/or the joint manipulation arm. The method determines ligaments balancing at the joint based on a plurality of measurements of the feedback signal, and to output information characterizing the ligaments balancing.
Some other related embodiments are directed to a computer program product including a non-transitory computer readable medium storing program instructions executable by at least one processor of a surgical robot system including a surgical robot with a robot arm connected to move a join manipulation arm to apply force and/or torque to a joint connecting an appendage of a patient. The program instructions when executed by the at least one processor operate to obtain, from a force and/or torque sensor apparatus, a feedback signal providing an indication of an amount of force and/or torque that is being applied to the robot arm and/or the joint manipulation arm. The operations determine ligaments balancing at the joint based on a plurality of measurements of the feedback signal, and to output information characterizing the ligaments balancing.
Other surgical robot systems and corresponding methods and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional surgical robot systems, methods, and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.
It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
This application incorporates by reference the entire content of U.S. patent application Ser. No. 15/157,444 filed May 18, 2016.
Some more advanced ligaments balancing assessment processes have been proposed, but are subject to various limitations. One sensor-assisted technology product is called VERASENSE from Orthosensor. VERASENSE is a sensor-assisted device used during primary and revision Total Knee Arthoplasty (TKA). This technology sends real-time data to a monitor in the operating room, which a surgeon can reference to make decisions about soft tissue balance and decide on customization of implant position for a particular patient.
It is important for the ligament balancing process to be transferrable between patients while maintaining effectiveness of results. Navigation-based ligaments balancing can include measuring position of tibia and femur during a ROM test while a surgeon applies varus and valgus forces. With potentially additional information be provided by medical images (e.g. CT, segmented bone 3D models) and/or measurements (e.g. condylar/bone surface using a tracked pointer by a camera tracking system) useful measurements for assessing ligaments balancing can be identified. The measurements can include gap size, maximal and/or minimal varus and/or valgus angles, contact surfaces and points between bones, etc. In existing approaches the surgeon is entirely responsible for moving the leg across the estimated ROM which gives rise to the disadvantages described above.
Various embodiments of the present disclosure are directed to operating a surgical robot system to perform the ligaments balancing process. The surgical robot system is also referred to herein a “robot system” and “robot” for brevity. The robot system can be configured to precisely apply pre-defined constraints (forces and torques) on patient anatomy while measuring the resulting/reaction forces using, e.g., Force and/or Torque (FT) sensor apparatus(es). The robot system can precisely perform the ligaments balancing process in a manner that is repeatable with a particular patient and as a process that is repeatable across different patients.
Three separate example processes are now explained which may be performed by the robot system to carry out ligaments balancing and provide navigated assistance to a surgeon during a surgical procedure. The first process is referred to as Free Knee Motion through which the robot system operates to match measurements of initial knee ligaments balancing before and after implantation of an implant into a patient knee. The second process is referred to as Ligaments Characteristics Identifications through which the robot system operates to perform measurements of ligaments characteristics (e.g., elasticity, attachment points, etc.) when assisting a surgeon with choosing the best implant position, type and size for a specific patient. The third process is referred to as Standardized Measurement through which the robot system accesses a knowledge database defining standard “good” ligaments balancing to obtain baseline balancing measurements based on which the robot system performs operations after implantation to balance the specific patient's ligaments. These example processes can be used separately or together.
Although various embodiments are described and illustrated in the context of knee ligaments balancing, concepts implemented from these and other embodiments are not limited to the knee. These concepts may be used to evaluate and/or balance ligaments at other joints of the human body.
The Free Knee Motion process is explained in accordance with some embodiments. The robot system operates to determine how ligaments guide knee motion of a specific patient before implant placement and then closely match the ligament balancing to provide desired, e.g., substantially the same, knee motion after implant placement. For patients with advanced arthritis, osteophytes develop around knee joint.
There are four principal ligaments for correct knee balance (lower collateral ligament (LCL), medial collateral ligament (MCL), anterior cruciate ligament (ACL), and posterior cruciate ligament (PCL)), completed by other tissue in knee capsule, which are represented in the illustrations of knees in
The robotic arm 104 can be configured to move the patient's lower leg 62 through a defined ROM while performing measurements of force and/or torque applied to the knee manipulation arm 60. Alternatively or additionally the robotic arm 104 can be configured to perform measurements of force and/or torque applied to the knee manipulation arm 60 while a surgeon moves the patient's leg 62 through a ROM. The measurements can be performed by one or more force and/or torque sensor apparatus(es) in a distal end of the arm 104, such as within the end-effector 112, and/or in the knee manipulation arm 60 to measure force and/or torque at a joint of the arm 104. The force and/or torque sensor apparatus may directly measure force and/or torque at, e.g., a joint of the arm 104 and/or by measuring current applied to one or more motors which guide movement of the robotic arm 104.
The knee manipulation arm 60 can be configured to be attached to the lower leg 62 in a way that enables the robotic arm 104 to move the lower leg 62 while measuring the reaction forces and/or torques. Operations by the robot system 100 to perform ligaments balancing can be performed as the same point in the surgery as the manual ligaments balancing or may be performed more often such as responsive to any one or more of the following: before a surgical procedure to open the knee; after opening the knee; after a surgical procedure to remove osteophyte; after each cut on the knee; before and/or after performing a trial implant placement in the knee; and before and/or after performing a final implant placement in the knee. The robot system 100 can be configured to use the measurements to provide information guidance to the surgeon to assist with selection of implant size for the patient, placement of an implant into the knee of the patient (e.g. medio-lateral for femoral component, positioning and rotation of tibial component, size of polyethylene etc.), etc.
In the example embodiment of
In a further embodiment, the joint manipulation arm is configures as a knee manipulation arm 60 adapted to be attached to the robot arm and to be connected to the leg 62 of the patient to apply force and/or torque to a knee of the leg 62 through movement of the robot arm 104. The force and/or torque sensor apparatus is configured to output a feedback signal providing an indication of an amount of force and/or torque that is being applied to the knee. The at least one controller is configured to determine ligaments balancing at the knee based on the plurality of measurements of the feedback signal, and to output information characterizing the ligaments balancing.
Also in the example embodiment of
In a further embodiment, the at least one controller is further configured to display to an operator an indication of the defined ROM that the joint needs to be moved through during the pre-implant determination of ligaments, and to display to the operator a further indication of a present tracked location of the joint being moved by the operator and/or by the surgical robot in the defined ROM.
During robot-assisted ligaments balancing, the robot system 100 is configured to perform operations that move the knee manipulation arm 60 via the arm 104 to cause pre-defined movements of the lower leg 62 within the knee ROM and which may be performed to cause defined levels of resultant force and/or torque to be created at the sensor(s) of the arm 104 in order to measure the natural position given by the ligaments for the knee. One or more of the operations may include moving the lower leg 62 along a straight line without applying lateral forces and repeating the same movement when applying varus and/or valgus forces. The robot system 100 may be configured to operate in an impedance control mode through which it applies a substantially constant force and/or torque while performing the defined motion of the lower leg 62 and performing measurements of the natural position provided by the ligaments.
In some embodiments, the surgical robot further comprises at least one motor operatively connected to move the robot arm 104 relative to the robot base. The at least one controller is further configured to perform the determination 1400 of ligaments balancing based on controlling the at least one motor to move the robot arm 104 along a path computed to move the joint in the defined ROM while repetitively performing the measurements of the feedback signal indicating the amount of force and/or torque that is being applied to the joint.
In a further embodiment, the at least one controller is further configured to perform the determination 1400 of ligaments balancing based on controlling the at least one motor to move the robot arm 104 to cause the joint to move in the defined ROM without lateral forces being applied to the knee and while performing a first set of measurements of the feedback signal indicating the amount of force and/or torque that is being applied to the joint, and to control the at least one motor to move the robot arm 104 and cause the joint in the defined ROM with defined lateral forces being applied to the knee and while performing a second set of measurements of the feedback signal indicating the amount of force and/or torque that is being applied to the joint. The at least one controller is further configured to determine the ligaments balancing based on a combination of the first and second sets of measurements.
The robot system 100 can be configured to operate to determine the size of gaps between bones (e.g. medial and lateral gap between tibia and femur), ligaments mechanical characteristics (e.g. flexibility, attachment points, health state) and other ligaments balance related parameters as a function of knee flexion angle and/or for a ROM, based on using the measurements of reaction forces and/or torque and optionally further based on positions of the robotic arm 104 and/or end-effector 112, positions of the tibia and the femur which can be tracked using a tracking camera as disclosed in U.S. patent application Ser. No. 16/587,203, filed Sep. 30, 2019, (incorporated herein by reference), and/or defined system set-up for tibia and femur measurements.
The determined information can be presented to a surgeon or other user via a display device (e.g., display screen 110, head-mounted display (HMD) 150, etc.) to provide computer assistance for implant selection, implant placement, and/or assessment of estimated surgical procedure outcomes. Software of the robot system 100 may operate to use the determined information to recommend to the surgeon an implant type, size, and placement for the particular patient. The software may operate using defined optimization criteria which estimates measurements that would result after a particular implant placement configuration and operate to try adapting the implant type, size, and/or placement to more precisely match the resulting measurement to the initially acquired set of measurements, i.e., before surgical steps that affect the ligaments balance, and/or a define set of measurements where a surgeons seeks to achieve resulting measurements that differ from the initially acquired set of measurements. The software may operate using a machine decision process which is based on one or more of a rule based decision process, artificial intelligence process, neural network circuit, etc. In some embodiments, the computer assistance can result in the structure of the knee after implant placement closely matching the original or other desired knee structure, and thus improve patient feeling and potentially outcomes from the surgical procedure.
In the example embodiment of
Various specific movements that can be performed with assistance from the robot system 100 are now discussed with reference to
In some embodiments, the robot system 100 operates to measure the knee laxity across a ROM using specifically defined robot movements, e.g., of the knee manipulation arm 60.
The robot system 100 may determine the ligaments mechanical properties by operating to repetitive cause different movements of the lower leg 622 to cause differing resultant forces on the knee while performing measurements of forces and/or torques. The ligaments mechanical properties, such as stiffness, can correspond to the force exerted by the ligament as a function of its stretching.
In a corresponding operational embodiment, the at least one controller is further configured to determine stiffness of the ligaments based on a ratio between measurements of force applied by the ligaments and measurements of stretching of the ligaments while the joint is moved in the defined ROM, such as described with regard to
The robot system 100 can determine ligaments balancing based on measurements obtained while specific movements of the knee are performed. As explained above, the robot system 100 may display the determined information to provide computer assistance for implant selection, implant placement, and/or assessment of estimated surgical procedure outcomes for the patient.
Various planning operations are now explained with reference to
For example, the robot system 100 can apply a progressive torque to the lower leg 62 to perform a complete flexion of the knee (e.g., according to the movements illustrated in
The different shifts measured during a movement of the lower leg 62 may not be smooth because of friction. For example, while the head of the femoral bone is sliding on the head of the tibial bone the bones may get stuck because of the cartilage which is worn (due to the osteoarthritis) not being perfectly smooth. The friction forces can be measured by the sensors in the robotic arm 104, the end-effector 112, and/or the knee manipulation arm 60, and/or the friction forces may be determined by analyzing the positions and/or oscillations (e.g., of the measured friction force values) at various positions of the femur and tibia. The measured and/or determined friction forces can be displayed as additional useful information to the surgeon, such as for use in guiding osteophyte removal or informing about knee areas of particular concern due to osteoarthritis or cartilage degradation.
This information generated by the robot system 100 can be used by the robot system 100 to make recommendations to the surgeon and/or may be displayed to the surgeon to facilitate selection of a preferred patient treatment plan, selection of a preferred implant, determination of a preferred implant placement location, selection of an implant type, selection of an implant size, determination of a knee surgical cut plan, prediction of an outcome of a surgical surgery, etc., for the patient.
Example corresponding planning operations are explained with further reference to
The at least one controller may be further configured to access a knowledge database using the pre-implant information to determine the at least one of: the candidate position for the implant in the joint; the candidate size of the implant; and the candidate type of the implant. The knowledge database defines relationships between different sets of ligaments balancing values for a baseline joint and at least one of: different sets of preferred positions for an implant in the baseline joint; different sets of preferred implant sizes; and different sets of preferred implant types.
The at least one controller may be further configured to perform a pre-implant determination of ligaments balancing to output pre-implant information characterizing ligaments balancing at the joint before surgical placement of an implant in the joint, based on movement of the joint in the defined ROM while measuring the feedback signal indication of the amount of force and/or torque. The at least one controller can then process the pre-implant information to determine and output to an operator at least one of a preferred joint surgical cut plan and a prediction characterizing an outcome of the preferred joint surgical cut plan.
Further operations are now explained which can be performed during the Free Knee Motion process to operate the robot system 100 to provide recommendations to the surgeon for cutting angles, implant size, implant type, etc.
The robot system 100 can operate to calculate distance between the articular surfaces of knee while performing movement of the knee in a ROM with successively applied varus and valgus stress to the knee and while tracking pose of both femur and tibia.
To apply valgus stress to the knee during movement in the ROM, the leg is moved by the robot system 100 in the knee's ROM, e.g., through the entire ROM, while applying valgus “force” to tension the Medial Collateral Ligament and thus ensure contact of the lateral condyle with the lateral tibial plateau in, e.g., throughout, the ROM. By tracking both tibia and femur location in space with the tracking camera 200 at the same time, the robot system 100 can calculate the distance with respect to flexion angle between the condylar surface and the tibial plateau surface on the medial side of the knee.
To apply varus stress to the knee during movement in the ROM, the leg is moved by the robot system 100 in the knee's ROM while applying varus “force” to tension the Lateral Collateral Ligament and thus ensure contact of the medial condyle with the medial tibial plateau in, e.g., throughout, the ROM. By tracking both tibia and femur location in space with the tracking camera 200 at the same time, the robot system 100 can calculate the distance with respect to flexion angle between the condylar surface and the tibial plateau surface on the lateral side of the knee.
It is noted that by having the robot system 100 apply the varus/valgus stress, the level of stress (forces and/or torques) applied to the knee joint can be specifically control and, thereby, increase repeatability of the acquired data.
As soon as distances between articular surfaces have been acquired, planning program module executed by the at least one controller uses the distances as input data and selects the most appropriate combination of implants sizes as well as implants positions and orientations with respect to the bone in order to optimize and reproduce as close as possible the measured distances between articular surfaces. Once the optimal position and orientation of both implants with respect to their respective bone has been determined by the planning program module, the corresponding resections planes and angles can be calculated accordingly (i.e., by knowing the implant's geometries and fitting rules).
After that, surgeon can still modify implants positions and orientations, based for example on effective calculated flexion and extension resection gaps.
The robot system 100 can also operate to select an appropriate implant or implant position based on ligament flexibility. The measurement of gaps between articular surfaces, as discussed above, is an indirect measurement of the ligament stiffness. Indeed, the more the ligaments “flexible” (compliant) are, the more distance that is consequently measured. As the ligaments can be considered as anatomical elastics, by determining the length of these elastics, attachment points on bones and their respective stiffness rate (the elongation vs applied force), the robot system 100 can then determine optimized implant position and orientation to ensure that ligaments are under tension and/or lax in certain position of the knee (in flexion, mid-flexion or extension).
The robot system 100 can also operate to access the knowledge database for planning. Based on the definition of a specific controlled leg movement (combination of ROM movement and specific movement for ligaments assessment, as described above) that could be performed with the robot arm 104, with control of monitoring of forces and torques applied to knee joint throughout the ROM. Based on the measurements of reaction forces (by means of the force and/or torque sensor apparatus, e.g. of the robot arm 104) as well as tracked position of both bones using the tracking camera 200, the robot system 100 (i.e., the at least one controller thereof) can determine the full biomechanical behavior (external/internal rotation, varus/valgus angles, posterior shift, etc.) of the knee joint and specifically adjust implant placement based on this assessment.
Example Surgical Room Equipment Layout and Operation:
Referring to
The tracking camera 200 may include any suitable camera or cameras, such as one or more infrared cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active and passive tracking markers for various reference arrays attached as the patient 210 (patient reference array), end-effector 112 (end-effector reference array), joint manipulation arm, extended reality (XR) headset(s) 150a-150b worn by a surgeon 120 and/or a surgical assistant 126, etc. in a given measurement volume viewable from the perspective of the tracking camera 200. The tracking camera 200 may track markers 170 attached to a joint manipulation arm manipulated by a user (surgeon) and/or the robot system. The tracking camera 200 may scan the given measurement volume and detect the light that is emitted or reflected from the reference arrays in order to identify and determine poses of the reference arrays in three-dimensions. For example, active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking camera 200 or other suitable device.
The XR headsets 150a and 150b (also referred to as an XR headset 150) may each include tracking cameras that can track poses of reference arrays within their camera field-of-views (FOVs) 152 and 154, respectively. Accordingly, as illustrated in
An XR headset may be configured to augment a real-world scene with computer generated XR images. The XR headset may be configured to provide an augmented reality (AR) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user. Alternatively, the XR headset may be configured to provide a virtual reality (VR) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer generated AR images on a display screen. An XR headset can be configured to provide both AR and VR viewing environments. Thus, the term XR headset can referred to as an AR headset or a VR headset.
The camera tracking system 202 may use tracking information and other information from multiple XR headsets 150a and 150b such as inertial tracking information and optical tracking information as well as (optional) microphone information. The XR headsets 150a and 150b operate to display visual information and play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 102 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment. The XR headsets 150a and 150b track apparatus such as instruments, patient references and end-effectors in 6 degrees-of-freedom (6DOF), and may track the hands of the wearer. The XR headsets 150a and 150b may also operate to track hand poses and gestures to enable gesture based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150a and 150b and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150a and 150b may have a 1-10× magnification digital color camera sensor called a digital loupe.
An “outside-in” machine vision navigation bar (tracking cameras 200) may track pose of the joint manipulation arm using monochrome and/or color camera(s). The machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150a and 150b tend to move while positioned on wearers' heads. The patient reference array 116 is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end-effector 112, instrument reference array 170, and reference arrays on the XR headsets 150a and 150b.
In some embodiments, one or more of the XR headsets 150a and 150b are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
The robot system 100 may be positioned near or next to patient 210. The tracking camera 200 may be separated from the robot system 100 and positioned at the foot of patient 210. This location allows the tracking camera 200 to have a direct visual line of sight to the surgical field 208. It is contemplated that the robot system 100 and the tracking camera 200 will be located at any suitable position. In the configuration shown, the surgeon 120 may be positioned across from the robot 102, but is still able to manipulate the end-effector 112 (and joint manipulation arm) and the display 110. A surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110. If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. The traditional areas for the anesthesiologist 122 and the nurse or scrub tech 124 remain unimpeded by the locations of the robot 102 and camera 200. The anesthesiologist 122 can operate anesthesia equipment which can include a display 34.
With respect to the other components of the robot 102, the display 110 can be attached to the surgical robot 102 and in other example embodiments, display 110 can be detached from surgical robot 102, either within a surgical room with the surgical robot 102, or in a remote location. End-effector 112 may be coupled to the robotic arm 104 and controlled by at least one motor. In example embodiments, end-effector 112 can be connectable to a joint manipulation arm and, alternatively, a guide tube 114, which is able to receive and orient a surgical instrument, tool, or implant 608 used to perform a surgical procedure on the patient 210.
As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” The term “instrument” is used in a non-limiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein. Example instruments, tools, and implants include, without limitation, joint manipulation arms, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc. Although generally shown with a guide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument 608 in a desired manner.
The surgical robot 102 is operable to control the translation and orientation of the end-effector 112. The robot 102 is operable to move end-effector 112 under computer control along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis (such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled). In some example embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a six degree of freedom robotic arm comprising only rotational axes. For example, the surgical robot system 100 may be used to operate on patient 210, and robotic arm 104 can be positioned above the body of patient 210, with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210.
In some example embodiments, the XR headsets 150a and 150b can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
As used herein, the term “pose” refers to the position and/or the rotational angle of one object (e.g., dynamic reference array, end-effector, surgical instrument, anatomical structure, etc.) relative to another object and/or to a defined coordinate system. A pose may therefore be defined based on only the multidimensional position of one object relative to another object and/or relative to a defined coordinate system, based on only the multidimensional rotational angles of the object relative to another object and/or to a defined coordinate system, or based on a combination of the multidimensional position and the multidimensional rotational angles. The term “pose” therefore is used to refer to position, rotational angle, or combination thereof.
In some further embodiments, surgical robot 102 can be configured to correct the path of the joint manipulation arm being moved by the surgeon with guidance by the robotic arm 104. In some example embodiments, surgical robot 102 can be configured to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the joint manipulation arm. Thus, in use, in example embodiments, a surgeon or other user can operate the system 100, and has the option to stop, modify, or manually control the autonomous movement of end-effector 112 and/or the joint manipulation arm.
Reference arrays can be formed on or connected to robotic arm 104, end-effector 112, joint manipulation arm 60, patient 210, and/or the surgical instrument to track poses in 6 degree-of-freedom (e.g., position along 3 orthogonal axes and rotation about the axes). In example embodiments, a reference array including a plurality of tracking markers can be provided thereon (e.g., formed-on or connected-to) to an outer surface of the robot 102, such as on robot 102, on robotic arm 104, and/or on the end-effector 112. A patient reference array including one or more tracking markers can further be provided on the patient 210 (e.g., formed-on or connected-to). An instrument reference array including one or more tracking markers can be provided on surgical instruments (e.g., a screwdriver, dilator, implant inserter, or the like). The reference arrays enable each of the marked objects (e.g., the end-effector 112, the patient 210, and the surgical instruments) to be tracked by the tracking camera 200, and the tracked poses can be used to provide navigation guidance to a surgical procedure and/or used to control movement of the surgical robot 102 for guiding the end-effector 112, joint manipulation arm, and/or an instrument.
Example Surgical System:
The imaging devices may include a C-arm imaging device 1304, an O-arm imaging device 1306, and/or a patient image database 1620. The XR headset 150 provides an improved human interface for performing navigated surgical procedures. The XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 1600, that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 1612. The display device 1612 may a video projector, flat panel display, etc. The user can view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen. The XR headset 150 may additionally or alternatively be configured to display on the display device 1612 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
Electrical components of the XR headset 150 can include a plurality of cameras 1622, a microphone 1620, a gesture sensor 1618, a pose sensor (e.g., inertial measurement unit (IMU)) 1616, the display device 1612, and a wireless/wired communication interface 1624. The cameras 1622 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
The cameras 1622 may be configured to operate as the gesture sensor 1618 by tracking for identification user hand gestures performed within the field of view of the camera(s) 1622. Alternatively the gesture sensor 1618 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 1618 and/or senses physical contact, e.g. tapping on the sensor 1618 or its enclosure. The pose sensor 1616, e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
As explained above, a surgical system includes a camera tracking system 202 which may be part of a computer platform 1600 that can also provide functionality of a navigation controller 1604 and/or of a XR headset controller 1610. The surgical system may include the imaging devices and/or a surgical robot 102. The navigation controller 1604 can be configured to provide visual navigation guidance to an operator for moving and positioning a joint manipulation arm during a joint ligaments balancing procedure and/or moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 202. The navigation controller 1604 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end-effector of the surgical robot 102, where the steering information is used to display information through the XR headset 150 to indicate where the surgical tool and/or the end-effector of the surgical robot 102 should be moved to perform the surgical plan.
The electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 1600 through a wired/wireless interface 1624. The electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 1600 or directly connected, to various imaging devices, e.g., the C-arm imaging device 1304, the I/O-arm imaging device 1306, the patient image database 1620, and/or to other medical equipment through the wired/wireless interface 1624.
The surgical system further includes at least one XR headset controller 1610 that may reside in the XR headset 150, the computer platform 1600, and/or in another system component connected via wired cables and/or wireless communication links. Various functionality is provided by software executed by the XR headset controller 1610. The XR headset controller 1610 is configured to receive information from the camera tracking system 202 and the navigation controller 1604, and to generate an XR image based on the information for display on the display device 1612.
The XR headset controller 1610 can be configured to operationally process signaling from the cameras 1622, the microphone 1620, and/or the pose sensor 1616, and is connected to display XR images on the display device 1612 for user viewing. Thus, the XR headset controller 1610 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user. For example, the XR headset controller 1610 may reside within the computer platform 1600 which, in turn, may reside within a housing of the surgical robot 102, the camera tracking system 202, etc.
In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 63/135,068, filed on Jan. 8, 2021, the disclosure and content of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4150293 | Franke | Apr 1979 | A |
5246010 | Gazzara et al. | Sep 1993 | A |
5354314 | Hardy et al. | Oct 1994 | A |
5397323 | Taylor et al. | Mar 1995 | A |
5598453 | Baba et al. | Jan 1997 | A |
5772594 | Barrick | Jun 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5820559 | Ng et al. | Oct 1998 | A |
5825982 | Wright et al. | Oct 1998 | A |
5887121 | Funda et al. | Mar 1999 | A |
5911449 | Daniele et al. | Jun 1999 | A |
5951475 | Gueziec et al. | Sep 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
6012216 | Esteves et al. | Jan 2000 | A |
6031888 | Ivan et al. | Feb 2000 | A |
6033415 | Mittelstadt et al. | Mar 2000 | A |
6080181 | Jensen et al. | Jun 2000 | A |
6106511 | Jensen | Aug 2000 | A |
6122541 | Cosman et al. | Sep 2000 | A |
6144875 | Schweikard et al. | Nov 2000 | A |
6157853 | Blume et al. | Dec 2000 | A |
6167145 | Foley et al. | Dec 2000 | A |
6167292 | Badano et al. | Dec 2000 | A |
6201984 | Funda et al. | Mar 2001 | B1 |
6203196 | Meyer et al. | Mar 2001 | B1 |
6205411 | DiGioia, III et al. | Mar 2001 | B1 |
6212419 | Blume et al. | Apr 2001 | B1 |
6231565 | Tovey et al. | May 2001 | B1 |
6236875 | Bucholz et al. | May 2001 | B1 |
6246900 | Cosman et al. | Jun 2001 | B1 |
6301495 | Gueziec et al. | Oct 2001 | B1 |
6306126 | Montezuma | Oct 2001 | B1 |
6312435 | Wallace et al. | Nov 2001 | B1 |
6314311 | Williams et al. | Nov 2001 | B1 |
6320929 | Von Der Haar | Nov 2001 | B1 |
6322567 | Mittelstadt et al. | Nov 2001 | B1 |
6325808 | Bernard et al. | Dec 2001 | B1 |
6340363 | Bolger et al. | Jan 2002 | B1 |
6377011 | Ben-Ur | Apr 2002 | B1 |
6379302 | Kessman et al. | Apr 2002 | B1 |
6402762 | Hunter et al. | Jun 2002 | B2 |
6424885 | Niemeyer et al. | Jul 2002 | B1 |
6447503 | Wynne et al. | Sep 2002 | B1 |
6451027 | Cooper et al. | Sep 2002 | B1 |
6477400 | Barrick | Nov 2002 | B1 |
6484049 | Seeley et al. | Nov 2002 | B1 |
6487267 | Wolter | Nov 2002 | B1 |
6490467 | Bucholz et al. | Dec 2002 | B1 |
6490475 | Seeley et al. | Dec 2002 | B1 |
6499488 | Hunter et al. | Dec 2002 | B1 |
6501981 | Schweikard et al. | Dec 2002 | B1 |
6507751 | Blume et al. | Jan 2003 | B2 |
6535756 | Simon et al. | Mar 2003 | B1 |
6560354 | Maurer, Jr. et al. | May 2003 | B1 |
6565554 | Niemeyer | May 2003 | B1 |
6587750 | Gerbi et al. | Jul 2003 | B2 |
6614453 | Suri et al. | Sep 2003 | B1 |
6614871 | Kobiki et al. | Sep 2003 | B1 |
6619840 | Rasche et al. | Sep 2003 | B2 |
6636757 | Jascob et al. | Oct 2003 | B1 |
6645196 | Nixon et al. | Nov 2003 | B1 |
6666579 | Jensen | Dec 2003 | B2 |
6669635 | Kessman et al. | Dec 2003 | B2 |
6701173 | Nowinski et al. | Mar 2004 | B2 |
6757068 | Foxlin | Jun 2004 | B2 |
6782287 | Grzeszczuk et al. | Aug 2004 | B2 |
6783524 | Anderson et al. | Aug 2004 | B2 |
6786896 | Madhani et al. | Sep 2004 | B1 |
6788018 | Blumenkranz | Sep 2004 | B1 |
6804581 | Wang et al. | Oct 2004 | B2 |
6823207 | Jensen et al. | Nov 2004 | B1 |
6827351 | Graziani et al. | Dec 2004 | B2 |
6837892 | Shoham | Jan 2005 | B2 |
6839612 | Sanchez et al. | Jan 2005 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6856827 | Seeley et al. | Feb 2005 | B2 |
6879880 | Nowlin et al. | Apr 2005 | B2 |
6892090 | Verard et al. | May 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6922632 | Foxlin | Jul 2005 | B2 |
6968224 | Kessman et al. | Nov 2005 | B2 |
6978166 | Foley et al. | Dec 2005 | B2 |
6988009 | Grimm et al. | Jan 2006 | B2 |
6991627 | Madhani et al. | Jan 2006 | B2 |
6996487 | Jutras et al. | Feb 2006 | B2 |
6999852 | Green | Feb 2006 | B2 |
7007699 | Martinelli et al. | Mar 2006 | B2 |
7016457 | Senzig et al. | Mar 2006 | B1 |
7043961 | Pandey et al. | May 2006 | B2 |
7062006 | Pelc et al. | Jun 2006 | B1 |
7063705 | Young et al. | Jun 2006 | B2 |
7072707 | Galloway, Jr. et al. | Jul 2006 | B2 |
7083615 | Peterson et al. | Aug 2006 | B2 |
7097640 | Wang et al. | Aug 2006 | B2 |
7099428 | Clinthorne et al. | Aug 2006 | B2 |
7108421 | Gregerson et al. | Sep 2006 | B2 |
7130676 | Barrick | Oct 2006 | B2 |
7139418 | Abovitz et al. | Nov 2006 | B2 |
7139601 | Bucholz et al. | Nov 2006 | B2 |
7155316 | Sutherland et al. | Dec 2006 | B2 |
7164968 | Treat et al. | Jan 2007 | B2 |
7167738 | Schweikard et al. | Jan 2007 | B2 |
7169141 | Brock et al. | Jan 2007 | B2 |
7172627 | Fiere et al. | Feb 2007 | B2 |
7194120 | Wicker et al. | Mar 2007 | B2 |
7197107 | Arai et al. | Mar 2007 | B2 |
7231014 | Levy | Jun 2007 | B2 |
7231063 | Naimark et al. | Jun 2007 | B2 |
7239940 | Wang et al. | Jul 2007 | B2 |
7248914 | Hastings et al. | Jul 2007 | B2 |
7301648 | Foxlin | Nov 2007 | B2 |
7302288 | Schellenberg | Nov 2007 | B1 |
7313430 | Urquhart et al. | Dec 2007 | B2 |
7318805 | Schweikard et al. | Jan 2008 | B2 |
7318827 | Leitner et al. | Jan 2008 | B2 |
7319897 | Leitner et al. | Jan 2008 | B2 |
7324623 | Heuscher et al. | Jan 2008 | B2 |
7327865 | Fu et al. | Feb 2008 | B2 |
7331967 | Lee et al. | Feb 2008 | B2 |
7333642 | Green | Feb 2008 | B2 |
7339341 | Oleynikov et al. | Mar 2008 | B2 |
7366562 | Dukesherer et al. | Apr 2008 | B2 |
7379790 | Toth et al. | May 2008 | B2 |
7386365 | Nixon | Jun 2008 | B2 |
7422592 | Morley et al. | Sep 2008 | B2 |
7435216 | Kwon et al. | Oct 2008 | B2 |
7440793 | Chauhan et al. | Oct 2008 | B2 |
7460637 | Clinthorne et al. | Dec 2008 | B2 |
7466303 | Yi et al. | Dec 2008 | B2 |
7493153 | Ahmed et al. | Feb 2009 | B2 |
7505617 | Fu et al. | Mar 2009 | B2 |
7533892 | Schena et al. | May 2009 | B2 |
7542791 | Mire et al. | Jun 2009 | B2 |
7555331 | Viswanathan | Jun 2009 | B2 |
7567834 | Clayton et al. | Jul 2009 | B2 |
7594912 | Cooper et al. | Sep 2009 | B2 |
7606613 | Simon et al. | Oct 2009 | B2 |
7607440 | Coste-Maniere et al. | Oct 2009 | B2 |
7623902 | Pacheco | Nov 2009 | B2 |
7630752 | Viswanathan | Dec 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7643862 | Schoenefeld | Jan 2010 | B2 |
7660623 | Hunter et al. | Feb 2010 | B2 |
7661881 | Gregerson et al. | Feb 2010 | B2 |
7683331 | Chang | Mar 2010 | B2 |
7683332 | Chang | Mar 2010 | B2 |
7689320 | Prisco et al. | Mar 2010 | B2 |
7691098 | Wallace et al. | Apr 2010 | B2 |
7702379 | Avinash et al. | Apr 2010 | B2 |
7702477 | Tuemmler et al. | Apr 2010 | B2 |
7711083 | Heigl et al. | May 2010 | B2 |
7711406 | Kuhn et al. | May 2010 | B2 |
7720523 | Omernick et al. | May 2010 | B2 |
7725253 | Foxlin | May 2010 | B2 |
7726171 | Langlotz et al. | Jun 2010 | B2 |
7742801 | Neubauer et al. | Jun 2010 | B2 |
7751865 | Jascob et al. | Jul 2010 | B2 |
7760849 | Zhang | Jul 2010 | B2 |
7762825 | Burbank et al. | Jul 2010 | B2 |
7763015 | Cooper et al. | Jul 2010 | B2 |
7787699 | Mahesh et al. | Aug 2010 | B2 |
7796728 | Bergfjord | Sep 2010 | B2 |
7813838 | Sommer | Oct 2010 | B2 |
7818044 | Dukesherer et al. | Oct 2010 | B2 |
7819859 | Prisco et al. | Oct 2010 | B2 |
7824401 | Manzo et al. | Nov 2010 | B2 |
7831294 | Viswanathan | Nov 2010 | B2 |
7834484 | Sartor | Nov 2010 | B2 |
7835557 | Kendrick et al. | Nov 2010 | B2 |
7835778 | Foley et al. | Nov 2010 | B2 |
7835784 | Mire et al. | Nov 2010 | B2 |
7840253 | Tremblay et al. | Nov 2010 | B2 |
7840256 | Lakin et al. | Nov 2010 | B2 |
7843158 | Prisco | Nov 2010 | B2 |
7844320 | Shahidi | Nov 2010 | B2 |
7853305 | Simon et al. | Dec 2010 | B2 |
7853313 | Thompson | Dec 2010 | B2 |
7865269 | Prisco et al. | Jan 2011 | B2 |
D631966 | Perloff et al. | Feb 2011 | S |
7879045 | Gielen et al. | Feb 2011 | B2 |
7881767 | Strommer et al. | Feb 2011 | B2 |
7881770 | Melkent et al. | Feb 2011 | B2 |
7886743 | Cooper et al. | Feb 2011 | B2 |
RE42194 | Foley et al. | Mar 2011 | E |
RE42226 | Foley et al. | Mar 2011 | E |
7900524 | Calloway et al. | Mar 2011 | B2 |
7907166 | Lamprecht et al. | Mar 2011 | B2 |
7909122 | Schena et al. | Mar 2011 | B2 |
7925653 | Saptharishi | Apr 2011 | B2 |
7930065 | Larkin et al. | Apr 2011 | B2 |
7935130 | Williams | May 2011 | B2 |
7940999 | Liao et al. | May 2011 | B2 |
7945012 | Ye et al. | May 2011 | B2 |
7945021 | Shapiro et al. | May 2011 | B2 |
7953470 | Vetter et al. | May 2011 | B2 |
7954397 | Choi et al. | Jun 2011 | B2 |
7971341 | Dukesherer et al. | Jul 2011 | B2 |
7974674 | Hauck et al. | Jul 2011 | B2 |
7974677 | Mire et al. | Jul 2011 | B2 |
7974681 | Wallace et al. | Jul 2011 | B2 |
7979157 | Anvari | Jul 2011 | B2 |
7983733 | Viswanathan | Jul 2011 | B2 |
7988215 | Seibold | Aug 2011 | B2 |
7996110 | Lipow et al. | Aug 2011 | B2 |
8004121 | Sartor | Aug 2011 | B2 |
8004229 | Nowlin et al. | Aug 2011 | B2 |
8010177 | Csavoy et al. | Aug 2011 | B2 |
8019045 | Kato | Sep 2011 | B2 |
8021310 | Sanborn et al. | Sep 2011 | B2 |
8035685 | Jensen | Oct 2011 | B2 |
8046054 | Kim et al. | Oct 2011 | B2 |
8046057 | Clarke | Oct 2011 | B2 |
8052688 | Wolf, II | Nov 2011 | B2 |
8054184 | Cline et al. | Nov 2011 | B2 |
8054752 | Druke et al. | Nov 2011 | B2 |
8057397 | Li et al. | Nov 2011 | B2 |
8057407 | Martinelli et al. | Nov 2011 | B2 |
8062288 | Cooper et al. | Nov 2011 | B2 |
8062375 | Glerum et al. | Nov 2011 | B2 |
8066524 | Burbank et al. | Nov 2011 | B2 |
8073335 | Labonville et al. | Dec 2011 | B2 |
8079950 | Stern et al. | Dec 2011 | B2 |
8086299 | Adler et al. | Dec 2011 | B2 |
8092370 | Roberts et al. | Jan 2012 | B2 |
8098914 | Liao et al. | Jan 2012 | B2 |
8100950 | St. Clair et al. | Jan 2012 | B2 |
8105320 | Manzo | Jan 2012 | B2 |
8108025 | Csavoy et al. | Jan 2012 | B2 |
8109877 | Moctezuma de la Barrera et al. | Feb 2012 | B2 |
8112292 | Simon | Feb 2012 | B2 |
8116430 | Shapiro et al. | Feb 2012 | B1 |
8120301 | Goldberg et al. | Feb 2012 | B2 |
8121249 | Wang et al. | Feb 2012 | B2 |
8123675 | Funda et al. | Feb 2012 | B2 |
8133229 | Bonutti | Mar 2012 | B1 |
8142420 | Schena | Mar 2012 | B2 |
8147494 | Leitner et al. | Apr 2012 | B2 |
8150494 | Simon et al. | Apr 2012 | B2 |
8150497 | Gielen et al. | Apr 2012 | B2 |
8150498 | Gielen et al. | Apr 2012 | B2 |
8165658 | Waynik et al. | Apr 2012 | B2 |
8170313 | Kendrick et al. | May 2012 | B2 |
8179073 | Farritor et al. | May 2012 | B2 |
8182476 | Julian et al. | May 2012 | B2 |
8184880 | Zhao et al. | May 2012 | B2 |
8202278 | Orban, III et al. | Jun 2012 | B2 |
8208708 | Homan et al. | Jun 2012 | B2 |
8208988 | Jenser | Jun 2012 | B2 |
8219177 | Smith et al. | Jul 2012 | B2 |
8219178 | Smith et al. | Jul 2012 | B2 |
8220468 | Cooper et al. | Jul 2012 | B2 |
8224024 | Foxlin et al. | Jul 2012 | B2 |
8224484 | Swarup et al. | Jul 2012 | B2 |
8225798 | Baldwin et al. | Jul 2012 | B2 |
8228368 | Zhao et al. | Jul 2012 | B2 |
8231610 | Jo et al. | Jul 2012 | B2 |
8263933 | Hartmann et al. | Jul 2012 | B2 |
8239001 | Verard et al. | Aug 2012 | B2 |
8241271 | Millman et al. | Aug 2012 | B2 |
8248413 | Gattani et al. | Aug 2012 | B2 |
8256319 | Cooper et al. | Sep 2012 | B2 |
8271069 | Jascob et al. | Sep 2012 | B2 |
8271130 | Hourtash | Sep 2012 | B2 |
8281670 | Larkin et al. | Oct 2012 | B2 |
8282653 | Nelson et al. | Oct 2012 | B2 |
8301226 | Csavoy et al. | Oct 2012 | B2 |
8311611 | Csavoy et al. | Nov 2012 | B2 |
8320991 | Jascob et al. | Nov 2012 | B2 |
8332012 | Kienzle, III | Dec 2012 | B2 |
8333755 | Cooper et al. | Dec 2012 | B2 |
8335552 | Stiles | Dec 2012 | B2 |
8335557 | Maschke | Dec 2012 | B2 |
8348931 | Cooper et al. | Jan 2013 | B2 |
8353963 | Glerum | Jan 2013 | B2 |
8358818 | Miga et al. | Jan 2013 | B2 |
8359730 | Burg et al. | Jan 2013 | B2 |
8374673 | Adcox et al. | Feb 2013 | B2 |
8374723 | Zhao et al. | Feb 2013 | B2 |
8379791 | Forthmann et al. | Feb 2013 | B2 |
8386019 | Camus et al. | Feb 2013 | B2 |
8392022 | Ortmaier et al. | Mar 2013 | B2 |
8394099 | Patwardhan | Mar 2013 | B2 |
8395342 | Prisco | Mar 2013 | B2 |
8398634 | Manzo et al. | Mar 2013 | B2 |
8400094 | Schena | Mar 2013 | B2 |
8414957 | Enzerink et al. | Apr 2013 | B2 |
8418073 | Mohr et al. | Apr 2013 | B2 |
8450694 | Baviera et al. | May 2013 | B2 |
8452447 | Nixon | May 2013 | B2 |
RE44305 | Foley et al. | Jun 2013 | E |
8462911 | Vesel et al. | Jun 2013 | B2 |
8465476 | Rogers et al. | Jun 2013 | B2 |
8465771 | Wan et al. | Jun 2013 | B2 |
8467851 | Mire et al. | Jun 2013 | B2 |
8467852 | Csavoy et al. | Jun 2013 | B2 |
8469947 | Devengenzo et al. | Jun 2013 | B2 |
RE44392 | Hynes | Jul 2013 | E |
8483434 | Buehner et al. | Jul 2013 | B2 |
8483800 | Jensen et al. | Jul 2013 | B2 |
8486532 | Enzerink et al. | Jul 2013 | B2 |
8489235 | Moll et al. | Jul 2013 | B2 |
8500722 | Cooper | Aug 2013 | B2 |
8500728 | Newton et al. | Aug 2013 | B2 |
8504201 | Moll et al. | Aug 2013 | B2 |
8506555 | Ruiz Morales | Aug 2013 | B2 |
8506556 | Schena | Aug 2013 | B2 |
8508173 | Goldberg et al. | Aug 2013 | B2 |
8512318 | Tovey et al. | Aug 2013 | B2 |
8515576 | Lipow et al. | Aug 2013 | B2 |
8518120 | Glerum et al. | Aug 2013 | B2 |
8521331 | Itkowitz | Aug 2013 | B2 |
8526688 | Groszmann et al. | Sep 2013 | B2 |
8526700 | Isaacs | Sep 2013 | B2 |
8527094 | Kumar et al. | Sep 2013 | B2 |
8528440 | Morley et al. | Sep 2013 | B2 |
8532741 | Heruth et al. | Sep 2013 | B2 |
8541970 | Nowlin et al. | Sep 2013 | B2 |
8548563 | Simon et al. | Oct 2013 | B2 |
8549732 | Burg et al. | Oct 2013 | B2 |
8551114 | Ramos de la Pena | Oct 2013 | B2 |
8551116 | Julian et al. | Oct 2013 | B2 |
8556807 | Scott et al. | Oct 2013 | B2 |
8556979 | Glerum et al. | Oct 2013 | B2 |
8560118 | Green et al. | Oct 2013 | B2 |
8561473 | Blumenkranz | Oct 2013 | B2 |
8562594 | Cooper et al. | Oct 2013 | B2 |
8571638 | Shoham | Oct 2013 | B2 |
8571710 | Coste-Maniere et al. | Oct 2013 | B2 |
8573465 | Shelton, IV | Nov 2013 | B2 |
8574303 | Sharkey et al. | Nov 2013 | B2 |
8585420 | Burbank et al. | Nov 2013 | B2 |
8594841 | Zhao et al. | Nov 2013 | B2 |
8597198 | Sanborn et al. | Dec 2013 | B2 |
8600478 | Verard et al. | Dec 2013 | B2 |
8603077 | Cooper et al. | Dec 2013 | B2 |
8611985 | Lavallee et al. | Dec 2013 | B2 |
8613230 | Blumenkranz et al. | Dec 2013 | B2 |
8621939 | Blumenkranz et al. | Jan 2014 | B2 |
8624537 | Nowlin et al. | Jan 2014 | B2 |
8630389 | Kato | Jan 2014 | B2 |
8634897 | Simon et al. | Jan 2014 | B2 |
8634957 | Toth et al. | Jan 2014 | B2 |
8638056 | Goldberg et al. | Jan 2014 | B2 |
8638057 | Goldberg et al. | Jan 2014 | B2 |
8639000 | Zhao et al. | Jan 2014 | B2 |
8641726 | Bonutti | Feb 2014 | B2 |
8644907 | Hartmann et al. | Feb 2014 | B2 |
8657809 | Schoepp | Feb 2014 | B2 |
8660635 | Simon et al. | Feb 2014 | B2 |
8666544 | Moll et al. | Mar 2014 | B2 |
8675939 | Moctezuma de la Barrera | Mar 2014 | B2 |
8678647 | Gregerson et al. | Mar 2014 | B2 |
8679125 | Smith et al. | Mar 2014 | B2 |
8679183 | Glerum et al. | Mar 2014 | B2 |
8682413 | Lloyd | Mar 2014 | B2 |
8684253 | Giordano et al. | Apr 2014 | B2 |
8685098 | Glerum et al. | Apr 2014 | B2 |
8693730 | Umasuthan et al. | Apr 2014 | B2 |
8694075 | Groszmann et al. | Apr 2014 | B2 |
8696458 | Foxlin et al. | Apr 2014 | B2 |
8700123 | Okamura et al. | Apr 2014 | B2 |
8706086 | Glerum | Apr 2014 | B2 |
8706185 | Foley et al. | Apr 2014 | B2 |
8706301 | Zhao et al. | Apr 2014 | B2 |
8717430 | Simon et al. | May 2014 | B2 |
8727618 | Maschke et al. | May 2014 | B2 |
8734432 | Tuma et al. | May 2014 | B2 |
8738115 | Amberg et al. | May 2014 | B2 |
8738181 | Greer et al. | May 2014 | B2 |
8740882 | Jun et al. | Jun 2014 | B2 |
8746252 | McGrogan et al. | Jun 2014 | B2 |
8749189 | Nowlin et al. | Jun 2014 | B2 |
8749190 | Nowlin et al. | Jun 2014 | B2 |
8761930 | Nixon | Jun 2014 | B2 |
8764448 | Yang et al. | Jul 2014 | B2 |
8771170 | Mesallum et al. | Jul 2014 | B2 |
8781186 | Clements et al. | Jul 2014 | B2 |
8781630 | Banks et al. | Jul 2014 | B2 |
8784385 | Boyden et al. | Jul 2014 | B2 |
8786241 | Nowlin et al. | Jul 2014 | B2 |
8787520 | Baba | Jul 2014 | B2 |
8792704 | Isaacs | Jul 2014 | B2 |
8798231 | Notohara et al. | Aug 2014 | B2 |
8800838 | Shelton, IV | Aug 2014 | B2 |
8808164 | Hoffman et al. | Aug 2014 | B2 |
8812077 | Dempsey | Aug 2014 | B2 |
8814793 | Brabrand | Aug 2014 | B2 |
8816628 | Nowlin et al. | Aug 2014 | B2 |
8818105 | Myronenko et al. | Aug 2014 | B2 |
8820605 | Shelton, IV | Sep 2014 | B2 |
8821511 | Von Jako et al. | Sep 2014 | B2 |
8823308 | Nowlin et al. | Sep 2014 | B2 |
8827996 | Scott et al. | Sep 2014 | B2 |
8828024 | Farritor et al. | Sep 2014 | B2 |
8830224 | Zhao et al. | Sep 2014 | B2 |
8834489 | Cooper et al. | Sep 2014 | B2 |
8834490 | Bonutti | Sep 2014 | B2 |
8838270 | Druke et al. | Sep 2014 | B2 |
8844789 | Shelton, IV et al. | Sep 2014 | B2 |
8855822 | Bartol et al. | Oct 2014 | B2 |
8858598 | Seifert et al. | Oct 2014 | B2 |
8860753 | Bhandarkar et al. | Oct 2014 | B2 |
8864751 | Prisco et al. | Oct 2014 | B2 |
8864798 | Weiman et al. | Oct 2014 | B2 |
8864833 | Glerum et al. | Oct 2014 | B2 |
8867703 | Shapiro et al. | Oct 2014 | B2 |
8870880 | Himmelberger et al. | Oct 2014 | B2 |
8876866 | Zappacosta et al. | Nov 2014 | B2 |
8880223 | Raj et al. | Nov 2014 | B2 |
8882803 | Iott et al. | Nov 2014 | B2 |
8883210 | Truncale et al. | Nov 2014 | B1 |
8888821 | Rezach et al. | Nov 2014 | B2 |
8888853 | Glerum et al. | Nov 2014 | B2 |
8888854 | Glerum et al. | Nov 2014 | B2 |
8894652 | Seifert et al. | Nov 2014 | B2 |
8894688 | Suh | Nov 2014 | B2 |
8894691 | Iott et al. | Nov 2014 | B2 |
8906069 | Hansell et al. | Dec 2014 | B2 |
8964934 | Ein-Gal | Feb 2015 | B2 |
8992580 | Bar et al. | Mar 2015 | B2 |
8996169 | Lightcap et al. | Mar 2015 | B2 |
9001963 | Sowards-Emmerd et al. | Apr 2015 | B2 |
9002076 | Khadem et al. | Apr 2015 | B2 |
9044190 | Rubner et al. | Jun 2015 | B2 |
9107683 | Hourtash et al. | Aug 2015 | B2 |
9119656 | Bose | Sep 2015 | B2 |
9125556 | Zehavi et al. | Sep 2015 | B2 |
9131986 | Greer et al. | Sep 2015 | B2 |
9215968 | Schostek et al. | Dec 2015 | B2 |
9308050 | Kostrzewski et al. | Apr 2016 | B2 |
9380984 | Li et al. | Jul 2016 | B2 |
9393039 | Lechner et al. | Jul 2016 | B2 |
9398886 | Gregerson et al. | Jul 2016 | B2 |
9398890 | Dong et al. | Jul 2016 | B2 |
9414859 | Ballard et al. | Aug 2016 | B2 |
9420975 | Gutfleisch et al. | Aug 2016 | B2 |
9492235 | Hourtash et al. | Nov 2016 | B2 |
9592096 | Maillet et al. | Mar 2017 | B2 |
9750465 | Engel et al. | Sep 2017 | B2 |
9757203 | Hourtash et al. | Sep 2017 | B2 |
9795354 | Menegaz et al. | Oct 2017 | B2 |
9814535 | Bar et al. | Nov 2017 | B2 |
9820783 | Donner et al. | Nov 2017 | B2 |
9833265 | Donner et al. | Nov 2017 | B2 |
9848922 | Tohmeh et al. | Dec 2017 | B2 |
9925011 | Gombert et al. | Mar 2018 | B2 |
9931025 | Graetzel et al. | Apr 2018 | B1 |
10034717 | Miller et al. | Jul 2018 | B2 |
20010036302 | Miller | Nov 2001 | A1 |
20020035321 | Bucholz et al. | Mar 2002 | A1 |
20040068172 | Nowinski et al. | Apr 2004 | A1 |
20040076259 | Jensen et al. | Apr 2004 | A1 |
20050096502 | Khalili | May 2005 | A1 |
20050143651 | Verard et al. | Jun 2005 | A1 |
20050171558 | Abovitz et al. | Aug 2005 | A1 |
20060100610 | Wallace et al. | May 2006 | A1 |
20060173329 | Marquart et al. | Aug 2006 | A1 |
20060184396 | Dennis et al. | Aug 2006 | A1 |
20060241416 | Marquart et al. | Oct 2006 | A1 |
20060291612 | Nishide et al. | Dec 2006 | A1 |
20070015987 | Benlloch Baviera et al. | Jan 2007 | A1 |
20070021738 | Hasser et al. | Jan 2007 | A1 |
20070038059 | Sheffer et al. | Feb 2007 | A1 |
20070073133 | Schoenefeld | Mar 2007 | A1 |
20070156121 | Millman et al. | Jul 2007 | A1 |
20070156157 | Nahum et al. | Jul 2007 | A1 |
20070167712 | Keglovich et al. | Jul 2007 | A1 |
20070233238 | Huynh et al. | Oct 2007 | A1 |
20080004523 | Jensen | Jan 2008 | A1 |
20080013809 | Zhu et al. | Jan 2008 | A1 |
20080033283 | Dellaca et al. | Feb 2008 | A1 |
20080046122 | Manzo et al. | Feb 2008 | A1 |
20080082109 | Moll et al. | Apr 2008 | A1 |
20080108912 | Node-Langlois | May 2008 | A1 |
20080108991 | Von Jako | May 2008 | A1 |
20080109012 | Falco et al. | May 2008 | A1 |
20080144906 | Allred et al. | Jun 2008 | A1 |
20080161680 | Von Jako et al. | Jul 2008 | A1 |
20080161682 | Kendrick et al. | Jul 2008 | A1 |
20080177203 | von Jako | Jul 2008 | A1 |
20080214922 | Hartmann et al. | Sep 2008 | A1 |
20080228068 | Viswanathan et al. | Sep 2008 | A1 |
20080228196 | Wang et al. | Sep 2008 | A1 |
20080235052 | Node-Langlois et al. | Sep 2008 | A1 |
20080269596 | Revie et al. | Oct 2008 | A1 |
20080287771 | Anderson | Nov 2008 | A1 |
20080287781 | Revie et al. | Nov 2008 | A1 |
20080300477 | Lloyd et al. | Dec 2008 | A1 |
20080300478 | Zuhars et al. | Dec 2008 | A1 |
20080302950 | Park et al. | Dec 2008 | A1 |
20080306490 | Lakin et al. | Dec 2008 | A1 |
20080319311 | Hamadeh | Dec 2008 | A1 |
20090012509 | Csavoy et al. | Jan 2009 | A1 |
20090030428 | Omori et al. | Jan 2009 | A1 |
20090080737 | Battle et al. | Mar 2009 | A1 |
20090185655 | Koken et al. | Jul 2009 | A1 |
20090198121 | Hoheisel | Aug 2009 | A1 |
20090216113 | Meier et al. | Aug 2009 | A1 |
20090228019 | Gross et al. | Sep 2009 | A1 |
20090259123 | Navab et al. | Oct 2009 | A1 |
20090259230 | Khadem et al. | Oct 2009 | A1 |
20090264899 | Appenrodt et al. | Oct 2009 | A1 |
20090281417 | Hartmann et al. | Nov 2009 | A1 |
20100022874 | Wang et al. | Jan 2010 | A1 |
20100039506 | Sarvestani et al. | Feb 2010 | A1 |
20100125286 | Wang et al. | May 2010 | A1 |
20100130986 | Mailloux et al. | May 2010 | A1 |
20100228117 | Hartmann | Sep 2010 | A1 |
20100228265 | Prisco | Sep 2010 | A1 |
20100249571 | Jensen et al. | Sep 2010 | A1 |
20100274120 | Heuscher | Oct 2010 | A1 |
20100280363 | Skarda et al. | Nov 2010 | A1 |
20100331858 | Simaan et al. | Dec 2010 | A1 |
20110022229 | Jang et al. | Jan 2011 | A1 |
20110077504 | Fischer et al. | Mar 2011 | A1 |
20110098553 | Robbins et al. | Apr 2011 | A1 |
20110137152 | Li | Jun 2011 | A1 |
20110213384 | Jeong | Sep 2011 | A1 |
20110224684 | Larkin et al. | Sep 2011 | A1 |
20110224685 | Larkin et al. | Sep 2011 | A1 |
20110224686 | Larkin et al. | Sep 2011 | A1 |
20110224687 | Larkin et al. | Sep 2011 | A1 |
20110224688 | Larkin et al. | Sep 2011 | A1 |
20110224689 | Larkin et al. | Sep 2011 | A1 |
20110224825 | Larkin et al. | Sep 2011 | A1 |
20110230967 | O'Halloran et al. | Sep 2011 | A1 |
20110238080 | Ranjit et al. | Sep 2011 | A1 |
20110276058 | Choi et al. | Nov 2011 | A1 |
20110282189 | Graumann | Nov 2011 | A1 |
20110286573 | Schretter et al. | Nov 2011 | A1 |
20110295062 | Solsona et al. | Dec 2011 | A1 |
20110295370 | Suh et al. | Dec 2011 | A1 |
20110306986 | Lee et al. | Dec 2011 | A1 |
20120035507 | George et al. | Feb 2012 | A1 |
20120046668 | Gantes | Feb 2012 | A1 |
20120051498 | Koishi | Mar 2012 | A1 |
20120053597 | Anvari et al. | Mar 2012 | A1 |
20120059248 | Holsing et al. | Mar 2012 | A1 |
20120071753 | Hunter et al. | Mar 2012 | A1 |
20120108954 | Schulhauser et al. | May 2012 | A1 |
20120136372 | Amat Girbau et al. | May 2012 | A1 |
20120143084 | Shoham | Jun 2012 | A1 |
20120184839 | Woerlein | Jul 2012 | A1 |
20120197182 | Millman et al. | Aug 2012 | A1 |
20120226145 | Chang et al. | Sep 2012 | A1 |
20120235909 | Birkenbach et al. | Sep 2012 | A1 |
20120245596 | Meenink | Sep 2012 | A1 |
20120253332 | Moll | Oct 2012 | A1 |
20120253360 | White et al. | Oct 2012 | A1 |
20120256092 | Zingerman | Oct 2012 | A1 |
20120294498 | Popovic | Nov 2012 | A1 |
20120296203 | Hartmann et al. | Nov 2012 | A1 |
20130006267 | Odermatt et al. | Jan 2013 | A1 |
20130016889 | Myronenko et al. | Jan 2013 | A1 |
20130030571 | Ruiz Morales et al. | Jan 2013 | A1 |
20130035583 | Park et al. | Feb 2013 | A1 |
20130060146 | Yang et al. | Mar 2013 | A1 |
20130060337 | Petersheim et al. | Mar 2013 | A1 |
20130094742 | Feilkas | Apr 2013 | A1 |
20130096574 | Kang et al. | Apr 2013 | A1 |
20130113791 | Isaacs et al. | May 2013 | A1 |
20130116706 | Lee et al. | May 2013 | A1 |
20130131695 | Scarfogliero et al. | May 2013 | A1 |
20130144307 | Jeong et al. | Jun 2013 | A1 |
20130158542 | Manzo et al. | Jun 2013 | A1 |
20130165937 | Patwardhan | Jun 2013 | A1 |
20130178867 | Farritor et al. | Jul 2013 | A1 |
20130178868 | Roh | Jul 2013 | A1 |
20130178870 | Schena | Jul 2013 | A1 |
20130204271 | Brisson et al. | Aug 2013 | A1 |
20130211419 | Jensen | Aug 2013 | A1 |
20130211420 | Jensen | Aug 2013 | A1 |
20130218142 | Tuma et al. | Aug 2013 | A1 |
20130223702 | Holsing et al. | Aug 2013 | A1 |
20130225942 | Holsing et al. | Aug 2013 | A1 |
20130225943 | Holsing et al. | Aug 2013 | A1 |
20130231556 | Holsing et al. | Sep 2013 | A1 |
20130237995 | Lee et al. | Sep 2013 | A1 |
20130245375 | DiMaio et al. | Sep 2013 | A1 |
20130261640 | Kim et al. | Oct 2013 | A1 |
20130272488 | Bailey et al. | Oct 2013 | A1 |
20130272489 | Dickman et al. | Oct 2013 | A1 |
20130274761 | Devengenzo et al. | Oct 2013 | A1 |
20130281821 | Liu et al. | Oct 2013 | A1 |
20130296884 | Taylor et al. | Nov 2013 | A1 |
20130303887 | Holsing et al. | Nov 2013 | A1 |
20130307955 | Deitz et al. | Nov 2013 | A1 |
20130317521 | Choi et al. | Nov 2013 | A1 |
20130325033 | Schena et al. | Dec 2013 | A1 |
20130325035 | Hauck et al. | Dec 2013 | A1 |
20130331686 | Freysinger et al. | Dec 2013 | A1 |
20130331858 | Devengenzo et al. | Dec 2013 | A1 |
20130331861 | Yoon | Dec 2013 | A1 |
20130342578 | Isaacs | Dec 2013 | A1 |
20130345717 | Markvicka et al. | Dec 2013 | A1 |
20130345757 | Stad | Dec 2013 | A1 |
20140001235 | Shelton, IV | Jan 2014 | A1 |
20140012131 | Heruth et al. | Jan 2014 | A1 |
20140031664 | Kang et al. | Jan 2014 | A1 |
20140046128 | Lee et al. | Feb 2014 | A1 |
20140046132 | Hoeg et al. | Feb 2014 | A1 |
20140046340 | Wilson et al. | Feb 2014 | A1 |
20140049629 | Siewerdsen et al. | Feb 2014 | A1 |
20140058406 | Tsekos | Feb 2014 | A1 |
20140073914 | Lavallee et al. | Mar 2014 | A1 |
20140080086 | Chen | Mar 2014 | A1 |
20140081128 | Verard et al. | Mar 2014 | A1 |
20140088612 | Bartol et al. | Mar 2014 | A1 |
20140094694 | Moctezuma de la Barrera | Apr 2014 | A1 |
20140094851 | Gordon | Apr 2014 | A1 |
20140096369 | Matsumoto et al. | Apr 2014 | A1 |
20140100587 | Farritor et al. | Apr 2014 | A1 |
20140121676 | Kostrzewski et al. | May 2014 | A1 |
20140128882 | Kwak et al. | May 2014 | A1 |
20140135796 | Simon et al. | May 2014 | A1 |
20140142591 | Alvarez et al. | May 2014 | A1 |
20140142592 | Moon et al. | May 2014 | A1 |
20140148692 | Hartmann et al. | May 2014 | A1 |
20140163581 | Devengenzo et al. | Jun 2014 | A1 |
20140171781 | Stiles | Jun 2014 | A1 |
20140171900 | Stiles | Jun 2014 | A1 |
20140171965 | Loh et al. | Jun 2014 | A1 |
20140180308 | von Grunberg | Jun 2014 | A1 |
20140180309 | Seeber et al. | Jun 2014 | A1 |
20140187915 | Yaroshenko et al. | Jul 2014 | A1 |
20140188132 | Kang | Jul 2014 | A1 |
20140194699 | Roh et al. | Jul 2014 | A1 |
20140130810 | Azizian et al. | Aug 2014 | A1 |
20140221819 | Sarment | Aug 2014 | A1 |
20140222023 | Kim et al. | Aug 2014 | A1 |
20140228631 | Kwak et al. | Aug 2014 | A1 |
20140234804 | Huang et al. | Aug 2014 | A1 |
20140257328 | Kim et al. | Sep 2014 | A1 |
20140257329 | Jang et al. | Sep 2014 | A1 |
20140257330 | Choi et al. | Sep 2014 | A1 |
20140275760 | Lee et al. | Sep 2014 | A1 |
20140275985 | Walker et al. | Sep 2014 | A1 |
20140276931 | Parihar et al. | Sep 2014 | A1 |
20140276940 | Seo | Sep 2014 | A1 |
20140276944 | Farritor et al. | Sep 2014 | A1 |
20140288413 | Hwang et al. | Sep 2014 | A1 |
20140299648 | Shelton, IV et al. | Oct 2014 | A1 |
20140303434 | Farritor et al. | Oct 2014 | A1 |
20140303643 | Ha et al. | Oct 2014 | A1 |
20140305995 | Shelton, IV et al. | Oct 2014 | A1 |
20140309659 | Roh et al. | Oct 2014 | A1 |
20140316436 | Bar et al. | Oct 2014 | A1 |
20140323803 | Hoffman et al. | Oct 2014 | A1 |
20140324070 | Min et al. | Oct 2014 | A1 |
20140330288 | Date et al. | Nov 2014 | A1 |
20140364720 | Darrow et al. | Dec 2014 | A1 |
20140371577 | Maillet et al. | Dec 2014 | A1 |
20150039034 | Frankel et al. | Feb 2015 | A1 |
20150085970 | Bouhnik et al. | Mar 2015 | A1 |
20150146847 | Liu | May 2015 | A1 |
20150150524 | Yorkston et al. | Jun 2015 | A1 |
20150196261 | Funk | Jul 2015 | A1 |
20150213633 | Chang et al. | Jul 2015 | A1 |
20150335480 | Alvarez et al. | Nov 2015 | A1 |
20150342647 | Frankel et al. | Dec 2015 | A1 |
20160005194 | Schretter et al. | Jan 2016 | A1 |
20160166329 | Langan et al. | Jun 2016 | A1 |
20160235480 | Scholl et al. | Aug 2016 | A1 |
20160249990 | Glozman et al. | Sep 2016 | A1 |
20160278754 | Todorov | Sep 2016 | A1 |
20160302871 | Gregerson et al. | Oct 2016 | A1 |
20160320322 | Suzuki | Nov 2016 | A1 |
20160331335 | Gregerson et al. | Nov 2016 | A1 |
20170135770 | Scholl et al. | May 2017 | A1 |
20170143284 | Sehnert et al. | May 2017 | A1 |
20170143426 | Isaacs et al. | May 2017 | A1 |
20170156816 | Ibrahim | Jun 2017 | A1 |
20170202629 | Maillet et al. | Jul 2017 | A1 |
20170212723 | Atarot et al. | Jul 2017 | A1 |
20170215825 | Johnson et al. | Aug 2017 | A1 |
20170215826 | Johnson et al. | Aug 2017 | A1 |
20170215827 | Johnson et al. | Aug 2017 | A1 |
20170231710 | Scholl et al. | Aug 2017 | A1 |
20170258426 | Risher-Kelly et al. | Sep 2017 | A1 |
20170273748 | Hourtash et al. | Sep 2017 | A1 |
20170296277 | Hourtash et al. | Oct 2017 | A1 |
20170360493 | Zucher et al. | Dec 2017 | A1 |
20170360512 | Couture | Dec 2017 | A1 |
20180132949 | Merette | May 2018 | A1 |
20180368930 | Esterberg | Dec 2018 | A1 |
Entry |
---|
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn) |
Number | Date | Country | |
---|---|---|---|
20220218431 A1 | Jul 2022 | US |
Number | Date | Country | |
---|---|---|---|
63135068 | Jan 2021 | US |