The present disclosure relates generally to surgical systems for orthopedic surgeries, for example surgical systems that facilitate joint replacement procedures. Joint replacement procedures (arthroplasty procedures) are widely used to treat osteoarthritis and other damage to a patient's joint by replacing portions of the joint with prosthetic components. Joint replacement procedures can include procedures to replace hips, knees, shoulders, or other joints with one or more prosthetic components.
One possible tool for use in an arthroplasty procedure is a robotically-assisted surgical system. A robotically-assisted surgical system typically includes a robotic device that is used to prepare a patient's anatomy to receive an implant, a tracking system configured to monitor the location of the robotic device relative to the patient's anatomy, and a computing system configured to monitor and control the robotic device. Robotically-assisted surgical systems, in various forms, autonomously carry out surgical tasks, provide force feedback to a user manipulating a surgical device to complete surgical tasks, augment surgeon dexterity and precision, and/or provide other navigational cues to facilitate safe and accurate surgical operations.
A surgical plan is typically established prior to performing a surgical procedure with a robotically-assisted surgical system. Based on the surgical plan, the surgical system guides, controls, or limits movements of the surgical tool during portions of the surgical procedure. Guidance and/or control of the surgical tool serves to assist the surgeon during implementation of the surgical plan. Various features enabling improved planning, improved intra-operative assessments of the patient biomechanics, intraoperative plan adjustments, etc. for use with robotically-assisted surgical systems or other computer-assisted surgical systems may be advantageous.
One implementation of the present disclosure is a surgical system. The surgical system includes a robotic arm extending from a base, a tracking system configured to track at least one of a first marker attached to a distal end of the robotic arm and a second marker attached to the base, and a controller. The controller is configured to obtain an indication that the base is in position for performing a surgical operation, determine a starting pose for a registration routine for the robotic arm, control the robotic arm to automatically move to the starting pose for the registration routine, and in response to successful automatic movement to the starting pose for the registration routine, perform the registration or calibration routine for the robotic arm.
Another implementation of the present disclosure is a method of controlling a robotic arm mounted on a base. The method includes guiding the base to a position relative to a tracking system and determining a starting pose for a registration or calibration routine for the robotic arm. The starting pose corresponds to an expected position of a surgical field relative to the base. The method includes controlling the robotic arm to automatically move to the starting pose, and, in response to successful automatic movement to the starting pose for the registration or calibration routine, providing the registration or calibration routine for the robotic arm.
Another implementation of the present disclosure is one or more non-transitory computer-readable media storing program instructions that, when executed by one or more processors, cause the one or more processors to perform operations. The operations include obtaining an indication that a base of a robotic device is positioned relative to a tracking system and determining a starting pose for a registration or calibration routine for a robotic arm extending from the base. The starting pose corresponds to an expected position of a surgical field relative to the base. The operations also include controlling the robotic arm to automatically move the robotic arm to the starting pose and, in response to successful automatic movement to the starting pose for the registration or calibration routine, providing the registration or calibration routine for the robotic arm.
Presently preferred embodiments of the invention are illustrated in the drawings. An effort has been made to use the same or like reference numbers throughout the drawings to refer to the same or like parts. Although this specification refers primarily to a robotic arm for orthopedic joint replacement, it should be understood that the subject matter described herein is applicable to other types of robotic systems, including those used for non-surgical applications, as well as for procedures directed to other anatomical regions, for example spinal or dental procedures.
Referring now to
As shown in
A tibia may also be modified during a joint replacement procedure. For example, a planar surface may be created on the tibia at the knee joint to prepare the tibia to mate with a tibial implant component. In some embodiments, one or more pilot holes or other recess (e.g., fin-shaped recess) may also be created in the tibia to facilitate secure coupling of an implant component tot eh bone.
In some embodiments, the systems and methods described herein provide robotic assistance for creating the planar surfaces 102-110 and the pilot holes 120 at the femur, and/or a planar surface and/or pilot holes 120 or other recess on a tibia. It should be understood that the creation of five planar cuts and two cylindrical pilot holes as shown in
The positions and orientations of the planar surfaces 102-110, pilot holes 120, and any other surfaces or recesses created on bones of the knee joint can affect how well implant components mate to the bone as well as the resulting biomechanics for the patient after completion of the surgery. Tension on soft tissue can also be affected. Accordingly, systems and methods for planning the cuts which create these surfaces, facilitating intra-operative adjustments to the surgical plan, and providing robotic-assistance or other guidance for facilitating accurate creation of the planar surfaces 102-110, other surfaces, pilot holes 120, or other recesses can make surgical procedures easier and more efficient for healthcare providers and improve surgical outcomes.
Referring now to
The robotic device 220 is configured to modify a patient's anatomy (e.g., femur 206 of patient 204) under the control of the computing system 224. One embodiment of the robotic device 220 is a haptic device. “Haptic” refers to a sense of touch, and the field of haptics relates to, among other things, human interactive devices that provide feedback to an operator. Feedback may include tactile sensations such as, for example, vibration. Feedback may also include providing force to a user, such as a positive force or a resistance to movement. One use of haptics is to provide a user of the device with guidance or limits for manipulation of that device. For example, a haptic device may be coupled to a surgical tool, which can be manipulated by a surgeon to perform a surgical procedure. The surgeon's manipulation of the surgical tool can be guided or limited through the use of haptics to provide feedback to the surgeon during manipulation of the surgical tool.
Another embodiment of the robotic device 220 is an autonomous or semi-autonomous robot. “Autonomous” refers to a robotic device's ability to act independently or semi-independently of human control by gathering information about its situation, determining a course of action, and automatically carrying out that course of action. For example, in such an embodiment, the robotic device 220, in communication with the tracking system 222 and the computing system 224, may autonomously complete the series of femoral cuts mentioned above without direct human intervention.
The robotic device 220 includes a base 230, a robotic arm 232, and a surgical tool 234, and is communicably coupled to the computing system 224 and the tracking system 222. The base 230 provides a moveable foundation for the robotic arm 232, allowing the robotic arm 232 and the surgical tool 234 to be repositioned as needed relative to the patient 204 and the table 205. The base 230 may also contain power systems, computing elements, motors, and other electronic or mechanical system necessary for the functions of the robotic arm 232 and the surgical tool 234 described below.
The robotic arm 232 is configured to support the surgical tool 234 and provide a force as instructed by the computing system 224. In some embodiments, the robotic arm 232 allows a user to manipulate the surgical tool and provides force feedback to the user. In such an embodiment, the robotic arm 232 includes joints 236 and mount 238 that include motors, actuators, or other mechanisms configured to allow a user to freely translate and rotate the robotic arm 232 and surgical tool 234 through allowable poses while providing force feedback to constrain or prevent some movements of the robotic arm 232 and surgical tool 234 as instructed by computing system 224. As described in detail below, the robotic arm 232 thereby allows a surgeon to have full control over the surgical tool 234 within a control object while providing force feedback along a boundary of that object (e.g., a vibration, a force preventing or resisting penetration of the boundary). In some embodiments, the robotic arm is configured to move the surgical tool to a new pose automatically without direct user manipulation, as instructed by computing system 224, in order to position the robotic arm as needed and/or complete certain surgical tasks, including, for example, cuts in a femur 206.
The surgical tool 234 is configured to cut, burr, grind, drill, partially resect, reshape, and/or otherwise modify a bone. The surgical tool 234 may be any suitable tool, and may be one of multiple tools interchangeably connectable to robotic device 220. For example, as shown in
Tracking system 222 is configured track the patient's anatomy (e.g., femur 206 and tibia 208) and the robotic device 220 (i.e., surgical tool 234 and/or robotic arm 232) to enable control of the surgical tool 234 coupled to the robotic arm 232, to determine a position and orientation of modifications or other results made by the surgical tool 234, and allow a user to visualize the bones (e.g., femur 206, the tibia 208, pelvis, humerus, scapula, etc. as applicable in various procedures), the surgical tool 234, and/or the robotic arm 232 on a display of the computing system 224. The tracking system 222 can also be used to collect biomechanical measurements relating to the patient's anatomy, assess joint gap distances, identify a hip center point, assess native or corrected joint deformities, or otherwise collect information relating to the relative poses of anatomical features. More particularly, the tracking system 222 determines a position and orientation (i.e., pose) of objects (e.g., surgical tool 234, femur 206) with respect to a coordinate frame of reference and tracks (i.e., continuously determines) the pose of the objects during a surgical procedure. According to various embodiments, the tracking system 222 may be any type of navigation system, including a non-mechanical tracking system (e.g., an optical tracking system), a mechanical tracking system (e.g., tracking based on measuring the relative angles of joints 236 of the robotic arm 232), or any combination of non-mechanical and mechanical tracking systems.
In the embodiment shown in
Using the tracking system 222 of
The computing system 224 is configured to create a surgical plan, control the robotic device 220 in accordance with the surgical plan to make one or more bone modifications and/or facilitate implantation of one or more prosthetic components. Accordingly, the computing system 224 is communicably coupled to the tracking system 222 and the robotic device 220 to facilitate electronic communication between the robotic device 220, the tracking system 222, and the computing system 224. Further, the computing system 224 may be connected to a network to receive information related to a patient's medical history or other patient profile information, medical imaging, surgical plans, surgical procedures, and to perform various functions related to performance of surgical procedures, for example by accessing an electronic health records system. Computing system 224 includes processing circuit 260 and input/output device 262.
The input/output device 262 is configured to receive user input and display output as needed for the functions and processes described herein. As shown in
The processing circuit 260 includes a processor and memory device. The processor can be implemented as a general purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable electronic processing components. The memory device (e.g., memory, memory unit, storage device, etc.) is one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage, etc.) for storing data and/or computer code for completing or facilitating the various processes and functions described in the present application. The memory device may be or include volatile memory or non-volatile memory. The memory device may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present application. According to an exemplary embodiment, the memory device is communicably connected to the processor via the processing circuit 260 and includes computer code for executing (e.g., by the processing circuit 260 and/or processor) one or more processes described herein.
More particularly, processing circuit 260 is configured to facilitate the creation of a preoperative surgical plan prior to the surgical procedure. According to some embodiments, the preoperative surgical plan is developed utilizing a three-dimensional representation of a patient's anatomy, also referred to herein as a “virtual bone model.” A “virtual bone model” may include virtual representations of cartilage or other tissue in addition to bone. To obtain the virtual bone model, the processing circuit 260 receives imaging data of the patient's anatomy on which the surgical procedure is to be performed. The imaging data may be created using any suitable medical imaging technique to image the relevant anatomical feature, including computed tomography (CT), magnetic resonance imaging (MM), and/or ultrasound. The imaging data is then segmented (i.e., the regions in the imaging corresponding to different anatomical features are distinguished) to obtain the virtual bone model. For example, MM-based scan data of a joint can be segmented to distinguish bone from surrounding ligaments, cartilage, previously-implanted prosthetic components, and other tissue to obtain a three-dimensional model of the imaged bone.
Alternatively, the virtual bone model may be obtained by selecting a three-dimensional model from a database or library of bone models. In one embodiment, the user may use input/output device 262 to select an appropriate model. In another embodiment, the processing circuit 260 may execute stored instructions to select an appropriate model based on images or other information provided about the patient. The selected bone model(s) from the database can then be deformed based on specific patient characteristics, creating a virtual bone model for use in surgical planning and implementation as described herein.
A preoperative surgical plan can then be created based on the virtual bone model. The surgical plan may be automatically generated by the processing circuit 260, input by a user via input/output device 262, or some combination of the two (e.g., the processing circuit 260 limits some features of user-created plans, generates a plan that a user can modify, etc.). In some embodiments, the surgical plan may be generated and/or modified based on distraction force measurements collected intraoperatively.
The preoperative surgical plan includes the desired cuts, holes, surfaces, burrs, or other modifications to a patient's anatomy to be made using the surgical system 200. For example, for a total knee arthroscopy procedure, the preoperative plan may include the cuts necessary to form, on a femur, a distal surface, a posterior chamfer surface, a posterior surface, an anterior surface, and an anterior chamfer surface in relative orientations and positions suitable to be mated to corresponding surfaces of the prosthetic to be joined to the femur during the surgical procedure, as well as cuts necessary to form, on the tibia, surface(s) suitable to mate to the prosthetic to be joined to the tibia during the surgical procedure. As another example, the preoperative plan may include the modifications necessary to create holes (e.g., pilot holes 120) in a bone. As another example, in a hip arthroplasty procedure, the surgical plan may include the burr necessary to form one or more surfaces on the acetabular region of the pelvis to receive a cup and, in suitable cases, an implant augment. Accordingly, the processing circuit 260 may receive, access, and/or store a model of the prosthetic to facilitate the generation of surgical plans. In some embodiments, the processing circuit facilitate intraoperative modifications tot eh preoperative plant.
The processing circuit 260 is further configured to generate a control object for the robotic device 220 in accordance with the surgical plan. The control object may take various forms according to the various types of possible robotic devices (e.g., haptic, autonomous). For example, in some embodiments, the control object defines instructions for the robotic device to control the robotic device to move within the control object (i.e., to autonomously make one or more cuts of the surgical plan guided by feedback from the tracking system 222). In some embodiments, the control object includes a visualization of the surgical plan and the robotic device on the display 264 to facilitate surgical navigation and help guide a surgeon to follow the surgical plan (e.g., without active control or force feedback of the robotic device). In embodiments where the robotic device 220 is a haptic device, the control object may be a haptic object as described in the following paragraphs.
In an embodiment where the robotic device 220 is a haptic device, the processing circuit 260 is further configured to generate one or more haptic objects based on the preoperative surgical plan to assist the surgeon during implementation of the surgical plan by enabling constraint of the surgical tool 234 during the surgical procedure. A haptic object may be formed in one, two, or three dimensions. For example, a haptic object can be a line, a plane, or a three-dimensional volume. A haptic object may be curved with curved surfaces and/or have flat surfaces, and can be any shape, for example a funnel shape. Haptic objects can be created to represent a variety of desired outcomes for movement of the surgical tool 234 during the surgical procedure. One or more of the boundaries of a three-dimensional haptic object may represent one or more modifications, such as cuts, to be created on the surface of a bone. A planar haptic object may represent a modification, such as a cut, to be created on the surface of a bone. A curved haptic object may represent a resulting surface of a bone as modified to receive a cup implant and/or implant augment. A line haptic object may correspond to a pilot hole to be made in a bone to prepare the bone to receive a screw or other projection.
In an embodiment where the robotic device 220 is a haptic device, the processing circuit 260 is further configured to generate a virtual tool representation of the surgical tool 234. The virtual tool includes one or more haptic interaction points (HIPs), which represent and are associated with locations on the physical surgical tool 234. In an embodiment in which the surgical tool 234 is a spherical burr (e.g., as shown in
Prior to performance of the surgical procedure, the patient's anatomy (e.g., femur 206) is registered to the virtual bone model of the patient's anatomy by any known registration technique. One possible registration technique is point-based registration, as described in U.S. Pat. No. 8,010,180, titled “Haptic Guidance System and Method,” granted Aug. 30, 2011, and hereby incorporated by reference herein in its entirety. Alternatively, registration may be accomplished by 2D/3D registration utilizing a hand-held radiographic imaging device, as described in U.S. application Ser. No. 13/562,163, titled “Radiographic Imaging Device,” filed Jul. 30, 2012, and hereby incorporated by reference herein in its entirety. Registration also includes registration of the surgical tool 234 to a virtual tool representation of the surgical tool 234, so that the surgical system 200 can determine and monitor the pose of the surgical tool 234 relative to the patient (i.e., to femur 206). Registration of allows for accurate navigation, control, and/or force feedback during the surgical procedure.
The processing circuit 260 is configured to monitor the virtual positions of the virtual tool representation, the virtual bone model, and the control object (e.g., virtual haptic objects) corresponding to the real-world positions of the patient's bone (e.g., femur 206), the surgical tool 234, and one or more lines, planes, or three-dimensional spaces defined by forces created by robotic device 220. For example, if the patient's anatomy moves during the surgical procedure as tracked by the tracking system 222, the processing circuit 260 correspondingly moves the virtual bone model. The virtual bone model therefore corresponds to, or is associated with, the patient's actual (i.e. physical) anatomy and the position and orientation of that anatomy in real/physical space. Similarly, any haptic objects, control objects, or other planned automated robotic device motions created during surgical planning that are linked to cuts, modifications, etc. to be made to that anatomy also move in correspondence with the patient's anatomy. In some embodiments, the surgical system 200 includes a clamp or brace to substantially immobilize the femur 206 to minimize the need to track and process motion of the femur 206.
For embodiments where the robotic device 220 is a haptic device, the surgical system 200 is configured to constrain the surgical tool 234 based on relationships between HIPs and haptic objects. That is, when the processing circuit 260 uses data supplied by tracking system 222 to detect that a user is manipulating the surgical tool 234 to bring a HIP in virtual contact with a haptic object, the processing circuit 260 generates a control signal to the robotic arm 232 to provide haptic feedback (e.g., a force, a vibration) to the user to communicate a constraint on the movement of the surgical tool 234. In general, the term “constrain,” as used herein, is used to describe a tendency to restrict movement. However, the form of constraint imposed on surgical tool 234 depends on the form of the relevant haptic object. A haptic object may be formed in any desirable shape or configuration. As noted above, three exemplary embodiments include a line, plane, or three-dimensional volume. In one embodiment, the surgical tool 234 is constrained because a HIP of surgical tool 234 is restricted to movement along a linear haptic object. In another embodiment, the haptic object is a three-dimensional volume and the surgical tool 234 may be constrained by substantially preventing movement of the HIP outside of the volume enclosed by the walls of the three-dimensional haptic object. In another embodiment, the surgical tool 234 is constrained because a planar haptic object substantially prevents movement of the HIP outside of the plane and outside of the boundaries of the planar haptic object. For example, the processing circuit 260 can establish a planar haptic object corresponding to a planned planar distal cut needed to create a distal surface on the femur 206 in order to confine the surgical tool 234 substantially to the plane needed to carry out the planned distal cut.
For embodiments where the robotic device 220 is an autonomous device, the surgical system 200 is configured to autonomously move and operate the surgical tool 234 in accordance with the control object. For example, the control object may define areas relative to the femur 206 for which a cut should be made. In such a case, one or more motors, actuators, and/or other mechanisms of the robotic arm 232 and the surgical tool 234 are controllable to cause the surgical tool 234 to move and operate as necessary within the control object to make a planned cut, for example using tracking data from the tracking system 222 to allow for closed-loop control.
Referring now to
At step 302, a surgical plan is obtained. The surgical plan (e.g., a computer-readable data file) may define a desired outcome of bone modifications, for example defined based on a desired position of prosthetic components relative to the patient's anatomy. For example, in the case of a knee arthroplasty procedure, the surgical plan may provide planned positions and orientations of the planar surfaces 102-110 and the pilot holes 120 as shown in
At step 304, one or more control boundaries, such as haptic objects, are defined based on the surgical plan. The one or more haptic objects may be one-dimensional (e.g., a line haptic), two dimensional (i.e., planar), or three dimensional (e.g., cylindrical, funnel-shaped, curved, etc.). The haptic objects may represent planned bone modifications (e.g., a haptic object for each of the planar surfaces 102-110 and each of the pilot holes 120 shown in
At step 306, a pose of a surgical tool is tracked relative to the haptic object(s), for example by the tracking system 222 described above. In some embodiments, one point on the surgical tool is tracked. In other embodiments, (e.g., in the example of
At step 308, the surgical tool is guided to the haptic object(s). For example, the display 264 of the surgical system 200 may display a graphical user interface instructing a user on how (e.g., which direction) to move the surgical tool and/or robotic device to bring the surgical tool to a haptic object. As another example, the surgical tool may be guided to a haptic object using a collapsing haptic boundary as described in U.S. Pat. No. 9,289,264, the entire disclosure of which is incorporated by reference herein. As another example, the robotic device may be controlled to automatically move the surgical tool to a haptic object.
In an embodiment where the robotic device is controlled to automatically move the surgical tool to the haptic object (referred to as motorized alignment or automated alignment), the robotic device may be controlled so that a duration of the alignment is bounded by preset upper and lower time thresholds. That is, across various instances of process 300 and multiple procedures, automated alignment in step 308 may be configured to always take between a first amount of time (the lower time threshold) and a second amount of time (the upper time threshold). The lower time threshold may be selected such that the robotic device moves over a long enough duration to be perceived as well-controlled and to minimize collision or other risks associated with high speed. The upper time threshold may be selected such that the robotic device moves over a short enough duration to avoid user impatience and provide improved usability. For example, the upper time threshold hold may be approximately five seconds in an example where the lower time thresholds is approximately three seconds. In other embodiments, a single duration setpoint is used (e.g., four seconds). Step 308 can include optimizing a path for the robotic device such that the step 308 ensures successful alignment to the haptic object while also satisfying the upper and lower time thresholds or duration setpoint.
At step 310, the robotic device is controlled to constrain movement of the surgical tool based on the tracked pose of the surgical tool and the poses of one or more haptic objects. The constraining of the surgical tool may be achieved as described above with reference to
At step 312, exit of the surgical tool from the haptic object(s) is facilitated, i.e., to release the constraints of a haptic object. For example, in some embodiments, the robotic device is controlled to allow the surgical tool to exit a haptic object along an axis of the haptic object. In some embodiments, the surgical tool may be allowed to exit the haptic object in a pre-determined direction relative to the haptic object. The surgical tool may thereby be removed from the surgical field and the haptic object to facilitate subsequent steps of the surgical procedure. Additionally, it should be understood that, in some cases, the process 300 may return to step 308 where the surgical tool is guided to the same or different haptic object after exiting a haptic object at step 312.
Process 300 may thereby be executed by the surgical system 200 to facilitate a surgical procedure. Features of process 300 are shown in
Referring now to
At step 402, segmented pre-operative images and other patient data are obtained, for example by the surgical system 200. For example, segmented pre-operative CT images or MRI images may be received at the computing system 224 from an external server. In some cases, pre-operative images of a patient's anatomy are collected using an imaging device and segmented by a separate computing system and/or with manual user input to facilitate segmentation. In other embodiments, unsegmented pre-operative images are received at the computing system 224 and the computing system 224 is configured to automatically segment the images. The segmented pre-operative images can show the geometry, shape, size, density, and/or other characteristics of bones of a joint which is to be operated on in a procedure performed using process 400.
Other patient data can also be obtained at step 402. For example, the computing system 224 may receive patient information from an electronic medical records system. As another example, the computing system 224 may accept user input of patient information. The other patient data may include a patient's name, identification number, biographical information (e.g., age, weight, etc.), other health conditions, etc. In some embodiments, the patient data obtained at step 402 includes information specific to the procedure to be performed and the relevant pre-operative diagnosis. For example, the patient data may indicate which joint the procedure will be performed on (e.g., right knee, left knee). The patient data may indicate a diagnosed deformity, for example indicating whether a knee joint was diagnosed as having a varus deformity or a valgus deformity. This or other data that may facilitate the surgical procedure may be obtained at step 402.
At step 404, a system setup, calibration, and registration workflow is provided, for example by the surgical system 200. The system setup, calibration, and registration workflows may be configured to prepare the surgical system 200 for use in facilitating a surgical procedure. For example, at step 404, the computer system 224 may operate to provide graphical user interfaces that include instructions for performing system setup, calibration, and registrations steps. The computer system 224 may also cause the tracking system 222 to collect tracking data and control the robotic device 220 to facilitate system setup, calibration, and/or registration. The computer system 224 may also receiving tracking data from the tracking system 222 and information from the computer system 224 and use the received information and data to calibrate the robotic device 220 and define various geometric relationships between tracked points (e.g., fiducials, markers), other components of the surgical system 200 (e.g., robotic arm 232, surgical tool 234, probe), and virtual representations of anatomical features (e.g., virtual bone models).
The system setup workflow provided at step 404 may include guiding the robotic device 220 to a position relative to a surgical table and the patient which will be suitable for completing an entire surgical procedure without repositioning the robotic device 220. For example, the computer system 224 may generate and provide a graphical user interface configured to provide instructions for moving a portable cart of the robotic device 220 into a preferred position. In some embodiments, the robotic device 220 can be tracked to determine whether the robotic device 220 is properly positioned. Once the cart is positioned, in some embodiments the robotic device 220 is controlled to automatically position the robotic arm 232 in a pose suitable for initiation of calibration and/or registration workflows.
The calibration and registration workflows provided at step 404 may include generating instructions for a user to perform various calibration and registration tasks while operating the tracking system 222 to generate tracking data. The tracking data can then be used to calibrate the tracking system 222 and the robotic device 220 and to register the first fiducial tree 240, second fiducial tree 241, and third fiducial tree 242 relative to the patient's anatomical features, for example by defining geometric relationships between the fiducial trees 240-242 and relevant bones of the patient in the example of
In some embodiments, providing the registration workflow includes generating instructions to move the patient's leg to facilitate collection of relevant tracking data that can be used to identify the location of a biomechanical feature, for example a hip center point. Providing the registration workflow can include providing audio or visual feedback indicating whether the leg was moved in the proper manner to collect sufficient tracking data. Various methods and approaches for registration and calibration can be used in various embodiments. Step 404 may include steps performed before or after an initial surgical incision is made in the patient's skin to initiate the surgical procedure.
At step 406, an initial assessment workflow is provided, for example by the surgical system 200. The initial assessment workflow provides an initial assessment of the joint to be operated upon based on tracked poses of the bones of the joint. For example, the initial assessment workflow may include tracking relative positions of a tibia and a femur using data from the tracking system while providing real-time visualizations of the tibia and femur via a graphical user interface. The computing system 224 may provide instructions via the graphical user interface to move the tibia and femur to different relative positions (e.g., different degrees of flexion) and to exert different forces on the joint (e.g., a varus or valgus force). In some embodiments, the initial assessment workflow includes determine, by the surgical system 200 and based on data from the tracking system 222, whether the patient's joint has a varus or valgus deformity, and, in some embodiments, determining a magnitude of the deformity. In some embodiments, the initial assessment workflow may include collecting data relating to native ligament tension or native gaps between bones of the joint. In some embodiments, the initial assessment workflow may include displaying instructions to exert a force on the patient's leg to place the joint in a corrected state corresponding to a desired outcome for a joint arthroplasty procedure, and recording the relative poses of the bones and other relevant measurements while the joint is in the corrected state. The initial assessment workflow thereby results in collection of data that may be useful for the surgical system 200 or a surgeon in later steps of process 400.
At step 408, an implant planning workflow is provided, for example by the surgical system 200. The implant planning workflow is configured to facilitate users in planning implant placement relative to the patient's bones and/or planning bone cuts or other modifications for preparing bones to receive implant components. Step 408 may include generating, for example by the computing system 324, three-dimensional computer models of the bones of the joint (e.g., a tibia model and a femur model) based on the segmented medical images received at step 402. Step 408 may also include obtaining three-dimensional computer models of prosthetic components to be implanted at the joint (e.g., a tibial implant model and a femoral implant model). A graphical user interface can be generated showing multiple views of the three-dimensional bone models with the three-dimensional implant models shown in planned positions relative to the three-dimensional bone models. Providing the implant planning workflow can include enabling the user to adjust the position and orientation of the implant models relative to the bone models. Planned cuts for preparing the bones to allow the implants to be implanted at the planned positions can then be automatically based on the positioning of the implant models relative to the bone models.
The graphical user interface can include data and measurements from pre-operative patient data (e.g., from step 402) and from the initial assessment workflow (step 406) and/or related measurements that would result from the planned implant placement. The planned measurements (e.g., planned gaps, planned varus/valgus angles, etc.) can be calculated based in part on data collected via the tracking system 222 in other phases of process 400, for example from initial assessment in step 406 or trialing or tensioning workflows described below with reference to step 412.
The implant planning workflow may also include providing warnings (alerts, notifications) to users when an implant plan violates various criteria. In some cases, the criteria can be predefined, for example related to regulatory or system requirements that are constant for all surgeons and/or for all patients. In other embodiments, the criteria may be related to surgeon preferences, such that the criteria for triggering a warning can be different for different surgeons. In some cases, the computing system 224 can prevent the process 400 from moving out of the implant planning workflow when one or more of certain criteria are not met.
The implant planning workflow provided at step 408 thereby results in planned cuts for preparing a joint to receive prosthetic implant components. In some embodiments, the planned cuts include a planar tibial cut and multiple planar femoral cuts, for example as described above with reference to
At step 410, a bone preparation workflow is provided, for example by the surgical system 200. The bone preparation workflow includes guiding execution of one or more cuts or other bone modifications based on the surgical plan created at step 408. For example, as explained in detail above with reference to
The bone preparation workflow at step 410 can also include displaying graphical user interface elements configured to guide a surgeon in completing one or more planned cuts. For example, the bone preparation workflow can include tracking the position of a surgical tool relative to a plane or other geometry associated with a planned cut and relative to the bone to be cut. In this example, the bone preparation workflow can include displaying, in real-time, the relative positions of the surgical tool, cut plane or other geometry, and bone model. In some embodiments, visual, audio, or haptic warnings can be provided to indicate completion or start of an event or step of the procedure, entry or exit from a state or virtual object, interruptions to performance of the planned cut, deviation from the planned cut, or violation of other criteria relating to the bone preparation workflow.
In some embodiments, step 410 is provided until all bone cuts planned at step 408 are complete and the bones are ready to be coupled to the implant components. In other embodiments, for example as shown in
Following an iteration of the bone preparation workflow at step 410, the process 400 can proceed to step 412. At step 412 a mid-resection tensioning workflow or a trialing workflow is provided, for example by the surgical system 200. The mid-resection tensioning workflow is provided when less than all of the bone resection has been completed. The trialing workflow is provided when all resections have been made and/or bones are otherwise prepared to be temporarily coupled to trial implants. The mid-resection tensioning workflow and the trialing workflow at step 412 provide for collection of intraoperative data relating to relative positions of bones of the joint using the tracking system 222 including performing gap measurements or other tensioning procedures that can facilitate soft tissue balancing and/or adjustments to the surgical plan.
For example, step 412 may include displaying instructions to a user to move the joint through a range of motion, for example from flexion to extension, while the tracking system 222 tracks the bones. In some embodiments, gap distances between bones are determined from data collected by the tracking system 222 as a surgeon places the joint in both flexion and extension. In some embodiments, soft tissue tension or distraction forces are measured. Because one or more bone resections have been made before step 412 and soft tissue has been affected by the procedure, the mechanics of the joint may be different than during the initial assessment workflow of step 402 and relative to when the pre-operative imaging was performed. Accordingly, providing for intra-operative measurements in step 412 can provide information to a surgeon and to the surgical system 200 that was not available pre-operatively and which can be used to help fine tune the surgical plan.
From step 412, the process 400 returns to step 408 to provide the implant planning workflow again, now augmented with data collected during a mid-resection or trialing workflow at step 412. For example, planned gaps between implants can be calculated based on the intraoperative measurements collected at step 414, the planned position of a tibial implant relative to a tibia, and the planned position of a femoral implant relative to a femur. The planned gap values can then be displayed in an implant planning interface during step 408 to allow a surgeon to adjust the planned implant positions based on the calculated gap values. In various embodiments, a second iteration of step 408 to provide the implant planning workflow incorporates various data from step 412 in order to facilitate a surgeon in modifying and fine-tuning the surgical plan intraoperatively.
Steps 408, 410, and 412 can be performed multiple times to provide for intra-operative updates to the surgical plan based on intraoperative measurements collected between bone resections. For example, in some cases, a first iteration of steps 408, 410, and 412 includes planning a tibial cut in step 408, executing the planned tibial cut in step 410, and providing a mid-resection tensioning workflow in step 414. In this example, a second iteration of steps 408, 410, and 412 can include planning femoral cuts using data collected in the mid-resection tensioning workflow in step 408, executing the femoral cuts in step 410, and providing a trialing workflow in step 412. Providing the trialing workflow can include displaying instructions relating to placing trial implants on the prepared bone surfaces, and, in some embodiments, verifying that the trial implants are positioned in planned positions using the tracking system 222. Tracking data can be collected in a trialing workflow in step 412 relating to whether the trial implants are placed in acceptable positions or whether further adjustments to the surgical plan are needed by cycling back to step 408 and making further bone modifications in another iteration of step 410.
In some embodiments, executing process 400 can include providing users with options to jump between steps of the process 400 to enter a desired workflow. For example, a user can be allowed to switch between implant planning and bone preparation on demand. In other embodiments, executing process 400 can include ensuring that a particular sequence of steps of process 400 are followed. In various embodiments, any number of iterations of the various steps can be performed until a surgeon is satisfied that the bones have been properly prepared to receive implant components in clinically-appropriate positions.
As shown in
Referring generally to
Motorized movement of the robotic arm as described below may address various challenges relating to system setup, calibration, and registration under step 404. In the example of
One aspect of this challenge is in finding a proper starting pose of the robotic arm for performing a registration or calibration routine for the robotic arm. For various reasons, for example relating to tracking accuracy, it may be desirable to perform a registration or calibration routine with a distal end of the robotic arm 232 as close as possible to where it will be during use of the robotic arm 232 during the surgical procedure. However, that location may not be readily apparent to a user, especially new users, as it should be identified at an early stage of an operation before other surgical tasks have been initiated. Additionally, a starting pose for a registration or calibration routine preferably corresponds to starting joint angles of the robotic arm 232 which will provide sufficient range of motion and degrees of freedom for the robotic arm 232 in order to complete both the registration or calibration routine and the steps of the surgical procedure. Such constraints may not be clear ahead of time to the user of a surgical system 200. Furthermore, the registration or calibration routine may require a clear line-of-sight between a trackable array coupled to a distal end of the robotic device and the detectors 246 of the tracking system throughout the registration or calibration routine, providing another constraint on selecting a proper placement of the starting pose of the robotic arm for a surgical procedure which may not be readily apparent to a user. Also, because computer-assisted navigation techniques available in later phases of a surgical operation rely upon completion of registration, such techniques are not available to assist in putting the robotic arm in a proper pose for starting a registration or calibration routine. Accordingly, a threshold challenge of providing the robotic device in a proper starting pose should be solved in order to provide a reliable, highly-accurate, user-friendly registration or calibration routine.
Referring now to
At step 502, the detector 246 of the tracking system 222 is positioned, for example in an operating room. That is, the detector 246 is set (parked, locked, braked, fixed, etc.) in the position where it will preferably stay for a duration of the surgical procedure. In some embodiments, the surgical system 200 is configured to provide, via a display screen 264, instructions for positioning the detector 246 of the tracking system 222.
At step 504, the robotic device 220 and the tracking system 222 are positioned and parked relative to one another, for example such that the robotic device 220 and the detector 246 of the tracking system 222 are separated by less than or equal to a preset distance. For example, the mobile base 230 can be rolled, steered, etc. into a desired position relative to the tracking system 222 and relative to other structures in the operating room (e.g., relative to a table/bed on which a patient can be positioned during a surgical procedure). As another example, the mobile base 230 could be parked first and the tracking system 222 (e.g., the detector 246 of the tracking system 222) can be moved toward the mobile base 230. In some cases, the mobile base 230 is positioned such that the patient will be located between the mobile base 230 and the detector 246 of the tracking system 222. In some embodiments, the surgical system 200 is configured to provide, via a display screen 264, instructions for positioning the detector 246 of the tracking system 222. In some cases, the tracking system 222 is used to provide live updates of the position of the base 230 relative to a target parking position displayed on the display screen 264. Accordingly, the base 230 can be guided to a parking position relative to other components used in the operating room.
At step 506, a trackable array (fiducial tree, end effector array) is coupled to a distal end of the robotic arm 232. For example, a surgical tool 234 may be attached to the distal end of the robotic arm 232 and the trackable array can be attached to the surgical tool 234 so as to be coupled to the distal end of the robotic arm 232. As mentioned above with reference to the example of
At step 508, a starting pose of the robotic arm for a registration or calibration routine is determined. The starting pose may be associated with an expected position of a surgical field in which a surgical procedure will be performed using a surgical tool 234 attached to the robotic arm 232. For example, the starting pose may be representative of cutting poses that will be used during the surgical procedure. In some embodiments, the processing circuit 260 determines the starting pose based on relative positions of the detector 246 and the base 230 of the robotic device 220. For example, the starting pose may be determined to ensure or improve the likelihood that the end effector tracker remains within the line-of-sight of the detector 246 of the tracking system 222 throughout the calibration and registration procedures. In some embodiments, the starting pose is automatically calculated based on one or more of these criteria each time the process 500 is performed (e.g., for each surgical operation). In other embodiments, the starting pose is predetermined or preprogrammed based on the various criteria, for example such that properly parking the base 230 in an acceptable position ensures that the starting pose will be properly situated in the operating room.
In some embodiments of step 508, the starting pose for registration or calibration is determined by performing an optimization process to find a best working volume for cuts in a total knee arthroplasty procedure (or other procedure in other applications). The optimization process may consider factors such as estimated calibration error for the robotic arm, anthropomorphic models of the surgeon/user relating to usability and ergonomics, surgeon height, surgeon preferences, probable position of the patient on the table, and other operating room constraints. The determination may be made using an assumption that the camera is positioned across the knee from the robotic device 220. The starting pose may be selected as the center of the optimized working volume. In some embodiments of step 508, the starting pose is selected to corresponding to a working volume where the robotic arm 232 has a lowest calibration error and estimated error due to compliance in the arm during use. Additionally, the starting pose may be selected such that motorized alignment ends in a plane that is parallel to the expected orientation of the cameras 248 of the tracking system 222.
At step 510, an approach area is defined around the starting pose. The approach area defines a space in which motorized movement of the robotic arm to the starting pose can be initiated as described below with reference to steps 512-518. In some embodiments, the approach area is defined by a virtual boundary, for example a sphere centered on the starting pose. In some embodiments, the approach area is defined in a coordinate system of the tracking system 222. In some embodiments, the approach area is defined in terms of joint angles of the robotic arm 232.
The approach area may be defined in various ways in various embodiments. For example, in some embodiments the approach area is defined to balance multiple considerations. Reducing a size of the approach area can reduce a risk of the robotic arm 232 colliding with objects or people in the operating room motorized movement. Also, determination of the approach area can include ensuring that the approach area is sufficiently large to enable a user to easily move the end effector in the approach area. The approach area can also be defined to ensure that it is consistent with the range of the robotic arm so that the robotic arm is capable of reaching the approach area. The approach area can also be sized and positioned based on a preferred distance and speed for the motorized motion in later steps, i.e., such that the robotic arm enters the approach area at a location which is within an acceptable distance of the starting pose for the registration or calibration procedure and from which the motorized motion can be performed in an acceptable amount of time (e.g., less than a threshold duration) and at an acceptable velocity (e.g., less than a threshold velocity). The approach area may vary based on whether the procedure is to be performed on a right or left side of the patient's body (e.g., right knee vs. left knee).
At step 511, instructions are displayed which instruct a user to move the robotic arm into the approach area. For example, the processing circuit 260 can cause the display screen 264 to display a graphical user interface including a graphic that illustrates movement of the robotic arm into the approach area. The graphical user interface may also include text-based instructions. An example graphical user interface that can be displayed at step 511 is shown in
At step 512, entry of the robotic arm 232 into the approach area is detected. The robotic arm 232 can be moved into the approach area manually by a user. That is, the user can exert a force on the robotic arm 232 to push the robotic arm into the approach area. In some embodiments, detecting entry of the robotic arm 232 into the approach area includes tracking the end effector array (trackable markers) attached to the distal end of the robotic arm 232 with the tracking system 222 and determining whether the distal end of the robotic arm 232 is in an approach area defined in a coordinate system used by the tracking system 222. In other embodiments, detecting entry of the robotic arm 232 includes checking joint angles of the robotic arm 232 (e.g., from encoders at the joints) against one or more criteria which define the approach area in terms of joint angles of the robotic arm 232. In such embodiments, detecting entry of the robotic arm 232 into the approach area can be performed independently of the tracking system 222. Thus, step 512 corresponds to determining that the robotic arm 232 is in a position from which it can be automatically moved to the starting pose determined in step 508.
At step 514, instructions are displayed which instruct a user to activate (e.g., engage, disengage, depress, release, etc.) an input device or otherwise input a command to initiate motorized movement of the robotic arm to the starting pose for the registration or calibration routine. For example, the processing circuit 260 may cause the display screen 264 to display a graphical user interface that includes a graphic showing a user engaging an input device, for example depressing a trigger positioned proximate the surgical tool 234, depressing a foot pedal, or otherwise engaging some other input device (e.g., mouse, button, pedal, trigger, switch, sensor). As another example, a microphone may be communicable with the processing circuit 260 such that a voice command can be used to initiate motorized movement. As another example, touchless gesture control could be used, for example using a machine vision approach, to provide a command to initiate automated alignment. As another example, the command can be input by moving the end effector in a particular direction. The command can be provided by a primary user (e.g., surgeon) in the sterile field and/or by a second person, for example a technician or nurse elsewhere in the operating room. An example user interface for display at step 514 is shown in
Accordingly, in step 514, an option is provided for the user to initiate motorized movement of the robotic arm to the starting pose for the registration or calibration routine. In alternative embodiments, steps 514 and 516 are omitted and motorized movement is automatically initiated when the robotic arm 232 enters the approach area without additional input from a user.
At step 516, a determination is made of whether the user is still activating the input device as instructed in step 514. For example, engagement of the input device (e.g., depression of a trigger) may create an electrical signal from the input device to the processing circuit 260. In such an example, the processing circuit 260 can determine whether the user is activating the input device based on whether the electrical signal is received. For example, presence of the signal from the input device may cause the processing circuit 260 to determine at step 516 that the user is engaging the input device, whereas absence of the signal from the input device may cause the processing circuit 260 to determine at step 516 that the user is not engage the input device.
If a determination is made at step 516 that the user is not activating the input device (i.e., “No” at step 516 in
If a determination is made at step 516 that the user is engaging the input device (i.e., “Yes” at step 516 in
Motorized movement of the robotic arm 232 to the starting pose in step 518 can includes movement in one to six degrees of freedom, for example including moving a distal end of the robotic arm 232 to a location identified by the starting pose and providing rotations to align with an orientation identified by the starting pose. In some embodiments, motorized movement includes arranging joint angles of the robotic arm 232 in a preferred (e.g., predefined) arrangement, for example an arrangement that facilitate calibration, registration, and/or completion of the surgical procedure. In other embodiments, for example for a seven degree of freedom robot, motorized movement can be performed such that the target starting position of the end effector (surgical tool 234) is defined and used for control without regards to angles or other positions of the arm 232.
As illustrated in
If the user continues to engage the input device, motorized movement continues until the robotic arm 232 reaches the starting pose for the registration or calibration routine. At step 520, in response to reaching the starting pose, a registration or calibration routine is initiated. Initiating the registration or calibration routine can include starting one or more data collection processes, for example tracking of an end effector array and base array by the tracking system 222, any other tracking of the robotic device 220, controlling the robotic arm 232 to provide additional motorized movements or to constrain manual movement of the robotic arm 232, and/or providing instructions for user actions to support the registration or calibration routine via the display screen 264.
For example,
By ensuring that the registration or calibration routine (procedure) is performed from the system-determined starting pose, process 500 can reduce or eliminate potential human-caused variations in initiation of the registration or calibration routine, which may increase the reliability and accuracy of the registration or calibration routine. Additionally, by providing motorized movement to the starting pose, efficiency and usability of the system can be improved. The process 500 thereby provides improvements over alternative approaches to initiating a registration or calibration routine.
Referring now to
The graphical user interface 600 also includes text-based messages 610 that can include instructions, alerts, warning, updates, etc. with respect to operation of the surgical system 200. In the example shown, the text-based messages 610 include instructions to bring the robotic arm 232 in Approach Mode as shown, i.e., to move the robotic arm 232 into the approach area as illustrated in the graphic 602. The text-based messages 610 also indicate that the surgical tool is successfully connected to the robotic arm, and that the end effector array is not currently visible to the detector 246 of the tracking system. Motorized movement via process 500 can move the end effector from outside the field of view of the detector 246 as indicated by the text-based messages 610 and into the field of view of the detector 246 to enable a registration or calibration routine. Information can also be communicated to the user via sounds emitted by a speaker of the surgical system 200 (e.g., acoustic feedback), forces provided via the robotic device 220 (e.g., haptic feedback), or indicators lights positioned on the robotic arm 232 or elsewhere in the surgical system 200. These various types of feedback can be provided at various events in operation of the surgical system 200, for example when the approach area is entered, when motorized movement can be initiated, when motorized movement is initiated, successful motorized movement, and/or successful or unsuccessful completion of various other events and steps described herein.
A user can follow the graphical and text-based instructions of graphical user interface 600 (and/or acoustic or haptic feedback) to move the robotic arm 232 into the approach area 606. In response, the processing circuit 260 detects entry of the robotic arm 232 into the approach area at step 512, and updates the display screen 264 to display instructions to engage an input device to initiate motorized movement to the starting pose for the registration or calibration routine at step 514 as in
Referring now to
The graphical user interface 700 includes a graphic 702 of the surgical tool 234 attached to the distal end of the robotic arm 232, as is the case when step 514 of process 500 is initiated. The graphic 702 shows an end effector array 704 attached to the surgical tool, and includes a call-out window 706 showing a zoomed-in view of an interface between the end effector array 704 and the surgical tool 234. The graphic 702 shows the end effector array 704 as properly attached to the surgical tool 234, such that the graphic 702 may thereby encourage a user to verify that this connection is properly made at the physical surgical tool. The graphic 702 also shows that the surgical tool 234 includes a trigger 708. In other embodiments, another input device is shown instead of the trigger 708 (e.g., foot pedal, mouse, button, switch, sensor).
The graphical user interface 700 also includes an icon 710 configured to communicate an instruction to press the trigger 708. The icon 710 includes a depiction of the surgical tool 234, including the trigger 708 and a hand holding a grip portion of the surgical tool 234 with a finger of the hand positioned on the trigger 708. An arrow is included in the icon 710 indicating that the finger is depressing the trigger 708. The icon 710 is thereby configured to show depression of the trigger 708 by a user. In embodiments where other types of commands are received to initiate the motorized movement, the icon 710 can be adapted accordingly. For example, the icon 710 can show depression or release of a foot pedal, selection of a button, engagement of some other sensor, verbal statement of a command (e.g., “Okay Robot, Start Movement”), user gesture or body movement, etc. as appropriate in various embodiments to indicate to the user that the system is ready to accept the user input to initiate the motorized movement.
The graphical user interface 700 also includes text-based messages 712 that can include instructions, alerts, updates, statuses, warnings, etc. regarding operation of the surgical system 200. As shown in
When the trigger 708 (or other input device in various embodiments) is engaged or activated, for example by depression of the trigger 708 as instructed in the graphical user interface 700 as shown in
Referring now to
The graphical user interface 800 is also shown as including selectable buttons 808 that allow a user to select to restart the registration or calibration routine, free the robotic arm from the registration or calibration routine, or collect a point (i.e., record a position of end effector tracker as part of the registration or calibration routine). The graphical user interface 800 thereby provides interactivity with and control over the registration or calibration routine. The graphical user interface 800 also shows text-based instructions 810 explaining how the robotic arm is to be moved through the virtual cube 804 as guided in the graphic 802 while keeping the end effector array visible to the detector 246 of the tracking system 222. A sound can be emitted from the surgical system 200 at each successful capture (e.g., when the end effector meets a vertex of the virtual cube 804). A haptic feedback, for example a vibration, or an indicator light can also be used to indicate a successful capture during the registration or calibration process.
The text-based instructions 810 may include real-time coordinates of the tracked tip of the surgical tool 234 which may be useful to a user. The graphical user interface 800 is also shown as including an icon 812 which can change colors to indicate whether the end effector array is currently visible to the detector 246 (e.g., red for not visible, green for visible). Other icons may be similarly included to show connectivity, proper operation, etc. of other components of the surgical system 200. The graphical user interface 800 is also shown to include a registration results progress bar 814 which can update to show progress through the registration process and/or indicate a quality of the registration process.
The graphical user interface 800 thereby includes various features which may be helpful in guiding a user through a registration or calibration routine for the surgical system 200 starting at and following step 520 of process 500. In some embodiments, for example the example shown in
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, magnetic, or fluidic.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
The hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some embodiments, particular processes and methods may be performed by circuitry that is specific to a given function. The memory (e.g., memory, memory unit, storage device) may include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present disclosure. The memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. According to an exemplary embodiment, the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
This application claims the benefit of and priority to U.S. Provisional Patent Application No. 63/107,781 filed Oct. 30, 2020, U.S. Provisional Patent Application No. 63/125,481 filed Dec. 15, 2020, U.S. Provisional Patent Application No. 63/131,654 filed Dec. 29, 2020, and U.S. Provisional Patent Application No. 63/189,508 filed May 17, 2021, the entire disclosures of which are incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4559936 | Hill | Dec 1985 | A |
5078140 | Kwoh | Jan 1992 | A |
5413573 | Koivukangas | May 1995 | A |
5540696 | Booth et al. | Jul 1996 | A |
5630431 | Taylor | May 1997 | A |
5682886 | Delp et al. | Nov 1997 | A |
5800438 | Tuke et al. | Sep 1998 | A |
5824085 | Sahay et al. | Oct 1998 | A |
6328752 | Sjostrom et al. | Dec 2001 | B1 |
6595997 | Axelson et al. | Jul 2003 | B2 |
6685711 | Axelson et al. | Feb 2004 | B2 |
6758850 | Smith et al. | Jul 2004 | B2 |
6827723 | Carson | Dec 2004 | B2 |
6859661 | Tuke | Feb 2005 | B2 |
7008362 | Fitzgibbon | Mar 2006 | B2 |
7412897 | Crottet et al. | Aug 2008 | B2 |
7510557 | Bonutti | Mar 2009 | B1 |
7547307 | Carson et al. | Jun 2009 | B2 |
7591821 | Kelman | Sep 2009 | B2 |
7607440 | Coste-Maniere et al. | Oct 2009 | B2 |
7618421 | Axelson et al. | Nov 2009 | B2 |
7634306 | Sarin et al. | Dec 2009 | B2 |
7670345 | Plassky et al. | Mar 2010 | B2 |
7696899 | Immerz et al. | Apr 2010 | B2 |
7794467 | Mcginley et al. | Sep 2010 | B2 |
7809421 | Govari | Oct 2010 | B1 |
7831295 | Friedrich et al. | Nov 2010 | B2 |
7927336 | Rasmussen | Apr 2011 | B2 |
7931655 | Axelson et al. | Apr 2011 | B2 |
7945310 | Gattani et al. | May 2011 | B2 |
7963913 | Devengenzo et al. | Jun 2011 | B2 |
8007448 | Moctezuma De La Barrera | Aug 2011 | B2 |
8010180 | Quaid et al. | Aug 2011 | B2 |
8038683 | Couture et al. | Oct 2011 | B2 |
8075317 | Youngblood | Dec 2011 | B2 |
8078440 | Otto et al. | Dec 2011 | B2 |
8096997 | Plaskos et al. | Jan 2012 | B2 |
8109942 | Carson | Feb 2012 | B2 |
8116847 | Gattani et al. | Feb 2012 | B2 |
8126533 | Lavallee | Feb 2012 | B2 |
8160345 | Pavlovskaia et al. | Apr 2012 | B2 |
8170888 | Silverman | May 2012 | B2 |
8172775 | Warkentine et al. | May 2012 | B2 |
8197549 | Amirouche et al. | Jun 2012 | B2 |
8257360 | Richard et al. | Sep 2012 | B2 |
8265790 | Amiot et al. | Sep 2012 | B2 |
8265949 | Haddad | Sep 2012 | B2 |
8277455 | Couture et al. | Oct 2012 | B2 |
8337508 | Lavallee et al. | Dec 2012 | B2 |
8357111 | Caillouette et al. | Jan 2013 | B2 |
8377129 | Fitz et al. | Feb 2013 | B2 |
8382765 | Axelson et al. | Feb 2013 | B2 |
8386077 | Birkenbach et al. | Feb 2013 | B2 |
8480679 | Park et al. | Jul 2013 | B2 |
8483469 | Pavlovskaia et al. | Jul 2013 | B2 |
8521252 | Diez | Aug 2013 | B2 |
8545509 | Park et al. | Oct 2013 | B2 |
8548559 | Hodgson et al. | Oct 2013 | B2 |
8551023 | Sherman et al. | Oct 2013 | B2 |
8551099 | Lang et al. | Oct 2013 | B2 |
8626267 | Lavallee | Jan 2014 | B2 |
8641726 | Bonutti | Feb 2014 | B2 |
8682052 | Fitz et al. | Mar 2014 | B2 |
8707963 | Davis et al. | Apr 2014 | B2 |
8715291 | Park et al. | May 2014 | B2 |
8721568 | Rock et al. | May 2014 | B2 |
8777875 | Park | Jul 2014 | B2 |
8801719 | Park et al. | Aug 2014 | B2 |
8801720 | Park et al. | Aug 2014 | B2 |
8832019 | Gao | Sep 2014 | B2 |
8834490 | Bonutti | Sep 2014 | B2 |
8845645 | Wilkinson et al. | Sep 2014 | B2 |
8861818 | Ito et al. | Oct 2014 | B2 |
8880152 | Lavallee | Nov 2014 | B2 |
8885904 | Darrow et al. | Nov 2014 | B2 |
8938282 | Daon et al. | Jan 2015 | B2 |
8951260 | Lang et al. | Feb 2015 | B2 |
8956355 | Edwards et al. | Feb 2015 | B2 |
8965483 | Couture et al. | Feb 2015 | B2 |
8974468 | Borja | Mar 2015 | B2 |
8979859 | Leparmentier et al. | Mar 2015 | B2 |
9002426 | Quaid et al. | Apr 2015 | B2 |
9101394 | Arata et al. | Aug 2015 | B2 |
9119722 | Kusuma | Sep 2015 | B1 |
9125669 | Ranawat et al. | Sep 2015 | B2 |
9167989 | Odermatt et al. | Oct 2015 | B2 |
9168153 | Bettenga | Oct 2015 | B2 |
9173716 | Kasodekar et al. | Nov 2015 | B2 |
9186292 | Besendorfer | Nov 2015 | B2 |
9220510 | Cheal et al. | Dec 2015 | B2 |
9237951 | Hakki | Jan 2016 | B1 |
9241801 | Parry et al. | Jan 2016 | B1 |
9247998 | Hladio et al. | Feb 2016 | B2 |
9248001 | Colombet et al. | Feb 2016 | B2 |
9259290 | Jenkins et al. | Feb 2016 | B2 |
9262802 | Aghazadeh | Feb 2016 | B2 |
9265447 | Stein et al. | Feb 2016 | B2 |
9271756 | Van Der Walt et al. | Mar 2016 | B2 |
9277968 | Min et al. | Mar 2016 | B2 |
9286355 | De Guise et al. | Mar 2016 | B2 |
9289264 | Iorgulescu et al. | Mar 2016 | B2 |
9301812 | Kehres et al. | Apr 2016 | B2 |
9332987 | Leimbach et al. | May 2016 | B2 |
9406134 | Klingenbeck-Regn | Aug 2016 | B2 |
9433425 | Wilkinson | Sep 2016 | B2 |
9439656 | Chana et al. | Sep 2016 | B2 |
9517000 | Donhowe et al. | Dec 2016 | B2 |
9532788 | Jordan et al. | Jan 2017 | B2 |
9532838 | Coste-Maniere et al. | Jan 2017 | B2 |
9549742 | Berend et al. | Jan 2017 | B2 |
9549782 | Park et al. | Jan 2017 | B2 |
9554953 | Dirauf et al. | Jan 2017 | B2 |
9561082 | Yen et al. | Feb 2017 | B2 |
9572682 | Aghazadeh | Feb 2017 | B2 |
9585725 | Bonutti | Mar 2017 | B2 |
9585768 | Sherman et al. | Mar 2017 | B2 |
9592133 | Toler et al. | Mar 2017 | B2 |
9597096 | Aghazadeh | Mar 2017 | B2 |
9610086 | Park et al. | Apr 2017 | B2 |
9610134 | Kubiak et al. | Apr 2017 | B2 |
9639156 | Iorgulescu et al. | May 2017 | B2 |
9684768 | Lavallee et al. | Jun 2017 | B2 |
9700292 | Nawana et al. | Jul 2017 | B2 |
9724165 | Arata et al. | Aug 2017 | B2 |
9737311 | Lavallee et al. | Aug 2017 | B2 |
9737369 | Burger et al. | Aug 2017 | B2 |
9763683 | Bonutti | Sep 2017 | B2 |
9763746 | Deichmann et al. | Sep 2017 | B2 |
9782226 | Park et al. | Oct 2017 | B2 |
9782229 | Crawford et al. | Oct 2017 | B2 |
9808356 | Haight et al. | Nov 2017 | B2 |
9848896 | Emslie et al. | Dec 2017 | B2 |
9861446 | Lang | Jan 2018 | B2 |
9888931 | Bake | Feb 2018 | B2 |
9901404 | Park et al. | Feb 2018 | B2 |
9901463 | Mahfouz | Feb 2018 | B2 |
9911187 | Steinle et al. | Mar 2018 | B2 |
9913691 | Brooks | Mar 2018 | B2 |
9913692 | Arata et al. | Mar 2018 | B2 |
9916421 | Vorhis et al. | Mar 2018 | B2 |
9987092 | Hladio et al. | Jun 2018 | B2 |
10010377 | Iorgulescu et al. | Jul 2018 | B2 |
10052164 | Overmyer | Aug 2018 | B2 |
10070931 | Itkowitz et al. | Sep 2018 | B2 |
10070973 | Sherman et al. | Sep 2018 | B2 |
10071488 | Robinson et al. | Sep 2018 | B2 |
10076344 | Toler | Sep 2018 | B2 |
10080616 | Wilkinson et al. | Sep 2018 | B2 |
10092361 | Ferro et al. | Oct 2018 | B2 |
10102309 | Mckinnon et al. | Oct 2018 | B2 |
10117658 | Talbot | Nov 2018 | B2 |
10130375 | Yager et al. | Nov 2018 | B2 |
10136950 | Schoenefeld | Nov 2018 | B2 |
10136952 | Couture et al. | Nov 2018 | B2 |
10172687 | Garbus et al. | Jan 2019 | B2 |
10194991 | Bonny et al. | Feb 2019 | B2 |
10201320 | Saget et al. | Feb 2019 | B2 |
10206714 | Van Der Walt et al. | Feb 2019 | B2 |
10206792 | Sherman et al. | Feb 2019 | B2 |
10226261 | Park et al. | Mar 2019 | B2 |
10226306 | Itkowitz et al. | Mar 2019 | B2 |
10231739 | Bonutti | Mar 2019 | B1 |
10231786 | Ferro et al. | Mar 2019 | B2 |
10238454 | Boettner et al. | Mar 2019 | B2 |
10271954 | Roach et al. | Apr 2019 | B2 |
10272569 | Swarup et al. | Apr 2019 | B2 |
10278777 | Lang | May 2019 | B1 |
10285683 | Plaskos et al. | May 2019 | B2 |
10307269 | Miller | Jun 2019 | B2 |
10368947 | Lang | Aug 2019 | B2 |
10416624 | Bly et al. | Sep 2019 | B2 |
10420611 | Jaramaz et al. | Sep 2019 | B2 |
10426556 | Miga et al. | Oct 2019 | B2 |
10441366 | Tabandeh et al. | Oct 2019 | B2 |
10441438 | Rahman et al. | Oct 2019 | B1 |
10452238 | Nikou et al. | Oct 2019 | B2 |
10456075 | Auchinleck et al. | Oct 2019 | B2 |
10456166 | Cooper et al. | Oct 2019 | B2 |
10463242 | Kesten et al. | Nov 2019 | B2 |
10470838 | Epstein et al. | Nov 2019 | B2 |
10492693 | Irisawa | Dec 2019 | B2 |
10492798 | Metzger | Dec 2019 | B2 |
10548667 | Flett et al. | Feb 2020 | B2 |
10555777 | Griffiths et al. | Feb 2020 | B2 |
10572733 | Wells et al. | Feb 2020 | B2 |
10575910 | Itkowitz et al. | Mar 2020 | B2 |
10595880 | Otto et al. | Mar 2020 | B2 |
10595887 | Shelton et al. | Mar 2020 | B2 |
10595952 | Forrest et al. | Mar 2020 | B2 |
10610310 | Todd et al. | Apr 2020 | B2 |
10610315 | Itkowitz et al. | Apr 2020 | B2 |
10610316 | Swarup et al. | Apr 2020 | B2 |
10617479 | Itkowitz et al. | Apr 2020 | B2 |
10624807 | Itkowitz et al. | Apr 2020 | B2 |
10638970 | Obma et al. | May 2020 | B2 |
10739963 | Nikou et al. | Aug 2020 | B2 |
10765384 | Wollowick et al. | Sep 2020 | B2 |
20020055918 | Hlathein et al. | May 2002 | A1 |
20020082612 | Moll et al. | Jun 2002 | A1 |
20040260301 | Lionberger et al. | Dec 2004 | A1 |
20050020941 | Tarabichi | Jan 2005 | A1 |
20050113846 | Carson | May 2005 | A1 |
20050119661 | Hodgson et al. | Jun 2005 | A1 |
20050149040 | Haines et al. | Jul 2005 | A1 |
20050171545 | Walsh et al. | Aug 2005 | A1 |
20050234466 | Stallings | Oct 2005 | A1 |
20050251148 | Friedrich et al. | Nov 2005 | A1 |
20060015120 | Richard et al. | Jan 2006 | A1 |
20060064043 | Goeggelmann et al. | Mar 2006 | A1 |
20060200026 | Wallace et al. | Sep 2006 | A1 |
20060241405 | Leitner et al. | Oct 2006 | A1 |
20070073136 | Metzger | Mar 2007 | A1 |
20070123896 | Wyss et al. | May 2007 | A1 |
20070179626 | de la Barrera et al. | Aug 2007 | A1 |
20080004633 | Arata et al. | Jan 2008 | A1 |
20080208081 | Murphy et al. | Aug 2008 | A1 |
20080249394 | Giori et al. | Oct 2008 | A1 |
20080281301 | Deboer et al. | Nov 2008 | A1 |
20080281328 | Lang et al. | Nov 2008 | A1 |
20080281426 | Fitz et al. | Nov 2008 | A1 |
20100063508 | Borja et al. | Mar 2010 | A1 |
20100064216 | Borja et al. | Mar 2010 | A1 |
20100145344 | Jordan et al. | Jun 2010 | A1 |
20110029091 | Bojarski et al. | Feb 2011 | A1 |
20110071528 | Carson | Mar 2011 | A1 |
20110071530 | Carson | Mar 2011 | A1 |
20110304332 | Mahfouz | Dec 2011 | A1 |
20110306986 | Lee et al. | Dec 2011 | A1 |
20120143084 | Shoham | Jun 2012 | A1 |
20120226198 | Carson | Sep 2012 | A1 |
20120226481 | Carson | Sep 2012 | A1 |
20130072821 | Odermatt et al. | Mar 2013 | A1 |
20130085510 | Stefanchik et al. | Apr 2013 | A1 |
20130123983 | Brogardh | May 2013 | A1 |
20130172905 | Iorgulescu et al. | Jul 2013 | A1 |
20130209953 | Arlinsky et al. | Aug 2013 | A1 |
20140039520 | Haider et al. | Feb 2014 | A1 |
20140073907 | Kumar et al. | Mar 2014 | A1 |
20140108983 | William et al. | Apr 2014 | A1 |
20140128727 | Daon et al. | May 2014 | A1 |
20140135791 | Nikou et al. | May 2014 | A1 |
20140188240 | Lang et al. | Jul 2014 | A1 |
20140189508 | Granchi et al. | Jul 2014 | A1 |
20140296871 | Chen et al. | Oct 2014 | A1 |
20150094736 | Malackowski | Apr 2015 | A1 |
20150105782 | D'Lima et al. | Apr 2015 | A1 |
20150106024 | Lightcap et al. | Apr 2015 | A1 |
20160007836 | Kikuchi | Jan 2016 | A1 |
20160022374 | Haider et al. | Jan 2016 | A1 |
20160220175 | Tam et al. | Aug 2016 | A1 |
20160278868 | Berend et al. | Sep 2016 | A1 |
20160338777 | Penenberg et al. | Nov 2016 | A1 |
20170014169 | Dean et al. | Jan 2017 | A1 |
20170042557 | Plaskos et al. | Feb 2017 | A1 |
20170061375 | Laster et al. | Mar 2017 | A1 |
20170196571 | Berend et al. | Jul 2017 | A1 |
20170252112 | Crawford et al. | Sep 2017 | A1 |
20170258532 | Shalayev et al. | Sep 2017 | A1 |
20170312099 | Paszicsnyek | Nov 2017 | A1 |
20170325973 | Bonny et al. | Nov 2017 | A1 |
20170340389 | Otto et al. | Nov 2017 | A1 |
20170347922 | Bhandari | Dec 2017 | A1 |
20170348008 | Lavallee et al. | Dec 2017 | A1 |
20180064496 | Hladio et al. | Mar 2018 | A1 |
20180071049 | Nowatschin et al. | Mar 2018 | A1 |
20180085135 | Singh et al. | Mar 2018 | A1 |
20180085172 | Bell et al. | Mar 2018 | A1 |
20180116805 | Johannaber et al. | May 2018 | A1 |
20180116823 | Johannaber et al. | May 2018 | A1 |
20180132949 | Merette et al. | May 2018 | A1 |
20180168750 | Staunton et al. | Jun 2018 | A1 |
20180168762 | Scheib et al. | Jun 2018 | A1 |
20180177512 | Hogan et al. | Jun 2018 | A1 |
20180185100 | Weinstein | Jul 2018 | A1 |
20180199995 | Odermatt et al. | Jul 2018 | A1 |
20180214180 | Theodore et al. | Aug 2018 | A1 |
20180250078 | Shochat et al. | Sep 2018 | A1 |
20180256256 | May et al. | Sep 2018 | A1 |
20180317898 | Plaskos et al. | Nov 2018 | A1 |
20180338796 | Yao et al. | Nov 2018 | A1 |
20180344409 | Bonny et al. | Dec 2018 | A1 |
20180368930 | Esterberg et al. | Dec 2018 | A1 |
20190000631 | Blankevoort et al. | Jan 2019 | A1 |
20190008599 | Lynch et al. | Jan 2019 | A1 |
20190066832 | Kang et al. | Feb 2019 | A1 |
20190069962 | Tabandeh et al. | Mar 2019 | A1 |
20190069963 | Azizian et al. | Mar 2019 | A1 |
20190083191 | Gilhooley et al. | Mar 2019 | A1 |
20190090952 | Bonny et al. | Mar 2019 | A1 |
20190090962 | Boettner | Mar 2019 | A1 |
20190099228 | Keller et al. | Apr 2019 | A1 |
20190117156 | Howard et al. | Apr 2019 | A1 |
20190117407 | Yang | Apr 2019 | A1 |
20190122330 | Saget et al. | Apr 2019 | A1 |
20190133695 | Hladio et al. | May 2019 | A1 |
20190147128 | O'Connor | May 2019 | A1 |
20190175283 | Bonny et al. | Jun 2019 | A1 |
20190200900 | Thelen et al. | Jul 2019 | A1 |
20190201101 | Hafez | Jul 2019 | A1 |
20190201214 | Miller et al. | Jul 2019 | A1 |
20190209079 | Delport | Jul 2019 | A1 |
20190216520 | Babak et al. | Jul 2019 | A1 |
20190223962 | Roldan et al. | Jul 2019 | A1 |
20190224016 | Walker et al. | Jul 2019 | A1 |
20190240045 | Couture | Aug 2019 | A1 |
20190240046 | Couture | Aug 2019 | A1 |
20190254756 | Zhang et al. | Aug 2019 | A1 |
20190269476 | Bowling et al. | Sep 2019 | A1 |
20190272917 | Couture et al. | Sep 2019 | A1 |
20190274662 | Rockman et al. | Sep 2019 | A1 |
20190274762 | Kim et al. | Sep 2019 | A1 |
20190290198 | Belson et al. | Sep 2019 | A1 |
20190311542 | Douglas et al. | Oct 2019 | A1 |
20190325386 | Laster et al. | Oct 2019 | A1 |
20190336220 | Hladio et al. | Nov 2019 | A1 |
20190365481 | Otto et al. | Dec 2019 | A1 |
20190374130 | Bydlon et al. | Dec 2019 | A1 |
20190380792 | Poltaretskyi et al. | Dec 2019 | A1 |
20190388153 | Running et al. | Dec 2019 | A1 |
20190388157 | Shameli et al. | Dec 2019 | A1 |
20200000400 | Mckinnon et al. | Jan 2020 | A1 |
20200015598 | Hondori et al. | Jan 2020 | A1 |
20200030036 | Forstein | Jan 2020 | A1 |
20200060772 | Konh et al. | Feb 2020 | A1 |
20200060773 | Barral et al. | Feb 2020 | A1 |
20200100848 | Zuhars et al. | Apr 2020 | A1 |
20200113583 | Philipp et al. | Apr 2020 | A1 |
20200129311 | Singh et al. | Apr 2020 | A1 |
20200305978 | Tan et al. | Oct 2020 | A1 |
20200305979 | Crawford et al. | Oct 2020 | A1 |
20200323540 | Kang et al. | Oct 2020 | A1 |
20200352529 | Wollowick et al. | Nov 2020 | A1 |
20210068845 | Schers et al. | Mar 2021 | A1 |
20220071720 | Sexson | Mar 2022 | A1 |
20230146679 | Lavallée et al. | May 2023 | A1 |
Number | Date | Country |
---|---|---|
1 518 501 | Mar 2005 | EP |
1 690 503 | Aug 2006 | EP |
1 226 788 | Oct 2006 | EP |
1 755 466 | Dec 2007 | EP |
2 007 291 | Dec 2008 | EP |
2 156 794 | Feb 2010 | EP |
2 384 714 | Nov 2011 | EP |
1 919 390 | Dec 2012 | EP |
1 841 372 | Sep 2017 | EP |
3 510 927 | Jul 2019 | EP |
3 334 383 | Apr 2020 | EP |
WO-9531148 | Nov 1995 | WO |
WO-2004070580 | Aug 2004 | WO |
WO-2006078236 | Jul 2006 | WO |
WO-2007092841 | Aug 2007 | WO |
WO-2012082164 | Jun 2012 | WO |
WO-2012101286 | Aug 2012 | WO |
WO-2015057814 | Apr 2015 | WO |
WO-2016146768 | Sep 2016 | WO |
WO-2016198844 | Dec 2016 | WO |
WO-2017076886 | May 2017 | WO |
WO-2017108776 | Jun 2017 | WO |
WO-2017115235 | Jul 2017 | WO |
WO-2017124043 | Jul 2017 | WO |
WO-2017147596 | Aug 2017 | WO |
WO-2017179075 | Oct 2017 | WO |
WO-2018085694 | May 2018 | WO |
WO-2018085900 | May 2018 | WO |
WO-2018095499 | May 2018 | WO |
WO-2018104704 | Jun 2018 | WO |
WO-2018161120 | Sep 2018 | WO |
WO-2019006370 | Jan 2019 | WO |
WO-2019032828 | Feb 2019 | WO |
WO-2019068194 | Apr 2019 | WO |
WO-2019079634 | Apr 2019 | WO |
WO-2019081915 | May 2019 | WO |
WO-2019135805 | Jul 2019 | WO |
WO-2019148154 | Aug 2019 | WO |
WO-2019191722 | Oct 2019 | WO |
WO-2019224745 | Nov 2019 | WO |
WO-2019241516 | Dec 2019 | WO |
WO-2019245849 | Dec 2019 | WO |
WO-2019245851 | Dec 2019 | WO |
WO-2020033568 | Feb 2020 | WO |
WO-2020056443 | Mar 2020 | WO |
WO-2020065209 | Apr 2020 | WO |
WO-2020227832 | Nov 2020 | WO |
Entry |
---|
US 9,445,923, 9/2016, Arthromeda, Inc. (withdrawn) |
International Search Report and Written Opinion for International Application No. PCT/US2021/057024, mailed Feb. 16, 2022, 17 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/057045, mailed Feb. 14, 2022, 17 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2021/057065, mailed Feb. 18, 2022, 19 pages. |
Number | Date | Country | |
---|---|---|---|
20220134569 A1 | May 2022 | US |
Number | Date | Country | |
---|---|---|---|
63189508 | May 2021 | US | |
63131654 | Dec 2020 | US | |
63125481 | Dec 2020 | US | |
63107781 | Oct 2020 | US |