Systems and methods for navigation and visualization

Information

  • Patent Grant
  • 12096995
  • Patent Number
    12,096,995
  • Date Filed
    Thursday, July 13, 2023
    a year ago
  • Date Issued
    Tuesday, September 24, 2024
    2 months ago
Abstract
A surgical system and method of operating the same involve a surgical operating table is controllable and adjustable and that supports a patient thereon. A robotic system includes a moveable arm that supports and moves an end effector relative to a surgical site of the patient. Controller(s) coupled to the surgical operating table and to the robotic system associate a virtual boundary with respect to the surgical site and detect that the end effector is outside of the virtual boundary. In response to detection of the end effector being outside of the virtual boundary, the controller(s) command adjustment of the surgical operating table to reposition the surgical site and the virtual boundary to enable the end effector to be inside the virtual boundary.
Description
BACKGROUND

The field of the disclosure relates generally to visualization and navigation, and more specifically, to methods and systems for visualizing sites that do not have direct line of sight to a user.


Generally, clear visualization is important when performing detailed tasks such as driving, operating machinery, or performing surgery. For example, surgical procedures require direct line of site to prepare and conduct the surgical procedure to ensure accuracy. To reduce the complications during the surgical procedure, surgeons attempt to minimize any disturbances to body. Those disturbances can include minimal incisions that reduce the size surgical site, which in turn can limit the field of view for the surgeon. Accordingly, a need exists for visualization and navigation that provides feedback to a user (e.g., a physician/surgeon) while performing tasks (e.g., preoperatively, intraoperatively, and postoperatively) to increase the accuracy and efficiency of the task.


SUMMARY

In a first aspect, a surgical system is provided that comprises a surgical operating table configured to support a patient thereon, wherein the surgical operating table is controllable and adjustable; a robotic system comprising a moveable arm that is configured to support and move an end effector relative to a surgical site of the patient; and one or more controllers coupled to the surgical operating table and to the robotic system and being configured to: associate a virtual boundary with respect to the surgical site; detect that the end effector is outside of the virtual boundary; and in response to detection of the end effector being outside of the virtual boundary, command adjustment of the surgical operating table to reposition the surgical site and the virtual boundary to enable the end effector to be inside the virtual boundary.


In a second aspect, a method of operating a surgical system is provided, wherein the surgical system includes a surgical operating table configured to support a patient thereon, wherein the surgical operating table is controllable and adjustable, a robotic system including a moveable arm that is configured to support and move an end effector relative to a surgical site of the patient and one or more controllers coupled to the surgical operating table and to the robotic system, the method comprising the one or more controllers performing the following: associating a virtual boundary with respect to the surgical site; detecting that the end effector is outside of the virtual boundary; and in response to detecting that the end effector is outside of the virtual boundary, commanding adjustment of the surgical operating table to re-position the surgical site and the virtual boundary for enabling the end effector to be inside the virtual boundary.


In a third aspect, a surgical system is provided that comprises a surgical operating table configured to support a patient thereon, wherein the surgical operating table is controllable and adjustable; a robotic system comprising a moveable arm that is configured to support and move an end effector relative to a surgical site of the patient; and one or more controllers coupled to the surgical operating table and to the robotic system and being configured to: associate a virtual boundary with respect to the surgical site; control the robotic system to move the end effector relative to the surgical site; permit operation of the end effector in response to detection of the end effector being inside of the virtual boundary; detect that the end effector is outside of the virtual boundary; prohibit operation of the end effector in response to detection of the end effector being outside of the virtual boundary; and command adjustment of the surgical operating table to re-position the surgical site and the virtual boundary to enable the end effector to be inside the virtual boundary


The features, functions, and advantages that have been discussed can be achieved independently in various embodiments or may be combined in yet other embodiments, further details of which can be seen with reference to the following description and drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a robotic system 100 used in surgical procedures.



FIG. 2 is a perspective of an exemplary procedural component that may be used with the system shown in FIG. 1.



FIG. 3 is a perspective of an alternative procedural component that may be used with the system shown in FIG. 1.



FIG. 4 is a perspective of an alternative procedural component that may be used with the system shown in FIG. 1.



FIG. 5 is a perspective of a patient undergoing a procedure using the system shown in FIG. 1.



FIGS. 6A and 6B are exemplary images produced by the system shown in FIG. 1.



FIG. 7 is a perspective view of a portion of a spine having markers that may be sued with the system shown in FIG. 1.



FIG. 8 is a cut-away perspective view of an exemplary temperature implant for use with the system shown in FIG. 1.



FIG. 9A is an exemplary image produced by the system shown in FIG. 1.



FIG. 9B is a schematic of the exemplary image shown in FIG. 9A.



FIGS. 10-12 are exemplary data flow diagrams of filtering images performed by the system shown in FIG. 1.





DETAILED DESCRIPTION

The systems and methods described herein enable accurate navigation during surgical procedures. The systems and methods described herein provide landmark information inside the body of a patient during a surgical procedure. As used herein, the term “tissue” or “body tissue” refers to a group or layer of similarly specialized cells that together perform certain special functions and can refer to any type of tissue in the body including, but not limed to, bone, organs, cartilage, muscles, skin, fat, nerves, and scars. As used herein the terms “procedure” or “surgery” refers to an operation performed on a patient to investigate and/or treat a pathological condition.



FIG. 1 is a block diagram of a robotic system 100 used in surgical procedures. System 100 includes a computing device 102, user input controls 104, a presentation interface 106, a communications interface 108, an imaging device 110, and a procedural component 112 having at least one end effector 114. In some embodiments, system 100 is communicatively coupled (e.g., through an electrical wire or cable, or wirelessly through Bluetooth or Wi-Fi) to additional operating room systems including, but not limited to, monitoring surgical room controls 120, surgical monitoring systems 122, and additional surgical systems 124 as well as an operating table or bed 220. The robotic system 100 could attach to the floor, be table mounted, and/or mounted to the patient. As further described herein, system 100 may be attached to or moving relative to the patient's body tissue. Moreover, system 100 could be a micro-robot that would be ingested or placed within the patient's body, including sensors that could communicate wirelessly and/or recharge via electromagnetic radiofrequency motion, ultrasound, capacitive coupling, and the like. Non-limiting examples of room controls 120 are HVAC (temperature, humidity), Oxygen, Nitrogen, Carbon Dioxide, and lighting and nonlimiting examples of surgical monitoring systems 122 include cardiac, hemodynamic, respiratory, neurological, blood glucose, blood chemistry, organ function, childbirth, and body temperature monitoring. Additional surgical systems 124 include, but are not limited to, anesthesia (e.g., oxygen, carbon dioxide, nitrogen, nitrous oxide, etc.), endoscopy, arthroscopic, electromagnetic guidance, oncology, navigation, arthroscopy, tissue ablation, ultrasound ablation, cardiac, stent, valve, cardiovascular, ESW, inhalation, urologic, cerebrospinal fluid, synovial fluid, OB/GYN, ENG, neurosurgery, plastic surgery, pulmonary gastroenterology, and IV infusion systems. One having ordinary skill in the art will understand that system 100 may be used in a macroscopic manner and/or a microscopic manner, looking at cellular chemistry.


Computing device 102 includes at least one memory device 120 and one or more processors 122 (e.g., in a multi-core configuration) that is coupled to memory device 120 for executing instructions. In some embodiments, executable instructions are stored in memory device 120. Further, processor 122 may be implemented using one or more heterogeneous processor systems in which a main processor is present with secondary processors on a single chip. As another illustrative example, processor 122 may be a symmetric multiprocessor system containing multiple processors of the same type. Processor 122 may perform partial processing and receive partial processing by a processor and/or computing device communicatively coupled to computing device 102 to enable cloud or remote processing. Further, processor 122 may be implemented using any suitable programmable circuit including one or more systems and microcontrollers, microprocessors, reduced instruction set circuits (RISC), application specific integrated circuits (ASIC), programmable logic circuits, field programmable gate arrays (FPGA), and any other circuit capable of executing the functions described herein. In the exemplary embodiment, processor receives imaging information from device 110 and creates co-registered images for display on interface 106 as well as providing movement limitations to component 112 based the imaging information.


In the exemplary embodiment, memory device 120 is one or more devices that enable information such as executable instructions and/or other data to be stored and retrieved. Memory device 120 may include one or more computer readable media, such as, without limitation, dynamic random access memory (DRAM), static random access memory (SRAM), a solid state disk, and/or a hard disk. Memory device 120 may be configured to store, without limitation, application source code, application object code, source code portions of interest, object code portions of interest, configuration data, execution events and/or any other type of data. In some embodiments, memory device 120 retains or stores limited or no information locally but stores information on a device communicatively coupled to system 100 to enable cloud storage.


In some embodiments, computing device 102 includes a presentation interface 106 that is coupled to processor 122. Presentation interface 106 presents information, such patient information and/or images (e.g., scans), to a user/surgeon. For example, presentation interface 106 may include a display adapter (not shown) that may be coupled to a display device, such as a cathode ray tube (CRT), a liquid crystal display (LCD), an organic LED (OLED) display, and/or an “electronic ink” display. In some embodiments, presentation interface 106 includes one or more display devices. In the exemplary embodiment, presentation interface 106 displays surgical site data that is received from imaging device 110 and created by processor 122. The surgical site data may be displayed on presentation interface 106 and/or in any format that enables user view to surgical site information including but not limited to, glasses, a heads up display positioned within a surgical helmet, a retinal display that projects information onto the user's retina, and a monitor located within the operating room or some other remote location. In some embodiments, presentation interface 106 projects images from system 100 directly into the retina of a surgeon. In some embodiments, surgical site data is provided to the surgeon with audible commands to help direct the surgeon during a procedure.


In the exemplary embodiment, computing device 102 includes a user input interface 104. In the exemplary embodiment, user input interface 104 is coupled to processor 122 and receives input from a user/surgeon. User input interface 104 may include, for example, a keyboard, a pointing device, a mouse, a stylus, a touch sensitive panel (e.g., a touch pad or a touch screen), a gyroscope, an accelerometer, a position detector, and/or an audio user input interface. In some embodiments, user input interface 104 is a haptic feedback system that provides feedback (e.g., pressure, torque) from a procedural component 112. In some embodiments, a single component, such as a touch screen, may function as both a display device of presentation interface 106 and user input interface 104. In one or more embodiments, user input interface is a sensor that senses vibration, heat, thermal properties, and the like.


In one embodiment, user input interface 104 is one or more sensors coupled to a surgeon that are configured to detect muscle movement such that the procedural component 112 and/or end effector(s) 114 will respond to the detected muscle movements. In some embodiments, sensors are positioned on the skin of a surgeon or user so that the sensor can detect either mechanical (e.g., physical movement) or electrical signals of the muscles and/or nerves. Such a system enables a surgeon to perform a procedure remotely without the use of instrumentation directly coupled to the procedural component 112 and/or end effector 114. In some embodiments, a camera (i e , imaging device 110) is utilized in conjunction with the sensors to determine and/or track surgeon movement patterns to provide a more efficient determination of surgeon movements.


In the exemplary embodiment, computing device 102 includes or is coupled to a communication interface 108 coupled to processor 122. Communication interface 108 communicates with imaging device 110, procedural component 112, and/or remote computing systems (not shown) such as mobile phones and/or tablets. To communicate with imaging device 110, procedural component 112, and/or remote computing systems, communication interface 108 may include, for example, a wired network adapter, a wireless network adapter (e.g., Bluetooth, Wi-Fi), and/or a mobile telecommunications adapter. In the exemplary embodiment, communication interface 108 and presentation interface 106 enable remote conferencing of a procedure with system 100. For example, a surgeon can receive guidance during a procedure from a remote surgeon, assistant, or medical sales representative during a procedure. Additionally, system 100 includes sensors of components 112 that sense movement of components and provide feedback to system 100 enabling system 100 to provide signals to provide tactile feedback and/or alarms to a surgeon through device in which the surgeon is interacting. Imaging device(s) 110 can provide remote users to visualize what is occurring in the procedure in real-time, while allowing the surgeon to interactively communicate with those remotely connected. Imaging device(s) 110 may also provide preoperative and/or postoperative images. Moreover, different types of imaging devices (e.g., fiberoptic, light, acoustic, laser, etc.) may be utilized intraoperatively.


Additionally, the remote conferencing described above can also be utilized to enable remote inventory management. For example, a medical device company or representative can utilize imaging device(s) 110 to view the inventory present in an operating room, or outside the operating room in a secure location (e.g., pharmacy, stock room), to determine what devices and/or objects have been utilized during a procedure. In some embodiments, an imaging device 110 (e.g., camera) scans an inventory system (e.g., cart) to determine, via processor 122, which objects are no longer present and were utilized during a procedure. It should be noted that system 100 can determine inventory levels by utilizing additional sensors. For example, in some embodiments, system 100 is coupled to a scale that weighs the inventory to determine missing items. In one embodiment, sensors are utilized to provide feedback to determine missing inventory based on displacement, an empty space in a known inventory location, and/or a changed shape of a stack or collection of inventory. Alternatively, system 100 is configured to track inventory using Automatic identification and data capture (AIDC) sensors configured to provide device information by receiving information from the inventory that includes, but it not limited to including, bar codes, Radio Frequency Identification (RFID), biometrics, magnetic stripes, Optical Character Recognition (OCR), smart cards, and voice recognition.


The device utilization is processed by system 100 and transmitted, via communication interface 108, to a hospital billing department, medical device supplier, practitioner, clinic, and/or any other entity necessary to track device usage (e.g., insurance, procedure payor, and government reporting entity). It should be noted that system 100 is configured to track inventory systems within the medical and/or hospital including but not limited to, implants, surgical instruments, disposable medical devices, pharmaceuticals, and bracing. Additionally, the inventory control features described herein could be utilized by any system needing inventory control outside of the medical field.


In some embodiments, system 100 utilizes movement patterns for navigation (e.g., surgical navigation, vehicle navigation, etc.). In a surgical context, the actual movement or stimulation of tissue that twitches, moves, or goes in a motion pattern can be tracked by system 100 and used to navigate (i.e., understand where soft tissue is located relative to soft tissue and/or bone). For example, if electrical stimulation is used to stimulate a muscle or nerve to twitch, system 100 can track these movement patterns to determine where the nerve or muscle is located. In another exemplary embodiment, a spasm in the blood vessel can be created to determine where the blood vessel is located to create patterns of navigation. System 100 can also be used at a microscopic level to create navigation at a cellular level where cell membranes and/or cell receptors are stimulated. As movement patterns are tracked, system 100, using methods described herein, could remove pixels and/or enhance or change a visualization. For example, if tissue, cells, and/or membranes are stimulated to move, system 100 could eliminate or remove those pixels.


In some embodiments, aspects of system 100 utilize extensor optical systems. Numerous optical sensors are known to those of ordinary skill in the art. For example, if there are opacities in the way of the optical sensor such as cloudiness, bleeding, or synovial fluid, an optical sensor may inaccurately note properties, such as pH or pressure. By removing opacities, system 100 improves the functioning of the optical sensors. In some embodiments, system 100 changes light to color the frequency or intensity by strobing, flashing on/off, and/or displaying different intensities as it reflects (i.e., albedo). System 100 may also remove tissues that reflect differently based on light color, frequency, intensity, and/or strobe.



FIGS. 2-4 are schematic diagrams of exemplary procedural components 210, 230, and 250 that may be used with system 100, shown in FIG. 1. FIG. 2 is a schematic diagram of an exemplary procedural component 112 in the form of a telemanipulator 210. Telemanipulator 210 receives operational instructions from a user/surgeon operating input interface 104. In such an embodiment, the user/surgeon is capable of performing normal surgical movements while arms 212 carry out those movements using end-effectors and manipulators 214 to perform the actual surgery on the patient 200. It should be noted that utilizing a telemanipulator 210, the surgeon does not have to be present in the operating room, but can be anywhere in the world, leading to the possibility for remote surgery. In some embodiments, telemanipulator 210 includes an imaging device 110 (e.g., endoscope) within an arm 214.



FIG. 3 is a schematic diagram of an alternative procedural component 112 in the form of a user movable robotic arm 220. In the exemplary embodiment, a surgeon manipulates arm 220 by moving an end effector 222 into place by movement of handle portion 224. During a procedure, the surgeon utilizes arm 220 and more specifically end effector 222 with limitations that are imposed by computing device 102 and/or processor 122 with information obtained from imaging device 110.



FIG. 4 is a schematic diagram of an alternative procedural component 112 in the form of a manual surgical tool 230. In such an embodiment, tool 230 is manufactured to be portable (i.e., hand-held) such that the surgeon can manipulate the tool 230 during a procedure. Tool 230 includes an end effector 232 capable of performing surgical functions on a patient. In an embodiment, tool 230 can be at least partially sterilized. In another embodiment, tool 230 includes surgical drains that cover part of tool 230 and sterilize parts in an operating room, such as sleeves, drapes, and the like.


It should be noted that system 100 is configured to complete an entire surgical procedure utilizing only system 100, inclusive of the procedural component 112 which non-limiting examples are represented by telemanipulator 210, arm 220, and tool 230. As noted above, procedural component may include one or more end effectors 214, 222, and 232 to perform to actions needed to perform the surgical procedure. The actions of component 112 can be any surgical action including, but not limited to, sewing, stitching, stapling, cutting, sawing, cauterizing, grasping, pinching, holding, tensioning, moving, implanting, removing, viewing, sensing force, sensing pressure, and tying. The end effectors 214, 222, and 232 can be any end effector needed for performing the surgical actions including, but not limited to, forceps, needles, needle drivers, retractors, clip appliers, probe graspers, cardiac stabilizers, balloons, tissue dissectors, saws, knives, mills, reamers, coagulation devices, lasers, ultrasonic transducers/probes, cautery instruments, scalpels, staplers, scissors, graspers, and sealers.


In the exemplary embodiment, system 100 includes at least one imaging device 110. As shown in FIG. 5, imaging device 110 can substantially surround or be positioned adjacent patient 200 undergoing a procedure (i.e., intraoperative) performed at least in part by procedural component 112 and/or end effector 114. In the exemplary embodiment, imaging device 110 includes a C-arm 300 coupled to a scanner 302. In one embodiment, scanner 302 is a gamma camera configured to detect radioactive tracers inside the body of patient 200. Alternatively, scanner 302 is any device that is capable of scanning and/or detecting environmental information of patient 200, a surgical site, and/or and operating room including, but not limited to, endoscopy, fluoroscopy (e.g. X-ray, CT, CAT scan), laser or ultrasonic Doppler velocimetry, projection radiography, MRI, SPECT, PET, ultrasound, infrared, elastography, tactile imaging, photoacoustic imaging, thermography, tomography, echocardiography, NIRS, and fNIRS. In an embodiment, environmental information scanned and/or detected via a plurality of techniques may be combined (e.g., combining visible light wavelengths and infrared for imaging technologies). In some embodiments, imaging device 110 may have one or more scanners 302 located at, near, adjacent, or in a surgical site to perform imaging In one embodiment, imaging device 110 includes a scanner 302 that is configured to locate procedural component markers 250 positioned on procedural components 112. Marker 250 information is transmitted to computing device 102 to determine component location relative to patient 200, other components of system 100, and/or the operating room. In the exemplary embodiment, system 100 includes an imaging device 110 that provides holistic imaging of a surgical site to provide users with the necessary imaging to complete a procedure. For example, in a knee arthroplasty procedure, the imagine device(s) 110 used in the procedure would provide the surgeon images of the knee as well as relative joints (e.g., hip and ankle) to ensure an effective procedure. In such a procedure, system 100 would produce a three-dimensional model of the hip, knee, and ankle to provide views of the procedure in all angles including anterior-posterior (AP) views and medial-lateral (ML) views. While a non-limiting example of an arthroplasty procedure is provided, it should be noted that the holistic imaging could be utilized with any type of procedure utilizing system 100.


In preparation for use of the robotic system 100, a calibration is required to ensure that accuracy of the end effectors 114. Typically, imaging (e.g., CT) of the patient is done preoperatively and the images are loaded into system 100. In some instances, during the calibration of system 100, while patient 200 is in the operating room and on the operating table, patient markers are placed or pinned into the body to provide landmark information. The system 100 is configured to associate the patient landmark information with the images provided preoperatively to provide a map of a surgical site as shown in FIG. 9.


In some embodiments, calibration is aided by coupling the robotic system to the surgical table and/or the patient. Coupling system 100 to a surgical table or bed provides system 100 relative patient position information as the patient is moved during a procedure. Referring to FIG. 2, procedural component 112 can be configured to couple directly to table 220 at a base 230 of telemanipulator 210 or at or near arm(s) 212 of telemanipulator 210 via coupling mechanism 232. Referring to FIG. 3, movable robotic arm 220 can be configured to have one or more recesses or docks 240 provided within arm 220 to couple directly into table 220. In addition to being coupled to table 220. Components 112 can be coupled directly to the patient. In some embodiments, custom molded or 3-D printed devices 250 can be created for each patient to be worn during a procedure. To create such devices 250, preoperative scans are taken of a surgical site and custom fit devices 250 would be manufactured to be placed on the patient. In one such non-limiting example, as is shown in FIG. 3, for a brain surgery, a helmet 250 is custom manufactured for patient 200 and includes attachment portions 252 that provide a coupling point for components 112 as well as apertures 254 for performing the procedure. Devices 250 can be manufactured to fit any body portion undergoing a procedure including, but not limited to, an abdomen, leg, knee, foot, ankle, neck, back, torso, arm, hand, head, and face.


It should be noted that procedural components can be configured to dock or electrically couple into surgical tables or beds such that communication from each can be transmitted back and forth via a communications interface. In some embodiments, procedural components 112 rigidly couple to the table or bed while other embodiments provide an electrical wire, cable, or wireless (e.g., Bluetooth or Wi-Fi) coupling between the components 112 and the table. In addition, in some embodiments, procedural components 112 rigidly couple directly to a patient or to surgical instrumentations utilized during a procedure (e.g., surgical robot, instrumentation, visualization system, retractor, electrosurgery system, knife, saw, and mill). In some embodiments, procedural components 112 are used in a transmitter/physician office or other locations such as insurance companies and the like.


In the exemplary embodiment, a radioactive tracer is inserted into the body and landmark information of the patient is determined by a radioactive detector (e.g., scanner 302) of system 100 and provided to a surgeon via presentation interface 106 to increase the efficiency and accuracy of the procedure. As is commonly known, radioactive tracers emit gamma rays from within the body. These tracers are generally short-lived isotopes linked to chemical compounds (e.g., radiopharmaceutical) that enable examination of specific physiological processes and/or anatomical landmarks. In some embodiments, the tracers are given through an intravenous injection (e.g., IV), inhalation, or orally. In some embodiments, the tracers are optical and/or biodegradable.


In the exemplary embodiment, Technecium-99m is used as the radioactive tracer. Technetium-99m emits 140 key gamma rays with a half-life of approximately 6 hours that exits in the form of pertechnetiate ion (TcO4). Alternatively, any radioactive tracer could be used with the systems and methods described herein, including but not limited to, Bismuth-213, Calcium-47, Carbon-11, Cesium-137, Chromium-51, Cobalt-57, Cobalt-60, Copper-67, Dysprosium-165, Erbium-169, Fluorine-18, Gallium-67, Holmium-166, Indium-111, Iodine-123, Iodine-125, Iodine-131, Iridium-192, Iron-59, Irridium-192, Krypton-81m, Lutetium-177, Molybdenum-99, Nitrogen-13, Oxygen-15, Palladium-103, Phosphorus-32&33, Potassium-42, Rhenium-186, Rhenium-188, Rubidium-82, Samarium-153, Selenium-75, Sodium-24, Strantium-85, Strontium-89, Strontium-92, Sulfur-35, Technecium-99m, Thallium-201, Uranium-235, Xenon-133, Ytterbium-169, and Yttrium-90.


In some embodiments, the tracers are detected by a gamma camera, which recognize photons enabling a view of internal landmarks of a patient from many different angles. In such an embodiment, the camera builds up an image from the points from which radiation is emitted and the image is enhanced by system 100 and viewed by a physician on monitor 106. In an alternative embodiment, a Positron Emission Tomography (PET) is performed in which a PET camera detects emission of two identifiable gamma rays in opposite directions to identify landmarks. In yet another embodiment, myocardial perfusion imaging (MPI) is performed to identify landmarks. In some embodiments, images having landmarks identified with tracers (e.g., gamma, PET, MPI) are utilized with a computerized tomography (CT) scan and the images are co-registered (e.g., layered) by system 100 to provide complete landmark information. It should be noted that the tracer images can be co-registered with any other type of imaging (e.g., ultrasound, x-ray, and MRI) to produce landmark information. In an embodiment, the tracers are used for ultrasound navigation with or without radiation. The ultrasound navigation may be utilized for surface mapping or registering anatomical data points.


During robot assisted surgery, such as surgery utilizing system 100, it is necessary to have multiple fixed points for recognition by trackers to allow for navigational computation. Currently, in the case of arthroplasty, invasive pins are inserted into the bones to calibrate the robotic system and to create landmarks. In the exemplary embodiment, the landmark information found via tracers is utilized for calibration and navigational computation. In such an embodiment, after the radiopharmaceutical (e.g., technetium) is introduced into the system, the tracer is taken up by osteoblasts and noted on resulting images. Often, the tracer appears dark on an image and can be known as a hot spot. These locations are co-registered with other images, by system 100 and the system 100 performs calculations using known algorithms to determine key distances and/or boundary layers for surgery.



FIGS. 6A and 6B are illustrations of an exemplary image 400 created, displayed, and/or utilized by system 100. In the exemplary image 400, a bone scan created with the use of a gamma camera is shown with multiple hot spots 402, indicative of the location of the radiopharmaceutical. In some embodiments, a (e.g., surgeon, physician's assistant, nurse) creates a cut within tissue (e.g., bone) that would promote osteoblast formation and hot spot formation as the osteoblast will absorb the tracer. In such an embodiment, the tracer can be injected and/or placed directly on the cut such that the tracer remains substantially in place and avoids systemic introduction of the tracer. In an embodiment, different tracers may be absorbed into different tissues such as thyroid, liver, kidney, and the like. The system 100 is also capable of creating, displaying, and/or utilizing three-dimensional images using ultrasound, radialized isometry, and/or three-dimensional views anterior, posterior, and circumferential.


In one embodiment, as shown in FIG. 7, in addition to, and/or in substitution of, creating tissue cuts, a user places one or more tissue markers 420 in discrete locations within the body. The tissue marker 420 can be injected, filled, or formed with a tracer to provide location information (e.g., via a thermogram). In one embodiment, the marker 420 is a scaffold that is formed to adhere to the contours of body tissue. The scaffold 420 may be formed to adhere directly to the tissue and/or be affixed with a fixation substance (e.g., surgical adhesive, bio-adhesive, glue, fasteners, ultrasonic welding, other tissue reconstruction repair approaches). Alternatively, tracers can be compounded with an agent (e.g., PLLA, collagen) and positioned on the tissue with or without the use of a fixation substance. It should be noted that the markers can be biodegradable such that the tissue marker can remain in the body after the surgical procedure. Additionally, the markers can be fabricated from a non-biodegradable material. In some embodiments, the markers include a wireless transmitter (e.g., RFID tag, Bluetooth) that provides, at minimum, location information for inventory and/or complication risk assessment. Additionally, the markers can be sensors in the tissue.


In one embodiment, sensors 422 are positioned within a procedure site. The sensors 422 are configured to detect and transmit non line of sight surgical data to the surgeon, through system 100. Although the sensors are configured to communicate with system 100 in a wireless fashion, the sensors 422 can electrically couple to system 100 to communicate directly over a transmission line (e.g., fiber or metallic cable). In one embodiment, the sensors 422 would act as Geiger counters and detect tracers within a particular location. In an embodiment, sensors 422 are powered by capacitors. In another embodiment, sensors 422 are powered by body flow across cell membranes (i.e., turning tissue into a battery) by utilizing electrical energy through the membranes through thermal application. It should be noted that the body flow could be accomplished through synthetic tissue. In an embodiment, sensors 422 measure microcellular micro-cellular electrical gradients inside the cell. In some embodiments, sensors 422 are configured to detect force to provide feedback as to the forces exerted on and around tissue. In some embodiments, sensors 422 measure electrical pulses emitted by the body (e.g., nerves) during surgery. In such embodiments, sensors 422 can detect pulses by EEG, EMG, micro electrode arrays, or other electrophysiological recording methods. In one embodiment, sensors 422 are provided such that somatosensory evoked potentials are detected during a procedure. It should be noted that sensors 422 can be any sensor that monitors environmental factors including, but not limited to, force, acoustic, vibratory, density, pressure, optical, chemical, and electric. In some embodiments, sensors 422 are optical and/or biodegradable. Sensors 422 may also be positioned on the skin of a patient or implantable in the body of the patient. In some embodiments, sensors 422 partially degrade. For example, sensors 422 could include a biodegradable coating or slowly degrade over a specific time when exposed to water. The degradation may be hastened by heat, pH, and the like. In some embodiments, sensors 422 are rechargeable, such as through electromagnetic, optical, laser, ultrasound, vibratory, and/or thermal techniques.


In one embodiment, sensors 422 are configured to measure distances of particular tools. For example, as shown in FIG. 7, an end effector 114 (e.g., ultrasonic scalpel) may have an emitter 424 positioned at a fixed location. As end effector 114 is inserted in the body, system 100 can monitor the distance of the tool relative to sensor 422. In the exemplary embodiment, the sensor is placed on at least a portion of the spinal cord 430. As the tool would progress internally towards the spinal cord 430, system 100 and/or processor 122 would lock-out (prevent) end effector from functioning and/or progressing to prevent disruption of spinal cord 430. In some embodiments, sensors detect tools directly without the need for an emitter. For example, sensor may detect the vibratory or acoustic energy emitted by tool to determine a location of the tool relative to the sensor.


As noted above, system 100 can be calibrated such that the markers 420 and/or sensors 422 provide boundaries such that disturbances to portions of the body not part of the surgical procedure are minimized Additionally, the markers 420 and/or sensors 422 can provide a secondary layer of protection should a fault occur. In such an embodiment, the markers and/or sensors can require the software of system 100 to perform a first check of calibration and confirm navigation throughout the procedure.


It should be noted that while methods and systems shown herein are depicted for arthroplasty, the methods and systems described herein can be utilized in any part of the body for any surgical procedure. For example, a tracer could be absorbed into organ (e.g., heart, gallbladder, etc.) which enables system 100 to create a boundary for procedural component 112 and/or end effector 114, such that the end effector does not work outside of the boundaries created using information obtained from markers 420, sensors 422, and/or scanners 302. Accordingly, as the soft tissue moves or is moved, system 100 can maintain an accurate location of the surgical site. Moreover, the methods and systems described herein can be utilized for robotics and/or haptics guided with preoperative imaging via CT, MRI, ultrasound techniques, and the like. The method and systems described herein may also be utilized standalone intraoperatively.


In the exemplary embodiment, system 100 receives instructions from software that enables processor 122 to determine boundaries within a surgical site and prevent (e.g., lock-out or disable) components 112 and/or end effectors 114 from operating outside of the determined boundaries. To this, if a components 112 and/or end effectors 114 are prevented from continued operation due to a boundary, system 100 and/or processor 122 is configured to determine whether to move the component 112 and/or end effector 114 or a portion (or all) of table 220 to enable further operation of the component 112 and/or end effector 114. In one embodiment, the determination of what object to move is provided to a surgeon to enable manual intervention.


Alternatively, system 100 provides signals to the appropriate object (e.g., effector 114 or table 220) to enable the procedure to continue. The signals provided to the table 220 can be any signals that effect a table to re-position a patient as needed, including but not limited to, manipulating, rotating, torqueing, twisting, distracting, flexing, extending, elevating, descending, inflating or utilizing a retractor (i.e., internal or external) or bolster that aid in bringing objects into/out of a surgical site. Such a repositioning of objects enables a surgeon to optimize portions of a body relative to a portal or incision. For example, system 100 is configured to provide instructions to manipulate vessel blood flow, organ position, or the position of any other body part. The signals provided to the table 220 can also be any signals that affect a table to move relative to a body part. The body part could also move relative to the table 220 and/or relative to other systems (e.g., 120, 122, 124, and 220) coupled to system 100. Moreover, the table and the body part could both move together synchronously or independently. In addition to providing instructions to manipulate table 220, system 100 can also provide instructions to other systems (e.g., 120, 122, 124, and 220) coupled to system 100. For example, system 100 manipulates a change in table 220, system 100 would also transmit a signal to adjust lighting or visualization to re-position to the new location of the surgical site. The signals described herein may enable control and/or activation of aspects of system 100 through voice commands, remote commands, imaging controls, and/or robotic activation. Aspects of system 100 may also be controlled with a robotic mechanism with or without navigation or visualization.


Additionally, if processor 122 determined that a repertory change of the patient would aid in the effectiveness of the procedure, a signal can be generated and transmitted to the anesthesia system and/or anesthetist to alter the anesthetic (e.g., amount or type) given. In an embodiment, processor 122 generates and transmits a signal to dictate anesthesia to, for example, increase or decrease inflation of the lungs, change the blood pressure, heartbeat rate, lung volume, and the like. Moreover, processor 122 is capable of controlling electrical currents for transcutaneous electrical nerve stimulation (TENS) to stimulate muscular activity and/or vascular activity and control position. For example, electrothermal vibrations could be used to stimulate tissue and/or electrical soma.



FIG. 8 is a cut-away perspective view of an exemplary temperature implant 500 for use with system 100 shown in FIG. 1. Implant 500 includes a heating portion 502 that provides heating to tissue in direct contact with implant 500. Implant 500 also includes a cooling portion 504 that surrounds and is adjacent heating portion 502 that prevents heating of tissue that is not in direct contact with heating portion 502. Heating portion includes a surface 506 that is configured to increase temperature as a result of a heating element 510 positioned within implant 500. Heating element 510 can be any element that produces heat to surface 506 including, but not limited to, metal, ceramic, and composite. In one embodiment, heat is produced from the use of vibratory (e.g., ultrasonic) energy. In the exemplary embodiment, surface 506 is fabricated to include a polymer but can include any material that enables heat transfer including, but not limited to, metals and polymer composites. In some embodiments, heating portion 502 includes a magnetic surface. Heating element 510 can also be any element that produces electrical charges to an implant surface, cell membrane, tumor, or infection, for example through microscopic membranes, cell membranes, nerve fibers, mitochondria, and/or intracellular.


To provide cooling to implant 500, cooling portion 504 includes a heat exchanger to reduce heating of tissue that is not in direct contact with surface 506. In one embodiment, implant 500 is fabricated to be modular. In such an embodiment, implant 500 is fabricated to have multiple removable sections such as a base section 520 and first modular section 522. Modular section(s) 522 can be added or removed to increase or reduce the size of the implant 500 thus providing for a variable sized implant.


In the exemplary embodiment, heating element 510 and/or cooling portion 504 is powered by a controller 530 and power source 532. In one embodiment, power source 532 is a battery. Alternatively, power source 532 is a power converter that converts power (e.g., A/C or D/C current) from a power source (e.g., outlet) into electrical signals for use by the heating element 510 and/or heat exchanger 504. Implant 500 also includes sensors 534 configured to monitor environmental factors within the operating site. In one embodiment, a sensor 534 is temperature sensor configured to monitor the temperature of implant 500 and/or the tissue in contact with surface 506 and/or the tissue adjacent to implant 500. Additionally, the sensors 534 can be any sensor that monitors environmental factors including, but not limited to, force, acoustic, vibratory, density, pressure, optical, chemical, and electric. In some embodiments, implant 500 includes a timer (not shown) coupled to controller 530 that enables controller to selectively provide power to heating element 510 at predetermined times or intervals. In an embodiment, sensors 534 are internal sensors.


Sensors 534 are coupled to a communication interface 540 coupled to processor 542 and/or controller 530. Communication interface 540 communicates with system 100, shown in FIG. 1. In addition to providing operating site information, communication interface 540 also receives instructions, remotely, for powering and adjusting parameters (e.g., temperature) of implant 500. To communicate with system 100, communication interface 540 may include, for example, a wired network adapter, a wireless network adapter, and/or a mobile telecommunications adapter. Moreover, communication interface 540 may communicate via nerve transport, cellular membrane transport, and/or cellular signals.


In use, implant 500 is placed on tissue. In some embodiments, implant 500 heats tissue for a predetermined amount of time and then removed. After removal, a thermogram scan can be taken to provide landmark information to system 100. In one embodiment, implant 500 remains positioned within the body to allow for selective heating of tissue throughout the procedure. The resulting thermogram images can be co-registered with other images (e.g., CT, MRI, X-Ray, Gamma) taken of the patient to provide landmark information to system 100 and/or the surgeon. In some embodiments, the thermogram can be utilized as an alternative to the radioisotopes described above. The advantages of the use of implant 500 and the resulting thermogram images is that landmark information can be provided and registered without having direct line of site enabling the implant 500 to be positioned on the side, back, or into the padding behind the arthroplasty as one would not require direct line of site, which would impede the surgical procedure. In some embodiments, implant 500 uses electrical, thermal, and/or magnetic techniques to stimulate muscle, vessels, nerves, and the like to contract over movement patterns. Landmark information can be detected by creating boundary levels as well as navigation. For example, a stimulator would move to detect where the movement is and then create boundary layers or guidance direction to a specific site.


In one embodiment, system 100 includes a scanner 302 in the form of a vessel determination device. The scanner is configured to locate vessels and flow rates in the body. The scanner 302 can include any flow determination technology for locating vessels including, but not limited to ultrasonic flow meters and/or laser Doppler velocimeters. Once the vessels are located, positional information can be transmitted to system 100 to be input into images and navigation calibration. In one embodiment, system 100 determines the type of vessel (e.g., artery, vein, capillary) based on the size and/or flow rates of the located vessels. Vessels that are within a procedure site can be used as boundaries such that procedural component 112 and/or end effector 114 will be disabled when approaching the vessel to maintain patient safety during the procedure.



FIG. 9A is an exemplary image 600 created by processor 122, shown in FIG. 1, using information received from scanners 302 for display on interface 106. FIG. 9B is a schematic representation of image 600. Image 600 is created by co-registering multiple images created from scanners 302. As can be seen in the image, a knee joint having a femur 604 and tibia 606 are created from an MRI. Image 600 also displays a vessel 602 positioned behind the femur 604 and tibia 606 from flow determination technology. Radioactive tracer information is shown by hot spots 608 that were derived from a gamma camera or PET, ands sensors 422 or markers 610 can be located as well. Image 600 also includes thermography information 612 from the patient. In one embodiment, image 600 includes cutting guides 614 that display portions of tissue that will be removed by the surgeon. It should be noted that any and all of the landmarks, sensors, or makers that can be identified by system 100 can serve as boundaries or limitations on procedural components 112 and/or end effectors 114. Additionally, any of the imaging and/or landmark determination techniques described herein can be combined such that processor 122 can co-register the information to produce images and instructions for system 100.


In addition to image 600, processor and/or system 100 can also generate a composite image of the surgical site in the form of an animation or 3-D rendering based on the information shown in FIG. 9. Such an image would provide a reconstruction of the surgical site enabling the surgeon to rotate through the surgical site and visualize the entire site from any desired angle. As noted above, the images produced would also enable a surgeon to receive a holistic view of a site. For example, while a portion of image 600 displays a portion of the knee, a surgeons' view would provide views of the pelvis, hip, foot, and ankle for relative information. Such information is vital to determine how objects in the surgical site (e.g., knee movement) affect remote portions of the body (e.g., hip, spine, and ankle). In addition to optical changes, the images produced may also enable detection of electrical and/or motion patterns.


In the exemplary embodiment, the images produced by system 100 also provide a surgeon the ability to locate anatomical structures (e.g., bones, organs, arteries, vessels, cartilage) in a surgical site and denote those structures by color coding or labeling. Such structures can also be removed or added to a view, via input interface 104, during a procedure based on the surgeons' needs. Images produced by system 100 may also be utilized for external ablation systems that utilize ultrasound, thermal, and like techniques to refine exact tissue location for ablation of tumors, treatment of infection with antibiotics to enhance growth, neurologic tissue, rewire neurons, and/or repair complex neurological bundles.


To decrease the trauma (e.g., pain, swelling, etc.) received or resulting from a surgical procedure, surgical markers and/or sensors can be positioned at points of interest for a surgeon. The surgical markers are substances including a dye that fluoresce when exposed to ultraviolet light (UV). As an alternative to directly placing the surgical markers on tissue, tissue closure devices (e.g., suture, staples) can be impregnated with the dye such that it is absorbed or transferred to the tissue that is in direct contact with the closure device. For example, in a revision arthroplasty procedure in which an infected joint replacement component is being extracted and replaced, the surgeon can position surgical markers on the incision or open tissue after he/she extracts the joint replacement component and before closing the surgical site to allow the body to eliminate/fight the present infection. When patient returns for the secondary procedure (e.g., placing a drug local into the body), ultraviolet light can be used to locate former incision locations. Using the UV indicated locations, a surgeon can utilize the former incision to open the surgical site. Utilizing a former incision can greatly reduce pain, inflammation, and trauma to the patient as scar tissue generally forms at locations of former incisions, which has been found to be less traumatic to a patient than disturbances (i.e., cuts) to muscle tissue.


In addition to the images created in system 100 from imaging devices 110, system 100 can include software filter for filtering out material and/or objects from an image that can restrict line of sight of a user (e.g., physician/surgeon, driver, machine operator, etc.). For example, the filters described herein would enhance the surgeon's ability to visualize a surgical site during a procedure (e.g., arthroscopy) by filtering opaque bleeds or blood flow and allowing the surgeon to determine the location or source of the bleed. In another exemplary embodiment, the filters described herein would enhance the driver's ability to visualize upcoming stretches of roadway by filtering fog, rain, and the like and allowing the driver to determine if hazards exist on the upcoming stretches of roadway. When video is digitized, each frame is represented by a two-dimensional array. Each location in the array represents a pixel and each pixel contains a set of values representing color and other attributes. The filters described herein manipulate these pixel values. For example, the filters described herein may change pixels and/or remove pixels with or without electrical charges or motion changes. It should be noted that these filters could be applied to any video format and coding methods including, but not limited to, PAL, NTSC, MP4, and AVI.


In one embodiment, a first filter is utilized within system 100 and the first filter is configured to determine when blood exists in the saline solution of a surgical site during surgery. This is accomplished by monitoring a sequence of frames for a range of target colors that move in a particular pattern. In one embodiment, the first filter lowers the color values of blood to match the surrounding colors. Alternatively, the first filter lowers the intensity of color of blood (e.g., red) to enable the surgeon to better visualize the intended area. In some embodiments, the first filter lowers the color values and well as lowering the intensity of color. For example, one could see the reflective coefficient (e.g., albedo), albedo with different light sources, and/or different movement creating the changes. In some embodiments, the first filter accomplishes the determination with vibratory changes, acoustic changes, moving cells that change, and/or move or have specific electrical charges. The first filter could remove these pixels in the tissue/bone. In some embodiments, the target (e.g., tissue) may be magnetized. In some embodiments, the filter bounces, changes, and/or reflects light. For example, the filter could enable a user to see around corners with reflective light by using an albedo or reflective coefficient.


In another embodiment, a second filter is utilized by system 100 to provide images to a user. The second filter is configured to adjust particular colors in an image by removing pixels in each frame that meet a predetermined criteria (e.g., blood flow). The pixels in the buffer which would be displayed on a standard image without the use of second filter would then be used in place of the obscured pixels. This would give the second filter hysteresis which would allow it to use previous information to help recreate what is behind an object (e.g., blood, synovium, tissue fats, debris, bone fragments) that could obscure the view other objects of interest (e.g., soft tissue, cartilage, muscle, bone). It should be noted that the first and second filters could be used in combination to provide an accurate rendering of a surgical site enabling a surgeon to selectively eliminate unnecessary objects from their view. For example, multiple filters could be used to change orange pixels to red pixels. In some embodiments, one filter is a software filter, for example a mechanical filter (e.g., film, prism, Fresnel lens, UV lens).


In some embodiments, the second filter is utilized by generating a baseline image and comparing the baseline image to new images taken at predetermined time increments (e.g., 0.1, 0.5, 1, 2, 30 seconds or minutes). In such embodiments, pixels could be compared to determine flow patterns of fluid or other moving objects. Additionally, system 100 is configured to enhance objects in an image by receiving imaging information, comparing to known imaging information (e.g., pre-operative scan), determining the differential and adding necessary data to the received imaging information.



FIG. 9 is an exemplary flowchart 700 of a method of visualization for use with the system 100 shown in FIG. 1. In the exemplary embodiment, system 100 receives an image of a site. In some embodiments, an image is received by computing device 102 from an imaging device 110 and or input interface 104. In some embodiments, images are received by computing device 102 from a remote location through communications interface 108.



FIGS. 10-13 are images utilized with the method shown in FIG. 9.


In some embodiments, system 100 utilizes filters to determine the source of the blood flow. Once system 100 and/or processor 122 determines the location or source of a blood flow, system 100 can indicate, on the image, the source of the blood. Additionally, system 100 can also indicate a particular concentration of blood in the image at certain areas where the blood is has stronger concentration. In some embodiments, system 100 can determine the velocity of the blood flow in one or more locations, which may indicate the source of blood flow. Any of the determinations described above, can be indicated on an image, by system 100 and/or processor 122, with indicia having non-limiting examples of a circle, marker, or color differentiation so that the surgeon can easily locate the area of blood flow to allow for electrocautery, to locate the source of the bleed, and/or to apply pressure for coagulation.


In one embodiment, a diagnostic ultrasound or B-Mode ultrasound head is imbedded into an imaging device 110 (e.g., camera) to enable overlaying the information from the diagnostic ultrasound with the video image in real time. This provides the surgeon a 3-dimensional view of the surgical site as well as the direct camera view. Such a combination is useful in removing blood or other debris from the surgeon's view with filters. In some embodiments, a second portal or external ultrasound is utilized in conjunction with the camera to enable these filters as well. If either internal or external ultrasound is used, it is possible to use Doppler information to better detect blood for filtering. The filters would work as previously mentioned for either removal the color or using information from previous frames. The Doppler information is useful in a more precise determination of the location of the bleed. The filter should also monitor for dramatic increase in the blood flow. If system 100 determines an increase in blood flow has occurred, an alert is transmitted to the users that a bleed is occurring and being filtered out that might require attention. In one embodiment, the alert is provided as an overlay warning the surgeon of the increase in blood flow. Alternatively, system 100 can be configured to turn off filters when an amount of blood in the field of view exceeds a predetermined threshold. In some embodiments, system 100 is configured to filter out objects that produce a reflection at or above a predetermined threshold.


In one embodiment, system 100 creates a charged particle filter used for filtering out particular objects. In such an embodiment, system 100 projects or transmits a number of charged particles (e.g., ions in a gas) at a site that attach to a particular object. The charge is selected by determining an object to be filtered out and charging particles to a predetermined setting that would enable the particles to attach or couple to the object. Once the particles attach to the object, system 100 detects the location of the objects and filters out the information from the images generated.


In some embodiments, physical optical filters are utilized on or with lighting and visualization systems to aid in visualization. For example, physical filters could be applied to the lights to substantially block a predetermined color (e.g., red) from appearing in the imaging Alternatively, physical filters can be utilized to substantially increase the brightness or coloration of particular color in an image. As such, the physical filters can be applied to block out additional unwanted properties (e.g., UV, sun photons).


In the exemplary embodiment, system 100 is configured to enable a surgeon and/or user to switch between the filters, combine particular filters, remove particular filters, and turn the filters on and off all together. While the software filters discussed above were provided in the application of blood during surgery, these techniques could also be used to eliminate other unwanted elements of an image including, but not limited to, smoke that is released during electrocautery, or moving objects and debris in the view of the camera. The visualization system described herein is valuable because the system 100 enables a surgeon to operate or perform a surgical procedure in a substantially dark room reducing heat from the lights, which can be detrimental during a procedure and affect tissue healing. Additionally, the system described herein eliminates the necessity for water or carbon dioxide air during an endoscopic procedure.


In some embodiments, imaging device 110 is an ingestible-type camera for examination of the internal system of a body (e.g., abdomen). An endoscopic application, colonoscopy, or microsurgery could be used to repair individual nerve fibers or vascular fibers. Aspects of system 100 could be used for cardia ablation to localize exactly where irregular cardiac rhythms are coming from.



FIG. 10 is a data flow diagram of an exemplary two-input filtering method 700 for use with the system 100 shown in FIG. 1. In the method 700, computing device 102 creates buffered scenes 702 from current scenes captured by a primary camera (e.g., imaging devices 110). The computing device 102 uses scenes captured by a secondary camera (e.g., imaging devices 110) to create a background model 704. The secondary camera may capture images using CT, MRI, infrared, ultrasound, and/or like techniques. The computing device 102 subtracts the background model 704 from the current scene captured by the primary camera utilizing a threshold (T1) to create a foreground mask 706. Moreover, the computing device 102 takes the complement of the foreground mask to generate a foreground mask complement 708 (i.e., a background mask). The computing device 102 subtracts the foreground mask 706 from the current scene captured by the primary camera to generate a background 710. The computing device 102 also generates a foreground 712 by subtracting the foreground mask complement 708 from the current scene captured by the primary camera. And the computing device 102 subtracts the foreground mask complement 708 from one or more buffered scenes 702 to generate a buffered background 714. The computing device 102 generates a weighted average or threshold (T2) of the foreground 712 and the buffered background 714. The computing device 102 then generates an output frame 716 by determining the absolute difference between the background 710 and the weighted average.



FIG. 11 is a data flow diagram of an exemplary two-input filtering method 800 for use with system 100 shown in FIG. 1. In the method 800, the computing device 102 creates buffered primary scenes 802 from current scenes captured by the primary camera and buffered secondary scenes 804 from current scenes captured by the secondary camera. The computing device 102 uses the buffered primary scenes 802 and buffered secondary scenes 804 to find frames where the object in the foreground obstructing the view is not present. In this embodiment, the computing device 102 may create the background model 704 from any combination of the primary input, secondary input, output frame 716, primary buffer 802, and secondary buffer 804. As described above, the computing device 102 utilizes the background model 704 to create a foreground mask 706 from the primary camera. In addition to generating the background 710 as described above, the computing device 102 applies the background mask 708 to the primary camera input to generate a primary foreground image 806. The computing device 102 also applies the background mask 708 to the secondary camera input to generate a secondary foreground image 812. To create a buffered primary foreground image 808, the computing device 102 applies the background mask 708 to images selected from the buffered primary scenes 802. And the computing device 102 generates a buffered secondary foreground image 810 by applying the background mask 708 to images selected from the buffered secondary scenes 804. The computing device 102 takes a weighted average of the primary foreground image 806 over time T2, the buffered primary foreground image over time T3, the buffered secondary foreground image over time T4, and the secondary foreground image 812 over time T5. The final output frame 716 is generated by the computing device 102 taking the absolute difference between the weighted average and the background image 710.



FIG. 12 is a data flow diagram of an exemplary color shift filtering method 900 for use with system 100 shown in FIG. 1. In the method 900, the computing device 102 creates a primary color shift image 902 from the primary camera input and a secondary color shift image 904 from the secondary camera input. In an embodiment, the color shifts are dependent upon one or more wavelengths of interest. The computing device 102 applies the background mask 708 to the color shifted primary image 902 to create a color shift primary foreground image 908 and applies the background mask 708 to the color shifted secondary image 904 to create a color shift secondary foreground image 910. The computing device takes a weighted average (T2) of the primary foreground image 806, the color shift primary foreground image 908 threshold (T3), the color shift secondary foreground image 910 threshold (T4), and the secondary foreground threshold (T5). The final output frame 716 is generated by the computing device 102 taking the absolute difference between the weighted average and the background image 710.


The system and methods described above could be used in tumor, oncology, endoscopic, arthroscopic, and/or tissue ablation procedures such as ultrasound ablation. For example, the system and methods could be used for guidance, direction, and/or location tasks during surgical procedures. In some embodiments the system and methods could be done on a macroscopic level and in some embodiments the system and methods could be done on a microscopic level, looking for specific cells, movement patterns, visualization patterns, and/or electromagnetic patterns. In an embodiment, the system and methods described above could be used to detect cells such as tumor cells or infections, which could be removed with navigation visualization provided by aspects of the system and methods. For example, one could have a patient adjust or intravenously give a marker that would absorb by abnormal cells such as a tumor or infectious cells and then visualization aspects of the system and methods could be utilized to remove pixels or enhance pixels with types of light frequency, vibrations, and the like. The tumor or infectious cells could be removed either by external tissue ablation such as ultrasonic, thermal guided ablation, or internally. Moreover, it could guide surgeons during removal of specific cells both on a macroscopic and microscopic level. For example, cells of amyloid deposits for Alzheimer's disease and/or cells that create hypertrophic tissue or infectious tissue.


While the system and methods described above have been described in a non-limiting medical setting, it should also be noted that the systems and methods described above (e.g., software filters) could also be used in non-medical applications, such as optimizing a heads up display on a car when there is fog or rain. In such an embodiment, a vehicle system can be configured to utilize the filters described above to filter objects (e.g., fog, mist, sun, pollution, smoke, or rain) and provide a clear image to vehicle passengers' either in place of a passengers' view or as an addition to a passengers' view. Such a system is also configured to detect objects in the path of the vehicle and alert passengers of the detected objects.


The embodiments described herein enable non line of sight structures and/or landmarks in the body to be observed before, during, and/or after a surgical procedure. As compared to at least some known navigational systems that require objects to be affixed to the body through invasive measures, the systems and methods described herein are capable of providing information to a robotic system to enable calibration and/or boundary layer configuration to assist in creating a more efficient and safe surgical procedure. The methods and systems described herein provide surgeons the ability to calibrate a robotic system and perform surgical procedures without direct line of site needed in known systems. The method and systems described herein could vibrate the visual fields as particles would have different vibratory frequencies based on their densities, thickness, or movement. The particles with variable movement patterns could then be removed. For example, this could be done through vibratory, acoustic, electromagnetic, external compression, internal compression, magnetic frequency, and the like. Although there may be a slight delay, any delay would not affect visualization for surgery treatment or non-surgical tasks.


The embodiments described herein may utilize executable instructions embodied in a non-transitory computer readable medium, including, without limitation, a storage device or a memory area of a computing device. Such instructions, when executed by one or more processors, cause the processor(s) to perform at least a portion of the methods described herein. As used herein, a “storage device” is a tangible article, such as a hard drive, a solid state memory device, and/or an optical disk that is operable to store data.


Although specific features of various embodiments of the invention may be shown in some drawings and not in others, this is for convenience only. In accordance with the principles of the invention, any feature of a drawing may be referenced and/or claimed in combination with any feature of any other drawing. Accordingly, while many procedures described herein relate to arthroplasty or orthopedic surgery, the methods and systems described herein can be utilized in any surgical procedure including, but not limited to, general surgery, cardiothoracic surgery, tissue ablation, ultrasound ablation, arthroscopic procedures, endoscopic procedures, cardiology and electrophysiology procedures, colon and rectal surgery, plastic surgery, anesthesia, pain management procedures, ENT procedures, gastrointestinal surgery, gynecology procedures, neurosurgery, oncology procedures, pediatric procedures, radiosurgery, reconstructive surgery, spine surgery, transplant surgery, urology procedures, and vascular surgery. Additionally, it should be noted that the systems and methods described herein could be utilized to provide encryption technology by determining known patterns and either accepting or rejecting based on a determination that known patterns have been detected.


While system 100 has been described as including a procedural component 112 and at least one end effector 114, it should be noted that system 100 can operate independently to provide visualization and/or navigation to users. For example, system 100 can be utilized in a manual surgical environment where system 100 provides surgical site information to a surgeon operating manually (i.e., without robotic assistance). Additionally, the system 100 described herein can be utilized to provide visualization and/or navigation with other non-medical and/or non-surgical applications. For example, portions of system 100 and method 700 can be installed in vehicles to provide the visualization and/or navigation needed. Portions of system 100 and method 700 can be utilized to enable a driver/passenger to “see through” objects that would limit sight. In the case of a car, truck, motorcycle, bus, or other land vehicle, system 100 and method 700 is utilized to remove fog, cloud cover, rain, sunlight, hail, mist, pollution, smoke, snow, or any other form of debris obfuscating in air or fluid media from a visual image to provide a substantially clear image of the path of travel of the vehicle. Consequently, the system 100 is configured to provide the same visualization to air vehicles (e.g., plane, spaceship, rocket, balloon, unmanned aerial vehicle (UAV) (e.g., drone)) and water vehicles (e.g., boats, ships, and submarines). Additionally, portions of system 100 and method 700 can be utilized in any application reproducing video or image feeds including, but not limited to including, residential and commercial surveillance systems, television production systems and equipment, telescopes, binoculars, marine applications, and satellite imagery. It should also be noted that the system and method described herein can be utilized with technologies described in U.S. patent application Ser. Nos. 14/451,562, 10/102,413, 13/559,352, and 62/275,436, each of which is hereby incorporated by reference in their entirety.


This written description uses examples to disclose various embodiments, which include the best mode, to enable any person skilled in the art to practice those embodiments, including making and using any devices or systems and performing any incorporated methods. The patentable scope is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.


A robotic system for navigation of a surgical site is provided. The robotic system includes a computing device coupled to a presentation interface, a procedural component, and a communications interface. The computing device is also coupled to a first imaging device configured to provide imaging data of a surgical site. The computing device is also coupled to a second computing device that is configured to provide a second type of imaging data of the surgical site that is different that the imaging data of the first imaging device. The computing device is configured to co-register the imaging data to create a surgical site image for display to a surgeon on the presentation interface.

Claims
  • 1. A surgical system comprising: a surgical operating table configured to support a patient thereon, wherein the surgical operating table is controllable and adjustable;a robotic system comprising a moveable arm that is configured to support and move an end effector relative to a surgical site of the patient; andone or more controllers coupled to the surgical operating table and to the robotic system and being configured to: associate a virtual boundary with respect to the surgical site;detect that the end effector is outside of the virtual boundary; andin response to detection of the end effector being outside of the virtual boundary, command adjustment of the surgical operating table to reposition the surgical site and the virtual boundary to enable the end effector to be inside the virtual boundary.
  • 2. The surgical system of claim 1, wherein the one or more controllers are configured to prohibit operation of the end effector in response to detection of the end effector being outside of the virtual boundary.
  • 3. The surgical system of claim 1, wherein the one or more controllers are configured to permit operation of the end effector in response to detection of the end effector being inside the virtual boundary.
  • 4. The surgical system of claim 1, wherein the one or more controllers are configured to: control the robotic system to move the end effector relative to the surgical site;permit operation of the end effector in response to detection of the end effector being inside the virtual boundary;detect that the end effector has exited the virtual boundary;prohibit operation of the end effector in response to detection of the end effector exiting the virtual boundary; andcommand adjustment of the surgical operating table to reposition the surgical site and the virtual boundary to enable the end effector to re-enter the virtual boundary; andpermit operation of the end effector in response to detection of the end effector re-entering the virtual boundary.
  • 5. The surgical system of claim 1, wherein: the surgical operating table comprises one or more sub-components that are controllable and adjustable; andthe one or more controllers are configured to command adjustment of the one or more sub-components of the surgical operating table to reposition the surgical site and the virtual boundary to enable the end effector to be inside the virtual boundary.
  • 6. The surgical system of claim 1, wherein, in response to detection of the end effector being outside of the virtual boundary, the one or more controllers are further configured to control the robotic system to move the end effector towards the virtual boundary.
  • 7. The surgical system of claim 1, wherein the robotic system is electrically coupled to the surgical operating table via a communications interface and wherein the one or more controllers enable bi-directional communication between the robotic system and the surgical operating table.
  • 8. The surgical system of claim 1, wherein the one or more controllers command adjustment of the surgical operating table to perform one or more of the following: rotate the surgical site; torque the surgical site; distract the surgical site; flex the surgical site; extend the surgical site; elevate the surgical site; and/or descend the surgical site.
  • 9. The surgical system of claim 1, wherein the robotic system comprises a cart with wheels that rest on a floor surface, and wherein the moveable arm is coupled to the cart.
  • 10. The surgical system of claim 1, wherein the robotic system is mounted directly to the surgical operating table.
  • 11. The surgical system of claim 1, wherein the robotic system is at least partially mounted to the patient.
  • 12. The surgical system of claim 1, wherein the surgical site is an anatomical joint of the patient comprising one or more bones, and wherein the end effector configured to remove material from the one or more bones to prepare the one or more bones for receiving an implant.
  • 13. A method of operating a surgical system, the surgical system including a surgical operating table configured to support a patient thereon, wherein the surgical operating table is controllable and adjustable, a robotic system including a moveable arm that is configured to support and move an end effector relative to a surgical site of the patient and one or more controllers coupled to the surgical operating table and to the robotic system, the method comprising the one or more controllers: associating a virtual boundary with respect to the surgical site;detecting that the end effector is outside of the virtual boundary; andin response to detecting that the end effector is outside of the virtual boundary, commanding adjustment of the surgical operating table to re-position the surgical site and the virtual boundary for enabling the end effector to be inside the virtual boundary.
  • 14. The method of claim 13, comprising the one or more controllers prohibiting operation of the end effector in response to detecting that the end effector is outside of the virtual boundary.
  • 15. The method of claim 13, comprising the one or more controllers permitting operation of the end effector in response to detecting the end effector being inside the virtual boundary.
  • 16. The method of claim 13, comprising the one or more controllers: controlling the robotic system for moving the end effector relative to the surgical site;permitting operation of the end effector in response to detecting the end effector being inside the virtual boundary;detecting that the end effector has exited the virtual boundary;prohibiting operation of the end effector in response to detecting the end effector exiting the virtual boundary;commanding adjustment of the surgical operating table to reposition the surgical site and the virtual boundary for enabling the end effector to re-enter the virtual boundary; andpermitting operation of the end effector in response to detecting the end effector re-entering the virtual boundary.
  • 17. The method of claim 13, wherein the surgical operating table comprises one or more sub-components that are controllable and adjustable, and comprising the one or more controllers: commanding adjustment of the one or more sub-components of the surgical operating table to reposition the surgical site and the virtual boundary for enabling the end effector to be inside the virtual boundary.
  • 18. The method of claim 13, comprising, in response to detecting the end effector being outside of the virtual boundary, the one or more controllers further controlling the robotic system to move the end effector towards the virtual boundary.
  • 19. The method of claim 13, comprising the one or more controllers commanding adjustment of the surgical operating table for performing one or more of the following: rotating the surgical site; torquing the surgical site; distracting the surgical site; flexing the surgical site; extending the surgical site; elevating the surgical site; and/or descending the surgical site.
  • 20. A surgical system comprising: a surgical operating table configured to support a patient thereon, wherein the surgical operating table is controllable and adjustable;a robotic system comprising a moveable arm that is configured to support and move an end effector relative to a surgical site of the patient; andone or more controllers coupled to the surgical operating table and to the robotic system and being configured to: associate a virtual boundary with respect to the surgical site;control the robotic system to move the end effector relative to the surgical site;permit operation of the end effector in response to detection of the end effector being inside of the virtual boundary;detect that the end effector is outside of the virtual boundary;prohibit operation of the end effector in response to detection of the end effector being outside of the virtual boundary; andcommand adjustment of the surgical operating table to re-position the surgical site and the virtual boundary to enable the end effector to be inside the virtual boundary.
CROSS-REFERENCE TO RELATED APPLICATIONS

The subject application is a continuation of U.S. patent application Ser. No. 17/714,191, filed Apr. 6, 2022, which is a continuation of U.S. patent application Ser. No. 16/986,467, filed Aug. 6, 2020, now U.S. Pat. No. 11,317,974, which is a continuation of U.S. patent application Ser. No. 16/113,666, filed Aug. 27, 2018, now U.S. Pat. No. 10,765,484, which is a continuation of U.S. patent application Ser. No. 15/299,981, filed Oct. 21, 2016, now U.S. Pat. No. 10,058,393, which claims the benefit of and priority to U.S. Provisional Patent App. No. 62/369,821, filed Aug. 2, 2016 and U.S. Provisional Patent App. No. 62/244,460, filed Oct. 21, 2015, the disclosures of each of the aforementioned applications being hereby incorporated by reference in their entirety.

US Referenced Citations (1129)
Number Name Date Kind
319296 Molesworth Jun 1885 A
668878 Jensen Feb 1901 A
668879 Miller Feb 1901 A
702789 Gibson Jun 1902 A
862712 Collins Aug 1907 A
2121193 Hanicke Jun 1938 A
2178840 Lorenian Nov 1939 A
2187852 Friddle Jan 1940 A
2199025 Conn Apr 1940 A
2235419 Callahan Mar 1941 A
2248054 Becker Jul 1941 A
2270188 Longfellow Jan 1942 A
2518276 Braward Aug 1950 A
2557669 Lloyd Jun 1951 A
2566499 Richter Sep 1951 A
2621653 Briggs Dec 1952 A
2725053 Bambara Nov 1955 A
2830587 Everett Apr 1958 A
3204635 Voss Sep 1965 A
3347234 Voss Oct 1967 A
3367809 Soloff Feb 1968 A
3391690 Armao Jul 1968 A
3477429 Sampson Nov 1969 A
3513848 Winston May 1970 A
3518993 Blake Jul 1970 A
3577991 Wilkinson May 1971 A
3596292 Erb Aug 1971 A
3608539 Miller Sep 1971 A
3625220 Engelsher Dec 1971 A
3648705 Lary Mar 1972 A
3653388 Tenckhoff Apr 1972 A
3656476 Swinney Apr 1972 A
3657056 Winston Apr 1972 A
3678980 Gutshall Jul 1972 A
3709218 Halloran Jan 1973 A
3711347 Wagner Jan 1973 A
3739773 Schmitt Jun 1973 A
3760808 Bleuer Sep 1973 A
3788318 Kim Jan 1974 A
3789852 Kim Feb 1974 A
3802438 Wolvek Apr 1974 A
3807394 Attenborough Apr 1974 A
3809075 Matles May 1974 A
3811449 Gravlee May 1974 A
3825010 McDonald Jul 1974 A
3833003 Taricco Sep 1974 A
3835849 McGuire Sep 1974 A
3842824 Neufeld Oct 1974 A
3845772 Smith Nov 1974 A
3857396 Hardwick Dec 1974 A
3867932 Huene Feb 1975 A
3875652 Arnold Apr 1975 A
3898992 Balamuth Aug 1975 A
3918442 Nikolaev Nov 1975 A
3968800 Vilasi Jul 1976 A
3976079 Samuels Aug 1976 A
4023559 Gaskell May 1977 A
4064566 Fletcher Dec 1977 A
4089071 Kainberz May 1978 A
4108399 Pilgram Aug 1978 A
4156574 Boben May 1979 A
4164794 Spector Aug 1979 A
4171544 Hench Oct 1979 A
4183102 Guiset Jan 1980 A
4199864 Ashman Apr 1980 A
4200939 Oser May 1980 A
4210148 Stivala Jul 1980 A
4213816 Morris Jul 1980 A
4235233 Mouwen Nov 1980 A
4235238 Ogiu Nov 1980 A
4244370 Furlow Jan 1981 A
4257411 Cho Mar 1981 A
4265231 Scheller, Jr. et al. May 1981 A
4281649 Derweduwen Aug 1981 A
4291698 Fuchs Sep 1981 A
4309488 Heide Jan 1982 A
4320762 Bentov Mar 1982 A
4351069 Ballintyn Sep 1982 A
4364381 Sher Dec 1982 A
4365356 Broemer Dec 1982 A
4388921 Sutter Jun 1983 A
4395798 McVey Aug 1983 A
4409974 Freedland Oct 1983 A
4414166 Carlson Nov 1983 A
4437362 Hurst Mar 1984 A
4437941 Marwil Mar 1984 A
4444180 Schneider Apr 1984 A
4448194 DiGiovanni et al. May 1984 A
4456005 Lichty Jun 1984 A
4461281 Carson Jul 1984 A
4493317 Klaue Jan 1985 A
4495664 Bianquaert Jan 1985 A
4501031 McDaniel Feb 1985 A
4504268 Herlitze Mar 1985 A
4506681 Mundell Mar 1985 A
4514125 Stol Apr 1985 A
4526173 Sheehan Jul 1985 A
4532926 O'Holla Aug 1985 A
4535772 Sheehan Aug 1985 A
4547327 Bruins Oct 1985 A
4556059 Adamson Dec 1985 A
4556350 Bernhardt Dec 1985 A
4566138 Lewis Jan 1986 A
4589868 Dretler May 1986 A
4590928 Hunt May 1986 A
4597379 Kihn Jul 1986 A
4599085 Riess Jul 1986 A
4601893 Cardinal Jul 1986 A
4606335 Wedeen Aug 1986 A
4611593 Fogarty Sep 1986 A
4621640 Mulhollan Nov 1986 A
4630609 Chin Dec 1986 A
4632101 Freedland Dec 1986 A
4645503 Lin Feb 1987 A
4657460 Bien Apr 1987 A
4659268 Del Mundo Apr 1987 A
4662063 Collins May 1987 A
4662068 Polonsky May 1987 A
4662887 Turner May 1987 A
4669473 Richards Jun 1987 A
4681107 Kees Jul 1987 A
4685458 Leckrone Aug 1987 A
4691741 Affa Sep 1987 A
4705040 Mueller Nov 1987 A
4706670 Andersen et al. Nov 1987 A
4708139 Dunbar, IV Nov 1987 A
4713077 Small Dec 1987 A
4716901 Jackson Jan 1988 A
4718909 Brown Jan 1988 A
4722331 Fox Feb 1988 A
4722948 Sanderson Feb 1988 A
4724584 Kasai Feb 1988 A
4738255 Goble Apr 1988 A
4739751 Sapega Apr 1988 A
4741330 Hayhurst May 1988 A
4749585 Greco Jun 1988 A
4750492 Jacobs Jun 1988 A
4768507 Fischell Sep 1988 A
4772286 Goble Sep 1988 A
4776328 Frey Oct 1988 A
4776738 Winston Oct 1988 A
4776851 Bruchman Oct 1988 A
4781182 Purnell Nov 1988 A
4790303 Steffee Dec 1988 A
4792336 Hiavacek Dec 1988 A
4817591 Klause Apr 1989 A
4822224 Carl Apr 1989 A
4823794 Pierce Apr 1989 A
4832025 Coates May 1989 A
4832026 Jones May 1989 A
4834752 VanKampen May 1989 A
4841960 Garner Jun 1989 A
4843112 Gerhart Jun 1989 A
4846812 Walker Jul 1989 A
4862812 Walker Sep 1989 A
4862882 Venturi Sep 1989 A
4869242 Galluzo Sep 1989 A
4870957 Goble Oct 1989 A
4883048 Purnell Nov 1989 A
4890612 Kensey Jan 1990 A
4895148 Bays Jan 1990 A
4898156 Gattuma Feb 1990 A
4899729 Gill Feb 1990 A
4899743 Nicholson et al. Feb 1990 A
4899744 Fujitsuka et al. Feb 1990 A
4901721 Hakki Feb 1990 A
4921479 Grayzel May 1990 A
4922897 Sapega May 1990 A
4924865 Bays May 1990 A
4924866 Yoon May 1990 A
4932960 Green Jun 1990 A
4935026 McFadden Jun 1990 A
4935028 Drews Jun 1990 A
4945625 Winston Aug 1990 A
4946468 Li Aug 1990 A
4954126 Wallsten Sep 1990 A
4955910 Bolesky Sep 1990 A
4957498 Caspari Sep 1990 A
4961741 Hayhurst Oct 1990 A
4963151 Ducheyne Oct 1990 A
4964862 Arms Oct 1990 A
4966583 Debbas Oct 1990 A
4968315 Gattuma Nov 1990 A
4969888 Scholten et al. Nov 1990 A
4969892 Burton Nov 1990 A
4979949 Matsen, III et al. Dec 1990 A
4990161 Kampner Feb 1991 A
4994071 MacGregor Feb 1991 A
4997445 Hodorek Mar 1991 A
4998539 Delsanti Mar 1991 A
5002550 Li Mar 1991 A
5002563 Pyka Mar 1991 A
5009652 Morgan Apr 1991 A
5009663 Broome Apr 1991 A
5009664 Sievers Apr 1991 A
5013316 Goble May 1991 A
5019090 Pinchuk May 1991 A
5021059 Kensey et al. Jun 1991 A
5031841 Schafer Jul 1991 A
5035713 Friis Jul 1991 A
5037404 Gold Aug 1991 A
5037422 Hayhurst Aug 1991 A
5041093 Chu Aug 1991 A
5041114 Chapman Aug 1991 A
5041129 Hayhurst Aug 1991 A
5046513 Gattuma Sep 1991 A
5047055 Gattuma Sep 1991 A
5051049 Wills Sep 1991 A
5053046 Janese Oct 1991 A
5053047 Yoon Oct 1991 A
5059193 Kuslich Oct 1991 A
5059206 Winters Oct 1991 A
5061274 Kensey Oct 1991 A
5061286 Lyle Oct 1991 A
5069674 Farnot Dec 1991 A
5078140 Kwoh Jan 1992 A
5078731 Hayhurst Jan 1992 A
5078744 Chvapil Jan 1992 A
5078745 Rhenter Jan 1992 A
5084050 Draenert Jan 1992 A
5084051 Tormala Jan 1992 A
5085660 Lin Feb 1992 A
5085661 Moss Feb 1992 A
5086401 Glassman et al. Feb 1992 A
5090072 Kratoska Feb 1992 A
5098433 Freedland Mar 1992 A
5098434 Serbousek Mar 1992 A
5098436 Ferrante Mar 1992 A
5100405 McLaren Mar 1992 A
5100417 Cerier Mar 1992 A
5102417 Palmaz Apr 1992 A
5102421 Anspach Apr 1992 A
5120175 Arbegast Jun 1992 A
5123520 Schmid Jun 1992 A
5123914 Cope Jun 1992 A
5123941 Lauren et al. Jun 1992 A
5133732 Wiktor Jul 1992 A
RE34021 Mueller Aug 1992 E
5141520 Goble Aug 1992 A
5147362 Goble Sep 1992 A
5152765 Ross Oct 1992 A
5154720 Trott Oct 1992 A
5156613 Sawyer Oct 1992 A
5156616 Meadows Oct 1992 A
5158566 Pianetti Oct 1992 A
5158934 Ammann Oct 1992 A
5163960 Bonutti Nov 1992 A
5171251 Bregen Dec 1992 A
5176682 Chow Jan 1993 A
5179964 Cook Jan 1993 A
5180388 DiCarlo Jan 1993 A
5183464 Dubrul Feb 1993 A
5192287 Fournier et al. Mar 1993 A
5192326 Bao et al. Mar 1993 A
5197166 Meier et al. Mar 1993 A
5197971 Bonutti Mar 1993 A
5203784 Ross Apr 1993 A
5203787 Noblitt Apr 1993 A
5208950 Merritt May 1993 A
5209776 Bass May 1993 A
5217486 Rice Jun 1993 A
5217493 Raad Jun 1993 A
5219359 McQuilkin Jun 1993 A
5224946 Hayhurst Jul 1993 A
5226899 Lee Jul 1993 A
5230352 Putnam Jul 1993 A
5234006 Eaton Aug 1993 A
5234425 Fogarty Aug 1993 A
5236438 Wilk Aug 1993 A
5236445 Hayhurst Aug 1993 A
5242902 Murphy Sep 1993 A
5246441 Ross Sep 1993 A
5250026 Ehrlich Oct 1993 A
5250055 Moore Oct 1993 A
5254113 Wilk Oct 1993 A
5258007 Spetzler Nov 1993 A
5258015 Li Nov 1993 A
5258016 Di Poto Nov 1993 A
5261914 Warren Nov 1993 A
5266325 Kuzma Nov 1993 A
5269783 Sander Dec 1993 A
5269785 Bonutti Dec 1993 A
5269809 Hayhurst Dec 1993 A
5281235 Haber Jan 1994 A
5282832 Toso Feb 1994 A
5290281 Tschakaloff Mar 1994 A
5304119 Balaban Apr 1994 A
5306280 Bregen Apr 1994 A
5306301 Graf Apr 1994 A
5312438 Johnson May 1994 A
5315741 Dubberke May 1994 A
5318588 Horzewski et al. Jun 1994 A
5320611 Bonutti Jun 1994 A
5324308 Pierce Jun 1994 A
5328480 Melker Jul 1994 A
5329846 Bonutti Jul 1994 A
5329924 Bonutti Jul 1994 A
5330468 Burkhart Jul 1994 A
5330476 Hiot Jul 1994 A
5330486 Wilk Jul 1994 A
5336231 Adair Aug 1994 A
5336240 Metzler Aug 1994 A
5339799 Kami et al. Aug 1994 A
5343385 Joskowicz Aug 1994 A
5349956 Bonutti Sep 1994 A
5352229 Goble Oct 1994 A
5354298 Lee Oct 1994 A
5354302 Ko Oct 1994 A
5366480 Corriveaau Nov 1994 A
5370646 Reese et al. Dec 1994 A
5370660 Weinstein et al. Dec 1994 A
5372146 Branch Dec 1994 A
5374235 Ahrens Dec 1994 A
5376126 Lin Dec 1994 A
5382254 McGarry Jan 1995 A
5383883 Wilk Jan 1995 A
5383905 Golds Jan 1995 A
5391171 Schmieding Feb 1995 A
5391173 Wilk Feb 1995 A
5395308 Fox Mar 1995 A
5397311 Walker Mar 1995 A
5400805 Warren Mar 1995 A
5402801 Taylor Apr 1995 A
5403312 Yates Apr 1995 A
5403348 Bonutti Apr 1995 A
5405359 Pierce Apr 1995 A
5411523 Goble May 1995 A
5411538 Bonutti May 1995 A
5413585 Pagedas May 1995 A
5417691 Hayhurst May 1995 A
5417701 Holmes May 1995 A
5417712 Whittaker May 1995 A
5423796 Shikhman Jun 1995 A
5423860 Lizardi Jun 1995 A
5431670 Holmes Jul 1995 A
5438746 Demarest Aug 1995 A
5439470 Li Aug 1995 A
5441502 Bartlett Aug 1995 A
5441538 Bonutti Aug 1995 A
5443512 Parr Aug 1995 A
5447503 Miller Sep 1995 A
5449372 Schmaltz Sep 1995 A
5449382 Dayton Sep 1995 A
5451235 Lock Sep 1995 A
5453090 Martinez Sep 1995 A
5456722 McLeod Oct 1995 A
5458653 Davison Oct 1995 A
5462561 Voda Oct 1995 A
5464424 O'Donnell Nov 1995 A
5464425 Skiba Nov 1995 A
5464426 Bonutti Nov 1995 A
5464427 Curtis Nov 1995 A
5467911 Tsuruta Nov 1995 A
5470337 Moss Nov 1995 A
5472444 Huebner Dec 1995 A
5474554 Ku Dec 1995 A
5478351 Meade Dec 1995 A
5478353 Yoon Dec 1995 A
5480403 Lee Jan 1996 A
5486197 Le Jan 1996 A
5487216 Demarest Jan 1996 A
5487844 Fujita Jan 1996 A
5488958 Topel Feb 1996 A
5496292 Burnham Mar 1996 A
5496335 Thomason Mar 1996 A
5496348 Bonutti Mar 1996 A
5500000 Feagin Mar 1996 A
5501700 Hirata Mar 1996 A
5504977 Weppner Apr 1996 A
5505735 Li Apr 1996 A
5507754 Green Apr 1996 A
5514153 Bonutti May 1996 A
5518163 Hooven May 1996 A
5518164 Hooven May 1996 A
5520700 Beyar May 1996 A
5522844 Johnson Jun 1996 A
5522845 Wenstrom Jun 1996 A
5522846 Bonutti Jun 1996 A
5527341 Goglewski Jun 1996 A
5527342 Pietrzak Jun 1996 A
5527343 Bonutti Jun 1996 A
5528844 Johnson Jun 1996 A
5529075 Clark Jun 1996 A
5531759 Kensey Jul 1996 A
5534012 Bonutti Jul 1996 A
5534028 Bao et al. Jul 1996 A
5540703 Barker Jul 1996 A
5540718 Bartlett Jul 1996 A
5542423 Bonutti Aug 1996 A
5545178 Kensey Aug 1996 A
5545180 Le Aug 1996 A
5545206 Carson Aug 1996 A
5549630 Bonutti Aug 1996 A
5549631 Bonutti Aug 1996 A
5556402 Xu Sep 1996 A
5562688 Riza Oct 1996 A
5569252 Justin Oct 1996 A
5569305 Bonutti Oct 1996 A
5569306 Thal Oct 1996 A
5573517 Bonutti Nov 1996 A
5573538 Laboureau Nov 1996 A
5573542 Stevens Nov 1996 A
5575801 Habermeyer Nov 1996 A
5580344 Hasson Dec 1996 A
5584835 Greenfield Dec 1996 A
5584860 Goble Dec 1996 A
5584862 Bonutti Dec 1996 A
5591206 Moufarrege Jan 1997 A
5593422 Muijs Van de Moer Jan 1997 A
5593425 Bonutti Jan 1997 A
5593625 Riebel Jan 1997 A
5601557 Hayhurst Feb 1997 A
5601558 Torrie Feb 1997 A
5601595 Schwartz Feb 1997 A
5607427 Tschakaloff Mar 1997 A
5609595 Pennig Mar 1997 A
5618314 Harwin et al. Apr 1997 A
5620461 Muijs Van De Moer et al. Apr 1997 A
5626612 Bartlett May 1997 A
5626614 Hart May 1997 A
5626718 Phillippe May 1997 A
5628446 Geiste May 1997 A
5628751 Sander May 1997 A
5628756 Barker May 1997 A
5630824 Hart May 1997 A
5634926 Jobe Jun 1997 A
5643272 Haines Jul 1997 A
5643274 Sander Jul 1997 A
5643293 Kogasaka Jul 1997 A
5643295 Yoon Jul 1997 A
5643320 Lower Jul 1997 A
5643321 McDevitt Jul 1997 A
5645553 Kolesa Jul 1997 A
5645597 Krapiva Jul 1997 A
5645599 Samani Jul 1997 A
5649940 Hart Jul 1997 A
5649955 Hashimoto Jul 1997 A
5649963 McDevitt Jul 1997 A
5651377 O'Donnell Jul 1997 A
5658313 Thal Aug 1997 A
5660225 Saffran Aug 1997 A
5662658 Wenstrom Sep 1997 A
5665089 Dall Sep 1997 A
5665109 Yoon Sep 1997 A
5665112 Thal Sep 1997 A
5667513 Torrie Sep 1997 A
5669917 Sauer Sep 1997 A
5674240 Bonutti Oct 1997 A
5680981 Mililli Oct 1997 A
5681310 Yuan et al. Oct 1997 A
5681333 Burkhart Oct 1997 A
5681351 Jamiolkowski Oct 1997 A
5681352 Clancy Oct 1997 A
5682886 Delp et al. Nov 1997 A
5683401 Schmieding Nov 1997 A
5683418 Luscombe Nov 1997 A
5685820 Riek Nov 1997 A
5688283 Knapp Nov 1997 A
5690654 Ovil Nov 1997 A
5690655 Hart Nov 1997 A
5690674 Diaz Nov 1997 A
5690676 Dipoto Nov 1997 A
5693055 Zahiri Dec 1997 A
5697950 Fucci Dec 1997 A
5702397 Gonle Dec 1997 A
5702462 Oberlander Dec 1997 A
5707395 Li Jan 1998 A
5713903 Sander Feb 1998 A
5713921 Bonutti Feb 1998 A
5718717 Bonutti Feb 1998 A
5720747 Burke Feb 1998 A
5720753 Sander Feb 1998 A
5725529 Nicholson Mar 1998 A
5725541 Anspach Mar 1998 A
5725556 Moser Mar 1998 A
5725582 Bevan Mar 1998 A
5730747 Ek Mar 1998 A
5733306 Bonutti Mar 1998 A
5735875 Bonutti Apr 1998 A
5735877 Pagedas Apr 1998 A
5735899 Schwartz Apr 1998 A
5741268 Schutz Apr 1998 A
5741282 Anspach Apr 1998 A
5743915 Bertin Apr 1998 A
5748767 Raab May 1998 A
5752952 Adamson May 1998 A
5752974 Rhee May 1998 A
5755809 Cohen May 1998 A
5762458 Wang et al. Jun 1998 A
5766126 Anderson Jun 1998 A
5766221 Benderev Jun 1998 A
5769092 Williamson, Jr. Jun 1998 A
5769894 Ferragamo Jun 1998 A
5772672 Toy Jun 1998 A
5776136 Sahay et al. Jul 1998 A
5776151 Chan Jul 1998 A
5779706 Tschakaloff Jul 1998 A
5779719 Klein Jul 1998 A
5782862 Bonutti Jul 1998 A
5785713 Jobe Jul 1998 A
5792096 Rentmeester Aug 1998 A
5797931 Bito Aug 1998 A
5797963 McDevitt Aug 1998 A
5800537 Bell Sep 1998 A
5806518 Mittelstadt Sep 1998 A
5807403 Beyar Sep 1998 A
5810849 Kontos Sep 1998 A
5810853 Yoon Sep 1998 A
5810884 Kim Sep 1998 A
5814072 Bonutti Sep 1998 A
5814073 Bonutti Sep 1998 A
5817107 Schaller Oct 1998 A
5823994 Sharkey Oct 1998 A
5824009 Fukuda Oct 1998 A
5824085 Sahay et al. Oct 1998 A
5830125 Scribner Nov 1998 A
5836897 Sakural Nov 1998 A
5839899 Robinson Nov 1998 A
5843178 Vanney Dec 1998 A
5844142 Blanch et al. Dec 1998 A
5845645 Bonutti Dec 1998 A
5847394 Alfano et al. Dec 1998 A
5851185 Berns Dec 1998 A
5855583 Wang et al. Jan 1999 A
5865728 Moll Feb 1999 A
5865834 McGuire Feb 1999 A
5866634 Tokushige Feb 1999 A
5868749 Reed Feb 1999 A
5873212 Esteves Feb 1999 A
5873891 Sohn Feb 1999 A
5874235 Chan Feb 1999 A
5876325 Mizuno Mar 1999 A
5879371 Gardiner Mar 1999 A
5879372 Bartlett Mar 1999 A
5891166 Schervinsky Apr 1999 A
5891168 Thal Apr 1999 A
5893880 Egan Apr 1999 A
5897574 Bonutti Apr 1999 A
5899911 Carter May 1999 A
5899921 Casparai May 1999 A
5906579 Vander Salm et al. May 1999 A
5906625 Bito May 1999 A
5908429 Yoon Jun 1999 A
5911449 Daniele Jun 1999 A
5911721 Nicholson Jun 1999 A
5915751 Esteves Jun 1999 A
5918604 Whelan Jul 1999 A
5919193 Slavitt Jul 1999 A
5919194 Andersen Jul 1999 A
5919208 Valenti Jul 1999 A
5919215 Wiklund Jul 1999 A
5921986 Bonutti Jul 1999 A
5924976 Stelzer Jul 1999 A
5925064 Meyers Jul 1999 A
5928244 Tovey Jul 1999 A
5928267 Bonutti Jul 1999 A
5931838 Vito Aug 1999 A
5931869 Boucher Aug 1999 A
5937504 Esteves Aug 1999 A
5940942 Fong Aug 1999 A
5941900 Bonutti Aug 1999 A
5941901 Egan Aug 1999 A
5945002 Bonutti Aug 1999 A
5947982 Duran Sep 1999 A
5948000 Larsen Sep 1999 A
5948001 Larsen Sep 1999 A
5948002 Bonutti Sep 1999 A
5951590 Goldfarb Sep 1999 A
5956927 Daniele Sep 1999 A
5957953 DiPoto Sep 1999 A
5961499 Bonutti Oct 1999 A
5961521 Roger Oct 1999 A
5961538 Pedlick Oct 1999 A
5961554 Janson Oct 1999 A
5964075 Daniele Oct 1999 A
5964765 Fenton Oct 1999 A
5964769 Wagner Oct 1999 A
5967970 Cowan Oct 1999 A
5968046 Castleman Oct 1999 A
5968047 Reed Oct 1999 A
5970686 Demarest Oct 1999 A
5976156 Taylor et al. Nov 1999 A
5980520 Vancaillie Nov 1999 A
5980558 Wiley Nov 1999 A
5980559 Bonutti Nov 1999 A
5983601 Blanch Nov 1999 A
5984929 Bashiri Nov 1999 A
5987848 Blanch Nov 1999 A
5989282 Bonutti Nov 1999 A
5993458 Vaitekunas Nov 1999 A
5993477 Vaitekunas Nov 1999 A
6007567 Bonutti Dec 1999 A
6007580 Lento Dec 1999 A
6010525 Bonutti Jan 2000 A
6010526 Sandstrom Jan 2000 A
6012216 Esteves Jan 2000 A
6014851 Daniele Jan 2000 A
6017321 Boone Jan 2000 A
6032343 Blanch et al. Mar 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6033429 Magovern Mar 2000 A
6033430 Bonutti Mar 2000 A
6045551 Bonutti Apr 2000 A
6050998 Fletcher Apr 2000 A
6056751 Fenton May 2000 A
6056772 Bonutti May 2000 A
6056773 Bonutti May 2000 A
6059797 Mears May 2000 A
6059817 Bonutti May 2000 A
6059827 Fenton May 2000 A
6063095 Wang et al. May 2000 A
6066151 Miyawaki May 2000 A
6066160 Colvin May 2000 A
6066166 Bischoff May 2000 A
6068637 Popov May 2000 A
6068648 Cole May 2000 A
6074409 Goldfarb Jun 2000 A
6077277 Mollenauer Jun 2000 A
6077292 Bonutti Jun 2000 A
6080161 Eaves Jun 2000 A
6081981 Demarest Jul 2000 A
6083244 Lubbers Jul 2000 A
6083522 Chu Jul 2000 A
6086593 Bonutti Jul 2000 A
6086608 Ek Jul 2000 A
6090072 Kratoska Jul 2000 A
6099531 Bonutti Aug 2000 A
6099537 Sugai Aug 2000 A
6099547 Gellman Aug 2000 A
6099550 Yoon Aug 2000 A
6099552 Adams Aug 2000 A
6102850 Wang et al. Aug 2000 A
6106545 Egan Aug 2000 A
6117160 Bonutti Sep 2000 A
6120536 Ding Sep 2000 A
6125574 Ganaja Oct 2000 A
6126677 Ganaja Oct 2000 A
6132368 Cooper Oct 2000 A
6139320 Hahn Oct 2000 A
RE36974 Bonutti Nov 2000 E
6149658 Gardiner Nov 2000 A
6149669 Li Nov 2000 A
6152949 Bonutti Nov 2000 A
6155756 Mericle Dec 2000 A
6159224 Yoon Dec 2000 A
6159234 Bonutti Dec 2000 A
6171307 Orlich Jan 2001 B1
6174324 Egan Jan 2001 B1
6179840 Bowman Jan 2001 B1
6179850 Goradia Jan 2001 B1
6187008 Hamman Feb 2001 B1
6190400 Van De Moer Feb 2001 B1
6190401 Green et al. Feb 2001 B1
6200322 Branch Mar 2001 B1
6200329 Fung Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6205748 Daniele Mar 2001 B1
6217591 Egan Apr 2001 B1
6224593 Ryan May 2001 B1
6224630 Bao May 2001 B1
6228086 Wahl May 2001 B1
6231565 Tovey May 2001 B1
6231592 Bonutti May 2001 B1
6238395 Bonutti May 2001 B1
6238396 Bonutti May 2001 B1
6241749 Rayhanabad Jun 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6258091 Sevrain Jul 2001 B1
6263558 Blanch Jul 2001 B1
6264675 Brotz Jul 2001 B1
6267761 Ryan Jul 2001 B1
6273717 Hahn Aug 2001 B1
6280474 Cassidy et al. Aug 2001 B1
6286746 Egan et al. Sep 2001 B1
6287325 Bonutti Sep 2001 B1
6293961 Schwartz Sep 2001 B2
6306159 Schwartz Oct 2001 B1
6309405 Bonutti Oct 2001 B1
6312448 Bonutti Nov 2001 B1
6319252 McDevitt Nov 2001 B1
6319271 Schwartz Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6327491 Franklin et al. Dec 2001 B1
6331181 Tierney et al. Dec 2001 B1
6338730 Bonutti Jan 2002 B1
6340365 Dittrich Jan 2002 B2
6348056 Bates Feb 2002 B1
6358271 Egan Mar 2002 B1
6364897 Bonutti Apr 2002 B1
6368325 McKinley Apr 2002 B1
6368326 Dakin Apr 2002 B1
6368343 Bonutti Apr 2002 B1
6371957 Amrein Apr 2002 B1
6385475 Cinquin et al. May 2002 B1
6395007 Bhatnagar et al. May 2002 B1
6409742 Fulton Jun 2002 B1
6409743 Fenton Jun 2002 B1
6419704 Ferree Jul 2002 B1
6423072 Zappala Jul 2002 B1
6423088 Fenton Jul 2002 B1
6425919 Lambrecht Jul 2002 B1
6428562 Bonutti Aug 2002 B2
6430434 Mittelstadt Aug 2002 B1
6432115 Mollenauer Aug 2002 B1
6436107 Wang et al. Aug 2002 B1
6447516 Bonutti Sep 2002 B1
6447550 Hunter Sep 2002 B1
6450985 Schoelling Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6461360 Adam Oct 2002 B1
6468265 Evans et al. Oct 2002 B1
6468293 Bonutti Oct 2002 B2
6471715 Weiss Oct 2002 B1
6475230 Bonutti Nov 2002 B1
6488196 Fenton Dec 2002 B1
6496003 Okumura et al. Dec 2002 B1
6500195 Bonutti Dec 2002 B2
6503259 Huxel Jan 2003 B2
6527774 Lieberman Mar 2003 B2
6530933 Yeung Mar 2003 B1
6533157 Whitman Mar 2003 B1
6533818 Weber Mar 2003 B1
6535764 Imran Mar 2003 B2
6544267 Cole Apr 2003 B1
6545909 Tanaka et al. Apr 2003 B2
6547792 Tsuji Apr 2003 B1
6551304 Whalen Apr 2003 B1
6554852 Oberlander Apr 2003 B1
6557426 Reinemann, Jr. et al. May 2003 B2
6558390 Cragg May 2003 B2
6562043 Chan May 2003 B1
6565554 Niemeyer May 2003 B1
6568313 Fukui May 2003 B2
6569167 Bobechko May 2003 B1
6569187 Bonutti May 2003 B1
6572635 Bonutti Jun 2003 B1
D477776 Pontaoe Jul 2003 S
6585746 Gildenberg Jul 2003 B2
6585750 Bonutti Jul 2003 B2
6585764 Wright Jul 2003 B2
6592609 Bonutti Jul 2003 B1
6594517 Nevo Jul 2003 B1
6605090 Trieu Aug 2003 B1
6610080 Morgan Aug 2003 B2
6618910 Pontaoe Sep 2003 B1
6623486 Weaver Sep 2003 B1
6623487 Goshert Sep 2003 B1
6626944 Taylor Sep 2003 B1
6632245 Kim Oct 2003 B2
6635073 Bonutti Oct 2003 B2
6638279 Bonutti Oct 2003 B2
6641592 Sauer Nov 2003 B1
6645227 Fallin Nov 2003 B2
6666877 Morgan Dec 2003 B2
6669705 Westhaver Dec 2003 B2
6676669 Charles et al. Jan 2004 B2
6679888 Green Jan 2004 B2
6685750 Plos Feb 2004 B1
6699177 Wang et al. Mar 2004 B1
6699240 Francischelli Mar 2004 B2
6702821 Bonutti Mar 2004 B2
6705179 Mohtasham Mar 2004 B1
6709457 Otte Mar 2004 B1
6712828 Schraft et al. Mar 2004 B2
6714841 Wright Mar 2004 B1
6719765 Bonutti Apr 2004 B2
6719797 Ferree Apr 2004 B1
6722552 Fenton Apr 2004 B2
6731988 Green May 2004 B1
6733506 McDevitt et al. May 2004 B1
6733531 Trieu May 2004 B1
6764514 Li et al. Jul 2004 B1
6770078 Bonutti Aug 2004 B2
6770079 Bhatnagar Aug 2004 B2
6780198 Gregoire Aug 2004 B1
6783524 Anderson et al. Aug 2004 B2
6786989 Torriani Sep 2004 B2
6796003 Marvel Sep 2004 B1
6799065 Niemeyer Sep 2004 B1
6818010 Eichhorn Nov 2004 B2
6823871 Schmieding Nov 2004 B2
6827712 Tovey Dec 2004 B2
6837892 Shoham Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6860878 Brock Mar 2005 B2
6860885 Bonutti Mar 2005 B2
6869437 Hausen Mar 2005 B1
6878167 Ferree Apr 2005 B2
6884264 Spiegelberg et al. Apr 2005 B2
6890334 Brace May 2005 B2
6893434 Fenton May 2005 B2
6899722 Bonutti May 2005 B2
6913666 Aeschlimann Jul 2005 B1
6916321 TenHuisen Jul 2005 B2
6921264 Mayer Jul 2005 B2
6923824 Morgan Aug 2005 B2
6932835 Bonutti Aug 2005 B2
6942684 Bonutti Sep 2005 B2
6944111 Nakamura Sep 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6955540 Mayer Oct 2005 B2
6955683 Bonutti Oct 2005 B2
6958077 Suddaby Oct 2005 B2
6987983 Rosenblatt Jan 2006 B2
6997940 Bonutti Feb 2006 B2
7001411 Dean Feb 2006 B1
7004959 Bonutti Feb 2006 B2
7008226 Mayer Mar 2006 B2
7013191 Rubbert Mar 2006 B2
7018380 Cole Mar 2006 B2
7033379 Peterson Apr 2006 B2
7048755 Bonutti May 2006 B2
7066960 Dickman Jun 2006 B1
7087073 Bonutti Aug 2006 B2
7090111 Egan Aug 2006 B2
7090683 Brock et al. Aug 2006 B2
7094251 Bonutti Aug 2006 B2
7104996 Bonutti Sep 2006 B2
7128763 Blatt Oct 2006 B1
7147652 Bonutti Dec 2006 B2
7153312 Torrie Dec 2006 B1
7160405 Aeschlimann Jan 2007 B2
7179259 Gibbs Feb 2007 B1
7192448 Ferree Mar 2007 B2
7209776 Leitner Apr 2007 B2
7217279 Reese May 2007 B2
7217290 Bonutti May 2007 B2
7241297 Shaolian et al. Jul 2007 B2
7250051 Francischelli Jul 2007 B2
7252685 Bindseil Aug 2007 B2
7273497 Ferree Sep 2007 B2
7297142 Brock Nov 2007 B2
7329263 Bonutti Feb 2008 B2
7331932 Leitner Feb 2008 B2
7335205 Aeschlimann Feb 2008 B2
7445634 Trieu Nov 2008 B2
7477926 McCombs Jan 2009 B2
7481825 Bonutti Jan 2009 B2
7481831 Bonutti Jan 2009 B2
7510895 Rateman Mar 2009 B2
7641660 Lakin et al. Jan 2010 B2
7708741 Bonutti May 2010 B1
7794467 McGinley et al. Sep 2010 B2
7831295 Friedrich Nov 2010 B2
7854750 Bonutti Dec 2010 B2
7879072 Bonutti Feb 2011 B2
7891691 Bearey Feb 2011 B2
7959635 Bonutti Jun 2011 B1
7967820 Bonutti Jun 2011 B2
8046052 Verard et al. Oct 2011 B2
8073528 Zhao Dec 2011 B2
8109942 Carson Feb 2012 B2
8128669 Bonutti Mar 2012 B2
8140982 Hamilton Mar 2012 B2
8147514 Bonutti Apr 2012 B2
8162977 Bonutti Apr 2012 B2
8214016 Lavallee et al. Jul 2012 B2
8221402 Francischelli et al. Jul 2012 B2
8382765 Axelson et al. Feb 2013 B2
8429266 Vanheuverzwyn Apr 2013 B2
8480679 Park Jul 2013 B2
8483469 Pavlovskaia Jul 2013 B2
8500816 Dees Aug 2013 B2
8532361 Pavlovskaia Sep 2013 B2
8560047 Haider et al. Oct 2013 B2
8617171 Park et al. Dec 2013 B2
8702732 Woodard Apr 2014 B2
8715291 Park May 2014 B2
8737700 Park May 2014 B2
8740798 Hamada Jun 2014 B2
8777875 Park Jul 2014 B2
8781193 Steinberg et al. Jul 2014 B2
8781556 Kienzle Jul 2014 B2
8894634 Devengenzo et al. Nov 2014 B2
8968320 Park Mar 2015 B2
9008757 Wu Apr 2015 B2
9014851 Wong et al. Apr 2015 B2
9060797 Bonutti Jun 2015 B2
9119655 Bowling et al. Sep 2015 B2
9186046 Ramamurthy et al. Nov 2015 B2
9195926 Spodak Nov 2015 B2
9456765 Odermatt Oct 2016 B2
9844414 Fischer et al. Dec 2017 B2
9922410 Shimada Mar 2018 B2
10058393 Bonutti et al. Aug 2018 B2
10194801 Elhawary et al. Feb 2019 B2
10350390 Moll et al. Jul 2019 B2
10368951 Moll et al. Aug 2019 B2
10433763 Piron et al. Oct 2019 B2
10499996 de Almeida Barreto Dec 2019 B2
10594931 Piponi Mar 2020 B2
10672134 Suzuki Jun 2020 B2
10687784 Shoham Jun 2020 B2
10765484 Bonutti et al. Sep 2020 B2
10828786 Shoham Nov 2020 B2
10974069 Maguire et al. Apr 2021 B2
11197651 Cohen et al. Dec 2021 B2
11229362 Ben-Haim Jan 2022 B2
11317974 Bonutti et al. May 2022 B2
20010002440 Bonutti May 2001 A1
20010005975 Golightly Jul 2001 A1
20010009250 Herman Jul 2001 A1
20010041916 Bonutti Nov 2001 A1
20010049497 Kalloo Dec 2001 A1
20020016593 Hearn Feb 2002 A1
20020016633 Lin Feb 2002 A1
20020019649 Sikora Feb 2002 A1
20020026244 Trieu Feb 2002 A1
20020029083 Zucherman Mar 2002 A1
20020029084 Paul Mar 2002 A1
20020038118 Shoham Mar 2002 A1
20020045888 Ramans et al. Apr 2002 A1
20020045902 Bonutti Apr 2002 A1
20020049449 Bhatnagar Apr 2002 A1
20020062136 Hillstead May 2002 A1
20020062153 Paul May 2002 A1
20020082612 Moll et al. Jun 2002 A1
20020087048 Brock Jul 2002 A1
20020087049 Brock Jul 2002 A1
20020087148 Brock Jul 2002 A1
20020087166 Brock Jul 2002 A1
20020087169 Brock Jul 2002 A1
20020095175 Brock Jul 2002 A1
20020103495 Cole Aug 2002 A1
20020115934 Tuke Aug 2002 A1
20020120252 Brock Aug 2002 A1
20020123750 Eisermann Sep 2002 A1
20020128633 Brock Sep 2002 A1
20020128661 Brock Sep 2002 A1
20020128662 Brock Sep 2002 A1
20020133173 Brock Sep 2002 A1
20020133174 Charles et al. Sep 2002 A1
20020133175 Carson Sep 2002 A1
20020138082 Brock Sep 2002 A1
20020138109 Keogh Sep 2002 A1
20020143319 Brock Oct 2002 A1
20020183762 Anderson Dec 2002 A1
20020183851 Spiegelberg Dec 2002 A1
20020188301 Dallara Dec 2002 A1
20030014064 Blatter Jan 2003 A1
20030028196 Bonutti Feb 2003 A1
20030039196 Nakamura Feb 2003 A1
20030040758 Wang Feb 2003 A1
20030045900 Hahnen Mar 2003 A1
20030055409 Brock Mar 2003 A1
20030060927 Gerbi et al. Mar 2003 A1
20030065361 Dreyfuss Apr 2003 A1
20030069591 Carson et al. Apr 2003 A1
20030105474 Bonutti Jun 2003 A1
20030118518 Hahn Jun 2003 A1
20030125808 Hunter Jul 2003 A1
20030135204 Lee Jul 2003 A1
20030153978 Whiteside Aug 2003 A1
20030158582 Bonutti Aug 2003 A1
20030167072 Oberlander Sep 2003 A1
20030176783 Hu Sep 2003 A1
20030181800 Bonutti Sep 2003 A1
20030195530 Thill Oct 2003 A1
20030195565 Bonutti Oct 2003 A1
20030204204 Bonutti Oct 2003 A1
20030212403 Swanson Nov 2003 A1
20030216669 Lang Nov 2003 A1
20030216742 Wetzler Nov 2003 A1
20030225438 Bonutti Dec 2003 A1
20030229361 Jackson Dec 2003 A1
20040010287 Bonutti Jan 2004 A1
20040030341 Aeschlimann Feb 2004 A1
20040034282 Quaid Feb 2004 A1
20040034357 Beane Feb 2004 A1
20040097939 Bonutti May 2004 A1
20040097948 Heldreth May 2004 A1
20040098050 Foerster May 2004 A1
20040102804 Chin May 2004 A1
20040106916 Quaid et al. Jun 2004 A1
20040138703 Alleyne Jul 2004 A1
20040143334 Ferree Jul 2004 A1
20040152970 Hunter et al. Aug 2004 A1
20040157188 Luth et al. Aug 2004 A1
20040167548 Bonutti Aug 2004 A1
20040199072 Sprouse et al. Oct 2004 A1
20040220616 Bonutti Nov 2004 A1
20040225325 Bonutit Nov 2004 A1
20040230223 Bonutti Nov 2004 A1
20040236374 Bonutti Nov 2004 A1
20040236424 Berez Nov 2004 A1
20040243109 Tovey Dec 2004 A1
20040267242 Grimm Dec 2004 A1
20050033366 Cole Feb 2005 A1
20050038514 Helm Feb 2005 A1
20050043796 Grant Feb 2005 A1
20050071012 Serhan Mar 2005 A1
20050090827 Gedebou Apr 2005 A1
20050090840 Gerbino Apr 2005 A1
20050096699 Wixey et al. May 2005 A1
20050113846 Carson May 2005 A1
20050113928 Cragg et al. May 2005 A1
20050126680 Aeschlimann et al. Jun 2005 A1
20050143826 Zucherman et al. Jun 2005 A1
20050149024 Ferrante et al. Jul 2005 A1
20050149029 Bonutti Jul 2005 A1
20050177169 Fisher et al. Aug 2005 A1
20050192673 Saltzman et al. Sep 2005 A1
20050203521 Bonutti Sep 2005 A1
20050216059 Bonutti Sep 2005 A1
20050216087 Zucherman Sep 2005 A1
20050222620 Bonutti Oct 2005 A1
20050234332 Murphy Oct 2005 A1
20050234461 Burdulis Oct 2005 A1
20050234465 McCombs et al. Oct 2005 A1
20050240190 Gall Oct 2005 A1
20050240227 Bonutti Oct 2005 A1
20050246021 Ringelsen Nov 2005 A1
20050261684 Shaolian Nov 2005 A1
20050267481 Carl Dec 2005 A1
20050267534 Bonutti Dec 2005 A1
20060009855 Goble Jan 2006 A1
20060015101 Warburton Jan 2006 A1
20060015108 Bonutti Jan 2006 A1
20060024357 Carpenter Feb 2006 A1
20060026244 Watson Feb 2006 A1
20060064095 Senn Mar 2006 A1
20060089646 Bonutti Apr 2006 A1
20060122600 Cole Jun 2006 A1
20060122704 Vresilovic Jun 2006 A1
20060142657 Quaid et al. Jun 2006 A1
20060142799 Bonutti Jun 2006 A1
20060161051 Terrill-Grisoni Jul 2006 A1
20060161136 Anderson Jul 2006 A1
20060167495 Bonutti Jul 2006 A1
20060200199 Bonutti Sep 2006 A1
20060207978 Rizun et al. Sep 2006 A1
20060212073 Bonutti Sep 2006 A1
20060217765 Bonutti Sep 2006 A1
20060229623 Bonutti Oct 2006 A1
20060235470 Bonutti Oct 2006 A1
20060241695 Bonutti Oct 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20060265009 Bonutti Nov 2006 A1
20060265011 Bonutti Nov 2006 A1
20060271056 Terrill-Grisoni Nov 2006 A1
20070032825 Bonutti Feb 2007 A1
20070088340 Brock Apr 2007 A1
20070088362 Bonutti Apr 2007 A1
20070100258 Shoham May 2007 A1
20070118055 McCombs May 2007 A1
20070118129 Fraser May 2007 A1
20070173946 Bonutti Jul 2007 A1
20070185498 Lavallee Aug 2007 A2
20070198555 Friedman Aug 2007 A1
20070219561 Lavallee Sep 2007 A1
20070239153 Hodorek et al. Oct 2007 A1
20070265561 Yeung Nov 2007 A1
20070270833 Bonutti et al. Nov 2007 A1
20070287889 Mohr Dec 2007 A1
20080021474 Bonutti et al. Jan 2008 A1
20080039845 Bonutti et al. Feb 2008 A1
20080039873 Bonutti et al. Feb 2008 A1
20080046090 Paul et al. Feb 2008 A1
20080097448 Binder Apr 2008 A1
20080108897 Bonutti May 2008 A1
20080108916 Bonutti May 2008 A1
20080114399 Bonutti May 2008 A1
20080132950 Lange Jun 2008 A1
20080140088 Orban Jun 2008 A1
20080140116 Bonutti Jun 2008 A1
20080140117 Bonutti Jun 2008 A1
20080195145 Bonutti Aug 2008 A1
20080243127 Lang Oct 2008 A1
20080249394 Giori Oct 2008 A1
20080262812 Arata Oct 2008 A1
20080269753 Cannestra Oct 2008 A1
20080269808 Gall Oct 2008 A1
20090024161 Bonutti Jan 2009 A1
20090088634 Zhao et al. Apr 2009 A1
20090093684 Schorer Apr 2009 A1
20090131941 Park et al. May 2009 A1
20090138014 Bonutti May 2009 A1
20090194969 Bearey Aug 2009 A1
20090197217 Butscher Aug 2009 A1
20090287222 Lee et al. Nov 2009 A1
20100211120 Bonutti Aug 2010 A1
20100217400 Nortman et al. Aug 2010 A1
20100256504 Moreau-Gaudry Oct 2010 A1
20110029093 Bojarski Feb 2011 A1
20110060375 Bonutti Mar 2011 A1
20110082462 Suarez et al. Apr 2011 A1
20110087332 Bojarski Apr 2011 A1
20110130761 Plaskos et al. Jun 2011 A1
20110144661 Houser Jun 2011 A1
20110295253 Bonutti Dec 2011 A1
20120053591 Haines Mar 2012 A1
20120123937 Spodak May 2012 A1
20120165841 Bonutti Jun 2012 A1
20120184961 Johannaber Jul 2012 A1
20120191140 Bonutti Jul 2012 A1
20120215233 Bonutti Aug 2012 A1
20120323244 Cheal et al. Dec 2012 A1
20120330429 Axelson, Jr. et al. Dec 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130035696 Qutub Feb 2013 A1
20130072821 Odermatt Mar 2013 A1
20130211531 Steines et al. Aug 2013 A1
20130331730 Fenech et al. Dec 2013 A1
20140257293 Axelson, Jr. et al. Sep 2014 A1
20140343573 Bonutti Nov 2014 A1
20150106024 Lightcap Apr 2015 A1
20150141759 Charles et al. May 2015 A1
20150145953 Fujie et al. May 2015 A1
20150157416 Andersson Jun 2015 A1
20150257768 Bonutti Sep 2015 A1
20150320514 Ahn et al. Nov 2015 A1
20160012306 Huang Jan 2016 A1
20160030115 Shen et al. Feb 2016 A1
20160081758 Bonutti Mar 2016 A1
20160199548 Cheng Jul 2016 A1
20160202053 Walker et al. Jul 2016 A1
20160206375 Abbasi et al. Jul 2016 A1
20170020615 Koenig Jan 2017 A1
20170148213 Thomas et al. May 2017 A1
20170252114 Crawford et al. Sep 2017 A1
20180296283 Crawford et al. Oct 2018 A1
20190000049 Bonutti et al. Jan 2019 A1
20190220976 Holsing et al. Jul 2019 A1
20190274765 Crawford et al. Sep 2019 A1
20190388160 Wood et al. Dec 2019 A1
20210106389 Bonutti et al. Apr 2021 A1
20220015727 Averbuch Jan 2022 A1
20220296311 Bonutti et al. Sep 2022 A1
Foreign Referenced Citations (41)
Number Date Country
2641580 Aug 2007 CA
2660827 Feb 2008 CA
2698057 Mar 2009 CA
1903016 Oct 1964 DE
1903316 Jul 1970 DE
3517204 Nov 1986 DE
3722538 Jan 1989 DE
9002844 Dec 1990 DE
0773004 May 1997 EP
0784454 Jul 1997 EP
1614525 Jan 2006 EP
1988837 Nov 2008 EP
2134294 Dec 2009 EP
2696338 Apr 1994 FR
2717368 Sep 1995 FR
2728779 Jul 1996 FR
2736257 Jan 1997 FR
2750031 Dec 1997 FR
2771621 Jun 1999 FR
2785171 May 2000 FR
2093701 Sep 1982 GB
2306110 Apr 1997 GB
H08140982 Jun 1996 JP
184396 Oct 2018 RU
1991012779 Sep 1991 WO
199323094 Nov 1993 WO
1994008642 Apr 1994 WO
1995016398 Jun 1995 WO
1995031941 Nov 1995 WO
1996014802 May 1996 WO
1997012779 Apr 1997 WO
1997049347 Dec 1997 WO
1998011838 Mar 1998 WO
1998026720 Jun 1998 WO
2002053011 Jul 2002 WO
2007092869 Aug 2007 WO
2008116203 Sep 2008 WO
2009029908 Mar 2009 WO
2010099222 Sep 2010 WO
2014163164 Oct 2014 WO
2015020093 Feb 2015 WO
Non-Patent Literature Citations (70)
Entry
510K, Arthrex Pushlock, Jun. 29, 2005, K051219.
510K, Mitek Micro anchor, Nov. 6, 1996, K962511.
510K, Modified Mitek 3.5mm Absorbable Suture Anchor System, Jun. 9, 1997, K970896.
510K, Multitak Suture System, Jan. 10, 1997, K964324.
510K, Summary for Arthrex Inc.'s Bio-Interference Screw, Jul. 9, 1997, K971358.
510K, Surgicraft Bone Tie, Sep. 25, 1998, K982719.
510k—Linvatec Biomaterials modification of Duet and impact Suture Anchor, Nov. 19, 2004, k042966.
510k—TranSet Fracture Fixation System, Feb. 24, 2004, k033711.
Arthrex, Protect your graft, Am J Sports Med, vol. 22, No. 4, Jul.-Aug. 1994.
Ask Oxford, compact Oxford English dictionary: projection, Mar. 30, 2009.
Ask Oxford, compact Oxford English dictionary: slit, Mar. 30, 2009.
Barrett et al., T-Fix endoscopic meniscal repair: technique and approach to different types of tears, Apr. 1995, Arthroscopy vol. 11 No. 2 p. 245-51.
Biomet, Stanmore Modular Hip, J. Bone Joint Surg., vol. 76-B : No. Two, Mar. 1994.
Branson, Polymers: Characteristics and Compatibility for Ultrasonic Assembly, Applied Technologies Group, Publication unknown.
Cobb et al, Late Correction of Malunited Intercondylar Humeral Fractures Intra-Articular Osteotomy and Tricortical Bone Grafting, J BoneJointSurg [Br] 1994; 76-B:622-6.
Cope, Suture Anchor for Visceral Drainage, AJR, vol. 148 p. 160-162, Jan. 1986.
Corrected Petition for Inter Partes Review of U.S. Pat. No. 5,921,986, IPR 2013-00631, filed Sep. 27, 2013.
Corrected Petition for Inter Partes Review of U.S. Pat. No. 8,147,514, IPR 2013-00632, filed Sep. 27, 2013.
Corrected Petition for Inter Partes Review of U.S. Pat. No. 8,147,514, IPR 2013-00633, filed Sep. 27, 2013.
Declaration of David Kaplan. Ph.D. Regarding U.S. Pat. No. 5,980,559, IPR 2013-00603, Sep. 24, 2013.
Declaration of Dr. Philip Hardy in Support of Petition for Inter Partes Review of U.S. Pat. No. 5,527,343, IPR 2013-00628, Sep. 25, 2013.
Declaration of Dr. Philip Hardy in Support of Petition for Inter Partes Review of U.S. Pat. No. 6,500,195, IPR 2013-00624, Sep. 25, 2013.
Declaration of Dr. Steve E. Jordan for U.S. Pat. No. 8, 147,514, from IPR 2013-00631, dated Sep. 23, 2013.
Declaration of Steve Jordan for U.S. Pat. No. 5,921,986, from IPR 2013-00632, dated Sep. 24, 2013 (exhibit 1010).
Declaration of Steve Jordan for U.S. Pat. No. 5,921,986, from IPR 2013-00633, dated Sep. 24, 2013 (exhibit 1007).
Declaration of Steve Jordan for U.S. Pat. No. 8,147,514, from IPR 2013-00632, dated Sep. 23, 2013 (exhibit 1009).
Declaration of Steve Jordan for U.S. Pat. No. 8,147,514, from IPR 2013-00633, dated Sep. 23, 2013 (exhibit 1006).
Declaration of Wayne J. Sebastianelli, MD Regarding U.S. Pat. No. 7,087,073, Sep. 24, 2013, IPR 2013-00604.
Enabling Local Drug Delivery—Implant Device Combination Therapies, Surmodics, Inc., (c) 2003.
English language abstract for WO 2014/163164 A1 extracted from espacenet.com database on Apr. 6, 2022, 2 pages.
Expert Declaration of Steve E. Jordan, MD, for Inter Partes Review of U.S. Pat. No. 5,921,986, IPR 2013-00631, Sep. 24, 2013.
Extended Search Report for 17183788.3, dated Oct. 5, 2017, 8 pages.
Fellinger et al, Radial avulsion of the triangular fibrocartilage complex in acute wrist trauma: a new technique for arthroscopic repair, Jun. 1997, Arthroscopy vol. 13 No. 3 p. 370-4.
Femoral Bone Plug Recession in Endoscope Anterior Cruciate Ligament Reconstruction, David E. Taylor, Arthroscopy: The Journal of Arthroscopic and Related Surgery, Aug. 1996.
Flory, Principles of Polymer Chemistry, 1953, selected pages (cited in IPR 2013-00603, exhibit 1012).
Gabriel, Arthroscopic Fixation Devices, Wiley Enc. of Biomed Eng., 2006.
Gao et el, Swelling of Hydroxypropyl Methylcellulose Matrix Tablets . . . , J. of Pharmaceutical Sciences, vol. 85, No. 7, Jul. 1996, p. 732-740 (cited in IPR 2013-00603, exhibit 1014).
Gopferich, Mechanisms of polymer degradation and erosion, Biomaterials, 1996, vol. 17, No. 2, p. 103-114 (cited in PR 2013-00603, exhibit 1013).
Grizzi, Hydrolytic degradation of devices based on poly(DL-lactic acid) size-dependence, Biomaterials, 1995, vol. 16, No. 4, p. 305-11 (cited in IPR 2013-00603, exhibit 1006).
Guide to Ultrasound Plastic Assembly, Ultrasonic Division Publication, (c) 1995.
Gupta. “Dynamic illumination based system to remove the glare and improve the quality of medical images.” Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of IEEE, pp. 3020-3023, Jul. 3, 2013.
Hecker et al , Pull-out strength of suture anchors for rotator cuff and Bankart lesion repairs, Nov.-Dec. 1993 , The American Journal of Sports Medicine, vol. 21 No. 6 p. 874-9.
Hernigou et al, Proximal Tibial Osteotomy for Osteoarthritis with Varus Deformity a Ten to Thirteen-Year Follow-Up Study, J Bone Joint Surg, vol. 69-A, No. 3. Mar. 1987, p. 332-354.
Ibarra et al, Glenoid Replacement in Total Shoulder Arthroplasty, The Orthopedic Clinics of North America: Total Shoulder Arthroplasty, vol. 29 No. 3, Jul. 1998 p. 403-413.
Innovasive, We've got you covered, Am J Sports Med, vol. 26, No. 1, Jan.-Feb. 1998.
Karlsson et al, Repair of Bankart lesions with a suture anchor in recurrent dislocation of the shoulder, Scand. j. of Med & Science in Sports, 1995, 5:170-174.
Linvatec, Impact Suture Anchor brochure, 2004 (cited in IPR 2013-00628, exhibit 1010).
Madjar et al, Minimally Invasive Pervaginam Procedures, for the Treatment of Female Stress Incontinence . . . , Artificial Organs, 22 (10) 879-885, 1998.
Meniscus Replacement with Bone Anchors: A Surgical Technique, Arthroscopy: The Journal of Arthroscopic and Related Surgery, 1994.
Mosca et al, Calcaneal Lengthening for Valgus Deformity of the Hindfoot: Results in Children Who Had Severe, Symptomatic Flatfoot and Skewfoot, J Bone Joint Surg,, 1195—p. 499-512.
Murphy et al , Radial Opening Wedge Osteotomy in Madelung's Deformity, J. Hand Surg, vol. 21 A No. 6 Nov. 1996, p. 1035-44.
Nowak et al, Comparative Study of Fixation Techniques in the Open Bankart Operation Using Either a Cannulated Screw or Suture—Anchors, Acta Orthopcedica Belgica, vol. 64—Feb. 1998.
Packer et al, Repair of Acute Scapho-Lunate Dissociation Facilitated by the “TAG” Suture Anchor, Journal of Hand Surgery (British and European Volume, 1994) 19B: 5: 563-564.
Petition for Inter Partes Review of U.S. Pat. No. 5,527,343, IPR 2013-00628, filed Sep. 25-26, 2013.
Petition for Inter Partes Review of U.S. Pat. No. 5,980,559, IPR 2013-00603, filed Sep. 24, 2013.
Petition for Inter Partes Review of U.S. Pat. No. 6,500, 195, IPR 2013-00624, filed Oct. 2, 2013.
Petition for Inter Partes Review of U.S. Pat. No. 7,087,073, IPR 2013-00604, filed Sep. 4, 2013.
Problem Solving Report Question No. 1014984.066, Ultrasonic Welding, (c) 1999.
Richmond, Modification of the Bankart reconstruction with a suture anchor, Am J Sports Med, vol. 19, No. 4, p. 343-346, 1991.
Seitz et al, Repair of the Tibiofibular Syndesmosis with a Flexible Implant, J. of Orthopaedic Trauma, vol. 5, No. 1, p. 78-82, 1991 (cited in IPR 2013-00631, exhibit 1007) (cited in 2013-00632).
Shea et al, Technical Note: Arthroscopic Rotator Cuff Repair Using a Transhumeral Approach to Fixation, Arthroscopy: The Journal of Arthroscopic and Related Surgery, vol. 14, No. 1 Jan.-Feb. 1998: pp. 118-122.
Stent Based Delivery of Sirolimus Reduces Neointimal Formation in a Porcine Coronary Model, Takeshi Suzuki, American Heart Association, Inc. (c) 2001.
Textured Surface Technology, Branson Technology, Branson Ultrasonics Copr., (c) 1992.
Tfix, Acufexjust tied the knot . . . , Am. J. Sports Med., vol. 22, No. 3, May-Jun. 1994.
The Search for the Holy Grail: A Century of Anterior Cruciate Ligament Reconstruction, R. John Naranja, American Journal of Orthopedics, Nov. 1997.
Translation of DE9002844.9 with translator's certificate dated Sep. 26, 2013 (cited in IPR 2013-00631, 2013-00632).
Translation of FR2696338 with translator's certificate dated Sep. 17, 2013 (cited in IPR 2013-00631, 2013-00632).
Why Tie a Knot When You Can Use Y-Knot?, Innovasive Devices Inc., (c) 1998.
WO2015020093 English Translation Sep. 4, 2017.
Wong et al, Case Report: Proper Insertion Angle Is Essential to Prevent Intra-Articular Protrusion of a Knotless Suture Anchor in Shoulder Rotator Cuff Repair, Arthroscopy: The Journal of Arthroscopic and Related Surgery, vol. 26, No. 2 Feb. 2010:pp. 286-290.
Related Publications (1)
Number Date Country
20230372026 A1 Nov 2023 US
Provisional Applications (2)
Number Date Country
62369821 Aug 2016 US
62244460 Oct 2015 US
Continuations (4)
Number Date Country
Parent 17714191 Apr 2022 US
Child 18221592 US
Parent 16986467 Aug 2020 US
Child 17714191 US
Parent 16113666 Aug 2018 US
Child 16986467 US
Parent 15299981 Oct 2016 US
Child 16113666 US