Surgical visualization systems and displays

Information

  • Patent Grant
  • 11147443
  • Patent Number
    11,147,443
  • Date Filed
    Monday, February 24, 2020
    4 years ago
  • Date Issued
    Tuesday, October 19, 2021
    2 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Li; Tracy Y.
    Agents
    • Knobbe, Martens, Olson & Bear, LLP
Abstract
A medical apparatus is described for providing visualization of a surgical site. The medical apparatus includes an electronic display disposed within a display housing, the electronic display configured to produce a two-dimensional image. The medical apparatus includes a display optical system disposed within the display housing, the display optical system comprising a plurality of lens elements disposed along an optical path. The display optical system is configured to receive the two-dimensional image from the electronic display, produce a beam with a cross-section that remains substantially constant along the optical path, and produce a collimated beam exiting the opening in the display housing. The medical apparatus can also include an auxiliary video camera configured to provide an oblique view of a patient on the electronic display without requiring a surgeon to adjust their viewing angle through oculars viewing the electronic display.
Description
BACKGROUND
Field

Embodiments of the present disclosure relate to visualization systems and displays for use during surgery.


Description of Related Art

Some surgical operations involve the use of large incisions. These open surgical procedures provide ready access for surgical instruments and the hand or hands of the surgeon, allowing the user to visually observe and work in the surgical site, either directly or through an operating microscope or with the aide of loupes. Open surgery is associated with significant drawbacks, however, as the relatively large incisions result in pain, scarring, and the risk of infection as well as extended recovery time. To reduce these deleterious effects, techniques have been developed to provide for minimally invasive surgery. Minimally invasive surgical techniques, such as endoscopy, laparoscopy, arthroscopy, pharyngo-laryngoscopy, as well as small incision procedures utilizing an operating microscope for visualization, utilize a significantly smaller incision than typical open surgical procedures. Specialized tools may then be used to access the surgical site through the small incision. However, because of the small access opening, the surgeon's view and workspace of the surgical site is limited. In some cases, visualization devices such as endoscopes, laparoscopes, and the like can be inserted percutaneously through the incision to allow the user to view the surgical site.


The visual information available to a user without the aid of visualization systems and/or through laparoscopic or endoscopic systems contain trade-offs in approach. Accordingly, there is a need for improved visualization systems, for use in open and/or minimally invasive surgery.


SUMMARY

The systems, methods and devices of the disclosure each have innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.


In a first aspect, a medical apparatus is provided that includes a display housing and an opening in the display housing. The medical apparatus also includes an electronic display disposed within the display housing, the electronic display comprising a plurality of pixels configured to produce a two-dimensional image. The medical apparatus also includes a display optical system disposed within the display housing, the display optical system comprising a plurality of lens elements disposed along an optical path. The display optical system is configured to receive the two-dimensional image from the electronic display, produce a beam with a cross-section that remains substantially constant along the optical path, and produce a collimated beam exiting the opening in the display housing.


In some embodiments of the first aspect, the display optical system further comprises an optical redirection element configured to fold the optical path. In a further embodiment of the first aspect the optical redirection element comprises a mirror or a prism. In another embodiment of the first aspect, the display optical system is configured to direct light received from the electronic display to the opening in the display housing while reducing stray light.


In some embodiments of the first aspect, the display optical system further comprises a baffle configured to reduce stray light. In a further embodiment, the display optical system comprises less than or equal to four baffles. In a further embodiment, the display optical system comprises less than or equal to four mirrors. In a further embodiment, a first baffle is positioned between the electronic display and a first baffle along the optical path, the first mirror positioned prior to the plurality of lens elements along the optical path from the display to the opening. In another further embodiment, at least three baffles are positioned prior to the plurality of lens elements along the optical path from the display to the opening. In another further embodiment, at least two mirrors are positioned prior to the plurality of lens elements along the optical path from the display to the opening.


In some embodiments of the first aspect, the display optical system has an exit pupil and the electronic display is not parallel to the exit pupil. In some embodiments of the first aspect, the opening in the display housing comprises a mounting interface configured to mate with a binocular assembly for a surgical microscope. In a further embodiment, an exit pupil of the display optical system is of a same size or smaller than an entrance pupil of oculars in the binocular assembly.


In some embodiments of the first aspect, the medical apparatus further comprises a second electronic display and a second display optical system configured to provide a stereo view. In some embodiments of the first aspect, the medical apparatus further comprises processing electronics configured to communicate with the electronic display to provide images for the electronic display. In a further embodiment, the processing electronics are configured to receive images from one or more cameras on a surgical device. In a further embodiment, the processing electronics are configured to receive images from one or more cameras that provide a surgical microscope view.


In some embodiments of the first aspect, the optical path is less than or equal to 16.2 inches and a light-emitting portion of the electronic display has a diagonal measurement that is greater than or equal to 5 inches. In some embodiments of the first aspect, the optical path is less than or equal to 18.7 inches and a light-emitting portion of the electronic display has a diagonal measurement that is greater than or equal to 8 inches. In some embodiments of the first aspect, the display optical system further comprises a converging mirror. In some embodiments of the first aspect, the medical apparatus further comprises a viewing assembly comprising an objective lens, beam positioning optics, and an ocular, the viewing assembly configured to receive the collimated beam exiting the opening in the display housing. In some embodiments of the first aspect, the electronic display has a diagonal light-emitting portion between 4 inches and 9 inches. In some embodiments of the first aspect, an optical path length from the electronic display to a last element of the display optical system is at least 9 inches. In a further embodiment, the optical path length from the electronic display to the last element of the display optical system is less than 20 inches.


In a second aspect, a medical apparatus is provided that includes a viewing assembly comprising a housing and an ocular, the ocular configured to provide a view an electronic display disposed in the housing. The medical assembly includes an optical assembly disposed on the viewing assembly, the optical assembly configured to provide a surgical microscope view of a surgical site. The optical assembly includes an auxiliary video camera and a gimbal configured to couple the auxiliary video camera to the viewing assembly and configured to change an orientation of the auxiliary video camera relative to the viewing assembly. The medical apparatus includes an image processing system in communication with the optical assembly and the electronic display, the image processing system comprising at least one physical processor. The image processing system is configured to receive video images acquired by the auxiliary video camera, provide output video images based on the received video images, and present the output video images on the electronic display so that the output video images are viewable through the ocular. The gimbal is configured to adjust a pitch of the auxiliary video camera between a first position and a second position, wherein the auxiliary video camera has a first viewing angle perpendicular to a floor in the first position and a second viewing angle that is within about 10 degrees of parallel to the floor in the second position.


In some embodiments of the second aspect, the gimbal comprises two pivots. In a further embodiment, a first pivot is configured to adjust a pitch of the auxiliary video camera and a second pivot is configured to rotate the auxiliary video camera around an axis perpendicular to the floor.


In some embodiments of the second aspect, the gimbal is configured to adjust a pitch of the auxiliary video camera between the first position and a third position, wherein the auxiliary video camera has a third viewing angle in the third position that is less than or equal to 180 degrees from the first viewing angle. In some embodiments of the second aspect, the gimbal is electronically controlled. In some embodiments of the second aspect, the optical assembly is configured to provide an oblique view of a portion of a patient. In a further embodiment, an orientation of the ocular of the viewing assembly is configured to remain stationary when an orientation of the auxiliary video camera changes to provide the oblique view of the portion of the patient.


In some embodiments of the second aspect, the gimbal is configured to smoothly adjust the viewing angle of the auxiliary video camera between the first position and the second position. In some embodiments of the second aspect, the auxiliary video camera comprises a stereo video camera and the ocular comprises a pair of oculars. In some embodiments of the second aspect, the medical apparatus further comprises a camera arm attached to the viewing assembly.


In a third aspect, a medical apparatus is provided that includes a display housing. The medical apparatus includes a plurality of electronic displays disposed within the display housing, each of the plurality of electronic displays comprising a plurality of pixels configured to produce a two-dimensional image. The plurality of electronic displays are configured to present superimposed images in a field of view of a person's eye.


In some embodiments of the third aspect, the medical apparatus further comprises a binocular viewing assembly coupled to the display housing. In some embodiments of the third aspect, at least one of the plurality of electronic displays comprises a transmissive display panel. In some embodiments of the third aspect, the superimposed images comprise a video of a first portion of a surgery site that is superimposed on a video of a second portion of the surgery site, the first portion contained within the second portion. In a further embodiment, the video of the first portion is magnified relative to the video of the second portion.


In some embodiments, a medical apparatus can include a camera having a field of view that can be designed to include a surgical site, wherein the camera is designed to provide a surgical microscope view of the surgical site. In some embodiments, the medical apparatus can include a binocular viewing assembly having a housing and a plurality of oculars, the plurality of oculars designed to provide views of at least one display disposed in the housing. In some embodiments, the medical apparatus can include an image processing system designed to receive images acquired by the camera and present the output video images on the at least one display. In some embodiments, the medical apparatus can include a movement control system designed to move the camera relative to the binocular viewing assembly, the movement control system having a control member operatively coupled to the movement control system to translate the camera relative to the binocular viewing assembly along at least a first axis and a second axis and to rotate the camera relative to the binocular viewing assembly.


In a fourth aspect a medical apparatus is provided wherein a movement control system can include a translation system having a moveable platform to which the camera is attached, the moveable platform being positioned between the binocular viewing assembly and the camera and being moveable relative to the binocular viewing assembly along at least a first axis and a second axis. In some embodiments, the translation system can include an electromechanical device operatively coupled to the moveable platform.


In some embodiments of the fourth aspect, the movement control system can include a pitch-yaw adjustment system having an electromechanical device to which the camera can be attached, the pitch-yaw adjustment system designed to rotate the camera relative to the binocular viewing assembly around an axis parallel to the first axis and rotate the camera around an axis parallel to the second axis. In some embodiments, the control member is operatively coupled to the movement control system via sensors designed to detect movement of the control member, the sensors in communication with components of the movement control system In some embodiments, the control member can be operatively coupled to the movement control system via a gimbal having one or more sensors designed to detect movement of the control member, the sensors in communication with one or more components of the movement control system.


In some embodiments of the fourth aspect, the movement control system can be attached to the binocular viewing assembly. In some embodiments, the movement control system can be attached to an articulated arm. In some embodiments, the camera can be attached to the movement control system via an arm. In some embodiments, the medical apparatus can include a control system for controlling one or more electromechanical devices operatively coupled to the movement control system. In some embodiments, the control system can includes one or more pre-set positions for the movement control system


In a fifth aspect, a medical apparatus is provided that includes a display, a plurality of cameras and a processor, at least one of said cameras providing a surgical microscope view, said plurality of cameras comprising a first camera configured to image fluorescence in a surgical field and a second camera configured to produce a non-fluorescence image of said surgical field, a processor configured to receive video from said plurality of cameras and to display on said display a first fluorescence video from the first of said cameras and display a second non-fluorescence video from said second of said cameras.


In some embodiments of the fifth aspect, said first and second cameras have different spectral responses. In certain embodiments of the fifth aspect, one of the said first and second cameras is sensitive to infrared and the other is not.





BRIEF DESCRIPTION OF THE DRAWINGS

Throughout the drawings, reference numbers can be reused to indicate general correspondence between reference elements. The drawings are provided to illustrate example embodiments described herein and are not intended to limit the scope of the disclosure.



FIG. 1 illustrates an embodiment of the surgical visualization system having an imaging system that can be configured to provide imagery similar to a direct-view surgery microscope.



FIG. 2 illustrates an example surgical viewing system attached to an articulating arm, the system including one or more cameras mounted on a binocular viewing platform.



FIGS. 3A and 3B illustrate an example surgical viewing system that includes an isocenter positioning system attached to the binocular viewing platform.



FIGS. 4A and 4B illustrate an embodiment of a surgical visualization system having an optical imaging system mounted under the binocular viewing platform.



FIGS. 5A-5E illustrate embodiments of optical imaging systems for use in a stereoscopic surgical viewing system, such as those illustrated in FIGS. 4A and 4B.



FIG. 6A is a front view of an embodiment of a surgical visualization system, a movement control system, and an imager.



FIG. 6B is a front view of the embodiment of FIG. 6A with the movement control system and imager shifted.



FIG. 6C is a partial section view of the embodiment of a movement control system of FIG. 6A.



FIG. 7 is a side view of an embodiment of a surgical visualization system, a movement control system, and an imager.



FIG. 8 is a rear view of an embodiment of an embodiment of a movement control system.



FIGS. 9A-9D illustrate example display optical systems configured to provide a view of a display or a pair of displays through oculars.



FIGS. 10A-10G illustrate example display optical systems configured to deliver to oculars images of a display wherein light paths that would intersect a viewing assembly are reduced or eliminated through baffles.





DETAILED DESCRIPTION

The following description is directed to certain embodiments for the purposes of describing the innovative aspects of this disclosure. However, a person having ordinary skill in the art will readily recognize that the teachings herein can be applied in a multitude of different ways. The described embodiments may be implemented in any device or system that can be configured to provide visualization of a surgical site. Thus, the teachings are not intended to be limited to the embodiments depicted solely in the figures and described herein, but instead have wide applicability as will be readily apparent to one having ordinary skill in the art.


Surgical Visualization System


To provide improved visualization of a surgical site, a surgical device can be provided with multiple integrated cameras. Each of the cameras may capture a distinct view of the surgical site. In some embodiments, imagery from the plurality of cameras may be displayed to facilitate operation in a surgical site. Tiled, individual, and/or stitched imagery from the multiple cameras can provide the user with a view of the surgical site. The user can select the imagery to be displayed and the manner in which it is displayed for enhanced utility during surgery. As used herein, the term imagery and images includes video and/or images captured from one or more video cameras. Images from video are often referred to as video images or simply images. The term images may also refer to still images or snap shots. Video feed or video stream may also be used to describe the video images such as video images from a camera.


The video cameras may comprise, for example, CCD or CMOS sensor arrays or other types of detector arrays. A frame grabber may be configured to capture data from the cameras. For example, the frame grabber may be a Matrox Solios eA/XA, 4 input analog frame grabber board. Image processing of the captured video may be undertaken. Such image processing can be performed by, for example, the Matrox Supersight E2 with Matrox Supersight SHB-5520 with two Intel Six Core Xeon E5645 2.4 GHz processors with DDR3-1333SDRAM. This system can be designed to support eight or more camera inputs using two Matrox Solios eA/XA, 4 input, analog frame grabber boards. More or less cameras may be employed. In some implementations, a field programmable gate array (“FPGA”) can be used to capture and/or process video received from the cameras. For example, the image processing can be performed by Xilinx series 7 FPGA boards. Other hardware devices can be used as well, including ASIC, DSP, computer processors, a graphics board, and the like. The hardware devices can be standalone devices or they can be expansion cards integrated into a computing system through a local computer bus, e.g., a PCI card or PCIe card.



FIG. 1 shows an example embodiment of a surgical visualization system 1. As illustrated, the system 1 includes a console and electronics 3 from which three arms 5, 7 and 7b extend. The first arm 5 has mounted to its distal end a viewing platform 9. The viewing platform may include two oculars 11 and be configured similarly to a standard surgical microscope viewing platform. In some embodiments, however, unlike a conventional surgical microscope or a head mounted display the viewing platform 9 is not a direct view device where the surgeon or other user sees directly through the platform, e.g., an aperture in the platform. In some embodiments, regardless whether the user can view directly through the viewing platform, the surgical visualization system 1 can be configured to display video in a manner that the video displayed is decoupled from movement of the surgical microscope cameras such that a user can adjust the position and/or orientation of the surgical microscope cameras without moving the oculars 11 or the user adjusting position. As discussed in more detail below, the viewing platform 9 may include displays that receive signals from cameras that the surgeon or user employs to view the surgical site.


In some embodiments, cameras can be mounted to the viewing platform 9 and the cameras can be configured to provide imagery of the surgical site. Accordingly, the cameras can be used to provide imagery similar to a conventional surgical microscope. For example, the cameras on the viewing platform 9 can be configured to provide a working distance, or a distance from the viewing platform 9 to the patient, that can vary using zooming. The virtual working distance can vary, where the working distance can be at least about 150 mm and/or less than or equal to about 450 mm, at least about 200 mm and/or less than or equal to about 400 mm, or at least about 250 mm and/or less than or equal to about 350 mm. The working distance can be selected and/or changed by the surgeon. In some embodiments, changing the working distance does not affect the position and/or orientation of the oculars 11 with respect to the user or surgeon. In some embodiments, the cameras mounted on the viewing platform 9 can be used to provide gesture recognition to allow a surgeon to virtually interact with imagery provided by the display using the surgeon's hands, a surgical tool, or both, as described in greater detail herein.


The second arm 5 has mounted to its distal end an input and display device 13. In some embodiments, the input and display device 13 comprises a touchscreen display having various menu and control options available to a user. In some embodiments, the touchscreen can be configured to receive multi-touch input from ten fingers simultaneously, allowing for a user to interact with virtual objects on the display. For example, an operator may use the input device 13 to adjust various aspects of the displayed image. In various embodiments, the surgeon display incorporating a video camera providing a surgical microscope view may be mounted on a free standing arm, from the ceiling, on a post, or the like. The flat panel display touch screen 13 may be positioned on a tilt/rotate device on top of the electronics console.


A surgical tool 17 can be connected to the console 3 by electrical cable 19. The surgical tool 17 includes, for example, a cutting tool, a cleaning tool, a device used to cut patients, or other such devices. In other embodiments, the surgical tool 17 may be in wireless communication with the console 3, for example via WiFi (e.g., IEEE 802.11a/b/g/n), Bluetooth, NFC, WiGig (e.g., IEEE 802.11ad), etc. The surgical tool 17 may include one or more cameras configured to provide imagery, e.g., image and/or video data. In various embodiments, video data can be transmitted to a video switcher, camera control unit (CCU), video processor, or image processing module positioned, for example, within the console 3. The video switching module may then output a display video to the viewing platform 9. The operator may then view the displayed video through the oculars 11 of the viewing platform 9. In some embodiments, the binoculars permit 3D viewing of the displayed video. As discussed in more detail below, the displayed video viewed through the viewing platform 9 may comprise a composite video formed (e.g., stitched or tiled) from two or more of the cameras on the surgical tool 17.


In use, an operator may use the surgical tool 17 to perform open and/or minimally invasive surgery. The operator may view the surgical site by virtue of the displayed video in the viewing platform 9. Accordingly, the viewing platform (surgeon display system) 9 may be used in a manner similar to a standard surgical microscope although, as discussed above, the viewing platform 9 need not be a direct view device wherein the user sees directly through the platform 9 to the surgical site via an optical path from the ocular through an aperture at the bottom of the viewing platform 9. Rather in various embodiments, the viewing platform 9 includes a plurality of displays, such as liquid crystal or light emitting diode displays (e.g., LCD, AMLCD, LED, OLED, etc.) that form an image visible to the user by peering into the ocular. Accordingly, one difference, however, is that the viewing platform 9 itself need not necessarily include a microscope objective or a detector or other image-capturing mechanisms. Rather, the image data can be acquired via the cameras of the surgical tool 17. The image data can then be processed by a camera control unit, video processor, video switcher or image processor within the console 3 and displayed imagery may then be viewable by the operator at the viewing platform 9 via the display devices, e.g., liquid crystal or LED displays, contained therein. In some embodiments, the viewing platform 9 can provide a view similar to a standard surgical microscope using cameras and displays and can be used in addition to or in conjunction with a standard surgical microscope optical pathway in the viewing platform. In certain embodiments, the viewing platform 9 can provide a surgical microscope view wherein changes in the viewing angle, viewing distance, work distance, zoom setting, focal setting, or the like is decoupled from movement of the viewing platform 9. In certain embodiments, changes in the position, pitch, yaw, and/or roll of the imaging system 18 are decoupled from the viewing platform 9 such that the imaging system 18 can move and/or re-orient while the surgeon can remain stationary while viewing video through the oculars 11.


The third arm 7b can include an imaging system 18 that can be configured to provide video similar to a direct-view surgery microscope. The imaging system 18 can be configured, then, to provide a surgical imaging system configured to provide an electronic microscope-like view that can comprise video of the work site or operational site from a position above the site (e.g., about 15-45 cm above the surgical site) or from another desired angle. By decoupling the imagers 18 from the display, the surgeon can manipulate the surgical imaging system to provide a desired or selected viewpoint without having to adjust the viewing oculars. This can advantageously provide an increased level of comfort, capability, and consistency to the surgeon compared to traditional direct-view operating microscope systems. In some embodiments, as described herein, the imagers 18 can be located on the viewing platform 9, on a dedicated arm 7b, on a display arm 5, on a separate post, a separate stand, supported from an overhead structure, supported from the ceiling or wall, or detached from other systems. The imagers 18 can comprise a camera configured to be adjustable to provide varying levels of magnification, viewing angles, monocular or stereo imagery, convergence angles, working distance, or any combination of these.


The viewing platform 9 can be equipped with wide field-of-view oculars 11 that are adjustable for refractive error and presbyopia. In some embodiments, the oculars 11, or eyepieces, may additionally include polarizers in order to provide for stereoscopic vision. The viewing platform 9 can be supported by the arm 7 or 7b, such that it may be positioned for the user to comfortably view the display 13 through the oculars 11 while in position to perform surgery. For example, the user can pivot and move the arm 7 or 7b to re-orient and/or re-position the viewing platform 9.


In some embodiments, the image processing system and the display system are configured to display imagery placed roughly at infinity to reduce or eliminate accommodation and/or convergence when viewing the display. A display optical system can include one or more lenses and one or more redirection elements (e.g., mirrors, prisms) and can be configured to provide light from the display that can be imaged by a binocular viewing assembly comprising a pair of oculars, objectives, and/or turning prisms or mirrors. The display devices such as liquid crystal displays can be imaged with the objective and the pair of oculars and display optical system within the viewing platform 9. The binocular assembly and display optical system can be configured to produce an image of the displays at infinity. Such arrangements may potentially reduce the amount of accommodation by the surgeon. The oculars can also have adjustments (e.g., of focus or power) to address myopia or hyperopia of the surgeon. Accordingly, the surgeon or other users may view the displays through the oculars without wearing glasses even if ordinarily prescription glasses were worn for other activities.


In some embodiments, the viewing platform 9 can include one or more imagers configured to provide electronic microscope-like imaging capabilities. FIG. 2 illustrates an example surgical imaging system 51 attached to an arm 7, the system 51 including one or more cameras 18 mounted on a viewing platform 9. The cameras 18 can be configured to provide imagery of a worksite. The image data can be presented on a display that the user can view using oculars 11 mounted on the viewing platform 9. This design can be used to mimic other direct-view microscopes, but it can also be configured to provide additional capabilities. For example, the surgical imaging system 51 can be configured to have a variable working distance without adjusting the viewing platform 9 or the articulating arm 7. The surgical imaging system 51 can be configured to provide image processing capabilities such as electronic zooming and/or magnification, image rotation, image enhancement, stereoscopic imagery, and the like. Furthermore, the imagery from the cameras 18 can be combined with imagery from cameras on the surgical device 17. In some embodiments, the surgical imaging system 51 can provide fluorescence images.


Although the discussion considers images from surgical tools, numerous embodiments may involve at least one auxiliary video camera 18 and one or more other cameras that are not disposed on surgical tools but are disposed on other medical devices. These medical devices may include devices introduced into the body such as endoscopes, laparoscopes, arthroscopes, etc.


Accordingly, one or more displays such as the at least one display 13 included in the viewing platform 9 may be used to provide a surgical microscope view using one or more cameras such as the auxiliary video camera(s) 18 as well as to display views from one or more cameras located on such medical devices other than surgical tools. In some embodiments, cameras from a variety of sources, e.g., surgical tools and other medical devices, in any combination, may be viewed on the display(s) on the surgical platform together with the surgical microscope view from the auxiliary video cameras 18. As described herein, the displays may provide 3D thus any of the images and graphics may be provided in 3D.


In various embodiments, a virtual touchscreen may be provided by the auxiliary video cameras 18 or other virtual touchscreen cameras mounted to the viewing platform 9. Accordingly, in some embodiments a user may provide a gesture in the field of view of the auxiliary video cameras and/or virtual touchscreen cameras and the processing module can be configured to recognize the gesture as an input. Although the virtual display has been described in the context of the auxiliary video cameras 18, other cameras, e.g., virtual reality input cameras, possibly in addition to the auxiliary video cameras 18 may be used. These cameras may be disposed on the viewing platform 9 or elsewhere, such as the third arm 7b. As described herein the displays may provide 3D thus the virtual reality interface may appear in 3D. This may increase the immersive quality of the viewing experience, enhancing the detail and/or realistic presentation of video information on the display.


In some embodiments, as illustrated in FIG. 3A, the surgical imaging system 51 includes an isocenter positioning system 52 attached to the viewing platform 9. The isocenter positioning system 52 can include a single track or guide configured to move and orient the cameras 18 such that they are substantially pointed at a single point 53, the isocenter. In some embodiments, a second track or guide can be attached to the first guide in an orthogonal manner to provide movement along two dimensions while substantially maintaining the pointing angle towards the isocenter 53. Other configurations can be used to provide isocenter pointing capabilities, such as articulating arms, electro-mechanical elements, curved friction plates, etc. In some embodiments, as illustrated in FIG. 3B, the imaging system is configured to move in an isocenter manner. This can be used to enhance dexterity of the user of the system because hand-eye coordination is increased or maximized. Such enhanced dexterity can be vital for prolonged and/or difficult surgery. In the displayed embodiment, the horizons of the acquisition systems are configured to be horizontal to match the horizon of the display system and the user. As shown in FIG. 3B, in various embodiments, a stereo imaging system may be maintained in a horizontal configuration as it is moved across a range of locations to avoid confusion for the user viewing the video from the stereo camera. By maintaining a common relative horizon between the display and the acquisition system, the user can relatively easily translate hand motion to manipulation of objects in the display, which may not be the case where translation of the acquisition is accompanied by a relative rotation between the display and the acquisition system.


In the embodiments illustrated in FIGS. 3A and 3B, the isocenter assemblies can be a part of the display system or a separate, independent system. For example, the viewing platform 9 can be mounted on a separate arm from the cameras 18. Thus, the display and the image acquisition of the surgical imaging system can be decoupled, similar to the embodiment illustrated in FIG. 1. By decoupling the isocenter cameras 18 from the display ergonomic benefits are provided such as, for example, the surgeon does not need to be looking through binoculars for an extended period of time or at an uncomfortable position or angle. In various embodiments, a common relative horizon for both the display and the acquisition system may also be employed.


In some embodiments, the distance between the surgical site of interest and the imagers, e.g., the working distance, can be at least about 20 cm and/or less than or equal to about 450 cm, at least about 10 cm and/or less than or equal to about 50 cm, or at least about 5 cm and/or less than or equal to about 1 m, although values outside this range are possible.


The user can interact with the surgical imaging system 51 to select a working distance, which can be fixed throughout the procedure or which can be adjusted at any point in time. Changing the working distance can be accomplished using elements on a user interface, such as a graphical user interface, or using physical elements such as rotatable rings, knobs, pedals, levers, buttons, etc. In some embodiments, the working distance is selected by the system based at least in part on the cables and/or tubing being used in the surgical visualization system. For example, the cables and/or tubing can include an RFID chip or an EEPROM or other memory storage that is configured to communicate information to the surgical imaging system 51 about the kind of procedure to be performed. For an ENT/Head/Neck procedure, the typical working distance can be set to about 40 cm. In some embodiments, the user's past preferences are remembered and used, at least in part, to select a working distance.


In some embodiments, gross focus adjustment can be accomplished manually by positioning the cameras 18 and arm 7. The fine focus adjustment can be done using other physical elements, such as a fine focusing ring, or it can be accomplished electronically.


In some embodiments, the magnification of the surgical imaging system 51 can be selected by the user using physical or virtual user interface elements. The magnification can change and can range between about 1× and about 6×, between about 1× and about 4×, or between about 1× and about 2.5×. Embodiments may be able to change between any of these such as between 2.5× and 6× or between 2.5× and 6×. Values outside these ranges are also possible. For example, the system 51 can be configured to provide magnification and demagnification and image inversion, with a range from about −2× to about 10×, from about −2× to about 8×, from about −2× to about 4×, from about −0.5× to about 4×, or from about −0.5× to about 10×. The surgical imaging system 51 can be configured to decouple zoom features and focus adjustments, to overcome problems with traditional operating room microscopes. In some embodiments, the surgical visualization system 51 can be used to provide surgical microscope views. In some embodiments, the surgical imaging system 51 can decouple instrument myopia by providing an electronic display instead of a direct view of a scene. The electronic displays can be configured to be focused at varying levels of magnification allowing the user to view the displays without adjusting the oculars between magnification adjustments. Moreover, in various embodiments, the oculars can be configured to provide continuous views at infinity. In some embodiments, however, the principal user of the surgical imaging system may select an accommodation level for the oculars, rather than using a relaxed view provided by the electronic displays. The electronic displays, in various embodiments, however, can remain in focus and the ocular adjustments do not affect the focus of the various video acquisition systems. Thus, adjustments by the principal user do not affect the views of the other users of the system viewing, for example, other displays showing the video, as the cameras/acquisition systems can remain focused. In some embodiments, the surgical imaging system 51 can be focused at a relatively close working distance (e.g., a distance with a relatively narrow depth of field) such that the image remains focused when moving to larger working distances (e.g., distances with broader depth of field). Thus, the surgical imaging system 51 can be focused over an entire working range, reducing or eliminating the need to refocus the system after magnification or zoom adjustments are made.



FIGS. 4A and 4B illustrate an embodiment of the surgical imaging system 51 having an optical system 53 mounted under the viewing platform 9. As illustrated, the optical components are shown as free-standing to show the structure of the components, but in practice the optical components 53 will be mounted within or on a structure attached to the viewing platform. In some embodiments, the optical system 53 and/or the cameras 18 (discussed above) can be modular and can be selected and swapped for use with the surgical imaging system 51. Paragraph [0489] from each of U.S. Prov. App. No. 61/880,808, U.S. Prov. App. No. 61/920,451, U.S. Prov. App. No. 61/921,051, U.S. Prov. App. No. 61/921,389, U.S. Prov. App. No. 61/922,068, and U.S. Prov. App. No. 61/923,188 is incorporated by reference herein.


The optical system 53 is configured to provide stereo image data to the imaging system 51. The optical system 53 includes a turning prism 54 to fold the optical path underneath the viewing platform 9 to decrease the physical extent (e.g., length) of the imaging system under the viewing platform 9.


In some embodiments, the optical system 53 comprises a Greenough-style system wherein the optical paths for each eye have separate optical components. In some embodiments, the optical system 53 comprises a Galilean-style system wherein the optical paths for each eye pass through a common objective. The Greenough-style system may be preferable where imaging sensors are being used to capture and convey the image data as compared to the Galilean-style system. The Galilean system can introduce aberrations into the imagery by virtue of the rays for each eye's optical path passing through a periphery of the objective lens. This does not happen in the Greenough-style system as each optical path has its own optics. In addition, the Galilean system can be more expensive as the objective used can be relatively expensive based at least in part on the desired optical quality of the lens and its size.


The optical system 53 can include two right-angle prisms 54, two zoom systems 55, and two image sensors 56. This folding is different from a traditional operating room microscope because the optical path leads to image sensors rather than to a direct-view optical system.


In some embodiments, the optical system 53 can have a relatively constant F-number. This can be accomplished, for example, by varying the focal length and/or aperture of the system based on working distance and/or magnification. In one embodiment, as the focal length changes, the eye paths can move laterally apart (or together), the prisms 54 can rotate to provide an appropriate convergence angle, and the apertures can change their diameters to maintain the ratio of the focal length to the diameter a relatively constant value. This can produce a relatively constant brightness at the image sensor 56, which can result in a relatively constant brightness being displayed to the user. This can be advantageous in systems, such as the surgical visualization systems described herein, where multiple cameras are being used and changing an illumination to compensate for changes in focal length, magnification, working distance, and/or aperture can adversely affect imagery acquired with other cameras in the system. In some embodiments, the illumination can change to compensate for changes in the focal length and/or the aperture so as to provide a relatively constant brightness at the image sensors 56.


The optical assembly 53 can include a zoom system 55 configured to provide a variable focal distance and/or zoom capabilities. A Galilean-style stereoscopic system generally includes a common objective for the two eye paths. When this optical system is imaged with image sensors 56, it can create aberrations, wedge effects, etc. that can be difficult to compensate for. In some embodiments, the surgical imaging system 51 can include a Galilean-style optical system configured to re-center at least one of the stereo paths to a central location through the objective lens, which can be advantageous in some applications.


In some embodiments, the real-time visualization system utilizes a Greenough-style system. This can have separate optical components for each stereo path. The optical assembly 53 can be configured to provide variable magnification and/or a focal zoom and can be configured to operate in a magnification range from about 1× to about 6×, or from about 1× to about 4×, or from about 1× to about 2.5×.


The distal-most portion of the Greenough assembly 53 can be similar in functionality to an objective lens of a typical, direct-view operating room microscope with the working distance set approximately to that of the focal length. The working distance, and in some implementations the focal length, can be between about 20 cm and about 40 cm, for example. In some embodiments the work distance may be adjustable from 15 cm to 40 cm or to 45 cm. Other values outside these ranges are also possible. In some embodiments, the surgical imaging system 51 includes an opto-mechanical focus element configured to vary the focal length of a part of the optical assembly 53 or the whole optical assembly 53.



FIGS. 5A-5E illustrate embodiments of optical assemblies 53 for use in a stereoscopic surgical imaging system, such as those described herein with reference to FIGS. 4A-4B. FIG. 5A illustrates a side view of an example optical assembly 53 configured to use a turning prism 54 to fold an optical path from a tissue 57 to a sensor 56 along a lens train 55 that is situated near or adjacent to a viewing platform 9. This can advantageously provide a relatively long optical path in a relatively compact distance.



FIG. 5B illustrates a front view of an embodiment of an optical assembly configured to change a convergence angle in a stereoscopic imaging system. The prisms 54 can be the turning prism 54 illustrated in FIG. 5A. The prisms 54 can be configured to rotate to change a convergence angle, and as a result, a convergence point and/or a working distance. The working distance, which can be a distance from the prisms 54 to the target 57 (e.g., tissue), can be user-selectable or adjustable. In various embodiments, with increased working distance to the target 57, the convergence angle can decrease. Conversely, when the working distance gets smaller, the convergence angle can increase (e.g., θ1>θ2). This can be advantageous where the lens path 55 is fixed and the working distance is adjustable. The stereo imagery can then be viewed on the display 59 by a user.



FIG. 5C illustrates a front view of an embodiment of an optical assembly 53 that is configured to maintain a substantially constant convergence angle. The optical assembly 53 can include two prisms 54a and 54b for each optical path, wherein the prisms 54a, 54b can move and/or rotate. For example, when the working distance decreases the first set of prisms 54a can rotate towards one another to decrease an effective distance between the second set of prisms 54b. The second set of prisms 54b can, in turn, rotate to compensate for the changed angle so as to converge on the common target. The second set of prisms 54b can direct the light to the first set of prisms 54a which can then direct the light down the fixed lens paths 55 (e.g., fixed in their position relative to the viewfinder). By providing a relatively fixed convergence angle, a change in working distance may not require refocusing for the user. Maintaining a constant convergence angle, especially a comfortable angle, may reduce the strain on the user such as a surgeon performing a prolonged, arduous procedure.



FIG. 5D illustrates a front view of an embodiment of an optical assembly 53 configured to provide a substantially narrow convergence angle to be able to view stereoscopic imagery through a narrow insertion tube 60 (e.g., a tube partially inserted into a body during a procedure). A similar assembly 53 can be used as described with reference to FIG. 5C, and the convergence angle can be maintained substantially constant or at least sufficiently narrow to view through the insertion tube 60.



FIG. 5E illustrates a front view of an embodiment of an optical assembly 53 configured to provide a substantially constant convergence angle by moving the lens paths 55 laterally, e.g., toward or away from one another. The prisms 54 can be made to have a substantially constant orientation (e.g., no rotation for changing working distances) and compensation for changing working distance can be accomplished by translating the optical paths laterally to separate or join the optical paths. The translation of the optical paths can be accomplished using any suitable means including, for example, electro-mechanical actuators, slides, articulating arms, etc. This can simplify the optical assembly compared to the embodiments having two sets of prisms as only one set of prisms may be used when the lens paths are configured to move.


The embodiments of the optical assembly 53 which are configured to maintain a sufficiently narrow convergence angle can be advantageous as they allow stereo access to narrow surgical entries by allowing the angle to decrease and avoid clipping one of the stereo paths. For example, the left and right lens paths can move closer to one another and the prisms can adjust to the proper convergence angle for that distance. As another example, the left and right lens paths can remain fixed and there can be sets of prisms for each path configured to direct the light along the lens paths while maintaining a substantially constant convergence angle. In some embodiments, maintaining a constant convergence angle can be visually helpful to the user when zoom changes, e.g., because the changing depth cues do not confuse the user's eye and/or brain. In addition, constant convergence may induce less stress on the user.


Movement Control System



FIGS. 6A-C illustrate embodiments of components of a movement control system 10100 that can be configured to allow an operator of the surgical visualization system 1, such as a medical professional or assistant, to control the movement of one or more imagers 18. Such imagers may comprise cameras that provide a surgical microscope view through the oculars 11 or eyepieces of the binocular display unit 9. In various embodiments, the movement control system can enable the imagers 18 to be moved without changing the positioning of oculars 11, and thus an operator can remain in an ergonomic position while changing the view provided by the imager 18. The imager 18 can be on the binocular display unit 9 or located elsewhere such as on a separate platform or articulated arm. Additionally, unlike conventional articulated optical systems which are generally unwieldy, complex, and have the potential for introducing optical aberrations, use of the movement control system 10100 with the surgical visualization system 1 can result in a simplified system with greater optical clarity and range of movement. It should be appreciated by one of skill in the art that, while the description of the movement control system 10100 is described herein in the context of medical procedures, the movement control system 10100 can be used for other types of visualization and imaging systems. Movement of the imagers 18 can be performed prior to and/or during the activity, such as surgical procedures, dental procedures, and the like. Movement of the imagers 18 can advantageously allow a medical professional or other operator to alter the view through oculars 11, for example, to provide different surgical microscope-like electronic visualizations which might be beneficial during the course of a medical procedure or for different surgical procedures.


In some embodiments, control of the movement of the imager 18 can be achieved using a single control member such as 10110. This provides the advantage of allowing single-handed operation of the movement control system 10100 which can, for example, allow a medical professional to move one or more imagers 18 using only one hand while using a second hand for other tasks such as performing surgical techniques. It should be appreciated by one of skill in the art that, while the description of the movement control system 10100 is described herein in the context of medical procedures, the movement control system 10100 can be used for other types of visualization and imaging systems.


Operation


As illustrated in FIGS. 6A-C, in some embodiments, the control member, such as a joystick, 10110 can be used to translate the imager 18, adjust the pitch, yaw, and/or roll of the imager 18, and/or adjust the working distance of the imager 18. In some embodiments, the oculars 11 can remain immobile when translating the imager 18, adjusting the pitch, yaw, and/or roll of the imager 18, and/or adjusting the working distance of the imager 18. The ability for a single control member 10110 to control translation, adjustments to pitch and/or yaw, and/or adjustments to the working distance can beneficially simplify operation of the device as an operator need not release the control member 10110 to control multiple aspects of its operation. For example, an operator can translate the imager 18 and subsequently adjust the pitch and/or yaw without having to release the control member 10110 thereby increasing ease-of-use of the system and enhancing efficiency when using this system.


As shown in FIG. 6C, one or more control members of the movement control system 10100, such as control member 10110, and/or one or more imager arms (see FIG. 7) can be attached to a component of the movement control system 10100 using various types of joints and/or can be remote from the movement control system 10100 such as a remote joystick or toggle. In some embodiments, the control member 10110 can include a joint for attachment to the movement control system 10100. For example, as shown in the illustrated embodiment, control member 10110 can include joint 10111. In some embodiments, one or more of the joints can include components for detecting movement of the control member and/or an imager arm. For example, one or more of the joints can include one or more sensors for detecting rotation and/or translation of the control member and/or the imager arm about the joint. The signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components.


For purposes of this disclosure, rotation about joints, such as joint 10111, around the x-axis is hereinafter termed “pitch” or “tilt” and rotation about joints, such as joint 10111, around the y-axis is hereinafter termed “yaw” or “pan.”


As shown in the illustrated embodiment, the joint 10111 can be spherical joints received in a socket formed in the member 10220 thereby forming a ball-and-socket attachment. As should be apparent to one of ordinary skill in the art, other types of mounting mechanisms may be used for attaching control member 10110 as well as an imager arm to components of the movement control system 10100. For example, joints such as gimbals can be used which limit the rotational degrees of freedom about the gimbal. Other types of joint can be used depending on the types of movement the movement control system is designed to allow. For example, if only pitch is needed without yaw, one can use a joint having a single rotational degree of freedom. In some embodiments, the control member 10110 can be positioned remotely from the movement control system 10100.


General Embodiment

With continued reference to FIGS. 6A and 6B, in some embodiments, the movement control system 10100 can be attached to an attachment structure, such as binocular display unit 9, and support one or more imagers 18. As shown in the illustrated embodiment, the movement control system 10100 can be oriented generally underneath the binocular display unit 9 and in some embodiments can be sized such that the movement control system 10100 does not extend significantly beyond the outer housing of the binocular display unit 9. This can advantageously provide a smaller form factor thereby reducing the likelihood that the movement control system 10100 will interfere with the medical professionals and assistants during a medical procedure. In other embodiments, the attachment structure can be other components of the surgical visualization system 1 such as, but not limited to, a dedicated articulating arm or a display arm. In some embodiments, the movement control system 10100 can extend significantly beyond the outer housing of the binocular display unit 9 or any other platform to which it is attached. This can be advantageous in situations where a greater degree of movement of the imagers 18 is desired or in embodiments where the control member 10110 is located above the attachment point between the movement control system 10100 and binocular display unit 9.


With continued reference to FIGS. 6A and 6B, as discussed in part above, the movement control system 10100 can be configured to allow translation of one or more attached imagers 18 along a plane relative to the binocular display unit 9. In some embodiments, the binocular display unit 9 can be immobile while the one or more imagers 18 are translated. For example, when attached to the binocular display unit 9 with the movement control mechanism 10100 parallel to an operating table 10101, the one or more imagers 18 can be translated along a plane parallel to the operating table 10101. As shown in the illustrated embodiment, the movement control system 10100 can be translated along both the x-axis and the y-axis (which projects perpendicularly through the sheet). This can advantageously allow the medical professional to position the view of oculars 11 for comfortable viewing by the surgeon thereby reducing physical strain on the surgeon during long procedures.


In some embodiments, defining an imager 18 centered on the movement control system 10100 (as shown in FIG. 6A) as having an x-axis, y-axis, and z-axis coordinate of zero, the movement control system 10100 can have a range of translation relative to the binocular display unit 9, of approximately ±500 mm along the x-axis and y-axis at full extension, approximately ±400 mm along the x-axis and y-axis at full extension, approximately ±300 mm along the x-axis and y-axis at full extension, approximately ±200 mm along the x-axis and y-axis at full extension, or approximately ±100 mm along the x-axis and y-axis at full extension. In some embodiments, full extension along one axis can be greater than full extension along the other axis. For example, in some embodiments, full extension along the x-axis may be approximately ±175 mm whereas the y-axis extension can be three-quarters full extension of the x-axis, one-half full extension of the x-axis, one-quarter full extension of the x-axis, or any other ratio between unity and zero. In some embodiments, the range of translation relative to the binocular display unit 9 along the y-axis can be approximately ±87.5 mm. This can be advantageous in cases where allowing the y-axis to have a full range of motion may interfere with the medical professional and/or assistants.


These ratios can be reversed such that the range of translation of the x-axis can be three-quarters full extension of the y-axis, one-half full extension of the y-axis, one-quarter full extension of the y-axis, or any ratio between unity and zero. Additionally, in some embodiments, the imager 18 can translate further in the “positive” direction than the “negative” direction. For example, along the x-axis, the imager 18 may move from −100 mm to 500 mm. Ranges of motion outside these ranges are also possible. As should be apparent to one of ordinary skill in the art, the maximum translation relative to the binocular display unit 9 along the x-axis and y-axis can be chosen to provide a balance between greater maneuverability, the yaw and/or pitch angles, working distances, size constraints, and other such factors.


As described in part above and as will be discussed in greater detail below, in some embodiments, translation of the imagers 18 can be performed by translating one or more control members, such as control member 10110, in the desired direction. In some embodiments, the control member 10110 can be electrically coupled to the movement control system 10100 to provide translation via an electromechanical system utilizing stepper motors, linear motors, or the like. For example, a joint of the control member 10110 can include components for detecting translation of the control member 10110. The signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components such as stepper motors, linear motors, or the like to translate the imager 18. The electromechanical components can be coupled to a moveable platform to which the imager 18 can be attached. In some embodiments, the control member 10110 can be physically connected to the movement control system 10100 without any electromechanical assistance.


As should be appreciated by one of ordinary skill in the art, the movement control system 10100 need not translate solely along a plane parallel to the operating table 10101 or the x-y plane as set forth in the illustrated embodiment. In some embodiments, the plane of translation can be defined by the orientation of the mount to which the movement control system 10100 is connected. In some embodiments, the movement control system 10100 can be configured for non-planar translation and/or translation along more than one plane. In some embodiments, for example, a tip and tilt stage provides angular motion. A rotary stage can also be used to provide rotary motion.


With continued reference to FIGS. 6A and 6B, as described in part above, the movement control system 10100 can be configured to allow rotation of the one or more attached imagers 18 about a joint which can be attached to components of the movement control system 10100 and/or remotely from the movement control system 10100. In some embodiments, the movement control system 10100 can be designed to allow the control member, such as control member 10110, as well as the imager 18 and/or imager arm to “pitch” or “tilt” and “yaw” or “pan” relative to the binocular display unit 9. In some embodiments, the binocular display unit 9 can be immobile while the “tilt” and “yaw” or “pan” of the one or more imagers 18 are adjusted. Pitch or yaw can allow the imager 18 to have a line of sight that is centered (e.g., focused) on the surgical site after the imager 18 is translated. This can advantageously allow the medical professional or assistant to adjust the viewing angle during a medical procedure. This can be beneficial in circumstances where a medical professional is unable to adequately view an object due to another element obstructing the view. Under such circumstances, a medical professional can translate the imager 18 and adjust the viewing angle of the imager 18 such that the same general area is viewed from a different angle.


In some embodiments, defining an imager 18 in a perpendicular orientation to the movement control system 10100 (as shown in FIG. 6A) as having an a pitch and yaw of zero (i.e., as shown in FIG. 6A), the movement control system 10100 can allow both pitch and yaw adjustments relative to the binocular display unit 9 within the range of approximately ±60 degrees each, by approximately ±50 degrees each, by approximately ±40 degrees each, by approximately ±30 degrees each, by approximately ±20 degrees each, or approximately ±10 degrees each. In some embodiments, the pitch and yaw can have different adjustment ranges. For example, in some embodiments, the yaw can have an adjustment range of approximately ±40 degrees whereas the pitch can have an adjustment range of approximately three-quarters that of the yaw, one-half that of the yaw, one-quarter that of the yaw, or any other ratio between unity and zero. In some embodiments, the pitch can have an adjustment range of approximately ±20 degrees.


The adjustment range of yaw and pitch can correspond to the distance at full extension along both the x-axis and the y-axis. For example, in some embodiments, the pitch and yaw can be chosen such that the imager 18 can remain centered on the surgical site when the movement control system 10100 is fully extended in any direction. In some embodiments, the working distance between the imager 18 and the surgical site can be approximately 200 mm, with a range of translation along the x-axis of approximately ±175 mm, and a range of translation along the y-axis of approximately ±87.5 mm. In order to remain centered on the surgical site, the pitch adjustment range can be ±20 degrees and the yaw adjustment range can be ±40 degrees. As such, because the full extension need not be the same in both directions, the pitch and yaw adjustment ranges can also be different to match the differences in extension. In other embodiments, such as those in which the working distance can be adjusted, the pitch and yaw adjustment range can be chosen such that the imager 18 can remain centered on the surgical site when the movement control system 10100 is fully extended in any direction at at least one working distance. For example, in embodiments where the working distance can be adjusted between approximately 200 mm and 400 mm, the pitch and yaw adjustment range can be approximately ±20 degrees and approximately ±10 degrees respectively to allow centering at a working distance of 400 mm.


Additionally, in some embodiments, the imager 18 can adjust further in a “positive” angle than a “negative” angle. For example, the yaw may range from −5 degrees to 15 degrees.


As described in part above and as will be discussed in greater detail below, in some embodiments, increasing or decreasing the pitch and/or yaw of the imagers 18 relative to the binocular display unit 9 can be achieved by increasing or decreasing the pitch and/or yaw of the one or more control members, such as control member 10110. In some embodiments, the control member 10110 can be electrically coupled to the movement control system 10100 to provide pitch and yaw via an electromechanical system utilizing stepper motors, linear motors, or the like. For example, a joint of the control member 10110 can include components for detecting pitch and/or yaw of the control member 10110. In some embodiments, the joint of the control member 10110 can be gimbals which can detect pitch and/or yaw of the control member 10110. The signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components such as stepper motors, linear motors, or the like to adjust the pitch and/or yaw of the imager 18. As should be appreciated by one of ordinary skill in the art, in some embodiments, the movement control system 10100 can be configured to allow rotation along other axes such as the z-axis. In some embodiments, the control member 10110 can be physically connected to the movement control system 10100 without any electromechanical assistance.


Additionally, in some embodiments, the movement control system 10100 can be configured to adjust the working distance between the imagers 18 and the surgical site. In some embodiments, the binocular display unit 9 can remain immobile while the working distance of the imagers 18 are adjusted. In some embodiments, the working distance can range from between approximately 1 m to approximately 10 mm, from between approximately 800 mm to approximately 50 mm, from between approximately 600 mm to approximately 100 mm, or from between approximately 400 mm to approximately 200 mm. In some embodiments, the control member 10110 can be electrically coupled to the movement control system 10100 to provide working distance adjustment via an electromechanical system utilizing stepper motors, linear motors, or the like. For example, a joint of the control member 10110 can include components for detecting rotation of the control member 10110 about the longitudinal axis. The signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components such as stepper motors, linear motors, or the like to adjust the pitch and/or yaw of the imager 18. In some embodiments, the control member 10110 can be physically connected to the movement control system 10100 without any electromechanical assistance.


In some embodiments, the movement control system 10100 can include a translation system for translating an imager 18 and/or an imager arm, a pitch-yaw adjustment system for adjusting the pitch and/or yaw of the imager 18 and/or an imager arm, a control member, such as control member 10110, and one or more imager arms to which the imager 18 can be attached. In some embodiments, a working distance adjustment system can be included which can allow adjustments in working distance of the imager 18 and/or an imager arm. It should be appreciated by one of ordinary skill in the art that the translation system, the pitch-yaw adjustment system, and/or the working distance adjustment system can be used separately or in any combination.


Operation of the translation, pitch-yaw adjustment, and/or working distance adjustment systems can be performed using a control member, such as control member 10110. In some embodiments, control member 10110 can be operatively coupled to the translation, pitch-yaw adjustment, and/or working distance adjustment systems. For example, as described above, in some embodiments, the control member can be coupled to an electromechanical system for controlling the translation, pitch-yaw adjustment, and/or working distance adjustment systems. The control member can be directly attached to a component of the movement control system 10100 or can be remotely positioned (e.g., a toggle or joystick on a separate module). In some embodiments, the control member can be coupled directly to the translation, pitch-yaw adjustment, and/or working distance adjustment systems such that no electromechanical devices are used. In some embodiments, the operator can be given the option of controlling the translation, pitch-yaw adjustment, and/or working distance adjustment systems with or without electromechanical devices. For example, the operator can control the translation, pitch-yaw adjustment, and/or working distance adjustment systems without electromechanical devices for certain portions of a procedure and use such electromechanical devices for controlling the translation, pitch-yaw adjustment, and/or working distance adjustment systems during other portions of a procedure. As another example, in some embodiments coarse control of the movement control system 10100 can be achieved without use of electromechanical devices whereas fine control of the movement control system 10100 can be achieve with use of electromechanical devices, vice-versa, or a combination of the two.


In some embodiments, the movement control system 10100 can include a control system which controls functions of the electromechanical devices. In some embodiments, the electromechanical components can be programmed such that the electromechanical components can orient the translation, pitch-yaw adjustment, and/or working distance adjustment systems in certain positions based on the operator's input. For example, the electromechanical components can be programmed such that it goes to reverts back to a pre-set or previous position upon receiving a command from the operator. As another example, the electromechanical components can be programmed such that an operator can specify a desired position for the imager 18 and the control system can control the electromechanical devices coupled to the translation, pitch-yaw adjustment, and/or working distance adjustment systems orient the imager 18 in the desired position.


With reference to FIG. 7, in some embodiments, the imager arm 10120 and the imager 18 can be attached such that the imager 18 can be directed towards the side of the head of a patient. For example, in some embodiments, the imager 18 can be attached to the imager arm 10120 using a yoke 10125 which can be designed to allow for coarse and/or fine control of pitch, yaw, and/or roll of the imager 18. In some embodiments, the yoke 10125 can have one or more pivots which can be configured to allow the imager 18 to have a viewing angle parallel to the operating room floor such that an operator can view the side of the head. In some embodiments, the yoke 10125 can be configured to allow the imager 18 to rotate such that the imager can be directed to a portion of the back of the head.


In some embodiments, the imager 18 can be positioned on a movement control system 10130 providing at least two rotational degrees of freedom and/or at least one translational degree of freedom. In some embodiments, movement control system 10130 can provide two rotational degrees of freedom and at least two translation degrees of freedom. For example, as shown in FIG. 8, the movement control system 10130 can allow for rotation along axis 10135 of the movement control system 10130 and/or along axis 10140 (which can be parallel with the z-axis). Moreover, as shown in the illustrated embodiment, the movement control system can allow translation along both the x-axis and y-axis. In some embodiments, apparatus 10130 can provide at least one translational degree of freedom.


As shown in the illustrated embodiment, the movement control system 10130 can include a one or more control members, such as control member 10145. Control member 10145 can be positioned such that the longitudinal axis of the control member 10145 is parallel with and/or collinear with axis 10135. This can advantageously allow the imager 18 to be rotated about axis 10135 by rotating the control member 10145. In some embodiments, the control member 10145 can be mechanically coupled to the imager 18. In some embodiments, the control member 10145 can be coupled to the imager 18 via an electromechanical system. For example, the control member 10145 can include sensors for detecting rotation of the control member 10145 and use data received from the sensors to rotate the imager 18 via electromechanical components such as stepper motors, linear motors, or the like.


As shown in the illustrated embodiment, the movement control system 10130 can include a first plate element 10150 and a second plate element 10155 which can be rotatable coupled. The second plate element 10155 can include first and second supports 10160, 10165 to which the imager 18 can be attached. In some embodiments, the first and second plate elements 10150, 10155 can be rotatable coupled such that the axis of rotation of the two plate elements 10150, 10155 is parallel and/or collinear with axis 10140.


In some embodiments, the control member 10145 can include one or more switches and/or actuators 10170 for controlling movement of the device. For example, the actuator 10170 can be coupled to mechanisms which can unlock the apparatus 10130 such that the movement control system 10130 can be manipulated to rotate and/or translate the imager 18. In some embodiments, the switches and/or actuators can be coupled to an electromechanical system to rotate and/or translate the movement control system 10130.


Optical Systems for Displays



FIGS. 9A-9D illustrate example display optical systems 11005 configured to provide a view of displays 11010 through oculars (not shown) that receive light from the last lens 11015 in the display optical system 11005. The display optical system 11005 forms an exit pupil at or near the entrance pupil of the surgeon binoculars. These pupils are closely matched, for example, in size and shape. In some embodiments, the exit pupil of the display optical system 11005 can be the same size or smaller than the entrance pupil of oculars used to view the display. The oculars form an exit pupil that is matched (e.g., in size and shape) to the entrance pupil of the surgeon's eye(s). In some embodiments, the display optical system 11005 is configured to produce a beam that has a relatively constant cross-section between the first lens element 11012 and the last lens element 11015, where the cross-section is relatively small. Advantageously, this allows the display optical system 11005 to be included in a relatively small or compact package and use relatively small optical elements. In some embodiments, the last lens 11015 collimates the beam leaving the display optical system 11005. The termination of the rays shown in FIG. 11A to the left of lens 11015 is the exit pupil of the display optical system 11005. In some embodiments, the exit pupil of the display optical system 11005 is configured to be the same size or smaller than, and positioned at the same location, as an entrance pupil of a binocular viewing assembly configured to allow a user to view the display 11010.


The lenses in the display optical system 11005 form a highly color-corrected view of the display by forming the exit pupil in a position favorably disposed for the user and the binoculars. A combination of singlets and bonded lenses provide such correction. The display optical system 11005 may be designed to provide such correction while keeping a small beam column or ray bundle, which permits adding mirrors and obtaining a compact package. In various embodiments, producing an undistorted image can be difficult without such a group of lenses designed properly to provide such correction. This correction includes both color correction as well as distortion correction.


The display optical system 11005 advantageously allows a relatively small, compact lens assembly to provide a view of a relatively large display 11010. The display optical system 11005 can be configured to work with displays 11010 of varying sizes, including, without limitation, displays with a diagonal that is less than or equal to about 0.86 in. (22 mm), at least about 0.86 in. (22 mm) and/or less than or equal to about 10 in., at least about 1 in. and/or less than or equal to about 9 in., at least about 2 in. and/or less than or equal to about 8 in., or at least about 4 in. and/or less than or equal to about 6 in. The display may, for example, have a diagonal of about 5 inches or about 8 inches in some embodiments. The total optical path length of the display optical system 11005 can be less than or equal to about 9 in., at least about 9 in. and/or less than or equal to about 20 in., at least about 10 in. and/or less than or equal to about 19 in., at least about 14 in. and/or less than or equal to about 18 in. The display optical system 11005 can include lenses, mirrors, prisms, and other optical elements configured to direct and manipulate light along an optical path. The display optical system 11005 can be used in conjunction with a primary display, a surgeon display, an assistant display, possibly other displays, or any combination of these.


The example display optical system 11005 illustrated in FIG. 9A has a total optical path length of about 16.2 in. (412 mm). It is configured to provide an image of a 5 in. display 11010. The display optical system 11005 can include a lens 11012 configured to direct the light from the display 11010 along a path wherein light from the display 11010 is directed along a path with a relatively narrow cross-section. In various embodiments, the light received from the display is initially substantially reduced in beam size for example by the lens 11012 or lenses closest to the display and a more narrow beam is produced. In certain embodiments, for example, the lens 11012 or lenses closest to the display collect light at an angle (half angle) in excess of 20°, 25°, 30° and reduce the beam size of the light. This design is advantageous because it allows for the elements in the display optical system 11005 to be relatively small and compact. In some embodiments, the cross-section of the optical beam after the lens 11012 in the display optical system 11005 can be configured to be relatively constant. This configuration allows folding or redirecting mirrors present in the optical path to remain small.



FIG. 9B illustrates a binocular display optical system 11005 configured to provide a view of stereo displays 11010a, 11010b through a pair of oculars. The binocular display optical system 11005 can be based on the optical design illustrated in FIG. 9A, and can include one or more elements 11014 in the optical path before the lens 11012 to reduce the physical size of the optical system while maintaining the length of the optical path. These elements can include mirrors, prisms, and/or other optical elements configured to redirect the light from the displays 11010a, 11010b to the lens 11012. In some embodiments, the elements 11014 include curved mirrors which redirect the optical path and converge the rays from the displays 11010a, 11010b. In some embodiments, the elements 11014 include mirrors or prisms (for example that may have planar reflecting surface) that do not substantially affect the convergence of the light rays, but redirect the optical path. In some embodiments, because of the shape of the beam incident on the reflective surface, for example, mirror, the reflective surface or cross-section of the mirror is non-circular, and is, for example, elliptical. Accordingly, in various embodiments the cross-section of the mirror or other reflective surface is possibly being longer in one direction than in another, for example, orthogonal direction. These elements may fold the optical path to provide for a more compact system. Such a system may therefore have an optical path length from display to ocular that is longer than the length and/or width of the viewing platform of the combination thereof.


In some embodiments, the display optical system 11005 can include at least four mirrors, or less than or equal to four mirrors. In certain implementations, two mirrors can be used to fold the optical path from the display 11010 to the exit pupil, the two mirrors positioned between the first lens 11012 and the display 11010. In some embodiments, the display optical system 11005 includes at least four lenses or less than or equal to four lenses.


The example display optical system 11005 illustrated in FIG. 9C has a total optical path length of about 18.7 in. (475 mm). It is configured to provide an image of an 8 in. display 11010. The display optical system 11005 can include a lens 11012 configured to direct the light from the display 11010 along a path wherein light from the display 11010 is directed along a path with a relatively narrow cross-section, allowing for the display optical system 11005 to be relatively small and compact. In some embodiments, the cross-section of the optical beam after the lens 11012 in the display optical system 11005 (e.g., to the exit pupil prior to the entrance to a binocular viewing assembly) can be configured to be relatively constant. This configuration allows folding or redirecting mirrors present in the optical path to remain small. The display optical system 11005 can be configured to be used in conjunction with a display 11010 with a relatively high resolution.


The example display optical system 11005 illustrated in FIG. 9D has a total optical path length of about 9.3 in. (237 mm). It is configured to provide an image of a smaller display, in this case a 0.9 in. (22 mm) display 11010. Because the display is much smaller than the display in the embodiments described in connection with FIGS. 9A-9C, the optical path can be much shorter and may fit into a smaller space. The display optical system 11005 can include a lens 11012 configured to direct the light from the display 11010 along a path wherein light from the display 11010 is directed along a path with a relatively narrow cross-section, allowing for the display optical system 11005 to be relatively small and compact. In some embodiments, the cross-section of the optical path after the lens 11012 in the display optical system 11005 can be configured to be relatively constant. This configuration allows folding or redirecting mirrors present in the optical path to remain small. Based at least in part on the relatively short optical path length, the display optical system 11005 can be configured to be used in conjunction with a secondary display or an assistant display.



FIGS. 10A-10G illustrate example display optical systems 11300 configured to provide a view of a display 11310, the display optical system 11300 having an exit pupil 11305 wherein light paths that would intersect a viewing assembly housing 11315 are reduced or eliminated through baffles or apertures 11320, where a baffle includes a panel with an aperture. FIG. 10A illustrates an example embodiment of a display optical system 11300 comprising a display 11310, with other optical components configured to direct the light from the display 11310 to the exit pupil 11305. The light paths are traced with black lines to show the periphery of the bundle of light paths from the display 11310 to the exit pupil 11305. FIG. 10B shows this same display optical system 11300 as situated in an example viewing assembly housing 11315. When the display optical system 11300 is configured in this way, portions of the light 11320 from the display 11310 are outside of the housing 11315, which leads to light being reflected and/or scattered off the sidewalls of the housing along the path to the exit pupil 11305. This can lead to undesirable results, such as degradation in the quality of the image of the display 11310 viewed with an ocular, for example, by reducing contrast. The display optical systems 11300 can be configured to provide a collimated beam at the exit pupil 11305 such that a binocular viewing assembly comprising an objective and oculars can mate to the viewing assembly housing 11315 and view the display 11310.


In some embodiments, one or more baffles or apertures 11325 can be incorporated into the display optical system 11300 to reduce or eliminate the amount of light that intersects with the housing 11315. The apertures may be disposed to reduce the view of the sidewalls by the ocular, thereby reducing the light collected that is reflected off the sidewalls. FIG. 10C illustrates an example embodiment of the display optical system 11300 without any apertures. The display optical system includes mirrors M1, M2, M3, and M4 to redirect the light path within the viewing assembly. The mirrors M1, M2, M3, and M4 fold the optical path such that the display optical system 11300 can be contained in a more compact housing having a smaller footprint. Additionally in various embodiments, the mirrors M1, M2, M3, and M4 fold the optical path wraps around a supporting column configured to support the housing on an arm. In various embodiments the column is a conduit for electrical signals, power, and illumination fibers. Electronics boards, for example, with FPGAs, etc., can be disposed on the top of the display. Such a configuration may be useful because signal integrity (e.g. of MIPI2 signal) can be preserved with short cable routing. An opening 11307 for the support column about which the optical path is wrapped is visible in FIG. 10B. The display optical system includes lenses in lens tube L1 to shape (e.g., collimate) the light path along the path from the display 11310 to the exit pupil 11305. The lens tube L1 can be used to maintain a relatively narrow optical passageway that contains substantially all of the light travelling from the display 11310 to the exit pupil 11305. FIG. 10D illustrates the example embodiment of the display optical system 11300 from FIG. 10C with an aperture 11325 added between the mirror M4 and the display 11310. FIG. 10E illustrates the example embodiment of the display optical system 11300 from FIG. 10C with an aperture 11325 added between the mirror M3 and the final mirror M4. FIG. 10F illustrates the example embodiment of the display optical system 11300 from FIG. 10C with an aperture 11325 added between the lens tube L1 and the mirror M3.



FIG. 10G illustrates the example embodiment of the display optical system 11300 from FIG. 10C, with apertures 11325 added at all the locations illustrated in FIGS. 10D to 10F, between the lens tube L1 and the mirror M3, between the mirror M3 and the mirror M4, and between the mirror M4 and the display 11310. Simulations of the performance of this configuration have shown that the radiant intensity of unwanted light, e.g., light that arrives after being reflected or scattered from the inner housing of the housing 11315, have been reduced by about 3.6× while the radiant intensity at the exit pupil 11305 from the display 11310 has been substantially held constant which substantially means that there is less than a 10% change in the radiant intensity.


In some embodiments, the display optical system 11300 can include at least four baffles or less than or equal to four baffles. In certain implementations, four baffles can be included in the optical path between the first lens and the display 11310. In some implementations, two mirrors can be included in the optical path between the first lens and the display 11310. In some embodiments, the optical path can include, in order from the display 11310, a first baffle, a first mirror, a second baffle, a second mirror, and a third baffle prior to the first lens.


In some embodiments, the display can be a curved surface, for example either a projection display or recent generation of flexible LCD or OLED displays having high-resolution (e.g., in excess of 300 ppi). A curved display may provide two advantages: the imaging optics for the display can be less complex than for flat panels, and the cone or numerical aperture of each picture element in the display can be directed towards the viewing optics and in the periphery of the display, thereby providing a brighter image less subject to vignetting.


In some embodiments, the display can be a volumetric display comprising two or more transmissive display panels having a single backlight wherein the transmissive display panels are stacked to provide different planes of focus for a surgeon. The transmissive displays can be active matrix liquid crystal displays (“AMLCD”) or other types of transmissive displays. The backlight can be a fluorescent lamp, LEDs, or other suitable light source. By having displays positioned in different focal planes, image data from different focal planes may be presented to the surgeon with relatively less image processing and/or compression compared to a system which combines data from multiple focal planes into a single image. In some embodiments, a number of cameras can be positioned at varying depths or having varying focal distances such that the displays at different focal planes are configured to display image data from cameras positioned or focused at different depths to create a display that assists the surgeon in identifying positions of features within displayed images.


The display can show, as an overlay, pre-operative CT, MR, or other 3D image datasets from, for example, conventional surgical navigation systems (e.g., the Medtronic StealthStation or Treon, Stryker Surgical Navigation System, or Brainlab, among others). In various embodiments, in addition to images, the display can additionally provide numerical data and/or text. For example, in various embodiments, the display can overlay information such as distance or tool measurements, transparent tool renderings, camera identification information (e.g., the portion of the composite image attributable to a specific optical sensor may generate an identifying border around that portion), up/down orientation, elapsed time, and/or one or more still images captured from one or more optical sensors from a previous time in the operation The tracking system can provide 5-DOF (degrees of freedom) or 6-DOF position and orientation information to conventional surgical navigation systems. Other information, graphic, alpha numeric, or otherwise, can be provided.


The tool image can be magnified with respect to the wide-field view image, and change in image scaling will occur as the tool is moved in and out. In some embodiments, a visual metaphor for embodiments of the display is that of a hand-held magnifying glass for inspecting and doing work on a smaller region of a larger workpiece, while seeing the larger workpiece with lower magnification (if any) in more peripheral regions of the visual field to provide situational awareness. Tool images, for example, can be superimposed on the background image thereby blocking that portion of the background image. In various embodiments, the tool images may be stereo.


In some embodiments fluorescence information can be displayed. Cameras that image in different wavelengths, such as infrared, could image the surgical site or objects contained therein. In some embodiments, features could be made to fluoresce, for example, by injecting fluorescent chemical and illuminating the area with light the will induce fluorescence. Such a technique may be useful to identify and/or highlight the location and/or boundaries of specific features of interest such as tumors, etc. The fluorescence or other wavelength of interest may be detected by the one or more cameras imaging the surgical field such as one or more camera providing a surgical microscope view. In some embodiments, images produced by fluorescence or other wavelengths of interest are superimposed on one or more images from other camera(s). Filtering could be provided to remove unwanted wavelengths and possibly increase contrast. The filter can remove excitation illumination. In some embodiments emission image content, (e.g., fluorescing tissue) can be parsed and superimposed on image content that is not emitting (e.g., tissue that is not fluorescing), or vice versa. In various embodiments, such as where the fluorescing wavelength is not visible (e.g., for fluorescence in the infrared), an artificial color rendition of the fluorescing content can be used in place of the actual fluorescing color so as to enable the fluorescing tissue to be visible.


Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein.


Certain features that are described in this specification in the context of separate embodiments also can be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment also can be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.

Claims
  • 1. A medical apparatus comprising: a display housing;an opening in the display housing;an electronic display disposed within the display housing, the electronic display comprising a plurality of pixels configured to produce a two-dimensional image; anda display optical system disposed within the display housing, the display optical system comprising a plurality of lens elements disposed along an optical path,wherein the display optical system is configured to: receive the two-dimensional image from the electronic display;produce a beam with a cross-section that remains substantially constant along the optical path; andproduce a collimated beam exiting the opening in the display housing.
  • 2. The medical apparatus of claim 1, wherein the display optical system further comprises an optical redirection element configured to fold the optical path.
  • 3. The medical apparatus of claim 2, wherein the optical redirection element comprises a mirror.
  • 4. The medical apparatus of claim 2, wherein the optical redirection element comprises a prism.
  • 5. The medical apparatus of claim 2, wherein the display optical system is configured to direct light received from the electronic display to the opening in the display housing while reducing stray light.
  • 6. The medical apparatus of claim 1, wherein the display optical system further comprises a baffle configured to reduce stray light.
  • 7. The medical apparatus of claim 6, wherein the display optical system comprises less than or equal to four baffles.
  • 8. The medical apparatus of claim 7, wherein the display optical system comprises less than or equal to four mirrors.
  • 9. The medical apparatus of claim 8, wherein a first baffle is positioned between the electronic display and a first mirror along the optical path, the first mirror positioned prior to the plurality of lens elements along the optical path from the display to the opening.
  • 10. The medical apparatus of claim 8, wherein at least three baffles are positioned prior to the plurality of lens elements along the optical path from the display to the opening.
  • 11. The medical apparatus of claim 8, wherein at least two mirrors are positioned prior to the plurality of lens elements along the optical path from the display to the opening.
  • 12. The medical apparatus of claim 1, wherein the display optical system has an exit pupil and the electronic display is not parallel to the exit pupil.
  • 13. The medical apparatus of claim 1, wherein the opening in the display housing comprises a mounting interface configured to mate with a binocular assembly for a surgical microscope.
  • 14. The medical apparatus of claim 13, wherein an exit pupil of the display optical system is of a same size or smaller than an entrance pupil of oculars in the binocular assembly.
  • 15. The medical apparatus of claim 1, further comprising a second electronic display and a second display optical system configured to provide a stereo view.
  • 16. The medical apparatus of claim 1, further comprising processing electronics configured to communicate with the electronic display to provide images for the electronic display.
  • 17. The medical apparatus of claim 16, wherein the processing electronics are configured to receive images from one or more cameras on a surgical device.
  • 18. The medical apparatus of claim 16, wherein the processing electronics are configured to receive images from one or more cameras that provide a surgical microscope view.
  • 19. The medical apparatus of claim 1, wherein the optical path is less than or equal to 16.2 inches and a light-emitting portion of the electronic display has a diagonal measurement that is greater than or equal to 5 inches.
  • 20. The medical apparatus of claim 1, wherein the optical path is less than or equal to 18.7 inches and a light-emitting portion of the electronic display has a diagonal measurement that is greater than or equal to 8 inches.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/042,318, filed Jul. 23, 2018, which is incorporated herein by reference in its entirety and which is a continuation of U.S. patent application Ser. No. 14/491,827, filed Sep. 19, 2014, which is incorporated herein by reference in its entirety and which claims the benefit of priority to U.S. Prov. App. No. 61/880,808, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Sep. 20, 2013; to U.S. Prov. App. No. 61/920,451, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 23, 2013; to U.S. Prov. App. No. 61/921,051, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 26, 2013; to U.S. Prov. App. No. 61/921,389, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 27, 2013; to U.S. Prov. App. No. 61/922,068, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 30, 2013; and to U.S. Prov. App. No. 61/923,188, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Jan. 2, 2014.

US Referenced Citations (493)
Number Name Date Kind
497064 Van Meter May 1893 A
2826114 Bryan Mar 1958 A
3050870 Heilig Aug 1962 A
3108781 Saffir Oct 1963 A
3128988 Mandroian Apr 1964 A
3141650 Saffir Jul 1964 A
3405990 Nothnagle et al. Oct 1968 A
3409346 Stapsy Nov 1968 A
3664330 Deutsch May 1972 A
4056310 Shimizu et al. Nov 1977 A
4063557 Wuchinich et al. Dec 1977 A
4087198 Theis, Jr. May 1978 A
4167302 Karasawa Sep 1979 A
4176453 Abbott Dec 1979 A
4223676 Wuchinich et al. Sep 1980 A
4226228 Shin et al. Oct 1980 A
4344746 Leonard Aug 1982 A
4354734 Nkahashi Oct 1982 A
4395731 Schoolman Jul 1983 A
4562832 Wilder et al. Jan 1986 A
4651201 Schoolman Mar 1987 A
4655557 Takahashi Apr 1987 A
4665391 Spani May 1987 A
4684224 Yamashita et al. Aug 1987 A
4703314 Spani Oct 1987 A
4718106 Weinblatt Jan 1988 A
4750488 Wuchinich et al. Jun 1988 A
4750902 Wuchinich et al. Jun 1988 A
4779968 Sander Oct 1988 A
4783156 Yokota Nov 1988 A
4786155 Fantone et al. Nov 1988 A
4813927 Morris et al. Mar 1989 A
4873572 Miyazaki et al. Oct 1989 A
4900301 Morris et al. Feb 1990 A
4905670 Adair Mar 1990 A
4920336 Meijer Apr 1990 A
4922902 Wuchinich et al. May 1990 A
4986622 Martinez Jan 1991 A
4989452 Toon et al. Feb 1991 A
5016098 Cooper et al. May 1991 A
5032111 Morris et al. Jul 1991 A
5047009 Morris et al. Sep 1991 A
5098426 Sklar et al. Mar 1992 A
5143054 Adair Sep 1992 A
5151821 Marks Sep 1992 A
5176677 Wuchinich et al. Jan 1993 A
5201325 McEwen et al. Apr 1993 A
5221282 Wuchinich Jun 1993 A
5251613 Adair Oct 1993 A
5327283 Zobel Jul 1994 A
5354314 Hardy et al. Oct 1994 A
5417210 Funda et al. May 1995 A
5441059 Dannan Aug 1995 A
5464008 Kim Nov 1995 A
5523810 Volk Jun 1996 A
5537164 Smith Jul 1996 A
5553995 Martinez Sep 1996 A
5575789 Bell et al. Nov 1996 A
5584796 Cohen Dec 1996 A
5593402 Patrick Jan 1997 A
5601549 Miyagi Feb 1997 A
5625493 Matsumura et al. Apr 1997 A
5634790 Pathmanabhan et al. Jun 1997 A
5667481 Villalta et al. Sep 1997 A
5697891 Hori Dec 1997 A
5712995 Cohn Jan 1998 A
5716326 Dannan Feb 1998 A
5743731 Lares et al. Apr 1998 A
5743846 Takahashi et al. Apr 1998 A
5747824 Jung et al. May 1998 A
5751341 Chaleki May 1998 A
5797403 DiLorenzo Aug 1998 A
5803733 Trott et al. Sep 1998 A
5822036 Massie et al. Oct 1998 A
5825534 Strahle Oct 1998 A
5835266 Kitajima Nov 1998 A
5841510 Roggy Nov 1998 A
5861983 Twisselman Jan 1999 A
5889611 Zonneveld Mar 1999 A
5897491 Kastenbauer et al. Apr 1999 A
5909380 Dubois Jun 1999 A
5913818 Co et al. Jun 1999 A
5928139 Koros et al. Jul 1999 A
5949388 Atsumi Sep 1999 A
5982532 Mittelstadt et al. Nov 1999 A
6016607 Morimoto et al. Jan 2000 A
6023638 Swanson Feb 2000 A
6088154 Morita Jul 2000 A
6139493 Koros et al. Oct 2000 A
6152736 Schmidinger Nov 2000 A
6152871 Foley et al. Nov 2000 A
6176825 Chin et al. Jan 2001 B1
6217188 Wainwright et al. Apr 2001 B1
6246898 Vesely et al. Jun 2001 B1
6293911 Imaizumi et al. Sep 2001 B1
6317260 Ito Nov 2001 B1
6319223 Wortrich et al. Nov 2001 B1
6340363 Bolger et al. Jan 2002 B1
6350235 Cohen et al. Feb 2002 B1
6354992 Kato Mar 2002 B1
6398721 Nakamura Jun 2002 B1
6405072 Cosman Jun 2002 B1
6434329 Dube et al. Aug 2002 B1
6443594 Marshall et al. Sep 2002 B1
6450706 Chapman Sep 2002 B1
6450950 Irion Sep 2002 B2
6491661 Boukhny et al. Dec 2002 B1
6508759 Taylor et al. Jan 2003 B1
6517207 Chapman Feb 2003 B2
6525310 Dunfield Feb 2003 B2
6525878 Takahashi Feb 2003 B1
6527704 Chang et al. Mar 2003 B1
6538665 Crow et al. Mar 2003 B2
6549341 Nomura et al. Apr 2003 B2
6561999 Nazarifar et al. May 2003 B1
6582358 Akui et al. Jun 2003 B2
6587711 Alfano et al. Jul 2003 B1
6589168 Thompson Jul 2003 B2
6618207 Lei Sep 2003 B2
6626445 Murphy et al. Sep 2003 B2
6633328 Byrd et al. Oct 2003 B1
6635010 Lederer Oct 2003 B1
6636254 Onishi et al. Oct 2003 B1
6661571 Shioda et al. Dec 2003 B1
6668841 Chou Dec 2003 B1
6698886 Pollack et al. Mar 2004 B2
6720988 Gere et al. Apr 2004 B1
6757021 Nguyen-Nhu Jun 2004 B1
6805127 Karasic Oct 2004 B1
6817975 Farr et al. Nov 2004 B1
6824525 Nazarifar et al. Nov 2004 B2
6847336 Lemelson et al. Jan 2005 B1
6869398 Obenchain et al. Mar 2005 B2
6873867 Vilsmeier Mar 2005 B2
6892597 Tews May 2005 B2
6903883 Amanai Jun 2005 B2
6908451 Brody et al. Jun 2005 B2
6985765 Morita Jan 2006 B2
6996460 Krahnstoever et al. Feb 2006 B1
7034983 Desimone et al. Apr 2006 B2
7050225 Nakamura May 2006 B2
7050245 Tesar et al. May 2006 B2
7054076 Tesar et al. May 2006 B2
7116437 Weinstein et al. Oct 2006 B2
7125119 Farberov Oct 2006 B2
7150713 Shener et al. Dec 2006 B2
7150714 Myles Dec 2006 B2
7154527 Goldstein et al. Dec 2006 B1
7155316 Sutherland Dec 2006 B2
7163543 Smedley et al. Jan 2007 B2
7226451 Shluzas et al. Jun 2007 B2
7244240 Nazarifar et al. Jul 2007 B2
7278092 Krzanowski Oct 2007 B2
7298393 Morita Nov 2007 B2
7306559 Williams Dec 2007 B2
7307799 Minefuji Dec 2007 B2
7326183 Nazarifar et al. Feb 2008 B2
7471301 Lefevre Dec 2008 B2
7480872 Ubillos Jan 2009 B1
7494463 Nehls Feb 2009 B2
7518791 Sander Apr 2009 B2
7537565 Bass May 2009 B2
7538939 Zimmerman et al. May 2009 B2
7559887 Dannan Jul 2009 B2
7621868 Breidenthal et al. Nov 2009 B2
7633676 Brunner et al. Dec 2009 B2
7644889 Johnson Jan 2010 B2
7651465 Sperling et al. Jan 2010 B1
7713237 Nazarifar et al. May 2010 B2
7764370 Williams et al. Jul 2010 B2
7766480 Graham et al. Aug 2010 B1
7777941 Zimmer Aug 2010 B2
7785253 Arambula Aug 2010 B1
7786457 Gao Aug 2010 B2
7806865 Wilson Oct 2010 B1
7844320 Shahidi Nov 2010 B2
7872746 Gao et al. Jan 2011 B2
7874982 Selover et al. Jan 2011 B2
7896839 Nazarifar et al. Mar 2011 B2
7907336 Abele et al. Mar 2011 B2
7927272 Bayer et al. Apr 2011 B2
7932925 Inbar et al. Apr 2011 B2
7956341 Gao Jun 2011 B2
8009141 Chi et al. Aug 2011 B1
8012089 Bayat Sep 2011 B2
8018523 Choi Sep 2011 B2
8018579 Krah Sep 2011 B1
8027710 Dannan Sep 2011 B1
8038612 Paz Oct 2011 B2
8070290 Gille et al. Dec 2011 B2
8088066 Grey et al. Jan 2012 B2
8136779 Wilson et al. Mar 2012 B2
8149270 Yaron et al. Apr 2012 B1
8159743 Abele et al. Apr 2012 B2
8169468 Scott et al. May 2012 B2
8187167 Kim May 2012 B2
8187180 Pacey May 2012 B2
8194121 Blumzvig et al. Jun 2012 B2
8221304 Shioda et al. Jul 2012 B2
8229548 Frangioni Jul 2012 B2
8294733 Eino Oct 2012 B2
8295693 McDowall Oct 2012 B2
8351434 Fukuda et al. Jan 2013 B1
8358330 Riederer Jan 2013 B2
8405733 Saijo Mar 2013 B2
8408772 Li Apr 2013 B2
8409088 Grey et al. Apr 2013 B2
8419633 Koshikawa et al. Apr 2013 B2
8419634 Nearmann et al. Apr 2013 B2
8430840 Nazarifar et al. Apr 2013 B2
8439830 McKinley et al. May 2013 B2
8460184 Nearman et al. Jun 2013 B2
8464177 Ben-Yoseph et al. Jun 2013 B2
8482606 Razzaque Jul 2013 B2
8498695 Westwick et al. Jul 2013 B2
8521331 Itkowitz Aug 2013 B2
8702592 Langlois et al. Apr 2014 B2
8702602 Berci et al. Apr 2014 B2
8734328 McDowall May 2014 B2
8786946 Nakamura Jul 2014 B2
8827899 Farr et al. Sep 2014 B2
8827902 Dietze, Jr. et al. Sep 2014 B2
8836723 Tsao et al. Sep 2014 B2
8858425 Farr et al. Oct 2014 B2
8876711 Lin et al. Nov 2014 B2
8878924 Farr Nov 2014 B2
8882662 Charles Nov 2014 B2
8976238 Ernsperger et al. Mar 2015 B2
8979301 Moore Mar 2015 B2
9033870 Farr et al. May 2015 B2
9216068 Tesar Dec 2015 B2
9492065 Tesar et al. Nov 2016 B2
9615728 Charles et al. Apr 2017 B2
9629523 Tesar et al. Apr 2017 B2
9642606 Charles et al. May 2017 B2
9681796 Tesar et al. Jun 2017 B2
9723976 Tesar Aug 2017 B2
9782159 Tesar Oct 2017 B2
9936863 Tesar Apr 2018 B2
10022041 Charles et al. Jul 2018 B2
10028651 Tesar Jul 2018 B2
10231607 Charles et al. Mar 2019 B2
10555728 Charles et al. Feb 2020 B2
10568499 Tesar Feb 2020 B2
10702353 Tesar Jul 2020 B2
10881286 Tesar et al. Jan 2021 B2
20010045506 Masuyama Nov 2001 A1
20010055062 Shioda et al. Dec 2001 A1
20020013514 Brau Jan 2002 A1
20020038075 Tsai Mar 2002 A1
20020049367 Irion et al. Apr 2002 A1
20020065461 Cosman May 2002 A1
20020082498 Wendt et al. Jun 2002 A1
20030055410 Evans et al. Mar 2003 A1
20030059097 Abovitz et al. Mar 2003 A1
20030078494 Panescu et al. Apr 2003 A1
20030088179 Seeley et al. May 2003 A1
20030102819 Min et al. Jun 2003 A1
20030103191 Staurenghi et al. Jun 2003 A1
20030142204 Rus et al. Jul 2003 A1
20030147254 Yoneda et al. Aug 2003 A1
20040017607 Hauger et al. Jan 2004 A1
20040027652 Erdogan et al. Feb 2004 A1
20040036962 Brunner et al. Feb 2004 A1
20040070822 Shioda et al. Apr 2004 A1
20040087833 Bauer et al. May 2004 A1
20040111183 Sutherland Jun 2004 A1
20040196553 Banju et al. Oct 2004 A1
20040230191 Frey et al. Nov 2004 A1
20050018280 Richardson Jan 2005 A1
20050019722 Schmid et al. Jan 2005 A1
20050026104 Takahashi Feb 2005 A1
20050031192 Sieckmann Feb 2005 A1
20050033117 Ozaki et al. Feb 2005 A1
20050052527 Remy et al. Mar 2005 A1
20050063047 Obrebski et al. Mar 2005 A1
20050064936 Pryor Mar 2005 A1
20050065435 Rauch et al. Mar 2005 A1
20050095554 Wilkinson May 2005 A1
20050107808 Evans et al. May 2005 A1
20050171551 Sukovich et al. Aug 2005 A1
20050215866 Kim Sep 2005 A1
20050228231 MacKinnon et al. Oct 2005 A1
20050228232 Gillinov et al. Oct 2005 A1
20050279355 Loubser Dec 2005 A1
20060004261 Douglas Jan 2006 A1
20060020213 Whitman et al. Jan 2006 A1
20060025656 Buckner et al. Feb 2006 A1
20060069315 Miles et al. Mar 2006 A1
20060069316 Dorfman et al. Mar 2006 A1
20060085969 Bennett et al. Apr 2006 A1
20060092178 Tanguya, Jr. et al. May 2006 A1
20060114411 Wei et al. Jun 2006 A1
20060129140 Todd et al. Jun 2006 A1
20060152516 Plummer Jul 2006 A1
20060235279 Hawkes et al. Oct 2006 A1
20060236264 Cain et al. Oct 2006 A1
20060241499 Irion et al. Oct 2006 A1
20060276693 Pacey Dec 2006 A1
20060293557 Chuanggui et al. Dec 2006 A1
20070010716 Malandain Jan 2007 A1
20070019916 Takami Jan 2007 A1
20070030344 Miyamoto Feb 2007 A1
20070038080 Salisbury, Jr. et al. Feb 2007 A1
20070086205 Krupa et al. Apr 2007 A1
20070129608 Sandhu Jun 2007 A1
20070129719 Kendale et al. Jun 2007 A1
20070153541 Bennett et al. Jul 2007 A1
20070173853 MacMillan Jul 2007 A1
20070238932 Jones et al. Oct 2007 A1
20070282171 Karpowicz et al. Dec 2007 A1
20080015417 Hawkes et al. Jan 2008 A1
20080058606 Miles et al. Mar 2008 A1
20080081947 Irion et al. Apr 2008 A1
20080091066 Sholev Apr 2008 A1
20080094583 Williams et al. Apr 2008 A1
20080096165 Virnicchi et al. Apr 2008 A1
20080097467 Gruber et al. Apr 2008 A1
20080123183 Awdeh May 2008 A1
20080151041 Shafer et al. Jun 2008 A1
20080183038 Tilson et al. Jul 2008 A1
20080195128 Orbay et al. Aug 2008 A1
20080221394 Melkent et al. Sep 2008 A1
20080221591 Farritor et al. Sep 2008 A1
20080266840 Nordmeyer et al. Oct 2008 A1
20080269564 Gelnett Oct 2008 A1
20080269730 Dotson Oct 2008 A1
20080278571 Mora Nov 2008 A1
20080300465 Feigenwinter et al. Dec 2008 A1
20080303899 Berci Dec 2008 A1
20080310181 Gurevich et al. Dec 2008 A1
20080319266 Poll et al. Dec 2008 A1
20090030436 Charles Jan 2009 A1
20090034286 Krupa et al. Feb 2009 A1
20090040783 Krupa et al. Feb 2009 A1
20090052059 Lin Feb 2009 A1
20090105543 Miller et al. Apr 2009 A1
20090137893 Seibel et al. May 2009 A1
20090137989 Kataoka May 2009 A1
20090149716 Diao et al. Jun 2009 A1
20090156902 Dewey et al. Jun 2009 A1
20090185392 Krupa et al. Jul 2009 A1
20090190209 Nakamura Jul 2009 A1
20090190371 Root et al. Jul 2009 A1
20090209826 Sanders et al. Aug 2009 A1
20090225159 Schneider Sep 2009 A1
20090238442 Upham et al. Sep 2009 A1
20090244259 Kojima et al. Oct 2009 A1
20090245600 Hoffman et al. Oct 2009 A1
20090248036 Hoffman et al. Oct 2009 A1
20090258638 Lee Oct 2009 A1
20090304582 Rousso et al. Dec 2009 A1
20090318756 Fisher et al. Dec 2009 A1
20090326322 Diolaiti Dec 2009 A1
20090326331 Rosen Dec 2009 A1
20100013910 Farr Jan 2010 A1
20100013971 Amano Jan 2010 A1
20100063508 Borja Mar 2010 A1
20100081919 Hyde et al. Apr 2010 A1
20100107118 Pearce Apr 2010 A1
20100128350 Findlay et al. May 2010 A1
20100161129 Costa et al. Jun 2010 A1
20100168520 Poll et al. Jul 2010 A1
20100182340 Bachelder et al. Jul 2010 A1
20100198014 Poll et al. Aug 2010 A1
20100198241 Gerrah et al. Aug 2010 A1
20100208046 Takahashi Aug 2010 A1
20100245557 Luley, III et al. Sep 2010 A1
20100249496 Cardenas et al. Sep 2010 A1
20100286473 Roberts Nov 2010 A1
20100305409 Chang Dec 2010 A1
20100312069 Sutherland et al. Dec 2010 A1
20100318099 Itkowitz et al. Dec 2010 A1
20100331855 Zhao et al. Dec 2010 A1
20110034781 Loftus Feb 2011 A1
20110038040 Abele et al. Feb 2011 A1
20110042452 Cormack Feb 2011 A1
20110046439 Pamnani et al. Feb 2011 A1
20110063734 Sakaki Mar 2011 A1
20110065999 Manzanares Mar 2011 A1
20110071359 Bonadio et al. Mar 2011 A1
20110080536 Nakamura et al. Apr 2011 A1
20110115882 Shahinian et al. May 2011 A1
20110115891 Trusty May 2011 A1
20110144436 Nearman et al. Jun 2011 A1
20110178395 Miesner et al. Jul 2011 A1
20110184243 Wright et al. Jul 2011 A1
20110190588 McKay Aug 2011 A1
20110234841 Akeley et al. Sep 2011 A1
20110249323 Tesar et al. Oct 2011 A1
20110257488 Koyama et al. Oct 2011 A1
20110263938 Levy Oct 2011 A1
20110264078 Lipow et al. Oct 2011 A1
20110288560 Shohat et al. Nov 2011 A1
20110298704 Krah Dec 2011 A1
20110301421 Michaeli et al. Dec 2011 A1
20110316994 Lemchen Dec 2011 A1
20120029280 Kucklick Feb 2012 A1
20120035423 Sebastian et al. Feb 2012 A1
20120035638 Mathaneswaran et al. Feb 2012 A1
20120040305 Karazivan et al. Feb 2012 A1
20120041272 Dietze, Jr. et al. Feb 2012 A1
20120041534 Clerc et al. Feb 2012 A1
20120059222 Yoshida Mar 2012 A1
20120065468 Levy et al. Mar 2012 A1
20120087006 Signaigo Apr 2012 A1
20120088974 Maurice Apr 2012 A1
20120089093 Trusty Apr 2012 A1
20120097567 Zhao et al. Apr 2012 A1
20120108900 Viola et al. May 2012 A1
20120116173 Viola May 2012 A1
20120127573 Robinson et al. May 2012 A1
20120130399 Moll et al. May 2012 A1
20120134028 Maruyama May 2012 A1
20120157775 Yamaguchi Jun 2012 A1
20120157787 Weinstein et al. Jun 2012 A1
20120157788 Serowski et al. Jun 2012 A1
20120158015 Fowler et al. Jun 2012 A1
20120190925 Luiken Jul 2012 A1
20120197084 Drach et al. Aug 2012 A1
20120230668 Vogt Sep 2012 A1
20120232352 Lin et al. Sep 2012 A1
20120245432 Karpowicz et al. Sep 2012 A1
20120265023 Berci et al. Oct 2012 A1
20120320102 Jorgensen Dec 2012 A1
20120330129 Awdeh Dec 2012 A1
20130012770 Su Jan 2013 A1
20130027516 Hart et al. Jan 2013 A1
20130041226 McDowall Feb 2013 A1
20130041368 Cunningham et al. Feb 2013 A1
20130060095 Bouquet Mar 2013 A1
20130066304 Belson et al. Mar 2013 A1
20130072917 Kaschke et al. Mar 2013 A1
20130076863 Rappel Mar 2013 A1
20130077048 Mirlay Mar 2013 A1
20130085337 Hess et al. Apr 2013 A1
20130159015 O'Con Jun 2013 A1
20130197313 Wan Aug 2013 A1
20130245383 Friedrich et al. Sep 2013 A1
20130298208 Ayed Nov 2013 A1
20130331730 Fenech et al. Dec 2013 A1
20140005485 Tesar et al. Jan 2014 A1
20140005486 Charles et al. Jan 2014 A1
20140005487 Tesar Jan 2014 A1
20140005488 Charles et al. Jan 2014 A1
20140005489 Charles Jan 2014 A1
20140005555 Tesar Jan 2014 A1
20140081659 Nawana et al. Mar 2014 A1
20140168785 Belgum Jun 2014 A1
20140168799 Hurbert et al. Jun 2014 A1
20140179998 Pacey et al. Jun 2014 A1
20140187859 Leeuw et al. Jul 2014 A1
20140198190 Mu Jul 2014 A1
20140247482 Doi et al. Sep 2014 A1
20140275801 Menchaca et al. Sep 2014 A1
20140276008 Steinbach et al. Sep 2014 A1
20140285403 Kobayashi Sep 2014 A1
20140316209 Overes et al. Oct 2014 A1
20140327742 Kiening et al. Nov 2014 A1
20140347395 Tsao et al. Nov 2014 A1
20140362228 McCloskey et al. Dec 2014 A1
20140378843 Valdes et al. Dec 2014 A1
20150018622 Tesar Jan 2015 A1
20150025324 Wan Jan 2015 A1
20150080982 Funderburk Mar 2015 A1
20150087918 Vasan Mar 2015 A1
20150094533 Kleiner et al. Apr 2015 A1
20150112148 Bouquet Apr 2015 A1
20150141755 Tesar et al. May 2015 A1
20150238073 Charles Aug 2015 A1
20150297311 Tesar Oct 2015 A1
20150300816 Yang et al. Oct 2015 A1
20150317830 Kihara Nov 2015 A1
20160018598 Hansson Jan 2016 A1
20160089026 Heerren Mar 2016 A1
20160139039 Ikehara et al. May 2016 A1
20160220324 Tesar Aug 2016 A1
20170020627 Tesar Jan 2017 A1
20170143442 Tesar May 2017 A1
20170258550 Vazales Sep 2017 A1
20180055348 Tesar et al. Mar 2018 A1
20180055502 Charles et al. Mar 2018 A1
20180064316 Charles et al. Mar 2018 A1
20180064317 Tesar Mar 2018 A1
20180070804 Tesar Mar 2018 A1
20180256145 Tesar Sep 2018 A1
20180318033 Tesar Nov 2018 A1
20180353059 Tesar Dec 2018 A1
20180368656 Austin et al. Dec 2018 A1
20190046021 Charles et al. Feb 2019 A1
20190380566 Charles Dec 2019 A1
20200318810 Tesar Oct 2020 A1
20210000461 Charles et al. Jan 2021 A1
Foreign Referenced Citations (63)
Number Date Country
2336380 Sep 1999 CN
101518438 Sep 2009 CN
102495463 Jun 2012 CN
202920720 May 2013 CN
103 41 125 Apr 2005 DE
10 2010 030 285 Dec 2011 DE
10 2010 044 502 Mar 2012 DE
0 293 228 Nov 1988 EP
0 233 940 Nov 1993 EP
0 466 705 Jun 1996 EP
1 175 106 Jan 2002 EP
1 333 305 Aug 2003 EP
2 641 561 Sep 2013 EP
49-009378 Mar 1974 JP
03-018891 Jan 1991 JP
06-315487 Nov 1994 JP
07-194602 Aug 1995 JP
07-261094 Oct 1995 JP
08-131399 May 1996 JP
2001-087212 Apr 2001 JP
2001-117049 Apr 2001 JP
2001-161638 Jun 2001 JP
2001-161640 Jun 2001 JP
2002-011022 Jan 2002 JP
3402797 May 2003 JP
2003-322803 Nov 2003 JP
2004-024835 Jan 2004 JP
3549253 Aug 2004 JP
2004-305525 Nov 2004 JP
2007-068876 Mar 2007 JP
2009-288296 Dec 2009 JP
4503748 Jul 2010 JP
2010-206495 Sep 2010 JP
2011-509729 Mar 2011 JP
2011-118741 Jun 2011 JP
WO 87001276 Mar 1987 WO
WO 91012034 Aug 1991 WO
WO 99017661 Apr 1999 WO
WO 00078372 Dec 2000 WO
WO 01072209 Oct 2001 WO
WO 2007047782 Apr 2007 WO
WO 2008073243 Jun 2008 WO
WO 2009051013 Apr 2009 WO
WO 2010079817 Jul 2010 WO
WO 2010114843 Oct 2010 WO
WO 2010123578 Oct 2010 WO
WO 2011069469 Jun 2011 WO
WO 2012047962 Apr 2012 WO
WO 2012078989 Jun 2012 WO
WO 2013049679 Apr 2013 WO
WO 2013109966 Jul 2013 WO
WO 2013116489 Aug 2013 WO
WO 2014004717 Jan 2014 WO
WO 2014060412 Apr 2014 WO
WO 2014189969 Nov 2014 WO
WO 2015042460 Mar 2015 WO
WO 2015042483 Mar 2015 WO
WO 2015100310 Jul 2015 WO
WO 2016090336 Jun 2016 WO
WO 2016154589 Sep 2016 WO
WO 2017091704 Jun 2017 WO
WO 2018208691 Nov 2018 WO
WO 2018217951 Nov 2018 WO
Non-Patent Literature Citations (197)
Entry
Burle Industries, Technical Memorandum 100—Fiber Optics: Theory and Applications, archived Feb. 21, 2007, in 20 pages https://web.archive.org/web/20070221125354/http://www.burle.com/cgi-bin/byteserver.pl/pdf/100r.pdf.
Office Action in U.S. Appl. No. 16/357,081, dated Jul. 8, 2020.
Decision to Refuse received in European Application No. 13808996.6, dated Oct. 19, 2020.
Official Communication in Japanese Application No. 2018-218745, dated Feb. 25, 2020.
Official Communication in Japanese Application No. 2018-218745, dated Dec. 15, 2020.
Amendment in U.S. Appl. No. 15/483,995, dated Dec. 12, 2019.
Final Office Action in U.S. Appl. No. 15/483,995, dated Feb. 20, 2020.
Amendment in U.S. Appl. No. 15/483,995, dated Aug. 19, 2020.
Office Action in U.S. Appl. No. 15/483,995, dated Sep. 4, 2020.
Office Action in U.S. Appl. No. 15/645,589, dated Dec. 26, 2019.
Amendment in U.S. Appl. No. 15/645,589, dated Jun. 26, 2020.
Notice of Allowance in U.S. Appl. No. 15/645,589, dated Jul. 14, 2020.
Notice of Allowance in U.S. Appl. No. 15/645,589, dated Nov. 18, 2020.
Preliminary Amendment filed in U.S. Appl. No. 16/036,665, dated Sep. 5, 2019.
Office Action in U.S. Appl. No. 16/036,665, dated Sep. 26, 2019.
Amendment filed in U.S. Appl. No. 16/036,665, dated Mar. 26, 2020.
Office Action in U.S. Appl. No. 16/036,665, dated Jul. 13, 2020.
Amendment in U.S. Appl. No. 15/626,516, dated Jan. 24, 2020.
Notice of Allowance in U.S. Appl. No. 15/626,516, dated Mar. 9, 2020.
Notice of Allowance in U.S. Appl. No. 15/626,516, dated Jun. 29, 2020.
Notice of Allowance in U.S. Appl. No. 15/626,516, dated Oct. 13, 2020.
Response to Restriction Requirement in U.S. Appl. No. 15/495,484, dated Nov. 13, 2019.
Office Action in U.S. Appl. No. 15/495,484, dated Nov. 27, 2019.
Amendment in U.S. Appl. No. 15/495,484, dated May 27, 2020.
Notice of Allowance in U.S. Appl. No. 15/495,484, dated Jun. 16, 2020.
Notice of Allowance in U.S. Appl. No. 15/495,484, dated Sep. 29, 2020.
Supplemental Notice of Allowance in U.S. Appl. No. 15/495,484, dated Oct. 13, 2020.
Amendment after Allowance in U.S. Appl. No. 15/495,484, dated Dec. 28, 2020.
Corrected Notice of Allowability in U.S. Appl. No. 15/495,484, dated Jan. 7, 2021.
Amendment in U.S. Appl. No. 16/786,054, dated Sep. 25, 2020.
Restriction Requirement in U.S. Appl. No. 15/948,842, dated Jan. 22, 2020.
Response to Restriction Requirement in U.S. Appl. No. 15/948,842, dated Jul. 21, 2020.
Office Action in U.S. Appl. No. 15/948,842, dated Aug. 24, 2020.
Preliminary Amendment filed in U.S. Appl. No. 15/724,100, dated Jun. 5, 2018.
Office Action in U.S. Appl. No. 15/724,100, dated Oct. 9, 2019.
Amendment filed in U.S. Appl. No. 15/724,100, dated Apr. 9, 2020.
Office Action in U.S. Appl. No. 15/724,100, dated Apr. 22, 2020.
Amendment in U.S. Appl. No. 15/724,100, dated Jun. 22, 2020.
Notice of Allowance in U.S. Appl. No. 15/724,100, dated Jul. 6, 2020.
Notice of Allowance in U.S. Appl. No. 15/724,100, dated Oct. 15, 2020.
Amendment after Allowance in U.S. Appl. No. 15/724,100, dated Jan. 14, 2021.
Corrected Notice of Allowability in U.S. Appl. No. 15/724,100, dated Jan. 28, 2021.
Notice of Allowance in U.S. Appl. No. 16/799,168, dated Nov. 4, 2020.
Amendment in U.S. Appl. No. 14/581,779, dated Feb. 4, 2020.
Final Office Action in U.S. Appl. No. 14/581,779, dated Apr. 29, 2020.
Amendment in U.S. Appl. No. 14/581,779, dated Oct. 29, 2020.
Office Action in U.S. Appl. No. 14/581,779, dated Nov. 23, 2020.
Official Communication in Japanese Application No. 2019-165048, dated Oct. 27, 2020.
Amendment in U.S. Appl. No. 15/081,653, dated Jan. 10, 2020.
Final Office Action in U.S. Appl. No. 15/081,653, dated Jan. 31, 2020.
Amendment in U.S. Appl. No. 15/081,653, dated Jul. 30, 2020.
Office Action in U.S. Appl. No. 15/081,653, dated Sep. 17, 2020.
Amendment in U.S. Appl. No. 15/360,565, dated Nov. 21, 2019.
Office Action in U.S. Appl. No. 15/360,565, dated Jan. 30, 2020.
Amendment in U.S. Appl. No. 15/360,565, dated Jul. 29, 2020.
Notice of Allowance in U.S. Appl. No. 15/360,565, dated Aug. 13, 2020.
Notice of Allowance in U.S. Appl. No. 15/360,565, dated Nov. 20, 2020.
Official Communication in European Application No. 16869253.1, dated Jan. 21, 2021.
Amendment in U.S. Appl. No. 15/973,433, dated Sep. 30, 2019.
Notice of Allowance in U.S. Appl. No. 15/973,433, dated Jan. 28, 2020.
Notice of Allowance in U.S. Appl. No. 15/973,433, dated Jun. 25, 2020.
Notice of Allowance in U.S. Appl. No. 15/973,433, dated Oct. 15, 2020.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2018/031442, dated Nov. 21, 2019.
International Preliminary Report on Patentability and Written Opinion in PCT/US2018/034227, dated Dec. 5, 2019.
Amendment in U.S. Appl. No. 16/677,441, dated Jun. 26, 2020.
Restriction Requirement in U.S. Appl. No. 16/677,441, dated Dec. 8, 2020.
Aesculap Inc.; Aesculap Neurosurgery Pneumatic Kerrison; http://www.aesculapusa.com/assets/base/doc/doc763-pneumatic_kerrison_brochure.pdf; 2008; in 12 pages.
Aliaga, Daniel G.; “Image Morphing and Warping”; Department of Computer Science; Purdue University; Spring 2010; in 61 pages.
“ARRI Medical Shows SeeFront 3D Display with Hd 3D Surgical Microscope”; dated Jun. 9; 2013; downloaded from http://www.seefront.com/news-events/article/arri-medical-shows-seefront-3d-display-with-hd-3d-surgical-microscope/ in 2 pages.
“Arriscope: A New Era in Surgical Microscopy”; Arriscope Brochure published May 20, 2014 in 4 pages.
AustriaMicroSystems; “AS5050: Smallest Magnetic Rotary Encoder for PA Low Power Applications”; www.austriamicrosystems.com/AS5050 printed Nov. 2012 in 2 pages.
Bayonet Lock Video; 00:16 in length; Date Unknown; Received Oct. 15, 2014 [Screenshots captured at 00:00, 00:02, 00:05, 00:08, and 00:16].
BellowsTech; “Actuators”; www.bellowstech.com/metal-bellows/actuators/ printed Jul. 17, 2012 in 4 pages.
“Carl Zeiss Unveils $99 VR One Virtual Reality Headset”; www.electronista.com/articles/14/10/10/zeiss.vr.one.able.to.accept, variety.of.smartphones.using.custo m.trays printed Oct. 13, 2014 in 2 pages.
Designboom; “Bright LED”; http://www.designboom.com/project/fiber-optics-light-glove/; Sep. 28, 2007.
Fei-Fei, Li; Lecture 10: Multi-View Geometry; Stanford Vision Lab; Oct. 24, 2011; in 89 pages.
“Fuse™. Full Spectrum Endoscopy™”; http://www.endochoice.com/Fuse printed Oct. 7, 2013 in 3 pages.
Hardesty, Larry; “3-D Cameras for Cellphones: Clever math could enable a high-quality 3-D camera so simple, cheap and power-efficient that it could be incorporated into handheld devices”; MIT News Office; http://web.mit.edu/newsoffice/2011/lidar-3d-camera-cellphones-0105.html; Jan. 5, 2012; in 4 pages.
Hartley et al.; “Multiple View Geometry in Computer Vision: Chapter 9 - Epipolar Geometry and the Fundamental Matrix”; http://www.robots.ox.ac.uk/˜vgg/hzbook/hzbook2/HZepipolar.pdf; Mar. 2004; 2nd Edition; Ch. 9; pp. 239-261.
Heidelberg Engineering; “MultiColor: Scanning Laser Imaging”; http://www.heidelbergengineering.com/US/products/spectralis-models/imaging-modes/multicolor/; Copyright © 2013; printed Apr. 5, 2013.
Kramer, Jennifer; “The Right Filter Set Gets the Most out of a Microscope”; Biophotonics International; January/Feb. 1999; vol. 6; pp. 54-58.
Krishna, Golden; “Watch: What Good is a Screen?”; http://www.cooper.com/author/golden_krishna as printed Jul. 9, 2014 in 62 pages.
Lang et al.; “ZEISS Microscopes for Microsurgery”; Springer-Verlag; Berlin, Heidelberg; 1981.
Leica Microsystems; “Images Truevision Integrated 3D”; http://www.leica-microsystems.com/products/surgical-microscopes/neurosurgery-spine/details/product/truevision-integrated-3d/gallery/; Nov. 26, 2014; in 3 pages.
Leica Microsystems; “Leica Microsystems' Ophthalmic Surgical Microscopes with Truevision 3D Technology Available Globally”; http://www.leica-microsystems.com/products/surgical-microscopes/neurosurgery-spine/details/product/truevision-integrated-3d/news/; Sep. 18, 2014; in 5 pages.
Lutze et al.; “Microsystems Technology for Use in a Minimally Invasive Endoscope Assisted Neurosurgical Operating System - Minop Ii”; 2005; http://web.archive.org/web/20151120215151/http://www.meditec.hia.rwth-aachen.de/fileadmin/content/meditec/bilder/forschung/aktuelle_projekte/robotische/Exoscope_Aesculap .pdf; Nov. 20, 2015 in 4 pages.
Male Bayonet Video; 00:04 in length; Date Unknown; Received Oct. 10, 2014 [Screenshots captured at 00:00, 00:01,00:02, 00:03, and 00:04].
MediTec; “Minop II—Robotical Microscope Platform”; http://web.archive.org/web/20151120213932/http://www.meditec.hia.rwth-aachen.de/en/research/former-projects/minop-ii/; Nov. 20, 2015 in 3 pages.
Melexis; “MLX75031 Optical Gesture and Proximity Sensing IC”; http://melexis.com/optical-sensors/optical-sensing.mlx75031-815.aspx?sta printed Mar. 15, 2013 in 1 page.
MMR Technologies; “Micro Miniature Refrigerators”; http://www.mmr-tech.com/mmr_overview.php; Copyright ©2011; printed Feb. 11, 2013.
Moog; “Surgical Handpieces: Therapeutic Ultrasonic Devices”; http://www.moog.com/products/surgical-hpieces/ printed Sep. 25, 2013 in 1 page.
Morita; “TwinPower Turbine® High Speed Handpieces Standard, 45″, and Ultra Series Head Designs”; J. Morita Mfg. Corp., http://www.morita.com/USA/root/img/pool/pdf/product_brochures/twinpower_brochure_I-264 0512 web.pdf; May 2012; in 20 pages.
“Narrow Band Imaging”; http://web.archive.Org/web/20150701233623/https://en.wikipedia.org/wiki/Narrow_band_imaging printed Jul. 1, 2015 in 1 page.
Olympus; “Olympus Introduces the World's First and Only Monopolar, Disposable Tonsil Adenoid Debrider (DTAD)”; http://www.olympusamerica.com/corporate/corp_presscenter_headline.asp?pressNo=926; Sep. 11, 2012; in 2 pages.
OmniVision; “OV2722 full HD (1080p) product brief: 1/6-lnch Native 1080p HD CameraChip Sensor for Ultra-Compact Applications”; http://web.archive.org/web/20120730043057/http://www.ovt.com/download_document.php?type=sensor &sensorid=119; May 2012 in 2 pages.
Orthofix; “ProView MAP System Retractors”; www.US.orthofix.com/products/proviewretractors.asp?cid=39; Copyright ©2010; printed Apr. 1, 2013.
OrtusTech; “Sample Shipment Start: World's Smallest Size Full-HD Color TFT LCD”; http://ortustech.co.jp/english/notice/20120427.html printed May 22, 2012 in 2 pages.
“Portion”; Definition; American Heritage® Dictionary of the English Language; Fifth Edition; 2016; Retrieved Apr. 12, 2018 from https://www.thefreedictionary.com/portion in 1 page.
Purcher, Jack; “Apple Wins a Patent for an Oculus Rift-Like Display System”; http://www.patentlyapple.com/patently-apple/2014/09/apple-wins-a-patent-for-an-oculus-rift-like-display-system.html; Sep. 9, 2014.
Rustum, Dr. Abu; “ICG Mapping Endometrial Cancer”; Pinpoint Endometrium Ca Lenfedenektomi MSKCC May 2013; Memorial Sloan Kettering Cancer Center; May 2013; Published to YouTube.com Sep. 1, 2013; in 2 pages; http://web.archive.org/web/20150402210857/https://www.youtube.com/watch?v=DhChvaUCe4l.
Saab, Mark; “Applications of High-Pressure Balloons in the Medical Device Industry”; http://www.ventionmedical.com/documents/medicalballoonpaper.pdf; Copyright© 1999; in 19 pages.
Savage, Lynn; “Sound and Light, Signifying Improved Imaging”; www.photonics.com/Article.aspx?AID=45039; Nov. 1, 2010; in 6 pages.
Sun et al.; “Neurotoxin-Directed Synthesis and in Vitro Evaluation of Au Nanoclusters”; RSC Advances, 2015; vol. 5, No. 38; p. 29647-29652.
Timm, Karl Walter; “Real-Time View Morphing of Video Streams”; University of Illinois; Chicago, Illinois; 2003; in 168 pages.
Truevision Microscopes; http://truevisionmicroscopes.com/images/productsnew/081a-f.jpg; printed Nov. 26, 2014 in 1 page.
Truevision; “About Truevision”; http://web.archive.Org/web/20071208125103/http://www.truevisionsys.com/about.html; as viewed Dec. 8, 2007 in 2 pages.
Truevision; “Leica Microsystems and Truevision® 3D Surgical create the first 3D digital hybrid microscope”; Press Release; Octobers, 2012; in 2 pages.
Truevision; “Truevision Technology”; http://web.archive.Org/web/20071208125125/http://www.truevisionsys.com/technology.html; as viewed Dec. 8, 2007 in 2 pages.
Whitney et al.; “Pop-up book MEMS”; Journal of Micromechanics and Microengineering; Oct. 14, 2011; vol. 21; No. 115021; in 7 pages.
Wikipedia; “Zoom Lens”; http://en.wikipedia.org/wiki/Optical_Zoom; printed Oct. 7, 2014 in 3 pages.
Zeiss; “Informed for Medical Professionals, Focus: Fluorescence”; Carl Zeiss; 2nd Issue; Oct. 2006; 30-801-LBW-GFH-X-2006; Printed in Germany; in 32 pages.
Zeiss; “Ophthalmic Surgery in Its Highest Form, Opmi® Visu 210”; Carl Zeiss, 2005, 30-097/III-e/USA Printed in Germany AW-TS-V/2005 Uoo; in 19 pages.
Zeiss; “SteREO Discovery. V12, Expanding the Boundaries”; Carl Zeiss, Sep. 2004; 46-0008 e 09.2004, in 6 pages.
Zeiss; “Stereomicroscopes: Stemi SV 6, SV 11, SV 11 Apo”; The Profile; 1999; in 30 pages.
Zeiss; “Time for a Change: OPMI® pico for ENT”; Carl Zeiss, 2005, 30-451 /lll-e Printed in Germany LBW-TS-V/2005 Uoo, in 8 pages.
Zhang, Michael; “Lifx: A WiFi-Enabled LED Bulb that May Revolutionize Photographic Lighting”; http://www.petapixel.com/2012/09/22/lifx-a-wifi-enabled-led-bulb-that-may-revolutionize-photographic-lighting/ printed Sep. 28, 2012 in 9 pages.
Zhang, Sarah; “The Obscure Neuroscience Problem That's Plaguing VR”; http://web.archive.org/web/20150812172934/http://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr/; Aug. 11, 2015 in 5 pages.
Preliminary Amendment in U.S. Appl. No. 16/357,081, dated Sep. 4, 2019.
Official Communication in European Application No. 13808996.6, dated Jan. 4, 2016.
Official Communication in European Application No. 13808996.6, dated Apr. 14, 2016.
Official Communication in European Application No. 13808996.6, dated Feb. 21, 2017.
Official Communication in European Application No. 13808996.6, dated Jun. 6, 2017.
Official Communication in European Application No. 13808996.6, dated Jun. 15, 2018.
Official Communication in European Application No. 13808996.6, dated May 13, 2019.
Official Communication in Japanese Application No. 2015-520471, dated May 9, 2017.
Official Communication in Japanese Application No. 2015-520471, dated Nov. 21, 2017.
Notice of Decision or Rejection in Japanese Application No. 2015-520471, dated Jul. 24, 2018.
International Search Report and Written Opinion in PCT Application No. PCT/US2013/047972, dated Jan. 3, 2014.
International Preliminary Reporton Patentability in PCT Application No. PCT/US2013/047972, dated Jan. 8, 2015.
Preliminary Amendment in U.S. Appl. No. 15/483,995, dated Nov. 21, 2017.
Office Action in U.S. Appl. No. 15/483,995, dated Mar. 9, 2018.
Amendment in U.S. Appl. No. 15/483,995, dated Sep. 7, 2018.
Final Office Action in U.S. Appl. No. 15/483,995, dated Nov. 29, 2018.
Amendment in U.S. Appl. No. 15/483,995, dated May 28, 2019.
Office Action in U.S. Appl. No. 15/483,995, dated Jun. 13, 2019.
Office Action in U.S. Appl. No. 15/645,589, dated Feb. 9, 2018.
Amendment in U.S. Appl. No. 15/645,589, dated Aug. 7, 2018.
Final Office Action in U.S. Appl. No. 15/645,589, dated Nov. 28, 2018.
Amendment in U.S. Appl. No. 15/645,589, dated May 28, 2019.
Office Action in U.S. Appl. No. 15/645,589, dated Jun. 13, 2019.
Preliminary Amendment filed in U.S. Appl. No. 16/036,665, dated Nov. 1, 2018.
Office Action in U.S. Appl. No. 15/626,516, dated Mar. 14, 2018.
Amendment in U.S. Appl. No. 15/626,516,dated Sep. 13, 2018.
Final Office Action in U.S. Appl. No. 15/626,516, dated Jan. 15, 2019.
Response in U.S. Appl. No. 15/626,516, dated Jul. 15, 2019.
Restriction Requirement in U.S. Appl. No. 15/495,484, dated May 14, 2019.
Official Communication in European Application No. 14800423.7, dated Feb. 8, 2017.
International Search Report and Written Opinion in PCT Application No. PCT/US2014/038839, dated Oct. 17, 2014.
International Preliminary Reporton Patentability in PCT Application No. PCT/US2014/038839, dated Dec. 5, 2015.
Partial Supplementary European Search Report in European Application No. 14845427.5, dated May 4, 2017.
Extended European Search Report in European Application No. 14845427.5, dated Aug. 8, 2017.
Extended European Search Report in European Application No. 14846410.0, dated Jun. 23, 2017.
Official Communication in European Application No. 14846410.0, dated Jul. 18, 2018.
Official Communication in European Application No. 14846410.0, dated Mar. 20, 2019.
Official Communication in Japanese Application No. 2016-544032, dated Jun. 26, 2018.
International Search Report and Written Opinion in PCT Application No. PCT/US2014/056643, dated Dec. 11, 2014.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/056643, dated Mar. 31, 2016.
Invitation to Pay Additional Fees in PCT Application No. PCT/US2014/056681, dated Jan. 14, 2015.
International Search Report and Written Opinion in PCT Application No. PCT/US2014/056681, dated Mar. 20, 2015.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/056681, dated Mar. 31, 2016.
Preliminary Amendment in U.S. Appl. No. 14/581,779, dated Jul. 6, 2015.
Restriction Requirement in U.S. Appl. No. 14/581,779, dated Oct. 31, 2017.
Restriction Requirement and Election of Species Response in U.S. Appl. No. 14/581,779, dated Jan. 2, 2018.
Office Action in U.S. Appl. No. 14/581,779, dated Apr. 24, 2018.
Amendment in U.S. Appl. No. 14/581,779, dated Sep. 24, 2018.
Final Office Action in U.S. Appl. No. 14/581,779, dated Jan. 4, 2019.
Amendment in U.S. Appl. No. 14/581,779, dated Jul. 2, 2019.
Office Action in U.S. Appl. No. 14/581,779, dated Aug. 5, 2019.
Extended European Search Report in European Application No. 14873324.9, dated Aug. 25, 2017.
Official Communication in Japanese Application No. 2016-542194, dated Nov. 6, 2018.
Decision of Rejection in Japanese Application No. 2016-542194, dated May 14, 2019.
Invitation to Pay Additional Fees in PCT Application No. PCT/US2014/072121, dated Mar. 2, 2015.
International Search Report and Written Opinion in PCT Application No. PCT/US2014/072121, dated May 1, 2015.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/072121, dated Jul. 7, 2016.
Extended European Search Report in European Application No. 15865454.1, dated Jun. 27, 2018.
International Search Report and Written Opinion in PCT Application No. PCT/US2015/064133, dated Feb. 9, 2016.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2015/064133, dated Jun. 15, 2017.
Preliminary Amendment in U.S. Appl. No. 15/081,653, dated Oct. 11, 2016.
Office Action in U.S. Appl. No. 15/081,653, dated Mar. 28, 2018.
Amendment in U.S. Appl. No. 15/081,653, dated Sep. 27, 2018.
Final Office Action in U.S. Appl. No. 15/081,653, dated Nov. 16, 2018.
Final Amendment in U.S. Appl. No. 15/081,653, dated May 15, 2019.
Office Action in U.S. Appl. No. 15/081,653, dated Jul. 12, 2019.
Extended European Search Report in European Application No. 16769809.1, dated Nov. 23, 2018.
International Search Report and Written Opinion in PCT Application No. PCT/US2016/024330, dated Jul. 1, 2016.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2016/024330, dated Oct. 5, 2017.
Preliminary Amendment in U.S. Appl. No. 15/360,565, dated Feb. 6, 2017.
Office Action in U.S. Appl. No. 15/360,565, dated Aug. 10, 2018.
Amendment in U.S. Appl. No. 15/360,565, dated Feb. 8, 2019.
Office Action in U.S. Appl. No. 15/360,565, dated May 22, 2019.
Extended European Search Report in European Application No. 16869253.1, dated May 29, 2019.
Invitation to Pay Additional Fees in PCT Application No. PCT/US2016/063549, dated Feb. 2, 2017.
International Search Report and Written Opinion in PCT Application No. PCT/US2016/063549, dated Apr. 14, 2017.
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2016/063549, dated Jun. 7, 2018.
Office Action in U.S. Appl. No. 15/973,433, dated Jun. 28, 2019.
International Search Report and Written Opinion in PCT Application No. PCT/US2018/031442, dated Sep. 14, 2018.
International Search Report and Written Opinion in PCT Application No. PCT/US2018/034227, dated Jul. 30, 2018.
Related Publications (1)
Number Date Country
20210015355 A1 Jan 2021 US
Provisional Applications (6)
Number Date Country
61880808 Sep 2013 US
61920451 Dec 2013 US
61921051 Dec 2013 US
61921389 Dec 2013 US
61922068 Dec 2013 US
61923188 Jan 2014 US
Continuations (2)
Number Date Country
Parent 16042318 Jul 2018 US
Child 16799168 US
Parent 14491827 Sep 2014 US
Child 16042318 US