Various example embodiments relate to visual displays and, more specifically but not exclusively, to augmented reality displays for imaging applications.
Augmented reality (AR) refers to real-time integration of digital information into a user's environment. In some examples, AR technology is used to overlay additional content, such as graphics, images, and/or text, onto the real-world view observed by the user, thereby enriching the user's perception of reality rather than replacing it. With AR, users still see and can interact with the corresponding physical environments while experiencing supplementary information overlaid onto their field of vision. Various AR technologies are beneficially used, e.g., in retail shopping, education, manufacturing, entertainment, healthcare, and navigation.
Various examples provide methods and apparatus for generating intraocular overlay patterns in an optical microscope. In one example, an intraocular overlay pattern is generated using a digital micromirror device (DMD) located in the intermediate focal plane of a telecentric optical relay coupled between the objective lens and an eyepiece of the optical microscope. In some examples, the overlay pattern displays real-time visualization of intraoperative optical coherence tomography (iOCT) data and surgical field overlays. Such overlay patterns can beneficially be used, e.g., to provide real-time intraoperative feedback during ophthalmic surgery substantially without any interference with the surgical workflow.
In one example, an optical microscope comprises: a first optical relay coupled between an objective lens and a first eyepiece of the optical microscope, the first optical relay having an intermediate focal plane between first and second relay portions thereof; a first two-dimensional (2D) mirror array having at least a first portion thereof in the intermediate focal plane of the first optical relay; a light source configured to illuminate the first 2D mirror array with overlay light; and a driver circuit configured to controllably switch each mirror of the first 2D mirror array between a respective first orientation and a respective second orientation. In the respective first orientation, a mirror of the first 2D mirror array is configured to: direct object light from the objective lens toward the first eyepiece; and direct the overlay light from the light source toward a light trap. In the respective second orientation, the mirror of the first 2D mirror array is configured to: direct the object light from the objective lens toward the light trap; and direct the overlay light from the light source toward the first eyepiece.
In another example, a method of generating an intraocular overlay pattern in an optical microscope comprises illuminating a 2D mirror array with overlay light, the 2D mirror array having at least a portion thereof in an intermediate focal plane of an optical relay coupled between an objective lens and an eyepiece of the optical microscope; and controllably rotating each mirror of the 2D mirror array into a respective first orientation or a respective second orientation. In the respective first orientation, a mirror of the 2D mirror array is configured to: direct object light from the objective lens toward the eyepiece; and direct the overlay light toward a light trap. In the respective second orientation, the mirror of the 2D mirror array is configured to: direct the object light from the objective lens toward the light trap; and direct the overlay light toward the eyepiece.
Other aspects, features, and benefits of various disclosed embodiments will become more fully apparent, by way of example, from the following detailed description and the accompanying drawings, in which:
In the following description, numerous details are set forth, such as optical device/system configurations, timings, operations, and the like, in order to provide an understanding of one or more aspects of the present disclosure. It will be readily apparent to persons of ordinary skill in the pertinent art that these specific details are mere examples and are not intended to limit the scope of this application.
An operating or surgical microscope is an optical microscope specifically designed for use in a surgical setting, usually to assist with microsurgery. Typical magnification provided by a surgical microscope is in the approximate range from 4× to 40×. Certain components of the surgical microscope may be specifically designed for relatively easy sterilization or disinfection to ensure good cross-infection control. In some examples, a surgical microscope may incorporate a prism that allows splitting of the pertinent light beam, e.g., to enable the surgeon's assistant to also visualize the procedure or to allow photography or videography of the surgical field to be performed substantially without any interference with the surgical procedure. In some examples, a surgical microscope may incorporate optics that enables intraoperative optical coherence tomography (iOCT) to be performed during the procedure. Fields of medicine that make significant use of surgical microscopes include plastic surgery, dentistry (e.g., endodontics), otolaryngology (or ENT) surgery, ophthalmic surgery, and neurosurgery.
Microscope-integrated iOCT allows for depth-resolved volumetric imaging during surgery. In some examples, real-time visualization of iOCT data may be displayed on an external monitor (e.g., a computer or True Vision display) or a heads-up display (HUD) coupled to the microscope. In such examples, stereoscopic surgical views are either completely lost or necessitate the use of polarization glasses to be observable on the external monitor. In some other examples, an intraocular HUD may couple Light-Emitting-Diode (LED) or Organic Light-Emitting-Diode (OLED) displays across a beamsplitter cube to overlay the iOCT data onto the surgical views. However, in such examples, the display contrast may be limited by a relatively low panel brightness and/or tradeoffs between the display and surgical field brightness. In addition, to be sufficiently perceptible, the overlays may need to be generally constrained to dark or unused regions of the surgical field of view (FOV).
At least some of the above-indicated problems in the state of the art can beneficially be addressed using various embodiments disclosed herein. In one example, a surgical microscope includes an optical relay coupled between an objective lens and an eyepiece of the microscope, a two-dimensional (2D) micromirror array in the intermediate focal plane of the optical relay, a light source configured to illuminate the 2D micromirror array with overlay light, and an electronic controller configured to controllably switch each micromirror in the 2D micromirror array between a respective first orientation and a respective second orientation. In the respective first orientation, a micromirror of the 2D micromirror array is configured to direct object light from the objective lens toward the eyepiece and direct the overlay light from the light source toward a light trap. In the respective second orientation, the micromirror is configured to direct the object light from the objective lens toward the light trap and direct the overlay light from the light source toward the eyepiece. The electronic controller operates to control orientations of individual micromirrors in the 2D micromirror array to cause a desired overlay pattern (e.g., iOCT graphics, surgical-field markers, text, etc.) to be projected by the 2D micromirror array toward the eyepiece of the microscope together with the complimentary portion of the real-time view of the surgical FOV.
In some examples, the 2D micromirror array can be implemented using a commercially available digital micromirror device (DMD), such as the Model DLP471TP DMD from Texas Instruments. In some examples, the surgical microscope additionally includes one or more of the following features: binocular support using separate dedicated DMDs for each ocular or a shared DMD for both oculars; stereoscopic overlay using differential ray casting displays for each ocular; more precise color control and fidelity by modulating the corresponding RGB LED source and DMD on/off synchronously or asynchronously; polarization switching to multiplex information, thereby effectively doubling the information throughput; and use of the DMD(s) in conjunction with pixel shifters to increase pixel density (e.g., at the expense of pixel rates).
In some examples, a similar optical design can be used in optical devices other than surgical microscopes, such as in various augmented/mixed reality viewing devices for medical and educational applications.
As used herein, the term “real time” refers to a computer-based process that controls or monitors a corresponding environment by receiving data, processing the received data, and generating a response sufficiently quickly to affect or characterize the environment without significant delay. In the context of control or processing software, real-time responses are often understood to be on the order of milliseconds, or sometimes microseconds. In the context of a surgical procedure, “real-time” updates mean that the experimental data and measurement results derived therefrom sufficiently accurately represent the state of the surgical FOV at any point in time. In this case, data-acquisition and/or processing delays of several seconds may still be considered to be within “real time” or “near real time” for at least some surgical procedures.
In a representative example, the optical microscope 100 includes first and second eyepieces, only one of which (labeled 140) is explicitly shown in
In the example shown, a user eye is modeled with a camera 150, which is optically coupled to the eyepiece 140 in a substantially similar manner. The camera 150 includes a lens 152 and a pixelated photodetector (e.g., a CCD) 154, which respectively model a typical crystalline lens and a typical retina of a human eye. As such, images captured by the pixelated photodetector 154 (e.g., see
Different configurations of the optical microscope 100 may employ different embodiments of the eyepiece 140 characterized by different respective magnifications, such as 10×, 12.5×, 16×, 20×, etc. The choice of magnification typically depends on the needed size of the field of view and the desired overall magnification of the optical microscope 100. In some examples, the eyepiece 140 has a focal length of 125 mm.
The magnification changer 120 is designed to change the degree of magnification of the optical microscope 100 without any change in the working distance (i.e., the distance between the objective lens 116 and the patient eye 102). In the example shown, the magnification changer 120 includes a system of lenses, the relative position(s) of which can be controllably changed to provide a continuous change in the magnification. In one example, the changeable magnification provided by the magnification changer 120 can be in the range from 0.5× to 2.5×.
The objective lens 116 operates to direct the illumination light toward the patient eye 102 and to collect a portion of the illumination light reflected from the retina 104 (which may be referred to as object light). A reduction lens 110 and an ophthalmic lens 108 operate to reduce the beam dimeter and collimate the illumination light entering the patient eye 102 through a crystalline lens 106 thereof and further operate to properly couple the reflected light exiting the patient eye 102 through the crystalline lens 106 into the objective lens 116. The lenses 108, 110 further operate to increase the FOV of the optical microscope 100 (and of the volumetric imaging module 300, see
The illumination light is typically generated by an external illuminator (not explicitly shown in
In various examples, the optical microscope 100 may use one of the following mechanical support systems: (i) on casters; (ii) wall mounted; (iii) tabletop; and (iv) ceiling mounted. In some cases, an on-caster stand is the preferred mechanical support structure owing to its better mobility. In some other cases, a ceiling or wall mount may be preferred because it helps with space management. An example mechanical support system for the optical microscope 100 may include precision motorized mechanics so that the microscope can be adjusted flexibly to the right position as needed. In some examples, the mechanical support system incorporates a foot pedal that can be used to control the illumination, focus, zoom, and X-Y position of the optics over the surgical field.
To enable overlays on the view observable through the eyepiece 140, the optical microscope 100 includes an optical relay 130, a 2D mirror array 160, a light engine 170, a driver circuit 180, and an electronic controller 190. The optical relay 130 may be optically coupled between the objective lens 116 and the eyepiece 140, e.g., as indicated in
The 2D mirror array 160 is positioned such that the effective reflecting surface thereof is substantially located in the intermediate focal plane 162 of the optical relay 130. The optics of the optical relay 130 includes optical prisms 164, 166 optically coupled between the relay portions 132, 134 and the 2D mirror array 160 to achieve substantially (e.g., within ±5 degrees) normal incidence of the object light onto the 2D mirror array 160. Overlay light generated by the light engine 170 is collimated by a collimation lens 168 and is directed through the optical prisms 164, 166 to the 2D mirror array 160, e.g., as indicated in
In one example, mirrors of the 2D mirror array 160 are arranged in mutually orthogonal rows and columns and have a pitch of 5.4 μm/pixel, with an overall size of the corresponding rectangular array being 1920×1080 pixel2. In other examples, other pitches and overall sizes can also be used. The driver circuit 180 is configured to control the 2D mirror array 160 and the light engine 170 via control signals 176 and 178, respectively. In some examples, the control signals 176 and 178 can be synchronized to achieve a 60 Hz refresh rate (240 Hz per RGB color channel). In one example, the light engine 170 is implemented using the light engine Model DLPDLCR471TPEVM commercially available from Texas Instruments. In other examples, other suitable light engines or light sources can also be used as substitutes. For example, a fixed-spectrum LED source can be used when color control or color variability of the overlay light is not needed.
The electronic controller 190 is configured to provide a video signal 182 based on which the driver circuit 180 generates the control signals 176 and 178 for the 2D mirror array 160 and the light engine 170, respectively. In general, the video signal 182 can specify any pattern (e.g., including graphics, images, and/or text) to be overlayed onto the intraocular view observed through the eyepiece 140. In one example, the pattern specified by the video signal 182 includes one or more depth profiles (Z-coordinate information) of the patient eye 102 in the FOV of the optical microscope 100. In some examples, such depth profiles are computed by the electronic controller 190 or other suitable computing device based on the iOCT data received via a communication signal 192 from a volumetric imaging module 300 coupled to the optical microscope 100, e.g., as illustrated in
In some examples, the optical microscope 100 may further include a second set of elements analogous to the set including the optical relay 130, the 2D mirror array 160, the light engine 170, and the driver circuit 180. In such examples, such second set of elements is similarly installed in and coupled to the optical path between the second portion 1222 of the magnification changer 120 and the above-described second eyepiece of the binocular head of the optical microscope 100. The electronic controller 190 can similarly be used to control the driver circuit for the second set of elements, e.g., to beneficially enable stereoscopic overlays. In some examples, a shared 2D mirror array may be coupled to the two eyepieces such that a first portion of the shared 2D mirror array is configured to handle the intraocular overlay patterns for the first eyepiece, and a nonoverlapping second portion of the shared 2D mirror array is configured to handle the intraocular overlay patterns for the second eyepiece.
Herein, a “main plane” of an object, such as a die, a substrate, an IC, or a MEMS device, is a plane parallel to a substantially planar surface thereof that has about the largest area among the exterior surfaces of the object. This substantially planar surface may be referred to as a main surface. The exterior surfaces of the object that have one relatively large size, e.g., length, but are of much smaller area, e.g., less than one half of the main-surface area, are typically referred to as the edges of the object. A surface is considered to be substantially planar when the feature height variation along the surface is much smaller than the length of at least one of its edges.
Referring to
Referring to
When all of the individual mirrors 202 in the 2D mirror array 160 are in the respective first orientations, the eyepiece 140 displays an intraocular view of the FOV that is free of overlays. When mirrors in complementary first and seconds subsets of the mirrors 202 in the 2D mirror array 160 are in the first and second orientations, respectively, the eyepiece 140 displays an intraocular view of the FOV in which an overlay is present. The geometric shape of the overlay is determined by the geometric shape of the second subset of mirrors. The color pattern and brightness of the overlay are determined by the mixture, intensity, and gating of the primary (e.g., RGB) colors emitted by the light engine 170. Dynamic (i.e., time-dependent) overlay patterns are generated by changing the geometric shape of the second subset of mirrors in the 2D mirror array 160 and/or the mixture, intensity, and gating of the primary colors emitted by the light engine 170 by appropriately driving the 2D mirror array 160 and the light engine 170 using the drive circuit 180.
The volumetric imaging module 300 is designed and configured to perform spectrally encoded coherence tomography and reflectometry (SECTR), which combines cross-sectional swept-source optical-coherence-tomography (OCT) imaging with en face SER. This multimodality of the module 300 beneficially enables concurrent acquisition of en face reflectance images of regions-of-interest (ROI) motion at high-speed with inherently spatiotemporally co-registered volumetric OCT data. The utility of the SECTR methodology was previously demonstrated, e.g., for SER-based retinal-tracking and OCT motion-correction, multi-volumetric OCT mosaicking to extend the imaging FOV, multi-volumetric averaging to improve the OCT signal-to-noise ratio (SNR) and OCT angiography connectivity. Integrating the module 300 with the optical microscope 100 further beneficially enables the integrated instrument to generate real-time overlays, e.g., displaying co-registered cross sections of the corresponding en face FOV that can be observed by the user through at least the first eyepiece 140 or, in some embodiments, through both the first and second eyepieces of the binocular head of the optical microscope 100.
Referring to
The volumetric imaging module 300 is configured to use NIR light generated by an external optical engine (not explicitly shown in
Referring to both
Referring to
The returned SER signal is detected in the SER input/output block 312 using an avalanche photodiode (APD, not explicitly shown). The electrical signal generated by the APD is converted into digital form using an analog-to-digital converter (ADC, not explicitly shown), and a resulting digital signal 308 (see
The computing device 700 of
The computing device 700 includes a processing device 702 (e.g., one or more processing devices). As used herein, the terms “electronic processor device” and “processing device” interchangeably refer to any device or portion of a device that processes electronic data from registers and/or memory to transform that electronic data into other electronic data that may be stored in registers and/or memory. In various embodiments, the processing device 702 may include one or more digital signal processors (DSPs), application-specific integrated circuits (ASICs), central processing units (CPUs), graphics processing units (GPUs), server processors, field programmable gate arrays (FPGA), or any other suitable processing devices.
The computing device 700 also includes a storage device 704 (e.g., one or more storage devices). In various embodiments, the storage device 704 may include one or more memory devices, such as random-access memory (RAM) devices (e.g., static RAM (SRAM) devices, magnetic RAM (MRAM) devices, dynamic RAM (DRAM) devices, resistive RAM (RRAM) devices, or conductive-bridging RAM (CBRAM) devices), hard drive-based memory devices, solid-state memory devices, networked drives, cloud drives, or any combination of memory devices. In some embodiments, the storage device 704 may include memory that shares a die with the processing device 702. In such an embodiment, the memory may be used as cache memory and include embedded dynamic random-access memory (eDRAM) or spin transfer torque magnetic random-access memory (STT-MRAM), for example. In some embodiments, the storage device 704 may include non-transitory computer readable media having instructions thereon that, when executed by one or more processing devices (e.g., the processing device 702), cause the computing device 700 to perform any appropriate ones of the methods disclosed herein below or portions of such methods.
The computing device 700 further includes an interface device 706 (e.g., one or more interface devices 706). In various embodiments, the interface device 706 may include one or more communication chips, connectors, and/or other hardware and software to govern communications between the computing device 700 and other computing devices. For example, the interface device 706 may include circuitry for managing wireless communications for the transfer of data to and from the computing device 700. The term “wireless” and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may communicate data via modulated electromagnetic radiation through a nonsolid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. Circuitry included in the interface device 706 for managing wireless communications may implement any of a number of wireless standards or protocols, including but not limited to Institute for Electrical and Electronic Engineers (IEEE) standards including Wi-Fi (IEEE 802.11 family), IEEE 802.16 standards, Long-Term Evolution (LTE) project along with any amendments, updates, and/or revisions (e.g., advanced LTE project, ultramobile broadband (UMB) project (also referred to as “3GPP2”), etc.). In some embodiments, circuitry included in the interface device 706 for managing wireless communications may operate in accordance with a Global System for Mobile Communication (GSM), General Packet Radio Service (GPRS), Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Evolved HSPA (E-HSPA), or LTE network. In some embodiments, circuitry included in the interface device 706 for managing wireless communications may operate in accordance with Enhanced Data for GSM Evolution (EDGE), GSM EDGE Radio Access Network (GERAN), Universal Terrestrial Radio Access Network (UTRAN), or Evolved UTRAN (E-UTRAN). In some embodiments, circuitry included in the interface device 706 for managing wireless communications may operate in accordance with Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Digital Enhanced Cordless Telecommunications (DECT), Evolution-Data Optimized (EV-DO), and derivatives thereof, as well as any other wireless protocols that are designated as 3G, 4G, 5G, and beyond. In some embodiments, the interface device 706 may include one or more antennas (e.g., one or more antenna arrays) configured to receive and/or transmit wireless signals.
In some embodiments, the interface device 706 may include circuitry for managing wired communications, such as electrical, optical, or any other suitable communication protocols. For example, the interface device 706 may include circuitry to support communications in accordance with Ethernet technologies. In some embodiments, the interface device 706 may support both wireless and wired communication, and/or may support multiple wired communication protocols and/or multiple wireless communication protocols. For example, a first set of circuitry of the interface device 706 may be dedicated to shorter-range wireless communications such as Wi-Fi or Bluetooth, and a second set of circuitry of the interface device 706 may be dedicated to longer-range wireless communications such as global positioning system (GPS), EDGE, GPRS, CDMA, WiMAX, LTE, EV-DO, or others. In some other embodiments, a first set of circuitry of the interface device 706 may be dedicated to wireless communications, and a second set of circuitry of the interface device 706 may be dedicated to wired communications.
The computing device 700 also includes battery/power circuitry 708. In various embodiments, the battery/power circuitry 708 may include one or more energy storage devices (e.g., batteries or capacitors) and/or circuitry for coupling components of the computing device 700 to an energy source separate from the computing device 700 (e.g., to AC line power).
The computing device 700 also includes a display device 710 (e.g., one or multiple individual display devices). In various embodiments, the display device 710 may include any visual indicators, such as a heads-up display, a computer monitor, a projector, a touchscreen display, a liquid crystal display (LCD), a light-emitting diode display, or a flat panel display.
The computing device 700 also includes additional input/output (I/O) devices 712. In various embodiments, the I/O devices 712 may include one or more data/signal transfer interfaces, audio I/O devices (e.g., microphones or microphone arrays, speakers, headsets, earbuds, alarms, etc.), audio codecs, video codecs, printers, sensors (e.g., thermocouples or other temperature sensors, humidity sensors, pressure sensors, vibration sensors, etc.), image capture devices (e.g., one or more cameras), human interface devices (e.g., keyboards, cursor control devices, such as a mouse, a stylus, a trackball, or a touchpad), etc.
Depending on the specific embodiment of the optical microscope 100, various components of the interface devices 706 and/or I/O devices 712 can be configured to send and receive suitable control messages, suitable control/telemetry signals, and streams of data. In some examples, the interface devices 706 and/or I/O devices 712 include one or more analog-to-digital converters (ADCs) for transforming received analog signals into a digital form suitable for operations performed by the processing device 702 and/or the storage device 704. In some additional examples, the interface devices 706 and/or I/O devices 712 include one or more digital-to-analog converters (DACs) for transforming digital signals provided by the processing device 702 and/or the storage device 704 into an analog form suitable for being communicated to the corresponding components of the optical microscope 100.
According to an example embodiment disclosed above, e.g., in the summary section and/or in reference to any one or any combination of some or all of
In some embodiments of the above apparatuses, the intermediate focal plane is conjugate to an object plane of the optical microscope.
In some embodiments of any of the above apparatuses, the first optical relay is a 4F telecentric relay having a unity magnification. In some other embodiments, other suitable types of optical relays can also be used, including optical relays configured to provide magnification/demagnification factors that are different from unity magnification, i.e., that are greater or smaller than one.
In some embodiments of any of the above apparatuses, the first relay portion of the first optical relay comprises a 2F optical relay; and wherein the second relay portion of the first optical relay comprises another 2F optical relay.
In some embodiments of any of the above apparatuses, optical axes of the first and second relay portions are substantially orthogonal to one another.
In some embodiments of any of the above apparatuses, the apparatus further comprises an optical prism coupled between the first and second relay portions of the first optical relay to achieve substantially (e.g., within 10 degrees) normal incidence of the object light onto a main plane of the first 2D mirror array. In some examples, the optical prism is designed and configured to have optical properties that enable the optical prism substantially not to add chromatic aberration to either the optical microscope or the projected light paths. This particular optical property can be conferred, e.g., by specific tuning of the angles of the prism components and their indices of refraction. In a preferred configuration, the optical prism provides total internal reflection (TIR) for the object light at the first glass-air interface and does not cause TIR at the same interface after the DMD.
In some embodiments of any of the above apparatuses, the apparatus further comprises a camera optically coupled to the first optical relay to capture at least a portion of the object light and at least a portion of the overlay light directed by the first 2D mirror array toward the first eyepiece.
In some embodiments of any of the above apparatuses, the apparatus further comprises an ophthalmic lens configured to direct the object light toward the first eyepiece through the objective lens and the first optical relay.
In some embodiments of any of the above apparatuses, the overlay light generated by the light source has a fixed, time-independent optical spectrum.
In some embodiments of any of the above apparatuses, the light source comprises a color light engine.
In some embodiments of any of the above apparatus, the driver circuit is configured to drive the first 2D mirror array and the color light engine in response to a received input signal specifying an overlay pattern to be projected toward the first eyepiece.
In some embodiments of any of the above apparatuses, the received input signal is a video signal.
In some embodiments of any of the above apparatuses, the apparatus further comprises a dichroic mirror optically coupled between the objective lens and the first optical relay and configured to optically couple a volumetric imaging module to imaging optics of the optical microscope.
In some embodiments of any of the above apparatuses, the volumetric imaging module is configured to OCT imaging.
In some embodiments of any of the above apparatuses, the volumetric imaging module is further configured to perform SER imaging.
In some embodiments of any of the above apparatuses, the driver circuit is configured to drive the first 2D mirror array to cause an overlay pattern projected toward the first eyepiece to include an OCT image acquired using the volumetric imaging module.
In some embodiments of any of the above apparatuses, the apparatus further comprises a second optical relay coupled between the objective lens and a second eyepiece of the optical microscope, the second optical relay having a respective intermediate focal plane between a respective first relay portion and a respective second relay portion thereof, wherein the first 2D mirror array has at least a second portion thereof in the respective intermediate focal plane of the second optical relay. In some examples, the first and second portions of the 2D mirror array are separately configured to allow display of different respective images in the first and second eyepieces. In some examples, the different respective images are such that a stereoscopic overlay is created for the viewer.
In some embodiments of any of the above apparatuses, the apparatus further comprises: a second optical relay coupled between the objective lens and a second eyepiece of the optical microscope, the second optical relay having a respective intermediate focal plane between a respective first relay portion and a respective second relay portion thereof; and a second 2D mirror array in the respective intermediate focal plane of the second optical relay. In some examples, the first and second 2D mirror arrays are separately configured to allow display of different respective images in the first and second eyepieces. In some examples, the different respective images are such that a stereoscopic overlay is created for the viewer.
In some embodiments of any of the above apparatuses, the apparatus further comprises a second light source configured to illuminate the second 2D mirror array with second overlay light, wherein the driver circuit is further configured to controllably switch each mirror of the second 2D mirror array between a corresponding first orientation and a corresponding second orientation; wherein, in the corresponding first orientation, a mirror of the second 2D mirror array is configured to: direct the object light from the objective lens toward the second eyepiece; and direct the second overlay light from the second light source toward the light trap; and wherein, in the corresponding second orientation, the mirror of the second 2D mirror array is configured to: direct the object light from the objective lens toward the light trap; and direct the second overlay light from the second light source toward the second eyepiece.
According to another example embodiment disclosed above, e.g., in the summary section and/or in reference to any one or any combination of some or all of
With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain embodiments and should in no way be construed so as to limit the claims.
Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the application is capable of modification and variation.
All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary is made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments incorporate more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in fewer than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
While this disclosure includes references to illustrative embodiments, this specification is not intended to be construed in a limiting sense. Various modifications of the described embodiments, as well as other embodiments within the scope of the disclosure, which are apparent to persons skilled in the art to which the disclosure pertains are deemed to lie within the principle and scope of the disclosure, e.g., as expressed in the following claims.
Unless explicitly stated otherwise, each numerical value and range should be interpreted as being approximate as if the word “about” or “approximately” preceded the value or range.
The use of figure numbers and/or figure reference labels in the claims is intended to identify one or more possible embodiments of the claimed subject matter in order to facilitate the interpretation of the claims. Such use is not to be construed as necessarily limiting the scope of those claims to the embodiments shown in the corresponding figures.
Although the elements in the following method claims, if any, are recited in a particular sequence with corresponding labeling, unless the claim recitations otherwise imply a particular sequence for implementing some or all of those elements, those elements are not necessarily intended to be limited to being implemented in that particular sequence.
Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. The same applies to the term “implementation.”
Unless otherwise specified herein, the use of the ordinal adjectives “first,” “second,” “third,” etc., to refer to an object of a plurality of like objects merely indicates that different instances of such like objects are being referred to, and is not intended to imply that the like objects so referred-to have to be in a corresponding order or sequence, either temporally, spatially, in ranking, or in any other manner.
Unless otherwise specified herein, in addition to its plain meaning, the conjunction “if” may also or alternatively be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” which construal may depend on the corresponding specific context. For example, the phrase “if it is determined” or “if [a stated condition] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event].”
Also, for purposes of this description, the terms “couple,” “coupling,” “coupled,” “connect,” “connecting,” or “connected” refer to any manner known in the art or later developed in which energy is allowed to be transferred between two or more elements, and the interposition of one or more additional elements is contemplated, although not required. Conversely, the terms “directly coupled,” “directly connected,” etc., imply the absence of such additional elements.
As used herein in reference to an element and a standard, the term compatible means that the element communicates with other elements in a manner wholly or partially specified by the standard and would be recognized by other elements as sufficiently capable of communicating with the other elements in the manner specified by the standard. The compatible element does not need to operate internally in a manner specified by the standard.
The functions of the various elements shown in the figures, including any functional blocks labeled as “processors” and/or “controllers,” may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage. Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
As used in this application, the terms “circuit,” “circuitry” may refer to one or more or all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) combinations of hardware circuits and software, such as (as applicable): (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.” This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
It should be appreciated by those of ordinary skill in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
The modifier “about” or “approximately” used in connection with a quantity is inclusive of the stated value and has the meaning dictated by the context (for example, it includes at least the degree of error associated with the measurement of the particular quantity). The modifier “about” or “approximately” should also be considered as disclosing the range defined by the absolute values of the two endpoints. For example, the expression “from about 2 to about 4” also discloses the range “from 2 to 4.” The term “about” may refer to plus or minus 10% of the indicated number. For example, “about 10%” may indicate a range of 9% to 11%, and “about 1” may mean from 0.9-1.1. Other meanings of “about” may be apparent from the context, such as rounding off, so that, for example, “about 1” may also mean from 0.5 to 1.4.
“SUMMARY” in this specification is intended to introduce some example embodiments, with additional embodiments being described in “DETAILED DESCRIPTION” and/or in reference to one or more drawings. “SUMMARY” is not intended to identify essential elements or features of the claimed subject matter, nor is it intended to limit the scope of the claimed subject matter.
This application claims the benefit of U.S. Provisional Patent Application No. 63/591,878 filed Oct. 20, 2023, and entitled “SYSTEMS AND METHODS FOR AUGMENTED/MIXED REALITY DISPLAY USING DIGITAL MICROMIRROR DEVICE,” the contents of which are incorporated herein by reference.
This invention was made with government support under EY030490, EY031769, and EY033969 awarded by the National Institutes of Health. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
63591878 | Oct 2023 | US |