The present disclosure generally relates to display and imaging systems. Particular embodiments of the present disclosure relate to systems and methods for cascaded wavefront programming for controlling light properties for light field displays or imaging systems.
There has been increasing traction toward more immersive light field and/or autostereoscopic three-dimensional (“3D”) displays due to advancements in optics, electronics, and nano/micro fabrications. Unlike stereoscopic 3D, light field displays manipulate optical wavefronts to create depth perception at the monocular level, which can eliminate the accommodation-vergence mismatch and reduce stress on the user's eyes.
There are four methods available for realizing more realistic light field experiences: super multi-view, computational, multi-focal, and holographic. Each method has unique weaknesses and advantages: The super multi-view method provides a light field at a compact form-factor but is limited to a reduced viewing zone and low resolution. The computational method increases resolution but produces haze and temporal flickering artifacts. The holographic method may struggle with color nonuniformity and fringing or specular artifacts. The multi-focal method can produce clean images; however, devices employing a multi-focal method are typically bulky.
The following issues are typical in all current light field display methods: large bandwidth requirements; a reliance on expensive and/or advanced components that are not easily mass-produced (e.g., tunable lenses); poor color uniformity; a small field of view or viewing zone; low brightness; low-resolution, haze, and diffraction artifacts; limited depth range; lack of compatibility with existing display drivers; and the occasional necessity to wear specialized glasses.
Implementations of the disclosure relate to a display or imaging system that utilizes cascaded metasurfaces to dynamically program the wavefront of light.
In one embodiment, a display system comprises: a display configured to emit light corresponding to an image; and one or more optical control components configured to receive the light emitted by the display and modify one or more properties associated with the light as it passes through the one or more optical control components, wherein: each of the one or more optical control components comprises a polarization-dependent metasurface; the one or more properties associated with the light comprise: a direction the light travels, a position of the light, an angular distribution of the light, a perceived depth of the image corresponding to the light, or a wavelength of the light that is filtered; and each of the one or more optical control components is configured to dynamically switch between a first state where the optical control component modifies at least one property of the one or more properties associated with the light, and a second state where the optical control component does not modify the at least one property.
In some implementations, each of the one or more optical components comprises the polarization-dependent metasurface between a first tunable waveplate and a second tunable waveplate. In some implementations, the first tunable waveplate is a first switchable halfwave plate (HWP), and the second tunable waveplate is a second switchable HWP. In some implementations, each of the first switchable HWP and the second switchable HWP comprises a liquid crystal.
In some implementations, the display system further includes: a controller configured to apply, for each of the one or more optical control components, a control signal to the first tunable waveplate that switches the optical control component between the first state and the second state, wherein in one of the first state and the second state the first tunable waveplate affects a polarization of light passing through it, and wherein in the other of the first state and the second state the first tunable waveplate does not affect the polarization of light passing through it.
In some implementations, a first optical control component of the one or more optical control components is configured to modify the direction the light travels, the metasurface of the first optical control component comprising a first metagrating configured to diffract the light at a first angle it passes through the metagrating.
In some implementations, the one or more optical control components comprise a plurality of optical control components configured to modify the direction the light travels, the plurality of optical control components including the first optical control component and a second optical control component cascaded with the first optical control component, the metasurface of the second optical control component comprising a second metagrating configured to diffract the light at the first angle as it passes through the second metagrating, wherein when the first and second optical control components are in the first state, the light is diffracted at two times the first angle after it passes through the first and second optical control components.
In some implementations, the one or more optical control components comprise a first optical control component adjacent a second optical control component; the first and second optical control components are configured to modify the position of the light; the metasurface of the first optical control component comprises a first metagrating configured to diffract the light at a first angle as it passes through the first metagrating; and the metasurface of the second optical control component comprises a second metagrating configured to diffract the light at a second angle opposite the first angle as it passes through the second metagrating.
In some implementations, a first optical control component of the one or more optical control components is configured to modify the angular distribution of the light, the metasurface of the first optical control component comprising a meta-lens array configured to converge or diverge the light at a given polarization as it passes through the meta-lens array.
In some implementations, a first optical control component of the one or more optical control components is configured to modify the perceived depth of the image, the metasurface of the first optical control component comprising a meta-lens array configured to reimage one or more pixels associated with the image.
In some implementations, the one or more optical control components comprise a first optical control component including: a substrate having a first side and a second side opposite the first side, a first meta-grating on the first side of the substrate, a second meta-grating on the second side of the substrate; the first metagrating is configured to diffract a first wavelength of the light at a first angle as it passes through the first metagrating, and not diffract a second wavelength of the light as it passes through the first metagrating; and the second metagrating is configured to diffract the first wavelength of the light at a second angle, opposite the first angle, as it passes through the second metagrating, and not diffract the second wavelength of the light as it passes through the second metagrating.
In some implementations, each of the one or more optical control components is capable of switching between the first state and the second state at a frequency greater than a framerate of the display. In some implementations, the one or more optical control components of the display system are configured to shift the image by less than a length of a pixel of the image to create a higher resolution image at a lower frame rate. In some implementations, the display comprises a plurality of display pixels; and the one or more optical control components comprises multiple optical controls, each of the multiple optical control components positioned over a respective one of the multiple display pixels to shift the position of light emitted by the pixel when the optical control component is in the first state.
In some implementations, each of the one or more optical control components is capable of switching between the first state and the second state at a frequency at least two times greater than a framerate of the display, and each of the one or more optical control components of the display system are configured to create an image having a framerate at least two times greater than the display.
In some implementations, each of the one or more optical components comprises the polarization-dependent metasurface between a first tunable wave plate and a cascaded set of tunable waveplates such that at each polarization angle or state the overall cascade performs a desired set of optical functionalities.
In some implementations, the display system is a tessellated display configured to expand a viewed size of the image; the display system further comprises at least two mirrors placed normal to a surface of the display; and the first optical component is configured to cause the light to travel in the direction of one of the two mirrors.
In some implementations, the one or more optical control components comprise a first optical control component and a second optical control component; the first optical control component is configured to modify the direction the light travels to control a destination that the light travels to; and the second optical component is configured to modify the angular distribution of the light to control a size of a viewable zone of the image.
In one embodiment, an image capture system comprises an aperture configured to receive light; a first optical component configured to collect the light received at the aperture; one or more optical control components configured to receive the light passed by the first optical component and modify one or more properties associated with the light as it passes through the one or more optical control components, wherein: each of the one or more optical control components comprises a metasurface; the one or more properties associated with the light comprise: a direction the light travels, a position of the light, an angular distribution of the light, a perceived depth of an image corresponding to the light, or a wavelength of the light that is filtered; and each of the one or more optical control components is configured to switch between a first state where the optical control component modifies at least one property of the one or more properties associated with the light, and a second state where the optical control component does not modify the at least one property; and an image sensor configured to receive the light after it passes through the one or more optical control components.
In some implementations of the image capture system, the one or more optical control components comprise: a depth control module positioned over the image sensor, the metasurface of the depth control module comprising a meta-lens array configured to reimage one or more pixels associated with the image; or an angular distribution control module positioned over the image sensor, the metasurface of the angular distribution control module comprising a meta-lens array configured to converge the light at a given polarization as it passes through the meta-lens array, thereby converging the light on an active region area of the image sensor.
Other features and aspects of the disclosure will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features in accordance with various embodiments. The summary is not intended to limit the scope of the invention, which is defined solely by the claims attached hereto.
The technology disclosed herein, in accordance with one or more embodiments, is described in detail with reference to the following figures. The drawings are provided for purposes of illustration only and merely depict typical or example embodiments of the disclosed technology. These drawings are provided to facilitate the reader's understanding of the disclosed technology and shall not be considered limiting of the breadth, scope, or applicability thereof. It should be noted that for clarity and ease of illustration these drawings are not necessarily made to scale.
FIG. 2D shows an example of a passive shutter for controlling the direction of light, in accordance with some implementations of the disclosure.
The figures are not intended to be exhaustive or to limit the invention to the precise form disclosed. It should be understood that the invention can be practiced with modification and alteration, and that the disclosed technology be limited only by the claims and the equivalents thereof.
As used herein, the term “optically coupled” is intended to refer to one element being adapted to impart, transfer, feed or direct light to another element directly or indirectly.
Throughout this disclosure, the term “arbitrarily engineered” is used to refer to “of being any shape, size, material, features, type or kind, orientation, location, quantity, components, and arrangements of components with single or an array of components that would allow the methods, the systems, the apparatuses, and the devices described in the present disclosure or a specific component of the methods, the systems, the apparatuses, and the devices to fulfill the objectives and intents of the present disclosure or that specific component within the methods, the systems, the apparatuses, and the devices.” In this disclosure, light field at a plane refers to a vector field that describes the amount of light flowing in every or several selected directions through every point in that plane. The light field is the description of the angle and intensity of light rays traveling through that plane.
In the present disclosure, display refers to an emissive display which may be based on any technology such as but not limited to Liquid Crystal Displays (“LCD”), Thin-film Transistor (“TFT”), Light Emitting Diode (“LED”), Organic Light Emitting Diode arrays (“OLED”), Active Matrix Organic Light Emitting Diode (“AMOLED”), projection or angular projection arrays on flat-screen or angle-dependent diffusive screen or any other display technology) and/or mirrors and/or half-mirrors and/or switchable mirrors or liquid crystal sheets arranged and assembled in such a way as to exit bundles of light with divergence apex at different depths or one depth from the core plane or waveguide-based displays. The display might be a near-eye display for a headset, a near-head display, or far standing display. The application of the display does not impact the principle of this invention and this is what is referred to by an emissive display in this disclosure.
In the present disclosure, metasurface is an arbitrary array of subwavelength nanostructures that collectively control the basics properties of light such as amplitude, phase, polarization, direction, and sometimes combinations of them at the same time. Examples of metasurfaces are described in P. Genevet, F. Capasso, F. Aieta, M. Khorasaninejad, and R. C. Devlin “Recent advances in planar optics: from plasmonics to dielectric metasurfaces” Optica, 4, (2017). A metasurface can be a meta-lens/metalens (metasurface based lens that converges or diverges the light based on a focal distance), meta-grating/metagrating (grating based on metasurface design), or meta-hologram/metahologram (hologram based on metasurface). Examples of metalenses are described in M. Khorasaninejad and F. Capasso “Metalenses: versatile multifunctional photonic components” Science, 358, eaam8100, (2017). Examples of meta-gratings are described in M. Khorasaninejad and F. Capasso “Broadband multifunctional efficient meta-gratings based on dielectric waveguide phase shifters” Nano Letters, 15 (2015). Examples of meta-holograms are described in M. Khorasaninejad, A. Ambrosio, P. Kanhaiya, and F. Capasso “Broadband and chiral binary dielectric meta-holograms” Science Advances, 5 (2016). Metasurfaces' building blocks can be made of a semiconductor such as (amorphous silicon polycrystalline silicon, gallium phosphide, gallium nitride, silicon carbide), crystal such as (silicon, lithium niobate), a dielectric such as (silicon dioxide, silicon nitride, hafnium oxide, titanium dioxide), a polymer such as (photoresist, PMMA), a metal such as (gold, silver, aluminum), or phase change materials such as (vanadium dioxide, chalcogenide) or a combination of them. These structures are typically made by processes such as optical lithography, electron beam lithography, nanoimprinting, reactive ion etching electron beam deposition sputtering, plasma-enhanced deposition, atomic layer deposition, and any combination of the aforementioned processes with arbitrary order. The process of manufacturing the layer is out of the focus of this disclosure and does not impact the proposed systems and methods.
In the present disclosure, the polarization state of light may be linear polarization, circular polarization, elliptical polarization, or any combination thereof. The polarization of the light is defined as a temporal and spatial status of the electric field with regard to the propagation direction of the light.
Throughout this disclosure, the angular profiling may be achieved by holographic optical elements (“HOE”), diffractive optical elements (“DOE”), lens, concave or convex mirrors, lens arrays, microlens arrays, aperture arrays, optical phase or intensity masks, digital mirror devices (“DMDs”), Spatial light modulators (“SLMs”), metasurfaces, diffraction gratings, interferometric films, privacy films, thin-film stack or other methods. Intensity profiling may be achieved by absorptive or reflective polarizers, absorptive or reflective coatings, gradient coatings, or other methods. The color or wavelength profiling may be achieved by color filters, absorptive or reflective notch filters, interference thin films, or other methods. The polarization profiling might be done by metasurfaces with metallic or dielectric, micro or nanostructures, wire grids, absorptive or reflective polarizers, wave plates such as quarter-waveplates, half-waveplates, 1/x waveplates, or other nonlinear crystals or polymer with anisotropy.
All such components may be arbitrarily engineered to deliver the desired profile. As used herein, “arbitrary optical parameter variation” refers to variations, change, modulation, programing and/or control of parameters which may be one or a collection of following variations namely: optical zoom change, aperture size, and aperture brightness variation, focus variation, aberration variation, focal length variation, time-of-flight or phase variation in case of an imaging system with time-sensitive or phase-sensitive imaging sensor, color variation or spectral variation in case of spectrum sensitive sensor, the angular variation of a captured image, variation in depth of field, the variation of depth of focus, the variation of coma, variation of stereopsis baseline in case of stereoscopic acquisition, the variation of a field of view of the lens.
Throughout the present disclosure, the imaging sensor might use “arbitrary image sensing technologies” to capture light or a certain parameter of light that is exposed to it. Examples of such “arbitrary image sensing technologies” include complementary-symmetry metal-oxide-semiconductor (“CMOS”), scientific CMOS (sCMOS), Single Photon Avalanche Diode (“SPAD”) array, Charge-Coupled Device (“CCD”), Intensified Charge-Coupled Device (“ICCD”), Ultra-fast Steak sensor, Time-of-Flight sensor (“ToF”). Schottky diodes or any other light or electromagnetic sensing mechanism for shorter or longer wavelengths.
Throughout the present disclosure, dynamic design or dynamic components or generally the adjective dynamic refers to a design or component that has variable optical properties that can be changed with an optical or electrical signal. Electro-optical materials such as liquid crystals or piezoelectric materials or nonlinear crystals are a few examples of such materials. A passive design or component is referred to as a design that does not have any dynamic component other than the display.
Throughout the present disclosure, the pass angle of a polarizer is the angle in which the incident light with the normal incident angle (that is perpendicular) to the surface of the polarizer can pass through the polarizer with maximum intensity. The “pass axis” is the axis or the vector with the pass angle such that the light with polarization at such vector passes through a linear polarizer
When two items are cross-polarized, it means their polarization status or orientation is in orthogonal status with regards to one another. For example, when two linear polarizers are cross-polarized; it means that their pass angle has a 90 degrees difference.
Throughout the present disclosure, the reflective polarizer is a polarizer that allows light that has polarization aligned with the pass angle of the polarizer to transmit through the polarizer, and it will reflect the light that is cross-polarized with its pass axis. A wire grid polarizer (a reflective polarizer made with nano wires aligned in parallel) is a non-limiting example of such a polarizer.
Throughout the present disclosure, an absorptive polarizer is a polarizer that allows light with polarization aligned with the pass angle of the polarizer to pass through, and it absorbs cross-polarized light.
Throughout the present disclosure, imaging system refers to any apparatus that acquires an image that is a matrix of information about light intensity and/or its other, temporal, spectral, polarization or entanglement or other properties used in any application or framework such as cellphone cameras, industrial cameras, photography or videography cameras, microscopes, telescopes, spectrometers, time-of-flight cameras, ultrafast cameras, thermal cameras, or any other type of camera.
Throughout the present disclosure, aperture refers to a structure having a single hole/opening or array of holes/openings in which light can pass through the structure. These openings or holes are surrounded by an area that blocks the light. The blocking mechanism can be based on different approaches, including not limited, to absorption (e.g., metal, black material) or reflection (e.g., metal, thin-film dielectric). Openings and holes can be filled with other material/medium/films to give extra functionalities to the aperture including, but not limited to, making the aperture color selective, angle selective, polarization selective, and amplitude selective.
Throughout the present disclosure, a color filter refers to a filter that only allows specific wavelengths of light (e.g., wavelength range) or colors of light to pass through. The mechanism behind color filtering can be based on absorption (e.g., using a die, pigment, metallic nanostructure, etc.), reflection (e.g., thin film, metallic nanostructure), or diffraction (e.g., reflective or transmission grating) of a specific color or colors.
As discussed above, there are a number of challenges that have significantly limited the use of or production of light field displays in commercial and/or industrial settings. For example, the success of cellphone cameras has increased the need for higher lens brightness to improve performance in dark environments and provide more flexible optical parameters at the hardware level without the need for computational restoration of the image. Some proposed techniques for generally modifying the wavefront of the light field while reducing the form factor include utilizing lenslet arrays, diffractive optics, and apertures. However, these techniques suffer from some associated shortcomings or challenges such as color and diffractive artifacts, limited field of view, and low image resolution. Accordingly, although there have been ongoing efforts to control the wavefront of light through passive diffractive elements, holographic layers, and/or lenticular microlens structures, these three methods induce significant haze, speckle artifact, and/or chromatic artifacts.
Therefore, there is a need for improved methods and systems for effectively controlling the wavefront of light with minimal artifacts that may overcome one or more of the above-mentioned problems and/or limitations.
To this end, the present disclosure describes systems and methods for dynamically controlling light propagation direction, angular distribution, and polarization in a display or imaging system, and, particularly, a light field display or imaging system. In accordance with implementations, a display or imaging system utilizes cascaded metasurfaces to dynamically program the wavefront of light. The system may include a stack of metasurfaces and liquid-crystal layers that may be controlled to programmatically control light propagation direction, angular distribution, and polarization as light moves through the different layers in the display or imaging system. For example, in a display system, these layers may be used to provide different images at different viewing angles from a display. These layers may also or alternatively be used to control image resolution, frame rate, brightness, color, and/or other properties of the display. One or more control modules of the system may provide dynamic control of the system by applying one or more electric signals that change optical properties of optical components described herein. Multiple control modules may be stacked together to increase the overall function.
As such, implementations of the present disclosure describe an approach based on metasurfaces and other flat optical technology such as liquid crystals for efficient and dynamic control of the wavefront of light or directionality of the light. The dynamic nature of these approaches along with the higher efficiency of metasurfaces significantly enhances the performance and flexibility of optical systems, especially for displays and imaging applications. For example, by virtue of using the cascaded metasurfaces, further described herein, it may be possible to dynamically program light propagation direction, angular distribution, and polarization for different pixels of an image. As further discussed below, this may be particularly advantageous in the context of light field and directional displays as well as imaging systems.
As further described below, by virtue of implementing the foregoing design including cascaded metasurfaces, it is possible to program the wavefront of light in a binary fashion. The foregoing technology may be used to optically adjust the frame rate and/or resolution of a display. For example, consider a panel having a maximum resolution of 8K that runs at 60 Hz frame rate. By having cascading metasurfaces in front of the display, it may be possible to shift the display resolution to 4K and frame rate to 240 Hz.
Further, some implementations of the present disclosure describe the light field for multiple users. Further, some implementations of the present disclosure relate generally to tiling the light field for increasing resolution and/or field of view of displays and/or imaging apparatuses via time multiplexing or spatial multiplexing. Further, the present disclosure describes optical and computational methods that may use a set of algorithms and reflectors, thin films, metasurface, polarization film, diffractive elements, and/or refractive elements to control properties of light for display or imaging purposes.
In this example, display system 100 also includes a head and/or eye tracking sensor 8. Sensor 8 may be a simultaneous localization and mapping sensor (SLAM) or an image sensor or camera. Sensor 8 is configured to collect and feedback head/eye tracking data of the user 7 to the source that generates the content that is displayed by display 1, thereby controlling how images are perceived by the user 7. For example, based on the location, gesture, and/or eye gaze of the user 7, sensor 8 may provide feedback or tracking data to a processor 9 of the system. Using the received tracking data, the processor 9 may send one or more signals to one or more control modules 2-6 to modify the wavefront of light before it reaches the user 7. This feedback system and processing of the tracking data by processor 9 may improve the quality of the image perceived by one or more users 7, eliminate color artifact, eliminate diffractive artifact, and/or eliminate other possible artifacts. In some implementations, the presented content is not adaptively changed, and sensor 8 and/or processor 9 are not included in system 100.
As described herein, a direction control module (DCM) 2 may be used to control the direction of light.
Although using apertures may be straightforward and effective, it comes at the cost of losing significant light intensity because of the absorption of part of the light. For example, this has been a limiting factor for many angular displays that use parallax barriers. The other disadvantage of apertures is that one cannot control the direction of light propagation without affecting other characteristics of light such as its angular distribution and diffraction from edges, especially if sub-apertures are smaller than 20 microns by 20 microns in dimension.
As shown, the DCM 2 of
In some cases, the polarizer 15 may be omitted. Otherwise, it polarizes the light before the HWP 14. In this example, meta-grating 13 is designed to diffract light for a first polarization of light and allow a second polarization of light, orthogonal to the first polarization, to pass unperturbed. As meta-grating 13 may convert the polarization when diffracting incoming light, another HWP 14 (e.g., another LC) is placed right after the meta-grating to control the output light polarization at will. Multiple DCMs 2 having the components depicted in
As described herein, an angular distribution control module (ADCM) 3 may be used to control the angular distribution of light.
To address the foregoing shortcomings of apertures for controlling the angular distribution of light,
As described herein, an a position control module (PCM) 4 may be used to control the position of light.
In some implementations, this position change may be used to change display frame rate by using super-resolution algorithms. The entire matrix of the pixels may be shifted by a size smaller than the length of a pixel (depending on the filling factor of the pixel active region), thereby creating a higher resolution image at a lower frame rate. For example, as depicted by
Where xt and yt are coordinates of the pixel at a time “t” at each sub-frame time, and xo and yo are the coordinate location of the pixel when the PCMs are OFF.
Depending on how the PCMs are activated, at each time the pixel can stay at where it is
or be shifted upward
toward the right
or upward and toward the left
Dividing one frame into four sub-frames
the pixel density may be multiplied by a factor of four, two added horizontally and two added vertically. With this approach, the combined image may have a 2000×2000 resolution with a 60 Hz frame rate.
As depicted by
In the case of [a b c d]=[0 0 0 0] the user cannot see any of the four pixels since none of them are shifted by the PCM (i.e., PCMs are OFF) and there is an aperture blocking the user to see them directly. The aperture is shown by the semi-transparent layer 10 in
As described herein, a depth control module 5 may be used to shift the monocular depth at which users perceive images
As depicted by
In this example time multiplexing is used, but the same concept can be achieved via pixel multiplexing. Where, for example, half of display pixel arrays show one content and the other half different contents, using the control module the direction and angular distribution of ray segments may be changed to generate a 3D effect that does not require a high frame rate display. As evident from
The same embodiment can be used for an imaging system where the display is replaced with an imaging sensor 36 as shown in
As the foregoing examples illustrate, by virtue of having the illustrated design including a metasurface between a first switchable HWP that may change the light polarization by 90 degrees, and a second switchable HWP, positioned after the metasurface, that may revert the light to its original polarization, it is possible to independently control the various cascaded optical control modules/components in a binary fashion. For example, a given optical control module may be switched on by turning “on” the first switchable HWP and the second switchable HWP such that, after passing through the first HWP, the light is of a suitable polarization to have one or more of its properties affected by the metasurface after it passes through the metasurface, and the light returns to its polarization (i.e., the polarization it had before passing through the first HWP) after passing through the second HWP. Continuing the same example, the optical control module may be turned off such that it appears transparent to the light passing through it (i.e., the light's polarization or other properties are not affected as it passes through the first HWP, the metasurface, and the second HWP).
Although primarily described in the context of using HWP that may change polarization by 90 degrees to enable binary switching on/off of an optical control module's state, the optical control modules described herein may be implemented with other tunable waveplates such as quarter waveplates, In such cases, it may be possible to realize more than two states in an optical control module. For example, in such cases the optical control module may include multiple cascaded tunable waveplates positioned after and/or before the metasurface, which enable the optical control module to function in three or more states.
One or more controllers (e.g., processor 9) may be utilized to deliver a control signal (e.g., voltage) to each of the waveplates to switch states of each of the optical control components. The one or more controllers may be electrically coupled (e.g., via suitable circuity) to each of the optical control modules to enable independent control of each of the modules.
In one embodiment, chip set 2200 includes a communication mechanism such as a bus 2202 for passing information among the components of the chip set 2200. A processor 2204 has connectivity to bus 2202 to execute instructions and process information stored in a memory 2206. Processor 2204 includes one or more processing cores with each core configured to perform independently. A multi-core processor enables multiprocessing within a single physical package. Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores. Alternatively or in addition, processor 2204 includes one or more microprocessors configured in tandem via bus 2202 to enable independent execution of instructions, pipelining, and multithreading. Processor 2204 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 2208, and/or one or more application-specific integrated circuits (ASIC) 2210. DSP 2208 can typically be configured to process real-world signals (e.g., sound) in real time independently of processor 2204. Similarly, ASIC 2210 can be configured to performed specialized functions not easily performed by a general purposed processor. Other specialized components to aid in performing the inventive functions described herein include one or more field programmable gate arrays (FPGA) (not shown), one or more controllers (not shown), or one or more other special-purpose computer chips.
Processor 2204 and accompanying components have connectivity to the memory 2206 via bus 2202. Memory 2206 includes both dynamic memory (e.g., RAM) and static memory (e.g., ROM) for storing executable instructions that, when executed by processor 2204, DSP 2208, and/or ASIC 2210, perform the process of example embodiments as described herein. Memory 2206 also stores the data associated with or generated by the execution of the process.
In this document, the terms “machine readable medium,” “computer readable medium,” and similar terms are used to generally refer to non-transitory mediums, volatile or non-volatile, that store data and/or instructions that cause a machine to operate in a specific fashion. Common forms of machine readable media include, for example, a hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, an optical disc or any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.
These and other various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution. Such instructions embodied on the medium, are generally referred to as “instructions” or “code.” Instructions may be grouped in the form of computer programs or other groupings. When executed, such instructions may enable a processing device to perform features or functions of the present application as discussed herein.
In this document, a “processing device” may be implemented as a single processor that performs processing operations or a combination of specialized and/or general-purpose processors that perform processing operations. A processing device may include a CPU, GPU, APU, DSP, FPGA, ASIC, SOC, and/or other processing circuitry.
The various embodiments set forth herein are described in terms of exemplary block diagrams, flow charts and other illustrations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated examples. For example, block diagrams and their accompanying description should not be construed as mandating a particular architecture or configuration.
Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code components executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The various features and processes described above may be used independently of one another, or may be combined in various ways. Different combinations and sub-combinations are intended to fall within the scope of this disclosure, and certain method or process blocks may be omitted in some implementations. Additionally, unless the context dictates otherwise, the methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate, or may be performed in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The performance of certain of the operations or processes may be distributed among computer systems or computers processors, not only residing within a single machine, but deployed across a number of
As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, the description of resources, operations, or structures in the singular shall not be read to exclude the plural. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps.
Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. Adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent.
This application claims priority to U.S. Provisional Application No. 63/087,777 filed Oct. 5, 2020 and titled “METHODS AND SYSTEMS FOR CASCADED WAVEFRONT PROGRAMMING FOR DISPLAYS AND IMAGE SENSORS,” which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63087777 | Oct 2020 | US |