Light-emitting diodes (LEDs) provide an efficient and relatively smaller source of light compared to conventional light sources. The use of LEDs has evolved from systems that provide purely lighting to more complicated systems that use light in various ways other than merely to provide illumination of an area. Consequently, there is ongoing effort to improve technology that uses LED arrays, as well as find additional uses for LED arrays.
Corresponding reference characters indicate corresponding parts throughout the several views. Elements in the drawings are not necessarily drawn to scale. The configurations shown in the drawings are merely examples and should not be construed as limiting in any manner.
The use of the LEDs in electronic devices has increased rapidly as the number and types of devices have expanded in various ways. Beyond mere displays, for example, compact light sources have recently been incorporated in augmented reality (AR) and virtual reality (VR) devices, among others. Such devices may be enabled by microLED arrays.
A microLED array may contain thousands to millions of microscopic microLEDs that may be individually controlled or controlled in groups of pixels (e.g., 5×5 groups of pixels). MicroLEDs are relatively small (e.g., <0.07 mm on a side) and may provide monochromic or multi-chromic light, typically red, green, blue, or yellow using inorganic semiconductor material. Other LEDs may have a size, for example, of about 4 mm2, 250 micron×250 micron, or larger. Note that while microLEDs are referred to herein, in some aspects, the polychromic matrix may use other size LEDs (e.g., miniLEDs that are larger than the microLEDs or LEDs larger than miniLEDs).
Active layers of microLEDs in general may be formed from one or more inorganic materials (e.g., binary compounds such as gallium arsenide (GaAs), ternary compounds such as aluminum gallium arsenide (AlGaAs), quaternary compounds such as indium gallium phosphide (InGaAsP), gallium nitride (GaN), or other suitable materials), usually either III-V materials (defined by columns of the Periodic Table) or II-VI materials.
The microLEDs in the different arrays may emit light in the visible spectrum (about 400 nm to about 800 nm) and/or may emit light in the infrared spectrum (above about 800 nm). MicroLEDs may be formed by epitaxially growing active, n-and p-type semiconductors on a rigid substrate (which may be textured). The substrate may include, for example, sapphire aluminum oxide (Al2O3) or silicon carbide (SIC), among others. In particular, various layers are deposited and processed on the substrate during fabrication of the microLEDs to form a microLED array. The surface of the substrate may be pretreated to anneal, etch, polish, etc. the surface prior to deposition of the various layers. The original substrate may be removed and replaced by a thin transparent rigid substrate, such as glass, or a flexible substrate, for example plastic. In general, the various active layers may be fabricated using epitaxial semiconductor deposition to deposit one or more semiconductor layers, metal deposition (e.g., by sputtering), oxide growth, as well as etching, liftoff, and cleaning, among other operations.
In some aspects, the growth substrate may be removed from the microLED structure after fabrication and after connection to contacts on a backplane via metal bonding such as wire or ball bonding. The backplane may be a printed circuit board or wafer containing integrated circuits (ICs), such as a complementary metal oxide semiconductor (CMOS) IC wafer. The semiconductor deposition operations may be used to create a microLED with an active region in which electron-hole recombination occurs and the light from the microLED is generated. The active region may be, for example, one or more quantum wells. Metal contacts may be used to drive provide current to the n-and p-type semiconductors from the ICs of the backplane on which the microLED array is disposed. Methods of depositing materials, layers, and thin films may include, for example: sputter deposition, atomic layer deposition (ALD), chemical vapor deposition (CVD), physical vapor deposition (PVD), plasma enhanced atomic layer deposition (PEALD), plasma enhanced chemical vapor deposition (PECVD), and combinations thereof, among others.
In some aspects, one or more other layers, such as a phosphor-converting layer that contains phosphor particles, may be disposed on some or all of the microLEDs or microLED arrays to convert at least a portion of the light from the microLEDs to light of a different wavelength. For example, blue light may be converted into near infrared light or white light by the phosphor-converting layer.
Recently, multi-junction polychromic microLED devices have been developed using InGaN as the active semiconductor. A polychromic InGaN device is a vertically-stacked multi-color (typically RGB) device in which three pn-junctions (for three colors) are connected via tunnel junctions. The term vertical as used herein indicates a direction of growth or deposition on a substrate (an array of such devices thus has devices arranged in horizontal [or lateral] directions). Specifics of fabrication of the polychromic microLED devices may be found at U.S. Pat. Nos. 10,236,409B2, 10,749,070B2, 10,541,352B2, 10,804,429B2, 10,622,206B2, 11,069,836B2, 11,069,524B2, 11,069,525B2, 11,404,599B2, 11,081,622B2, 11,594,572B2, 6,822,991B2, 6,847,057B1, which are herein incorporated in their entirety.
Although a three-junction device is described herein, as it may be the most fundamental as a color-display engine, the number of junctions is not limited in this regard. Other devices may be fabricated using a two-junction or a four-(or more) junction device as the junctions are vertically stacked. This enables separate access to each junction by fabricating appropriate contacts via lithography and etching and deposition techniques.
To contruct a display device, a polychromic device wafer may be prepared via photolithographic techniques. As the cell 202 has four terminals, and to operate the junctions individually, the horizontal bus lines 204 shown in
Simultaneous driving of the three-junction polychromic device may be a technique to achieve a mixed color, such as white light. The resulting color may be tuned by adjusting ratios of the currents passing through each of the three junctions. To adjust the ratios, current may flow into and out of the different intermediate contact terminals. However, independent control of the different currents flowing through the different junctions at any given moment is a source of significant complications to circuit design, and, in some cases may lead to the desired color not being able to be achieved. Alternatively, the junctions may sequentially be driven such that each junction is driven during a different phase of the driving cycle; i.e., up to only one junction of the vertically stacked polychromic junctions is driven by a forward current at any instance in time. At some time instances, no junctions may be driven—i.e., the polychromic device is turned off. As above, the three-junction RGB polychromic device is merely an example of vertically-stacked junction LED devices, whose number of individual microLEDs and thus colors may be different from that described herein.
Another driving scheme that mitigates at least some of the issues of simultaneous driving and image display while improving issues related to the purely sequential driving scheme may combine the simultaneous and sequential driving to provide a driving scheme that is partially simultaneous and partially sequential. Such a driving scheme may improve speed and resolution constraints when compared to a fully sequential driving scheme, while consuming more logic, power, and area resources. Specifically, a two-stage addressing strategy, referred here as semi-simultaneous driving, may be used to drive the three-junction polychromic device.
Because of persistence of human vision, color mixing may be achieved by switching primary colors at a rate that is relatively fast (e.g., faster than the video frame rate of 24 or 30 frames per second) in the time domain. A frame is a single image provided by a display; a series of frames form a video. One benefit to the driving circuitry used in the semi-simultaneous driving, while achieving satisfactory color mixing to the human eye, is that the complexity of wiring for the array of cells and drivers is dramatically reduced by sharing common traces across junctions in the same column or row of the array. Additionally, the total panel load remains reduced since activation of at least some of the junctions occurs at different times. Color tuning can be done by adjusting each current for one cycle (analog dimming), but also it can be achieved by adjusting each on-time duration per cycle (digital dimming).
The semi-simultaneous driving may incorporate simultaneous driving of at least one junction of a multi-junction polychromic device during a first stage of the driving cycle followed sequentially by driving of at least one other junction of the polychromic device during a second stage of the driving cycle, with more than one junction being driven during at least one of the first stage or second stage. In the aspect shown in
As shown in
Conversely, in the aspect shown in
The semi-simultaneous approach 700 may then follow one of three sequences in driving the polychromic matrix 704: a) driving interleaved rows and columns of polychromic devices 706aa of the polychromic matrix 704, b) driving rows first and then columns of polychromic devices 706aa of the polychromic matrix 704, c) driving columns first and then rows of polychromic devices 706aa of the polychromic matrix 704. The interleaved approach is shown in
As shown, in the first stage 702a, the first set 706a of polychromic devices 706aa is limited to the first row of polychromic devices 706aa. In some aspects, such as the polychromic devices 706aa described herein, the red microLEDs and blue microLEDs of the polychromic devices 706aa in the first row are driven simultaneously.
In the second stage 702b, the second set 706b of polychromic devices 706aa is limited to the first column of polychromic devices 706aa. The first set 706a of polychromic devices 706aa and the second set 706b of polychromic devices 706aa have a single polychromic device 706aa in common. As above, in the second stage 702b, the green microLEDs of the polychromic devices 706aa in the first column are driven simultaneously.
In the third stage 702c, the third set 706c of polychromic devices 706aa is limited to the second row of polychromic devices 706aa. Thus, the red microLEDs and blue microLEDs of the polychromic devices 706aa in the second row are driven simultaneously. The third set 706c of polychromic devices 706aa and the second set 706b of polychromic devices 706aa have a single polychromic device 706aa in common, but the third set 706c of polychromic devices 706aa and the first set 706a of polychromic devices 706aa have no polychromic devices 706aa in common as the first set 706a of polychromic devices 706aa and the third set 706c of polychromic devices 706aa form different rows of the polychromic matrix 704.
In the fourth stage 702d, the fourth set 706d of polychromic devices 706aa is limited to the second column of polychromic devices 706aa. As above, in the second stage 702b, the green microLEDs of the polychromic devices 706aa in the second column are driven simultaneously. The fourth set 706d of polychromic devices 706aa and the third set 706c of polychromic devices 706aa have a single polychromic device 706aa in common, but the fourth set 706d of polychromic devices 706aa and the second set 706b of polychromic devices 706aa have no polychromic devices 706aa in common as the second set 706b of polychromic devices 706aa and the fourth set 706d of polychromic devices 706aa form different columns of the polychromic polychromic matrix 704.
Driving the sets 706a, 706b, 706c, 706d of polychromic devices 706aa may continue until each microLED in the entire polychromic matrix 704 is driven. While the rows and columns driven in successive alternate stages (e.g., 1 and 3, or 2 and 4) of the interleaved driving scheme shown in
The semi-simultaneous approach 700 is able to complete a full panel scan (a scan of the entire polychromic matrix 704) in M+N stages, while a completely sequential approach (i.e., individual driving of each microLED in the polychromic matrix) takes 3·M·N stages for a three-junction polychromic device. The reduction in stages of the semi-simultaneous approach 700 in comparison to the sequential approach allows a decrease in the frequency of a pulse width modulation (PWM) clock to obtain the same rate of frames per second for the entire panel. Alternatively, or in addition, the semi-simultaneous approach 700 may be able increase the resolution of the panel, allowing a richer color gamut to be obtained when displaying images on the panel.
In other aspects, a combination of interleaving and contiguous driving may be used. For example, a set of rows (e.g., two) may be driven and then a set of columns may be driven before the next set of rows is driven. The number of rows/columns in a set may differ and may vary during the driving cycle (e.g., two rows, then one column, then one row, then three columns). Although a frame may be produced by driving rows/columns in order, moving from one row/column to the adjacent row/column (rastering) in adjacent stages, other driving schemes in which non-adjacent rows/columns are driven in adjacent stages may be used in other aspects. In addition, the junctions may be driven in different orders between stages (e.g., red and blue driven first and green driven next in one stage, green driven first and then red and blue in another stage).
The panel 810 is coupled to a variety of circuitry that is configured to control light emission from polychromic devices in the panel 810. A control system 820 receives the frames to display on the panel 810 dependent on the panel size. A processor 822 of the control system 820 is configured to produce digital signals to control current generation circuits 802a, 802b, 802c, duty cycle switching circuits 804a, 804b, 804c, and a ground switching circuit 806. Other circuitry to support image display by the panel 810 may be present, but is not shown for convenience. In some aspects, each of the current generation circuits 802a, 802b, 802c, duty cycle switching circuits 804a, 804b, 804c, and/or ground switching circuit 806 may be provided as a single circuit element (or integrated circuit) or multiple substantially identical circuit elements may be used to provide the particular functionality (e.g., two current generation circuit elements may form the current generation circuit 802a).
Each current generation circuit 802a, 802b, 802c may include individual current generators 802aa, 802ba, 802ca. Each current generation circuit 802a, 802b, 802c is configured to supply the rows or columns of the panel 810, with each current generator 802aa, 802ba, 802ca supplying a different bus line and thus a unique (different) junction of the polychromic devices disposed along the corresponding row or column. Accordingly, the number of current generators 802aa, 802ba, 802ca in each current generation circuit 802a, 802b, 802c may depend on the number of rows and columns (as well as bus lines) in the panel 810 as well as whether the current is being supplied to a row or to a column of the panel 810.
The current generators 802aa, 802ba, 802ca may be configured to set a desired bias current per color. The bias currents may be set by the processor 822 a single time during testing of the system 800 or may be able to be changed based on later testing of the panel 810 if issues later arise in the panel 810 and/or circuitry of the control system 820. For example, depending of the technology used, the bias currents may dynamically change over time when analog dimming is used by the control system 820. The control system 820 may accordingly be built with multi-channel current drivers (i.e., current drivers with multiple channels) that provide different amounts of current or with a single current driver (such as in the sequential driving approach) along with current mirrors to independently supply current to each connected bus line in the panel 810. Multi-channel current drivers may be used as each current generation circuit 802a, 802b, 802c to provide the current.
Similarly, the duty cycle switching circuits 804a, 804b, 804c may contain individual switches 804aa, 804ba, 804ca. Each switch 804aa, 804ba, 804ca is associated with a different current generator 802aa, 802ba, 802ca. Each duty cycle switching circuit 804a, 804b, 804c is configured to provide the current from the associated current generator 802aa, 802ba, 802ca to the rows or columns of the panel 810 or to ground the rows or columns of the panel 810. Different duty cycle switching circuits 804a, 804c that supply current to the rows of the panel 810 as shown in
As each bus line has a different current generator 802aa, 802ba, 802ca associated with the bus line, a dedicated switch 804aa, 804ba, 804ca handles a desired duty cycle and grounds the bus line when the connected junctions are to act as a sink. The switches 804aa, 804ba, 804ca may be an array of analog switches that allow current flow in both directions to achieve this function. The duty cycle switching circuits 804a, 804b, 804c may be continuously configured to set the digital brightness levels according to the frames being processed and displayed by the panel 810.
The ground switching circuit 806 similarly contains multiple individual switches 806a that are each activated to ground one of the columns and thus the B-junctions associated with the column. The switches 806a of the ground switching circuit 806 ground different bus lines than the columns supplied by the current generators 802ca of the current generation circuit 802c via the switches 804ca of the duty cycle switching circuit 804c.
In some aspects, the currents supplied by the current generator 802ba, 802aa, 802ca may be used to respectively drive the R+ terminal, the R−/G+ terminal, and the B+/G− terminal of the pixels in the panel 810. The ground switching circuit 806 may ground the B− terminal of the pixels in the panel 810. Unlike the duty cycle switching circuit 804a, 804b, 804c, the ground switching circuit 806 is not coupled to a current source driver line; consequently, while a duty cycle signal may be supplied to each switch 804aa, 804ba, 804ca in the duty cycle switching circuits 804a, 804b, 804c, no duty cycle signal is supplied to the switches 806a in the ground switching circuit 806.
The processor 822 may be programmed to handle settings for each current generation circuit 802a, 802b, 802c and to apply the PWM duty cycle to each analog switch 804aa, 804ba, 804ca, 806a across the M·N panel 810 according to the specified frame rate and bits per pixel. Since addressing the analog switches 804aa, 804ba, 804ca, 806a may be overwhelming in terms of pin count, a serializer/deserializer (SerDes) may be used to reduce the output signals of the control system 820. The serializer function in the SerDes may be used to transform parallel data to a serial stream of data, while the deserializer function in the SerDes may be used to transform the serial stream of data to the original parallel data. Thus, rather than using a single pin for each switch 804aa, 804ba, 804ca, 806a, the SerDes may be used to allow each pin to address one or multiple switches 804aa, 804ba, 804ca, 806a.
A controller 930 may include a processor 932 (or equivalently, processing circuitry), which may be used to control various functions of the system 900. As also shown, the controller 930 may contain further components, such as a circuitry 934 configured to drive, among others, the photodiode array 916 as controlled by the processor 932. In some embodiments, the circuitry 934 may also be configured to provide non-local driving of the microLED array 912 of the light source 910 and may include other circuits, e.g., the non-FPGA circuitry shown in
The light source 910 may include at least one lens and/or other optical elements such as reflectors. In different embodiments, a single lens may be disposed over the microLED array 912 or multiple lenses may be disposed over the microLED array 912. The lens and/or other optical elements may direct the light emitted by the microLED array 912 toward a target.
The processor 932 may also control one or more sensors 920 that includes a multi-pixel detector 922. The sensor 920 may sense light at the wavelength or wavelengths emitted by the microLED array 912 and reflected by a target, radiation that is emitted by the target, and or other wavelengths. The sensor 920 may, for example, be a radar or lidar sensor, or the processor 932 may be used to determine the presence of specific objects (e.g., other vehicles, people, road signs) nearby. The sensor 920 may include optical elements (e.g., at least one sensor lens) to capture the radiation. The multi-pixel detector 922 may include, for example, photodiodes or one or more other detectors capable of detecting light in the wavelength range(s) of interest. The multi-pixel detector 922 may include multiple different arrays to sense visible and/or infrared light. The multi-pixel detector 922 may have one or more segments (that are able to sense the same wavelength/range of wavelengths or different wavelength/range of wavelengths), similar to the photodiode array 916.
In some embodiments, instead of, or in addition to, being provided in the sensor 920, a multi-pixel detector may be provided in the light detector 918. In some embodiments, the light detector 918 and the sensor 920 may be integrated in a single module, while in other embodiments, the light detector 918 and the sensor 920 may be separate modules that are disposed on a printed circuit board (PCB) or other mount. In other embodiments, the light detector 918 and the sensor 920 may be attached to different PCBs or mounts. Similarly, the light source 910 may be integrated in a single module with the light detector 918 or may be separate from the light detector 918.
The microLEDs in the microLED array 912 may be driven as described herein. The components of the system 900 shown in
Modules and components are tangible entities (e.g., hardware) capable of performing specified operations and may be configured or arranged in a certain manner. In an example, circuits may be arranged (e.g., internally or with respect to external entities such as other circuits) in a specified manner as a module. In an example, the whole or part of one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware processors may be configured by firmware or software (e.g., instructions, an application portion, or an application) as a module that operates to perform specified operations. In an example, the software may reside on a machine readable medium. In an example, the software, when executed by the underlying hardware of the module, causes the hardware to perform the specified operations.
Accordingly, the term “module” (and “component”) is understood to encompass a tangible entity, be that an entity that is physically constructed, specifically configured (e.g., hardwired), or temporarily (e.g., transitorily) configured (e.g., programmed) to operate in a specified manner or to perform part or all of any operation described herein. Considering examples in which modules are temporarily configured, each of the modules need not be instantiated at any one moment in time. For example, where the modules comprise a general-purpose hardware processor configured using software, the general-purpose hardware processor may be configured as respective different modules at different times. Software may accordingly configure a hardware processor, for example, to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.
The electronic device 1000 may include a hardware processor (or equivalently processing circuitry) 1002 (e.g., a central processing unit (CPU), a GPU, a hardware processor core, or any combination thereof), a memory 1004 (which may include main and static memory), some or all of which may communicate with each other via an interlink (e.g., bus) 1008. The memory 1004 may contain any or all of removable storage and non-removable storage, volatile memory or non-volatile memory. The electronic device 1000 may further include a light source 1010 such as the microLEDs described above, or a video display, an alphanumeric input device 1012 (e.g., a keyboard), and a user interface (UI) navigation device 1014 (e.g., a mouse). In an example, the light source 1010, input device 1012 and UI navigation device 1014 may be a touch screen display. The electronic device 1000 may additionally include a storage device (e.g., drive unit) 1016, a signal generation device 1018 (e.g., a speaker), a network interface device 1020, one or more cameras 1028, and one or more sensors 1030, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor such as those described herein. The electronic device 1000 may further include an output controller, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.). Some of the elements, such as one or more of the sparse arrays that provide the light source 1010 may be remote from other elements and may be controlled by the hardware processor 1002.
The storage device 1016 may include a non-transitory machine readable medium 1022 (hereinafter simply referred to as machine readable medium) on which is stored one or more sets of data structures or instructions 1024 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. A storage device 1016 that includes the non-transitory machine readable medium should not be construed as that either the device or the machine-readable medium is itself incapable of having physical movement. The instructions 1024 may also reside, completely or at least partially, within the memory 1004 and/or within the hardware processor 1002 during execution thereof by the electronic device 1000. While the machine readable medium 1022 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 1024.
The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the electronic device 1000 and that cause the electronic device 1000 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine-readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine-readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; Random Access Memory (RAM); and CD-ROM and DVD-ROM disks.
The instructions 1024 may further be transmitted or received over a communications network using a transmission medium 1026 via the network interface device 1020 utilizing any one of a number of wireless local area network (WLAN) transfer protocols or a SPI or CAN bus. Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks. Communications over the networks may include one or more different protocols, such as Institute of Electrical and Electronics Engineers (IEEE) 1002.11 family of standards known as Wi-Fi, IEEE 1002.14 family of standards known as WiMax, IEEE 1002.14.4 family of standards, a Long Term Evolution (LTE) family of standards, a Universal Mobile Telecommunications System (UMTS) family of standards, peer-to-peer (P2P) networks, a next generation (NG)/6th generation (6G) standards among others. In an example, the network interface device 1020 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the transmission medium 1026.
Note that the term “circuitry” as used herein refers to, is part of, or includes hardware components such as an electronic circuit, a logic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group), an Application Specific Integrated Circuit (ASIC), a field-programmable device (FPD) (e.g., a field-programmable gate array (FPGA), a programmable logic device (PLD), a complex PLD (CPLD), a high-capacity PLD (HCPLD), a structured ASIC, or a programmable SoC), digital signal processors (DSPs), etc., that are configured to provide the described functionality. In some embodiments, the circuitry may execute one or more software or firmware programs to provide at least some of the described functionality. The term “circuitry” may also refer to a combination of one or more hardware elements (or a combination of circuits used in an electrical or electronic system) with the program code used to carry out the functionality of that program code. In these embodiments, the combination of hardware elements and program code may be referred to as a particular type of circuitry.
The term “processor circuitry” or “processor” as used herein thus refers to, is part of, or includes circuitry capable of sequentially and automatically carrying out a sequence of arithmetic or logical operations, or recording, storing, and/or transferring digital data. The term “processor circuitry” or “processor” may refer to one or more application processors, one or more baseband processors, a physical CPU, a single-or multi-core processor, and/or any other device capable of executing or otherwise operating computer-executable instructions, such as program code, software modules, and/or functional processes.
The camera 1028 may sense light at least the wavelength or wavelengths emitted by the microLEDs. The camera 1028 may include optical elements (e.g., at least one camera lens) that are able to collect reflected light of illumination that is reflected from and/or emitted by an illuminated region. The camera lens may direct the reflected light onto a multi-pixel sensor (also referred to as a light sensor) to form an image of on the multi-pixel sensor.
The processor 1002 may control and drive the LEDs via one or more drivers. For example, the processor 1002 may optionally control one or more microLEDs in microLED arrays independent of another one or more microLEDs in the microLED arrays, so as to illuminate an area in a specified manner.
In addition, the sensors 1030 may be incorporated in the camera 1028 and/or the light source 1010. The sensors 1030 may sense visible and/or infrared light and may further sense the ambient light and/or variations/flicker in the ambient light in addition to reception of the reflected light from the LEDs. The sensors may have one or more segments (that are able to sense the same wavelength/range of wavelengths or different wavelength/range of wavelengths), similar to the LED arrays.
The visualization system 1110 can include one or more sensors 1118, such as optical sensors, audio sensors, tactile sensors, thermal sensors, gyroscopic sensors, time-of-flight sensors, triangulation-based sensors, and others. In some examples, one or more of the sensors can sense a location, a position, and/or an orientation of a user. In some examples, one or more of the sensors 1118 can produce a sensor signal in response to the sensed location, position, and/or orientation. The sensor signal can include sensor data that corresponds to a sensed location, position, and/or orientation. For example, the sensor data can include a depth map of the surroundings. In some examples, such as for an augmented reality system, one or more of the sensors 1118 can capture a real-time video image of the surroundings proximate a user.
The visualization system 1110 can include one or more video generation processors 1120. The one or more video generation processors 1120 may receive scene data from a server and/or a storage medium. The scene data may represent a three-dimensional scene, such as a set of position coordinates for objects in the scene or a depth map of the scene. The one or more video generation processors 1120 can receive one or more sensor signals from the one or more sensors 1118. In response to the scene data, which represents the surroundings, and at least one sensor signal, which represents the location and/or orientation of the user with respect to the surroundings, the one or more video generation processors 1120 can generate at least one video signal that corresponds to a view of the scene. In some examples, the one or more video generation processors 1120 can generate two video signals, one for each eye of the user, that represent a view of the scene from a point of view of the left eye and the right eye of the user, respectively. In some examples, the one or more video generation processors 1120 can generate more than two video signals and combine the video signals to provide one video signal for both eyes, two video signals for the two eyes, or other combinations.
The visualization system 1110 can include one or more light sources 1122 such as those described herein that can provide light for a display of the visualization system 1110. Suitable light sources 1122 can include microLEDs as indicated above in addition to or instead of monolithic LEDs, one or more microLED arrays disposed on a common substrate, segmented microLEDs disposed on a single substrate whose microLEDs are individually addressable and controllable (and/or controllable in groups and/or subsets), and others. In some examples, one or more of the light sources 1122 can include microLEDs disposed on a transparent flexible substrate, and a rigid substrate adhered to the transparent flexible substrate with an adhesive layer such that the microLEDs are located between the rigid substrate and the transparent flexible substrate.
The one or more light sources 1122 can include light-producing elements having different colors or wavelengths. For example, a light source can include red microLEDs that can emit red light, green microLEDs that can emit green light, and blue microLEDs that can emit blue right. The red, green, and blue light combine in specified ratios to produce any suitable color that is visually perceptible in a visible portion of the electromagnetic spectrum.
The visualization system 1110 can include one or more modulators 1124. The modulators 1124 can be implemented in one of at least two configurations.
In a first configuration, the modulators 1124 can include circuitry that can modulate the light sources 1122 directly. For example, the light sources 1122 can include an array of light-emitting diodes, and the modulators 1124 can directly modulate the electrical power, electrical voltage, and/or electrical current directed to each light-emitting diode in the array to form modulated light. The modulation can be performed in an analog manner and/or a digital manner. In some examples, the light sources 1122 can include an array of red microLEDs, an array of green microLEDs, and an array of blue microLEDs, and the modulators 1124 can directly modulate the red microLEDs, the green microLEDs, and the blue microLEDs to form the modulated light to produce a specified image.
In a second configuration, the modulators 1124 can include a modulation panel, such as a liquid crystal panel. The light sources 1122 can produce uniform illumination, or nearly uniform illumination, to illuminate the modulation panel. The modulation panel can include pixels. Each pixel can selectively attenuate a respective portion of the modulation panel area in response to an electrical modulation signal to form the modulated light. In some examples, the modulators 1124 can include multiple modulation panels that can modulate different colors of light. For example, the modulators 1124 can include a red modulation panel that can attenuate red light from a red light source such as a red microLED, a green modulation panel that can attenuate green light from a green light source such as a green microLED, and a blue modulation panel that can attenuate blue light from a blue light source such as a blue microLED.
In some examples of the second configuration, the modulators 1124 can receive uniform white light or nearly uniform white light from a white light source, such as a white-light microLED. The modulation panel can include wavelength-selective filters on each pixel of the modulation panel. The panel pixels can be arranged in groups (such as groups of three or four), where each group can form a pixel of a color image. For example, each group can include a panel pixel with a red color filter, a panel pixel with a green color filter, and a panel pixel with a blue color filter. Other suitable configurations can also be used.
The visualization system 1110 can include one or more modulation processors 1126, which can receive a video signal, such as from the one or more video generation processors 1120, and, in response, can produce an electrical modulation signal. For configurations in which the modulators 1124 directly modulate the light sources 1122, the electrical modulation signal can drive the light sources 1122. For configurations in which the modulators 1124 include a modulation panel, the electrical modulation signal can drive the modulation panel.
The visualization system 1110 can include one or more beam combiners 1128 (also referred to as beam splitters), which can combine light beams of different colors to form a single multi-color beam. For configurations in which the light sources 1122 can include multiple microLEDs of different colors, the visualization system 1110 can include one or more wavelength-sensitive (e.g., dichroic) beam combiners 1128 that can combine the light of different colors to form a single multi-color beam.
The visualization system 1110 can direct the modulated light toward the eyes of the viewer in one of at least two configurations. In a first configuration, the visualization system 1110 can function as a projector, and can include suitable projection optics 1130 that can project the modulated light onto one or more screens 1132. The screens 1132 can be located a suitable distance from an eye of the user. The visualization system 1110 can optionally include one or more lenses 1134 that can locate a virtual image of a screen 1132 at a suitable distance from the eye, such as a close-focus distance, such as 500 mm, 950 mm, or another suitable distance. In some examples, the visualization system 1110 can include a single screen 1132, such that the modulated light can be directed toward both eyes of the user. In some examples, the visualization system 1110 can include two screens 1132, such that the modulated light from each screen 1132 can be directed toward a respective eye of the user. In some examples, the visualization system 1110 can include more than two screens 1132. In a second configuration, the visualization system 1110 can direct the modulated light directly into one or both eyes of a viewer. For example, the projection optics 1130 can form an image on a retina of an eye of the user, or an image on each retina of the two eyes of the user.
For some configurations of AR systems, the visualization system 1110 can include an at least partially transparent display, such that a user can view the user's surroundings through the display. For such configurations, the augmented reality system can produce modulated light that corresponds to the augmentation of the surroundings, rather than the surroundings itself. For example, in the example of a retailer showing a chair, the augmented reality system can direct modulated light, corresponding to the chair but not the rest of the room, toward a screen or toward an eye of a user.
In some embodiments, other components may be present, while in other embodiments not all of the components may be present. As indicated herein, although the term “a” is used herein, one or more of the associated elements may be used in different embodiments. For example, the term “a processor” configured to carry out specific operations includes both a single processor configured to carry out all of the operations as well as multiple processors individually configured to carry out some or all of the operations (which may overlap) such that the combination of processors carry out all of the operations; thus, the term “processor” is synonymous with “processing circuitry”. Further, the term “includes” may be considered to be interpreted as “includes at least” the elements that follow.
While only certain features of the system and method have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes.