The present disclosure relates to an illumination system that contains a light-emitting diode (LED) array and sensor on a printed circuit board (PCB).
There is ongoing effort to improve illumination systems. In particular, it is desired to provide tunable lighting in commercial and home lighting environments.
Corresponding reference characters indicate corresponding parts throughout the several views. Elements in the drawings are not necessarily drawn to scale. The configurations shown in the drawings are merely examples and should not be construed as limiting in any manner.
Real estate (e.g., a physical area or volume that may be available for a given device) is at a premium in many electronic devices. It is, however, difficult to reduce the size of mobile devices in particular due to the increasing complexity of such devices. One of the components that attracts a large amount of attention in mobile devices is the camera, which may have one or more different types of sensors and be used in conjunction with a flash module. Integration of the flash module and camera to reduce the overall size may be complicated, in particular in terms of semiconductor fabrication as well as circuit and board layout.
The mobile device 100 can include one or more light-emitting diode (LED) arrays 112. The one or more LED arrays 112 can include a plurality of LEDs 114 that can produce light during at least a portion of the exposure duration of the camera 120. Each of the one or more LED arrays 112 may contain segmented ones of the LEDs 114. Each of the LEDs 114 may be formed from one or more inorganic materials (e.g., binary compounds such as gallium arsenide (GaAs), ternary compounds such as aluminum gallium arsenide (AlGaAs), quaternary compounds such as indium gallium phosphide (InGaAsP), or other suitable materials). Each of the LEDs 114 emits light in the visible spectrum (about 400 nm to about 800 nm) and may also emit light in the infrared spectrum (above about 800 nm). Alternatively, a separate one or more LED arrays may be used to emit light in the infrared spectrum, with each of the LEDs 114 being individually controllable by the processor 130.
The one or more LED arrays 112 can include one or more non-emitting areas (e.g., non-emitting areas 204) located between adjacent ones of the LEDs 114, as shown in
The flash module 110 can include at least one lens 116 or other optical elements. The lens 116 can direct the light emitted by the one or more LED arrays 112 toward the scene 104 as illumination 102. In some embodiments, the flash module 110 can include an actuator (such as a voice-coil motor) to translate (e.g., a physical translation or movement of the LED arrays 112 relative to the lens 116) the one or more LED arrays 112 and/or the lens 116 during the exposure duration of the camera 120 controlled by the processor 130 so as to blur the dark bands in the illumination 102 in the image of the scene 104. In other embodiments, the actuator may not be present. Instead, such a system may have a fixed lens, and thus fixed aperture. In systems that contain the actuator, the actuator can translate the one or more LED arrays 112 and/or the lens 116 in an actuation direction that is generally orthogonal to a longitudinal axis that extends from the one or more LED arrays 112, through a center of the lens 116, to the scene 104 by a distance greater than or equal to a width of a non-emitting area of the one or more non-emitting areas of the one or more LED arrays 112. That is, the actuator can translate either or both the one or more LED arrays 112 and the lens 116 in one or more of the lateral (x-z) directions, where the y direction is shown in
In addition, or instead of translation, multiple one or more LED arrays containing segmented LEDs may be used to illuminate the scene. In this case, the non-emitting areas between the LEDs that form dark bands in the illumination can be offset between different one or more LED arrays so that the dark bands may at least partially overlap. The offset can help reduce or eliminate dark bands in the total illumination at the scene, which could be present if only one of one or more LED arrays and lens were used.
The camera 120 may sense light at least the wavelength or wavelengths emitted by the one or more LED arrays 112. The camera 120 can include optics (e.g., at least one camera lens 122) that can collect reflected light 106 that is reflected from and/or emitted by the scene 104. The camera lens 122 can direct the reflected light 106 onto a multi-pixel sensor 124 to form an image of the scene 104 on a multi-pixel sensor 124. The processor 130 can receive a data signal that represents the image of the scene 104. The processor 130 can additionally drive the LEDs 114 in the one or more LED arrays 112. For example, the processor 130 can optionally control one or more LEDs 114 in the one or more LED arrays 112 independent of another one or more LEDs 114 in the one or more LED arrays 112, so as to illuminate the scene in a specified manner.
In addition, one or more detectors 126 may be incorporated in the camera 120. In other embodiments, instead of being incorporated in the camera 120, the one or more detectors 126 may be incorporated in one or more different areas, such as the flash module 110 or elsewhere close to the camera 120 and strobe. The one or more detectors 126 may include multiple different sensors to sense visible and/or infrared light (and perhaps ultra-violet light) to sense the ambient light and/or variations/flicker in the ambient light in addition to reception of the reflected light from the LEDs 114. The multi-pixel sensor 124 of the camera 120 may be of higher resolution than the sensors of the one or more detectors 126 to obtain an image of the scene with a desired resolution. The sensors of the one or more detectors 126 may have one or more segments (that are able to sense the same wavelength/range of wavelengths or different wavelength/range of wavelengths). In some embodiments, if multiple detectors are used, one or more of the detectors may detect visible wavelengths and one or more of the detectors may detect infrared wavelengths; the detectors may be individually controllable by the processor 130.
In some embodiments, instead of or in addition to being provided in the camera 120, one or more of the sensors of the one or more detectors 126 may be provided in the flash module 110. In some embodiments, the flash module 110 and the camera 120 may be integrated in a single module, while in other embodiments, the flash module 110 and the camera 120 may be separate modules that are disposed on a PCB. In other embodiments, the flash module 110 and the camera 120 may be attached to different PCBs—for example, as the camera 120 may be thicker than the flash module 110, which may result in design issues if the flash module 110 and the camera 120 are attached to the same PCB. In the latter embodiment, multiple openings may be present in the housing at least one of which may be eliminated with the use of an integrated version of the flash module 110 and the camera 120.
The LEDs 114 can be driven using, for example, a direct current (DC) driver or pulse width modulation (PWM). Using DC driving may encounter color differences if the segmented one or more LED arrays 112 is driven at different current densities, while PWM driving can generate artifacts due to ambient lighting conditions. The flicker sensor may sense the variation of artificial lighting at the wall current frequency or electronic ballasts frequencies (e.g., 50 Hz or 60 Hz or an integral multiple thereof), in addition to the phase of the flicker. The camera sensor is then tuned to an integration time of an integral multiple of the time period (1/f) or triggered at the phase where the illumination changes most slowly (minimum or maximum intensity, with the maximum intensity preferred for signal-to-noise ratio considerations). The LEDs 114 may be driven using a PWM whose phase shift varies between LEDs 114 to reduce potential current surge issues. As shown, one or more drivers 132 may be used to drive the LEDs 114 in the one or more LED arrays 112, as well as other components, such as the actuators.
The mobile device 100 can also include an input device, for example, a user-activated input device such as a button that is depressed to take a picture. The flash module 110 and camera 120 can be disposed in a single housing.
In other embodiments, the LED array 200 is a micro-LED array that includes, for example, thousands to millions of microscopic LED pixels that can emit light and that can be individually controlled or controlled in groups of pixels (e.g., 5×5 groups of pixels). The microLEDs are small (e.g., <0.01 mm on a side) and may provide monochromatic or multi-chromatic light, typically red, green, and blue using inorganic semiconductor material.
As above, the flash module 110 of
The LED matrix 302 and the sensors 304 may be integrated on a semiconductor structure such as a CMOS chip 310. As is apparent, the amount of space used by the LED matrix 302 and the sensors 304 correspondingly increases the amount of semiconductor used and enlarges the CMOS chip 310. In addition to increasing the CMOS area, the integration of the LED matrix 302 and the sensors 304 may reduce the number of power and communication contacts. The increase in the CMOS area may improve the thermal design of the CMOS chip 310 by providing additional area to help dissipate heat. In some embodiments, a heat sink may also be disposed on extended areas of the CMOS chip 310.
The sensors 304 may be of the same type and may be arranged to be symmetric to the LED matrix 302. This arrangement may permit detection that is spatially symmetrical. In particular, in such an embodiment, the CMOS chip 310 that contains the LED matrix 302 and the sensors 304 may be positioned in the flash module 110 of
In other embodiments, sensors other than flicker sensors may be additionally disposed around the LED matrix 302 or integrated into the structure 300. These sensors may include other optical sensors (such as a red, green, blue (RGB) ambient light sensor) or non-optical sensors, such as accelerometers, gyroscopes, or near field communication (NFC) sensors. As above, the LED matrix 302 may be segmented into one or more LED array, and either the LEDs within each LED array may be driven individually or the LED arrays themselves may be driven individually.
Once the driver has been tested and the response is found to be acceptable according to a set of predetermined criteria, at operation 404 contacts may be fabricated on the semiconductor structure. The contacts may include PCB contacts to be used to connect to the PCB and LED contacts to be used to connect to the LED array structure. The PCB contacts may include one or more of input/output, test, power, and ground contacts, for example. The PCB contacts may be formed, for example, as wirebonds, copper (Cu) pillars, or solder bumping that uses through substrate vias (TSV); the LED contacts may be formed using, for example, copper (Cu) pillars.
At operation 406, a protective layer may be deposited on a semiconductor structure 502. Upon reading and understanding the disclosed subject matter, a person of ordinary skill in the art will recognize that non-semiconductor structures (e.g., a dielectric structure with one or more semiconductor films formed thereon) may be used as well. The overall structure is shown in
At operation 408, a CMOS chip containing the LED array and sensors may be attached to the semiconductor structure 502. As above, surface mount technology may be used to mount the CMOS chip on the solder bumps (LED contacts 504b). Alternatively, Cu pillars may be used to wafer bond the LED die to the CMOS chip.
At operation 410, and shown in shown in
At operation 412, and shown in shown in
At operation 414, the integrated LED array and sensor structure 512 remaining after liftoff may be cleaned using solvents and deionized water, for example. The integrated LED array and sensor structure 512 may then be, for example, visually or optically inspected for defects and impurities prior to undertaking further fabrication processes.
After cleaning and inspection, at operation 416, and shown in
After deposition of the phosphor layer 516, at operation 418 the protective layer 506 (e.g., resist, see
After removal of the protective layer 506, at operation 420 and shown in shown in
Other layers and structures may be added to the above. For example, multiple phosphor layers or reflective layers to better direct the light may be added to the structure shown in
Accordingly, the term “module” (and “component”) is understood to encompass a tangible entity, be that an entity that is physically constructed, specifically configured (e.g., hardwired), or temporarily (e.g., transitorily) configured (e.g., programmed) to operate in a specified manner or to perform part or all of any operation described herein. Considering examples in which modules are temporarily configured, each of the modules need not be instantiated at any one moment in time. For example, where the modules comprise a general-purpose hardware processor configured using software, the general-purpose hardware processor may be configured as respective different modules at different times. Software may accordingly configure a hardware processor, for example, to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.
The mobile device 600 may include a hardware processor (or equivalently processing circuitry) 602 (e.g., a central processing unit (CPU), a GPU, a hardware processor core, or any combination thereof), a main memory 604 and a static memory 606, some or all of which may communicate with each other via an interlink (e.g., bus) 608. The main memory 604 may contain any or all of removable storage and non-removable storage, volatile memory or non-volatile memory. The mobile device 600 may further include a display 610 such as a video display, an alphanumeric input device 612 (e.g., a keyboard), and a user interface (UI) navigation device 614 (e.g., a mouse). In an example, the display 610, input device 612 and UI navigation device 614 may be a touch screen display. The mobile device 600 may additionally include a storage device (e.g., drive unit) 616, a signal generation device 618 (e.g., a speaker), a network interface device 620, one or more cameras 628, and one or more sensors 630, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor such as those described herein. The mobile device 600 may further include an output controller, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
The storage device 616 may include a non-transitory machine readable medium 622 (hereinafter simply referred to as machine readable medium) on which is stored one or more sets of data structures or instructions 624 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 624 may also reside, completely or at least partially, within the main memory 604, within static memory 606, and/or within the hardware processor 602 during execution thereof by the mobile device 600. While the machine readable medium 622 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 624.
The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the mobile device 600 and that cause the mobile device 600 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine-readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine-readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; Random Access Memory (RAM); and CD-ROM and DVD-ROM disks.
The instructions 624 may further be transmitted or received over a communications network using a transmission medium 626 via the network interface device 620 utilizing any one of a number of wireless local area network (WLAN) transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks. Communications over the networks may include one or more different protocols, such as Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi, IEEE 802.16 family of standards known as WiMax, IEEE 802.15.4 family of standards, a Long Term Evolution (LTE) family of standards, a Universal Mobile Telecommunications System (UNITS) family of standards, peer-to-peer (P2P) networks, a next generation (NG)/5th generation (5G) standards among others. In an example, the network interface device 620 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the transmission medium 626.
Note that the term “circuitry” as used herein refers to, is part of, or includes hardware components such as an electronic circuit, a logic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group), an Application Specific Integrated Circuit (ASIC), a field-programmable device (FPD) (e.g., a field-programmable gate array (FPGA), a programmable logic device (PLD), a complex PLD (CPLD), a high-capacity PLD (HCPLD), a structured ASIC, or a programmable SoC), digital signal processors (DSPs), etc., that are configured to provide the described functionality. In some embodiments, the circuitry may execute one or more software or firmware programs to provide at least some of the described functionality. The term “circuitry” may also refer to a combination of one or more hardware elements (or a combination of circuits used in an electrical or electronic system) with the program code used to carry out the functionality of that program code. In these embodiments, the combination of hardware elements and program code may be referred to as a particular type of circuitry.
The term “processor circuitry” or “processor” as used herein thus refers to, is part of, or includes circuitry capable of sequentially and automatically carrying out a sequence of arithmetic or logical operations, or recording, storing, and/or transferring digital data. The term “processor circuitry” or “processor” may refer to one or more application processors, one or more baseband processors, a physical central processing unit (CPU), a single- or multi-core processor, and/or any other device capable of executing or otherwise operating computer-executable instructions, such as program code, software modules, and/or functional processes.
In addition to reducing the number of contacts, improving the thermal issues of the CMOS device, and reducing the number of openings in the housing of the mobile device, integration of the LED matrix and the flicker sensor in the CMOS device may enable use of the flicker sensor to control a minimum current through each of the LEDs to produce a white appearance from the overall structure. In particular, variations are present in the various layers (semiconductor and otherwise) due to a variety of issues, including, among others, variations in gas flows and etch rates over the wafer used to fabricate the integrated structure. Such variations may create large operational variations in the emission of the LEDs at low applied currents. This may cause issues as, at relatively low or no currents, the LED array may appear yellowish/orange due to the presence of the phosphor; to make the LED array appear white (typically desirable), the LEDs may be activated with at least a minimum amount of current, which may vary from LED to LED due to the abovementioned manufacturing variability. This minimum amount of current may cause the LEDs to emit light at an intensity greater than that of the ambient light. However, due to the LED variation at low current, the ability to provide white light (either intensity or color) across the LED array may not be uniform. Accordingly, it may be desirable to individually control the current to the LEDs to account for variations in the LED manufacturing and adjust for the amount of ambient light. The flicker sensor, which is formed on the same structure as the LEDs, may be used by the processor on the PCB to help tune the current to the individual LEDs to avoid such variations. The amount of current for applied to each LED to ensure that the LED appears white may be calibrated during manufacturing.
In addition, although multiple lenses are shown in
While only certain features of the system and method have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes. Method operations can be performed substantially simultaneously or in a different order.
Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the present disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show, by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
The subject matter may be referred to herein, individually and/or collectively, by the term “embodiment” merely for convenience and without intending to voluntarily limit the scope of this application to any single inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In this document, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, UE, article, composition, formulation, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.
The Abstract of the Disclosure is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.
This application claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 63/284,953, filed Dec. 1, 2021, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63284953 | Dec 2021 | US |