The present disclosure relates generally to a system providing dynamic lighting control to LED arrays. In certain embodiments, the system can include a connected and individually addressable LED pixel array able to provide intensity and spatially modulated light projection suitable for adaptive lighting systems supporting video refresh rates or greater.
While pixel arrays of LEDs with supporting CMOS circuitry have been used, practical implementations suitable for commercial use can face severe manufacture, power, and data management problems. Individual light intensity of thousands of emitting pixels may need to be controlled at refresh rates of 30-60 Hz. Power fluctuations need to be controlled, and room found for large number of thick power traces extending through a hybrid silicon CMOS/GaN assembly. Manufacturable systems able to reliably handle such power at high data refresh rates are needed.
In one embodiment, a LED controller includes a power distribution module and an interface to an external data bus. An image frame buffer is connected to the interface to receive image data. A separate logic module is connected to the interface and configured to modify image frame buffer output signals sent to an LED pixel array connected to the image frame buffer. The LED pixel array can project light according to a pattern and intensity defined at least in part by the image held in the image frame buffer.
In another embodiment, a standby image buffer is connected to the image frame buffer to hold a default image. In another embodiment a pulse width modulator is connected between the image frame buffer and the LED pixel array.
In some embodiments, the image frame buffer can refresh held images at 60 Hz or greater speed. Image refresh data can be provided externally over a serial interface.
Various applications can benefit from a LED controller system able to support high data rates, default image presentation, and larger LED pixel arrays of hundreds to thousands of independently addressable pixels. These applications can include but are not limited to architectural lighting, projected light displays, street lighting, or vehicle headlamps.
In another embodiment, an LED controller system includes an LED controller including an image frame buffer able to receive image data. A sensor processing module is used to receive and process sensor data and a decision module is used to determine actions taken in response to processed sensor data. An image creation module is used to create images to be sent to the image frame buffer of the LED controller.
In another embodiment, a method of interacting with an LED controller system, includes the steps of receiving image data using an LED controller including an image frame buffer. Sensor data is received and processed, with the output being used to determine actions using a decision module. Images are created with an image creation module and sent to the image frame buffer of the LED controller.
Light emitting pixel arrays may support applications that benefit from fine-grained intensity, spatial, and temporal control of light distribution. This may include, but is not limited to, precise spatial patterning of emitted light from pixel blocks or individual pixels. Depending on the application, emitted light may be spectrally distinct, adaptive over time, and/or environmentally responsive. The light emitting pixel arrays may provide pre-programmed light distribution in various intensity, spatial, or temporal patterns. The emitted light may be based at least in part on received sensor data and may be used for optical wireless communications. Associated optics may be distinct at a pixel, pixel block, or device level. An example light emitting pixel array may include a device having a commonly controlled central block of high intensity pixels with an associated common optic, whereas edge pixels may have individual optics. Common applications supported by light emitting pixel arrays include video lighting, automotive headlights, architectural and area illumination, street lighting, and informational displays.
Light emitting pixel arrays may be used to selectively and adaptively illuminate buildings or areas for improved visual display or to reduce lighting costs. In addition, light emitting pixel arrays may be used to project media facades for decorative motion or video effects. In conjunction with tracking sensors and/or cameras, selective illumination of areas around pedestrians may be possible. Spectrally distinct pixels may be used to adjust the color temperature of lighting, as well as support wavelength specific horticultural illumination.
Street lighting is an important application that may greatly benefit from use of light emitting pixel arrays. A single type of light emitting array may be used to mimic various street light types, allowing, for example, switching between a Type I linear street light and a Type IV semicircular street light by appropriate activation or deactivation of selected pixels. In addition, street lighting costs may be lowered by adjusting light beam intensity or distribution according to environmental conditions or time of use. For example, light intensity and area of distribution may be reduced when pedestrians are not present. If pixels of the light emitting pixel array are spectrally distinct, the color temperature of the light may be adjusted according to respective daylight, twilight, or night conditions.
Light emitting arrays are also well suited for supporting applications requiring direct or projected displays. For example, warning, emergency, or informational signs may all be displayed or projected using light emitting arrays. This allows, for example, color changing or flashing exit signs to be projected. If a light emitting array is composed of a large number of pixels, textual or numerical information may be presented. Directional arrows or similar indicators may also be provided.
Vehicle headlamps are a light emitting array application that requires large pixel numbers and a high data refresh rate. Automotive headlights that actively illuminate only selected sections of a roadway can used to reduce problems associated with glare or dazzling of oncoming drivers. Using infrared cameras as sensors, light emitting pixel arrays activate only those pixels needed to illuminate the roadway, while deactivating pixels that may dazzle pedestrians or drivers of oncoming vehicles. In addition, off-road pedestrians, animals, or signs may be selectively illuminated to improve driver environmental awareness. If pixels of the light emitting pixel array are spectrally distinct, the color temperature of the light may be adjusted according to respective daylight, twilight, or night conditions. Some pixels may be used for optical wireless vehicle to vehicle communication.
One high value application for light emitting arrays is illustrated with respect to
Positioned adjacent to LED light module 22 is an active LED array 230. The LED array includes a CMOS die 202, with a pixel area 204 and alternatively selectable LED areas 206 and 208. The pixel area 204 can have 104 rows and 304 columns, for a total of 31,616 pixels distributed over an area of 12.2 by 4.16 millimeters. The selectable LED areas 206 and 208 allow for differing aspect ratios suitable for different vehicle headlamps or applications to be selected. For example, in one embodiment selectable LED area 206 can have a 1:3 aspect ratio with 82 rows and 246 columns, for a total of 20,172 pixels distributed over an area of 10.6 by 4 millimeters. Alternatively, selectable LED area 208 can have a 1:4 aspect ratio with 71 rows and 284 columns, for a total of 20,164 pixels distributed over an area of 12.1 by 3.2 millimeters. In one embodiment, pixels can be actively managed to have a 10-bit intensity range and a refresh rate of between 30 and 100 Hz, with a typical operational refresh rate of 60 Hz or greater.
The vehicle headlamp system 300 can include a power input filter and control module 310. The module 310 can support various filters to reduce conducted emissions and provide power immunity. Electrostatic discharge (ESD) protection, load-dump protection, alternator field decay protection, and reverse polarity protection can also be provided by module 310.
Filtered power can be provided to a LED DC/DC module 312. Module 312 can be used only for powering LEDs, and typically has an input voltage of between 7 and 18 volts, with a nominal 13.2 volts. Output voltage can be set to be slightly higher (e.g. 0.3 volts) than LED array max voltage as determined by factory or local calibration, and operating condition adjustments due to load, temperature or other factors.
Filtered power is also provided to a logic LDO module 314 that can be used to power microcontroller 322 or CMOS logic in the active headlamp 330.
The vehicle headlamp system 300 can also include a bus transceiver 320 (e.g. with a UART or SPI interface) connected to microcontroller 322. The microcontroller 322 can translate vehicle input based on or including data from the sensor module 306. The translated vehicle input can include a video signal that is transferable to an image buffer in the active headlamp module 324. In addition, the microcontroller 322 can load default image frames and test for open/short pixels during startup. In one embodiment, a SPI Interface loads an image buffer in CMOS. Image frames can be full frame, differential or partial. Other microcontroller 322 features can include control interface monitors of CMOS status, including die temperature, as well as logic LDO output. In some embodiments, LED DC/DC output can be dynamically controlled to minimize headroom. In addition to providing image frame data, other headlamp functions such as complementary use in conjunction with side marker or turn signal lights, and/or activation of daytime running lights can also be controlled.
Based on the results of the decision algorithm module 344, image creation module 346 provides an image pattern that will ultimately provide an active illumination pattern to the vehicle headlamp that is dynamically adjustable and suitable for conditions. This created image pattern can be encoded for serial or other transmission scheme by image coding module 348 and sent over a high speed bus 350 to an image decoding module 354. Once decoded, the image pattern is provided to the uLED module 380 to drive activation and intensity of illumination pixels.
In some operational modes, the system 330 can be driven with default or simplified image patterns using instructions provided to a headlamp control module 370 via connection of the decision algorithm module 344 through a CAN bus 352. For example, an initial pattern on vehicle start may be a uniform, low light intensity pattern. In some embodiments, the headlamp control module can be used to drive other functions, including sensor activation or control.
In other possible operational modes, the system 330 can be driven with image patterns derived from local sensors or commands not requiring input via the CAN bus 352 or high speed bus 350. For example, local sensors 360 and electronic processing modules capable of sensor processing 362 can be used. Processed sensor data can be input to various decision algorithms in a decision algorithm module 364 that result in command instructions or pattern creation based at least in part on various sensor input conditions, for example, such as ambient light levels, time of day, vehicle location, location of other vehicles, road conditions, or weather conditions. As will be appreciated, like vehicle supported remote sensors 340, useful information for the decision algorithm module 364 can be provided from other sources as well, including connections to user smartphones, vehicle to vehicle wireless connections, or connection to remote data or information resources.
Based on the results of the decision algorithm module 364, image creation module 366 provides an image pattern that will ultimately provide an active illumination pattern to the vehicle headlamp that is dynamically adjustable and suitable for conditions. In some embodiments, this created image pattern does not require additional image coding/decoding steps but can be directly sent to the uLED module 380 to drive illumination of selected pixels.
Image or other data from the vehicle can arrive via an SPI interface 412. Successive images or video data can be stored in an image frame buffer 414. If no image data is available, one or more standby images held in a standby image buffer can be directed to the image frame buffer 414. Such standby images can include, for example, an intensity and spatial pattern consistent with legally allowed low beam headlamp radiation patterns of a vehicle.
In operation, pixels in the images are used to define response of corresponding LED pixels in the pixel module 430, with intensity and spatial modulation of LED pixels being based on the image(s). To reduce data rate issues, groups of pixels (e.g. 5×5 blocks) can be controlled as single blocks in some embodiments. High speed and high data rate operation is supported, with pixel values from successive images able to be loaded as successive frames in an image sequence at a rate between 30 Hz and 100 Hz, with 60 Hz being typical. In conjunction with a pulse width modulation module 418, each pixel in the pixel module can be operated to emit light in a pattern and with an intensity at least partially dependent on the image held in the image frame buffer 414.
In one embodiment, intensity can be separately controlled and adjusted by setting appropriate ramp times and pulse width for each LED pixel using logic and control module 420 and the pulse width modulation module 418. This allows staging of LED pixel activation to reduce power fluctuations, and to provide various pixel diagnostic functionality.
In one embodiment, the SPI frame includes 2 stop bits (both “0”), 10 data bits, MSB first, 3 CRC bits (x3+x+1), a start 111b, and target 000b. Timing can be set per SafeSPI “in-frame” standards.
MOSI Field data can be as follows:
Frame 0: Header
Frame 1/2: Start Column Address [SCOL]
Frame 3/4: Start Row Address [SROW}
Frame 5/6: Number of Columns [NCOL]
Frame 7/8: Number of Rows [NROW]
Frame 9: Intensity pixel [SCOL, SROW]
Frame 10: Intensity pixel [SCOL+1, SROW]
Frame 9+NCOL: Intensity pixel [SCOL+NCOL, SROW]
Frame 9+NCOL+1: Intensity pixel [SCOL, SROW+1]
Frame 9+NCOL+NROW: Intensity pixel [SCOL+NCOL, SROW+NROW]
MISO Field data can include loopback of frame memory.
A field refresh rate at 60 Hz (60 full frames per second) is supported, as is a bit rate of at least 10 Mbps, and typically between 15-20 Mbps.
The SPI interface connects to an address generator, frame buffer, and a standby frame buffer. Pixels can have parameters set and signals or power modified (e.g. by power gating before input to the frame buffer, or after output from the frame buffer via pulse width modulation or power gating) by a command and control module. The SPI interface can be connected to an address generation module that in turn provides row and address information to the active matrix. The address generator module in turn can provide the frame buffer address to the frame buffer.
The command and control module can be externally controlled via an Inter-Integrated Circuit (I2C) serial bus. A clock (SCL) pin and data (SDA) pin with 7-bit addressing is supported.
The command and control module include a digital to analog converter (DAC) and two analog to digital converters (ADC). These are respectively used to set Vbias for a connected active matrix, help determine maximum Vf, and determine system temperature. Also connected are an oscillator (OSC) to set the pulse width modulation oscillation (PWMOSC) frequency for the active matrix. A bypass line is also present to allow address of individual pixels or pixel blocks in the active matrix for diagnostic, calibration, or testing purposes.
In one embodiment, the command and control module can provide the following inputs and outputs:
Input to CMOS chip:
VBIAS: Sets voltage bias for LDO's.
GET_WORD[ . . . ]: Requests Output from CMOS.
TEST_M1: Run Pixel Test: LDO in bypass mode, sequentially addresses columns, then rows, outputs VF, using internal 1 μA source.
Vf values output via SPI.
TEST_M2: Run Pixel Test: LDO in bypass mode, sequentially addresses columns, then rows, outputs VF, using external I source.
Vf values output via SPI.
TEST_M3: LDO in bypass mode, addressing through I2C, using internal 1 μA source, Vf output via I2C.
TEST_M4: LDO in bypass mode, addressing through I2C, using external I source, Vf output via I2C.
BUFFER_SWAP: Swap to/from standby buffer.
COLUMN_NUM: Addresses a specific row.
ROW_NUM: Addresses a specific column.
Output from CMOS chip:
CW_PHIV_MIN, CW_PHIV_AVG, CW_PHIV_MAX: factory measured EOL global luminous flux data.
CW_VLED_MIN, CW_VLED_AVG, CW_VLED_MAX: factory measured EOL global forward voltage data.
CW_SERIALNO: die/CMOS combo serial number for traceability purposes.
TEMP_DIE: Value of Die Temperature.
VF: Value of Vf bus when being addressed with COLUMN_NUM and ROW_NUM.
BUFFER_STATUS: Indicates which buffer is selected.
Various calibration and testing methods for microcontroller assembly 500 are supported. During factory calibration a Vf of all pixels can be measured. Maximum, minimum and average Vf of the active area can be “burned” as calibration frame. Maximum Vf and dVf/dT calibration frames can be used together with measured die temperature to determine actual VLED dynamically. Typically, a VLED of between 3.0V-4.5V is supported, with actual value being determined by feedback loop to external DC/DC converter such as described with respect to
Many modifications and other embodiments of the invention will come to the mind of one skilled in the art having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is understood that the invention is not to be limited to the specific embodiments disclosed, and that modifications and embodiments are intended to be included within the scope of the appended claims. It is also understood that other embodiments of this invention may be practiced in the absence of an element/step not specifically disclosed herein.
Number | Date | Country | Kind |
---|---|---|---|
18202319 | Oct 2018 | EP | regional |
This application claims benefit of priority to European Patent Application No. 18202319.2 filed Oct. 24, 2018 and to U.S. Provisional Patent Application No. 62/729,284 filed Sep. 10, 2018 each of which is incorporated herein by reference in its entirety. Further, this application is related to co-pending U.S. Non-provisional patent application Ser. No. 16/456,849 filed Jun. 28, 2019.
Number | Name | Date | Kind |
---|---|---|---|
5184114 | Brown | Feb 1993 | A |
5612728 | Kun et al. | Mar 1997 | A |
6611610 | Stam et al. | Aug 2003 | B1 |
6728393 | Stam et al. | Apr 2004 | B2 |
6933956 | Sato et al. | Aug 2005 | B2 |
7432967 | Bechtel et al. | Oct 2008 | B2 |
7526103 | Schofield et al. | Apr 2009 | B2 |
7542861 | You et al. | Jun 2009 | B1 |
9318069 | Nambi et al. | Apr 2016 | B2 |
9849827 | Uchida et al. | Dec 2017 | B2 |
9944222 | Tsuzuki | Apr 2018 | B2 |
10148938 | Nagasaki et al. | Dec 2018 | B2 |
10189398 | Jung et al. | Jan 2019 | B2 |
10219348 | Chung | Feb 2019 | B1 |
10471887 | Jung et al. | Nov 2019 | B2 |
10651357 | Andrews | May 2020 | B2 |
20010026646 | Morita et al. | Oct 2001 | A1 |
20020142504 | Feldman et al. | Oct 2002 | A1 |
20020171617 | Fuller | Nov 2002 | A1 |
20020186192 | Maruoka et al. | Dec 2002 | A1 |
20030038983 | Tanabe et al. | Feb 2003 | A1 |
20030058262 | Sato et al. | Mar 2003 | A1 |
20030222893 | Koester et al. | Dec 2003 | A1 |
20040095184 | Oka et al. | May 2004 | A1 |
20040114379 | Miller | Jun 2004 | A1 |
20040119667 | Yang | Jun 2004 | A1 |
20040129887 | Vydrin et al. | Jul 2004 | A1 |
20040252874 | Yamazaki | Dec 2004 | A1 |
20050029872 | Ehrman et al. | Feb 2005 | A1 |
20060007059 | Bell | Jan 2006 | A1 |
20060017688 | Hohmann et al. | Jan 2006 | A1 |
20060237636 | Lyons et al. | Oct 2006 | A1 |
20060290770 | Leblanc | Dec 2006 | A1 |
20070242337 | Bradley | Oct 2007 | A1 |
20080129206 | Stam et al. | Jun 2008 | A1 |
20090040152 | Scheibe | Feb 2009 | A1 |
20090040775 | Scheibe | Feb 2009 | A1 |
20090322429 | Ivanov et al. | Dec 2009 | A1 |
20100073358 | Ozaki | Mar 2010 | A1 |
20100097525 | Mino | Apr 2010 | A1 |
20100259182 | Man et al. | Oct 2010 | A1 |
20100301777 | Kraemer | Dec 2010 | A1 |
20110012891 | Cheng et al. | Jan 2011 | A1 |
20110062872 | Jin et al. | Mar 2011 | A1 |
20120286135 | Gong et al. | Nov 2012 | A1 |
20120287144 | Gandhi et al. | Nov 2012 | A1 |
20120306370 | Ven et al. | Dec 2012 | A1 |
20130082604 | Williams et al. | Apr 2013 | A1 |
20140267329 | Lee et al. | Sep 2014 | A1 |
20150138212 | Bae et al. | May 2015 | A1 |
20150151671 | Refior | Jun 2015 | A1 |
20150186098 | Hall | Jul 2015 | A1 |
20150204512 | Chen et al. | Jul 2015 | A1 |
20160081028 | Chang et al. | Mar 2016 | A1 |
20160081148 | Liang et al. | Mar 2016 | A1 |
20160104418 | Keum et al. | Apr 2016 | A1 |
20160155406 | Lee | Jun 2016 | A1 |
20160275919 | Lawrence et al. | Sep 2016 | A1 |
20160302270 | Wang | Oct 2016 | A1 |
20160335957 | Fu et al. | Nov 2016 | A1 |
20160345392 | Scenini et al. | Nov 2016 | A1 |
20170243532 | Huang et al. | Aug 2017 | A1 |
20180074199 | Lin et al. | Mar 2018 | A1 |
20180079352 | Dalal | Mar 2018 | A1 |
20180263098 | Recker et al. | Sep 2018 | A1 |
20180290584 | Jung et al. | Oct 2018 | A1 |
20190013307 | Wu et al. | Jan 2019 | A1 |
20190057643 | Bae et al. | Feb 2019 | A1 |
20190132917 | Veenstra et al. | May 2019 | A1 |
20190189879 | Tandon et al. | Jun 2019 | A1 |
20200079278 | Bonne et al. | Mar 2020 | A1 |
20200079280 | Bonne et al. | Mar 2020 | A1 |
20200082503 | Bonne et al. | Mar 2020 | A1 |
20200082749 | Bonne | Mar 2020 | A1 |
20200084848 | Bonne et al. | Mar 2020 | A1 |
20200084853 | Bonne et al. | Mar 2020 | A1 |
20200084854 | Bonne et al. | Mar 2020 | A1 |
20200084868 | Bonne | Mar 2020 | A1 |
20200128640 | Van Voorst | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
204077513 | Jan 2015 | CN |
102015016375 | Jun 2017 | DE |
2626851 | Aug 2013 | EP |
2002247614 | Aug 2002 | JP |
2004-210129 | Jul 2004 | JP |
2010-529932 | Sep 2010 | JP |
101846329 | Apr 2018 | KR |
201110811 | Mar 2011 | TW |
202019238 | May 2020 | TW |
WO-2008154736 | Dec 2008 | WO |
2009140963 | Nov 2009 | WO |
2013066017 | May 2013 | WO |
2014062425 | Apr 2014 | WO |
2020053716 | Mar 2020 | WO |
2020053717 | Mar 2020 | WO |
2020053718 | Mar 2020 | WO |
2020053719 | Mar 2020 | WO |
Entry |
---|
U.S. Appl. No. 62/888,246, filed Aug. 16, 2019, Zhi Hua Song. |
U.S. Appl. No. 62/890,853, filed Aug. 23, 2019, Toni Lopez. |
U.S. Appl. No. 62/938,479, filed Nov. 21, 2019, Zhi Hua Song. |
U.S. Appl. No. 62/938,527, filed Nov. 21, 2019, Zhi Hua Song. |
U.S. Appl. No. 62/941,123, filed Nov. 27, 2019, Ronald Johannes Bonne. |
U.S. Appl. No. 62/951,199, filed Dec. 20, 2019, Zhi Hua Song. |
European Search Report corresponding to EP18201763; dated Feb. 8, 2019, 1 page. |
European Search Report corresponding to EP18203445, dated Apr. 5, 2019, 1 page. |
International Search Report and Written Opinion in International Patent Application No. PCT/IB2019/057507 dated Apr. 16, 2020, 31 pages. |
International Search Report corresponding to PCT/IB2019/057508, dated Nov. 4, 2019, 4 pages. |
International Search Report, PCT/IB2019/057504, dated Nov. 11, 2019, 3 pages. |
USPTO Non-Final Office Action in U.S. Appl. No. 16/456,868 dated Apr. 1, 2020, 8 pages. |
USPTO Non-Final Office Action dated Feb. 20, 2020 for. U.S. Appl. No. 16/456,858; 10 pages. |
USPTO Non-Final Office Action dated Mar. 6, 2020 for. U.S. Appl. No. 16/456,862; 23 pages. |
USPTO Notice of Allowance in U.S. Appl. No. 16/456,835 dated Jan. 27, 2020, 8 pages. |
USPTO office action issued in U.S. Appl. No. 16/456,844 dated Feb. 5, 2020; 16 pages. |
Written Opinion of the International Searching Authority corresponding to PCT/IB2019/057508, dated Nov. 4, 2019, 10 pages. |
Written Opinion of the International Searching Authority, PCT/IB2019/057504, dated Nov. 11, 2019, 5 pages. |
Taiwanese Notice of Allowance dated Aug. 26, 2020 for ROC (Taiwan) Patent Application No. 108123755; 3 pages total. |
USPTO Final Office Action in U.S. Appl. No. 16/456,862 dated Sep. 14, 2020, 15 pages. |
USPTO Notice of Allowance in U.S. Appl. No. 16/456,858 dated Aug. 28, 2020, 6 pages. |
“U.S. Appl. No. 16/456,849, Examiner Interview Summary dated May 4, 2021”, 4 pgs. |
“U.S. Appl. No. 16/456,849, Response filed Jun. 7, 2021 to Final Office Action dated Apr. 14, 2021”, 15 pgs. |
International Search Report corresponding to PCT/IB2019/05706, 2 pages, dated Oct. 31, 2019. |
Taiwanese Office Action dated Jun. 29, 2020 for ROC (Taiwan) Patent Application No. 108123758; 11 pages total. |
Taiwanese Office Action dated May 13, 2020 for ROC (Taiwan) Patent Application No. 108123755; with English translation; 18 pages total. |
Taiwanese Office Action dated May 25, 2020 for corresponding ROC (Taiwan) Patent Application No. 108123756; with English translation; 48 pages total. |
USPTO Non-Final Office Action dated Jun. 11, 2020 for U.S. Appl. No. 16/456,835; 19 pages. |
“U.S. Appl. No. 16/456,835, Notice of Allowance dated Oct. 8, 2020”, 9 pgs. |
“U.S. Appl. No. 16/456,849, Non Final Office Action dated Oct. 29, 2020”, 11 pgs. |
“U.S. Appl. No. 16/456,849, Preliminary Amendment filed May 29, 2020”, 3 pgs. |
“U.S. Appl. No. 16/456,849, Response filed Jan. 12, 2021 to Non Final Office Action dated Oct. 29, 2020”, 9 pgs. |
“U.S. Appl. No. 16/456,868, Final Office Action dated Oct. 15, 2020”, 12 pgs. |
“International Application Serial No. PCT/IB2019/057506, Written Opinion dated Oct. 31, 2019”, 10 pgs. |
“Taiwanese Application Serial No. 108123755, Response filed Aug. 14, 2020 to Office Action dated May 13, 2020”, (w/ English Translation), (dated Aug. 14, 2020), 14 pgs. |
“Taiwanese Application Serial No. 108123756, Decision to Grant dated Nov. 6, 2020”, (w/ English Translation), 6 pgs. |
“Taiwanese Application Serial No. 108123756, Response filed Aug. 26, 2020 to Office Action dated May 25, 2020”, (w/ English Translation), 15 pgs. |
“Taiwanese Application Serial No. 108123758, Response filed Sep. 29, 2020 to Office Action dated Jun. 29, 2020”, (w/ English Translation), 35 pgs. |
European Search Report corresponding to EP18202319, dated Jan. 29, 2019, 1 page. |
USPTO Non-Final Office Action dated Aug. 5, 2020 for U.S. Appl. No. 16/456,874; 6 pages. |
USPTO Notice of Allowance in U.S. Appl. No. 16/456,844 dated Aug. 6, 2020, 24 pages. |
“U.S. Appl. No. 16/456,849, Final Office Action dated Apr. 14, 2021”, 13 pgs. |
“International Application Serial No. PCT/IB2019/057506, International Preliminary Report on Patentability dated Mar. 25, 2021”, 12 pgs. |
“TPS99000-Q1 System Management and Illumination Controller”, Texas Instruments, (Apr. 2019), 83 pgs. |
“Japanese Application Serial No. 2021-513237, Voluntary Amendment filed May 6, 2021”, (w/ English Claims), 7 pgs. |
“U.S. Appl. No. 16/456,849, Notice of Allowance dated Jul. 7, 2021”, 10 pgs. |
“Japanese Application Serial No. 2021-513237, Notification of Reasons for Refusal dated Jul. 5, 2022”, (w/ English Translation), 17 pgs. |
Number | Date | Country | |
---|---|---|---|
20200084848 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
62729284 | Sep 2018 | US |