System and method for processing streamed video images to correct for flicker of amplitude-modulated lights

Information

  • Patent Grant
  • 11178353
  • Patent Number
    11,178,353
  • Date Filed
    Wednesday, June 22, 2016
    8 years ago
  • Date Issued
    Tuesday, November 16, 2021
    3 years ago
Abstract
A display system is provided for a vehicle equipped with a camera for supplying streamed video images of a scene rearward of the vehicle. The display system includes an image processing unit for receiving the streamed video images and processing the streamed video images, and a display for displaying the processed streamed video images. To perform processing of the streamed video images, the image processing unit is configured to: detect amplitude-modulated light sources in the streamed video images, classify the detected amplitude-modulated light sources into one of several possible classifications, select the streamed video images in which an amplitude-modulated light source is detected that flickers based upon the classification of the amplitude-modulated light source, and modify the selected streamed video images to correct for flicker of any amplitude-modulated light sources in the selected streamed video images.
Description
BACKGROUND OF THE INVENTION

The present invention generally relates to processing of video images streamed to a display, and more specifically to processing of streamed video images of scenes exterior to a vehicle. In some embodiments the present invention pertains even more specifically to processing of video images obtained from a rearward facing camera in a vehicle that are streamed to a display serving as a replacement for a rearview mirror.


SUMMARY OF THE INVENTION

According to one aspect of the invention, a display system is provided for a vehicle equipped with a camera for supplying streamed video images of a scene rearward of the vehicle. The display system comprises: an image processing unit for receiving the streamed video images and processing the streamed video images; and a display for displaying the processed streamed video images. To perform processing of the streamed video images, the image processing unit is configured to: detect amplitude-modulated light sources in the streamed video images, classify the detected amplitude-modulated light sources into one of several possible classifications, select the streamed video images in which an amplitude-modulated light source is detected that flickers based upon the classification of the amplitude-modulated light source, and modify the selected streamed video images to correct for flicker of any amplitude-modulated light sources in the selected streamed video images.


According to one aspect of the invention, a display system is provided that comprises: an image processing unit for receiving streamed video images and processing the streamed video images; and a display for displaying the processed streamed video images. To perform processing of the streamed video images, the image processing unit is configured to: detect amplitude-modulated light sources in the streamed video images, classify the detected amplitude-modulated light sources into at least two classes where a first class of detected amplitude-modulated light sources have a flicker not perceivable by a human when viewed directly by the human, and a second class of detected amplitude-modulated light sources have a flicker that is perceivable by a human when viewed directly by the human, track the detected amplitude-modulated light sources through image frames of the streamed video images, and modify the streamed video images in which an amplitude-modulated light source is detected that is classified in the first class by substituting pixels representing each of the detected amplitude-modulated light sources that is classified in the first class such that the pixels representing the detected amplitude-modulated light source are always at a state so that when the processed streamed video images are displayed, each of the detected amplitude-modulated light sources that is classified in the first class appears to have no perceivable flicker.


According to one aspect of the invention, a method of processing streamed video images is provided that comprises: detecting amplitude-modulated light sources in the streamed video images; classifying the detected amplitude-modulated light sources into at least two classes where a first class of detected amplitude-modulated light sources have a flicker not perceivable by a human when viewed directly by the human, and a second class of detected amplitude-modulated light sources have a flicker that is perceivable by a human when viewed directly by the human; tracking the detected amplitude-modulated light sources through image frames of the streamed video images; and modifying the streamed video images in which an amplitude-modulated light source is detected that is classified in the first class by substituting pixels representing each of the detected amplitude-modulated light sources that is classified in the first class such that the pixels representing the detected amplitude-modulated light source are always at a state so that when the processed streamed video images are displayed, each of the detected amplitude-modulated light sources that is classified in the first class appears to have no perceivable flicker.


These and other features, advantages, and objects of the present invention will be further understood and appreciated by those skilled in the art by reference to the following specification, claims, and appended drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings:



FIG. 1 is a block diagram of an imaging system according to an embodiment of the invention;



FIG. 2 is a flow chart showing the method steps performed by the image processing unit shown in FIG. 1;



FIG. 3 is a cut-away plan view of a vehicle in which the imaging system may be implemented;



FIG. 4A is a front and side perspective view of a vehicle rearview assembly in which various components of the imaging system may be implemented; and



FIG. 4B is a front elevational view of the vehicle rearview assembly shown in FIG. 4A.





DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numerals will be used throughout the drawings to refer to the same or like parts. In the drawings, the depicted structural elements are not to scale and certain components are enlarged relative to the other components for purposes of emphasis and understanding.


A common problem in rendering streaming video data captured from an imager occurs when the object being imaged is an amplitude modulated (AM) light source. A very common example of this type of light source is one which is pulsing on/off at some periodic rate, like a vehicle lamp assembly constructed with light emitting diodes (LEDs), where the LEDs are pulse-width-modulated (PWM), which is a subset of possible amplitude modulation methods. The PWM period and duty cycle result in the LEDs being turned on and off at some periodic rate, and a camera taking streaming images of this lamp assembly will capture successive images where an LED may be ‘on’ in one or more consecutive images, and then ‘off’ in one or more subsequent images. Other examples of AM light sources include the flashers on an emergency vehicle (which may also be comprised of PWM LEDs), a turn signal on a vehicle, or a fluorescent light source in a tunnel or parking garage.


For many of the exemplary AM light sources listed above, a human observer of the light source does not perceive any flicker in the ‘on/off’ pattern since the frequency of the on/off pattern is higher than the human vision system can perceive (PWM LED headlamp/tail lamp assemblies being a prime example). But in imaging the AM light element with an electronic camera system, the exposure time, frame rate, and shutter scheme (rolling or global) used when capturing the light element at a particular pixel in the imager array may result in some images showing this pixel to be imaging an ‘on’ state of the light element, and successive images showing this pixel capturing the ‘off’ state of the light element. In attempting to render these images to a display, at some display frame rate, the display system may end up presenting the human observer an ‘on/off’ pattern that is discernible as a ‘flickering’ light.



FIG. 1 shows an image system 10 according to a first embodiment. As shown, image system 10 includes a camera 26 that captures images of a scene and outputs streamed video images of the scene, and a display system 12, which includes an image processing unit 30 that receives the streamed video images and processes the images (as discussed in detail below) and outputs the processed streamed video images, and a display 32 that displays the processed streamed video images.


The methods and processing sequences described herein are intended to mitigate the ‘flickering’ phenomena seen in rendered AM headlamps and tail lamps (especially targeted to PWM LED assemblies, but not limited to lighting of that technology). As described below, the platform on which these methods may be implemented is part of an automotive mirror replacement system, where a vehicle mirror is replaced by camera (lens plus digital imager) 26, image processing unit (serial processor and/or ASIC/FPGA) 30, and electronic display (LCD/LED panel) 32. The methods described herein may be incorporated in the image processing unit 30 in the above system 10. As shown in FIG. 2, the method steps may be performed in the following sequence (as would occur on the image processing unit 30): 1) receiving the streamed video images (step 100); 2) detection of the PWM LED (or AM) lights in a succession of the streamed video images (step 102); 3) differentiation/classification of the PWM LED (or AM) elements (which are part of a headlamp or tail lamp assembly) from other illuminating objects in the scene which have time-varying brightness levels (e.g. emergency vehicle lights) (step 104); 4) tracking of the pulsed lights over time (step 106); 5) correction of the flicker artifact associated with these rendered lights in a way that is appropriate to the specific type of light source (step 108); and 6) supplying the processed video streamed images to display 32 (step 110). Possible techniques for each of these steps are detailed below.


Multiple methods exist for performing step 102 involving detection of time-varying lights in a sequence of captured images. In the problem area of a rearview mirror replacement system (based on an electronic camera 26, an image processing unit 30, and a display system 32), PWM LED lights that may need to be detected are those originating from vehicle headlamp and tail lamp systems. These lights are related to vehicles, which are on the same roadway as the vehicle outfitted with the mirror replacement system. The search space for the PWM LED lights of interest thus can be influenced by roadway detection, where an auto-aim or lane detection system can narrow the light search space to the vertical region above the detected road boundaries (from a lane detection system), or around the focus of expansion (from an auto aim system), and discriminated from stationary non-vehicle light sources. In this reduced search space, methods exist in existing high beam control systems to detect PWM LED lights as disclosed in commonly-owned U.S. Pat. Nos. 6,587,573; 6,593,698; 6,611,610; 6,631,316; 6,653,614; 6,728,393; 6,774,988; 6,861,809; 6,906,467; 6,947,577; 7,321,112; 7,417,221; 7,565,006; 7,567,291; 7,653,215; 7,683,326; 7,881,839; 8,045,760; 8,120,652; and 8,543,254, the entire disclosures of which are incorporated herein by reference.


Additionally, detection methods such as frame subtraction may be used for detecting time-varying light sources, where successive images are subtracted from one another to produce temporal difference maps. The resultant maps are then processed by routines (implemented in software or in ASIC/FPGA fabric), which perform some combination of thresholding and/or filtering to identify spatial areas in the map where there were significant changes in pixel brightness between the two source images. The absolute value of the difference data indicates the magnitude of the change in pixel intensity between frames, and the sign of the difference data indicates whether the change in a pixel value between frames is associated with a light source brightening or darkening. The frame data used to generate these temporal difference maps may be raw data from a Bayer patterned image, luminance data extracted from the image, or some other image form extracted from the image processing path. On a typical roadway scene, the most significant deltas in pixel values between a pair of frames (referenced to a single pixel location), tend to be related to these PWM LED (AM) lights which are going from extremely bright to fully off. Motion artifacts can also contribute to temporal changes in image values at the pixel locations, but in the search space of the roadway imaged by the vehicle, this motion is quite small—as the image capture rate is rapid compared to vehicle dynamics, and the brightness changes related to objects which do not produce their own illumination is also quite reduced (imaging a vehicle body at a pixel in the first frame to a part of the vehicle bumper in the next frame does not produce as significant a luminance change than the PWM LED is exhibiting in its on/off sequencing).


Other methods of detecting the presence of AM lights may be leveraged from the imager implementation, where some imagers may supply information (to the pixel level) on whether the scene brightness changed state during the pixel exposure time (especially for an imager such as an HDR CMOS imager).


As described below, the methods for correctly rendering pulsed lights tend to fall in the category of adding image content to ‘brighten’ the pulsed light location for durations when the light is captured as ‘off’ and addressing incorrect color measurements induced by the time-varying nature of the lights. The classification operation (step 104) is applied to discriminate between the types of time-varying light sources that introduce the brightness and/or color errors. To ensure only the desired pulsed lights are corrected (and not, for example, motion artifacts), light source classification may be performed to influence the correction step 108. Methods of classifying PWM LED lights are known in high beam control systems such as those disclosed in commonly-owned U.S. Pat. Nos. 6,587,573; 6,593,698; 6,611,610; 6,631,316; 6,653,614; 6,728,393; 6,774,988; 6,861,809; 6,906,467; 6,947,577; 7,321,112; 7,417,221; 7,565,006; 7,567,291; 7,653,215; 7,683,326; 7,881,839; 8,045,760; 8,120,652; and 8,543,254, the entire disclosures of which are incorporated herein by reference. However, other options exist in the use of temporal changes, color, brightness and location. The options for use of brightness and color for classification are greatly enhanced by the use of a Bayer patterned, High Dynamic Range (HDR) imager in the camera system, since bright objects are not saturated with an HDR imager, and the Bayer pattern contributions can be demosaiced to determine color of very bright lights. Object detection systems that classify vehicles can also be used to influence the classification of PWM LED headlamps/tail lamps, by limiting search windows to areas associated with the identified vehicles.


Basically, the classification can be used to distinguish between those flickering lights that are humanly perceivable when viewing the lights directly from those lights that are not humanly perceivable as flickering when viewing the lights directly. This way, the images of the light sources may be selectively modified based upon such classification so that the light sources will appear in the displayed scenes as they would otherwise appear to a human viewing the lights directly.


The step of temporal tracking of pulsed lights (step 106) can be performed using the techniques for tracking vehicle lights as described in known high beam control systems such as those disclosed in commonly-owned U.S. Pat. Nos. 6,587,573; 6,593,698; 6,611,610; 6,631,316; 6,653,614; 6,728,393; 6,774,988; 6,861,809; 6,906,467; 6,947,577; 7,321,112; 7,417,221; 7,565,006; 7,567,291; 7,653,215; 7,683,326; 7,881,839; 8,045,760; 8,120,652; 8,543,254; and 9,185,363, the entire disclosures of which are incorporated herein by reference. This temporal and spatial tracking is useful when selectively modifying the images in order to brighten pixels corresponding to the expected location of the flickering light source in those modified images.


Step 108 involves resolving light flicker for rendering on display 32. With the AM (or pulsed) lights which need to be addressed for display flicker reduction identified, the method of flicker reduction can be performed by substituting low pixel values (from ‘off’ situations), with values which correspond to levels associated with ‘on’ situations. The pixel value replacements can be performed at the raw level (a Bayer pattern color associated with the replaced pixel), or at a later processing step in the processing subsystem. There are advantages to performing pixel replacement at the post-demosaic step, and color can be preserved for the PWM light by creating the correct balance of red, green and blue contributions. To maintain the displayed boundaries of AM light objects when pixel substitution is being performed, some image processing steps may be used to predict the object outline in an upcoming frame by using the tracking information of step 106, and an object shape detection routine.


Alternatively, the temporal difference maps from the detection step can be used to define the region of pixels to be substituted (since they represent the pixels which have changed state between frames), with better results possible from using maps that incorporate more than just a two frame difference. One possible implementation of this method would involve creating difference maps of pixel values (by location) across sequences of frames, replacing pixel values that have been determined to be producing images of pulsed PWM LED lights with an average of the highest M values in an N frame sequence (M less than N), if the average of highest M values exceeds some threshold. If analysis of overall image luminance and object color is used, this replacement method may also be used to replace PWM LED detection, classification, tracking and replacement.


A forward-facing turn signal is one example of a time-varying light source that could be detected, classified, and corrected using the ideas disclosed here. Unlike a PWM light, whose row-to-row values on the imager may vary greatly due to beat frequencies, a turn signal's frequency is significantly lower than a camera's frame rate (1-2 Hz as opposed to 15-120 Hz). This results in areas of the light turning on and off at the turn signal's frequency. This spatial consistency within the boundaries of the light, coupled with a detected frequency that is indicative of a turn signal and a yellowish hue, could allow classification of a light as a turn signal as it is tracked. Once the system knows what kind of light it is, the system can fix it by increasing its yellow saturation. This creates more visual appeal, but leaves the on/off behavior of the light alone.


A PWM LED tail lamp is a difficult object to image and visualize correctly because it is typically not relatively bright compared to the background. In addition, for rolling shutter cameras, each row of pixels may have a sharply different level of brightness, and this can be exacerbated by the spatial effects of the Bayer filter, leading to many artifacts in both chrominance and luminance. However, some of these characteristics such as row-to-row variation, local colors that are wildly different, and colors and intensities that change drastically from frame to frame, etc., in addition to other characteristics such as location in the image, predominance of brighter red pixels, motion toward the focus of expansion, frequency estimation on the light modulation, etc., could allow classification of these lights with high accuracy. Fixing PWM LED tail lamps could be performed by making the colors a uniformly saturated red while choosing a luminance from the detected range, which would end up being visually appealing and remove harsh artifacts.


Referring now to FIG. 3, a schematic diagram of a vehicular implementation of the above embodiment is shown. A vehicle 20 is shown that is driven by operator 22. One or more cameras 26 are operative to view a scene 24. In the example shown, scene 24 is generally behind vehicle 20. However, camera 26 may be oriented in a variety of ways to view scenes at other locations about vehicle 20 including, but not limited to, the sides, back, front, bottom, top, and inside. In the example shown, signals representative of the scene are sent via channel 28 to an image processing unit 30. Image processing unit 30 produces an enhanced image of scene 24 on one or more displays 32. Input from an optional ambient light sensor 34 and one or more direct glare sensors 36 is also available to image processing unit 30.


In a particularly useful embodiment, a rearview assembly 50 (FIGS. 4A and 4B) is augmented or replaced by imaging system 10 having cameras 26 which cover a wide field of view to the back and sides so that pedestrians or other objects directly in back of vehicle 20 may be seen and so that oncoming traffic from the sides may be seen. The system is designed so that, when backing out of a parking spot, oncoming vehicles may be seen before backing into the lane of travel. This requires camera system 26 with a near 180° field of view or several camera systems 26 mounted near the rear of the vehicle. An analogous system with a camera or cameras 26 mounted near the front of the vehicle 20 is adapted to view cross traffic at a “blind” intersection before entering the lane of travel of the cross traffic. These are desirable applications for the present invention which supplement the viewing function of conventional rearview mirrors.



FIGS. 4A and 4B show an example of a rearview assembly 50 having a housing 54 with a display 32 and an optional mirror element 52 positioned in front of the display 32. A user switch 56 may optionally be provided for tilting of the mirror element 52 and/or display 32 to reduce glare on the display 32 when activated. Examples of such a rearview assembly 50 are known and are disclosed in commonly-owned U.S. Patent Application Publication Nos. 2015/0219967 A1, 2015/0266427 A1, and 2015/0277203 A1, the entire disclosures of which are incorporated herein by reference. The optional ambient light sensor 34 and a direct glare sensor 36 may be incorporated in rearview assembly 50 as is known in the art. Further, image processing unit 30 may be disposed in the rearview assembly 50. Rearview assembly 50 may be an interior rearview assembly as shown in FIGS. 4A and 4B, or may be an exterior rearview assembly.


The above description is considered that of the preferred embodiments only. Modifications of the invention will occur to those skilled in the art and to those who make or use the invention. Therefore, it is understood that the embodiments shown in the drawings and described above are merely for illustrative purposes and not intended to limit the scope of the invention, which is defined by the claims as interpreted according to the principles of patent law, including the doctrine of equivalents.

Claims
  • 1. A display system for a vehicle equipped with a camera for supplying streamed video images of a scene rearward of the vehicle, the display system comprising: an image processing unit for receiving the streamed video images and processing the streamed video images; anda display for displaying the processed streamed video images,wherein to perform processing of the streamed video images, the image processing unit is configured to: detect amplitude-modulated light sources in the streamed video images,classify the detected amplitude-modulated light sources into one of several possible classifications, andprocess the detected amplitude-modulated light sources differently based upon the classification of the amplitude-modulated light source including: select the streamed video images in which an amplitude-modulated light source is detected that flickers based upon the classification of the amplitude-modulated light source, andmodify the selected streamed video images to correct for flicker of any amplitude-modulated light sources in the selected streamed video images.
  • 2. The display system of claim 1, wherein the image processing unit modifies the selected streamed video images such that the pixels representing each of the detected amplitude-modulated light sources are maintained at a state so that when the processed streamed video images are displayed, each of the detected amplitude-modulated light sources that is represented by the pixels appears to have no perceivable flicker.
  • 3. The display system of claim 2, wherein each of the detected amplitude-modulated light sources are maintained by substituting low pixel values from off periods with higher pixel values from on periods.
  • 4. The display system of claim 1, wherein the image processing unit is further configured to track the detected amplitude-modulated light sources through image frames of the streamed video images.
  • 5. The display system of claim 4, wherein the image processing unit modifies the selected streamed video images such that the pixels representing each of the detected amplitude-modulated light sources are maintained at a state so that when the processed streamed video images are displayed, each of the detected amplitude-modulated light sources that is represented by the pixels appears to have no perceivable flicker and appears at the expected locations in the images based upon the tracking of each of the detected amplitude-modulated light sources.
  • 6. The display system of claim 1, wherein the image processing unit does not modify the streamed video images to correct for flicker from light sources classified as a turn signal or emergency vehicle light.
  • 7. The display system of claim 1, wherein the image processing unit classifies the detected amplitude-modulated light sources into at least two classes where a first class of detected amplitude-modulated light sources has a flicker not perceivable by a human when viewed directly by the human, and a second class of detected amplitude-modulated light sources has a flicker that is perceivable by a human when viewed directly by the human.
  • 8. The display system of claim 7, wherein the streamed video images in which an amplitude-modulated light source is detected that is classified in the first class is modified by substituting pixels representing each of the detected amplitude-modulated light sources that is classified in the first class such that the pixels representing each of the detected amplitude-modulated light sources are always at a state so that when the processed streamed video images are displayed, the detected amplitude-modulated light source that is classified in the first class appears to have no perceivable flicker.
  • 9. The display system of claim 7, wherein the image processing unit classifies the detected amplitude-modulated light sources into the first class when a frequency of the flicker in the light sources is above a threshold frequency and classifies the detected amplitude-modulated light sources into the second class when a frequency of the flicker in the light sources is below the threshold frequency.
  • 10. A rearview assembly for mounting to the vehicle, the rearview assembly comprising the display system of claim 1.
  • 11. A display system comprising: an image processing unit for receiving streamed video images and processing the streamed video images; anda display for displaying the processed streamed video images,wherein to perform processing of the streamed video images, the image processing unit is configured to: detect amplitude-modulated light sources in the streamed video images,classify the detected amplitude-modulated light sources into at least two classes where a first class of detected amplitude-modulated light sources having a flicker not perceivable by a human when viewed directly by the human, and a second class of detected amplitude-modulated light sources having a flicker that is perceivable by a human when viewed directly by the human, andprocess the detected amplitude-modulated light sources differently based upon the classification of the amplitude-modulated light source including: track the detected amplitude-modulated light sources through image frames of the streamed video images,modify the streamed video images in which an amplitude-modulated light source is detected that is classified in the first class by substituting pixels representing each of the detected amplitude-modulated light sources that is classified in the first class such that the pixels representing the detected amplitude-modulated light source are always at a state so that when the processed streamed video images are displayed, each of the detected amplitude-modulated light sources that is classified in the first class appears to have no perceivable flicker, andnot modify the streamed video images to correct for flicker from light sources classified in the second class.
  • 12. The display system of claim 11, wherein the image processing unit classifies the detected amplitude-modulated light sources into the first class when a frequency of the flicker in the light sources is above a threshold frequency and classifies the detected amplitude-modulated light sources into the second class when a frequency of the flicker in the light sources is below the threshold frequency.
  • 13. The display system of claim 12, wherein each of the detected amplitude-modulated light sources are maintained by substituting low pixel values from off periods with higher pixel values from on periods.
  • 14. The display system of claim 11, wherein light sources classified in the second class include turn signals and emergency vehicle lights.
  • 15. A rearview assembly for mounting to the vehicle, the rearview assembly comprising the display system of claim 11.
  • 16. A method of processing streamed video images comprising: detecting amplitude-modulated light sources in the streamed video images;classifying the detected amplitude-modulated light sources into at least two classes where a first class of detected amplitude-modulated light sources having a flicker not perceivable by a human when viewed directly by the human, and a second class of detected amplitude-modulated light sources having a flicker that is perceivable by a human when viewed directly by the human;processing the detected amplitude-modulated light sources differently based upon the classification of the amplitude-modulated light source including: tracking the detected amplitude-modulated light sources through image frames of the streamed video images;modifying the streamed video images in which an amplitude-modulated light source is detected that is classified in the first class by substituting pixels representing each of the detected amplitude-modulated light sources that is classified in the first class such that the pixels representing the detected amplitude-modulated light source are always at a state so that when the processed streamed video images are displayed, each of the detected amplitude-modulated light sources that is classified in the first class appears to have no perceivable flicker; andnot correcting the light sources classified in the second class for flicker.
  • 17. The method of claim 16, wherein the detected amplitude-modulated light sources are classified into the first class when a frequency of the flicker in the light sources is above a threshold frequency and are classified into the second class when a frequency of the flicker in the light sources is below the threshold frequency.
  • 18. The method of claim 16, wherein light sources classified in the second class include turn signals and emergency vehicle lights.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority under 35 U.S.C. § 119(e) upon U.S. Provisional Patent Application No. 62/182,863, entitled “SYSTEM AND METHOD FOR PROCESSING STREAMED VIDEO IMAGES TO CORRECT FOR FLICKER OF AMPLITUDE-MODULATED LIGHTS” filed on Jun. 22, 2015, by Gregory S. Bush et al., the entire disclosure of which is incorporated herein by reference.

US Referenced Citations (465)
Number Name Date Kind
2131888 Harris Oct 1938 A
2632040 Rabinow Mar 1953 A
2827594 Rabinow Mar 1958 A
3179845 Kulwiec Apr 1965 A
3581276 Newman May 1971 A
3663819 Hicks et al. May 1972 A
4109235 Bouthors Aug 1978 A
4139801 Linares Feb 1979 A
4151526 Hinachi et al. Apr 1979 A
4214266 Myers Jul 1980 A
4236099 Rosenblum Nov 1980 A
4257703 Goodrich Mar 1981 A
4258979 Mahin Mar 1981 A
4277804 Robison Jul 1981 A
4286308 Wolff Aug 1981 A
4310851 Pierrat Jan 1982 A
4357558 Massoni et al. Nov 1982 A
4376909 Tagami et al. Mar 1983 A
4479173 Rumpakis Oct 1984 A
4499451 Suzuki et al. Feb 1985 A
4599544 Martin Jul 1986 A
4638287 Umebayashi et al. Jan 1987 A
4645975 Meitzler et al. Feb 1987 A
4665321 Chang et al. May 1987 A
4665430 Hiroyasu May 1987 A
4692798 Seko et al. Sep 1987 A
4716298 Etoh Dec 1987 A
4727290 Smith et al. Feb 1988 A
4740838 Mase et al. Apr 1988 A
4768135 Kretschmer et al. Aug 1988 A
4862037 Farber et al. Aug 1989 A
4891559 Matsumoto et al. Jan 1990 A
4910591 Petrossian et al. Mar 1990 A
4930742 Schofield et al. Jun 1990 A
4934273 Endriz Jun 1990 A
4967319 Seko Oct 1990 A
5005213 Hanson et al. Apr 1991 A
5008946 Ando Apr 1991 A
5027200 Petrossian et al. Jun 1991 A
5036437 Macks Jul 1991 A
5072154 Chen Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5121200 Choi et al. Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5166681 Bottesch et al. Nov 1992 A
5182502 Slotkowski et al. Jan 1993 A
5187383 Taccetta et al. Feb 1993 A
5197562 Kakinami et al. Mar 1993 A
5230400 Kakainami et al. Jul 1993 A
5235178 Hegyi Aug 1993 A
5243417 Pollard Sep 1993 A
5289321 Secor Feb 1994 A
5296924 Blancard et al. Mar 1994 A
5304980 Maekawa Apr 1994 A
5329206 Slotkowski et al. Jul 1994 A
5347261 Adell Sep 1994 A
5347459 Greenspan et al. Sep 1994 A
5355146 Chiu et al. Oct 1994 A
5379104 Takao Jan 1995 A
5386285 Asayama Jan 1995 A
5396054 Krichever et al. Mar 1995 A
5402170 Parulski et al. Mar 1995 A
5408357 Beukema Apr 1995 A
5414461 Kishi et al. May 1995 A
5416318 Hegyi May 1995 A
5418610 Fischer May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5428464 Silverbrook Jun 1995 A
5430450 Holmes Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5451822 Bechtel et al. Sep 1995 A
5452004 Roberts Sep 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475441 Parulski et al. Dec 1995 A
5475494 Nishida et al. Dec 1995 A
5481268 Higgins Jan 1996 A
5483346 Butzer Jan 1996 A
5483453 Uemura et al. Jan 1996 A
5485155 Hibino Jan 1996 A
5485378 Franke et al. Jan 1996 A
5488496 Pine Jan 1996 A
5508592 Lapatovich et al. Apr 1996 A
5515448 Nishitani May 1996 A
5523811 Wada et al. Jun 1996 A
5530421 Marshall et al. Jun 1996 A
5535144 Kise Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5541724 Hoashi Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5554912 Thayer et al. Sep 1996 A
5574443 Hsieh Nov 1996 A
5574463 Shirai et al. Nov 1996 A
5576975 Sasaki et al. Nov 1996 A
5587929 League et al. Dec 1996 A
5592146 Kover, Jr. et al. Jan 1997 A
5602542 Widmann et al. Feb 1997 A
5614788 Mullins et al. Mar 1997 A
5615023 Yang Mar 1997 A
5617085 Tsutsumi et al. Apr 1997 A
5621460 Hatlestad et al. Apr 1997 A
5634709 Iwama Jun 1997 A
5642238 Sala Jun 1997 A
5646614 Abersfelder et al. Jul 1997 A
5650765 Park Jul 1997 A
5660454 Mori et al. Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5680123 Lee Oct 1997 A
5684473 Hibino et al. Nov 1997 A
5707129 Kobayashi Jan 1998 A
5708410 Blank et al. Jan 1998 A
5708857 Ishibashi Jan 1998 A
5710565 Shirai et al. Jan 1998 A
5714751 Chen Feb 1998 A
5715093 Schierbeek et al. Feb 1998 A
5729194 Spears et al. Mar 1998 A
5736816 Strenke et al. Apr 1998 A
5745050 Nakagawa Apr 1998 A
5751211 Shirai et al. May 1998 A
5751832 Panter et al. May 1998 A
5754099 Nishimura et al. May 1998 A
5760828 Cortes Jun 1998 A
5764139 Nojima et al. Jun 1998 A
5767793 Agravante et al. Jun 1998 A
5781105 Bitar et al. Jul 1998 A
5786787 Eriksson et al. Jul 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798727 Shirai et al. Aug 1998 A
5811888 Hsieh Sep 1998 A
5812321 Schierbeek et al. Sep 1998 A
5837994 Stam et al. Nov 1998 A
5841126 Fossum et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5845000 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5867214 Anderson et al. Feb 1999 A
5877897 Schofield et al. Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5904729 Ruzicka May 1999 A
5905457 Rashid May 1999 A
5912534 Benedict Jun 1999 A
5923027 Stam et al. Jul 1999 A
5935613 Benham et al. Aug 1999 A
5940011 Agravante et al. Aug 1999 A
5942853 Piscart Aug 1999 A
5949331 Schofield et al. Sep 1999 A
5956079 Ridgley Sep 1999 A
5956181 Lin Sep 1999 A
5959555 Furuta Sep 1999 A
5990469 Bechtel et al. Nov 1999 A
6008486 Stam et al. Dec 1999 A
6009359 El-Hakim et al. Dec 1999 A
6018308 Shirai Jan 2000 A
6025872 Ozaki et al. Feb 2000 A
6046766 Sakata Apr 2000 A
6049171 Stam et al. Apr 2000 A
6060989 Gehlot May 2000 A
6061002 Weber et al. May 2000 A
6067111 Hahn et al. May 2000 A
6072391 Suzuki et al. Jun 2000 A
6078355 Zengel Jun 2000 A
6097023 Schofield et al. Aug 2000 A
6102546 Carter Aug 2000 A
6106121 Buckley et al. Aug 2000 A
6111498 Jobes et al. Aug 2000 A
6115651 Cruz Sep 2000 A
6122597 Saneyoshi et al. Sep 2000 A
6128576 Nishimoto et al. Oct 2000 A
6130421 Bechtel et al. Oct 2000 A
6130448 Bauer et al. Oct 2000 A
6140933 Bugno et al. Oct 2000 A
6144158 Beam Nov 2000 A
6151065 Steed et al. Nov 2000 A
6151539 Bergholz et al. Nov 2000 A
6154149 Tychkowski et al. Nov 2000 A
6157294 Urai et al. Dec 2000 A
6166629 Andreas Dec 2000 A
6166698 Turnbull et al. Dec 2000 A
6167755 Damson et al. Jan 2001 B1
6172600 Kakinami et al. Jan 2001 B1
6172601 Wada et al. Jan 2001 B1
6175300 Kendrick Jan 2001 B1
6184781 Ramakesavan Feb 2001 B1
6185492 Kagawa et al. Feb 2001 B1
6191704 Takenaga et al. Feb 2001 B1
6200010 Anders Mar 2001 B1
6218934 Regan Apr 2001 B1
6222447 Schofield et al. Apr 2001 B1
6249214 Kashiwazaki Jun 2001 B1
6250766 Strumolo et al. Jun 2001 B1
6255639 Stam et al. Jul 2001 B1
6259475 Ramachandran et al. Jul 2001 B1
6265968 Betzitza et al. Jul 2001 B1
6268803 Gunderson et al. Jul 2001 B1
6269308 Kodaka et al. Jul 2001 B1
6281632 Stam et al. Aug 2001 B1
6281804 Haller et al. Aug 2001 B1
6289332 Menig et al. Sep 2001 B2
6300879 Regan et al. Oct 2001 B1
6304173 Pala et al. Oct 2001 B2
6317057 Lee Nov 2001 B1
6320612 Young Nov 2001 B1
6324295 Avonique et al. Nov 2001 B1
6329925 Skiver et al. Dec 2001 B1
6330511 Ogura et al. Dec 2001 B2
6335680 Matsuoka Jan 2002 B1
6344805 Yasui et al. Feb 2002 B1
6348858 Weis et al. Feb 2002 B2
6349782 Sekiya et al. Feb 2002 B1
6356206 Takenaga et al. Mar 2002 B1
6356376 Tonar et al. Mar 2002 B1
6357883 Strumolo et al. Mar 2002 B1
6363326 Scully Mar 2002 B1
6369701 Yoshida et al. Apr 2002 B1
6379013 Bechtel et al. Apr 2002 B1
6396040 Hill May 2002 B1
6396397 Bos et al. May 2002 B1
6403942 Stam Jun 2002 B1
6408247 Ichikawa et al. Jun 2002 B1
6412959 Tseng Jul 2002 B1
6415230 Maruko et al. Jul 2002 B1
6421081 Markus Jul 2002 B1
6424272 Gutta et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6424892 Matsuoka Jul 2002 B1
6428172 Hutzel et al. Aug 2002 B1
6433680 Ho Aug 2002 B1
6437688 Kobayashi Aug 2002 B1
6438491 Farmer Aug 2002 B1
6441872 Ho Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6443602 Tanabe et al. Sep 2002 B1
6447128 Lang et al. Sep 2002 B1
6452533 Yamabuchi et al. Sep 2002 B1
6463369 Sadano et al. Oct 2002 B2
6465962 Fu et al. Oct 2002 B1
6466701 Ejiri et al. Oct 2002 B1
6469739 Bechtel et al. Oct 2002 B1
6472977 Pochmuller Oct 2002 B1
6473001 Blum Oct 2002 B1
6476731 Miki et al. Nov 2002 B1
6476855 Yamamoto Nov 2002 B1
6483429 Yasui et al. Nov 2002 B1
6483438 DeLine et al. Nov 2002 B2
6487500 Lemelson et al. Nov 2002 B2
6491416 Strazzanti Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6501387 Skiver et al. Dec 2002 B2
6507779 Breed et al. Jan 2003 B2
6515581 Ho Feb 2003 B1
6515597 Wada et al. Feb 2003 B1
6520667 Mousseau Feb 2003 B1
6522969 Kannonji Feb 2003 B2
6542085 Yang Apr 2003 B1
6542182 Chutorash Apr 2003 B1
6545598 De Villeroche Apr 2003 B1
6550943 Strazzanti Apr 2003 B2
6553130 Lemelson et al. Apr 2003 B1
6558026 Strazzanti May 2003 B2
6559761 Miller et al. May 2003 B1
6572233 Northman et al. Jun 2003 B1
6580373 Ohashi Jun 2003 B1
6581007 Hasegawa et al. Jun 2003 B2
6583730 Lang et al. Jun 2003 B2
6575643 Takahashi Jul 2003 B2
6587573 Stam et al. Jul 2003 B1
6591192 Okamura et al. Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6594614 Studt et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611227 Nebiyeloul-Kifle Aug 2003 B1
6611610 Stam et al. Aug 2003 B1
6611759 Brosche Aug 2003 B2
6614387 Deadman Sep 2003 B1
6616764 Kramer et al. Sep 2003 B2
6617564 Ockerse et al. Sep 2003 B2
6618672 Sasaki et al. Sep 2003 B2
6630888 Lang et al. Oct 2003 B2
6631316 Stam et al. Oct 2003 B2
6636258 Strumolo Oct 2003 B2
6642840 Lang et al. Nov 2003 B2
6642851 Deline et al. Nov 2003 B2
6648477 Hutzel et al. Nov 2003 B2
6665592 Kodama Dec 2003 B2
6670207 Roberts Dec 2003 B1
6670910 Delcheccolo et al. Dec 2003 B2
6674370 Rodewald et al. Jan 2004 B2
6675075 Engelsberg et al. Jan 2004 B1
6677986 Pöchmüller Jan 2004 B1
6683539 Trajkovic et al. Jan 2004 B2
6683969 Nishigaki et al. Jan 2004 B1
6690268 Schofield et al. Feb 2004 B2
6690413 Moore Feb 2004 B1
6693517 Mccarty et al. Feb 2004 B2
6693518 Kumata Feb 2004 B2
6693519 Keirstead Feb 2004 B2
6693524 Payne Feb 2004 B1
6717610 Bos et al. Apr 2004 B1
6727808 Uselmann et al. Apr 2004 B1
6727844 Zimmermann et al. Apr 2004 B1
6731332 Yasui et al. May 2004 B1
6734807 King May 2004 B2
6737964 Samman et al. May 2004 B2
6738088 Uskolovsky et al. May 2004 B1
6744353 Sjonell Jun 2004 B2
6772057 Breed et al. Aug 2004 B2
6774988 Stam et al. Aug 2004 B2
6846098 Bourdelais et al. Jan 2005 B2
6847487 Burgner Jan 2005 B2
6861809 Stam Mar 2005 B2
6902307 Strazzanti Jun 2005 B2
6912001 Okamoto et al. Jun 2005 B2
6913375 Strazzanti Jul 2005 B2
6923080 Dobler et al. Aug 2005 B1
6930737 Weindorf et al. Aug 2005 B2
6946978 Schofield Sep 2005 B2
7012543 Deline et al. Mar 2006 B2
7038577 Pawlicki et al. May 2006 B2
7046448 Burgner May 2006 B2
7175291 Li Feb 2007 B1
7255465 Deline et al. Aug 2007 B2
7262406 Heslin et al. Aug 2007 B2
7265342 Heslin et al. Sep 2007 B2
7292208 Park et al. Nov 2007 B1
7311428 Deline et al. Dec 2007 B2
7321112 Stam et al. Jan 2008 B2
7360932 Uken et al. Apr 2008 B2
7417221 Creswick et al. Aug 2008 B2
7446650 Scholfield et al. Nov 2008 B2
7467883 Deline et al. Dec 2008 B2
7468651 Deline et al. Dec 2008 B2
7505047 Yoshimura Mar 2009 B2
7533998 Schofield et al. May 2009 B2
7548291 Lee et al. Jun 2009 B2
7565006 Stam et al. Jul 2009 B2
7567291 Bechtel et al. Jul 2009 B2
7579940 Schofield et al. Aug 2009 B2
7653215 Stam Jan 2010 B2
7658521 Deline et al. Feb 2010 B2
7711479 Taylor et al. May 2010 B2
7719408 Deward et al. May 2010 B2
7720580 Higgins-Luthman May 2010 B2
7815326 Blank et al. Oct 2010 B2
7877175 Higgins-Luthman Jan 2011 B2
7881839 Stam et al. Feb 2011 B2
7888629 Heslin et al. Feb 2011 B2
7914188 Deline et al. Mar 2011 B2
7972045 Schofield Jul 2011 B2
7994471 Heslin et al. Aug 2011 B2
8031225 Watanabe et al. Oct 2011 B2
8045760 Stam et al. Oct 2011 B2
8059235 Utsumi et al. Nov 2011 B2
8063753 Deline et al. Nov 2011 B2
8090153 Schofield et al. Jan 2012 B2
8100568 Deline et al. Jan 2012 B2
8116929 Higgins-Luthman Feb 2012 B2
8120652 Bechtel et al. Feb 2012 B2
8142059 Higgins-Luthman et al. Mar 2012 B2
8162518 Schofield Apr 2012 B2
8201800 Filipiak Jun 2012 B2
8203433 Deuber et al. Jun 2012 B2
8217830 Lynam Jul 2012 B2
8222588 Schofield et al. Jul 2012 B2
8237909 Ostreko et al. Aug 2012 B2
8258433 Byers et al. Sep 2012 B2
8325028 Schofield et al. Dec 2012 B2
8482683 Hwang et al. Jul 2013 B2
20010019356 Takeda et al. Sep 2001 A1
20010022616 Rademacher et al. Sep 2001 A1
20010026316 Senatore Oct 2001 A1
20010045981 Gloger et al. Nov 2001 A1
20020040962 Schofield et al. Apr 2002 A1
20020044065 Quist et al. Apr 2002 A1
20020105423 Rast Aug 2002 A1
20020191127 Roberts et al. Dec 2002 A1
20030002165 Mathias et al. Jan 2003 A1
20030007261 Hutzel et al. Jan 2003 A1
20030016125 Lang et al. Jan 2003 A1
20030016287 Nakayama et al. Jan 2003 A1
20030025596 Lang et al. Feb 2003 A1
20030025597 Schofield Feb 2003 A1
20030030546 Tseng Feb 2003 A1
20030030551 Ho Feb 2003 A1
20030030724 Okamoto Feb 2003 A1
20030035050 Mizusawa Feb 2003 A1
20030043269 Park Mar 2003 A1
20030052969 Satoh et al. Mar 2003 A1
20030058338 Kawauchi et al. Mar 2003 A1
20030067383 Yang Apr 2003 A1
20030076415 Strumolo Apr 2003 A1
20030080877 Takagi et al. May 2003 A1
20030085806 Samman et al. May 2003 A1
20030088361 Sekiguchi May 2003 A1
20030090568 Pico May 2003 A1
20030090569 Poechmueller May 2003 A1
20030090570 Takagi et al. May 2003 A1
20030098908 Misaiji et al. May 2003 A1
20030103141 Bechtel et al. Jun 2003 A1
20030103142 Hitomi et al. Jun 2003 A1
20030117522 Okada Jun 2003 A1
20030122929 Minaudo et al. Jul 2003 A1
20030122930 Schofield et al. Jul 2003 A1
20030133014 Mendoza Jul 2003 A1
20030137586 Lewellen Jul 2003 A1
20030141965 Gunderson et al. Jul 2003 A1
20030146831 Berberich et al. Aug 2003 A1
20030169158 Paul, Jr. Sep 2003 A1
20030179293 Oizumi Sep 2003 A1
20030202096 Kim Oct 2003 A1
20030202357 Strazzanti Oct 2003 A1
20030214576 Koga Nov 2003 A1
20030214584 Ross, Jr. Nov 2003 A1
20030214733 Fujikawa et al. Nov 2003 A1
20030222793 Tanaka et al. Dec 2003 A1
20030222983 Nobori et al. Dec 2003 A1
20030227546 Hilborn et al. Dec 2003 A1
20040004541 Hong Jan 2004 A1
20040027695 Lin Jan 2004 A1
20040032321 McMahon et al. Feb 2004 A1
20040036768 Green Feb 2004 A1
20040051634 Schofield et al. Mar 2004 A1
20040056955 Berberich et al. Mar 2004 A1
20040057131 Hutzel et al. Mar 2004 A1
20040064241 Sekiguchi Apr 2004 A1
20040066285 Sekiguchi Apr 2004 A1
20040075603 Kodama Apr 2004 A1
20040080404 White Apr 2004 A1
20040080431 White Apr 2004 A1
20040085196 Milelr et al. May 2004 A1
20040090314 Iwamoto May 2004 A1
20040090317 Rothkop May 2004 A1
20040096082 Nakai et al. May 2004 A1
20040098196 Sekiguchi May 2004 A1
20040107030 Nishira et al. Jun 2004 A1
20040107617 Shoen et al. Jun 2004 A1
20040109060 Ishii Jun 2004 A1
20040114039 Ishikura Jun 2004 A1
20040119668 Homma et al. Jun 2004 A1
20040125905 Vlasenko et al. Jul 2004 A1
20040202001 Roberts et al. Oct 2004 A1
20050140855 Utsumi Jun 2005 A1
20050237440 Sugimura et al. Oct 2005 A1
20060007550 Tonar et al. Jan 2006 A1
20060054783 Voronov Mar 2006 A1
20060115759 Kim et al. Jun 2006 A1
20060139953 Chou et al. Jun 2006 A1
20060158899 Ayabe et al. Jul 2006 A1
20070171037 Schofield et al. Jul 2007 A1
20070221822 Stein Sep 2007 A1
20080068520 Minikey, Jr. et al. Mar 2008 A1
20080192132 Bechtel et al. Aug 2008 A1
20080247192 Hoshi et al. Oct 2008 A1
20080294315 Breed Nov 2008 A1
20090015736 Weller et al. Jan 2009 A1
20090141516 Wu et al. Jun 2009 A1
20100201896 Ostreko et al. Aug 2010 A1
20130028473 Hilldore et al. Jan 2013 A1
20130229521 Siecke et al. Sep 2013 A1
20140347488 Tazaki et al. Nov 2014 A1
Foreign Referenced Citations (15)
Number Date Country
0513476 Nov 1992 EP
0899157 Oct 2004 EP
2391117 Nov 2011 EP
2338363 Dec 1999 GB
1178693 Mar 1999 JP
2005148119 Jun 2005 JP
2005327600 Nov 2005 JP
2008139819 Jun 2008 JP
2008211442 Sep 2008 JP
20093866 Feb 2009 JP
2009214795 Sep 2009 JP
2013216286 Oct 2013 JP
9621581 Jul 1996 WO
2007103573 Sep 2007 WO
2010090964 Aug 2010 WO
Non-Patent Literature Citations (16)
Entry
Palalau et al., “FPD Evaluation for Automotive Application,” Proceedings of the Vehicle Display Symposium, Nov. 2, 1995, pp. 97-103, Society for Information Display, Detroit Chapter, Santa Ana, CA.
Adler, “A New Automotive AMLCD Module,” Proceedings of the Vehicle Display Symposium, Nov. 2, 1995, pp. 67-71, Society for Information Display, Detroit Chapter, Santa Ana, CA.
Sayer, et al., “In-Vehicle Displays for Crash Avoidance and Navigation Systems,”Proceedings of the Vehicle Display Symposium, Sep. 18, 1996, pp. 39-42, Society for Information Display, Detroit Chapter, Santa Ana, CA.
Knoll, et al., “Application of Graphic Displays in Automobiles,” SID 87 Digest, 1987, pp. 41-44, 5A.2.
Terada, et al., “Development of Central Information Display of Automotive Application,” SID 89 Digest, 1989, pp. 192-195, Society for Information Display, Detroit Center, Santa Ana, CA.
Thomsen, et al., “AMLCD Design Considerations for Avionics and Vetronics Applications,” Proceedings of the 5th Annual Flat Panel Display Strategic and Technical Symposium, Sep. 9-10, 1998, pp. 139-145, Society for Information Display, Metropolitan Detroit Chapter, CA.
Knoll, et al., “Conception of an Integrated Driver Information System,” SID International Symposium Digest of Technical Papers, 1990, pp. 126-129, Society for Information Display, Detroit Center, Santa Ana, CA.
Vincen, “An Analysis of Direct-View FPDs for Automotive Multi-Media Applications,”Proceedings of the 6th Annual Strategic and Technical Symposium “Vehicular Applications of Displays and Microsensors,” Sep. 22-23, 1999, pp. 39-46, Society for Information Display, Metropolitan Detroit Chapter, San Jose, CA.
Zuk, et al., “Flat Panel Display Applications in Agriculture Equipment,” Proceedings of the 5th Annual Flat Panel Display Strategic and Technical Symposium, Sep. 9-10, 1998, pp. 125-130, Society for Information Display, Metropolitan Detroit Chapter, CA.
Vijan, et al., “A 1.7-Mpixel Full-Color Diode Driven AM-LCD,” SID International Symposium, 1990, pp. 530-533, Society for Information Display, Playa del Rey, CA.
Vincen, “The Automotive Challenge to Active Matrix LCD Technology,” Proceedings of the Vehicle Display Symposium, 1996, pp. 17-21, Society for Information Display, Detroit Center, Santa Ana, CA.
Corsi, et al., “Reconfigurable Displays Used as Primary Automotive Instrumentation,” SAE Technical Paper Series, 1989, pp. 13-18, Society of Automotive Engineers, Inc., Warrendale, PA.
Schumacher, “Automotive Display Trends,” SID 96 Digest, 1997, pp. 1-6, Delco Electronics Corp., Kokomo, IN.
Knoll, “The Use of Displays in Automotive Applications,” Journal of the SID 5/3 1997, pp. 165-172, 315-316, Stuttgart, Germany.
Donofrio, “Looking Beyond the Dashboard,” SID 2002, pp. 30-34, Ann Arbor, MI.
Stone, “Automotive Display Specification,” Proceedings of the Vehicle Display Symposium, 1995, pp. 93-96, Society or Information Display, Detroit Center, Santa Ana, CA.
Related Publications (1)
Number Date Country
20160373684 A1 Dec 2016 US
Provisional Applications (1)
Number Date Country
62182863 Jun 2015 US