1. Field of the Invention
The field of the invention relates to microelectromechanical systems (MEMS).
2. Description of the Related Art
Microelectromechanical systems (MEMS) include micro mechanical elements, actuators, and electronics. Micromechanical elements may be created using deposition, etching, and or other micromachining processes that etch away parts of substrates and/or deposited material layers or that add layers to form electrical and electromechanical devices. One type of MEMS device is called an interferometric modulator. As used herein, the term interferometric modulator or interferometric light modulator refers to a device that selectively absorbs and/or reflects light using the principles of optical interference. In certain embodiments, an interferometric modulator may comprise a pair of conductive plates, one or both of which may be transparent and/or reflective in whole or part and capable of relative motion upon application of an appropriate electrical signal, e.g., a voltage. In a particular embodiment, one plate may comprise a stationary layer deposited on a substrate and the other plate may comprise a metallic membrane separated from the stationary layer by an air gap. As described herein in more detail, the position of one plate in relation to another can change the optical interference of light incident on the interferometric modulator. Such devices have a wide range of applications, and it would be beneficial in the art to utilize and/or modify the characteristics of these types of devices so that their features can be exploited in improving existing products and creating new products that have not yet been developed.
An embodiment provides for a method for processing image data to be displayed on a display device where the display device requires more power to be driven to display image data comprising particular spatial frequencies in one dimension than to be driven to display image data comprising the particular spatial frequencies in a second dimension. The method includes receiving image data, and filtering the received image data such that the image data at particular spatial frequencies in a first dimension are attenuated more than the image data at particular spatial frequencies in a second dimension.
Another embodiment provides for an apparatus for displaying image data that includes a display device, where the display device requires more power to be driven to display image data comprising particular spatial frequencies in a first dimension than to be driven to display image data comprising the particular spatial frequencies in a second dimension. The apparatus further includes a processor configured to receive image data and to filter the image data, the filtering being such that the image data at particular spatial frequencies in the first dimension are attenuated more than the image data at particular spatial frequencies in the second dimension. The apparatus further includes at least one driver circuit configured to communicate with the processor and to drive the display device, the driver circuit further configured to provide the filtered image data to the display device.
Another embodiment provides for an apparatus for displaying video data that includes at least one driver circuit, and a display device configured to be driven by the driver circuit, where the display device requires more power to be driven to display video data comprising particular spatial frequencies in a first dimension, than to be driven to display video data comprising the particular spatial frequencies in a second dimension. The apparatus further includes a processor configured to communicate with the driver circuit, the processor further configured to receive partially decoded video data, wherein the partially decoded video data comprises coefficients in a transformed domain, the processor further configured to filter the partially decoded video data, wherein the filtering comprises reducing a magnitude of at least one of the transformed domain coefficients containing spatial frequencies within the particular spatial frequencies in the first dimension. The processor is further configured to inverse transform the filtered partially decoded video data, thereby resulting in filtered spatial domain video data, and to finish decoding the filtered spatial domain video data. The driver circuit is configured to provide the decoded spatial domain video data to the display device.
a is a general 3×3 spatial filter mask.
b is a 3×3 spatial filter mask providing a symmetrical averaging (smoothing).
c is a 3×3 spatial filter mask providing a symmetrical weighted averaging (smoothing).
d is a 3×3 spatial filter mask providing averaging (smoothing) in the vertical dimension only.
e is a 3×3 spatial filter mask providing averaging (smoothing) in the horizontal dimension only.
f is a 3×3 spatial filter mask providing averaging (smoothing) in one diagonal dimension only.
g is a 5×5 spatial filter mask providing averaging (smoothing) in both vertical and horizontal dimensions, but with more smoothing in the vertical dimension than in the horizontal dimension.
a illustrates basis images of an exemplary 4×4 image transform.
b shows transform coefficients used as multipliers of the basis images shown in
The following detailed description is directed to certain specific embodiments of the invention. However, the invention can be embodied in a multitude of different ways. In this description, reference is made to the drawings wherein like parts are designated with like numerals throughout. As will be apparent from the following description, the embodiments may be implemented in any device that is configured to display an image, whether in motion (e.g., video) or stationary (e.g., still image), and whether textual or pictorial. More particularly, it is contemplated that the embodiments may be implemented in or associated with a variety of electronic devices such as, but not limited to, mobile telephones, wireless devices, personal data assistants (PDAs), hand-held or portable computers, GPS receivers/navigators, cameras, MP3 players, camcorders, game consoles, wrist watches, clocks, calculators, television monitors, flat panel displays, computer monitors, auto displays (e.g., odometer display, etc.), cockpit controls and/or displays, display of camera views (e.g., display of a rear view camera in a vehicle), electronic photographs, electronic billboards or signs, projectors, architectural structures, packaging, and aesthetic structures (e.g., display of images on a piece of jewelry). MEMS devices of similar structure to those described herein can also be used in non-display applications such as in electronic switching devices.
Bistable displays, such as an array of interferometric modulators, may be configured to be driven to display images utilizing several different types of driving protocols. These driving protocols may be designed to take advantage of the bistable nature of the display to conserve battery power. The driving protocols, in many instances, may update the display in a structured manner, such as row-by-row, column-by-column or in other fashions. These driving protocols, in many instances, require switching of voltages in the rows or columns many times a second in order to update the display. Since the power to update a display is dependent of the frequency of the charging and discharging of the column or row capacitance, the power usage is highly dependent on the image content. Images characterized by high spatial frequencies typically require more power to display. This dependence on spatial frequencies, in many instances, is not equal in all dimensions. A method and apparatus for performing spatial frequency filtering at particular frequencies and in a selected dimension(s) more than another dimension(s), so as to reduce the power required to display an image, is discussed.
One interferometric modulator display embodiment comprising an interferometric MEMS display element is illustrated in
The depicted portion of the pixel array in
The optical stacks 16a and 16b (collectively referred to as optical stack 16), as referenced herein, typically comprise of several fused layers, which can include an electrode layer, such as indium tin oxide (ITO), a partially reflective layer, such as chromium, and a transparent dielectric. The optical stack 16 is thus electrically conductive, partially transparent and partially reflective, and may be fabricated, for example, by depositing one or more of the above layers onto a transparent substrate 20. The partially reflective layer can be formed from a variety of materials that are partially reflective such as various metals, semiconductors, and dielectrics. The partially reflective layer can be formed of one or more layers of materials, and each of the layers can be formed of a single material or a combination of materials.
In some embodiments, the layers of the optical stack are patterned into parallel strips, and may form row electrodes in a display device as described further below. The movable reflective layers 14a, 14b may be formed as a series of parallel strips of a deposited metal layer or layers (orthogonal to the row electrodes of 16a, 16b) deposited on top of posts 18 and an intervening sacrificial material deposited between the posts 18. When the sacrificial material is etched away, the movable reflective layers 14a, 14b are separated from the optical stacks 16a, 16b by a defined gap 19. A highly conductive and reflective material such as aluminum may be used for the reflective layers 14, and these strips may form column electrodes in a display device.
With no applied voltage, the cavity 19 remains between the movable reflective layer 14a and optical stack 16a, with the movable reflective layer 14a in a mechanically relaxed state, as illustrated by the pixel 12a in
In one embodiment, the processor 21 is also configured to communicate with an array driver 22. In one embodiment, the array driver 22 includes a row driver circuit 24 and a column driver circuit 26 that provide signals to a display array or panel 30. The cross section of the array illustrated in
In typical applications, a display frame may be created by asserting the set of column electrodes in accordance with the desired set of actuated pixels in the first row. A row pulse is then applied to the row 1 electrode, actuating the pixels corresponding to the asserted column lines. The asserted set of column electrodes is then changed to correspond to the desired set of actuated pixels in the second row. A pulse is then applied to the row 2 electrode, actuating the appropriate pixels in row 2 in accordance with the asserted column electrodes. The row 1 pixels are unaffected by the row 2 pulse, and remain in the state they were set to during the row 1 pulse. This may be repeated for the entire series of rows in a sequential fashion to produce the frame. Generally, the frames are refreshed and/or updated with new display data by continually repeating this process at some desired number of frames per second. A wide variety of protocols for driving row and column electrodes of pixel arrays to produce display frames are also well known and may be used in conjunction with the present invention.
In the
The display device 40 includes a housing 41, a display 30, an antenna 43, a speaker 44, an input device 48, and a microphone 46. The housing 41 is generally formed from any of a variety of manufacturing processes as are well known to those of skill in the art, including injection molding, and vacuum forming. In addition, the housing 41 may be made from any of a variety of materials, including but not limited to plastic, metal, glass, rubber, and ceramic, or a combination thereof. In one embodiment the housing 41 includes removable portions (not shown) that may be interchanged with other removable portions of different color, or containing different logos, pictures, or symbols.
The display 30 of exemplary display device 40 may be any of a variety of displays, including a bi-stable display, as described herein. In other embodiments, the display 30 includes a flat-panel display, such as plasma, EL, OLED, STN LCD, or TFT LCD as described above, or a non-flat-panel display, such as a CRT or other tube device, as is well known to those of skill in the art. However, for purposes of describing the present embodiment, the display 30 includes an interferometric modulator display, as described herein.
The components of one embodiment of exemplary display device 40 are schematically illustrated in
The network interface 27 includes the antenna 43 and the transceiver 47 so that the exemplary display device 40 can communicate with one ore more devices over a network. In one embodiment the network interface 27 may also have some processing capabilities to relieve requirements of the processor 21. The antenna 43 is any antenna known to those of skill in the art for transmitting and receiving signals. In one embodiment, the antenna transmits and receives RF signals according to the IEEE 802.11 standard, including IEEE 802.11(a), (b), or (g). In another embodiment, the antenna transmits and receives RF signals according to the BLUETOOTH standard. In the case of a cellular telephone, the antenna is designed to receive CDMA, GSM, AMPS or other known signals that are used to communicate within a wireless cell phone network. The transceiver 47 pre-processes the signals received from the antenna 43 so that they may be received by and further manipulated by the processor 21. The transceiver 47 also processes signals received from the processor 21 so that they may be transmitted from the exemplary display device 40 via the antenna 43.
In an alternative embodiment, the transceiver 47 can be replaced by a receiver. In yet another alternative embodiment, network interface 27 can be replaced by an image source, which can store or generate image data to be sent to the processor 21. For example, the image source can be a digital video disc (DVD) or a hard-disc drive that contains image data, or a software module that generates image data.
Processor 21 generally controls the overall operation of the exemplary display device 40. The processor 21 receives data, such as compressed image data from the network interface 27 or an image source, and processes the data into raw image data or into a format that is readily processed into raw image data. The processor 21 then sends the processed data to the driver controller 29 or to frame buffer 28 for storage. Raw data typically refers to the information that identifies the image characteristics at each location within an image. For example, such image characteristics can include color, saturation, and gray-scale level.
In one embodiment, the processor 21 includes a microcontroller, CPU, or logic unit to control operation of the exemplary display device 40. Conditioning hardware 52 generally includes amplifiers and filters for transmitting signals to the speaker 45, and for receiving signals from the microphone 46. Conditioning hardware 52 may be discrete components within the exemplary display device 40, or may be incorporated within the processor 21 or other components.
The driver controller 29 takes the raw image data generated by the processor 21 either directly from the processor 21 or from the frame buffer 28 and reformats the raw image data appropriately for high speed transmission to the array driver 22. Specifically, the driver controller 29 reformats the raw image data into a data flow having a raster-like format, such that it has a time order suitable for scanning across the display array 30. Then the driver controller 29 sends the formatted information to the array driver 22. Although a driver controller 29, such as a LCD controller, is often associated with the system processor 21 as a stand-alone Integrated Circuit (IC), such controllers may be implemented in many ways. They may be embedded in the processor 21 as hardware, embedded in the processor 21 as software, or fully integrated in hardware with the array driver 22.
Typically, the array driver 22 receives the formatted information from the driver controller 29 and reformats the video data into a parallel set of waveforms that are applied many times per second to the hundreds and sometimes thousands of leads coming from the display's x-y matrix of pixels.
In one embodiment, the driver controller 29, array driver 22, and display array 30 are appropriate for any of the types of displays described herein. For example, in one embodiment, driver controller 29 is a conventional display controller or a bi-stable display controller (e.g., an interferometric modulator controller). In another embodiment, array driver 22 is a conventional driver or a bi-stable display driver (e.g., an interferometric modulator display). In one embodiment, a driver controller 29 is integrated with the array driver 22. Such an embodiment is common in highly integrated systems such as cellular phones, watches, and other small area displays. In yet another embodiment, display array 30 is a typical display array or a bi-stable display array (e.g., a display including an array of interferometric modulators).
The input device 48 allows a user to control the operation of the exemplary display device 40. In one embodiment, input device 48 includes a keypad, such as a QWERTY keyboard or a telephone keypad, a button, a switch, a touch-sensitive screen, a pressure or heat-sensitive membrane. In one embodiment, the microphone 46 is an input device for the exemplary display device 40. When the microphone 46 is used to input data to the device, voice commands may be provided by a user for controlling operations of the exemplary display device 40.
Power supply 50 can include a variety of energy storage devices as are well known in the art. For example, in one embodiment, power supply 50 is a rechargeable battery, such as a nickel-cadmium battery or a lithium ion battery. In another embodiment, power supply 50 is a renewable energy source, a capacitor, or a solar cell, including a plastic solar cell, and solar-cell paint. In another embodiment, power supply 50 is configured to receive power from a wall outlet.
In some implementations control programmability resides, as described above, in a driver controller which can be located in several places in the electronic display system. In some cases control programmability resides in the array driver 22. Those of skill in the art will recognize that the above-described optimization may be implemented in any number of hardware and/or software components and in various configurations.
The details of the structure of interferometric modulators that operate in accordance with the principles set forth above may vary widely. For example,
In embodiments such as those shown in
The actuation protocol shown in
(Energy/col)=½*count*Cline*Vs2 (1)
The power consumed in driving an entire array is simply the energy required for writing to every column divided by time or:
Power=(Energy/col)*ncols*f (2)
where:
For a given frame update frequency (f) and frame size (number of columns), the power required to write to the display is linearly dependent on the frequency of the data being written. Of particular interest is the “count” variable in (1), which depends on the frequency of changes in pixel states (actuated or relaxed) in a given column. For this reason, images that contain high spatial frequencies in the vertical direction (parallel to the columns) are particularly demanding in terms of power consumption. High horizontal spatial frequencies do not drive up the power consumption since the row lines are not switched as quickly, thus the row capacitance is not charged and discharged as often. For example, with reference to
This high sensitivity to vertical frequencies, particularly in the higher frequency ranges, and low sensitivity to horizontal frequencies in the same particular high range, is due to the actuation protocol updating in a row-by-row fashion. In another embodiment, where a display is updated column-by-column, the power consumption will be oppositely affected. Since the row lines will be switched frequently due to high spatial frequencies in the horizontal dimension, the power use will be highly sensitive to these horizontal frequencies and will be relatively insensitive to the spatial frequencies in the vertical dimension. One of skill in the art can easily imagine other embodiments of actuation protocols (such as updating diagonal lines of pixels) and/or display circuitry where the power consumption of a display is more sensitive (in terms of power needed to drive a display) to particular spatial frequencies in one dimension than in another dimension.
The unsymmetrical power sensitivity described above allows for unconventional filtering of image data that takes advantage of the power requirements exhibited by a display device such as an array of interferometric modulators. Since power use is more sensitive in one dimension (vertical in the embodiment discussed above) than another dimension (horizontal in the embodiment discussed above), image data may be filtered in the dimension that is most power sensitive and the other dimension may remain substantially unfiltered, thereby retaining more image fidelity in the other dimension. Thus, power use will be reduced due to the less frequent switching required to display the filtered dimension that is most power sensitive. The nature of the filtering, in one embodiment, is that of smoothing, low-pass filtering, and/or averaging (referred to herein simply as low-pass filtering) in one dimension more than another dimension. This type of filtering, in general, allows low frequencies to remain and attenuates image data at higher frequencies. This will result in pixels in close spatial proximity to each other in the filtered dimension having a higher likelihood of being in identical states, thus requiring less power to display.
Pixel values may be in several models including gray level (or intensity) varying from black to grey to white (this may be all that is needed to represent monochrome or achromatic light), and radiance and brightness for chromatic light. Other color models that may be used include the RGB (Red, Green, Blue) or primary colors model, the CMY (Cyan, Magenta, Yellow) or secondary colors model, the HSI (Hue, Saturation, Intensity) model, and the Luminance/Chrominance model (Y/Cr/Cb: Luminance, red chrominance, blue chrominance). Any of these models can be used to represent the spatial pixels to be filtered. In addition to the spatial pixels, image data may be in a transformed domain where the pixel values have been transformed. Transforms that may be used for images include the DCT (Discrete Cosine Transform), the DFT (Discrete Fourier Transform), the Hadamard (or Walsh-Hadamard) transform, discrete wavelet transforms, the DST (discrete sine transform), the Haar transform, the slant transform, the KL (Karhunen-Loeve) transform and integer transforms such as that used in H.264 video compression. Filtering may take place in either the spatial domain or one of the transformed domains. Spatial domain filtering will now be discussed.
Spatial domain filtering utilizes pixel values of neighboring image pixels to calculate the filtered value of each pixel in the image space.
R=w(−1,−1)f(x−1,y−1)+w(−1,0)f(x−1,y)+ . . . +w(0,0)f(x,y)+ . . . w(1,0)f(x+1,y)+w(1,1)f(x+1,y+1), (3)
Equation 3 is the sum of the products of the mask coefficients and the corresponding pixel values underlying the mask of
d, shows a 3×3 mask that low-pass filters in the vertical dimension only. This mask, of course, could be reduced to a single column vector, but is shown as a 3×3 mask for illustrative purposes only. The filtered response in this case will be the average of the pixel value being filtered, f(x,y), and the pixel values immediately above, f(x−1,y) and below, f(x+1,y). This will result in low-pass filtering, or smoothing, of vertical spatial frequencies only. By only filtering the vertical frequencies, the power required to display the filtered image data may be lower in this embodiment. By not filtering the other dimensions, image details such as vertical edges and/or lines may be retained.
The filter masks shown in
In one embodiment, a spatial filter may filter in more than one dimension and still reduce the power required to display an image.
The pixel values being filtered (either spatially as discussed above or in a transform domain as discussed below) may include any one of several variables including, but not limited to, intensity or gray level, radiance, brightness, RGB or primary color coefficients, CMY or secondary color coefficients, HSI coefficients, and the Luminance/Chrominance coefficients (i.e., Y/Cr/Cb: Luminance, red chrominance, and blue chrominance, respectively). Some color variables may be better candidates for filtering than others. For example, the human eye is typically less sensitive to chrominance color data comprised mainly of reds and blues, than it is to Luminance data comprised of green-yellow colors. For this reason, the red and blue or chrominance values may be more heavily filtered than the green-yellow or luminance values without affecting the human visual perception as greatly.
Filtering on the borders of images, where the filter mask coefficients do not lie over pixels, may require special treatment. Well known methods such as padding with zeros, padding with ones, padding with some other pixel value other than zero or 1 may be used when filtering along image borders. Pixels that lie outside the mask may be ignored and not included in the filtering. The filtered image may be reduced in size by only filtering pixels that have neighboring pixels to completely fill the mask.
In addition to the spatial domain filtering, another general form of filtering is done in one of several transform domains. One of the most common and well known transform domains is the frequency domain which results from performing transforms such as the Fourier Transform, the DFT, the DCT or the DST. Other transforms, such as the Hadamard (or Walsh-Hadamard) transform, the Haar transform, the slant transform, the KL transform and integer transforms such as that used in H.264 video compression, while not truly frequency domain transforms, do contain frequency characteristics within the transform basis images. The act of transforming pixel data from the spatial domain to a transform domain replaces the spatial pixel values with transform coefficients that are multipliers of basis images.
The example basis images in
Knowing the spatial frequency characteristics of the individual basis images, one may filter the transformed coefficients and target those coefficients that are the most demanding, in terms of power requirements, to display. For example, in reference to
The size of the image being filtered when performing transform domain filtering is dependent on the size of the image block that was transformed. For example, if the transformed coefficients resulted from transforming pixel values that correspond to an image space covering a 16×16 pixel block, then the filtering will affect only the 16×16 pixel image block that was transformed. Transforming a larger image block will result in more basis images, and therefore the more spatial frequencies that may be filtered. However, an 8×8 block may be sufficient to target the high frequencies that may advantageously attenuated for conserving power on certain displays, e.g., a display of interferometric modulators.
Regardless which domain the filtering is done in, one objective is to selectively filter spatial frequencies that require the most power to be displayed. For this reason, the filtering will be referred to herein as spatial frequency filtering. Similarly, the module performing the filtering, whether implemented as software, firmware or microchip circuitry, depending on the embodiment, will be referred to as a spatial frequency filter. More details of certain embodiments of spatial domain and transform domain methods for performing spatial frequency filtering will be discussed below.
After receiving the image data, the data may need to be transformed to another domain at step 210, if the spatial frequency filter domain is different that the domain of the received data. Processor 21 may perform the optional transformation acts of step 210. Step 210 may be omitted if the received image data is already in the domain in which filtering occurs. After the image data is in the filtering domain, the spatial frequency filtering occurs at step 215 (steps 230, 235 and 240 will be discussed below in reference to another embodiment). Spatial frequency filtering may be in the spatial domain or in the transformed domain. In the spatial domain, the linear and nonlinear filtering methods discussed above in reference to
After filtering in step 215, it may be necessary to inverse transform the filtered data at step 220. If step 215 was performed in the spatial domain then the image data may be ready to provide to the display device at step 225. If the filtering was performed in a transform domain, the processor 21 will inverse transform the filtered data into the spatial domain. At step 225, the filtered image data is provided to the display device. The filtered image data input to step 225 is typically raw image data. Raw image data typically refers to the information that identifies the image characteristics at each location within an image. For example, such image characteristics can include color, saturation, and gray-scale level. In one embodiment, actions taken in step 225 comprise the driver controller 29 taking the filtered image data generated by the processor 21 either directly from the processor 21 or from the frame buffer 28 and reformatting the filtered image data appropriately for high speed transmission to the array driver 22. Specifically, the driver controller 29 reformats the raw image data into a data flow having a raster-like format, such that it has a time order suitable for scanning across the display array 30. Then the driver controller 29 sends the formatted information to the array driver 22 to drive the display array 30 to display the filtered image data.
In one embodiment, image data is provided to the display array 30 by the array driver 22 in a row-by-row fashion. In this embodiment, the display array 30 is driven by column signals and row pulses as discussed above in reference to and illustrated in
In another embodiment, image data is provided to the display array 30 by the array driver 22 in a column-by-column fashion. In this embodiment, the display array 30 is driven by row signals and column pulses essentially switched (i.e., high frequency row switching and low frequency column pulses) from the protocol discussed above in reference to and illustrated in
In one embodiment, the filtering of step 215 is dependent on an estimated remaining lifetime of a battery such as power supply 50. An estimation of remaining battery lifetime is made in step 230. The estimation may be made in the driver controller 29 based on measured voltages from power supply 50. Methods of estimating the remaining lifetime of a power supply are known to those of skill in the art and will not be discussed in detail. Decision block 235 checks to see if the remaining battery lifetime is below a threshold value. If it is below the threshold than the process flow continues on to filtering spatial frequencies at step 215 in order to preserve the remaining battery life. If decision block 235 does not find the estimated battery lifetime to be below the threshold, then the filtering step 215 is bypassed. In this way, higher quality images can be viewed until battery power is low.
In another embodiment, decision block 235 checks if the estimated battery life is below multiple thresholds and filter parameter may be set at step 240 depending on which threshold the estimate falls below. For example, if the estimated battery life is below a first threshold than step 215 filters spatial frequencies using a first parameter set. If the estimated battery life is below a second threshold than step 215 filters spatial frequencies using a second parameter set. In one aspect of this embodiment, the first threshold is higher (higher meaning there is more battery lifetime remaining) than the second threshold and the first parameter set results in less attenuation or smoothing of the particular frequencies than the second parameter set. In this way, more drastic filtering may result in more power savings as the estimated battery lifetime decreases. Battery life may be measured from a battery controller IC (integrated circuit).
In another embodiment, step 230 is replaced by an estimate of the power required to drive the display array 30 to display a specific image. The estimate may be made in the driver controller 29. The estimate may be made by using equations such as equations (2) and (3) above that depend on the driver protocol. In this embodiment, decision block 235 may be replaced by a decision block that checks the estimated power to display the image to a threshold. If the estimated power exceeds the threshold then filtering will be performed at step 215 to reduce the power required to display the image. If the estimated power is below the threshold, then the filtering step 215 is omitted. Multiple thresholds may be utilized in other embodiments similar to the multiple battery lifetime thresholds discussed above. Multiple filtering parameter sets may be set at step 240 depending on which estimated power threshold is exceeded. Depending on the embodiment, selected steps of process 200 illustrated in
In another embodiment, the spatial frequency filtering process 200 may be performed at multiple points in a decoding process for decompressing compressed image and/or video data. Such compressed image and/or video data may be compressed using JPEG, JPEG-2000, MPEG-2, MPEG 4, H.264 encoders as well as other image and video compression algorithms.
In addition to the compressed image decoder blocks 105, 110, 115 and 120, the display device 40 shown in
Performing the spatial frequency filtering in different areas of the decoding process may provide advantages depending on the embodiment of the display array 30. For example, the image size being filtered by filters 125a and 125b may be on a relatively small portion of image data, thereby limiting the choice of basis images and/or spatial frequencies represented in the sub-image space. In contrast, filters 125c and 125d may have a complete image to work with, thereby having many more spatial frequencies and/or basis images to choose from to selectively filter. Any of the filters 125 may be switched to filtering in another domain by performing a transform, then filtering in the new domain, then inverse transforming to the old domain. In this way, spatial and/or transformed filtering may be performed at any point in the decoding process.
Having several candidate places to perform spatial frequency filtering and having multiple domains in which to filter gives a designer a great deal of flexibility in optimizing the filtering to best filter the particular frequencies in the selected dimensions to provide for power saving in the driving of the display array 30. In one embodiment, a system controller 130 controls the nature of the filtering (e.g., which domain filtering is performed in, which position in the decoding process the filtering is performed at, and what level of filtering is provided) performed by spatial frequency filters 125a through 125d. In one aspect of this embodiment, system controller 130 receives the estimated battery lifetime remaining for power supply 50 that is calculated in step 230 of process 200. In this aspect, the estimated battery lifetime is calculated in another module such as the driver controller 29. In another aspect of this embodiment, system controller 130 estimates the battery lifetime remaining. The estimated battery lifetime may be utilized by system controller 130 to determine the filtering parameter sets based on estimated battery lifetime thresholds as discussed above (see discussion of decision block 235 and step 240). These filtering parameter sets may be transmitted to one or more of the spatial frequency filters 125a through 125d. In another aspect of this embodiment, system controller 130 receives an estimate of the power required to drive the display array 30 to display a specific image (this power estimate may replace the battery lifetime estimate at step 230). The estimate may be made in the driver controller 29. If the estimated power exceeds a threshold then decision block 235 will direct flow such that filtering be performed at step 215 to reduce the power required to display the image. If the estimated power is below the threshold, then the filtering step 215 is omitted. Multiple thresholds may be utilized in other embodiments similar to the multiple battery lifetime thresholds discussed above. Multiple filtering parameter sets may be set at step 240 depending on which estimated power threshold is exceeded. System controller 130 may be software, firmware and/or hardware implemented in, e.g., the processor 21 and/or the driver controller 29.
An embodiment of an apparatus for processing image data includes means for displaying image data, the displaying means requiring more power to display image data comprising particular spatial frequencies in a first dimension, than to display image data comprising the particular spatial frequencies in a second dimension. The apparatus further includes means for receiving image data, means for filtering the received image data such that the image data at particular spatial frequencies in a first dimension are attenuated more than image data at the particular spatial frequencies in a second dimension are attenuated, so as to reduce power consumed by the displaying means, and driving means for providing the filtered image data to the displaying means. With reference to
While the above detailed description has shown, described, and pointed out novel features of the invention as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made by those skilled in the art without departing from the spirit of the invention. As will be recognized, the present invention may be embodied within a form that does not provide all of the features and benefits set forth herein, as some features may be used or practiced separately from others.
Number | Name | Date | Kind |
---|---|---|---|
3982239 | Sherr | Sep 1976 | A |
4403248 | te Velde | Sep 1983 | A |
4441791 | Hornbeck | Apr 1984 | A |
4459182 | te Velde | Jul 1984 | A |
4482213 | Piliavin et al. | Nov 1984 | A |
4500171 | Penz et al. | Feb 1985 | A |
4519676 | te Velde | May 1985 | A |
4566935 | Hornbeck | Jan 1986 | A |
4571603 | Hornbeck et al. | Feb 1986 | A |
4596992 | Hornbeck | Jun 1986 | A |
4615595 | Hornbeck | Oct 1986 | A |
4662746 | Hornbeck | May 1987 | A |
4681403 | te Velde et al. | Jul 1987 | A |
4709995 | Kuribayashi et al. | Dec 1987 | A |
4710732 | Hornbeck | Dec 1987 | A |
4856863 | Sampsell et al. | Aug 1989 | A |
4859060 | Katagiri et al. | Aug 1989 | A |
4954789 | Sampsell | Sep 1990 | A |
4956619 | Hornbeck | Sep 1990 | A |
4982184 | Kirkwood | Jan 1991 | A |
5018256 | Hornbeck | May 1991 | A |
5028939 | Hornbeck et al. | Jul 1991 | A |
5037173 | Sampsell et al. | Aug 1991 | A |
5055833 | Hehlen et al. | Oct 1991 | A |
5061049 | Hornbeck | Oct 1991 | A |
5078479 | Vuilleumier | Jan 1992 | A |
5079544 | DeMond et al. | Jan 1992 | A |
5083857 | Hornbeck | Jan 1992 | A |
5096279 | Hornbeck et al. | Mar 1992 | A |
5099353 | Hornbeck | Mar 1992 | A |
5124834 | Cusano et al. | Jun 1992 | A |
5142405 | Hornbeck | Aug 1992 | A |
5142414 | Koehler et al. | Aug 1992 | A |
5162787 | Thompson et al. | Nov 1992 | A |
5168406 | Nelson | Dec 1992 | A |
5170156 | DeMond et al. | Dec 1992 | A |
5172262 | Hornbeck | Dec 1992 | A |
5179274 | Sampsell | Jan 1993 | A |
5192395 | Boysel et al. | Mar 1993 | A |
5192946 | Thompson et al. | Mar 1993 | A |
5206629 | DeMond et al. | Apr 1993 | A |
5212582 | Nelson | May 1993 | A |
5214419 | DeMond et al. | May 1993 | A |
5214420 | Thompson et al. | May 1993 | A |
5216537 | Hornbeck | Jun 1993 | A |
5226099 | Mignardi et al. | Jul 1993 | A |
5227900 | Inaba et al. | Jul 1993 | A |
5231532 | Magel et al. | Jul 1993 | A |
5233385 | Sampsell | Aug 1993 | A |
5233456 | Nelson | Aug 1993 | A |
5233459 | Bozler et al. | Aug 1993 | A |
5254980 | Hendrix et al. | Oct 1993 | A |
5272473 | Thompson et al. | Dec 1993 | A |
5278652 | Urbanus et al. | Jan 1994 | A |
5280277 | Hornbeck | Jan 1994 | A |
5285196 | Gale | Feb 1994 | A |
5287096 | Thompson et al. | Feb 1994 | A |
5287215 | Warde et al. | Feb 1994 | A |
5296950 | Lin et al. | Mar 1994 | A |
5305640 | Boysel et al. | Apr 1994 | A |
5312513 | Florence et al. | May 1994 | A |
5323002 | Sampsell et al. | Jun 1994 | A |
5325116 | Sampsell | Jun 1994 | A |
5327286 | Sampsell et al. | Jul 1994 | A |
5331454 | Hornbeck | Jul 1994 | A |
5339116 | Urbanus et al. | Aug 1994 | A |
5365283 | Doherty et al. | Nov 1994 | A |
5411769 | Hornbeck | May 1995 | A |
5444566 | Gale et al. | Aug 1995 | A |
5446479 | Thompson et al. | Aug 1995 | A |
5448314 | Heimbuch et al. | Sep 1995 | A |
5452024 | Sampsell | Sep 1995 | A |
5454906 | Baker et al. | Oct 1995 | A |
5457493 | Leddy et al. | Oct 1995 | A |
5457566 | Sampsell et al. | Oct 1995 | A |
5459602 | Sampsell | Oct 1995 | A |
5461411 | Florence et al. | Oct 1995 | A |
5488505 | Engle | Jan 1996 | A |
5489952 | Gove et al. | Feb 1996 | A |
5497172 | Doherty et al. | Mar 1996 | A |
5497197 | Gove et al. | Mar 1996 | A |
5499062 | Urbanus | Mar 1996 | A |
5506597 | Thompson et al. | Apr 1996 | A |
5515076 | Thompson et al. | May 1996 | A |
5517347 | Sampsell | May 1996 | A |
5523802 | Sugihara et al. | Jun 1996 | A |
5523803 | Urbanus et al. | Jun 1996 | A |
5526051 | Gove et al. | Jun 1996 | A |
5526172 | Kanack | Jun 1996 | A |
5526688 | Boysel et al. | Jun 1996 | A |
5535047 | Hornbeck | Jul 1996 | A |
5548301 | Kornher et al. | Aug 1996 | A |
5551293 | Boysel et al. | Sep 1996 | A |
5552924 | Tregilgas | Sep 1996 | A |
5552925 | Worley | Sep 1996 | A |
5563398 | Sampsell | Oct 1996 | A |
5566284 | Wakayama | Oct 1996 | A |
5567334 | Baker et al. | Oct 1996 | A |
5570135 | Gove et al. | Oct 1996 | A |
5578976 | Yao | Nov 1996 | A |
5581272 | Conner et al. | Dec 1996 | A |
5583688 | Hornbeck | Dec 1996 | A |
5589852 | Thompson et al. | Dec 1996 | A |
5597736 | Sampsell | Jan 1997 | A |
5598565 | Reinhardt | Jan 1997 | A |
5600383 | Hornbeck | Feb 1997 | A |
5602671 | Hornbeck | Feb 1997 | A |
5606441 | Florence et al. | Feb 1997 | A |
5608468 | Gove et al. | Mar 1997 | A |
5610438 | Wallace et al. | Mar 1997 | A |
5610624 | Bhuva | Mar 1997 | A |
5610625 | Sampsell | Mar 1997 | A |
5612713 | Bhuva et al. | Mar 1997 | A |
5619061 | Goldsmith et al. | Apr 1997 | A |
5619365 | Rhoades et al. | Apr 1997 | A |
5619366 | Rhoads et al. | Apr 1997 | A |
5629790 | Neukermans et al. | May 1997 | A |
5633652 | Kanbe et al. | May 1997 | A |
5636052 | Arney et al. | Jun 1997 | A |
5638084 | Kalt | Jun 1997 | A |
5638946 | Zavracky | Jun 1997 | A |
5646768 | Kaeriyama | Jul 1997 | A |
5650881 | Hornbeck | Jul 1997 | A |
5654741 | Sampsell et al. | Aug 1997 | A |
5657099 | Doherty et al. | Aug 1997 | A |
5659374 | Gale, Jr. et al. | Aug 1997 | A |
5665997 | Weaver et al. | Sep 1997 | A |
5699075 | Miyamoto | Dec 1997 | A |
5745193 | Urbanus et al. | Apr 1998 | A |
5745281 | Yi et al. | Apr 1998 | A |
5754160 | Shimizu et al. | May 1998 | A |
5771116 | Miller et al. | Jun 1998 | A |
5784189 | Bozler et al. | Jul 1998 | A |
5784212 | Hornbeck | Jul 1998 | A |
5808780 | McDonald | Sep 1998 | A |
5818095 | Sampsell | Oct 1998 | A |
5828367 | Kuga | Oct 1998 | A |
5835255 | Miles | Nov 1998 | A |
5842088 | Thompson | Nov 1998 | A |
5867302 | Fleming et al. | Feb 1999 | A |
5912758 | Knipe et al. | Jun 1999 | A |
5943030 | Minamibayashi | Aug 1999 | A |
5943158 | Ford et al. | Aug 1999 | A |
5959763 | Bozler et al. | Sep 1999 | A |
5966235 | Walker et al. | Oct 1999 | A |
5986796 | Miles | Nov 1999 | A |
6028690 | Carter et al. | Feb 2000 | A |
6038056 | Florence et al. | Mar 2000 | A |
6040937 | Miles | Mar 2000 | A |
6049317 | Thompson et al. | Apr 2000 | A |
6055090 | Miles | Apr 2000 | A |
6061075 | Nelson et al. | May 2000 | A |
6099132 | Kaeriyama | Aug 2000 | A |
6100872 | Aratani et al. | Aug 2000 | A |
6113239 | Sampsell et al. | Sep 2000 | A |
6144493 | Okuyama et al. | Nov 2000 | A |
6147790 | Meier et al. | Nov 2000 | A |
6160833 | Floyd et al. | Dec 2000 | A |
6180428 | Peeters et al. | Jan 2001 | B1 |
6201633 | Peeters et al. | Mar 2001 | B1 |
6232936 | Gove et al. | May 2001 | B1 |
6275326 | Bhalla et al. | Aug 2001 | B1 |
6282010 | Sulzbach et al. | Aug 2001 | B1 |
6295154 | Laor et al. | Sep 2001 | B1 |
6296636 | Cheng et al. | Oct 2001 | B1 |
6300922 | Teggatz | Oct 2001 | B1 |
6304297 | Swan | Oct 2001 | B1 |
6323982 | Hornbeck | Nov 2001 | B1 |
6327071 | Kimura | Dec 2001 | B1 |
6343100 | Fujiwara et al. | Jan 2002 | B1 |
6356085 | Ryat et al. | Mar 2002 | B1 |
6356254 | Kimura | Mar 2002 | B1 |
6429601 | Friend et al. | Aug 2002 | B1 |
6433917 | Mei et al. | Aug 2002 | B1 |
6447126 | Hornbeck | Sep 2002 | B1 |
6465355 | Horsley | Oct 2002 | B1 |
6466358 | Tew | Oct 2002 | B2 |
6473274 | Maimone et al. | Oct 2002 | B1 |
6480177 | Doherty et al. | Nov 2002 | B2 |
6496122 | Sampsell | Dec 2002 | B2 |
6501107 | Sinclair et al. | Dec 2002 | B1 |
6507330 | Handschy et al. | Jan 2003 | B1 |
6507331 | Schlangen et al. | Jan 2003 | B1 |
6545335 | Chua et al. | Apr 2003 | B1 |
6548908 | Chua et al. | Apr 2003 | B2 |
6549338 | Wolverton et al. | Apr 2003 | B1 |
6552840 | Knipe | Apr 2003 | B2 |
6574033 | Chui et al. | Jun 2003 | B1 |
6589625 | Kothari et al. | Jul 2003 | B1 |
6593934 | Liaw et al. | Jul 2003 | B1 |
6600201 | Hartwell et al. | Jul 2003 | B2 |
6606175 | Sampsell et al. | Aug 2003 | B1 |
6625047 | Coleman, Jr. | Sep 2003 | B2 |
6630786 | Cummings et al. | Oct 2003 | B2 |
6632698 | Ives | Oct 2003 | B2 |
6636187 | Tajima et al. | Oct 2003 | B2 |
6642971 | Takeuchi | Nov 2003 | B2 |
6643069 | Dewald | Nov 2003 | B2 |
6650455 | Miles | Nov 2003 | B2 |
6666561 | Blakley | Dec 2003 | B1 |
6674090 | Chua et al. | Jan 2004 | B1 |
6674562 | Miles | Jan 2004 | B1 |
6680792 | Miles | Jan 2004 | B2 |
6710908 | Miles et al. | Mar 2004 | B2 |
6741377 | Miles | May 2004 | B2 |
6741384 | Martin et al. | May 2004 | B1 |
6741503 | Farris et al. | May 2004 | B1 |
6747785 | Chen et al. | Jun 2004 | B2 |
6762873 | Coker et al. | Jul 2004 | B1 |
6768520 | Rilly et al. | Jul 2004 | B1 |
6775174 | Huffman et al. | Aug 2004 | B2 |
6778155 | Doherty et al. | Aug 2004 | B2 |
6781643 | Watanabe et al. | Aug 2004 | B1 |
6787384 | Okumura | Sep 2004 | B2 |
6787438 | Nelson | Sep 2004 | B1 |
6788520 | Behin et al. | Sep 2004 | B1 |
6794119 | Miles | Sep 2004 | B2 |
6811267 | Allen et al. | Nov 2004 | B1 |
6813060 | Garcia et al. | Nov 2004 | B1 |
6819469 | Koba | Nov 2004 | B1 |
6819717 | Sasai et al. | Nov 2004 | B1 |
6822628 | Dunphy et al. | Nov 2004 | B2 |
6829132 | Martin et al. | Dec 2004 | B2 |
6853129 | Cummings et al. | Feb 2005 | B1 |
6855610 | Tung et al. | Feb 2005 | B2 |
6859218 | Luman et al. | Feb 2005 | B1 |
6861277 | Monroe et al. | Mar 2005 | B1 |
6862022 | Slupe | Mar 2005 | B2 |
6862029 | D'Souza et al. | Mar 2005 | B1 |
6867896 | Miles | Mar 2005 | B2 |
6870581 | Li et al. | Mar 2005 | B2 |
6903860 | Ishii | Jun 2005 | B2 |
7013161 | Morris | Mar 2006 | B2 |
7034783 | Gates et al. | Apr 2006 | B2 |
7046853 | Okada | May 2006 | B2 |
7111179 | Girson et al. | Sep 2006 | B1 |
7119786 | Cui | Oct 2006 | B2 |
7123216 | Miles | Oct 2006 | B1 |
7161728 | Sampsell et al. | Jan 2007 | B2 |
7202850 | Kitagawa | Apr 2007 | B2 |
7230996 | Matsuura et al. | Jun 2007 | B2 |
7254776 | Ejima et al. | Aug 2007 | B2 |
7262560 | Jaffar et al. | Aug 2007 | B2 |
7327510 | Cummings et al. | Feb 2008 | B2 |
7400489 | Van Brocklin et al. | Jul 2008 | B2 |
7444034 | Thistle et al. | Oct 2008 | B1 |
7515160 | Kerofsky | Apr 2009 | B2 |
7528883 | Hsu | May 2009 | B2 |
7710434 | Gu | May 2010 | B2 |
7760960 | Yan et al. | Jul 2010 | B2 |
20010003487 | Miles | Jun 2001 | A1 |
20010012051 | Hara et al. | Aug 2001 | A1 |
20010026250 | Inoue et al. | Oct 2001 | A1 |
20010034075 | Onoya | Oct 2001 | A1 |
20010040536 | Tajima et al. | Nov 2001 | A1 |
20010043171 | Van Gorkom et al. | Nov 2001 | A1 |
20010046081 | Hayashi et al. | Nov 2001 | A1 |
20010051014 | Behin et al. | Dec 2001 | A1 |
20010052887 | Tsutsui et al. | Dec 2001 | A1 |
20020000959 | Colgan et al. | Jan 2002 | A1 |
20020005827 | Kobayashi | Jan 2002 | A1 |
20020012159 | Tew | Jan 2002 | A1 |
20020015215 | Miles | Feb 2002 | A1 |
20020024529 | Miller et al. | Feb 2002 | A1 |
20020024711 | Miles | Feb 2002 | A1 |
20020036304 | Ehmke et al. | Mar 2002 | A1 |
20020050882 | Hyman et al. | May 2002 | A1 |
20020054424 | Miles et al. | May 2002 | A1 |
20020075226 | Lippincott | Jun 2002 | A1 |
20020075555 | Miles | Jun 2002 | A1 |
20020093722 | Chan et al. | Jul 2002 | A1 |
20020097133 | Charvet et al. | Jul 2002 | A1 |
20020113782 | Verberne et al. | Aug 2002 | A1 |
20020126364 | Miles | Sep 2002 | A1 |
20020179421 | Williams et al. | Dec 2002 | A1 |
20020181592 | Gagarin et al. | Dec 2002 | A1 |
20020186108 | Hallbjorner | Dec 2002 | A1 |
20030004272 | Power | Jan 2003 | A1 |
20030007205 | Lee et al. | Jan 2003 | A1 |
20030011728 | Battersby | Jan 2003 | A1 |
20030020699 | Nakatani et al. | Jan 2003 | A1 |
20030043157 | Miles | Mar 2003 | A1 |
20030051177 | Koo | Mar 2003 | A1 |
20030072070 | Miles | Apr 2003 | A1 |
20030122773 | Washio et al. | Jul 2003 | A1 |
20030128282 | Sudo | Jul 2003 | A1 |
20030137215 | Cabuz | Jul 2003 | A1 |
20030137521 | Zehner et al. | Jul 2003 | A1 |
20030189536 | Ruigt | Oct 2003 | A1 |
20030202264 | Weber et al. | Oct 2003 | A1 |
20030202265 | Reboa et al. | Oct 2003 | A1 |
20030202266 | Ring et al. | Oct 2003 | A1 |
20040008396 | Stappaerts | Jan 2004 | A1 |
20040021621 | Kim et al. | Feb 2004 | A1 |
20040021658 | Chen | Feb 2004 | A1 |
20040022044 | Yasuoka et al. | Feb 2004 | A1 |
20040027701 | Ishikawa | Feb 2004 | A1 |
20040036697 | Kim et al. | Feb 2004 | A1 |
20040051929 | Sampsell et al. | Mar 2004 | A1 |
20040058532 | Miles et al. | Mar 2004 | A1 |
20040080516 | Kurumisawa et al. | Apr 2004 | A1 |
20040080807 | Chen et al. | Apr 2004 | A1 |
20040136596 | Oneda et al. | Jul 2004 | A1 |
20040145049 | McKinnell et al. | Jul 2004 | A1 |
20040145553 | Sala et al. | Jul 2004 | A1 |
20040147056 | McKinnell et al. | Jul 2004 | A1 |
20040155872 | Kamijo | Aug 2004 | A1 |
20040160143 | Shreeve et al. | Aug 2004 | A1 |
20040174583 | Chen et al. | Sep 2004 | A1 |
20040179281 | Reboa | Sep 2004 | A1 |
20040183948 | Lai et al. | Sep 2004 | A1 |
20040212026 | Van Brocklin et al. | Oct 2004 | A1 |
20040217378 | Martin et al. | Nov 2004 | A1 |
20040217919 | Pichl et al. | Nov 2004 | A1 |
20040218251 | Piehl et al. | Nov 2004 | A1 |
20040218334 | Martin et al. | Nov 2004 | A1 |
20040218341 | Martin et al. | Nov 2004 | A1 |
20040223204 | Mao et al. | Nov 2004 | A1 |
20040227493 | Van Brocklin et al. | Nov 2004 | A1 |
20040240032 | Miles | Dec 2004 | A1 |
20040240138 | Martin et al. | Dec 2004 | A1 |
20040245588 | Nikkel et al. | Dec 2004 | A1 |
20040263944 | Miles et al. | Dec 2004 | A1 |
20050001797 | Miller et al. | Jan 2005 | A1 |
20050001828 | Martin et al. | Jan 2005 | A1 |
20050012577 | Pillans et al. | Jan 2005 | A1 |
20050024301 | Funston | Feb 2005 | A1 |
20050038950 | Adelmann | Feb 2005 | A1 |
20050057442 | Way | Mar 2005 | A1 |
20050068583 | Gutkowski et al. | Mar 2005 | A1 |
20050069209 | Damera-Venkata et al. | Mar 2005 | A1 |
20050089213 | Geng | Apr 2005 | A1 |
20050116924 | Sauvante et al. | Jun 2005 | A1 |
20050206991 | Chui et al. | Sep 2005 | A1 |
20050286113 | Miles | Dec 2005 | A1 |
20050286114 | Miles | Dec 2005 | A1 |
20050286741 | Watanabe et al. | Dec 2005 | A1 |
20060044246 | Mignard | Mar 2006 | A1 |
20060044298 | Mignard et al. | Mar 2006 | A1 |
20060044928 | Chui et al. | Mar 2006 | A1 |
20060056000 | Mignard | Mar 2006 | A1 |
20060057754 | Cummings | Mar 2006 | A1 |
20060066542 | Chui | Mar 2006 | A1 |
20060066559 | Chui et al. | Mar 2006 | A1 |
20060066560 | Gally et al. | Mar 2006 | A1 |
20060066561 | Chui et al. | Mar 2006 | A1 |
20060066594 | Tyger | Mar 2006 | A1 |
20060066597 | Sampsell | Mar 2006 | A1 |
20060066598 | Floyd | Mar 2006 | A1 |
20060066601 | Kothari | Mar 2006 | A1 |
20060066935 | Cummings et al. | Mar 2006 | A1 |
20060066937 | Chui | Mar 2006 | A1 |
20060066938 | Chui | Mar 2006 | A1 |
20060067648 | Chui et al. | Mar 2006 | A1 |
20060067653 | Gally et al. | Mar 2006 | A1 |
20060077127 | Sampsell et al. | Apr 2006 | A1 |
20060077505 | Chui et al. | Apr 2006 | A1 |
20060077520 | Chui et al. | Apr 2006 | A1 |
20060101293 | Chandley et al. | May 2006 | A1 |
20060103613 | Chui | May 2006 | A1 |
20060119613 | Kerofsky | Jun 2006 | A1 |
20060250335 | Stewart et al. | Nov 2006 | A1 |
20060250350 | Kothari et al. | Nov 2006 | A1 |
20060267923 | Kerofsky | Nov 2006 | A1 |
20070126673 | Djordjev et al. | Jun 2007 | A1 |
20070280357 | Sung | Dec 2007 | A1 |
20080037867 | Lee et al. | Feb 2008 | A1 |
20080062289 | Sudo | Mar 2008 | A1 |
20080069469 | Yan et al. | Mar 2008 | A1 |
20080094495 | Shikata et al. | Apr 2008 | A1 |
20080204475 | Kim et al. | Aug 2008 | A1 |
20080212884 | Oneda et al. | Sep 2008 | A1 |
20080238837 | Yamaguchi | Oct 2008 | A1 |
20080252628 | Han et al. | Oct 2008 | A1 |
20080291153 | Zhang et al. | Nov 2008 | A1 |
20090219309 | Sampsell | Sep 2009 | A1 |
20090219600 | Gally et al. | Sep 2009 | A1 |
20090225069 | Sampsell | Sep 2009 | A1 |
20090273596 | Cummings | Nov 2009 | A1 |
20100026680 | Chui et al. | Feb 2010 | A1 |
20100053224 | Hashimoto et al. | Mar 2010 | A1 |
20100091029 | Han et al. | Apr 2010 | A1 |
20100315398 | Chui et al. | Dec 2010 | A1 |
20110134159 | Adachi | Jun 2011 | A1 |
Number | Date | Country |
---|---|---|
0295802 | Dec 1988 | EP |
0300754 | Jan 1989 | EP |
0306308 | Mar 1989 | EP |
0318050 | May 1989 | EP |
0 417 523 | Mar 1991 | EP |
0 484 969 | Nov 1991 | EP |
0 467 048 | Jan 1992 | EP |
0570906 | Nov 1993 | EP |
0608056 | Jul 1994 | EP |
0655725 | May 1995 | EP |
0 667 548 | Aug 1995 | EP |
0 706 164 | Apr 1996 | EP |
0725380 | Aug 1996 | EP |
0852371 | Jul 1998 | EP |
0911794 | Apr 1999 | EP |
0 017 038 | Jul 2000 | EP |
1 134 721 | Sep 2001 | EP |
1 146 533 | Oct 2001 | EP |
1 239 448 | Sep 2002 | EP |
1 280 129 | Jan 2003 | EP |
1343190 | Sep 2003 | EP |
1345197 | Sep 2003 | EP |
1381023 | Jan 2004 | EP |
1 414 011 | Apr 2004 | EP |
1473691 | Nov 2004 | EP |
2401200 | Nov 2004 | GB |
2000-075963 | Apr 2000 | JP |
2004-29571 | Jan 2004 | JP |
WO 9530924 | Nov 1995 | WO |
WO 9717628 | May 1997 | WO |
WO 9952006 | Oct 1999 | WO |
WO 0173937 | Oct 2001 | WO |
WO 02089103 | Nov 2002 | WO |
WO 03007049 | Jan 2003 | WO |
WO 03015071 | Feb 2003 | WO |
WO 03044765 | May 2003 | WO |
WO 03060940 | Jul 2003 | WO |
WO 03069413 | Aug 2003 | WO |
WO 03073151 | Sep 2003 | WO |
WO 03079323 | Sep 2003 | WO |
WO 03090199 | Oct 2003 | WO |
WO 2004006003 | Jan 2004 | WO |
WO 2004026757 | Apr 2004 | WO |
WO 2004049034 | Jun 2004 | WO |
WO 2004054088 | Jun 2004 | WO |
Entry |
---|
Chen, H.F. et al., “Backlight Local Dimming Algorithm for High Contrast LCD-TV”, Oct. 12 ,2006, Proc. of ASIC, pp. 168-171. |
Bains, “Digital Paper Display Technology holds Promise for Portables”, CommsDesign EE Times (2000). |
Lieberman, “MEMS Display Looks to give PDAs Sharper Image” EE Times (2004). |
Lieberman, “Microbridges at heart of new MEMS displays” EE Times (2004). |
Seeger et al., “Stabilization of Electrostatically Actuated Mechanical Devices”, (1997) International Conference on Solid State Sensors and Actuators; vol. 2, pp. 1133-1136. |
Peroulis et al., Low contact resistance series MEMS switches, 2002, pp. 223-226, vol. 1, IEEE MTT-S International Microwave Symposium Digest, New York, NY. |
ISR and WO for PCT/US06/046723 filed Dec. 7, 2006. |
Miles et al., 5.3: Digital Paper™: Reflective displays using interferometric modulation, SID Digest, vol. XXXI, 2000 pp. 32-35. |
IPRP for PCT/US06/046723 filed Dec. 7, 2006. |
Chen et al., Low peak current driving scheme for passive matrix-OLED, SID International Symposium Digest of Technical Papers, May 2003, pp. 504-507. |
Miles, MEMS-based interferometric modulator for display applications, Part of the SPIE Conference on Micromachined Devices and Components, vol. 3876, pp. 20-28 (1999). |
Number | Date | Country | |
---|---|---|---|
20070147688 A1 | Jun 2007 | US |