This disclosure relates to imaging systems, such as lenless image capture systems, including lenless image capture systems integrated with electromechanical systems and devices.
Electromechanical systems (EMS) include devices having electrical and mechanical elements, actuators, transducers, sensors, optical components such as mirrors and optical films, and electronics. EMS devices or elements can be manufactured at a variety of scales including, but not limited to, microscales and nanoscales. For example, microelectromechanical systems (MEMS) devices can include structures having sizes ranging from about a micron to hundreds of microns or more. Nanoelectromechanical systems (NEMS) devices can include structures having sizes smaller than a micron including, for example, sizes smaller than several hundred nanometers. Electromechanical elements may be created using deposition, etching, lithography, and/or other micromachining processes that etch away parts of substrates and/or deposited material layers, or that add layers to form electrical and electromechanical devices.
One type of EMS device is called an interferometric modulator (IMOD). The term IMOD or interferometric light modulator refers to a device that selectively absorbs and/or reflects light using the principles of optical interference. In some implementations, an IMOD display element may include a pair of conductive plates, one or both of which may be transparent and/or reflective, wholly or in part, and capable of relative motion upon application of an appropriate electrical signal. For example, one plate may include a stationary layer deposited over, on or supported by a substrate and the other plate may include a reflective membrane separated from the stationary layer by an air gap. The position of one plate in relation to another can change the optical interference of light incident on the IMOD display element. IMOD-based display devices have a wide range of applications, and are anticipated to be used in improving existing products and creating new products, especially those with display capabilities.
Many devices include displays (such as IMOD-based displays) and also imaging systems, such as cameras. Often, the camera includes a relatively small aperture with a lens that focuses ambient light from a scene to be captured onto a relatively small area of a sensor. By focusing the incident light from the scene onto the sensor, a real spatial image of the scene can be formed on the sensor. To meet market demands and design criteria for devices incorporating imaging systems, new imaging systems are continually being developed.
The systems, methods and devices of this disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
One innovative aspect of the subject matter described in this disclosure can be implemented in an imaging system. The imaging system can include a light sensor, a light guide, an optical pattern generator, and a processor. The light guide can include a plurality of light turning features. At least some of the light turning features can be configured to receive ambient light and to direct the ambient light out through an output surface of the light guide to the light sensor. The optical pattern generator can be disposed between the output surface of the light guide and the light sensor. The optical pattern generator can be configured to generate a light intensity pattern upon the passage of the ambient light through the optical pattern generator, and to project the light intensity pattern onto the light sensor. The processor can be in communication with the light sensor and can be configured to construct an image based on the light intensity pattern.
In some implementations, the ambient light incident on different portions of a major surface of the light guide can cause different light intensity patterns. The processor can be configured to access a database that includes reference characterizations of light intensity patterns. Each reference characterization can be associated with light incident on a different portion of the major surface. The processor can also be configured to determine which portions of the major surface received ambient light based upon the light intensity patterns and the reference characterizations.
In some implementations, the light turning features are configured to receive the ambient light from substantially the same range of angular directions. Furthermore, each light turning feature can be configured to turn the ambient light received from a range of angular directions. The ranges for at least some of the light turning features can at least partially overlap. At least some of the light turning features can be configured to turn the ambient light received from a cone having an acceptance angle range of about 60 degrees to about 90 degrees, relative to a central axis of the cone. In some implementations, the light turning features include light turning facets. For example, each of the light turning facets can include sides of a truncated cone.
In some implementations of the imaging system, the light sensor is disposed facing an edge of the light guide. One or more additional light sensors can be disposed facing one or more other edges of the light guide. The optical pattern generator can be configured to project the light intensity pattern onto the one or more additional light sensors. The optical pattern generator can include an array of apertures or an array of lenses. The array of lenses can include an array of curved surfaces facing the light sensor or an array of curved surfaces facing the light-output surface of the light guide.
In some implementations, the imaging system further includes a display device underlying the light guide. For example, the display device can be a reflective display. The reflective display can include a plurality of interferometric modulator display elements. In some implementations, the processor can be configured to communicate with the display and can be configured to process image data.
Another innovative aspect of the subject matter described in the disclosure can be implemented in an imaging system. The imaging system can include a light sensor, a light guide, an optical pattern generating means, and means for processing. The light guide can include a plurality of light turning means. At least some of the light turning means can be configured to receive ambient light and to direct the ambient light out through an output surface of the light guide to the light sensor. The optical pattern generating means can be disposed between the output surface of the light guide and the light sensor. The optical pattern generating means can be configured to generate a light intensity pattern upon the passage of the ambient light through the optical pattern generating means, and to project the light intensity pattern onto the light sensor. The means for processing can be in communication with the light sensor. The processing means can be configured to construct an image based on the light intensity pattern.
In some implementations, the light turning means can include light turning facets. The optical pattern generating means can include at least one of an array of apertures and an array of lenses. For example, the array of apertures can include an opaque edge mask with apertures. As another example, the array of lenses can include at least one of an array of curved surfaces facing the light sensor and an array of curved surfaces facing the light-output surface of the light guide. In certain implementations, the processing means can include a processor.
In various implementations, ambient light incident on different portions of a major surface of the light guide can cause different light intensity patterns. The processing means can be configured to access a database that includes reference characterizations of light intensity patterns. Each reference characterization can be associated with light incident on a different portion of the major surface. The processing means can also determine which portions of the major surface received ambient light based upon the light intensity patterns and the reference characterizations. The light turning means can be configured to receive the ambient light from substantially the same range of angular directions.
Another innovative aspect of the subject matter described in this disclosure can be implemented in a non-transitory tangible computer storage medium. The computer storage medium can have instructions stored to direct a processor to construct an image. The instructions can direct the processor to construct an image by receiving signals indicative of a light intensity pattern from a light sensor. The light intensity pattern can correspond to light distributions caused by the ambient light incident on different portions of a major surface of a light guide. The instructions can also direct the processor to construct the image by accessing a database that includes reference characterizations of different light intensity patterns. Each reference characterization can be associated with light incident on a different portion of the major surface. The instructions can also include determining which portions of the major surface received ambient light based upon the light intensity pattern and the reference characterizations; and constructing the image based on the determined portions. In some implementations, determining which portions of the major surface received ambient light can include using a pseudo-inverse matrix relating to the reference characterizations.
Details of one or more implementations of the subject matter described in this disclosure are set forth in the accompanying drawings and the description below. Although some of the examples provided in this disclosure are described in terms of EMS and MEMS-based displays the concepts provided herein may apply to non-display devices and to other types of displays such as liquid crystal displays, organic light-emitting diode (“OLED”) displays, and field emission displays. Other features, aspects, and advantages will become apparent from the description, the drawings and the claims. Note that the relative dimensions of the following figures may not be drawn to scale.
12C-12D illustrate images constructed with the experimental imaging system shown in
Like reference numbers and designations in the various drawings indicate like elements.
The following description is directed to certain implementations for the purposes of describing the innovative aspects of this disclosure. However, a person having ordinary skill in the art will readily recognize that the teachings herein can be applied in a multitude of different ways. The described implementations may be implemented in any device, apparatus, or system that can be configured to display an image, whether in motion (such as video) or stationary (such as still images), and whether textual, graphical or pictorial. More particularly, it is contemplated that the described implementations may be included in or associated with a variety of electronic devices such as, but not limited to: mobile telephones, multimedia Internet enabled cellular telephones, mobile television receivers, wireless devices, smartphones, Bluetooth® devices, personal data assistants (PDAs), wireless electronic mail receivers, hand-held or portable computers, netbooks, notebooks, smartbooks, tablets, printers, copiers, scanners, facsimile devices, global positioning system (GPS) receivers/navigators, cameras, digital media players (such as MP3 players), camcorders, game consoles, wrist watches, clocks, calculators, television monitors, flat panel displays, electronic reading devices (for example, e-readers), computer monitors, auto displays (including odometer and speedometer displays, etc.), cockpit controls and/or displays, camera view displays (such as the display of a rear view camera in a vehicle), electronic photographs, electronic billboards or signs, projectors, architectural structures, microwaves, refrigerators, stereo systems, cassette recorders or players, DVD players, CD players, VCRs, radios, portable memory chips, washers, dryers, washer/dryers, parking meters, packaging (such as in electromechanical systems (EMS) applications including microelectromechanical systems (MEMS) applications, as well as non-EMS applications), aesthetic structures (such as display of images on a piece of jewelry or clothing) and a variety of EMS devices. The teachings herein also can be used in non-display applications such as, but not limited to, electronic switching devices, radio frequency filters, sensors, accelerometers, gyroscopes, motion-sensing devices, magnetometers, inertial components for consumer electronics, parts of consumer electronics products, varactors, liquid crystal devices, electrophoretic devices, drive schemes, manufacturing processes and electronic test equipment. Thus, the teachings are not intended to be limited to the implementations depicted solely in the Figures, but instead have wide applicability as will be readily apparent to one having ordinary skill in the art.
Cameras are used in many applications to capture images of a scene by using one or more lenses that focus light onto a sensor. The lens(es) may be configured to accept ambient light that is incident upon the lens(es) from a range of angles and to focus the light on the sensor to form a real spatial image on the sensor. Cameras can be difficult to integrate into devices where a small or thin form factor is desired, since the cameras can be relatively deep and bulky structures, due to, for example, the need to accommodate optical elements and to provide a length for light to properly focus on a sensor. Simply reducing the sizes of the cameras, however, can reduce the apertures of the cameras, which can decrease light collection efficiency and degrade image quality.
In some implementations, rather than a lens, a light guide may be used to collect light and direct the light to a light sensor. The light guide can provide a relatively large surface area for receiving light and, in some implementations, a substantially flat surface that can be integrated into devices to provide other functionality. In addition, certain implementations described herein include an imaging method and system, which can allow an image of a scene to be captured without directing light to form the image on the light sensor. Rather, light received at different locations on the light guide's surface is made to form different patterns on the light sensor. The different patterns may be substantially unique for each location on the light guide surface. Detection of different patterns by the light sensor allows an image to be constructed by correlating those patterns with the particular locations on the light guide surface which have received light. Thus, the locations on the light guide surface receiving light can be mapped and an image constructed based on this mapping.
For example, in some implementations, an imaging system can include a light sensor, a light guide, an optical pattern generator, and a processor. The light guide can include light turning features configured to receive and direct ambient light out through an output surface of the light guide to overlapping portions of the light sensor. The optical pattern generator can generate a light intensity pattern upon the passage of the ambient light through the optical pattern generator. The optical pattern generator can also project the light intensity pattern onto the light sensor. The processor, in communication with the light sensor, can construct an image based on the light intensity pattern. Some implementations can include light turning features that can receive light from substantially all and any angular directions, for example, receive light without angle discrimination.
Particular implementations of the subject matter described in this disclosure can be implemented to realize one or more of the following potential advantages. Various implementations enable the design of thin imaging systems, exceptionally thin imaging systems in some cases. For example, various implementations can construct an image without needing to the use of lens(es) to form a real spatial image on a light sensor, and thus without needing to consider or accommodate the focal length(s) of the lens(es). This may reduce the costs and bulk of devices by eliminating the lens(es), their associated costs, and their associated need for adequate focal lengths. In some implementations, the number of components of an electrical device, such as a smartphone or mobile computing device, may be reduced where surfaces can be used for more than one purpose, for example, for both displaying and capturing images. In some implementations, a display for a computer, cell phone, smartphone, personal digital assistant, or other electronic device, including mobile devices, may be able to both display images to a viewer while also collecting ambient light for imaging objects that are in front of the display. In this way, the display may serve both the purpose of displaying and capturing images. Because the area of the display can be used to collect light, the amount of light flux collected for imaging can increase relative to the amount collected for a conventional camera typically used with displays, thereby increasing the sensitivity of the imaging system and, ultimately, the image quality. In some implementations, for example, in a two-way video communication system, two or more video conference participants may watch live video images of each other. The display screen used by a participant may itself include an imaging system to take a live, moving image of the participant to send for the other participants to view. In some implementations, the light guide may be any transparent structure, for example, an architectural structure such as part of a wall or window, thereby allowing integration of the imaging system in many common structures. In addition, manufacturing yields may be increased and costs decreased, since the use of patterns to construct images can increase the margins of error for various components of the imaging system. For instance, the relevant reference characterizations of patterns for correlating light received by the light guide may be calibrated for each system, thereby allowing a high level of tolerance for variations between individual systems.
In general, in various implementations, to capture an image of a scene, light from the scene impinges on the light guide 120, which has light receiving portions p1, p2, p3, . . . pm having light turning features 130 that direct the light through the light guide 120, through the optical pattern generator 140, and to overlapping portions s1, s2, s3, . . . sn of the light sensor 110 facing the light output edge 123 of the light guide 120. Ambient light incident on different portions p1, p2, p3, . . . pm of a light receiving surface of the light guide 120 can cause different light intensity patterns. In some implementations where two or more light turning features 130 from different portions (for example, p1 and p2) of the light guide 120 receive and direct light to the same portion (for example, s2) of the light sensor 110, the light intensity pattern received at the portion (for example, s2) of the light sensor 110 can be a superposition (or the superimposed light intensity pattern) of each individual light intensity pattern for each of the different portions (for example, p1 and p2) of the light guide 120. Because each light receiving portion p1, p2, p3, . . . or pm (which may be considered a virtual pixel, photosite, or sense/) produces an unique reference characterization of a light intensity pattern, it is possible to determine which of the light receiving portions p1, p2, p3, . . . or pm have received light based on the light intensity pattern produced by the received light. Using this information, an image of the scene can be constructed, for example, by effectively “filling in” a grid (implemented in software) at locations corresponding to the light receiving portions p1, p2, p3, . . . or pm that were found, by correlating the observed light intensity patterns with the reference characterizations, to have received light.
In certain implementations, the light guide 120 can be substantially planar and transparent. In some implementations, the light guide 190 can be formed of one or more layers of optically transmissive material. Examples of materials can include the following: acrylics, acrylate copolymers, UV-curable resins, polycarbonates, cycloolefin polymers, polymers, organic materials, inorganic materials, silicates, alumina, sapphire, polyethylene terephthalate (PET), polyethylene terephthalate glycol (PET-G), silicon oxynitride, and/or combinations thereof.
In some implementations, the light guide 120 can be a slab of glass or plastic that can overlay a display. Examples of suitable display types can include IMOD-based displays or liquid crystal-based displays. As shown in
The light guide 120 also can include a back surface 122. The back surface 122 can refer to a surface of the light guide 120 opposite the front surface 121. Additionally, the light guide 120 can include edges 123, 124, 125, and 126. While the light output surface 123 is illustrated as one of the edges of the light guide 120, in various implementations, it is possible for the light output surface to be one or more of the front surface 121, the back surface 122, and the edges of the light guide 120 disposed about the front 121 and back 122 surfaces (e.g., edges 123, 124, 125, and 126).
The front surface 121, the back surface 122, and the edges 123-126 can be rectangular in shape as shown in
In certain implementations, ambient light can be incident on different portions of a major surface, for example, the front 121 surface of the light guide 120. As shown in
The light guide 120 can include light turning features 130. Some of the light turning features 130 can be configured to receive the ambient light and to direct at least a portion of the light out through an output surface 123 of the light guide 120 to overlapping portions of the light sensor 110.
Some of the light turning features 130 can be formed onto one or more of the major and minor surfaces. As shown in
The light turning features 130 can be configured to turn ambient light received within particular ranges of angular direction(s). In some implementations, the light turning features 130 can be configured to turn ambient light received from the same or different angular direction and/or ranges of angular directions, with the range of angular directions for a particular light turning feature 130 also referred to as the angular directions encompassed within the cone of acceptance angles for that light turning feature 130.
In some implementations, different light turning features can have cones of acceptance angles having central axes extending in different angular directions and/or different sized cones of acceptance angles. In some implementations, at least some of the light turning features 130 can have cones of acceptance angles that are centered in the same direction. Also, in some implementations, at least some of the cones can have ranges of acceptance angles that at least partially overlap. For example, for two light turning features each having a cone of acceptance angles with a central axis parallel to a direction normal to the surface receiving the light, one light turning feature may turn ambient light received within a cone having a θacc of about 60 degrees to about 90 degrees, while the other light turning feature may turn ambient light received within a cone having a θacc of about 70 degrees to about 90 degrees.
In some other implementations, each of the light turning features 130 in the light guide 120 can be configured to turn ambient light received from substantially the same range of angular directions. In some implementations, each of the light turning features 130 can be configured to isotropically turn ambient light. The term “isotropic,” as used herein, can refer to the property of turning light received from a wide range of directions, for example, up to and including cones of acceptance angles that include θacc's of about 25 degrees, 45 degrees, 65 degrees, or 90 degrees. Thus, in certain implementations, the light turning features 130 do not necessarily receive light in an angle discriminatory manner, but can receive and turn light from substantially all or any angular direction. For example, in some implementations, all light striking the light guide 120 (and not reflected off the light guide 120) can have an equal probability of being turned by a light turning feature 130. In some implementations, the ranges of acceptance angles may be narrow, which can improve resolution. For example, the cone of acceptance angles can have a θacc of, e.g., below about 10 degrees, below about 7 degrees, below about 5 degrees, below about 3 degrees, or below about 2 degrees.
With reference again to
In certain implementations, the optical pattern generator 140 can create a projection of the light profile with modulation onto the light sensor 110, from which the light sensor 110 can resolve finer details of the light distribution from not only the intensity but also the phase information in the light passing through the optical pattern generator 140. An optical pattern generator 140 as used herein can refer to a device that is configured to modulate a light profile (for example, a light intensity distribution) to form a light intensity pattern. For example, the light intensity pattern can be a pattern with discrete bands of light, which allows quantization of the light intensity distribution (for example, constraining gradual, continuous changes in the light intensity distribution into a smaller set of discrete values, the differences between which are more easily detected). For example, an optical pattern generator 140 can refer to an optical component that can discriminate intensity, spatial, and/or angular information of light and facilitate quantization of this information, which may otherwise change so gradually between light received at different points on the front surface 121 that detection of the change is difficult. Therefore, in certain implementations, an optical pattern generator 140 can project a light intensity pattern derived by modulating light, from the light output surface 123 of the light guide 120, so that the light sensor 110 can detect a signal with increased discrimination compared to a relatively broad signal without modulation. For example, an optical pattern generator 140 can change a relatively broad signal to multiple peaks with variable spacing. The intensity and spacing of the discrete peaks can be easier to discriminate and measure. In some implementations, the optical pattern generator 140 can include an array of apertures or an array of lenses. The light profile can be modulated with a frequency determined by the spacing of the array of apertures or by the spacing of the array of lenses through which light outputted from the light guide 120 travels to the light sensor 110.
In some arrangements, as shown and discussed with respect to
As shown in
As discussed above, the light turning features 130 can direct at least a portion of the ambient light out though an output surface 123 of the light guide 120 and to overlapping portions of the light sensor 110. For example, one light turning feature may direct light to portions s1, s2, s3, . . . s10 of the light sensor 110 (see
Although
The imaging system 100 can include a processor 150 as shown in
For example, when light strikes the light guide 120 and is turned by a light turning feature 130, the light turning feature 130 can effectively be considered a discrete light source. Thus, the plurality of light turning features 130 can effectively be considered an extended source of light on the light guide surface, which can be described by the spatial distribution U(x,y). Dividing the light receiving surface into a rectangular grid of discretized light sources (for example, corresponding to the light receiving portions p1, p2, p3, . . . pm) can give the distribution Ui(x,y) where i can represent the ith portion p1. The number i can vary from 1 to the total number of light receiving portions m. If the light receiving surface is broken into an M by N rectangular grid of portions, then the total number of portions m can be equal to the product MN. The portions can be described in terms of an (m,1) column vector U, where each vector element i can be proportional to the light flux received and turned in the light guide 120 at the ith position (for example, portion pi). The light flux (which can be proportional to Ui) can propagate in the light guide 120 out through its periphery, pass through the optical pattern generator 140, and be detected by the light sensor 110. For example, the light flux can be collected by the kth sensing portion sk, where k can be a number from 1 to the total number of sensing portions n. The sensor signal can be represented as an (n,1) column vector S. The efficiency by which light from the ith portion p1 propagates to the kth sensing portion sk can be given by Gk,i. The efficiencies can populate an (n, m) matrix G. In some implementations, the matrix G may refer to a propagation matrix which can map the propagation of light from each portion pi on the light receiving surface to each light sensing portion sk. The formation of a signal S from the light receiving surface U may be described by the following matrix equation:
S=GU (1)
where S can relate to the light intensity pattern received by the light sensor 110, G can relate to the reference characterizations, and U can relate to the pixelated image to be constructed.
To determine the reference characterization, when each light receiving portion p1, p2, p3, and p4 of the light receiving surface is individually illuminated, for example, with a light source, each light sensing portion s1, s2, s3, . . . s32 of the light sensor 110 can receive light according to its position (for example, distance) relative to the illuminated portions p1, p2, p3, and p4 of the light receiving surface. For example, when only light receiving portion p1 is illuminated, light sensing portions s1, s2, s3, s4, s29, s30, s31, and s32 of the light sensor 110 closest to the light receiving portion p1 can receive the greatest amounts of light compared to the other light sensing portions. Likewise, light sensing portions s16 and s17 farthest from light receiving portion p1 can receive the lowest amounts of light compared to the other light sensing pixels.
Certain implementations can construct an image electronically, such as by software, using the light intensity pattern collected by the light sensor 110 and the reference characterizations discussed above, without first forming or projecting the image on the light sensor 110 (that is, without, for example, focusing an image onto the light sensor 110 using lenses). As an example, an algorithm to determine the pixelated image from the light intensity pattern and the reference characterizations can include solving for U with the following relationship:
U=G−1S (2)
where G−1 can be a pseudo-inverse matrix relating the reference characterizations. The pseudo-inverse can provide one method to achieve image construction. It can be equivalent to a least-squared optimization with a non-negativity constraint. Other solutions and algorithms are possible.
In some implementations, directing the ambient light can include turning, by the light turning features, the ambient light incident upon the major surface surface of the light guide from substantially all angular directions. Furthermore, in some other implementations, directing the ambient light can include turning, by the light turning features 130, the ambient light received within a particular range of angular directions (such as, within a particular cone of acceptance angles). In some implementations, some of the light turning features 130 can turn the ambient light received from substantially the same range of angular directions. In other implementations, some of the light turning features 130 can turn the ambient light received from different ranges of angular directions. At least some of the ranges can partially overlap. Furthermore, directing the ambient light can include, in some implementations, directing the light received by the light turning features 130 from the ranges of angular directions out through the output surface 123 of the light guide 120 and through an optical pattern generator 140. In some implementations, the optical pattern generator 140 can include an array of apertures and/or an array of lenses.
In some implementations of the method 500, constructing the image as shown in block 550 can include receiving the light intensity pattern corresponding to light distributions caused by the ambient light incident on different portions of a major surface of the light guide 120. Constructing the image can also include accessing a database that includes reference characterizations of light intensity patterns. Each reference characterization can be associated with light incident on a different portion of the major surface. Constructing the image can further include determining which portions of the major surface received ambient light based upon the light intensity patterns and the reference characterizations.
Some implementations of the imaging system 100 described herein (for example, as shown in
As described herein, certain implementations of imaging systems 100 and methods 500 can be utilized in display devices including interferometric modulator (IMOD) display elements.
The IMOD display device can include an array of IMOD display elements which may be arranged in rows and columns. Each display element in the array can include at least a pair of reflective and semi-reflective layers, such as a movable reflective layer (i.e., a movable layer, also referred to as a mechanical layer) and a fixed partially reflective layer (i.e., a stationary layer), positioned at a variable and controllable distance from each other to form an air gap (also referred to as an optical gap, cavity or optical resonant cavity). The movable reflective layer may be moved between at least two positions. For example, in a first position, i.e., a relaxed position, the movable reflective layer can be positioned at a distance from the fixed partially reflective layer. In a second position, i.e., an actuated position, the movable reflective layer can be positioned more closely to the partially reflective layer. Incident light that reflects from the two layers can interfere constructively and/or destructively depending on the position of the movable reflective layer and the wavelength(s) of the incident light, producing either an overall reflective or non-reflective state for each display element. In some implementations, the display element may be in a reflective state when unactuated, reflecting light within the visible spectrum, and may be in a dark state when actuated, absorbing and/or destructively interfering light within the visible range. In some other implementations, however, an IMOD display element may be in a dark state when unactuated, and in a reflective state when actuated. In some implementations, the introduction of an applied voltage can drive the display elements to change states. In some other implementations, an applied charge can drive the display elements to change states.
The depicted portion of the array in
In
The optical stack 16 can include a single layer or several layers. The layer(s) can include one or more of an electrode layer, a partially reflective and partially transmissive layer, and a transparent dielectric layer. In some implementations, the optical stack 16 is electrically conductive, partially transparent and partially reflective, and may be fabricated, for example, by depositing one or more of the above layers onto a transparent substrate 20. The electrode layer can be formed from a variety of materials, such as various metals, for example indium tin oxide (ITO). The partially reflective layer can be formed from a variety of materials that are partially reflective, such as various metals (e.g., chromium and/or molybdenum), semiconductors, and dielectrics. The partially reflective layer can be formed of one or more layers of materials, and each of the layers can be formed of a single material or a combination of materials. In some implementations, certain portions of the optical stack 16 can include a single semi-transparent thickness of metal or semiconductor which serves as both a partial optical absorber and electrical conductor, while different, electrically more conductive layers or portions (for example, of the optical stack 16 or of other structures of the display element) can serve to bus signals between IMOD display elements. The optical stack 16 also can include one or more insulating or dielectric layers covering one or more conductive layers or an electrically conductive/partially absorptive layer.
In some implementations, at least some of the layer(s) of the optical stack 16 can be patterned into parallel strips, and may form row electrodes in a display device as described further below. As will be understood by one having ordinary skill in the art, the term “patterned” is used herein to refer to masking as well as etching processes. In some implementations, a highly conductive and reflective material, such as aluminum (Al), may be used for the movable reflective layer 14, and these strips may form column electrodes in a display device. The movable reflective layer 14 may be formed as a series of parallel strips of a deposited metal layer or layers (orthogonal to the row electrodes of the optical stack 16) to form columns deposited on top of supports, such as the illustrated posts 18, and an intervening sacrificial material located between the posts 18. When the sacrificial material is etched away, a defined gap 19, or optical cavity, can be formed between the movable reflective layer 14 and the optical stack 16. In some implementations, the spacing between posts 18 may be approximately 1-1000 μm, while the gap 19 may be approximately less than 10,000 Angstroms (Å).
In some implementations, each IMOD display element, whether in the actuated or relaxed state, can be considered as a capacitor formed by the fixed and moving reflective layers. When no voltage is applied, the movable reflective layer 14 remains in a mechanically relaxed state, as illustrated by the display element 12 on the left in
The processor 21 can be configured to communicate with an array driver 22. The array driver 22 can include a row driver circuit 24 and a column driver circuit 26 that provide signals to, for example a display array or panel 30. The cross section of the IMOD display device illustrated in
The display device 40 includes a housing 41, a display 30, an antenna 43, a speaker 45, an input device 48 and a microphone 46. The housing 41 can be formed from any of a variety of manufacturing processes, including injection molding, and vacuum forming. In addition, the housing 41 may be made from any of a variety of materials, including, but not limited to: plastic, metal, glass, rubber and ceramic, or a combination thereof. The housing 41 can include removable portions (not shown) that may be interchanged with other removable portions of different color, or containing different logos, pictures, or symbols.
The display 30 may be any of a variety of displays, including a bi-stable or analog display, as described herein. The display 30 also can be configured to include a flat-panel display, such as plasma, EL, OLED, STN LCD, or TFT LCD, or a non-flat-panel display, such as a CRT or other tube device. In addition, the display 30 can include an IMOD-based display, as described herein.
The components of the display device 40 are schematically illustrated in
The network interface 27 includes the antenna 43 and the transceiver 47 so that the display device 40 can communicate with one or more devices over a network. The network interface 27 also may have some processing capabilities to relieve, for example, data processing requirements of the processor 21. The antenna 43 can transmit and receive signals. In some implementations, the antenna 43 transmits and receives RF signals according to the IEEE 16.11 standard, including IEEE 16.11(a), (b), or (g), or the IEEE 802.11 standard, including IEEE 802.11a, b, g, n, and further implementations thereof. In some other implementations, the antenna 43 transmits and receives RF signals according to the Bluetooth® standard. In the case of a cellular telephone, the antenna 43 can be designed to receive code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA), Global System for Mobile communications (GSM), GSM/General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), Terrestrial Trunked Radio (TETRA), Wideband-CDMA (W-CDMA), Evolution Data Optimized (EV-DO), IxEV-DO, EV-DO Rev A, EV-DO Rev B, High Speed Packet Access (HSPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Evolved High Speed Packet Access (HSPA+), Long Term Evolution (LTE), AMPS, or other known signals that are used to communicate within a wireless network, such as a system utilizing 3G, 4G or 5G technology. The transceiver 47 can pre-process the signals received from the antenna 43 so that they may be received by and further manipulated by the processor 21. The transceiver 47 also can process signals received from the processor 21 so that they may be transmitted from the display device 40 via the antenna 43.
In some implementations, the transceiver 47 can be replaced by a receiver. In addition, in some implementations, the network interface 27 can be replaced by an image source, which can store or generate image data to be sent to the processor 21. The processor 21 can control the overall operation of the display device 40. The processor 21 receives data, such as compressed image data from the network interface 27 or an image source, and processes the data into raw image data or into a format that can be readily processed into raw image data. The processor 21 can send the processed data to the driver controller 29 or to the frame buffer 28 for storage. Raw data typically refers to the information that identifies the image characteristics at each location within an image. For example, such image characteristics can include color, saturation and gray-scale level.
The processor 21 can include a microcontroller, CPU, or logic unit to control operation of the display device 40. The conditioning hardware 52 may include amplifiers and filters for transmitting signals to the speaker 45, and for receiving signals from the microphone 46. The conditioning hardware 52 may be discrete components within the display device 40, or may be incorporated within the processor 21 or other components.
The driver controller 29 can take the raw image data generated by the processor 21 either directly from the processor 21 or from the frame buffer 28 and can re-format the raw image data appropriately for high speed transmission to the array driver 22. In some implementations, the driver controller 29 can re-format the raw image data into a data flow having a raster-like format, such that it has a time order suitable for scanning across the display array 30. Then the driver controller 29 sends the formatted information to the array driver 22. Although a driver controller 29, such as an LCD controller, is often associated with the system processor 21 as a stand-alone Integrated Circuit (IC), such controllers may be implemented in many ways. For example, controllers may be embedded in the processor 21 as hardware, embedded in the processor 21 as software, or fully integrated in hardware with the array driver 22.
The array driver 22 can receive the formatted information from the driver controller 29 and can re-format the video data into a parallel set of waveforms that are applied many times per second to the hundreds, and sometimes thousands (or more), of leads coming from the display's x-y matrix of display elements.
In some implementations, the driver controller 29, the array driver 22, and the display array 30 are appropriate for any of the types of displays described herein. For example, the driver controller 29 can be a conventional display controller or a bi-stable display controller (such as an IMOD display element controller). Additionally, the array driver 22 can be a conventional driver or a bi-stable display driver (such as an IMOD display element driver). Moreover, the display array 30 can be a conventional display array or a bi-stable display array (such as a display including an array of IMOD display elements). In some implementations, the driver controller 29 can be integrated with the array driver 22. Such an implementation can be useful in highly integrated systems, for example, mobile phones, portable-electronic devices, watches or small-area displays.
In some implementations, the input device 48 can be configured to allow, for example, a user to control the operation of the display device 40. The input device 48 can include a keypad, such as a QWERTY keyboard or a telephone keypad, a button, a switch, a rocker, a touch-sensitive screen, a touch-sensitive screen integrated with the display array 30, or a pressure- or heat-sensitive membrane. The microphone 46 can be configured as an input device for the display device 40. In some implementations, voice commands through the microphone 46 can be used for controlling operations of the display device 40. Although not explicitly illustrated, one or more light sensors 113, 114, 115, and 116 (see
The power supply 50 can include a variety of energy storage devices. For example, the power supply 50 can be a rechargeable battery, such as a nickel-cadmium battery or a lithium-ion battery. In implementations using a rechargeable battery, the rechargeable battery may be chargeable using power coming from, for example, a wall socket or a photovoltaic device or array. Alternatively, the rechargeable battery can be wirelessly chargeable. The power supply 50 also can be a renewable energy source, a capacitor, or a solar cell, including a plastic solar cell or solar-cell paint. The power supply 50 also can be configured to receive power from a wall outlet.
In some implementations, control programmability resides in the driver controller 29 which can be located in several places in the electronic display system. In some other implementations, control programmability resides in the array driver 22. The above-described optimization may be implemented in any number of hardware and/or software components and in various configurations.
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
The various illustrative logics, logical blocks, modules, circuits and algorithm steps described in connection with the implementations disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. The interchangeability of hardware and software has been described generally, in terms of functionality, and illustrated in the various illustrative components, blocks, modules, circuits and steps described above. Whether such functionality is implemented in hardware or software depends upon the particular application and design constraints imposed on the overall system.
The hardware and data processing apparatus used to implement the various illustrative logics, logical blocks, modules and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some implementations, particular steps and methods may be performed by circuitry that is specific to a given function.
In one or more aspects, the functions described may be implemented in hardware, digital electronic circuitry, computer software, firmware, including the structures disclosed in this specification and their structural equivalents thereof, or in any combination thereof. Implementations of the subject matter described in this specification also can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage media for execution by, or to control the operation of, data processing apparatus.
If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be implemented in a processor-executable software module which may reside on a computer-readable medium. For example, as shown in
Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein. Additionally, a person having ordinary skill in the art will readily appreciate, the terms “upper” and “lower” are sometimes used for ease of describing the figures, and indicate relative positions corresponding to the orientation of the figure on a properly oriented page, and may not reflect the proper orientation of, for example, an IMOD display element as implemented.
Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, a person having ordinary skill in the art will readily recognize that such operations need not be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flow diagram. However, other operations that are not depicted can be incorporated in the example processes that are schematically illustrated. For example, one or more additional operations can be performed before, after, simultaneously, or between any of the illustrated operations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.