This application relates generally to cameras and more specifically to camera apertures and camera shutters.
Miniature digital cameras have become very common features of personal computing devices such as mobile phones. These cameras typically have fixed apertures, because mechanical aperture plates are too large, too thick and/or too expensive for inclusion in small cameras of this type. These fixed apertures are generally small, because small apertures are suitable for taking photos in conditions of bright ambient light, e.g., outdoors. While a large aperture would be suitable for taking pictures in dim light, a fixed large aperture would not be appropriate for bright light conditions. Therefore, camera manufacturers implement small fixed apertures rather than large fixed apertures in miniature digital cameras, making the cameras unsatisfactory indoors or under other low-light conditions.
Such miniature cameras also lack mechanical shutters due to the same form-factor and cost limitations. As a result, these cameras generally use electronic switching, such as complementary metal-oxide-semiconductor (“CMOS”) switching, to control exposure time. This does not work very well for high-megapixel cameras, in part because the large amounts of data involved make it difficult to transfer the information collected by the sensor to memory quickly enough.
Some embodiments comprise at least one array that includes microelectromechanical systems (“MEMS”)-based light-modulating devices. Elements of the array(s) may be configured to absorb and/or reflect light when in a first configuration and to transmit light when in a second position. Such MEMS devices may have a fixed optical stack on a substantially transparent substrate and a movable mechanical stack or “plate” disposed at a predetermined air gap from the fixed stack. The optical stacks may be chosen such that when the movable stack is “up” or separated from the fixed stack, most light entering the substrates passes through the two stacks and air gap. When the movable stack is down, or close to the fixed stack, the combined stack may allow only a negligible amount of light to pass through.
Such an array may be controlled to function as a camera aperture and/or as a camera shutter. For example, a controller may cause the array to function as a shutter by causing the MEMS devices to open for a predetermined period of time. The predetermined period of time may be based, at least in part, on the intensity of ambient light, the intensity of a flash, the size of the camera aperture, etc. Some embodiments provide a variable aperture device that does not add significant thickness or cost to a camera module. Such embodiments may enable a camera to function well in both bright and dark light, to control depth of field, etc.
According to some such embodiments, the MEMS devices in a group may be gang-driven instead of being individually controlled. In such embodiments, the camera flash system may comprise a simple and relatively inexpensive controller for this purpose, as compared with a controller that is configured to individually control each MEMS device in the array.
In some embodiments, the array(s) may be controlled to allow partial transmission and partial reflection and/or absorption of light. For example, in some such embodiments, the array(s) may include a separate layer of material that can be made relatively more transmissive or relatively more absorptive. Accordingly, such embodiments may allow areas of an array that includes MEMS-based light-modulating devices to be only partially transmissive instead of substantially transmissive or substantially non-transmissive.
Some embodiments described herein provide a camera that includes a lens system, a first light detector, a first array and a controller. The first light detector may be configured to receive incoming light from the lens system. The first array may be configured to reflect or absorb incident light. The first array may comprise a first plurality of MEMS devices configured to reflect or absorb incident light when in a first position and to transmit incident light when in a second position. The controller may be configured to control the incoming light received by the light detector by controlling the first array.
The controller may be further configured to drive at least some of the MEMS devices to the second position for a predetermined period of time. The camera may also include a second light detector configured to detect an ambient light intensity and to provide ambient light intensity data to the controller. The controller may be further configured to determine the predetermined period of time based, at least in part, on the ambient light intensity data.
The controller may be further configured to control the first array to function as a camera shutter and/or as a variable camera aperture. The camera may also include a second array, which may comprise a second plurality of MEMS devices. The controller may be further configured to control the second array to function as a variable camera aperture or as a camera shutter. The controller may be configured to control the first array or the second array to transmit varying amounts of light.
In some embodiments, the camera may be part of a mobile device. For example, the camera may be part of a mobile device that is configured for data and/or voice communication. Although MEMS-based mobile devices are described in detail herein, the cameras described herein may be made part of many other types of devices, including but not limited to mobile devices.
Some methods are also described herein. Some such methods include processes of controlling light received by a light detector via a lens system and of capturing images via the light received by the light detector. The controlling process may involve controlling a first array comprising a first plurality of MEMS devices that are configured to reflect or absorb incident light when in a first position and to transmit incident light when in a second position.
The controlling process may also involve driving at least some of the MEMS devices to the second position, e.g., for a predetermined period of time. The controlling process may involve controlling the first array to transmit varying amounts of light.
The method may also involve detecting an ambient light intensity and calculating the predetermined period of time based, at least in part, on the ambient light intensity. The method may comprise controlling the first array to function as a camera shutter and/or as a variable camera aperture. The method may also involve controlling a second array to function as a variable camera aperture or as a camera shutter. The second array may comprise a second plurality of MEMS devices.
Alternative camera embodiments are described herein. Some such cameras include a lens system, an image capturing system and a light controlling system. The image capturing system may be configured to receive incoming light from the lens system. The light controlling system may be configured to reflect or absorb incident light when in a first position and to transmit incident light when in a second position.
The light controlling system may comprise a first array configured to function as a camera shutter. The first array may comprise a first plurality of MEMS devices. Alternatively, or additionally, the first array may be configured to function as a variable camera aperture. The light controlling system may also include a second array comprising a second plurality of MEMS devices. The second array may be configured to function as a variable camera aperture or as a camera shutter.
The functionality of the second array may depend on that of the first array. For example, if the first array is configured to function as a camera shutter, the second array may be configured to function as a camera aperture and vice versa.
These and other methods of the invention may be implemented by various types of devices, systems, components, software, firmware, etc. For example, some features of the invention may be implemented, at least in part, by computer programs embodied in machine-readable media. Some such computer programs may, for example, include instructions for determining which areas of the array(s) will be substantially transmissive, which areas will be substantially non-transmissive and/or which areas will be configured for partial transmission. Such computer programs may include instructions for controlling elements of a camera as described herein, including but not limited to instructions for controlling camera elements that include MEMS arrays.
While the present invention will be described with reference to a few specific embodiments, the description and specific embodiments are merely illustrative of the invention and are not to be construed as limiting. Various modifications can be made to the described embodiments. For example, the steps of methods shown and described herein are not necessarily performed in the order indicated. It should also be understood that the methods shown and described herein may include more or fewer steps than are indicated. In some implementations, steps described herein as separate steps may be combined. Conversely, what may be described herein as a single step may be implemented as multiple steps.
Similarly, device functionality may be apportioned by grouping or dividing tasks in any convenient fashion. For example, when steps are described herein as being performed by a single device (e.g., by a single logic device), the steps may alternatively be performed by multiple devices and vice versa.
MEMS interferometric modulator devices may include a pair of reflective layers positioned at a variable and controllable distance from each other to form a resonant optical gap with at least one variable dimension. This gap may be sometimes referred to herein as an “air gap,” although gases or liquids other than air may occupy the gap in some embodiments. Some embodiments comprise an array that includes MEMS-based light-modulating devices. The array may be configured to absorb and/or reflect light when in a first configuration and to transmit light when in a second position.
According to some embodiments described herein, a camera may an array of MEMS devices that are configured to function as a camera shutter, as a camera aperture, or both. A controller may control the array to transmit light through, or substantially prevent the transmission of light through, predetermined areas of the array. When the array is controlled to function as a camera aperture, the size of the transmissive portion of the array may be controlled in response to input from a user, in response to detected ambient light conditions, etc. When the array is controlled to function as a camera shutter, the time interval during which at least a portion of the area is made transmissive may be controlled in response to input from a user, in response to detected ambient light conditions, in response to the aperture size, etc.
A simplified example of a MEMS-based light-modulating device that may form part of such an array is depicted in
In some embodiments, movable reflective layer 14 may be moved between two positions. In the first position, which may be referred to herein as a relaxed position, the movable reflective layer 14 is positioned at a relatively large distance from a fixed partially reflective layer. The relaxed position is depicted in
The optical stacks may be chosen such that when the movable stack 14 is “up” or separated from the fixed stack 16, most visible light 120a that is incident upon substantially transparent substrate 20 passes through the two stacks and air gap. Such transmitted light 120b is depicted in
Depending on the embodiment, the light reflectance properties of the “up” and “down” states may be reversed. MEMS pixels and/or subpixels can be configured to reflect predominantly at selected colors, in addition to black and white. Moreover, in some embodiments, at least some visible light 120a that is incident upon substantially transparent substrate 20 may be absorbed. In some such embodiments, MEMS device 100 may be configured to absorb most visible light 120a that is incident upon substantially transparent substrate 20 and/or configured to partially absorb and partially transmit such light. Some such embodiments are discussed below.
The depicted portion of the subpixel array in
In some embodiments, the optical stacks 16a and 16b (collectively referred to as optical stack 16) may comprise several fused layers, which can include an electrode layer, such as indium tin oxide (ITO), a partially reflective layer, such as chromium, and a transparent dielectric. The optical stack 16 is thus electrically conductive, partially transparent, and partially reflective. The optical stack 16 may be fabricated, for example, by depositing one or more of the above layers onto a transparent substrate 20. The partially reflective layer can be formed from a variety of materials that are partially reflective such as various metals, semiconductors, and dielectrics. The partially reflective layer can be formed of one or more layers of materials, and each of the layers can be formed of a single material or a combination of materials.
In some embodiments, the layers of the optical stack 16 are patterned into parallel strips, and may form row or column electrodes. For example, the movable reflective layers 14a, 14b may be formed as a series of parallel strips of a deposited metal layer or layers (which may be substantially orthogonal to the row electrodes of 16a, 16b) deposited on top of posts 18 and an intervening sacrificial material deposited between the posts 18. When the sacrificial material is etched away, the movable reflective layers 14a, 14b are separated from the optical stacks 16a, 16b by a defined gap 19. A highly conductive and reflective material such as aluminum may be used for the reflective layers 14, and these strips may form column electrodes in a MEMS array.
With no applied voltage, the gap 19 remains between the movable reflective layer 14a and optical stack 16a, with the movable reflective layer 14a in a mechanically relaxed state, as illustrated by the subpixel 12a in
In one embodiment, the controller 21 is also configured to communicate with an array driver 22. In one embodiment, the array driver 22 includes a row driver circuit 24 and a column driver circuit 26 that provide signals to an array or panel 30, which is a MEMS array in this example. The cross section of the MEMS array illustrated in
The row/column actuation protocol may take advantage of a hysteresis property of MEMS interferometric modulators that is illustrated in
For a MEMS array having the hysteresis characteristics of
This feature makes the subpixel design illustrated in
Desired areas of a MEMS array may be controlled by asserting the set of column electrodes in accordance with the desired set of actuated subpixels in the first row. A row pulse may then be applied to the row 1 electrode, actuating the subpixels corresponding to the asserted column lines. The asserted set of column electrodes is then changed to correspond to the desired set of actuated subpixels in the second row. A pulse is then applied to the row 2 electrode, actuating the appropriate subpixels in row 2 in accordance with the asserted column electrodes. The row 1 subpixels are unaffected by the row 2 pulse, and remain in the state they were set to during the row 1 pulse. This may be repeated for the entire series of rows in a sequential fashion to produce the desired configuration.
A wide variety of protocols for driving row and column electrodes of subpixel arrays may be used to control a MEMS array.
In the embodiment depicted in
In the configuration depicted in
It will be appreciated that a similar procedure can be employed for arrays of dozens or hundreds of rows and columns. It will also be appreciated that the timing, sequence, and levels of voltages used to perform row and column actuation can be varied widely within the general principles outlined above. Moreover, it will be appreciated that the specific values and processes noted above are merely examples and that any suitable actuation voltage method can be used with the systems and methods described herein.
For example, in some camera-related embodiments described herein, groups of MEMS devices in predetermined areas of a MEMS array may be gang-driven instead of being individually controlled. These predetermined areas may, for example, comprise two or more groups of contiguous MEMS devices. A controller, such as a controller of a camera, a controller of a device that includes a camera, etc., may control the movable stack of each MEMS device in the group to be in substantially the same position (e.g., in the “up” or “down” position).
In some such embodiments, a camera system may comprise a simple and relatively inexpensive controller for this purpose, as compared with a controller that is configured to individually control each MEMS device in a MEMS array. In some embodiments, the controller may control the MEMS array in response to input from a user, in response to detected ambient light conditions. A shutter speed may be controlled, at least in part, according to aperture size and vice versa.
In some embodiments, a modulator device may include actuation elements integrated into the thin-film stack which permit displacement of portions of layers relative to one another so as to alter the spacing therebetween.
In some embodiments, the conductive layers 138a and 138b may comprise a transparent or light-transmissive material, such as indium tin oxide (ITO), for example, although other suitable materials may be used. The optical layers 132a and 132b may comprise a material having a high index of refraction. In some particular embodiments, the optical layers 132a and 132b may comprise titanium dioxide, although other materials may be used as well, such as lead oxide, zinc oxide, and zirconium dioxide, for example. The substrates may comprise glass, for example, and at least one of the substrates may be sufficiently thin to permit deformation of one of the layers towards the other.
In one embodiment in which the conductive layers 138a and 138b comprise ITO and are 80 nm in thickness, the optical layers 132a and 132b comprise titanium dioxide and are 40 nm in thickness, and the air gap is initially 170 nm in height.
It can be seen from these plots that the modulator device 130 is highly transmissive across visible wavelengths when in an actuated state with a small air gap (15 nm), particularly for those wavelengths of less than about 800 nm. When in an unactuated state with a larger air gap (170 nm), the device becomes roughly 70% reflective to those same wavelengths. In contrast, the reflectivity and transmission of the higher wavelengths, such as infrared wavelengths, does not significantly change with actuation of the device. Thus, the modulator device 130 can be used to selectively alter the transmission/reflection of a wide range of visible wavelengths, without significantly altering the infrared transmission/reflection (if so desired).
The second device 240 may in certain embodiments comprise a device which transmits a certain amount of incident light. In certain embodiments, the device 240 may comprise a device which absorbs a certain amount of incident light. In particular embodiments, the device 240 may be switchable between a first state which is substantially transmissive to incident light, and a second state in which the absorption of at least certain wavelengths is increased. In still other embodiment, the device 240 may comprise a fixed thin film stack having desired transmissive, reflective, or absorptive properties.
In certain embodiments, suspended particle devices (“SPDs”) may be used to change between a transmissive state and an absorptive state. These devices comprise suspended particles which in the absence of an applied electrical field are randomly positioned, so as to absorb and/or diffuse light and appear “hazy.” Upon application of an electrical field, these suspended particles may be aligned in a configuration which permits light to pass through.
Other devices 240 may have similar functionality. For example, in alternative embodiments, device 240 may comprise another type of “smart glass” device, such as an electrochromic device, micro-blinds or a liquid crystal device (“LCD”). Electrochromic devices change light transmission properties in response to changes in applied voltage. Some such devices may include reflective hydrides, which change from transparent to reflective when voltage is applied. Other electrochromic devices may comprise porous nano-crystalline films. In another embodiment, device 240 may comprise an interferometric modulator device having similar functionality.
Thus, when the device 240 comprises an SPD or a device having similar functionality, the apparatus 220 can be switched between three distinct states: a transmissive state, when both devices 230 and 240 are in a transmissive state, a reflective state, when device 230 is in a reflective state, and an absorptive state, when device 240 is in an absorptive state. Depending on the orientation of the apparatus 220 relative to the incident light, the device 230 may be in a transmissive state when the apparatus 220 is in an absorptive state, and similarly, the device 240 may be in a transmissive state when the apparatus 220 is in an absorptive state.
An array of MEMS devices that may be used for some embodiments described herein is depicted in
Referring first to
Referring now to
Further simplifications may be introduced in other embodiments, for example, by controlling an entire row, column or other aggregation of cells 705 as a group. In some such embodiments, all of the cells 705 within area 710a may be controlled as a group. In some such embodiments, the devices with area 710a and/or other portions of array 700a may be organized into separately controllable cells 705, but alternative embodiments may not comprise separately controllable cells 705. In some embodiments, columns and/or rows of devices and/or cells 705 may be controlled as a group.
Some such arrays may be controlled to function as a variable camera aperture. In some such embodiments, each area of a plurality of areas of the array may be controlled as a group. Such embodiments may include a controller that is configured to drive such predetermined areas of the array to obtain predetermined f-stop settings for a camera aperture.
One example is provided in
Data corresponding with areas 710c through 710j may, for example, be stored in a memory accessible by a camera controller and retrieved as needed to drive array 700b. Such aperture control enables satisfactory photographs to be taken in a variety of lighting conditions. Although the MEMS devices may be separately driven in alternative embodiments, simple and low-cost controllers may be used for gang-driving predetermined groups of MEMS devices corresponding to the predetermined areas.
In some embodiments, array 700b (or a similar array) may be controlled to achieve additional f-numbers. For example, if camera including such an array had a user interface for controlling aperture size, additional cells of array 700b may be made transmissive, reflective or absorptive to achieve a desired f-number. If a user were able to select certain f-numbers, such as f/2, a controller could cause area 710d of array 700b to be transmissive. However, if a user were able to select, e.g., f/3, a modified version of area 710e could be driven to more nearly match this f-number. For example, additional cells of area 710e could be made non-transmissive, such that the transmissive portion of area 710e would more closely correspond with an f-number of f/3. Alternative aperture array embodiments may have additional areas 710, to allow closer matching of additional f-numbers.
Camera lens assembly 810 may include one or more lenses, filters, spacers or other such components. Depending on the implementation, camera lens assembly 810 may be made integral with another device, such as a mobile device. Alternatively, camera lens assembly 810 may be configured to be easily removed and replaced by a user. For example, a user may desire to have several camera lens assemblies 810 with different focal lengths or ranges of focal lengths.
At the moment depicted in
In some embodiments, the duration of time that the camera controller causes the cells of shutter array 700c to be in a transmissive condition may depend, at least in part, on the f-number of aperture 815. For example, in some embodiments the camera controller may be configured to receive user input regarding the f-number of aperture 815. The camera controller may use this input to determine, at least in part, the duration of time that the cells of shutter array 700c are in a transmissive condition.
In other embodiments, the camera controller may be configured to receive user input regarding the shutter speed of shutter array 700c. In some such embodiments, the camera controller may be configured to control aperture 815 according to user input regarding the shutter speed of shutter array 700c.
In alternative embodiments, camera aperture 815 may be fixed. The camera controller may use the f-number and/or other information regarding the fixed aperture to determine, at least in part, the duration of time that the cells of shutter array 700c will be in a transmissive condition.
Some embodiments may also include an ambient light sensor. The camera controller may use ambient light data from the ambient light sensor as well as camera aperture data to determine the duration of time that the cells of shutter array 700c are in a transmissive condition.
Although shutter array 700c is positioned near image sensor 820 in this example, other configurations may be used. For example, in some embodiments shutter array 700c may be positioned within lens assembly 810. In some embodiments shutter array 700c may be positioned in or near a focal plane of a camera assembly. In alternative embodiments, shutter array 700c may be positioned in front of lens assembly 810.
An aperture controller (which may or may not be the same controller that controls array 700c, according to the particular implementation) has temporarily controlled area 710k of aperture array 700d to be in a substantially non-transmissive state. For example, the aperture controller may have controlled one or more “smart glass” elements in area 710k to be in an absorptive state. Alternatively, or additionally, the aperture controller may have controlled cells in area 710k to be in a reflective condition with respect to visible light. Accordingly, light ray 825d and other light rays that are incident upon area 710k do not enter lens assembly 810.
However, the aperture controller has temporarily driven the cells within area 7101 of aperture array 700d to be a transmissive state. The cells of shutter array 700c are also driven by a controller to be temporarily in a transmissive “open shutter” condition. The shutter controller may, for example, have performed this action in response to receiving user input from a shutter control or other user input device. Accordingly, light ray 825b, light ray 825c and light rays at intermediate angles can pass through area 7101, lens assembly 810 and shutter array 700c to reach image sensor 820. (The refractive effects of lens assembly 810 on light rays are not indicated in the simplified examples described herein.) If the device that includes the camera has a flash assembly, the shutter controller (or another such controller) may synchronize the open shutter condition of shutter array 700c with the activation of a light source in a camera flash assembly.
In some embodiments, the aperture controller may be configured to receive user input regarding a desired f-number of array 700d. Based on a user's selection of f-number, an aperture controller may determine a corresponding manner of controlling array 700d. For example, the aperture controller may select a corresponding array control template from a plurality of predetermined array control templates stored in a memory. Each of the array control templates may indicate groups of array cells and how each of the groups is controlled to yield a predetermined result, such as a desired f-number.
In some embodiments, the duration of time that a camera controller causes the cells of shutter array 700c to be in a transmissive condition may depend, at least in part, on the f-number of array 700d. The camera controller may also use ambient light data from an ambient light sensor as well as camera aperture data to determine the duration of time that the cells of shutter array 700c are in a transmissive condition.
A camera controller may also be configured to receive user input regarding a desired shutter speed and may control array 700c according to this input. In some such embodiments, an aperture controller may control the f-number of array 700d according to a selected shutter speed. The controller may also use ambient light data from an ambient light sensor to determine an appropriate f-number for array 700d.
Array 700e of
Camera controller 960 may control at least some components of camera 900 according to input from user interface system 965. In some embodiments, user interface system 965 may include a shutter control such as a button or a similar device. User interface system 965 may include a display device configured to display images, graphical user interfaces, etc. In some such embodiments, user interface system 965 may include a touch screen.
User interface system 965 may have varying complexity, according to the specific embodiment. For example, in some embodiments, user interface system 965 may include an aperture control that allows a user to provide input regarding a desired aperture size. Camera controller 960 may control shutter array 700c according to aperture size input received from user interface system 965. Similarly, user interface system 965 may include a shutter control that allows a user to indicate a desired shutter speed. Camera controller 960 may control aperture array 700d according to shutter speed input received from user interface system 965. Camera controller 960 may control shutter array 700c and/or aperture array 700d according to ambient light data received from light sensor 975.
Camera flash assembly 800 includes light source 805 and flash array 700f. In this embodiment, camera flash assembly 800 does not have a separate controller. Instead, camera controller 960 controls camera flash assembly 800 of camera 900. Camera interface system 955 provides I/O functionality and transfers information between camera controller 960, camera flash assembly 800 and other components of camera 900. In alternative embodiments, camera flash assembly 800 also includes a flash assembly controller configured for controlling light source 805 and array 700f. Various MEMS-based embodiments of camera flash assembly 800 are described in U.S. application Ser. No. 12/836,872 (see, e.g.,
In some embodiments, camera controller 960 may be configured to send control signals to camera flash assembly 800 regarding the appropriate configuration of flash array 700f and/or the appropriate illumination provided by light source 805. Moreover, camera controller 960 may be configured to synchronize the operation of camera flash assembly 800 with the operation of shutter array 700c.
Images from lens system 810 may be captured on image sensor 820. Camera controller 960 may control a display, such as that depicted in
Several components of camera 900 that are shown in
Referring now to
The display 30 in this example of the display device 40 may be any of a variety of displays. Moreover, although only one display 30 is illustrated in
Components of one embodiment of display device 40 are schematically illustrated in
The network interface 27 includes the antenna 43 and the transceiver 47 so that the display device 40 can communicate with one or more devices over a network. In some embodiments, the network interface 27 may also have some processing capabilities to relieve requirements of the processor 21. The antenna 43 may be any antenna known to those of skill in the art for transmitting and receiving signals. In one embodiment, the antenna is configured to transmit and receive RF signals according to an Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard, e.g., IEEE 802.11(a), (b), or (g). In another embodiment, the antenna is configured to transmit and receive RF signals according to the BLUETOOTH standard. In the case of a cellular telephone, the antenna may be designed to receive Code Division Multiple Access (“CDMA”), Global System for Mobile communications (“GSM”), Advanced Mobile Phone System (“AMPS”) or other known signals that are used to communicate within a wireless cell phone network. The transceiver 47 may pre-process the signals received from the antenna 43 so that the signals may be received by, and further manipulated by, the processor 21. The transceiver 47 may also process signals received from the processor 21 so that the signals may be transmitted from the display device 40 via the antenna 43.
In an alternative embodiment, the transceiver 47 may be replaced by a receiver and/or a transmitter. In yet another alternative embodiment, network interface 27 may be replaced by an image source, which may store and/or generate image data to be sent to the processor 21. For example, the image source may be a digital video disk (DVD) or a hard disk drive that contains image data, or a software module that generates image data. Such an image source, transceiver 47, a transmitter and/or a receiver may be referred to as an “image source module” or the like.
Processor 21 may be configured to control the operation of the display device 40. The processor 21 may receive data, such as compressed image data from the network interface 27, from camera 900 or from another image source, and process the data into raw image data or into a format that is readily processed into raw image data. The processor 21 may then send the processed data to the driver controller 29 or to frame buffer 28 (or another memory device) for storage.
Processor 21 may control camera 900 according to input received from input device 48. When camera 900 is operational, images received and/or captured by lens system 810 may be displayed on display 30. Processor 21 may also display stored images on display 30. In some embodiments, camera 900 may include a separate controller for camera-related functions.
In one embodiment, the processor 21 may include a microcontroller, central processing unit (“CPU”), or logic unit to control operation of the display device 40. Conditioning hardware 52 may include amplifiers and filters for transmitting signals to the speaker 45, and for receiving signals from the microphone 46. Conditioning hardware 52 may be discrete components within the display device 40, or may be incorporated within the processor 21 or other components. Processor 21, driver controller 29, conditioning hardware 52 and other components that may be involved with data processing may sometimes be referred to herein as parts of a “logic system,” a “control system” or the like.
The driver controller 29 may be configured to take the raw image data generated by the processor 21 directly from the processor 21 and/or from the frame buffer 28 and reformat the raw image data appropriately for high speed transmission to the array driver 22. Specifically, the driver controller 29 may be configured to reformat the raw image data into a data flow having a raster-like format, such that it has a time order suitable for scanning across the display array 30. Then the driver controller 29 may send the formatted information to the array driver 22. Although a driver controller 29, such as a LCD controller, is often associated with the system processor 21 as a stand-alone integrated circuit (“IC”), such controllers may be implemented in many ways. For example, they may be embedded in the processor 21 as hardware, embedded in the processor 21 as software, or fully integrated in hardware with the array driver 22. An array driver 22 that is implemented in some type of circuit may be referred to herein as a “driver circuit” or the like.
The array driver 22 may be configured to receive the formatted information from the driver controller 29 and reformat the video data into a parallel set of waveforms that are applied many times per second to the plurality of leads coming from the display's x-y matrix of pixels. These leads may number in the hundreds, the thousands or more, according to the embodiment.
In some embodiments, the driver controller 29, array driver 22, and display array 30 may be appropriate for any of the types of displays described herein. For example, in one embodiment, driver controller 29 may be a transmissive display controller, such as an LCD display controller. Alternatively, driver controller 29 may be a bi-stable display controller (e.g., an interferometric modulator controller). In another embodiment, array driver 22 may be a transmissive display driver or a bi-stable display driver (e.g., an interferometric modulator display driver). In some embodiments, a driver controller 29 may be integrated with the array driver 22. Such embodiments may be appropriate for highly integrated systems such as cellular phones, watches, and other devices having small area displays. In yet another embodiment, display array 30 may comprise a display array such as a bi-stable display array (e.g., a display including an array of interferometric modulators).
The input system 48 allows a user to control the operation of the display device 40. In some embodiments, input system 48 includes a keypad, such as a QWERTY keyboard or a telephone keypad, a button, a switch, a touch-sensitive screen, or a pressure- or heat-sensitive membrane. In one embodiment, the microphone 46 may comprise at least part of an input system for the display device 40. When the microphone 46 is used to input data to the device, voice commands may be provided by a user for controlling operations of the display device 40.
Power supply 50 can include a variety of energy storage devices. For example, in some embodiments, power supply 50 may comprise a rechargeable battery, such as a nickel-cadmium battery or a lithium ion battery. In another embodiment, power supply 50 may comprise a renewable energy source, a capacitor, or a solar cell such as a plastic solar cell or solar-cell paint. In some embodiments, power supply 50 may be configured to receive power from a wall outlet.
In some embodiments, control programmability resides, as described above, in a driver controller which can be located in several places in the electronic display system. In some embodiments, control programmability resides in the array driver 22.
In step 1105, an indication is received by camera controller 960 from a user input device that a user wants to take a picture. For example, an indication may be received by camera controller 960 from shutter control 1005 of
In this example, user interface system 965 of
In step 1125, camera controller 960 determines whether a flash would be appropriate. For example, if the shutter speed determined in step 1120 exceeds a predetermined threshold (such as ½ second, 1 second, etc.), camera controller 960 may determine that a flash would be appropriate. If so, step 1125 may also involve determining a revised shutter speed appropriate for the additional light contributed by the camera flash, given the aperture data.
In some embodiments, a user may be able to manually override use of the flash. For example, a user may intend to use a tripod or some other means of supporting the camera when a photograph is taken. If so, the user may not want the flash to operate when the picture is taken, even if the shutter will need to be open for a relatively long period of time.
If camera controller 960 determines in step 1125 that a flash should be used, camera controller 960 determines appropriate instructions for flash assembly 800 (such as the appropriate timing, intensity and duration of the flash(es) from light source 805) and coordinates the timing of the flash(es) with the operation of shutter array 700c. (Step 1130.) However, if camera controller 960 determines in step 1125 that a flash will not be used, camera controller 960 controls shutter array 700c (step 1135). An image is captured on image sensor 820 in step 1140.
In this example, the image captured in step 1140 is displayed on a display device in step 1145. The image may be deleted, edited, stored or otherwise processed according to input received from user input system 965. In step 1150, it will be determined whether the process will continue. For example, it may be determined whether input has been received from the user within a predetermined time, whether the user is powering off the camera, etc. In step 1155, the process ends.
In this example, user interface system 965 of
Here, camera controller 960 determines an appropriate aperture configuration according to the shutter speed data and the ambient light data (step 1220). For example, camera controller 960 may determine an appropriate aperture f-number according to the shutter speed data and the ambient light data. Camera controller 960 may query a memory structure that includes a plurality of predetermined aperture array control templates and corresponding f-numbers. Camera controller 960 may select an aperture array control template from the plurality of predetermined aperture array control templates that most closely matches the appropriate aperture f-number.
In step 1225, camera controller 960 determines whether a flash would be appropriate. If camera controller 960 determines in step 1225 that a flash will be used, camera controller 960 may determine whether the aperture array configuration determined in step 1220 would still be appropriate. If not, a new aperture array configuration may be determined. In alternative implementations, step 1225 may be performed prior to step 1220, so that only one process of determining aperture array configuration is performed for each iteration of method 1200.
If camera controller 960 has determined in step 1225 that a flash will be used, camera controller 960 determines appropriate instructions for flash assembly 800 and coordinates the timing of the flash(es) with the operation of the camera shutter. (Step 1230.) If camera controller 960 determines in step 1225 that a flash will not be used, camera controller 960 nonetheless controls the shutter in step 1235 according to the shutter speed data received in step 1215. An image is captured on image sensor 820. (Step 1240.)
In this example, the image captured in step 1240 is displayed on a display device in step 1245. In step 1250, it will be determined whether the process will continue. In step 1255, the process ends.
Although illustrative embodiments and applications are shown and described herein, many variations and modifications are possible which remain within the concept, scope, and spirit of the subject matter provided herein, and these variations should become clear after perusal of this application. For example, alternative MEMS devices and/or fabrication methods such as those described in U.S. application Ser. No. 12/255,423, entitled “Adjustably Transmissive MEMS-Based Devices” and filed on Oct. 21, 2008 (which is hereby incorporated by reference) may be used. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.