Adjusting Image Capture Device Settings

Abstract
Disclosed are various embodiments of generating user interface elements associated with an image capture device. Image settings applicable to an image can be identified. A user interface element providing simultaneous adjustability of image settings can be generated. Multiple image settings can be adjusted when a user modifies an adjustability element rendered in the user interface.
Description
BACKGROUND

Image capture devices (e.g., still cameras, video cameras, etc.) can incorporate various user-adjustable image capture settings that can be adjusted by a user via a user interface. These user-adjustable settings can include color levels, contrast, sharpness, exposure settings, aperture settings, flash intensity, and other settings applicable to image capture. Users may avoid adjusting these various settings due to the complexity and/or time required to navigate a user interface as well as make adjustments to the various settings individually.





BRIEF DESCRIPTION OF THE DRAWINGS

Many aspects of the invention can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.



FIGS. 1A-1B are drawings of a mobile device incorporating an image capture device according to various embodiments of the disclosure.



FIG. 2 is a drawing of an image capture device executing a user interface application according to various embodiments of the disclosure.



FIGS. 3-5 are drawings of example user interfaces that can be generated by the user interface application and/or image capture device according to various embodiments of the disclosure.



FIG. 6 is a flowchart depicting one example execution of a user interface application executed in an image capture device according to various embodiments of the disclosure.





DETAILED DESCRIPTION

Embodiments of the present disclosure relate to systems and methods that can be executed in an image capture device. More specifically, embodiments of the disclosure relate to systems and methods of presenting image setting adjustment user interface elements in a user interface associated with the image capture device. In the context of this disclosure, an image capture device can include a camera, video camera, a mobile device with an integrated image capture device, or other devices suitable to capturing imagery and/or video as can be appreciated. In some embodiments, an image capture device according to an embodiment of the disclosure can include a device such as a smartphone, tablet computing system, laptop computer, desktop computer, or any other computing device that has the capability to receive and/or capture imagery via image capture hardware.


Accordingly, image capture device hardware can include components such as lenses, image sensors (e.g., charge coupled devices, CMOS image sensor, etc.), processor(s), image signal processor(s), a main processor, memory, mass storage, or any other hardware or software components that can facilitate capture of imagery and/or video. In some embodiments, an image signal processor can be incorporated as a part of a main processor in an image capture device module that is in turn incorporated into a device having its own processor, memory and other components.


An image capture device according to an embodiment of the disclosure can provide a user interface via a display that is integrated into the image capture device. The display can be integrated with a mobile device, such as a smartphone and/or tablet computing device, and can include a touchscreen input device (e.g., a capacitive touchscreen, etc.) with which a user may interact with the user interface that is presented thereon. The image capture device hardware can also include one or more buttons, dials, toggles, switches, or other input devices with which the user can interact with software executed in the image capture device.


Referring now to the drawings, FIGS. 1A-1B show a mobile device 102 that can comprise and/or incorporate an image capture device according to various embodiments of the disclosure. The mobile device 102 may comprise, for example, a processor-based system, such as a computer system. Such a computer system may be embodied in the form of a desktop computer, a laptop computer, a personal digital assistant, a mobile device (e.g., cellular telephone, smart phone, etc.), tablet computing system, set-top box, music players, or other devices with like capability. The mobile device can include, for example, an image capture device 104, which can further include a lens system as well as other hardware components that can be integrated with the device to facilitate image capture. The mobile device 102 can also include a display device 141 upon which various content and other user interfaces may be rendered. The mobile device 102 can also include one or more input devices with which a user can interact with a user interface rendered on the display device 141. For example, the mobile device 102 can include or be in communication with a mouse, touch input device (e.g., capacitive and/or resistive touchscreen incorporated with the display device 141), keyboard, or other input devices.


The mobile device 102 may be configured to execute various applications, such as a camera application that can interact with an image capture module that includes various hardware and/or software components that facilitate capture and/or storage of images and/or video. In one embodiment, the camera application can interact with application programming interfaces (API's) and/or other software libraries and/or drivers that are provided for the purpose interacting with image capture hardware, such as the lens system and other image capture hardware. The camera application can be a special purpose application, a plug-in or executable library, one or more API's, image control algorithms, image capture device firmware, or other software that can facilitate communication with image capture hardware in communication with the mobile device 102.



FIG. 2 illustrates an embodiment of the various image capture components, or one example of an image capture device 104, that can be incorporated in the mobile device 102 illustrated in FIGS. 1A-1B. Although one implementation is shown in FIG. 2 and described herein, an image capture device according to an embodiment of the disclosure more generally comprises an image capture device that can provide images in digital form.


The image capture device 104 includes a lens system 200 that conveys images of viewed scenes to an image sensor 202. By way of example, the image sensor 202 comprises a charge-coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor that is driven by one or more sensor drivers 204. The analog image signals captured by the sensor 202 are provided to an analog-to-digital (ND) converter 206 for conversion into binary code that can be processed by a processor 208. The processor can also execute a user interface application 151 that can facilitate generating the user interface elements discussed herein. In some embodiments, the user interface application 151 can take the form of API's, firmware, or other software accessible to the image capture device 104 and/or a mobile device 102 or other system in which the image capture device 104 is integrated.


Operation of the sensor driver(s) 204 is controlled through a camera controller 210 that is in bi-directional communication with the processor 208. In some embodiments, the controller 210 can control one or more motors 212 that are used to drive the lens system 200 (e.g., to adjust focus, zoom, and/or aperture settings). The controller 210 can also communicate with a flash system, user input devices (e.g., buttons, dials, toggles, etc.) or other components associated with the image capture device 104. Operation of the camera controller 210 may be adjusted through manipulation of a user interface. A user interface comprises the various components used to enter selections and commands into the image capture device 104 and therefore can include various buttons as well as a menu system that, for example, is displayed to the user in, for example, a camera application executed on a mobile device 102 and/or on a back panel associated with a standalone digital camera.


The digital image signals are processed in accordance with instructions from an image signal processor 218 stored in permanent (non-volatile) device memory. Processed (e.g., compressed) images may then be stored in storage memory, such as that contained within a removable solid-state memory card (e.g., Flash memory card). The embodiment shown in FIG. 2 further includes a device interface 224 through which the image capture device 104 can communicate with a mobile device or other computing system in which it may be integrated. For example, the device interface 224 can allow the image capture device to communicate with a main processor associated with a mobile device as well as memory, mass storage, or other resources associated with the mobile device. The device interface 224 can communicate with a mobile device in various communications protocols, and this communication can be facilitated, at a software level, by various device drivers, libraries, API's or other software associated with the image capture device 104 that is executed in the mobile device.


Accordingly, as will be shown in the following drawings, a user interface application 151 executed by the image capture device and/or software that interacts with the image capture device can facilitate generation of user interface elements that allow simultaneous adjustability of multiple image settings associated with the image capture device. The user interface application 151 can generate adjustment elements in various forms that allow simultaneous adjustment of image settings associated with the image capture device, such as, but not limited to, color settings, contrast, brightness, flash settings, exposure settings, shutter settings, aperture settings, or any other settings associated with capture of imagery and/or video by the image capture device.


Therefore, reference is now made to FIGS. 3A-3B, which illustrates one example of a user interface that can be generated by the image capture device 104 or software executed by the mobile device 102 that interacts with the image capture device 104. The depicted user interface 301 provides simultaneous adjustability of multiple image settings associated with the image capture device 104. Image settings can include, but are not limited to, a brightness, a color intensity, a hue, a contrast, a sharpness, a shutter speed, a flash intensity, a flash duration, a film speed setting, an exposure value setting, a white balance setting, a noise filtering setting, a focal distance setting, an exposure index setting and an aperture size setting, or any other setting that can have an effect on the appearance of an image and/or a setting associated with the image capture device 104.


In the depicted example, the user interface 301 can link various image settings together with an adjustability user interface element 303. In the example of FIG. 3A, the adjustability user interface element 303 comprises a one dimensional element in that it allows adjustment of a slidable element 305 along a single axis. When a user adjusts the slidable element 305 along the axis, the image capture device can adjust the image settings associated with the adjustability user interface element 303 simultaneously. In the depicted example, the adjustability user interface element 303 can link brightness, color, and contrast image settings and adjust these image settings simultaneously as a user interacts with the element via an input device. The image capture device 104 can also determine a respective scale that is associated with each of the image settings linked by the adjustability user interface element 303. In other words, for each image setting associated with the element 303, the image capture device 104 can identify a minimum setting that can be associated with a first extreme of the element 303, a maximum setting can be associated with a second extreme of the element 303, and an amount by which an image setting is incremented and/or decremented as a user adjusts the slidable element 305 along the axis of the adjustability element 303. Additionally, the minimum setting, maximum setting and amount by which the image setting is changed can vary for each of the image settings linked by the adjustability element 303. Additionally, the adjustability element 303 can provide a visual cue that indicates to the user how an image setting will be changed as the user moves the slidable element 305 along the axis. FIG. 3B illustrates an alternative example where the adjustability element 303 can be rendered such that numeral indicators that indicate a level associated with an image setting can be included.


The image capture device 104 can generate various combinations of image settings that can be linked with the adjustability element 303. In some embodiments, the image capture device 104 can identify image properties associated with a captured image and/or a device mode in which the image capture device is placed. Accordingly, the image capture device 104 can then identify image settings that are applicable to the device mode and/or image the user desires to adjust. For example, if an image capture device 104 is placed in a video mode and/or the user invokes the adjustability element 103 in relation to a video captured by the image capture device 104, the adjustability element 303 generated by the image capture device can include settings that are applicable to one or both. As another example, if an image and/or device mode is black and white, the image capture device 104 can suppress color settings from appearing in the adjustability element.


Reference is now made to FIG. 4, which illustrates an example of a multi-dimensional adjustability element 403 that allows the user adjustments in at least two axes. In the depicted example, the adjustability element 403 can allow adjustment about a circular axis as well as a linear axis. In this way, the user can adjust multiple image settings via a single user interface element.


Reference is now made to FIG. 5, which illustrates an alternative example of an adjustability element 503 according to various embodiments of the disclosure. In the depicted example, the image capture device can generate an adjustability element that allows adjustment of multiple device and/or image settings via a single element. In the example of FIG. 5, the image capture device can alter a mix of image and/or device settings based at least upon a position of the slidable element 505. For example, the image capture device 104 can link certain image and/or device settings that are elated to one another.


As shown in FIG. 5, the adjustability element 503 can provide the ability to cause changes to a shutter speed as well as aperture setting associated with the image capture device. In this sense, the image capture device 104 can be placed into a shutter priority mode, as one extreme, and an aperture priority mode, as a second extreme. In a shutter priority mode, the image capture device can slow the shutter speed and narrow an aperture associated with the image capture device. In the aperture priority mode, the image capture device can widen an aperture and shorten a shutter speed associated with the image capture device. Accordingly, as the user adjusts the slidable element 505 between the depicted extremes, the image capture device 104 can create a mix of shutter speed and/or aperture size according to the distance from each extreme of the slidable element 505. As an alternative example of such a scenario, the adjustability element 503 can also allow a user to select a mixture flash intensity and/or duration on one extreme and an shutter speed or other exposure settings on another extreme.


Referring next to FIG. 6, shown is a flowchart that provides one example of the operation of a portion of a user interface application 151 executed by an image capture device 104, a mobile device 102 or any other device in which an image capture device 104 is integrated according to various embodiments of the disclosure. It is understood that the flowchart of FIG. 6 provides merely an example of the many different types of functional arrangements that may be employed to implement the operation of the portion of logic employed by the image capture device as described herein. As an alternative, the flowchart of FIG. 6 may be viewed as depicting an example of steps of a method implemented in a computing device, processor, or other circuits according to one or more embodiments.


First, in box 601, the user interface application 151 can initiate capture of one or more images and/or video by the image capture device 104. In one embodiment, the user interface application 151 can facilitate generation of a user interface through which a user can initiate image capture via the image capture device. In box 603, the user interface application 151 can identify image settings that are applicable to the image. In box 605, the user interface application 151 can generate a user interface element that, as described above, provides simultaneous adjustment of multiple image settings. In box 607, the user interface application 151 can determine whether the user interface element is modified by the user. In box 609, the user interface application 151 can cause the image capture device 104 to adjust the image settings linked by the user interface element when the element is modified by the user.


It should be appreciated that in some embodiments, image settings may be adjusted without initiating image capture as described in box 601, and that the example illustrated in the flowchart of FIG. 6 is but one non-limiting example. For example, a mobile device 102 and/or image capture application 104 can generate a user interface element providing adjustability of multiple image settings in conjunction a gallery application that allows for viewing and/or browsing of imagery and/or video stored in a mass storage device. Additionally, the user interface application 151 can be invoked to provide such an adjustability user interface element in conjunction with software that allows modification of various camera settings in a context that is unrelated to image capture. Other variations should be appreciated by a person of ordinary skill in the art.


Embodiments of the present disclosure can be implemented in various devices, for example, having a processor, memory as well as image capture hardware that can be coupled to a local interface. The logic described herein can be executable by one or more processors integrated with a device. In one embodiment, an application executed in a computing device, such as a mobile device, can invoke API's that provide the logic described herein as well as facilitate interaction with image capture hardware. Where any component discussed herein is implemented in the form of software, any one of a number of programming languages may be employed such as, for example, processor specific assembler languages, C, C++, C#, Objective C, Java, Javascript, Perl, PHP, Visual Basic, Python, Ruby, Delphi, Flash, or other programming languages.


As such, these software components can be executable by one or more processors in various devices. In this respect, the term “executable” means a program file that is in a form that can ultimately be run by a processor. Examples of executable programs may be, for example, a compiled program that can be translated into machine code in a format that can be loaded into a random access portion of memory and run by a processor, source code that may be expressed in proper format such as object code that is capable of being loaded into a random access portion of the memory and executed by the processor, or source code that may be interpreted by another executable program to generate instructions in a random access portion of the memory to be executed by the processor, etc. An executable program may be stored in any portion or component of the memory including, for example, random access memory (RAM), read-only memory (ROM), hard drive, solid-state drive, USB flash drive, memory card, optical disc such as compact disc (CD) or digital versatile disc (DVD), floppy disk, magnetic tape, or other memory components.


Although various logic described herein may be embodied in software or code executed by general purpose hardware as discussed above, as an alternative the same may also be embodied in dedicated hardware or a combination of software/general purpose hardware and dedicated hardware. If embodied in dedicated hardware, each can be implemented as a circuit or state machine that employs any one of or a combination of a number of technologies. These technologies may include, but are not limited to, discrete logic circuits having logic gates for implementing various logic functions upon an application of one or more data signals, application specific integrated circuits having appropriate logic gates, or other components, etc. Such technologies are generally well known by those skilled in the art and, consequently, are not described in detail herein.


The flowchart of FIG. 6 shows the functionality and operation of an implementation of portions of an image capture device according to embodiments of the disclosure. If embodied in software, each block may represent a module, segment, or portion of code that comprises program instructions to implement the specified logical function(s). The program instructions may be embodied in the form of source code that comprises human-readable statements written in a programming language or machine code that comprises numerical instructions recognizable by a suitable execution system such as a processor in a computer system or other system. The machine code may be converted from the source code, etc. If embodied in hardware, each block may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).


Although the flowchart of FIG. 6 shows a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession in FIG. 6 may be executed concurrently or with partial concurrence. Further, in some embodiments, one or more of the blocks shown in FIG. 6 may be skipped or omitted. In addition, any number of counters, state variables, warning semaphores, or messages might be added to the logical flow described herein, for purposes of enhanced utility, accounting, performance measurement, or providing troubleshooting aids, etc. It is understood that all such variations are within the scope of the present disclosure.


Also, any logic or application described herein that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor in a computer device or other system. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system. The computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media. More specific examples of a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.


It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims
  • 1. An image capture device, comprising: at least one image sensor; anda user interface application executed in the image capture device, comprising: logic that initiates capture of an image via an image sensor associated with the image capture device;logic that identifies a plurality of image settings applicable to the image; andlogic that generates a user interface element providing adjustability of the image settings, the user interface element further comprising an adjustability element that simultaneously modifies the plurality of image settings when the adjustability element is modified by a user.
  • 2. The image capture device of claim 1, wherein the logic that identifies the plurality of image settings applicable to the image further comprises: logic that receives a request to modify the image; andlogic that identifies a combination of device settings that can be simultaneously modified.
  • 3. The image capture device of claim 1, wherein the image settings comprise at least a subset of: a brightness, a color intensity, a contrast, a sharpness, a shutter speed, a flash intensity, a flash duration, and an aperture size setting.
  • 4. The image capture device of claim 1, wherein the adjustability element further comprises a slidable user interface element, the slidable user interface element causing the image settings to be modified when the slidable user interface element is modified by the user.
  • 5. The image capture device of claim 1, wherein the adjustability element further comprises a two dimensional adjustability element, the two dimensional adjustability element providing a first adjustability element movable in a first axis configured to modify a first image setting and a second adjustability element movable in a second axis configured to modify a second image setting.
  • 6. The image capture device of claim 5, wherein the two dimensional adjustability element further comprises a dial user interface element, wherein the first axis comprises a circular axis and the second axis comprises a linear axis.
  • 7. The image capture device of claim 1, wherein the image settings further comprise a shutter speed setting and an aperture size setting, and the adjustability element further comprises a one dimensional adjustability element.
  • 8. The image capture device of claim 1, wherein the image settings further comprise a flash duration setting and an exposure setting, and the adjustability element further comprises a one dimensional adjustability element.
  • 9. The image capture device of claim 1, wherein the adjustability element further comprises a one dimensional adjustability element adjustable in a horizontal direction, the one dimensional adjustment further including a respective visual indicator indicating a numeral value for each of the image settings.
  • 10. The image capture device of claim 9, wherein the image settings further comprise a color setting, a sharpness setting, and a contrast setting.
  • 11. A method executed in an image capture device comprising the steps of: capturing an image via an image sensor associated with the image capture device;identifying a plurality of image settings applicable to the image;generating a user interface element comprising an adjustability element providing adjustability of the image settings; andsimultaneously modifying the plurality of image settings when the adjustability element is modified by the user.
  • 12. The method of claim 11, wherein the step of identifying the plurality of image settings applicable to the image further comprises the steps of: receiving a request to modify the image; andidentifying a combination of device settings that can be simultaneously modified.
  • 13. The method of claim 11, wherein the image settings comprise at least a subset of: a brightness, a color intensity, a hue, a contrast, a sharpness, a shutter speed, a flash intensity, a flash duration, a film speed setting, an exposure value setting, a white balance setting, a noise filtering setting, a focal distance setting, an exposure index setting and an aperture size setting.
  • 14. The method of claim 11, wherein the adjustability element further comprises a slidable user interface element, the slidable user interface element causing the image settings to be modified when the slidable user interface element is modified by the user.
  • 15. The method of claim 11, wherein the adjustability element further comprises a two dimensional adjustability element, the two dimensional adjustability element providing a first adjustability element movable in a first axis configured to modify a first image setting and a second adjustability element movable in a second axis configured to modify a second image setting.
  • 16. The method of claim 15, wherein the two dimensional adjustability element further comprises a dial user interface element, wherein the first axis comprises a circular axis and the second axis comprises a linear axis.
  • 17. The method of claim 11, wherein the image settings further comprise a shutter speed setting and an aperture size setting, and the adjustability element further comprises a one dimensional adjustability element.
  • 18. The method of claim 11, wherein the image settings further comprise a flash duration setting and an exposure setting, and the adjustability element further comprises a one dimensional adjustability element.
  • 19. The method of claim 11, wherein the adjustability element further comprises a one dimensional adjustability element adjustable in a horizontal direction, the one dimensional adjustment further including a respective visual indicator indicating a numeral value for each of the image settings.
  • 20. A system, comprising: means for initiating capture an image via an image sensor means;means for identifying a plurality of image settings applicable to the image;means for generating a user interface element providing adjustability of the image settings;wherein the user interface element further comprises an adjustment means that simultaneously modifies the plurality of image settings when the adjustment means is modified by a user.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to co-pending U.S. provisional application entitled, “Image Capture Device Systems and Methods,” having Ser. No. 61/509,747, filed Jul. 20, 2011, which is entirely incorporated herein by reference.

Provisional Applications (1)
Number Date Country
61509747 Jul 2011 US