This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2014-135780, filed Jul. 1, 2014, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an electronic apparatus, processing method and a storage medium.
Portable, battery-powered electronic apparatuses, such as tablet computers and smartphones are now popular. Many such electronic apparatuses incorporate cameras called, for example, web cameras. Nowadays, cameras capable of highly-functional photography, such as high dynamic range photography or burst photography (continuous shooting), have been increasing.
In the case of normal depth of field, the image is in focus only in a part of the photograph, and is out of focus in the other parts. In particular, this tendency is conspicuous in macro photography. In macro photography, an image having its central portion in focus and its peripheral portions blurred is generally obtained.
An example of macro photography using a tablet computer or a smartphone is document capture photography. In this case, an image (omnifocal image) in which the entire image area is in focus is required. However, as mentioned above, in macro photography, an omnifocal image is hard to obtain.
As a method of obtaining an omnifocal image by macro photography, it is possible to, for example, acquire images by sweeping the focal point, and then to synthesize the images into an omnifocal image. It should be noted here that swept-focus photography requires a longer shooting time for camera control than normal burst photography (continuous shooting).
In view of the above, it is necessary to prevent camera shake as far as possible during swept-focus photography, or to devise a user interface that, for example, clearly indicates that a shot is in progress to prevent unintentional interruption of the shot.
A general architecture that implements the various features of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
In general, according to one embodiment, an electronic apparatus comprises a camera, a display, processing circuitry and display circuitry. The processor produces, by using first images of a first range photographed by the camera, a second image of the first range, a second quality of the second image higher than first qualities of the first images. The display circuitry displays simultaneously both a view image of the camera on a first area of a screen of the display and a transition image being produced by the processing circuitry during producing the second image, a quality of the transition image changing between the first qualities and the second quality.
Firstly, a first embodiment will be described.
The main body 11 has a thin box-shaped housing. The touch screen display 12 incorporates a flat panel display, and a sensor which detects the touch position of, for example, a finger on the screen of the flat panel display. The flat panel display is, for example, an LCD. The sensor is, for example, a touch panel. The touch panel is provided to cover the LCD.
As shown in
The CPU 101 is a processor which controls the operations of various components in the tablet 1. The CPU 101 executes various types of software loaded from the nonvolatile memory 106 onto the main memory 103. The software includes an operating system (OS) 200 and various application programs. The application programs include a camera application program 220 for photographing an image using the camera 13. The camera application program 220 will be described in detail later.
The CPU 101 also executes a BIOS stored in the BIOS-ROM 105. The BIOS is a program for hardware control.
The system controller 102 is a device which connects the local bus of the CPU 101 to various components. The system controller 102 contains various controllers for controlling various components, which include a memory controller for performing access control of the main memory 103.
The graphics controller 104 is a display controller which controls an LCD 12A used as the display monitor of the tablet 1. The LCD 12A displays screen images based on display signals generated by the graphics controller 104.
The wireless communication device 107 is a device which executes wireless communication, such as wireless LAN or 3G mobile communication. The EC 108 is a single-chip microcomputer including an embedded controller for power management. The EC 108 comprises a function of turning on and off the tablet 1 in accordance with a user's operation of a power button.
The camera application program 220 having the above-described system configuration and operating on tablet 1 will be described in detail.
Referring first to
A consideration will now be given to capture photography of two double-page documents as shown in
In view of this, the camera application program 220 comprises a function of synthesizing a series of images acquired by sweeping the focal point, thereby producing an omnifocal image. The method of “synthesizing a series of images acquired by sweeping the focal point to thereby produce an omnifocal image” will hereinafter be referred to a “focal sweep”.
(A) of
As shown in
The resolution (the number of pixels) of each image acquired by photography may be equal to or different from that of the omnifocal image as the final product. In other words, the resolution (the number of pixels) may be equal or different before and after focal sweep processing. Further, in the focal sweep, any processing may be performed to produce a high-quality image. It is sufficient if an image that is in sharper focus in at least a part thereof than an original image can be acquired. As mentioned above, the resolution (the number of pixels) may be equal or be changed before and after the synthesis processing.
Assuming that 15 images are acquired by photography to produce an omnifocal image as shown in
As shown in
The camera preview display area a1 is configured to display images (shown in (A) of
Namely, during photography, the camera application program 220 firstly provides the user, through the user interface screen image, with both the images being acquired by photography and the images being synthesized to produce an omnifocal image. By displaying an omnifocal image producing process, dissatisfaction of the user due to hang-up of operation can be lessened.
The camera application program 220 can adopt various methods as to how the images being synthesized to produce an omnifocal image are displayed in the synthesis result preview display area a2.
In
As shown in
For instance, assume that characters “A” and “B” included in a character string of “ABCDE”, which exists in the image area of the camera 13, are displayed in the synthesis result preview display area a2, and that characters “C”, “D” and “E” are displayed in the camera preview display area a1, as is shown in
When the partial image in the display target area b2 is displayed in the synthesis result preview display area a2, if it is enlarged therein as shown in
The status icon display area a3 is an area to display objects (icons) indicative of the progress of image pickup and progress of image synthesis.
Namely, the camera application program 220 secondly provides the user with the progresses of image pickup and image synthesis through the user interface screen image. Displaying the progresses of image pickup and image synthesis permits the user to notice that the camera is now photographing, thereby promoting awareness of the user so as not to shake the camera.
(A) of
The camera button display area a4 is an area to display a camera button for instructing the user to start photography, and to display, during photography, a camera button that enables the user to intentionally instruct interruption of the photography. After finishing or interrupting the photography, a camera button for instructing the user to start photography is again displayed.
As shown in
The controller 221 is a module to control the operations of various modules in the camera application program 220. The image input module 222 is a module to input images picked by the camera 13. The camera driving module 223 is a module to control the operations of the camera 13 including focus sweeping. The operation input module 224 is a module to input a user operation via the touch panel 12B. The synthesis processing module 225 is a module to execute synthesis processing for producing an omnifocal image from a plurality of images input through the image input module 222. The synthesis processing module 225 also executes blurring recover processing after completing synthesis processing, and stores an omnifocal image as a final product in the nonvolatile member 106.
The user interface screen producing module 226 is a module to produce a user interface screen image of such a layout as shown in, for example,
Referring then to
In an initial state, no image pickup (recording) is performed, and photography is started upon pressing the camera button by a touch input operation on the touch screen display 12. When the start of photography has been instructed, the tablet 1 starts to perform swept-focus burst photography (continuous shooting) (block A1). In general, focal change is impossible during burst photography. Therefore, an operation procedure of interrupting burst photography, then moving the focal point, and then performing burst photography again, is employed as shown in
Further, after burst photography is started, the tablet 1 starts image synthesis processing for producing an omnifocal image (block A2). Although in the operation procedure of
During image synthesis processing, the tablet 1 displays images, which are being processed, on the touch screen display 12, along with the images picked by photography. After completing image synthesis processing, the tablet 1 performs blurring recovery processing (block A3), thereby displaying a final high-quality image on the touch screen display 12 and outputting the same to the nonvolatile member 106.
In an initial state, an object falling within the image area of the camera 13 is displayed in the entire camera preview display area a1 (including the synthesis result preview display area a2). In this state, no icon display is performed on the status icon display area a3. Further, the camera button display area a4 displays a camera button d1 for initiating photography.
When the camera button d1 has been depressed, photography and synthesis processing are initiated, whereby images obtained by photography are sequentially displayed in the camera preview display area a1, and images produced by synthesizing the first-mentioned images and included in an omnifocal image are sequentially displayed in the synthesis result preview display area a2. At this time, the status icon display area a3 displays an icon indicative of the progress (cl) of photography and the progress (c2) of synthesis. Further, the camera button display area a4 displays a camera button d2 for instructing interruption of photography. If the camera button d2 has been depressed during photography, photography and synthesis are interrupted, and the initial state returns. Also when a return button displayed as one of the basis buttons by an OS 210 has been depressed, photography and synthesis are interrupted and the initial state returns, as in the case where the camera button d2 has been depressed.
After completing photography and synthesis, the synthesis result preview display area a2 displays an omnifocal image as a final product is displayed in the synthesis result preview display area a2. At this time, by a touch input operation on the synthesis result preview display area a2, the user can check the produced omnifocal image while, for example, moving the display range. Thus, the user can instruct saving or discarding of the omnifocal image after checking the same. As the button for instructing the saving or discarding of the omnifocal image, one of the basic buttons displayed by the OS 210, or a dedicated button prepared by the camera application program 220, may be used. For instance, when saving of an omnifocal image has been instructed, an icon e1, which visually indicates in the form of, for example, a rotating circle that certain processing is being executed to inform the user that the omnifocal image is being saved, is displayed during saving. Further, the display of the camera button display area a4 is returned to the camera button d1 for instructing initiation of photography. When the return button has been depressed, the omnifocal image disappears from the synthesis result preview display area a2, whereby the initial state returns.
As described above, the tablet 1 realizes an effective interface for providing the user with images currently being synthesized into an omnifocal image, along with images obtained by photography.
A second embodiment will be described. Also in the second embodiment, it is assumed that the electronic apparatus is realized as a tablet 1 as in the first embodiment. Further, in the second embodiment, elements similar to those of the first embodiment are denoted by corresponding reference numbers, and no detailed description will be given thereof.
As shown in
At the start of photography or during photography, the camera preview display area a1 displays an object falling within the image area of the camera 13, or an image currently being picked up by the camera 13. Further, after the photography, the area a1 displays one (e.g., the first one) of the images already obtained by photography. In contrast, the synthesis result preview display area a2 displays an image currently being produced by synthesis for generating an omnifocal image, or the omnifocal image as a final product.
Thus, in the second embodiment, after the image pickup and synthesis processing, the camera preview display area a1 displays one of the images already picked up by the camera 13, and the synthesis result preview display area a2 displays the omnifocal image as the final product. At this time, if a touch input operation has been performed to instruct the camera preview display area a1 or the synthesis result preview display area a2 to move, the camera application program 220 moves not only the area instructed to move, but also the other area in synchronism with each other. This means that the user is enabled to perform so-called synchronous scroll.
Thus, the user is enabled not only to observe the omnifocal image generation process, but also to browse both images while comparing corresponding portions of the images in quality, with the result that the user can easily detect to what a degree the omnifocal image generated by a focal sweep is improved in quality compared to the original image obtained by photography.
A third embodiment will be described. Also in the third embodiment, it is assumed that the electronic apparatus is realized as a tablet 1 as in the first embodiment. Further, in the third embodiment, elements similar to those of the first embodiment are denoted by corresponding reference numbers, and no detailed description will be given thereof.
In the third embodiment, the camera application program 220 provides a user interface that enables the user to designate, during a focal sweep, a range by which the focus of the camera is moved.
Assume here that capture photography of two double-page documents as shown in, for example,
During a focal sweep, the tablet 1 of the third embodiment accepts designation of a focus start position (block B1), and designation of a focus end position (block B2). Subsequently, the tablet 1 calculates focal points based on the designated focus start and end positions (block B3). After completing the focal point calculation, the tablet 1 starts swept-focus burst photography (continuous shooting) (block B4), and also starts image synthesis processing for generating an omnifocal image (block B5), as in the first embodiment.
As in the first embodiment, the tablet 1 displays images obtained by photography and images being processed, on the touch screen display 12 during image synthesis processing, performs blur correcting processing after the image synthesis processing (block B6), displays a final high-quality image on the touch screen display 12, and outputs the same to the nonvolatile member 106.
As described above, since the user is enabled to explicitly designate the range of focus movement, more accurate focus range control is possible.
A fourth embodiment will be described. Also in the fourth embodiment, it is assumed that the electronic apparatus is realized as a tablet 1 as in the first embodiment. Further, in the fourth embodiment, elements similar to those of the first embodiment are denoted by corresponding reference numbers, and no detailed description will be given thereof.
In the first to third embodiments, burst photography is performed by moving the focus of the camera, and a plurality of images obtained by the burst photography are synthesized into a single omnifocal image of high quality. In contrast, in the fourth embodiment, the camera application program 220 includes a video mode in which the camera 13 acquires video images (moving images), and provides a function of synthesizing images (video images) obtained by video-mode photography, and displaying highly fine video images, resulting from the synthesis, in a real-time manner (although a little delay occurs).
For instance, when it is dark and hence details cannot clearly be seen, if the above function of the camera application program 220 is utilized, a plurality of video images with high noise are synthesized to produce a video image of low noise, and the video image of low noise can be displayed during photography with a frame rate corresponding to the number of the synthesized images.
Assume here that the camera 13 can acquire video data at 30 fps, and that the camera application program 220 executes synthesis processing using 15 video images at a time. In this case, since one high-quality image is obtained per 15 video images obtained by photography, video data can be produced and displayed at 2 fps.
This enables the user to view high-quality data in substantially a real-time manner, and to utilize the camera application program 220 as software for complementing the hardware performance of the camera 13.
Since the processing of each embodiment can be realized by software (program), the same advantage as each embodiment can be easily achieved by installing the software in a computer through a computer-readable recording medium storing the software.
The various modules of the systems described herein can be implemented as software applications, hardware and/or software modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2014-135780 | Jul 2014 | JP | national |