The present disclosure relates to an image processing apparatus, an image processing method, and a program.
For example, an image such as a picture captured in a tourist destination may often contain subjects other than a particular subject. Such an image may probably not be what the user (for example, an image-capturing person or a particular subject) intended.
Meanwhile, techniques for providing an image obtained by removing a moving subject in a background are being developed. An example of such techniques includes a technique disclosed in Japanese Patent Application Laid-Open Publication No. 2006-186637.
For example, Japanese Patent Application Laid-Open Publication No. 2006-186637 discloses a technique for combining a background image of a particular subject that is generated based on a plurality of images and a subject region image that corresponds to a particular subject region image extracted from a subject image including the particular subject. Thus, with the use of the technique disclosed in Japanese Patent Application Laid-Open Publication No. 2006-186637, there is a possibility to obtain an image from which a moving subject is removed in a background.
However, for example, the image from which a moving object is removed may not be necessarily obtained, because the technique disclosed in Japanese Patent Application Laid-Open Publication No. 2006-186637 uses a simple arithmetic mean to obtain the image from which a moving subject is removed in a background. More specifically, in a case using the technique disclosed in Japanese Patent Application Laid-Open Publication No. 2006-186637, when there is an object that rarely moves, or there is no particular subject region that contains an entire particular subject due to a moving object, it is very difficult to obtain the image from which a moving object is removed.
In accordance with an embodiment of the present disclosure, there is provided a novel and improved image processing apparatus, image processing method, and program, capable of obtaining an image from which a moving object is removed, based on a plurality of still images.
According to an embodiment of the present disclosure, there is provided an image processing apparatus which includes an extraction unit for extracting a first still portion in an image based on a plurality of still images and for extracting at least a part of a second still portion corresponding to a portion that is not extracted as the first still portion in an image based on the plurality of still images; and a combining unit for combining the first still portion and the second still portion to generate a combined image.
According to another embodiment of the present disclosure, there is provided an image processing method which includes extracting a first still portion in an image based on a plurality of still images; extracting at least a part of a second still portion corresponding to a portion that is not extracted as the first still portion in an image based on the plurality of still images; and combining the first still portion and the second still portion to generate a combined image.
According to another embodiment of the present disclosure, there is provided a program for causing a computer to execute a process which includes extracting a first still portion in an image based on a plurality of still images; extracting at least a part of a second still portion corresponding to a portion that is not extracted as the first still portion in an image based on the plurality of still images; and combining the first still portion and the second still portion to generate a combined image.
According to embodiments of the present disclosure, it is possible to obtain an image from which a moving object is removed, based on a plurality of still images.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
The description will be given in the following order.
1. Image Processing Method according to an Embodiment of the Present Disclosure
2. Image Processing Apparatus according to an Embodiment of the Present Disclosure
3. Program according to an Embodiment of the Present Disclosure
An image processing method according to an embodiment of the present disclosure will be described first, and then a configuration of an image processing apparatus according to an embodiment of the present disclosure will be described. The following explanation is on the assumption that the image processing apparatus according to the present embodiment performs a process according to the image processing method of the present embodiment.
A process according to the image processing method of the present embodiment will be described below, by taking as an example the image obtained by capturing, in the case as shown in
In this example, the recording medium according to the present embodiment stores an image. An example of the recording medium includes a storage unit (to be described later) or RAM (Random Access Memory; not shown) provided in the image processing apparatus according to the present embodiment, a removable external recording medium detachably connected to the image processing apparatus according to the present embodiment, and a recording medium provided in an external device. This external device is connected to the image processing apparatus according to the present embodiment via a network (or directly, not through a network) on a wired or wireless connection. When an image stored in a recording medium provided in an external device is processed, the image processing apparatus according to the present embodiment obtains the image from the external device by transmitting an image transmission request to the external device. The image transmission request is a signal for causing the external device to transmit an image.
Further, an image to be processed by the process according to the image processing method of the present embodiment may include both a captured image and an image stored in a recording medium. In other words, an example of an image to be processed by the process according to the image processing method of the present embodiment may include, for example, either of a captured image or an image stored in a recording medium or both of them.
Moreover, an image to be processed in the process according to the image processing method of the present embodiment by the image processing apparatus according to the present embodiment may be a still image. In this case, a still image according to the present embodiment may be a frame image constituting a moving image as well as a still image. A frame image according to the present embodiment may be, for example, an image corresponding to a single frame of a moving image (which corresponds to a single field of a moving image, if the moving image is an interlaced image). The following explanation will be made with respect to a case where an image to be processed in the process according to the image processing method of the present embodiment is a still image according to the present embodiment. In addition, the still image may be simply referred to as “image” hereinafter.
[1] Problems of the Related Art
Prior to the explanation of the image processing method according to the present embodiment, an example of problems of the related art such as a technique disclosed in, for example, Japanese Patent Application Laid-Open Publication No. 2006-186637 will now be described.
An image processing apparatus which employs the related art (hereinafter, may be referred to as “image processing apparatus of related art”) specifies a region which contains a particular subject in an original image from a subject image which contains the particular subject in the original image (AR shown in B of
For example, as shown in
The image processing apparatus of related art specifies a region that contains a particular subject from a subject image that contains the particular subject in an original image (AR shown in B of
Further, the image processing apparatus of related art generates a background image (B1 shown in B of
In this case, a moving object (or a portion of the moving object) remains in the background image, and another subject is contained in a subject region image. Thus, in a case where a particular subject is hidden by any other subjects and the number of moving objects is larger than that shown in
The image processing apparatus of related art generates a background image using a simple arithmetic mean, and combines the generated background image and a subject region image. In this case, a moving object (or a portion of the moving object; for example, B1 shown in B of
[2] Overview of Image Processing Method according to the Embodiment
For example, as shown in
On the contrary, the image processing apparatus according to the present embodiment can obtain an image from which a moving object is removed based on a plurality of still images, for example, by performing the following items: (1) a first still portion extraction process, (2) a second still portion extraction process, and (3) a combined image generation process.
(1) First Still Portion Extraction Process
The image processing apparatus according to the present embodiment extracts a portion in which a moving object is not contained in an image (hereinafter, referred to as “first still portion”) based on a plurality of still images. More specifically, for example, the image processing apparatus according to the present embodiment calculates a motion vector based on a plurality of still images and extracts a first still portion based on the calculated motion vector.
In this example, the image processing apparatus according to the present embodiment may extract a still portion that contains a subject to be extracted as the first still portion. When a particular person is considered as a subject to be extracted, the image processing apparatus according to the present embodiment specifies the subject to be extracted, for example, by using face recognition technology. This face recognition technology detects feature points such as eye, nose, mouth, and facial skeleton, or detects a region similar to the brightness distribution and structural pattern of a face. In addition, when a particular object is considered as a subject to be extracted, the image processing apparatus according to the present embodiment specifies the subject to be extracted by using any object recognition technology. Additionally, the subject to be extracted may be specified based on an operation for allowing a user to specify a subject (an example of user operations). The image processing apparatus according to the present embodiment may perform a first still portion extraction process, while the number of still images to be processed is increased until a first still portion that contains the entire specified subject to be extracted is extracted. In this case, the image processing apparatus according to the present embodiment may determine whether the entire subject to be extracted is contained in the first still portion, by detecting the contour of a subject to be extracted using an edge detection method. In addition, in the image processing apparatus according to the present embodiment, the determination of whether the entire subject to be extracted is contained in the first still portion is not limited to the above example.
Further, in a case where a still portion that contains a subject to be extracted is extracted as the first still portion, the first still portion extraction process according to the present embodiment is not limited to the above example. For example, the image processing apparatus according to the present embodiment may extract a region that contains a portion of the specified subject to be extracted as the first still portion. Even when the region that contains a portion of the specified subject to be extracted is extracted as the first still portion, a process according to the image processing method of the present embodiment, as described later, allows an image from which a moving object is removed to be obtained.
(2) Second Still Portion Extraction Process
The image processing apparatus according to the present embodiment extracts a portion (hereinafter, referred to as “second still portion”) not extracted as the first still portion in an image, based on a plurality of still images. More specifically, the image processing apparatus according to the present embodiment, for example, calculates a motion vector based on a plurality of still images and extracts a second still portion based on the calculated motion vector. In this case, the plurality of still images to be processed that is used in a second still portion extraction process may or may not include a portion or all of the plurality of still images to be processed that is used in the first still portion extraction process.
In this example, the image processing apparatus according to the present embodiment may extract a new second still portion, for example, each time the number of still images to be processed increases. In addition, the second still portion extraction process performed in the image processing apparatus according to the present embodiment is not limited to the above example. For example, when the number of still images to be processed is predetermined, the image processing apparatus according to the present embodiment extracts the second still portion by processing the still image to be processed in a sequential manner.
(3) Combined Image Generation Process
The image processing apparatus according to the present embodiment combines the first still portion and the second still portion to generate a combined image.
In this example, when the first still portion is extracted and the second still portion is not yet extracted, the image processing apparatus according to the present embodiment regards only an image representing the extracted first still portion as a combined image. Then, when the second still portion is extracted, the image processing apparatus according to the present embodiment combines the first still portion and the second still portion to generate a combined image.
Furthermore, the image processing apparatus according to the present embodiment may generate a new combined image each time a new second still portion is extracted in the second still portion extraction process. More specifically, the image processing apparatus according to the present embodiment, each time a new second still portion is extracted, may combine the first still portion or the previously generated combined image with the newly extracted second still portion to generate a new combined image.
Moreover, for example, the image processing apparatus according to the present embodiment may cause the generated combined image to be recorded on a recording medium or may transmit the generated combined image to an external device.
In this example, the image processing apparatus according to the present embodiment causes a final combined image generated in the process (combined image generation process) of the above item (3) to be recorded on a recording medium and/or to be transmitted to an external device. However, the process in the image processing apparatus according to the present embodiment is not limited to the above example. For example, even before the final combined image is obtained, i.e. at an intermediate stage of the process (combined image generation process) of the above item (3), the image processing apparatus according to the present embodiment may cause the generated combined image to be recorded on a recording medium or to be transmitted to an external device.
As the example described above, at an intermediate stage of the process (combined image generation process) of the above item (3), a combined image obtained at an intermediate stage of the process (combined image generation process) of the above item (3) is recorded on a recording medium or is stored on an external device, by recording the combined image on a recoding medium and/or by transmitting the combined image to an external device. Thus, it is possible to avoid a situation where no combined image is obtained, for example, even when there is an accident caused by some reasons (e.g., power is shut off, or image pickup device is moved) at an intermediate stage of the process (combined image generation process) of the above item (3).
In this example, an example of the timing at which a combined image is recorded or transmitted at an intermediate stage of the process (combined image generation process) of the above item (3) includes a timing that is set according to an expected time taken to obtain a final combined image or a timing that is set according to a progress of the process (or percentage of completion of a combined image). An example of the timing that is set according to an expected time taken to obtain a final combined image may include a timing at which the recording or transmission is performed at intervals of one second, when it is expected to take three seconds until the final combined image is obtained. In addition, an example of the timing that is set according to the process progress (or percentage of completion of a combined image) may include a timing when the recording or transmission is performed each time the progress of the process (or percentage of completion of a combined image) reaches a predetermined percentage.
For example, the image processing apparatus according to the present embodiment performs the process (first still portion extraction process) of the above item (1), the process (second still portion extraction process) of the above item (2), and the process (combined image generation process) of the above item (3), as a process according to the image processing method of the present embodiment.
As shown in
In this example, as shown in A of
However, when a still portion that contains the entire subject to be extracted is extracted as the first still portion, at the point of time when the first still portion is extracted, the entire subject O to be extracted will be contained in the extracted first still portion. In other words, even when the subject O is moved after the completion of the process (first still portion extraction process) of the above item (1), the subject O is previously contained in the first still portion. Thus, a combined image generated in the process (combined image generation process) of the above item (3) will contain the entire subject O.
Therefore, for example, when a captured image obtained by capturing a subject to be captured (A shown in
Therefore, the fact that image processing apparatus according to the present embodiment performs the process according to the image processing method of the present embodiment makes it possible to shorten the time necessary, for the subject to be captured, to be kept stationary and to reduce a load on the subject to be captured.
Further, as described above, the process in the image processing apparatus according to the present embodiment has been described by taking an example where a subject to be captured is extracted, another portion is extracted, and then they are combined. However, the process in the image processing apparatus according to the present embodiment is not limited to the above example. For example, after the entire image is extracted and combined previously, the image processing apparatus according to the present embodiment can also extract the subject to be captured by using a process or the like performed in the object recognition technology. This extraction of the subject to be captured is performed based on a captured image obtained by capturing the subject to be captured that has been moved to a position corresponding to a desired position in the entire image. Even in the above case, the image processing apparatus according to the present embodiment can generate an image that contains the subject to be captured. This is performed, for example, by combining the combined image obtained from the previously obtained extraction result and a portion of the subject to be captured which is extracted based on a captured image obtained by capturing the subject to be captured.
Furthermore, when the extraction of the first still portion is completed (when the process (first still portion extraction process) of the above item (1) is completed), the image processing apparatus according to the present embodiment may notify the user that the extraction of the first still portion is completed. In this case, an example of the user receiving the notification that the extraction of the first still portion is completed may include a holder of the image processing apparatus according to the present embodiment, a person of the subject to be captured, and an image-capturing person who captures a subject to be captured using an image pickup device.
The image processing apparatus according to the present embodiment can notify the subject to be captured whether the subject to be captured is allowed to move or not to, in a direct or indirect way. This is achieved by causing the image processing apparatus according to the present embodiment to notify a user that the extraction of the first still portion is completed. Thus, it is possible to improve the convenience of the user by allowing the image processing apparatus according to the present embodiment to notify the user that the extraction of the first still portion is completed.
When the process (first still portion extraction process) of the above item (1) is completed, the image processing apparatus according to the present embodiment transmits a notification command to the image pickup device 10. The image pickup device 10 is connected to the image processing apparatus via a network (or directly) on a wired or wireless connection. When the image pickup device 10 receives the notification command from the image processing apparatus according to the present embodiment, the image pickup device 10 performs a visual notification by turning on a lamp based on the received notification command (B shown in
Further, the notification control in the image processing apparatus according to the present embodiment is not limited to the visual notification shown in
Furthermore, in the above example, there has been described an example in which the image processing apparatus according to the present embodiment transmits a notification command to another device that performs a notification, and thus cause the device to notify a user that the extraction of the first still portion is completed. However, the notification control of the image processing apparatus according to the present embodiment is not limited to the above example. For example, when the image processing apparatus according to the present embodiment performs a notification, such as when the image processing apparatus according to the present embodiment is the image pickup device 10 shown in
In this example, in order to make a comparison with the related art such as a technique, for example, disclosed in Japanese Patent Application Laid-Open Publication No. 2006-186637, an example of results obtained by the process according to the image processing method of the present embodiment will be described in examples used to explain problems that may occur in the above-mentioned related art.
The image processing apparatus according to the present embodiment extracts a first still portion based on a plurality of original images (B1 shown in
The image processing apparatus according to the present embodiment extracts a first still portion based on a plurality of original images (B1 shown in
Thus, as shown in A of
As shown in
Therefore, the image processing apparatus according to the present embodiment can obtain an image from which the moving object is removed, based on a plurality of still images, for example, by performing the process (first still portion extraction process) of the above item (1) to the process (combined image generation process) of the above item (3) as the process according to the image processing method of the present embodiment.
Further, the process according to the image processing method of the present embodiment is not limited to the process (first still portion extraction process) of the above item (1) to the process (combined image generation process) of the above item (3). For example, the image processing apparatus according to the present embodiment may perform a process (capturing process) for capturing an image to be captured as the process according to the image processing method of the present embodiment. In the capturing process, the image processing apparatus according to the present embodiment, for example, cause an image pickup unit (to be described later) to perform the capturing, or controls an external image pickup device to capture an image. When the capturing process is performed, the image processing apparatus according to the present embodiment can regard a captured image obtained by the capturing process as an image to be processed, in the process (first still portion extraction process) of the above item (1) and/or the process (second still portion extraction process) of the above item (2).
Moreover, the image processing apparatus according to the present embodiment may cause a combined image generated in the process (combined image generation process) of the above item (3) to be displayed on a display screen (display control process), by using the image processing method according to the present embodiment. An example of the display screen on which a combined image according to the present embodiment is displayed may include a display screen of a display unit (to be described later), and a display screen of an external device connected via a network (or directly, not through a network) on a wired or wireless connection.
In this example, as an example of the display control process according to the present embodiment, there may be exemplified a process for causing a final combined image generated in the process (combined image generation process) of the above item (3) to be displayed on a display screen. However, the display control process according to the present embodiment is not limited to the above example. For example, the image processing apparatus according to the present embodiment may cause the progress of the process made during obtaining a final combined image of the process according to the image processing method of the present embodiment to be displayed on the display screen.
The image processing apparatus according to the present embodiment extracts a first still portion based on a plurality of still images, and causes an image (combined image) indicating the extracted first still portion to be displayed on a display screen (A shown in
Further, for example, the image processing apparatus according to the present embodiment may apply a color to a portion not extracted as the first still portion in an image (an example of a portion that is not included in either of the first still portion or second still portion), and cause the colored combined image to be displayed on a display screen. In this example, the image processing apparatus according to the present embodiment may apply a monochromatic color such as gray, for example, as shown in A of
When the first still portion is extracted, the image processing apparatus according to the present embodiment extracts a second still portion based on a plurality of still images. The image processing apparatus according to the present embodiment, for example, each time a new second still portion is extracted, may combine the first still portion or the previous generated combined image and the newly extracted second still portion, and then generates a new combined image. Then, the image processing apparatus according to the present embodiment, each time a combined image is generated, causes the generated combined image to be displayed on a display screen (B to G of
Furthermore, the image processing apparatus according to the present embodiment, for example, applies a color to a portion that is not included in either of the first still portion or second still portion in an image, and causes the colored combined image to be displayed on a display screen. In this example, although the image processing apparatus according to the present embodiment applies a monochromatic color such as gray, for example, as shown in B to G of
For example, in the images displayed on the display screen by performing the process as the above example in the image processing apparatus according to the present embodiment, the area of a portion (a portion to which a gray color shown in A to F of
The progress display that indicates the progress of the process according to the image processing method of the present embodiment is not limited to the example in
For example, as shown in
Furthermore, for example, the image processing apparatus according to the present embodiment may represent results obtained by the process according to the image processing method of the present embodiment by applying color to the results.
The image processing apparatus according to the present embodiment applies the color “Red” to a first still portion extracted in the process (first still portion extraction process) of the above item (1) (A shown in
For example, even when the progress of the process according to the image processing method of the present embodiment is represented by colors as shown in
For example, the image processing apparatus of the present embodiment can implement the progress of the process as shown in
In this example, the image processing apparatus according to the present embodiment causes results obtained by the process according to the image processing method of the present embodiment to be displayed on a display screen. In this case, the results are obtained by performing the process to the still image of each of the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2). However, the display control process according to the present embodiment is not limited to the above example. For example, the image processing apparatus of the present embodiment may cause results obtained by the process according to the image processing method of the present embodiment to be displayed on a display screen. In this case, the results are obtained by performing the process to a reduced image in which a still image to be processed is reduced.
More specifically, for example, the image processing apparatus according to the present embodiment extracts a first still portion corresponding to a reduced image to which a still image to be processed is reduced, based on the reduced image, in the process (first still portion extraction process) of the above item (1). In addition, for example, each time the number of still images to be processed increases in the process (second still portion extraction process) of the above item (2), the image processing apparatus according to the present embodiment extracts a new second still image corresponding a reduced image to which the still image to be processed is reduced. The extraction of the new second still image is performed based on the reduced image. In addition, in the process (combined image generation process) of the above item (3), for example, the image processing apparatus of the present embodiment combines the first still portion corresponding to the reduced image or the previously generated combined image corresponding to the reduced image and a newly extracted second still portion corresponding to the reduced image. In this example, the image processing apparatus of the present embodiment, for example, performs the combination each time a second still portion corresponding to the reduced image is newly extracted in the process (combined image generation process) of the above item (3), thereby generating a new combined image corresponding to the reduced image. Then, the image processing apparatus of the present embodiment, for example, each time a combined image corresponding to the reduced image is generated in the above-mentioned display control process, allows the generated combined image to be displayed on a display screen.
Further, for example, as described above, when results obtained by the process according to the image processing method of the present embodiment performed with respect to the reduced image are displayed on a display screen, the image processing apparatus of the present embodiment, for example, performs a process to the still image to be processed, after a final combined image corresponding to the reduced image is obtained. In this example, the process for the progress display with respect to the reduced image has a significantly reduced load as compared with the process for the progress display to the still image to be processed. That is, the time necessary to perform the process of the progress display to the reduced image becomes shorter than the time necessary to perform the process of the progress display to the still image to be processed.
For example, as described above, the image processing apparatus of the present embodiment allows the results obtained by the process according to the image processing method of the present embodiment performed with respect to the reduced image to be displayed on a display screen, thereby reducing the time taken from the start to the completion of the progress display. Therefore, when the progress display that indicates a progress of the process according to the image processing method of the present embodiment is performed, the user latency can be reduced.
[3] Specific Example in Process according to Image Processing Method of the Present Embodiment
Next, the process according to the image processing method of the present embodiment will be described in more detail.
(i) Process for obtaining a Desired Combined Image
As shown in A and B of
Even when the unnecessary subject A1 as shown in A of
(ii) Process for Extracting Still Portion
The image processing apparatus according to the present embodiment divides a still image into blocks, as A1 shown in A of
The image processing apparatus of the present embodiment, when calculating a motion vector, for example, determines that a portion whose vector is 0 (zero) or a portion which is a value considered to be 0 (zero) by the threshold determination using a threshold pertaining to the absolute value of a vector is a still portion. Then, the image processing apparatus of the present embodiment extracts an image portion corresponding to a block determined to be a still portion as a still region (D shown in
The image processing apparatus according to the present embodiment extracts still portions (the first still portion and second still portion) by, for example, calculating the motion vector described above, in the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2), respectively. In addition, the process for extracting still portions according to the present embodiment is not limited to the process described with reference to
Although
For example, the image processing apparatus of the present embodiment can prevent a region that is moved even slightly from being extracted as a still portion by performing the process for the third example shown in
As described above, the image processing apparatus according to the present embodiment can specify a particular subject to be extracted by performing a face detection process or contour extraction process, or by performing a process for any object recognition technology. Thus, for example even when a plurality of subjects are contained in a portion extracted as a still portion, the image processing apparatus according to the present embodiment can extract the first still portion that contains only a person of a particular subject by performing a face detection process or contour extraction process to a portion extracted as a still portion (e.g., A shown in
The image processing apparatus according to the present embodiment extracts still portions (the first still portion and second still portion), for example, by performing the process for each of the first to third examples or the process for the fourth example, in the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2). In addition, the process for extraction of a still portion in the image processing apparatus according to the present embodiment is not limited to the processes according to the first to fourth examples.
(iii) Deviation Correction Process
As described in the above item (ii), the image processing apparatus according to the present embodiment extracts a still portion, for example, based on the calculated motion vector, in each of the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2). In this example, for example, when there are various deviations such as a vertical direction deviation, a horizontal direction deviation, or a rotation direction deviation in the still image to be processed, there is a possibility that the extraction accuracy of a still portion is deteriorated due to these various deviations.
As shown in B of
Thus, the image processing apparatus according to the present embodiment corrects one or more deviations of the vertical direction, horizontal direction, and rotation direction in a still image to be processed (e.g., C shown in
The image processing apparatus according to the present embodiment calculates a motion vector (C shown in
In this example, for example, when the still image to be processed is the image captured by an image pickup device, the motion vector due to various deviations such as a rotation direction deviation depends on a motion that would occurred in the image pickup device. Thus, the image processing apparatus according to the present embodiment calculates the rotation angle, center, and shift amount with a plurality of vector values indicating one motion (the rotation angle is calculates in the example of
The image processing apparatus according to the present embodiment calculates the motion vector (C shown in
In this example, for example, when the still image to be processed is an image captured by an image pickup device, the motion vector due to various deviations such as a horizontal direction deviation depends on a motion that would occurred in the image pickup device. Thus, the image processing apparatus according to the present embodiment calculates the rotation angle, center, and shift amount with a plurality of vector values indicating one motion (the shift amount is calculates in the example of
The image processing apparatus according to the present embodiment, for example, corrects one or more deviations of the vertical direction, horizontal direction, and rotation direction in the still image to be processed, as shown in
(iv) Noise Reduction Process
In the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2) according to the image processing method of the present embodiment, as described above, the still portion is extracted, for example, by using the motion vector. Thus, for example, a portion that can be determined to have a high degree of coincidence by a motion vector is likely to be extracted from among a plurality of still images. In other words, for example, when a frame image constituting a moving image is a still image to be processed, there may be a plurality of still images that contain the extracted still portions (first still portion and second still portion) of a plurality of still images to be processed.
In addition, when there is a plurality of still images that contain the extracted still portions (first still portion and second still portion) (e.g., B1 and B2 shown in
In this example, for example, in the process (first still portion extraction process) of the above item (1), when a plurality of still images which contains the extracted first still portion is included in the plurality of still images to be processed, the image processing apparatus according to the present embodiment performs an addition of the plurality of still images. More specifically, in the above case, the image processing apparatus according to the present embodiment, for example, regards an image indicating a signal obtained by averaging a signal that corresponds to a region corresponding to the first still portion in the plurality of still images, as the first still portion.
For example, as described above, the image processing apparatus according to the present embodiment can reduce a noise, for example, as shown in B2 of
Further, the image processing apparatus according to the present embodiment can perform a process which is similar to the noise reduction process performed in the process (first still portion extraction process) of the above item (1), for example, in the process (second still portion extraction process) of the above item (2).
(v) Processing Time Control Process in First Still Portion Extraction Process
For example, in a case where the image processing apparatus according to the present embodiment has image pickup function and obtains a plurality of still images to be processed in the process (first still portion extraction process) of the above item (1) by image capturing, the image processing apparatus according to the present embodiment may perform a processing time control process for controlling a processing time in the process (first still portion extraction process) of the above item (1) by controlling the image capturing.
Moreover, the processing time control process according to the present embodiment is not limited to, for example, the examples performed based on the setting value or user operation as shown in
Furthermore, for example, there may be a case where the image processing apparatus according to the present embodiment terminates the process (first still portion extraction process) of the above item (1) when it is detected that a user performed a particular operation as shown in B of
The image processing apparatus according to the present embodiment performs, for example, the processes described in the above items (i) to (v) in the above-mentioned process according to the image processing method of the present embodiment. In addition, the processes performed in the process according to the image processing method of the present embodiment are not limited to those described in the above items (i) to (v).
[4] Process According to Image Processing Method of Present Embodiment
Next, an example of the process according to the image processing method of the present embodiment will be described in more detail.
The image processing apparatus according to the present embodiment performs a progress display process (S100).
The image processing apparatus according to the present embodiment obtains a first frame of image to be processed and generates a reduced image of the image (S200). In this example, the image processing apparatus according to the present embodiment generates the reduced image, for example, by a resize process for changing a size of the image to a the set image size (size of the image to be processed>the set image size). In addition, the image processing apparatus according to the present embodiment can use any resize process for changing a size of the image to be processed to the set image size.
The image processing apparatus according to the present embodiment obtains a second frame of image to be processed, and generates a reduced image of the image in a similar way to step S200 (S202).
When steps S200 and S202 are completed, the image processing apparatus according to the present embodiment calculates a motion vector based on the generated reduced image (S204). In this example, the image processing apparatus according to the present embodiment, in step S204, calculates the motion vector by dividing the reduced image into blocks, for example, as described with reference to
The image processing apparatus according to the present embodiment extracts a still portion based on the motion vector calculated in step S204, and if there is a previously extracted portion, adds the extracted still portion (S206). Additionally, the image processing apparatus according to the present embodiment deletes a portion determined that it has moved based on the motion vector from among the portions extracted in step S206 (S208). In this example, the process in step S208 corresponds to, for example, the process described with reference to
The image processing apparatus according to the present embodiment extracts a particular subject to be extracted, for example, by performing a face detection process, contour extraction process, and so on (S210).
The image processing apparatus according to the present embodiment changes a display of the portion that has not been extracted as a still portion (first still portion) of an image, and causes the image indicating the extracted still portion (first still portion) to be displayed on a display screen (S212). In this case, an example of the method of changing the display in step S212 includes, for example, the methods illustrated in
When the process in step S212 is performed, the image processing apparatus according to the present embodiment determines whether the entire particular subject to be extracted is extracted (S214). In this example, when it is determined that the entire contour of a particular subject is extracted, for example, based on a result obtained from the contour extraction process or the like, the image processing apparatus according to the present embodiment determines that the entire particular subject is extracted.
When it is not determined that the entire particular subject to be extracted is extracted in step S214, the image processing apparatus according to the present embodiment repeats the process from step S202.
Furthermore, when it is determined that the entire particular subject to be extracted is extracted in step S214, the image processing apparatus according to the present embodiment obtains single frame of the image to be processed, and generates a reduced image of the image, in a similar way to step S200 (S216).
When the process in step S216 is completed, the image processing apparatus according to the present embodiment calculates a motion vector based on the generated reduced image, in a similar way to step S204 (S218).
The image processing apparatus according to the present embodiment extracts a still portion based on the motion vector calculated in step S218, and adds a newly extracted still portion to the previously extracted still portion (S220). In addition, the image processing apparatus according to the present embodiment deletes a portion determined that it has moved on the basis of the motion vector from among the portions extracted in step S220 in a similar way to step S208 (S222).
The image processing apparatus according to the present embodiment changes a display of the portion not extracted as the still portion (the first still portion and second still portion) of an image, and causes the image indicating the extracted still portions (the first still portion and second still portion) to be displayed on a display screen (S224). In this example, an example of the method of changing the display in step S224 includes, for example, the methods illustrated in
When the process in step S224 is performed, the image processing apparatus according to the present embodiment determines whether the entire image is extracted (S226). In this example, when there is no portion contained in either of the first still portion or the second still portion of an image, the image processing apparatus according to the present embodiment determines that the entire image is extracted.
When it is not determined that the entire particular subject to be extracted is extracted in step S226, the image processing apparatus according to the present embodiment repeats the process from step S216. In addition, when it is determined that the entire image is extracted in step S226, the image processing apparatus according to the present embodiment terminates the progress display process according to the present embodiment.
The image processing apparatus according to the present embodiment performs, for example, the process illustrated in
Referring again to
The image processing apparatus according to the present embodiment obtains a first frame of image to be processed (S300). In this case, the image obtained in step S300 is an image that corresponds to the reduced image generated in step S200 of
The image processing apparatus according to the present embodiment obtains a single image of the second and the following frames of image to be processed (S302). In this case, the image obtained in step S302 is an image that corresponds to the reduced image generated in step S202 of
When the processes in steps S300 and S302 are completed, the image processing apparatus according to the present embodiment calculates a motion vector based on the obtained images, in a similar way to step S204 of
The image processing apparatus according to the present embodiment extracts a still portion based on the motion vector calculated in step S304, and if there is a previously extracted portion, adds the extracted still portion (S306). In addition, the image processing apparatus according to the present embodiment deletes a portion determined that it has moved based on the motion vector from among the portions extracted in step S306 in a similar way to step S208 of
The image processing apparatus according to the present embodiment extracts a particular subject to be extracted, for example, by performing the face detection process, contour extraction process, or the like (S310).
When the process in step S310 is performed, the image processing apparatus according to the present embodiment determines whether the entire particular subject to be extracted is extracted in a similar way to step S214 of
Furthermore, when it is determined that the entire particular subject to be extracted is extracted in step S312, the image processing apparatus according to the present embodiment obtains single frame of the image to be processed (S314). In this case, the image obtained in step S300 is an image that corresponds to the reduced image generated in step S216 of
When the process in step S314 is completed, the image processing apparatus according to the present embodiment calculates a motion vector based on the obtained image in a similar way to step S304 (S316).
The image processing apparatus according to the present embodiment extracts a still portion based on the motion vector calculated in step S316, and adds a newly extracted still portion to the previously extracted still portion (S318). In addition, the image processing apparatus according to the present embodiment deletes a portion determined that it has moved based on the motion vector from among the portions extracted in step S318 in a similar way to step S308 (S320).
When the process in step 320 is performed, the image processing apparatus according to the present embodiment determines whether the entire image is extracted (S322). In this case, when there is no portion that is not contained in either of the first still portion or the second still portion in an image, the image processing apparatus according to the present embodiment determines that the entire image is extracted.
When it is not determined that the entire particular subject to be extracted is extracted in step S322, the image processing apparatus according to the present embodiment repeats the process from step S314. In addition, when it is determined that the entire image is extracted in step S322, the image processing apparatus according to the present embodiment terminates the image process according to the present embodiment.
The image processing apparatus according to the present embodiment performs, for example, the process illustrated in
The image processing apparatus according to the present embodiment performs, for example, the process illustrated in
(Image Processing Apparatus According to Present Embodiment)
Next, an example of the configuration of the image processing apparatus according to the present embodiment capable of performing the process according to the image processing method of the present embodiment described above will be described.
Further, the image processing apparatus 100 may include, for example, a ROM (Read Only Memory, not shown), a RAM (not shown), a storage unit (not shown), an operation unit operable by a user (not shown), a display unit for displaying various screens on a display screen (not shown), and so on. The image processing apparatus 100 connects, for example, between the above-mentioned respective components via a bus that functions as a data transmission path.
In this example, the ROM (not shown) stores control data such as a program or operation parameter used by the controller 104. The RAM (not shown) temporarily stores a program or the like executed by the controller 104.
The storage unit (not shown) is a storage device provided in the image processing apparatus 100, and stores a variety of data such as image data or applications. In this example, an example of the storage unit (not shown) may include, for example, a magnetic recording medium such as a hard disk, and a nonvolatile memory such as an EEPROM (Electrically Erasable and Programmable Read Only Memory) and a flash memory. In addition, the storage unit (not shown) may be removable from the image processing apparatus 100.
[Exemplary Hardware Configuration of Image Processing Apparatus 100]
The MPU 150 functions, for example, as the controller 104 for controlling the entire image processing apparatus 100 configured to include a MPU (Micro Processing Unit), various processing circuits, or the like. Additionally, in the image processing apparatus 100, the MPU 150 functions as an extraction unit 110, a combining unit 112, a display control unit 114, and a recording processing unit 116, which are described later.
The ROM 152 stores control data such as a program or operation parameter used by the MPU 150. The RAM 154 temporarily stores, for example, a program or the like executed by the MPU 150.
The recording medium 156 functions as a storage unit (not shown), and stores, for example, various data such as applications and data constituting an image to be operated. In this example, an example of the recording medium 156 may include, for example, a magnetic recording medium such as a hard disk and a nonvolatile memory such as a flash memory. In addition, the recording medium 156 may be removable from the image processing apparatus 100.
The input/output interface 158 is connected to, for example, the operation input device 160 and the display device 162. The operation input device 160 functions as an operation unit (not shown). The display device 162 functions as a display unit (not shown). In this example, an example of the input/output interface 158 may include, for example, a USB (Universal Serial Bus) terminal, a DVI (Digital Visual Interface) terminal, a HDMI (High-Definition Multimedia Interface) terminal, various processing circuits, and so on. In addition, the operation input device 160 is provided on the image processing apparatus 100, and is connected to the input/output interface 158 within the image processing apparatus 100. An example of the operation input device 160 may include, for example, a button, a direction key, a rotational selector such as a jog dial, or combination thereof. In addition, for example, the display device 162 is provided on the image processing apparatus 100, and is connected to the input/output interface 158 within the image processing apparatus 100. An example of the display device 162 may include, for example, a liquid crystal display (LCD), an organic EL display (organic Electro-Luminescence display; referred often to as an OLED display (Organic Light Emitting Diode display)), and so on.
Further, the input/output interface 158 may be connected to an external device such as an operation input device (e.g., a keyboard or mouse) or a display device that serves as an external equipment of the image processing apparatus 100. In addition, the display device 162 may be, for example, a device that can display information and be operated by a user, such as a touch screen.
The communication interface 164 is a communication appliance provided in the image processing apparatus 100, and functions as the communication unit 102 for performing communication with an external device such as an image pickup device or a display device via a network (or directly) on a wired or wireless connection. In this example, an example of the communication interface 164 may include a communication antenna and RF (radio frequency) circuit (wireless communication), an IEEE 802.15.1 port and transmission/reception circuit (wireless communication), an IEEE 802.11b port and transmission/reception circuit (wireless communication), a LAN (Local Area Network) terminal and transmission/reception circuit (wired communication), and so on. In addition, an example of the network according to the present embodiment may include a wired network such as a LAN or WAN (Wide Area Network), a wireless network such as a wireless LAN (WLAN: Wireless Local Area Network) or a wireless WAN (WWAN: Wireless Wide Area Network) via a base station, and the Internet using a communication protocol such as a TCP/IP (Transmission Control Protocol/Internet Protocol).
The image processing apparatus 100 performs, for example, the process according to the image processing method of the present embodiment by means of the configuration shown in
In this example, an example of the image pickup device may include a lens/imaging element and a signal processing circuit. The lens/imaging element is configured to include an optical lens and an image sensor that uses a plurality of imaging elements such as a CMOS (Complementary Metal Oxide Semiconductor). In addition, the signal processing circuit may include an AGC (Automatic Gain Control) circuit or ADC (Analog to Digital Converter). The signal processing circuit converts analog signals generated by the imaging elements into digital signals (image data), and performs various types of signal processing. An example of the signal processing performed by the signal processing circuit may include White Balance correction process, color tone correction process, gamma correction process, YCbCr conversion process, edge enhancement process, and so on.
Furthermore, for example, when the image processing apparatus 100 is configured to perform the process on a stand-alone basis, the image processing apparatus 100 may not include the communication interface 164. In addition, the image processing apparatus 100 may be configured without the operation input device 160 or the display device 162.
Referring again to
The presence of the communication unit 102 in the image processing apparatus 100 makes it possible for the image processing apparatus 100 to receive data indicating an image captured in an image pickup device as an external device or data indicating an image stored in an external device (e.g., a server, a user terminal such as a mobile phone or smart phone, and so on), and process an image indicated by the received data. In addition, the image processing apparatus 100 may transmit, for example, the processed image (combined image) to an external device. Thus, The presence of the communication unit 102 in the image processing apparatus 100 makes it possible to realize an image processing system including the image processing apparatus 100 and an external device (an image pickup device, a server, a user terminal such as a mobile phone or smart phone). In addition, the realization of the image processing system makes it possible to reduce the processing load on the external device, because, for example, the external device such as a user terminal may not perform the process according to the image processing method of the present embodiment.
In this example, an example of the communication unit 102 may include a communication antenna and RF circuit, a LAN terminal and transmission/reception circuit, and so on, but the configuration of the communication unit 102 is not limited thereto. For example, the communication unit 102 may have a configuration corresponding to any standard capable of performing a communication, such as a USB port and transmission/reception circuit, or a configuration capable of communicating with an external device via a network.
The controller 104 is configured to include, for example, a MPU or various processing circuits, and controls the entire image processing apparatus 100. In addition, the controller 104 may include an extraction unit 110, a combining unit 112, a display control unit 114, and a recording processing unit 116. The controller 104 plays a leading role in controlling the process according to the image processing method of the present embodiment.
The extraction unit 110 plays a leading role in performing the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2). The extraction unit 110 extracts a first still portion based on a plurality of still images, and extracts at least a part of a second still portion based on the plurality of still images. In addition, the extraction unit 110 includes, for example, a first still portion extraction unit 118 and a second still portion extraction unit 120.
The first still portion extraction unit 118 plays a leading role in performing the process (first still portion extraction process) of the above item (1), and extracts the first still portion in an image based on the plurality of still images. In this example, the still image processed by the first still portion extraction unit 118 may be one or more of an image captured by an image pickup device, a still image stored in a recording medium, and a frame image partially constituting an moving image stored in a recording medium. In addition, when the image processing apparatus 100 includes an image pickup device, the first still portion extraction unit 118 may process an image captured by the image pickup device. In other words, when the image captured by the image pickup device is processed, the image processing apparatus 100 can process the image captured by the image processing apparatus itself that functions as the image pickup device.
Further, the first still portion extraction unit 118 transmits, for example, data indicating the extracted first still portion to the second still portion extraction unit 120 and the combining unit 112.
Moreover, when the image processing apparatus 100 causes a progress display based on a reduced image to be displayed on a display screen as the display control process according to the present embodiment, the first still portion extraction unit 118 may, for example, generate a reduced image of the still image to be processed, and extract a first still portion corresponding to the reduced image. In this case, the first still portion extraction unit 118 transmits, for example, data indicating the first still portion corresponding to the extracted reduced image to the second still portion extraction unit 120 and the combining unit 112.
The second still portion extraction unit 120 plays a leading role in performing the process (second still portion extraction process) of the above item (2), and extracts a second still portion in an image based on a plurality of still images. In this example, an example of the still image processed by the second still portion extraction unit 120 may include one or more of an image captured by an image pickup device, a still image stored in a recording medium, and a frame image partially constituting an moving image stored in a recording medium. In addition, when the image processing apparatus 100 includes an image pickup device, the second still portion extraction unit 120 may process an image captured by the image pickup device, in a similar manner to the first still portion extraction unit 118.
Further, the second still portion extraction unit 120 transmits, for example, data indicating the extracted second still portion to the combining unit 112.
Furthermore, when a progress display based on a reduced image is displayed on a display screen by performing the display control process according to the present embodiment in the image processing apparatus 100, the second still portion extraction unit 120 may, for example, generate a reduced image of the still image to be processed, and extract a second still portion corresponding to the reduced image. In this case, for example, on the basis of a reduced image in which the still image to be processed is reduced each time the number of the still images to be processed increases, the second still portion extraction unit 120 extracts a new second still portion corresponding to the reduced image. In addition, the second still portion extraction unit 120 may transmit, for example, data indicating the second still portion corresponding to the extracted reduced image to the combining unit 112.
The extraction unit 110 includes, for example, the first still portion extraction unit 118 and the second still portion extraction unit 120, and thus plays a leading role in performing the process (first still portion extraction process) of the above item (1) and the process (second still portion extraction process) of the above item (2). In addition,
The combining unit 112 plays a leading role in performing the process (combined image generation process) of the above item (3). In addition, the combining unit 112 combines the first still portion indicated by data transmitted from the first still portion extraction unit 118 and the second still portion indicated by data transmitted from the second still portion extraction unit 120 to generate a combined image.
Further, when the image processing apparatus 100 performs the display control process according to the present embodiment so that a progress display based on a reduced image is displayed on a display screen, the combining unit 112 may combine the first still portion corresponding to the reduced image and the second still portion corresponding to the reduced image, and then generate a combined image corresponding to the reduced image. In this case, for example, each time a new second still portion corresponding to the reduced image is extracted in the second still portion extraction unit 120, the combining unit 112 may combine the first still portion corresponding to the reduced image or the previously generated combined image corresponding to the reduced image and a newly extracted second still portion corresponding to the reduced image, and then generate a new combined image corresponding to the reduced image.
Furthermore, the combining unit 112 transmits, for example, data indicating the combined image to the display control unit 114.
Moreover, the combining unit 112 transmits, for example, the generated combined image to the recording processing unit 116.
The display control unit 114 plays a leading role in performing the display control process according to the image processing method of the present embodiment, and allows the combined image indicated by the data transmitted from the combining unit 112 to be displayed on a display screen. For example, each time combining unit 112 generates a combined image (every time data indicating a combined image is transmitted), the display control unit 114 causes the generated combined image to be displayed on a display screen. In this case, an example of the display screen on which the combined image is displayed by the display control unit 114 may include a display screen of a display unit (not shown) provided in the image processing apparatus 100, and a display screen of an external device connected via a network (or directly) on a wired or wireless connection.
The combined image transmitted from the combining unit 112 (combined image generated in the combining unit 112) is recorded by the recording processing unit 116 at a predetermined timing.
In this example, an example of the recording medium on which the combined image is recorded by the recording processing unit 116 may include a storage unit (not shown), a removable external recording medium connected to the image processing apparatus 100, and a recording medium provided in an external device connected via a network (or directly) on a wired or wireless connection. When the combined image is recorded on a recording medium provided in an external device, the recording processing unit 116 causes the combined image to be recorded on the recording medium provided in the external device, for example, by transmitting data indicating the combined image and a recording instruction for recording the combined image to the external device.
Moreover, an example of the timing at which the combined image is recorded by the recording processing unit 116 may include a timing at which a final combined image is obtained, and a timing before a final combined image is obtained, that is, a predetermined timing at an intermediate stage of the process (combined image generation process) of the above item (3). For example, an example of the predetermined timing at an intermediate stage of the process (combined image generation process) of the above item (3) may include a timing set according to an expected time at which a final combined image is obtained, which described above, and a timing set according to the status of progress in the process (or percentage of completion of a combined image).
The controller 104 includes, for example, the extraction unit 110, the combining unit 112, the display control unit 114, and the recording processing unit 116, and thus plays a leading role in performing the process according to the image processing method of the present embodiment.
Furthermore, the configuration of the controller according to the present embodiment is not limited to the above example. For example, the controller according to the present embodiment may not include the display control unit 114 and/or the recording processing unit 116. The controller according to the present embodiment can play a leading role in performing the process (first still portion extraction process) of the above item (1) to the process (combined image generation process) of the above item (3), without at least one of the display control unit 114 or the recording processing unit 116.
The image processing apparatus 100 performs the process according to the image processing method of the present embodiment (e.g., the process (first still portion extraction process) of the above item (1) to the process (combined image generation process) of the above item (3)), for example, by the configuration shown in
Furthermore, the configuration of the image processing apparatus according to the present embodiment is not limited to the configuration shown in
Moreover, the image processing apparatus according to the present embodiment may be configured without an image pickup device (not shown). The image processing apparatus according to the present embodiment, when it includes an image pickup device (not shown), can perform the process according to the image processing method of the present embodiment based on the captured image generated by performing the image capturing in the image pickup device (not shown).
Furthermore, the image processing apparatus according to the present embodiment, when it is configured to perform the process on a stand-alone basis, may not include the communication unit 102.
As described above, the image processing apparatus according to the present embodiment performs, for example, the process (first still portion extraction process) of the above item (1) to the process (combined image generation process) of the above item (3) as the process according to the image processing method of the present embodiment. In this example, the image processing apparatus according to the present embodiment combines the first still portion extracted in the process (first still portion extraction process) of the above item (1) and the second still portion extracted in the process (second still portion extraction process) of the above item (2). In this case, the image processing apparatus according to the present embodiment extracts a still portion based on a motion vector, in comparison with obtaining a combined image by using a simple arithmetic mean in the related art. This motion vector is calculated based on a plurality of still images in each of the process (first still portion extraction process) of the above item (1) and process (second still portion extraction process) of the above item (2). Thus, as the example shown in
Therefore, the image processing apparatus according to the present embodiment can obtain an image from which a moving object is removed based on a plurality of still images.
Further, an example of the still image processed by the image processing apparatus according to the present embodiment may include one or more of an image captured by an image pickup device (image processing apparatus itself, or an external device), a still image stored in a recording medium, and a frame image partially constituting a moving image stored in a recording medium. The still image stored in a recording medium contains a combined image according to the present embodiment. Thus, even when a combined image that contains the unnecessary subject A1 as shown in A of
As described above, although the present embodiment has been described with reference to an exemplary image processing apparatus, the present embodiment is not limited to the illustrative embodiments set forth herein. The present embodiment is applicable to, for example, various kinds of equipments capable of processing an image. An example of these equipments includes a communication device such as mobile phone or smart phone, a video/music player (or video/music recording and reproducing apparatus), a game machine, a computer such as PC (Personal Computer) or server, a display device such as television receiver, and an image pickup device such as a digital camera. In addition, the present embodiment is applicable to, for example, a processing IC (Integrated Circuit) capable of being incorporated into the equipment as described above.
Further, the process according to the image processing method of the present embodiment may be realized, for example, by an image processing system including a plurality of devices based on the connection to a network (or communication between devices), such as cloud computing.
(Program According to Present Embodiment)
It is possible to obtain an image from which a moving object is removed based on a plurality of still images, by executing a program for causing a computer to function as the image processing apparatus according to the present embodiment in the computer. An example of the program may include a program capable of executing the process of the image processing apparatus according to the present embodiment, such as the process (first still portion extraction process) of the above item (1) to the process (combined image generation process) of the above item (3).
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
For example, in the above description, an illustration has been made of a program for causing a computer to function as the image processing apparatus according to the present embodiment being provided, embodiments of the present disclosure may further provide a storage medium in which the above-described program has been stored therewith.
The above-described configurations are an illustration of an example of an embodiment of the present disclosure, and belong to the technical scope of the present disclosure, as a matter of course.
Additionally, the present technology may also be configured as below.
(1) An image processing apparatus including:
an extraction unit for extracting a first still portion in an image based on a plurality of still images, and for extracting at least a part of a second still portion corresponding to a portion that is not extracted as the first still portion in an image based on the plurality of still images; and
a combining unit for combining the first still portion and the second still portion to generate a combined image.
(2) The image processing apparatus according to (1), wherein the extraction unit extracts a still portion that contains a subject to be extracted as the first still portion.
(3) The image processing apparatus according to any one of (1) to (5), further including:
a display control unit for causing the combined image to be displayed on a display screen.
(4) The image processing apparatus according to (3), wherein the extraction unit extracts a new second still portion each time a number of the still images to be processed increases,
wherein the combining unit, each time the second still portion is newly extracted, combines the first still portion or a previously generated combined image with the newly extracted second still portion to generate a new combined image, and
wherein the display control unit, each time the combined image is generated, causes the generated combined image to be displayed on a display screen.
(5) The image processing apparatus according to (4), further including:
a recording processing unit for recording the combined image generated in the combining unit at a predetermined timing.
(6) The image processing apparatus according to (3), wherein the extraction unit extracts, based on a reduced image obtained by reducing a size of a still image to be processed, the first still portion corresponding to the reduced image, and extracts, based on a reduced image obtained by reducing a size of a still image, a new second still portion corresponding to the reduced image each time a number of the still images to be processed increases, the new second still portion corresponding to the reduced image,
wherein the combining unit, each time the second still portion corresponding to the reduced image is newly extracted, combines the first still portion corresponding to the reduced image or a previously generated combined image corresponding to the reduced image with the newly extracted second still portion corresponding to the reduced image to generate a new combined image corresponding to the reduced image, and
wherein the display control unit, each time the new combined image corresponding to the reduced image is generated, causes the generated combined image to be displayed on a display screen.
(7) The image processing apparatus according to (4) or (6), wherein the display control unit applies a color to a portion that is not included in either of the first still portion or the second still portion in the combined image, and causes a combined image to which the color is applied to be displayed.
(8) The image processing apparatus according to (1) or (2), wherein the extraction unit extracts a still portion based on motion vectors of a plurality of still images.
(9) The image processing apparatus according to any one of (1) to (8), wherein the extraction unit corrects one or more deviations of a vertical direction, a horizontal direction, and a rotation direction in a still image to be processed, and extracts at least one of the first still portion or the second still portion.
(10) The image processing apparatus according to any one of (1) to (9), wherein the still image to be processed by the extraction unit is one or more of images captured by an image pickup device, a still image stored in a recording medium, and a frame image partially constituting a moving image stored in a recording medium.
(11) The image processing apparatus according to any one of (1) to (10), wherein the extraction unit, when a plurality of still images to be processed include a plurality of still images having the extracted first still portion, regards an image indicated by a signal obtained by averaging a signal that corresponds to an region corresponding to the first still portion in the plurality of still images as the first still portion.
(12) The image processing apparatus according to any one of (1) to (11), wherein the extraction unit, when the extraction of the first still portion is completed, notifies a user that the extraction of the first still portion is completed.
(13) The image processing apparatus according to any one of (1) to (12), further including:
an image pickup unit for capturing an image,
wherein the still image to be processed by the extraction unit includes an image captured by the image pickup unit.
(14) An image processing method including:
extracting a first still portion in an image based on a plurality of still images;
extracting at least a part of a second still portion corresponding to a portion that is not extracted as the first still portion in an image based on the plurality of still images; and
combining the first still portion and the second still portion to generate a combined image.
(15) The image processing method according to (14), wherein the step of extracting the first still portion includes extracting a still portion that contains a subject to be extracted as the first still portion.
(16) The image processing method according to (14) or (15), wherein the step of extracting the first still portion and the step of extracting the second still portion include calculating a motion vector based on a plurality of still images and extracting a still portion based on the calculated motion vector.
(17) The image processing method according to any one of (14) to (16), further including:
causing the combined image to be displayed on a display screen.
(18) The image processing method according to any one of (14) to (17), wherein the step of extracting the first still portion includes, when the extraction of the first still portion is completed, notifying a user that the extraction of the first still portion is completed.
(19) The image processing method according to any one of (14) to (18), further including:
capturing an image,
wherein a still image to be processed in the step of extracting the first still portion and the step of extracting the second still portion includes an image captured in the capturing step.
(20) A program for causing a computer to execute:
extracting a first still portion in an image based on a plurality of still images;
extracting at least a part of a second still portion corresponding to a portion that is not extracted as the first still portion in an image based on the plurality of still images; and
combining the first still portion and the second still portion to generate a combined image.
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2012-021894 filed in the Japan Patent Office on Feb. 3, 2012, the entire content of which is hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2012-021894 | Feb 2012 | JP | national |