This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2013-061071, filed Mar. 22, 2013, and the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to an image acquisition apparatus, an image acquisition method and a recording medium, all of which stop acquisition of images.
2. Related Art
As conventional technology, Japanese Unexamined Patent Application, Publication No. 2010-141517 discloses technology for automatically terminating acquisition of a moving image when a predetermined period of time has elapsed after initiating the acquisition of the moving image.
An aspect of an image acquisition apparatus according to the present invention is an image acquisition apparatus, including:
an acquisition unit that sequentially acquires images consecutively captured by an image capture unit while the image capture unit is moved;
a detection unit that detects a predetermined subject in the images acquired by the acquisition unit; and
a stop unit that stops acquisition of the images by the acquisition unit when a predetermined subject is detected by the detection unit.
Furthermore, an aspect of a method of acquiring images according to the present invention is a method of acquiring images, including:
sequentially acquiring images that are consecutively captured by an image capture unit while the image capture unit is moved;
detecting a predetermined subject in the images acquired in the step of acquiring images; and
stopping the acquiring of the images when a predetermined subject is detected in the step of detecting a subject.
Moreover, an aspect of a non-transitory storage medium according to the present invention is a non-transitory storage medium encoded with a computer-readable program that enables a computer to execute functions as:
an acquisition unit that sequentially acquires images consecutively captured by an image capture unit while the image capture unit is moved;
a detection unit that detects a predetermined subject in the images acquired by the acquisition unit; and
a stop unit that stops acquisition of the images by the acquisition unit when a predetermined subject is detected by the detection unit.
Embodiments of the present invention are hereinafter described with reference to the drawings.
An image acquisition apparatus 1 is configured as a digital camera, for example.
The image acquisition apparatus 1 includes a CPU (Central Processing Unit) 11, ROM (Read Only Memory) 12, RAM (Random Access Memory) 13, a bus 14, an input/output interface 15, an image capture unit 16, an input unit 17, an output unit 18, a storage unit 19, a communication unit 20, and a drive 21.
The CPU 11 executes various processing according to programs that are recorded in the ROM 12, or programs that are loaded from the storage unit 19 to the RAM 13.
The RAM 13 also stores data and the like necessary for the CPU 11 to execute the various processing, as appropriate.
The CPU 11, the ROM 12 and the RAM 13 are connected to one another via the bus 14. The input/output interface 15 is also connected to the bus 14. The image capture unit 16, the input unit 17, the output unit 18, the storage unit 19, the communication unit 20 and the drive 21 are connected to the input/output interface 15.
The image capture unit 16 includes an optical lens unit and an image sensor (not illustrated).
In order to photograph a subject, the optical lens unit is configured by a lens such as a focus lens and a zoom lens for condensing light.
The focus lens is a lens for forming an image of a subject on the light receiving surface of the image sensor. The zoom lens is a lens that causes the focal length to freely change in a certain range.
The optical lens unit also includes peripheral circuits to adjust setting parameters such as focus, exposure, white balance, and the like, as necessary.
The image sensor is configured by an optoelectronic conversion device, an AFE (Analog Front End), and the like.
The optoelectronic conversion device is configured by a CMOS (Complementary Metal Oxide Semiconductor) type of optoelectronic conversion device and the like, for example. Light incident through the optical lens unit forms an image of a subject in the optoelectronic conversion device. The optoelectronic conversion device optoelectronically converts (i.e. captures) the image of the subject, accumulates the resultant image signal for a predetermined time interval, and sequentially supplies the image signal as an analog signal to the AFE.
The AFE executes a variety of signal processing such as A/D (Analog/Digital) conversion processing on the analog image signal. The variety of signal processing generates a digital signal that is output as an output signal from the image capture unit 16.
Such an output signal of the image capture unit 16 is hereinafter referred to as “image data”. The image data is supplied to the CPU 11, the RAM 13 and the like, as appropriate.
The input unit 17 is configured by various buttons and the like, and inputs various information in accordance with instruction operations of the user.
The output unit 18 is configured by a display, speaker, etc., and outputs images and sound.
The storage unit 19 is configured by a hard disk or DRAM (Dynamic Random Access Memory) or the like, and stores data of various images.
The communication unit 20 controls communication with other devices (not shown) via networks including the Internet.
A removable medium 31 composed of a magnetic disk, an optical disk, a magneto-optical disk, semiconductor memory or the like is installed in the drive 21, as appropriate. Programs that are read via the drive 21 from the removable medium 31 are installed in the storage unit 19, as necessary. Similarly to the storage unit 19, the removable medium 31 can also store a variety of data such as the image data stored in the storage unit 19.
In the image acquisition stop processing, the image acquisition apparatus 1 sequentially acquires image data that is consecutively captured, and executes processing for determining presence or absence of a predetermined subject each time the image data is acquired. In the image acquisition stop processing, in a case of determining that the predetermined subject is present, processing for stopping acquisition of image data is executed.
The above sequence of processing is referred to as the image acquisition stop processing.
Examples assumed herein as the consecutively captured images include: continuously captured images in wide-shot photographing; continuously captured images in panoramic photographing; and a plurality of images composing a moving image. In the wide-shot photographing, a wide-angle image is composited from the consecutively captured images by moving a field angle (image capture range) of a digital camera in vertical and horizontal directions. In the panoramic photographing, a wide-angle image is composited from the consecutively captured images by moving the digital camera in a single predetermined direction.
In the present embodiment, in the wide-shot photographing, target images for composition by an acquired-image composition unit 47 (to be described later) are referred to as “target images for composition”. A wide-angle image that is composited from the target images for composition is referred to as a “wide-shot image”. Furthermore, a portion of the wide-shot image remaining after the trimming processing by a trimming unit 48 (to be described later) is referred to as a “trimmed image”.
The wide-shot photographing is described in the present embodiment; however, the present embodiment is not limited thereto, and is also applicable to the panoramic photographing and the moving-image photographing.
As shown in
However,
When the image acquisition stop processing is executed, an acquired-image storage unit 61 and a face image storage unit 62 are used, which are provided as an area of the storage unit 19.
Image data acquired by the image acquisition unit 43 (to be described later) is stored in the acquired-image storage unit 61.
A face image table shown in
When a corresponding face image is detected, the record designation serves as a flag for determining whether an image including the face image is used as a target image for wide-shot composition. The flag of the record designation includes ON and OFF. When at least one face image having the record designation ON is detected in an image, the image including the face image is used as a target image for wide-shot composition. Conversely, even if only face images having the record designation OFF are detected in each image, the images including the face images are not used as target images for wide-shot composition. In the present embodiment, after an instruction is provided to start wide-shot photographing, live-view images are sequentially captured, and at the timing when at least one face image having the record designation ON is detected in a live-view image, target images for wide-shot composition are started to be stored.
Priority is a value representative of a priority order of face images which will be included in trimmed images in a case in which at least two face images are present in a wide-shot image. Regarding the priority order, the priority 1 is the highest, the priority decreases in the order of 2, 3, 4 and 5, and the priority 5 is the lowest. For example, in a case in which a plurality of face images having the record designation ON are detected when capturing target images for composition, processing is executed such as focusing on a face image with the high priority as a main subject.
The main control unit 41 controls the entirety of the image acquisition apparatus 1.
The image capture control unit 42 controls an image capture operation by the image capture unit 16.
More specifically, a user depresses a shutter switch of the input unit 17 while holding the image acquisition apparatus 1 as a digital camera. The image capture control unit 42 then starts consecutive image capture operations of the image capture unit 16, and captures a subject at each predetermined time interval, while the image acquisition apparatus 1 is moved in a predetermined direction.
In particular, as shown in
The image capture control unit 42 causes the image capture unit 16 to repeatedly capture the image capture ranges 71 to 74 in this order until a predetermined condition is satisfied.
Each time the image capture unit 16 captures an image, the image capture control unit 42 causes the image capture unit 16 to output image data to the image acquisition unit 43.
With reference to
The face detection unit 44 detects an image of a facial portion a person (face image) in the image of the image data acquired by the image acquisition unit 43. A face detection device (not shown) is provided; and a face image is detected based on an output from the face detection device.
The determination unit 45 determines whether the face image detected by the face detection unit 44 is registered with the face image table, and whether the face image has the record designation ON. In other words, the determination unit 45 determines whether the image including the detected face image is a target image for wide-shot composition.
When the determination unit 45 determines that the image is a target image for wide-shot composition, the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the first target image for wide-shot composition into the acquired-image storage unit 61. The processing by the image acquisition unit 43 and the face detection unit 44 is further repeated until detecting the face image again. Meanwhile, the determination unit 45 stores the image data acquired by the image acquisition unit 43 into the acquired-image storage unit 61.
The determination unit 45 determines whether the detected face image is detected again. In other words, after the face detection unit 44 detects the face image in the image data, the determination unit 45 determines whether the detected face image is detected again in the image data acquired by the image acquisition unit 43.
In a case in which the determination unit 45 determines that the detected face image is detected again, the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the last target image for wide-shot composition into the acquired-image storage unit 61.
For example, in a case in which the image data of the image capture range 73 including the face image in a frame 81 as shown in
When the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the last target image for wide-shot composition into the acquired-image storage unit 61, the image acquisition stop unit 46 stops the acquisition of the image data by the image acquisition unit 43.
The acquired-image composition unit 47 generates a wide-shot image by compositing the first to last target image data for wide-shot composition, which are stored by the determination unit 45 into the acquired-image storage unit 61. More specifically, as shown in
The trimming unit 48 generates a trimmed image by trimming the wide-shot image generated by the acquired-image composition unit 47. More specifically, the trimming unit 48 recognizes face images included in the wide-shot image, and determines the frames 81 and 82 representative of image capture regions of the face images shown in
When the range 85 is determined, the trimming unit 48 displays an image of the range 85 as the trimmed image by way of the output unit 18.
Next, descriptions are provided for the image acquisition stop processing executed by the image acquisition apparatus 1.
In
In Step S1, the image capture control unit 42 controls the image capture operations by the image capture unit 16. More specifically, the user depresses the shutter switch of the input unit 17 while holding the image acquisition apparatus 1 as a digital camera. The image capture control unit 42 then causes the image capture unit 16 to start consecutive image capture operations, and capture an image at each predetermined time interval, while the image acquisition apparatus 1 is moved in a predetermined direction. Furthermore, each time the image capture unit 16 captures an image, the image capture control unit 42 causes the image capture unit 16 to output image data to the image acquisition unit 43.
In Step S2, the image acquisition unit 43 acquires image data that was output from the image capture unit 16 in Step S1, and stores the image data into the RAM 13.
In Step S3, the main control unit 41 determines whether a termination region is designated in Step S9 (to be described later). In a case in which a termination region is not designated, the main control unit 41 advances the processing to Step S4.
In Step S4, the face detection unit 44 detects a face image in the image based on the image data acquired by the image acquisition unit 43.
In Step S5, the determination unit 45 determines whether the face image detected by the face detection unit 44 is registered with the face image table (see
In Step S6, the determination unit 45 determines whether the designated face image was detected for the first time. In a case of determining that the designated face image was detected for the first time, the determination unit 45 designates an initiation region in Step S7.
In the image data stored in the acquired-image storage unit 61, the initiation region refers to a region where an image for initiating composition of target images for wide-shot composition is stored.
After the processing in Step S7, the determination unit 45 advances the processing to Step S10, and stores the image data acquired in Step S2 into the acquired-image storage unit 61. In Step S10 following Step S7, the stored image data serves as image data for initiating composition of target images for wide-shot composition.
In a case in which the determination unit 45 determines in Step S6 that the detection of the designated face image was not for the first time, the determination unit 45 determines in Step S8 whether the designated face image was detected for the second time. In a case of determining that the designated face image was detected for the second time, the determination unit 45 designates a termination region in Step S9.
In the image data stored in the acquired-image storage unit 61, the termination region refers to a region where an image for terminating the composition of target images for wide-shot composition is stored.
After the processing in Step S9, the determination unit 45 advances the processing to Step S10, and stores the image data acquired in Step S2 into the acquired-image storage unit 61. In Step S10 following Step S9, the stored image data serves as image data for terminating the composition of target images for wide-shot composition.
In a case in which the determination unit 45 determines in Step S8 that the detection of the designated face image was not for the second time, the determination unit 45 advances the processing to Step S10, and stores the image data acquired in Step S2 into the acquired-image storage unit 61.
When the processing of storing the image data in Step S10 is terminated, the determination unit 45 returns the processing to Step S1.
The processing through Steps S1 to S10 is repeated until determining in Step S3 that a termination region is designated. As a result, all the image data from initiation until termination of the composition of target images is stored into the acquired-image storage unit 61.
In a case of determining in Step S3 that a termination region is designated, the main control unit 41 advances the processing to Step S11.
In Step S11, the acquired-image composition unit 47 generates a wide-shot image by compositing all the image data from the initiation region to the termination region stored in the acquired-image storage unit 61 in Step S10.
In Step S12, the trimming unit 48 generates a trimmed image by trimming the wide-shot image generated by the acquired-image composition unit 47.
The embodiment of the present invention has been described above.
The image acquisition apparatus 1 for executing the image acquisition stop processing as described above includes the image capture unit 16, the image acquisition unit 43, the face detection unit 44, and the image acquisition stop unit 46.
The image capture unit 16 consecutively captures each image capture range as a target image for wide-shot composition.
The image acquisition unit 43 sequentially acquires the captured target images for composition.
Each time a target image for composition is acquired, the face detection unit 44 detects a predetermined subject in the acquired image.
When the predetermined subject is detected, the image acquisition stop unit 46 stops the acquisition of target images for composition.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for composition, the operation of acquiring images can be terminated at a desired point in time.
The image acquisition apparatus 1 further includes the determination unit 45 that determines whether an identical face image was detected at least twice by the face detection unit 44.
In a case in which the determination unit 45 determines that an identical face image was detected at least twice, the image acquisition stop unit 46 stops the acquisition of target images for wide-shot composition.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for wide-shot composition, the operation of acquiring images can be terminated at the timing when an identical face image is detected twice.
The image acquisition apparatus 1 further includes the acquired-image composition unit 47 that generates a wide-shot image from target images for composition sequentially acquired by the image acquisition unit 43.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for composition, the operation of acquiring images can be terminated at a desired point in time, and a wide-shot image can be efficiently generated.
The image acquisition apparatus 1 further includes the determination unit 45 that stores images sequentially acquired by the image acquisition unit 43 as a moving image into the acquired-image storage unit 61.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring images, the operation of acquiring images can be terminated at a desired point in time, and a moving image can be efficiently generated.
The image acquisition apparatus 1 further includes the trimming unit 48 that trims a wide-shot image as the wide-angle image generated by the acquired-image composition unit 47, based on a predetermined subject as a reference.
Therefore, a trimmed image can be generated as a wide-angle image that is suitably composed for a particular subject.
The image capture unit 16 captures a moving image and/or continuously captures images.
Therefore, not only in the wide-shot photographing but also in the moving image photographing and the panoramic photographing, even if the user does not provide any instruction to stop the operation of acquiring images, the operation of acquiring images is terminated at a desired point in time.
The present invention is not limited to the aforementioned embodiment; and modifications, improvements, etc. within a scope that can achieve the object of the present invention are included in the present invention.
A first modification example of the present invention is described.
In the first modification example, functional blocks other than the determination unit 45 are identical to those in the above embodiment shown in
The determination unit 45 determines whether the face image detected by the face detection unit 44 is registered with the face image table, and whether the face image has the record designation ON. In other words, the determination unit 45 determines whether the image including the detected face image is a target image for wide-shot composition.
When the determination unit 45 determines that the image is a target image for wide-shot composition, the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the first target image for wide-shot composition into the acquired-image storage unit 61. The processing by the image acquisition unit 43 and the face detection unit 44 is further repeated until detecting a particular face image (hereinafter referred to as a “particular designated face image”) which differs from the detected face image. Meanwhile, the determination unit 45 stores the image data acquired by the image acquisition unit 43 into the acquired-image storage unit 61.
The determination unit 45 determines whether a particular designated face image is detected. In other words, the determination unit 45 determines whether the face image detected by the face detection unit 44 coincides with a particular face image stored in the face image table, the particular face image having the corresponding record designation ON.
In a case in which the determination unit 45 determines that the face image detected by the face detection unit 44 coincides with a particular face image, the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the last target image for wide-shot composition into the acquired-image storage unit 61.
Image Acquisition Stop Processing according to First Modification Example
Next, descriptions are provided for the image acquisition stop processing executed by the image acquisition apparatus 1 according to the first modification example.
The processing in Steps S21 to S27, S31 and S32 herein is similar to the processing in Steps S1 to S7, S11 and S12 shown in
In a case in which the determination unit 45 determines in Step S26 that the detection of the designated face image was not for the first time, the determination unit 45 determines in Step S28 whether the face image detected by the face detection unit 44 coincides with a particular face image. In a case of determining that the designated face image coincides with the particular face image, the determination unit 45 designates a termination region in Step S29.
After the processing in Step S29, the determination unit 45 advances the processing to Step S30, and stores the image data acquired in Step S22 into the acquired-image storage unit 61. In Step S30 following Step S29, the stored image data serves as image data for terminating the composition of target images for wide-shot composition.
In a case in which the determination unit 45 determines in Step S28 that the face image detected by the face detection unit 44 does not coincide with a particular face image, the determination unit 45 advances the processing to Step S30, and stores the image data acquired in Step S22 into the acquired-image storage unit 61.
When the processing of storing the image data in Step S30 is terminated, the determination unit 45 returns the processing to Step S21.
The processing through Steps S21 to S30 is repeated until determining in Step S23 that a termination region is designated. As a result, all the image data from initiation until termination of the composition of target images for wide-shot composition is stored into the acquired-image storage unit 61.
The first modification example of the present invention has been described above.
The image acquisition apparatus 1 for executing the image acquisition stop processing according to the first modification example includes the image capture unit 16, the image acquisition unit 43, the face detection unit 44, and the image acquisition stop unit 46.
The image capture unit 16 consecutively captures each image capture range as a target image for wide-shot composition.
The image acquisition unit 43 sequentially acquires the captured images.
Each time a target image for composition is acquired, the face detection unit 44 detects a predetermined subject in the acquired image.
In a case in which the predetermined subject is detected, the image acquisition stop unit 46 stops the acquisition of target images for wide-shot composition.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for wide-shot composition, the operation of acquiring images is terminated at a desired point in time.
Target images for composition serve as the basis for composing a wide-shot image.
In a case in which a particular face image is detected, the image acquisition stop unit 46 stops the acquisition of target images for composition.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for composition, the operation of acquiring images is terminated at the timing when a particular face image is detected.
The image capture unit 16 captures a moving image and/or continuously captures images.
Therefore, not only in the wide-shot photographing but also in the moving image photographing and the panoramic photographing, even if the user does not provide any instruction to stop the operation of acquiring images, the operation of acquiring images is terminated at the timing when a particular face image is detected.
A second modification example of the present invention is described.
In the second modification example, functional blocks other than the determination unit 45 are identical to those in the above embodiment shown in
The determination unit 45 determines whether the face image detected by the face detection unit 44 is registered with the face image table, and whether the face image has the record designation ON. In other words, the determination unit 45 determines whether the image including the detected face image is a target image for wide-shot composition.
When the determination unit 45 determines that the image is a target image for wide-shot composition, the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the first target image for wide-shot composition into the acquired-image storage unit 61. The processing by the image acquisition unit 43 and the face detection unit 44 is further repeated until detecting face images (designated face images) being registered with the face image table and having the record designation ON are detected for N (N is an integer of at least two) persons (for example, three different persons). Meanwhile, the determination unit 45 stores the image data acquired by the image acquisition unit 43 into the acquired-image storage unit 61.
The determination unit 45 determines whether designated face images for the N persons are detected by the face detection unit 44.
In a case in which the determination unit 45 determines that the designated face images for the N persons are detected, the determination unit 45 stores the image data acquired by the image acquisition unit 43 as the last target image for wide-shot composition into the acquired-image storage unit 61.
Image Acquisition Stop Processing according to Second Modification Example
Next, descriptions are provided for the image acquisition stop processing executed by the image acquisition apparatus 1 according to the second modification example.
The processing in Steps S41 to S47, S51 and S52 herein is similar to the processing in Steps S1 to S7, S11 and S12 shown in
In a case in which the determination unit 45 determines in Step S46 that the detection of the face image being registered with the face image table and having the record designation ON by the face detection unit 44 was not for the first time, the determination unit 45 determines in Step S48 whether designated face images for N persons are detected by the face detection unit 44. In a case of determining that the designated face images for the N persons are detected, the determination unit 45 designates a termination region in Step S49.
After the processing in Step S49, the determination unit 45 advances the processing to Step S50, and stores the image data acquired in Step S42 into the acquired-image storage unit 61. In Step S50 following Step S49, the stored image data serves as image data for terminating the composition of target images for composition.
In a case in which the determination unit 45 determines in Step S48 that designated face images for N persons are not detected by the face detection unit 44, the determination unit 45 advances the processing to Step S50, and stores the image data acquired in Step S42 into the acquired-image storage unit 61.
When the processing of storing the image data in Step S50 is terminated, the determination unit 45 returns the processing to Step S41.
The processing through Steps S41 to S50 is repeated until determining in Step S43 that a termination region is designated. As a result, all the image data from initiation until termination of the composition of target images for composition is stored into the acquired-image storage unit 61.
The second modification example of the present invention has been described above.
The image acquisition apparatus 1 for executing the image acquisition stop processing according to the second modification example includes the image capture unit 16, the image acquisition unit 43, the face detection unit 44, and the image acquisition stop unit 46.
The image capture unit 16 consecutively captures each image capture range as a target image for wide-shot composition.
The image acquisition unit 43 sequentially acquires the captured images.
Each time a target image for composition is acquired, the face detection unit 44 detects a predetermined subject in the acquired image.
When the predetermined subject is detected, the image acquisition stop unit 46 stops the acquisition of target images for composition.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for composition, the operation of acquiring images is terminated at a desired point in time.
The image acquisition apparatus 1 further includes the determination unit 45 that determines whether the number of face images detected by the face detection unit 44 is at least a predetermined value.
In a case in which the number of face images determined by the determination unit 45 is at least a predetermined value, the image acquisition stop unit 46 stops the acquisition of target images for composition.
Therefore, even if the user does not provide any instruction to stop the operation of acquiring target images for composition, the operation of acquiring images is terminated at the timing when a predetermined number of face images are detected.
The image capture unit 16 captures a moving image and/or continuously captures images.
Therefore, not only in the wide-shot photographing but also in the moving image photographing and the panoramic photographing, even if the user does not provide any instruction to stop the operation of acquiring images, the operation of acquiring images is terminated at the timing when a predetermined number of face images are detected.
In the aforementioned embodiment and the first and second modification examples, a digital camera has been described as an example of the image acquisition apparatus 1 to which the present invention is applied; however, the present invention is not particularly limited thereto.
For example, the present invention can be applied to any electronic device in general having an image processing function. More specifically, for example, the present invention can be applied to a lap-top personal computer, a printer, a television, a video camera, a portable navigation device, a cell phone device, a portable gaming device, and the like.
The processing sequence described above can be executed by hardware, and can also be executed by software.
In other words, the hardware configuration shown in
A single functional block may be configured by a single piece of hardware, a single installation of software, or any combination thereof.
In a case in which the processing sequence is executed by software, a program configuring the software is installed from a network or a storage medium into a computer or the like.
The computer may be a computer embedded in dedicated hardware. Alternatively, the computer may be a computer capable of executing various functions by installing various programs, e.g., a general-purpose personal computer.
The storage medium containing such a program can not only be constituted by the removable medium 31 shown in
In the present specification, the steps describing the program recorded in the storage medium include not only the processing executed in a time series following this order, but also processing executed in parallel or individually, which is not necessarily executed in a time series.
Although some embodiments of the present invention have been described above, the embodiments are merely illustrative examples, and do not limit the technical scope of the present invention. Various other embodiments can be employed for the present invention, and various modifications such as omission and replacement are possible without departing from the spirit of the present invention. Such embodiments and modifications are included in the scope of the invention and the summary described in the present specification, and are included in the invention recited in the claims as well as the equivalent scope thereof.
Number | Date | Country | Kind |
---|---|---|---|
2013-061071 | Mar 2013 | JP | national |