1. Field of the Invention
The present invention relates to an image processing apparatus and a print control apparatus, and a control method and a storage medium thereof, and more particularly, to a print control apparatus for printing a moving image on a medium such as recording paper as a plurality of still images (for example, an album).
2. Description of the Related Art
In recent years, there has been an image pickup apparatus such as a digital camera which records motion of a subject from before a photographer presses a shutter button provided at the image pickup apparatus until immediately after the photographer presses the shutter button as a moving image separately from a still image shot when the shutter button is pressed. Through reproduction of the recorded moving image, the photographer, or the like, can know the atmosphere and situation at the time of shooting, which cannot be known only from the still image.
By this means, even a person who was not present at the shooting location can understand the atmosphere and situation at the time of shooting by viewing the moving image even if time has elapsed since the shooting.
Further, there is a case where a moving image is stored as a work, or a case where a still image and a moving image are edited to generate a new moving image as a digest version to be shown to another person. In most of such cases, typically, a computer such as a PC in which dedicated software for creating a moving image is installed is required.
Meanwhile, in recent years, an image pickup apparatus is known which can create a new moving image by editing a still image and a moving image. With such an image pickup apparatus, a user can create a digest version without using a computer when he is away from home, for example, while traveling. By this means, the user can look back on the travel so far with persons who travel with the user, such as family or friends in the course of the travel.
Further, the user can immediately know the atmosphere and situation at the time of shooting from the moving image recorded as a digest version by viewing the above-described moving image of the digest version.
However, a dedicated device is required to reproduce a moving image. If the user does not have a reproduction device for reproducing a moving image, it is impossible to reproduce the moving image, and, further, it is difficult for a person who is not used to manipulating the device to reproduce a moving image.
To solve such a problem, there is, for example, equipment which creates a moving image from a still image and accepts a printing instruction for particular effect in the moving image while reproducing the moving image (Japanese Laid-Open Patent Publication (Kokai) No. 2004-64285).
The equipment disclosed in Japanese Laid-Open Patent Publication (Kokai) No. 2004-64285, when receiving a printing instruction for particular effect, can print a still image having the effect. Further, this equipment can print an image having particular effect which a person desires to show to another person while viewing the moving image, so that it is possible to convey content corresponding to a digest using a printed matter of a still image instead of showing the moving image.
Further, there is an album creating method for determining layout of an album such as a photo book using characteristics of an image and image information (see Japanese Laid-Open Patent Publication (Kokai) No. 2006-261863).
However, while the equipment disclosed in Japanese Laid-Open Patent Publication (Kokai) No. 2004-64285, can extract a still image having particular effect designated in the moving image, the user cannot immediately understand what kind of meaning the extracted still image has in the moving image.
Therefore, when a still image extracted from the moving image is printed, the still image is printed while the user does not know what kind of meaning the effect which is targeted has in the moving image. As a result, even if an album is edited by collecting printed matters created from still images extracted from the moving image, there is a possibility that the album may express content different from the original moving image.
The present invention provides an image processing apparatus and a print control apparatus which allows a user to easily understand content expressed in a moving image of a print source when a still image extracted from the moving image is printed, and a control method and a control program thereof.
According a first aspect of the present invention, an image processing apparatus comprises a designating unit configured to designate a moving image as a print target, an image specifying unit configured to specify a material image to be printed based on composition information associated with the moving image designated as the print target by the designating unit, the composition information including information for specifying a plurality of original images used for generating the moving image, and a transmitting unit configured to transmit the material image specified by the image specifying unit and the composition information to a print control apparatus to instruct the print control apparatus to perform printing.
According a second aspect of the present invention, a print control apparatus comprises a storage unit configured to store composition information including information for specifying a type of effect applied when a moving image is generated and a material image to be printed, a layout determining unit configured to determine layout of a page of a printed matter according to the information for specifying the type of effect included in the composition information, and a print data generating unit configured to generate print data by arranging the material image based on the layout of the page.
According a third aspect of the present invention, a control method of an image processing apparatus comprises a designating step of designating a moving image as a print target, an image specifying step of specifying a material image to be printed based on composition information associated with the moving image designated as the print target in the designating step, the composition information including information for specifying a plurality of original images used for generating the moving image, and a transmitting step of transmitting the material image specified in the image specifying step and the composition information to a print control apparatus to instruct the print control apparatus to perform printing.
According a fourth aspect of the present invention, a control method of a print control apparatus, comprises a storage step of storing composition information including information for specifying a type of effect applied when a moving image is generated and a material image to be printed in a memory, a layout determining step of determining layout of a page of a printed matter according to the information for specifying the type of effect included in the composition information, and a print data generating step of generating print data by arranging the material image based on the layout of the page.
According a fifth aspect of the present invention, a computer-readable non-transitory storage medium is for storing a program for causing a computer to execute a control method of an image processing apparatus. The control method comprises a designating step of designating a moving image as a print target, an image specifying step of specifying a material image to be printed based on composition information associated with the moving image designated as the print target in the designating step, the composition information including information for specifying a plurality of original images used for generating the moving image, and a transmitting step of transmitting the material image specified in the image specifying step and the composition information to a print control apparatus to instruct the print control apparatus to perform printing.
According a sixth aspect of the present invention, a computer-readable non-transitory storage medium is for storing a program for causing a computer to execute a control method of a print control apparatus. The control method comprises a storage step of storing composition information including information for specifying a type of effect applied when a moving image is generated and a material image to be printed in a memory, a layout determining step of determining layout of a page of a printed matter according to the information for specifying the type of effect included in the composition information, and a print data generating step of generating print data by arranging the material image based on the layout of the page.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
According to the present invention, images used in a moving image to be printed are specified as material images according to composition information, and the material images and the composition information are transmitted to a print control apparatus. By this means, it is possible to easily recognize content expressed in the moving image of a print source from images obtained through printing also when the moving image is printed.
The present invention will now be described in detail below with reference to the accompanying drawings showing embodiments thereof.
The moving image printing system shown in the figure has at least an image pickup apparatus 100 and a server PC 110 which is one of print control apparatuses. The image pickup apparatus 100 is, for example, a digital camera (hereinafter, simply referred to as a camera), and the moving image printing system may include a terminal apparatus such as a smartphone 120.
The camera 100 has a CPU 105 which controls the camera 100. In an SD card 101 which is a recording medium, an image (shot image) obtained as a result of shooting by an image pickup unit (not shown) and an image (camera created image) generated through editing, or the like, at the camera 100 are recorded. In a memory 102, a program to be executed by the CPU 105 is recorded.
Further, the camera 100 includes a WiFi connection module 103 used as a communication unit, and a display 104. At the display 104, the shot image and the camera created image are displayed, and a user interface screen (UI screen) is displayed. A user manipulates the UI screen using a button 106.
The CPU 105 executes the program stored in the memory 102 to control the camera 100. In the memory 102, a program required to operate the CPU 105 is recorded. Further, the memory 102 is used as a work memory of the CPU 105. The CPU 105 controls the WiFi connection module 103 to perform communication with the server PC 110 via a WiFi base unit 130.
The server PC 110 includes a CPU 114, a network board 113, a memory 112, an HDD 111 and a USB terminal 115. The CPU 114 executes a program recorded in the HDD 111 to control the server PC 110. Further, the HDD 111 is used as a memory unit in which a layout definition file which will be described later is stored. The memory 112 is used as a work memory required to operate the CPU 114. The CPU 114 controls the network board 113 to perform communication with the camera 100 via a network (for example, Internet).
As shown in the figure, a printer 150 is connected to the server PC 110 via the USB terminal 115. As will be described later, the server PC 110 creates print data and causes the printer 150 to perform printing according to the print data.
In an example of the figure, the smartphone 120 can be a substitute for the camera 100 or a substitute for the server PC 110. The smartphone 120 includes a CPU 122, a flash memory 121, a memory 125, a touch panel 124 and a WiFi connection module 123.
The CPU 122 executes a program recorded in the flash memory 121 to control the smartphone 120. The memory 125 is used as a work memory required to operate the CPU 122. The CPU 122 controls the WiFi connection module 123 to perform communication with the server PC 110 through a WiFi base unit 140.
Further, the CPU 122 can wirelessly connect to the printer 150 via the WiFi connection module 123. The CPU 122 transmits print data to the printer 150 through the WiFi connection module 123 to perform printing according to the print data.
As illustrated in the figure, a display unit (not shown) including the touch panel 124 is connected to the CPU 122, and a result according to the processing by the CPU 122 is displayed at the display unit. The user transmits various instructions to the CPU 122 through touch operation using the touch panel 124.
Because the moving image printing system only has to have at least the camera 100 and the server PC 110, a case where the moving image printing system includes the camera 100 and the server PC 110 will be described first.
In the SD card 101, still images 201 to 204, 206 and 208 and a moving image 207 and a still image group 205 obtained through continuous shooting are recorded as image data. New moving image data (hereinafter, referred to as an edited moving image) is created using the image data recorded in the SD card 101.
In the image data, for example, a file name, date and time of shooting, a format, a subject name (in the case of a person) and a rating are recorded.
When editing of a moving image (that is, creation of a moving image to be printed) is started, the CPU 105 first defines specification of a moving image to be edited (step S301).
When definition of the specification of the edited moving image is started, the CPU 105 displays a creation condition designation screen shown in
In the example shown in
An option button 402 is a button for individually designating image data to be used for the edited moving image. Further, an option button 403 is a button for designating a subject (here, a person) and creating an edited moving image which focuses on the subject.
In the following description, processing when the option button 401 is manipulated by the user will be described.
When the option button 401 is pressed, the CPU 105 displays a period designation screen 404 shown in
It is assumed here that the user designates “from Dec. 1, 2012 to Mar. 4, 2014” as the shooting period. When the user presses an enter button 405, the CPU 105 settles the shooting period as the specification of a moving image to be created.
Referring to
The CPU 105 generates a list of images indicating attribute information (also referred to as image information) of the images recorded in the SD card 101 according to the shooting period. The list of images include five items: a file name 501, date and time of shooting 502, a format 503, a subject (subject name) 504 and a rating 505. While the list of images include five items here, the number of items may be increased or decreased according to an edited moving image.
In the file name 501, a file name of the image data in the SD card 101 is described. In the date and time of shooting 502, date and time at which the image data is shot is described. In the format 503, a format name of the image data is described. In the subject 504, a subject name (person name) present in the image data is described. When there are a plurality of subjects in the image data, all subject names relating to the plurality of subjects are described. In the rating 505, a rating indicating how much the user likes the image data is described.
It should be noted that concerning the subject name, subject information (subject name) is recorded in advance in association with image data, and the CPU 105 loads the subject information when generating a list of images. It is also possible to allow the user to input a subject name while viewing an image. Further, the rating 405 is evaluated, for example, from “1” to “5”, and the user sets the rating in advance for each image data.
As described above, because from Dec. 1, 2012 to Mar. 4, 2014 is designated as the shooting period, the CPU 105 searches for the image data whose date and time of shooting falls within the shooting period to generate a list of images. After collecting the images, the CPU 105 creates composition information indicating composition of the edited moving image (step S303). This composition image includes information for specifying original images.
When generation of the composition information is started, the CPU 105 confirms the image information regarding the image data (that is, material images) relating to the edited moving image using the list of images (step S601). Then, the CPU 105 determines a material image to be used as a title effect of the edited moving image. Here, the CPU 105 determines a material image with the highest rating as the material image to be used as the title effect (step S602).
In the example shown in
Subsequently, the CPU 105 determines a display order of material images in the edited moving image. Here, for example, the CPU 105 determines to display the material images according to the date and time of shooting of the list of images (step S603).
Then, the CPU 105 performs processing of putting continuous shot images generated through one shooting together as one group (step S604). Here, the CPU 105 performs processing of putting the continuous shot images together as one group with reference to the date and time of shooting 502 of the list of images while assuming that the continuous shot images are, for example, still images reproduced at intervals of one second. In the example shown in
It should be noted that if continuous shot image information indicating that images are continuous shot images is added to the image data upon continuous shooting, the CPU 105 may determine whether or not the image data is continuous shot images according to the continuous shot image information.
Subsequently, as will be described later, the CPU 105 determines material images to be used for flash effect (highlight) from still images (step S605). Here, the flash effect is effect used at opening of the moving image, and effect for allowing the user to recognize content of the edited moving image at the opening through display of still images used in the edited moving image being switched at short intervals.
When the highlight determination processing is started, the CPU 105 arranges still images in order of date and time of shooting to determine material images for highlight (for flash effect) (step S701). Then, the CPU 105 excludes still images other than a top image in a series of continuous shot images (step S702). The CPU 105 then creates a list of images for flash effect (for highlight) from the remaining still images (step S703) and finishes the highlight determination processing.
In
The edited moving image starts from the material image used as the title effect, and, after the material images used as the flash effect are displayed, the body is displayed. Therefore, the CPU 105 generates a composition information table indicating what kind of effect is used to display each material image in the edited moving image.
In the composition information table (also referred to as a composition information file) shown in the figure, material images used as part of the edited moving image are listed. Effect, a display period and character information relating to a displayed character string in the edited moving image are described for each material image.
For example, a file name, a display period, a displayed character string (hereinafter, simply referred to as a character string), location where a character string is to be input, a type of effect and scroll are described for each material image. In the file name, a file name of a material image in the SD card 101 is described. In the display period, a reproduction period in the edited moving image according to the format of the material image and the effect is described.
The CPU 105 determines a reproduction period for each material image according to, for example, a format with reference to the reproduction period determination table. In the reproduction period determination table shown in
Referring to
It should be noted that, even when the same effect is applied for the edited moving image, if there is large difference in a shooting period or a shooting target between material images, effect information is described using expression indicating that the images belong to different groups though the effect is the same.
Referring to
The CPU 105 settles the display period during which a material image used for effect in the edited moving image is displayed according to the reproduction period table for each type of images (step S607). The CPU 105 then assigns the reproduction period for each type of images, that is, the display period to the material images in the composition information table shown in
Subsequently, the CPU 105 performs processing of surrounding a main subject present in the material image with a rectangle (S608). Here, when surrounding the main subject with the rectangle, the CPU 105 sets an object (for example, a person) having a maximum size present in the material image as the main subject. It should be noted that the user may designate the main subject with reference to the material image displayed at the display 104.
The CPU 105 then checks whether or not the size of the rectangle is equal to or smaller than half of the size of the material image (that is, the whole image) (step S609). The CPU 105 then determines whether or not the size of the rectangle is equal to or smaller than the half of the size of the material image (step S610).
If the size of the rectangle is equal to or smaller than the half of the size of the material image (step S610: Yes), the CPU 105 sets only a region indicated with part of the rectangle in the material image as a display region to be displayed in the edited moving image (step S611). Meanwhile, if the size of the rectangle exceeds the half of the size of the material image (step S610: No), the CPU 105 sets the whole material image as a display region in the edited moving image (step S612).
After the processing of step S611 or S612, the CPU 105 checks whether a short side of the display region is a longitudinal direction or a lateral direction (step S613). The CPU 105 then determines whether or not the short side of the display region is a longitudinal direction, that is, whether or not the display region is horizontally long (step S614).
If the display region is vertically long (step S614: No), the CPU 105 increase or decreases the size of the display region so that the width of the edited moving image is the same as the width of the display region (step S615). Meanwhile, if the display region is horizontally long (step S614: Yes), the CPU 105 increases or decreases the size of the display region so that the height of the edited moving image is the same as the height of the display region (step S616).
After the processing of step S615 or step S616, the CPU 105 checks whether or not the length of the long side of the display region is the same as the length of the corresponding side in the edited moving image when increasing or decreasing the size of the display region (step S617). That is, the CPU 105 checks whether or not a ratio of the height and the width of the display region is the same as a ratio of the height and the width of the edited moving image. The CPU 105 then determines whether or not the ratio of the height and the width of the display region is the same as the ratio of the height and the width of the edited moving image (step S618).
If the ratio of the height and the width of the display region is the same as the ratio of the height and the width of the edited moving image (step S618: Yes), the CPU 105 determines not to scroll display the display region in the edited moving image (step S619). Meanwhile, if the ratio of the height and the width of the display region is not the same as the ratio of the height and the width of the edited moving image (step S618: No), the CPU 105 determines to scroll display the display region from one end to the other end of the long side so that the whole display region can be displayed in the edited moving image (step S620).
It should be noted that, if only a specific region of the material image is displayed regardless of whether or not to perform scroll display, the CPU 105 records an upper left coordinate and a lower right coordinate of the rectangle defining the display region in the composition information table as a scroll position 901.
Here, confirmation of the edited moving image with eyes when the material image is scroll displayed at the camera 100 shown in
Now, if scroll display is performed at the camera 100, images, for example, as shown in
The display region 1205 shown in
The size of the display region 1205 is vertically large compared to the longitudinal and lateral size of the edited moving image. Therefore, when the display region 1205 is displayed in the edited moving image, the display range is scrolled in a longitudinal direction.
First, a display region 1202a (
Given the reproduction period indicated in the reproduction period table shown in
That is, as shown in
Referring to
The shooting period of the material image can be confirmed by the date and time of shooting 502 shown in
Subsequently, the CPU 105 records the composition information table in the SD card 101 as a composition information file (step S622). The CPU 105 then finishes creation of the composition information and proceeds to processing of step S304 shown in
In the SD card 101, the composition information file 1402 is stored with a file name of NEW_0100.XML in a folder formed in a layer which is the same as the layer of the material image.
When the composition information file is recorded in the SD card 101, the CPU 105 finishes creation of the composition information and proceeds to processing of step S304 shown in
Upon generating the edited moving image, the CPU 105 confirms a size table (specification of the edited moving image) shown in
As shown in
When the user presses the print button 1501, the CPU 105 transmits the composition information file 1402 and the material images used for generating the edited moving image to the server PC 110 using the WiFi connection module 103. It should be noted that before the edited moving image is displayed at the display 104, the CPU 105 confirms whether or not the server PC 110 which coordinates with the camera 100 supports printing using the composition information file. That is, the CPU 105 confirms a function of the server PC 110.
When the confirmation processing is started, the CPU 105 confirms whether the server PC 110 has a function (printing function) of printing the material images using the composition information file (step S1601).
The CPU 105 performs confirmation of the printing function (hereinafter, also referred to as printing capacity) to the server PC 110 using the WiFi camera connection module 103 (step S1701). In response to the confirmation, the server PC 110 responds printing capacity to the camera 100 (step S1702).
When receiving the response from the server PC 110, the CPU 105 determines that the moving image can be printed at the server PC 110 (step S1703). Meanwhile, if the CPU 105 confirms the printing capacity while the server PC 110 is stopped (step S1704), because the server PC 110 does not respond, the CPU 105 determines that printing is impossible at the server PC 110 (step S1705).
Referring to
When the print button 1501 is hidden, as shown in
When the printing capacity of the server PC 110 can be confirmed (step S1602: Yes), the CPU 105 confirms a minimum image size required for the server PC 110 to print a still image (step S1504).
When the confirmation processing of the minimum image size is started, the CPU 105 inquires about the minimum image size required to perform printing to the server PC 110 via the WiFi connection module 103 (step S1801). The server PC 110 transmits a response regarding the minimum image size to the camera 100 in response to the inquiry (step S1802).
The CPU 105 decreases the size of an image (material image) to be uploaded to the server PC 110 to the minimum image size (printable image size) according to the minimum image size in the response (step S1803) and uploads the image to the server PC 110 (step S1804). The CPU 105 then finishes the confirmation processing of the function of the server PC 110.
Now, when the print button 1501 shown in
When the user inputs the email address in an email address input field 1503 at the email address screen and presses an enter button 1504, input of the email address is completed. When input of the email address is completed, the CPU 105 specifies a composition information file corresponding to the edited moving image which is determined to be printed (step S1902).
Here, because the moving image file 1401 is printed, the CPU 105 specifies the composition information file 1402 corresponding to the moving image file 1401 in the SD card 101. That is, the CPU 105 specifies a file which is located in the same layer as the layer of the moving image file 1401 and which has the same file name other than the extension, as the composition information file 1402.
Subsequently, the CPU 105 specifies material images used in the moving image file 1401 with reference to the composition information file 1402 (step S1903). When specifying of the material images is completed, the CPU 105 generates images for printing from the material images (step S1904).
Here, if the material images include a moving image, the CPU 105 generates three still images for printing for the moving image. The CPU 105 then stores the still images in a layer where the original moving image is located. Upon storage of the still images, the CPU 105 uses a file name with an underscore and a segment number being added after the file name of the original moving image and an extension of JPG being added thereto.
For example, in the example shown in
It should be noted that when the minimum image size is obtained from the server PC 110 and the size of the material image is larger than the minimum image size, the CPU 105 decreases the size of the material image to the minimum image size.
Subsequently, the CPU 105 modifies the composition information file 1402 according to the images for printing to generate a transmission composition information file 1403. This transmission composition information file 1403 is stored in the same layer as the layer of the composition information file 1402. At this time, the CPU 105 sets a file name of the transmission composition information file 1403, in which NEW which is a prefix character of the file name of the composition information file 1402 is changed to PRT.
The CPU 105 then transmits the transmission composition information file 1403 corresponding to the moving image file 1401 determined to be printed, the images for printing corresponding to the transmission composition information file 1403, and the email address to the server PC 110 (step S1906). The CPU 105 then finishes the printing processing procedure.
At the server PC 110, the CPU 114 receives the transmission composition information file 1403, the images for printing 201, 202, 203, 204, 205, 206, 208, 1404, 1405 and 1406 and the email address (step S2001).
In
The CPU 114 stores the transmission composition information file, the images for printing and the email address in the HDD 111. Here, the transmission composition information file and the images for printing are stored in the HDD 111 respectively as the transmission composition information file 1403a and the images for printing 201a to 204a, 208a, 206a and 205a.
At this time, the email address is described in an address file 2202 and stored in the HDD 111 as a file. Further, if the images for printing include a moving image, the CPU 114 divides the moving image into three parts having the same reproduction period. The CPU 114 then cuts out the first frames as still images to modify part of the moving image in the transmission composition information file. Further, the CPU 114 stores three still images in place of the moving image file.
Referring to
When the determination processing of the print layout is started, the CPU 114 reads out effect information used for the material images based on the transmission composition information file. The CPU 114 then confirms whether or not there is page layout (hereinafter, also simply referred to as layout) corresponding to the effect information (step S2301).
In the HDD 111, as shown in
In a layout name 2401, a name of the print layout is described, and the details are defined for each layout name. In the number of pages 2402, the number of pages used in each layout name is defined. In corresponding effect 2403, a name for each type of effect in the moving image which is designated to be printed is defined. The print layout for effect in which the effect name defined in the transmission composition information file 1403a matches the effect name of the corresponding effect 2403 is specified by comparing the effect name defined in the transmission composition information file 1403a and the effect name of the corresponding effect 2403.
In a displayed character string 2404, a character string to be displayed in each layout is defined. In layout specification 2405, how the images and the character string are arranged in each layout is defined.
In the example shown in the figure, when the image used for the title effect in the transmission composition information file 1403a is printed, the front page layout is selected from the print layout definition file 2201. Upon printing, one page is used, and the image and a character string for displaying the title effect are used. In a paper sheet, the image is arranged at a left side, and the character string is arranged at a right side of the image. By this means, the front page layout becomes layout, for example, shown in
Further, when the image used for flash effect is printed, the flash layout is selected from the print layout definition file 2201. Upon printing, two pages are used, and a plurality of pages of images are printed. At this time, a character string is not printed. Images are randomly arranged in two pages of paper sheets. By this means, the flash layout becomes layout, for example, shown in
When the image used for the continuous shooting effect (display switching effect) is printed, the continuous shooting layout is selected from the print layout definition file 2201. Upon printing, two pages are used, and a plurality of pages of images are printed. At this time, a character string is not printed. The images are arranged in order of shooting in two pages of paper sheets. By this means, continuous shooting effect layout becomes layout, for example, shown in
In the example shown in
When the image used for still image effect is printed, the still image layout is selected from the print layout definition file 2201. Upon printing, one page is used, and one image is printed. At this time, a character string is not printed. The image is arranged in the center in one page of a paper sheet. By this means, still image effect layout becomes layout, for example, shown in an image frame 2502 in
When the image used for highlight display effect is printed, highlight display layout is selected from the print layout definition file 2201. Upon printing, two facing pages are used, and one image is printed. At this time, a character string is not printed. By this means, highlight display effect layout becomes layout, for example, shown in an image frame 2506 in
When the image used for cameraman introduction effect (photographer introduction effect) is printed, the cameraman introduction layout is selected from the print layout definition file 2201. Upon printing, one page is used, and one image and a character string are printed. By this means, cameraman introduction effect layout becomes layout, for example, shown in an image frame 2504 and a character string 2505 in
The effect of “combination of still image and moving image” shown in a bottom row of
The combination of still image and moving image corresponds to a combination of the usual still image and still images extracted from the moving image which is shot before or after automatically shooting the usual still image.
In a displaying effect, the still images extracted from the moving image which is shot before or after automatically shooting the usual still image are continuously displayed for a short time. After that, the usual image is widely displayed for a long time. A detailed description will be made in case of laying out such a moving image with reference to
When the image used for moving image effect is printed, the moving image layout is selected from the print layout definition file 2201. Upon printing, one page is used, and a plurality of still images constituting the moving image are printed. At this time, a character string is not printed. The plurality of images are arranged in a longitudinal direction in order of shooting in one page of a paper sheet. By this means, moving image effect layout becomes layout, for example, shown in an image frame 2501 in
Referring to
If there is corresponding layout (step S2302: Yes), the CPU 114 confirms whether the number of images for printing falls within the number of images defined in the layout (step S2304). It should be noted that the upper limit number of images is defined in an image upper limit 2406 in the print layout definition file 2201.
The CPU 114 then determines whether or not the number of images for printing falls within the number of images (upper limit number of images) defined in the corresponding layout (step S2305). If the number of images for printing does not fall within the number of images defined in the corresponding layout (step S2305: No), the CPU 114 integrally multiplies the required number of pages for printing in the corresponding layout so that all the images for printing can be arranged (step S2306). That is, the CPU 114 increases the number of pages in the layout so that the number of material images per page is equal to or less than the upper limit number of images.
Meanwhile, if the number of images for printing falls within the number of images defined in the corresponding layout (step S2305: Yes), the CPU 114 secures required pages for printing in the corresponding layout (step S2307).
After the processing of step S2303, S2306 or S2307, the CPU 114 determines whether or not layout is determined for all effect (step S2308). If layout is determined for all effect (step S2308: Yes), the CPU 114 finishes the print layout determination processing.
Meanwhile, if layout is not determined for all effect (step S2308: No), the CPU 114 targets undetermined effect (step S2309). The CPU 114 then returns to the processing of step S2301. It should be noted that even if the type of effect is the same, if the images largely differ in the shooting period or the shooting target, and the images are defined as different groups in the transmission composition information file, layout is determined for each group.
Referring to
Subsequently, the CPU 114 determines whether or not a material image is scroll displayed in the moving image with reference to the transmission composition information file 1403a (step S2005). If the material image is not scroll displayed (step S2005: No), the CPU 114 sets the whole material image as a print range (step S2206).
Meanwhile, if the material image is scroll displayed (step S2005: Yes), the CPU 114 sets only part of the material image displayed in the moving image as the print range (step S2007).
As shown in
After the processing of step S2006 or S2007, the CPU 114 acquires a character string for layout required to be printed from the transmission composition information file 1403a (step S2008). Here, because it can be known that the transmission composition information file 1403a includes title effect, title layout which is layout for title effect is used.
In the title layout, a character string is printed along with the image, the CPU 114 acquires a character string to be used in the title effect from the transmission composition information file 1403a. Here, a character string 902 shown in
Subsequently, the CPU 114 notifies the camera 100 of completion of printing preparation (step S2009). This notification is performed toward the email address received in the above-described step S2001.
After issuing a notification of completion of printing preparation, the CPU 114 prepares a print preview (step S2010). This print preview is print data for performing printing, and used by the user to confirm a print result before printing. After receiving the notification of completion of printing preparation at the camera 100, the user accesses the server PC 111 through a Web browser using the camera 100. By this means, the user can open a print preview confirmation screen.
In the print preview confirmation screen shown in
Further, in the print preview confirmation screen, an order button 2604 is displayed, and, if the user confirms and satisfies the print preview 2603, the user can order the printed matter as a book by manipulating the order button 2604.
Meanwhile, if the user confirms the print preview 2603 and desires to change content to be printed, the user can perform editing by manipulating an editing button 2605.
When the editing button 2605 is pressed, the editing screen shown in
Further, on the editing screen, a stock image field 2612 is displayed, and if the user uses an image other than the images used in the print preview 2603, the user can select an image from the stock image field 2612 and drop the image to the print preview 2603.
It should be noted that, the images arranged in the stock image field 2612 are, for example, images uploaded to the server PC 110 when the moving image was previously printed at the camera 100. Further, the images arranged in the stock image field 2612 may be images acquired from social networking service other than the server PC 110.
Further, on the editing screen, a layout change button 2611 for changing layout of the displayed page is displayed. When the layout change button 2611 is pressed, a screen shown in
On the screen shown in
If the number of images required for the switched layout is larger than the number of images required for the layout prior to switching, the CPU 114 instructs the user to select insufficient images from the stock image field 2612 by the CPU 105. Meanwhile, if the number of images required for the switched layout is smaller than the number of images required for the layout prior to switching, the CPU 114 determines images to be used according to the date and time of shooting (for example, in order of date and time of shooting).
Referring to
When the image arrangement processing is started, the CPU 114 acquires a combination of images to be laid out on a page of an album as will be described later based on the moving image and still images stored in the HDD 111 (step S2701).
It should be noted that the combination of images of the group 2804 may be one other than that as described above. For example, the combination may be a combination of still images selected from still images obtained through continuous shooting or a combination of still images selected or extracted from image information under predetermined conditions. Further, the still image 2802 which will be described below is any one or a plurality of the still images 2802a to 2802c extracted from the moving image 2801.
The above-described group 2804 is transmitted from the camera 100 to the server PC 110. At the server PC 110, the CPU 114 stores the group 2804 in the HDD 111 as the moving image and the still images.
The CPU 114 acquires an amount of difference between the images in the group 2804 (step S2702). For example, the CPU 114 calculates respective amounts of difference between the still images 2802a to 2802c extracted from the moving image 2801 and the still image 2803 and sets an average value of the amounts as the amount of difference of the group 2804. For example, the CPU 114 creates thumbnail images of the same size for the still images 2802a to 2802c and 2803 and calculates an amount of difference between the thumbnail images. As the amount of difference, for example, a change amount of the thumbnail images, a change amount in predetermined regions of the thumbnail images, sizes of the thumbnail images, a difference in characteristic amounts of the thumbnail images or a difference in color tone (color temperature, color shade, intensity) of the thumbnail images is used.
Subsequently, the CPU 114 acquires object content of the still images 2802a to 2802c and 2803 (step S2703). For example, the CPU 114 performs object recognition processing on the still images 2802a to 2802c and 2803 and acquires content of objects, that is, a type of the subject. The CPU 114 identifies, as a type of the subject, whether or not there is a person in the objects included in the still images 2802a to 2802c and 2803 using a person detection method used in the known image processing. The person detection method includes, for example, a method using a brightness gradient direction histogram characteristic amount which is obtained by displaying a brightness gradient direction in each region of the objects included in the still images 2802a to 2802c and 2803 in a histogram.
As will be described later, the CPU 114 determines layout of the group 2804 (step S2704). The CPU 114 then creates a page by laying out the still images 2802a to 2802c and 2803 according to the layout determined in the processing of step S2704 (step S2705). Subsequently, the CPU 114 finishes the image arrangement processing.
When the layout determination processing is started, the CPU 114 determines whether or not the amount of difference of the group 2804 acquired in the processing of step S2702 is equal to or greater than a predetermined threshold (step S2901). As this threshold, for example, an amount of difference at which the still images 2802a to 2802c are not similar to the still image 2803 may be set. When the amount of difference is equal to greater than the predetermined threshold (step S2901: Yes), the CPU 114 determines that the still images 2802a to 2802c are not similar to the still image 2803. The CPU 114 then determines whether or not the object content acquired in the processing of step S2703 satisfies predetermined determination conditions (step S2902). In the processing of step S2902, the CPU 114 uses whether or not there is a person in the image of the group 2804 as the determination conditions.
When the object content satisfies the determination conditions (step S2902: Yes), the CPU 114 determines that there is a person in the image of the group 2804 and selects layout A which will be described later (step S2903). The CPU 114 then finishes the layout determination processing. Meanwhile, if the object content does not satisfy the determination conditions (step S2902: No), the CPU 114 determines that there is no person in the image of the group 2804 and selects layout B which will be described later (step S2904). The CPU 114 then finishes the layout determination processing.
When the amount of difference is less than the predetermined threshold (step S2901: No), the CPU 114 determines that the still images 2802a to 2802c are similar to the still image 2803. The CPU 114 then determines whether or not the object content acquired in step S2703 satisfies the determination conditions (step S2905). If the object content satisfies the determination conditions (step S2905: Yes), the CPU 114 determines that there is a person in the image of the group 2804 and selects layout C which will be described later (step S2906). The CPU 114 then finishes the layout determination processing. Meanwhile, if the object content does not satisfy the determination conditions (step S2905: No), the CPU 114 determines that there is no person in the image of the group 2804 and selects layout D which will be described later (step S2907). The CPU 114 then finishes the layout determination processing.
It should be noted that, while, in the group 3201, all the still images 3201a to 3201d include persons, for example, when there is no person in any of the still images 3201a to 3201d, a still image which includes a person may be laid out at a position of the still image 3201d of the page 3202. Then, the remaining still images may be respectively laid out at positions of 3201a to 3201c.
In the example shown in
In the example shown in
As described above, because a plurality of images are combined, and layout is determined based on the image information, it is possible to set layout appropriate for the combination of the still images. Further, if similar still images are combined, it is possible to avoid similar images from being arranged by excluding similar images from the layout.
After the processing of step S2702, the CPU 114 determines layout of the still images included in the group 2804 as will be described later (step S3301). The CPU 114 then generates a page by laying out the still images 2802a to 2802c and 2803 based on the layout determined in step S3301.
When the layout determination processing is started, the CPU 114 determines whether or not an amount of difference of one of the still images 2802a to 2802c acquired in step S2702 is equal to or greater than the predetermined threshold (step S3401). When the amount of difference is equal to or greater than the predetermined threshold (step S3401: Yes), the CPU 114 determines that the images are not similar images. The CPU 114 then counts the number of still images whose amount of difference is equal to or greater than the threshold (step S3402). It should be noted that, in step S3402, an initial value of the number of images is “0”.
Subsequently, the CPU 114 determines whether or not the amount of difference is compared with the predetermined threshold for all the still images 2802a to 2802c (step S3403). When the amount of difference is compared with the predetermined threshold for all the still images 2802a to 2802c (step S3403: No), the CPU 114 returns to the processing of step S3401. It should be noted that if the amount of difference is less than the predetermined threshold (step S3401: No), the CPU 114 determines that the images are similar images, and proceeds to the processing in step S3403.
When the amount of difference is compared with the predetermined threshold for all the still images 2802a to 2802c (step S3403: Yes), the CPU 114 determines that the number of still images whose amounts of difference are equal to or greater than the predetermined threshold is acquired. The CPU 114 then determines whether or not the number of images counted in the processing of step S3402 is equal to or greater than two (step S3404). If the number of images is equal to or greater than two (step S3404: Yes), the CPU 114 determines that the amounts of difference in the still images whose amounts of difference are equal to or greater than the predetermined threshold can be calculated. The CPU 114 then calculates amounts of difference for the still images whose amounts of difference are determined to be equal to or greater than the predetermined threshold in the processing of step S3401 (step S3405).
Subsequently, the CPU 114 determines whether or not the amounts of difference calculated in step S3405 are equal to or greater than the predetermined threshold (step S3406). If the amounts of difference are equal to or greater than the predetermined threshold (step S3406: Yes), the CPU 114 determines that the still images whose amounts of difference are determined to be equal to or greater than the predetermined threshold in step S3401 are not similar images. The CPU 114 then counts the number of images whose amounts of difference exceed the threshold (step S3407). In step S3407, an initial value of the number of images is “1”.
Subsequently, the CPU 114 determines whether or not the amount of difference is compared with the predetermined threshold for all the still images for which the amounts of difference are determined to be equal to or greater than the threshold in step S3401 (step S3408). It should be noted that if the amount of difference is less than the threshold (step S3406: No), the CPU 114 determines that the still images for which the amounts of difference are determined to be equal to or greater than the predetermined threshold in step S3401 are similar images, and proceeds to the processing of step S3408.
When the amount of difference is not compared with the predetermined threshold for all the still images whose amounts of difference are equal to or greater than the predetermined threshold (step S3408: No), the CPU 114 determines that the number of still images whose amounts of difference are equal to or greater than the threshold has not been acquired, and returns to the processing of step S3405. Meanwhile, if the amount of difference is compared with the predetermined threshold for all the still images whose amounts of difference are equal to or greater than the threshold (step S3408: Yes), the CPU 114 determines that counting of the number of images whose amounts of difference are equal to or greater than the predetermined threshold is finished. The CPU 114 then selects layout appropriate for the number of images acquired in step S3402 and step S3407 (step S3409). The CPU 114 then finishes the layout determination processing. It should be noted that if the number of images is less than two (step S3404: No), the CPU 114 proceeds to the processing of step S3409.
In the still images 2802a to 2802c and the still image 2803, if the number of images whose amounts of difference are equal to or greater than the predetermined threshold is “0”, the CPU 114 selects layout E in which the still image 2803 is arranged as shown in
When the number of images whose amounts of difference with the still image 2803 are equal to or greater than the predetermined threshold is “2”, the CPU 114 acquires the amounts of difference of these two still images 2802. If the amounts of difference of the two still images 2802 are less than the predetermined threshold, the CPU 114 selects layout F in which the still image 2803 and one still image 2802 whose amount of difference with the still image 2803 is equal to or greater than the predetermined threshold are arranged. Because the amounts of difference of these two still images 2802 are less than the predetermined threshold, the CPU 114 determines that there is no great change in the two still images 2802. Therefore, as described above, the CPU 114 selects one still image 2802 out of the two still images 2802 whose amounts of difference with the still image 2803 are equal to or greater than the predetermined threshold. Further, any image may be used if the image has an amount of difference with the still image 2803 equal to or greater than the predetermined threshold.
If the amounts of difference of the two still images 2802 whose amounts of difference with the still image 2803 are equal to or greater than the predetermined threshold are equal to or greater than the predetermined threshold, the CPU 114 selects layout G in which the still image 2803 and the two still images 2802 whose amounts of difference with the still image 2803 are equal to or greater than the predetermined threshold are arranged (see
If the number of images whose amounts of difference with the still image 2803 are equal to or greater than the predetermined threshold is “3”, the CPU 114 acquires the amounts of difference of these three still images 2802a to 2802c and acquires the number of images whose amounts of difference are equal to or greater than the predetermined threshold in these three still images 2802a to 2802c. When the number of images whose amounts of difference are equal to or greater than the predetermined threshold is “1”, the CPU 114 selects layout F in which one of the still images 2802a to 2802c and the still image 2803 are arranged. When the number of images whose amounts of difference are equal to or greater than the predetermined threshold is “2”, the CPU 114 selects layout G in which two still images whose amounts of difference are equal to or greater than the predetermined threshold and the still image 2803 are arranged. When the number of images whose amounts of difference are equal to or greater than the predetermined threshold is “3”, the CPU 114 selects layout H in which the still images 2802a to 2802c whose amounts of difference are equal to or greater than the predetermined threshold and the still image 2803 are arranged (see
In the example shown in
As described above, because the amounts of difference of the still images are used, it is possible to determine layout appropriate for combination of the images.
After the processing of step S2701, the CPU 114 performs the processing of step S2703 described using
When the layout determination processing is started, the CPU 114 compares the object content of the still images 2802a to 2802c acquired in step S2701 with the object content of the still image 2803 to determine whether or not the object content is different (step S3901). Here, the number of persons included in the still images 2802a, 2802b and 2802c and the still image 2803 is regarded as the object content. When the number of persons included in the still image 2803 is different from the number of persons included in the still images 2802a, 2802b and 2802c (step S3901: Yes), the CPU 114 determines that these still images are not similar images. The CPU 114 then counts the number of still images 2802 whose number of persons is different from that of the still image 2803 (step S3902). It should be noted that, in step S3902, an initial value of the number of images is “0”.
Subsequently, the CPU 114 determines whether or not the object content of all the still images 2802a to 2802c is compared with the object content of the still image 2803 (step S3903). If the object content of all the still images 2802a to 2802c is not compared with the object content of the still image 2803 (step S3903: No), the CPU 114 returns to the processing of step S3901. It should be noted that, if the number of persons is the same (step S3901: No), the CPU 114 determines that the images are similar images, and proceeds to the processing of step S3903.
If the object content of all the still images 2802a to 2802c is compared with the object content of the still image 2803 (step S3903: Yes), the CPU 114 determines that the number of still images whose number of persons is different is acquired. The CPU 114 then determines whether or not the number of images counted in the processing of step S3902 is two or more (step S3904). If the number of images is two or more (step S3904: Yes), the CPU 114 determines that the object content can be compared among the still images 2802 whose object content is different. The CPU 114 then determines whether or not the object content is different among the still images 2802 whose object content is determined to be different in the processing of step S3901 (step S3905).
If the object content is different among the still images 2802 (step S3905: Yes), the CPU 114 counts the number of images whose object content is different (step S3906). It should be noted that an initial value of the number of images in step S3906 is 1. The CPU 114 then determines whether or not the object content of all the still images 2802 whose object content is determined to be different in step S3901 is compared (step S3907). If the object content of all the still images 2802 is compared (step S3903: Yes), the CPU 114 selects layout appropriate for the number of images obtained in step S3902 and step S3906 (step S3908). Subsequently, the CPU 114 finishes the layout determination processing.
If the object content is the same among the still images 2802 (step S3905: No), the CPU 114 proceeds to the processing of step S3907. If the object content of all the still images 2802 is not compared (step S3907: No), the CPU 114 determines that the number of persons included in the still images 2802 is the same, and returns to the processing of step S3905.
In step S3904, if the number of images is less than “2” (step S3904: No), the CPU 114 determines that the object content cannot be compared among the still images 2802 whose object content is different, and proceeds to the processing of step S3908.
The layout E to H indicated in Table 3 is the same as the layout indicated in Table 2. Here, description will be provided using a group 2804 comprised of three still images 2802a to 2802c extracted from the moving image 2801 and one picked up still image 2803 as an example. In the still images 2802a to 2802c and the still image 2803, if the number of images whose number of persons is different from that of the still image 2803 is “0”, the CPU 114 selects the layout E in which the still image 2803 is arranged. If the number of images whose number of persons is different from that of the still image 2803 is “1”, the CPU 114 selects the layout F in which the still image 2803 and the still image 2802 whose number of persons is different from that of the still image 2803 are arranged.
If the number of images whose number of persons is different from that of the still image 2803 is “2”, the CPU 114 compares the number of persons included in these two still images 2802. If the number of persons included in the two still images 2802 is the same, the CPU 114 selects the layout F in which the still image 2803 and one of the two still images 2802 are arranged. Meanwhile, if the number of persons included in the two still images 2802 is different from each other, the CPU 114 selects the layout G in which the still image 2803 and the two still images 2802 are arranged.
If the number of images whose number of persons is different from that of the still image 2803 is “3”, the CPU 114 compares the number of persons among these three still images 2802a to 2802c. If the number of images whose number of persons is different among the three still images 2802a to 2802c is “1”, the CPU 114 selects the layout F in which one of the still images 2802a to 2802c and the still image 2803 are arranged. It should be noted that if the number of images whose number of persons is different is “0”, the number of persons in the still images 2802a to 2802c is the same.
If the number of images whose number of persons is different among the still images 2802a to 2802c is “2”, the CPU 114 selects the layout G in which the two still images 2802 whose number of persons is different and the still image 2803 are arranged. If the number of images whose number of persons is different among the still images 2802a to 2802c is “3”, the CPU 114 selects the layout H in which the still images 2802a to 2802c and the still image 2803 are arranged.
In the example shown in
As described above, because the layout is determined based on the object content identified in the plurality of still images, it is possible to determine layout appropriate for the combination of the images.
It should be noted that, while, in the above-described example, the object content is determined according to whether or not there is a person in the still image, the object content is not limited to a person. For example, the object content may be determined according to whether or not there is an authenticated person or whether or not there is a person who is focused on. Further, the object content may be determined according to whether or not the still image is a favorite image or whether or not there is a specific object other than a person. In this case, when a person is authenticated, the person is authenticated using a known face authentication method. Further, concerning a person who is focused on, determination is performed according to whether or not an image is picked up while the person is focused on upon image pickup. Still further, concerning whether or not the still image is a favorite image, determination is performed using a known rating.
In the known face authentication method, for example, a position of the face is detected from the image, and each characteristic part of the face is detected. Then, a person is authenticated by performing normalization processing of adjusting an angle, or the like, based on the position of each part and checking the face against face templates which have been already registered. Further, the rating is, for example, a numerical value set for the image using a function of application for image processing.
Further, while, in the above-described example, still images are extracted from the moving image which is automatically picked up before the still images are picked up, still images may be extracted from a moving image which is picked up after the still images are picked up. Further, while, in the above-described example, layout held by the server PC is selected, the CPU 114 may generate layout which satisfies the determination conditions according to programmed processing.
Further, while, in the example shown in
Further, the smartphone 120 shown in
Further, the smartphone 120 may be used as the print control apparatus, so that an image for printing, or the like, are transmitted from the camera 100 to the smartphone 120. The smartphone 120 prints a moving image using the printer 150 in a similar manner to the server PC 110.
As described above, according to the embodiment of the present invention, even when still images are printed using a moving image, the user can easily recognize content expressed with the moving image which is a printing source, from the still images obtained through printing.
As is clear from the above description, in the example shown in
Further, the server PC 110 is a print control apparatus, and the CPU 114 and the HDD 11 function as storage units, the CPU 114 functions as a layout determining unit and a print data generating unit. Further, the CPU 114 and the HDD 111 function as memory units, and the CPU 114 and the network board 113 function as a receiving unit, a notifying unit, a preview transmitting unit, an editing unit and a print control unit. It should be noted that the print control apparatus may transmit print data to the remotely located printer 150 via a wired or wireless network and give an instruction of printing, or may incorporate the printer 150.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Applications No. 2014-210650 filed Oct. 15, 2014 and No. 2015-124698 filed June 22, which are hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2014-210650 | Oct 2014 | JP | national |
2015-124698 | Jun 2015 | JP | national |