The present invention relates to a technique simply combining a plurality of images.
For acquiring an image with proper gloss and shade, such a technique has hitherto been known to combine images with lighting applied from various angles into a single image.
Patent Document1 describes that a subject is shot from a plurality of angles to combine images based on the quantity of light.
In Patent Document 1, the quantity of light illuminating the subject needs to be specified for combining. Though simplified for acquiring a combined image, it still needs to control the quantity of light depending on the subject.
An object of the present invention is to simply acquire a combined image through automatic determination of the proportion of combination without relying on the manual control of the quantity of light, etc. as in Patent Document 1.
In order to solve the above problem, the present invention includes an input unit receiving a first image obtained by shooting a subject with light applied thereto from a first lighting device in a diagonal direction, the input unit receiving a second image obtained by shooting the subject with light applied thereto from a second lighting device in an opposite diagonal direction, the input unit accepting information on the subject and a shooting device; a blend pattern generating unit calculating and finding, based on the accepted information, a ratio Z for determining a proportion in combining of the first image and the second image for each pixel; an image blending unit obtaining a third image combined by adding the first image multiplied by the ratio Z and the second image multiplied by 1−Z for each pixel; and an output unit outputting the third image.
According to the present invention, images can be simply combined.
Embodiments will now be described with reference to the drawings.
An L-lighting 102 and an R-lighting 103 have a function to turn on a light source 104 in response to a turn-on command received from a computer 134 via a communication unit 106 by the action of a light source control unit 105 and turn off the light source 104 in response to a turn-off command. The L-lighting 102 and the R-lighting 103 are disposed diagonally and oppositely diagonally, respectively, with respect to a subject.
A camera 111 includes a shooting unit 112, a shooting control unit 113, and a communication unit 114. The shooting control unit 113 has a function to form an image of the subject on an image pickup device of the shooting unit 112 through a lens of the shooting unit 112 in response to a shooting command received from the computer 134 via the communication unit 114, to convert the image into digital data by the image pickup device, for the storage in a temporary memory of the shooting unit 112. The shooting control unit 113 further has a function to convert image data stored in the temporary memory of the shooting unit 112 into a common image file format (JPEG format, TIFF format, etc.) in response to a transfer command received from the computer 134 via the communication unit 114, and to transfer the image data to the computer 134.
The computer 134 includes a storage unit 121, a control unit 129 (CPU), a memory 130, a communication unit 131, an input unit 132, and an output unit 133, which are coupled together via a bus 128.
The computer 134 reads a program 122 stored in the storage unit 121 such as a hard disk drive into the memory 130 and executes the program by the control unit 129. The computer 134 is provided with the input unit 132 such as various types of keyboards and mice and the output unit 133 such as a display, commonly included in a computer apparatus. The input unit may input external data. The input unit may read data from a storage medium or may directly read data transmitted via a network. Similarly, the output unit is not limited to the display and may be any one capable of outputting image data processed. The output unit may be one that performs writing to the storage medium or may be one that provides data as its output.
The computer 134 has the communication unit 131 transmitting data to and receiving data from other devices and is coupled to the camera 111, the L-lighting 102, and the R-lighting 103 to perform data communications therewith.
As shown in Table 1 which follows, a shooting database 135 is a database that stores a storage position of shot image data, a storage position of a shooting conditions data file, etc.
The positions in the diagram of a subject 101 that is an object to be shot such as a painting, the L-lighting 102, the R-lighting 103, and the camera 111 represent a positional relationship viewed from the top at a shooting location. The L-lighting 102, the R-lighting 103, and the camera 111 are each coupled to the computer 134 so that they can perform data communications with the computer 134 by the performance of the communication units 106 and the communication unit 114. Available as communication means are various networks such as a LAN and WAN or transmission lines such as a USB.
Functions of the program 122 of the computer 134 will then be described. A shooting conditions generating unit 123 has a function to store, as shooting conditions data into the storage unit 121, the distance between the camera 111 and the subject 101, the shooting range of the camera 111, the position of the L-lighting, and the position of the R-lighting.
A shooting control unit 124 has a function to firstly store into the storage unit 121 L-image data that is camera shooting data with the L-lighting 102 turned on and with the R-lighting 103 turned off and to secondly store into the storage unit 121 R-image data that camera shooting data with the R-lighting 103 turned on and with the L-lighting 102 turned off.
A blend pattern generating unit 125 has a function to create a blend pattern depending on the contents of the shooting conditions data, and to store the blend pattern into the storage unit 121.
An image blending unit 126 combines the L-image data and the R-image data depending on the contents of the blend pattern, to generate completed image data for the storage into the storage unit 121.
A general control unit 127 operates the shooting conditions generating unit 123 in response to a user command, then operates the shooting control unit 124, then operates the blend pattern generating unit 125, and lastly operates the image blending unit 126. Although in this embodiment the blend patterns are automatically generated from the shooting conditions data, this is not limitative, but the user may directly determine and input the blend patterns or initially determined blend patterns may be stored.
An execution procedure of a shooting method according to this embodiment will be described with reference to the drawings. Various performances corresponding to an image shooting method set forth hereinbelow are implemented by the program 122 that is read into the memory 130 of the computer 134 for execution. This program 122 is composed of codes for performing various actions which will hereinafter be described.
Performance of the shooting conditions generating unit 123 will first be described.
The shooting conditions generating unit 123 first accepts from the input unit a subject size P representative of a horizontal width of the subject, a focal length f of a lens mounted on the camera 111, and a width s of an image pickup device 401 within the camera 111.
A shooting range W is then found using Equation 1 and a shooting distance G representative of the length of a line joining the disposition position of the camera 111 and the center of the subject 101 is found using Equation 2.
W=1.2·P (Eq.1)
G=W·f/s+f (Eq.2)
The shooting distance G is then notified via the output unit 133 to the user. The user sets the camera 111 at a camera position 404 where the line joining the center of the subject 101 and the disposition position of the camera 111 is orthogonal to the subject 101 and where the length of the line joining the camera 111 and the center of the subject 101 is equal to the shooting distance G. Although in this embodiment the shooting range W is set using Equation 1, other equations may be used as long as W≧P is achieved.
The user may be able to change the coefficient 1.2 by which P is multiplied. Since a larger coefficient by which P is multiplied brings about a wider shooting range W, it is advantageous in that the disposition allowable error in the camera position increases but is disadvantageous in that the resolution falls due to a reduced proportion of the subject size P to the shooting range W.
The user then points the shooting direction of the camera 111 toward the center of the subject 101, places the L-lighting 102 on the left side of a line joining the camera 111 and the center of the subject 101, and places the R-lighting 103 on the right side of the line joining the camera 111 and the center of the subject 101.
The shooting conditions generating unit 123 then accepts from the user, via the input unit, an L-lighting position Lx and an L-lighting position Ly that indicate the position of the L-lighting 102 and an R-lighting position Rx and an R-lighting position Ry that indicate the position of the R-lighting 103. The values represent the lengths shown in the arrangement diagram of
Shooting conditions data is stored as a file in the storage unit 121, the shooting conditions data including the subject size P, the shooting distance G, the L-lighting position Lx, the L-lighting position Ly, the R-lighting position Rx, the R-lighting position Ry, and the shooting range W. The storage position of the shooting conditions data file is recorded in a row ID=1 of the shooting database 135.
Although this embodiment shows a method in which the focal length f and the width s of the image pickup device are input by the user, they may be previously stored as values proper to the camera in the storage unit 121 of the computer 134 so that the values can be read out for use. Alternatively, use may be made of values written to a predetermined portion of an L-image data file or an R-image data file.
Although this embodiment shows a method in which the shooting distance G is found by calculation, another method may be employed in which the user adjusts the position of the camera while squinting into a viewfinder fitted to the camera 111 so that the camera 111 lies at a position where the subject falls within the shooting range, after which the user inputs the shooting distance G and the shooting range W at that time. In this case, the user may acquire the shooting range by reading a value of a scale, etc., from an image shot with the scale placed at the same position as the subject 101 or may acquire the shooting range from the focal length and the width of the image pickup device by calculation using Equation 2.
Although Equation 2 is obtained by using a general lens formula, the calculation equation is not limited to this equation, but other equations are available. The other equations encompass, by way of example, a pinhole camera model based equation and an equation created based on actual measurements of the shooting distance and the shooting range.
Performance of the shooting control unit 124 will then be described in sequential order.
The shooting control unit 124 first transmits a turn-on command to the L-lighting 102 and transmits a turn-off command to the R-lighting 103. These commands allow only the left-hand lighting to turn on. The shooting control unit 124 then transmits a shooting command to the camera 111. The camera 111 writes image data shot in response to the shooting command into the temporary memory of the shooting unit 112.
The shooting control unit 124 transmits a transfer command to the camera 111. In response to the transfer command, the camera 111 converts the contents of the temporary memory into a common image file format for the transfer to the computer 134. The shooting control unit 124 stores shot image data transferred from the camera 111 as L-image data into the storage unit 121 and records the storage position of the L-image data in a row ID=2 of the shooting database 135.
The shooting control unit 124 transmits a turn-on command to the R-lighting 103 and transmits a turn-off command to the L-lighting 102. These commands allow only the right-hand lighting to turn on. The shooting control unit 124 then transmits a shooting command to the camera 111. In response to the shooting command, the camera 111 writes shot image data into the temporary memory of the shooting unit.
The shooting control unit 124 transmits a transfer command to the camera 111. In response to the transfer command, the camera 111 converts the contents of the temporary memory into a common image file format for the transfer to the computer 134. The shooting control unit 124 stores shot image data transferred from the camera 111 as R-image data into the storage unit 121 and records the storage position of the R-image data in a row ID=3 of the shooting database 135.
In this manner, the actions of the shooting control unit 124 allow the storage into the storage unit 121 of the L-image data that is an image obtained when the left-hand lighting turns on and of the R-image data that is an image obtained when the right-hand lighting turns on.
Performance of the blend pattern generating unit 125 will then be described.
A horizontal pixel count H of image data is first obtained. Due to the use of a common image file format such as JPEG or TIFF in this embodiment, H is obtained by reading a numerical value indicative of the horizontal width written to the file at a predetermined position.
Coordinate values b and c are obtained using the following method. The coordinate values b and c represent a point where the angle of incidence of light from a lighting becomes equal to the angle of reflection to a shooting device.
The abscissa value x is a numerical value from 0 to H−1 and the ordinate value y is a numerical value from 0 to V−1. Similar to the case of H, the value of V is obtained by reading a numerical value indicative of the vertical width written to the file at a predetermined position. Although in a common color image a plurality of values such as R, G, and B are present for a single coordinate, one representative type will be described in this embodiment for the purpose of avoiding the complication.
A blend pattern 501 is reserved in the memory 130 as an area for storing a blend pattern. The contents of the blend pattern 501 are H numerical values to associate an 8-bit positive integer value with the abscissa value x of the shot image data. An L-lighting position 402 represents a position at which the L-lighting 102 is placed and an R-lighting position 403 represents a position at which the R-lighting 103 is placed. A center 405 represents a center position of the subject 101.
A method of generating the blend pattern 501 will be described. A position is first found at which light issued from the L-lighting 102 undergoes regular reflection. The regular reflection occurs at a position on the subject surface where the angle of incidence from the light source coincides with the angle of reflection toward the camera 111. Accordingly, a distance Ls between the regular reflection position of the L-lighting 102 and the center is expressed from
Ls=G·tan(θ) (where 0≦θ≦90) (Eq.3)
θ=tan−1(Lx/(G+Ly)) (Eq.4)
b=(W/2−Ls)·(H/W) (where round up the first decimal place) (Eq.5)
In the same manner, a distance Rs between the regular reflection position of the R-lighting 103 and the center can be obtained from the following equations.
Rs=G·tan(β) (Eq.6)
β=tan−1(Rx/(G+Ry)) (Eq.7)
An x-coordinate value c of the blend pattern corresponding to the regular reflection position of the R-lighting 103 can be obtained by associating the shooting range W with the horizontal pixel count H from the following equation.
c=(W/2+Rs)·(H/W) (where round down the first decimal place) (Eq.8)
The blend pattern generating unit 125 generates a blend pattern by use of the x-coordinate values b and c as follows.
In 0≦x≦b, all the values are set to 0. In b<x<c, the value is gradually incremented from 0 to 255 and hence the value of z expressed by the following equation is set. Note that 0 is set if z<0 and that 255 is set if z>255. In c≦x≦H−1, all the values are set to 255.
z=255·(x−b)/(c−b) (Eq.9)
By setting the value in this manner, the blend pattern has the value shown in the graph of
When generating the blend pattern, the blend pattern value in b<x<c may be set so as to increase according as x varies from b to c and the method thereof is not limited to the method shown in this embodiment. For example, the following equation may be employed. The equation is not limitative but any equation is available as long as it can define which proportion of two images becomes larger in gradually variable manner.
z=255·(1−cos(π(x−b)/(c−b)))/2 (Eq.10)
Use of Equation 10 brings about an effect that the blend boundary (near b and c) has a smoother variation, as compared with the use of Equation 9.
The generated blend pattern is stored in the storage unit. Performance of the image blending unit 126 will then be described.
In the memory 130 are reserved an L-image data area 1901 storing L-image data 901, an R-image data area 1902 storing R-image data 911, and a completed image data area 1904 storing completed image data 903.
The image blending unit refers to the storage position of L-image data 901 written in the row ID=2 of the shooting database, to read out the L-image data 901 for writing to the L-image data area 1901 of the memory 130 and refers to the storage position of R-image data 911 written in the row ID=3 of the shooting database, to read out the R-image data 911 for writing to the R-image data area 1902 of the memory 130.
For all of the pixels of the L-image data area 1901 and the R-image data area 1902, the image blending unit 126 performs an operation expressed by the following equation to generate completed data.
Q(x,y)=(P(x)·L(x,y)+(255−P(x))·R(x,y))/255 (Eq.11)
where x is a horizontal coordinate value of image data, y is a vertical coordinate value of the image data, L(x, y) is a value of the L-image data area 1901 at the coordinate values x and y, R(x, y) is a value of the R-image data area 1902 at the coordinate values x and y, P(x) is a value of the blend pattern at a horizontal coordinate value x, and Q(x, y) is a value of the completed image data area 1904 at the coordinate values x and y. As used herein, the value of the image data area is assumed to be a value such as a luminance value representing the luminance or a lightness value representing the lightness. Although Equation 11 uses the value 255 based on the assumption that each pixel has 8 bits as already described for Equation 9, this is not limitative but the value of 1 is available if the hierarchical value is not taken into consideration. In such a case, Equation 11 results in Q(x,y)=(P(x)·L(x,y)+(1−P(x))·R(x,y)).
The contents of processing will be specifically described referring to Table 2. First, for the coordinate at the left bottom corner, Q(0,0) is obtained from the following equation.
Q(0,0)=(P(0)·L(0,0)+(255−P(0))·R(0,0))/255 (Eq.21)
Then, for the next coordinate on the right side, Q(1,0) is obtained from the following equation.
Q(1,0)=(P(1)·L(1,0)+(255−P(1))·R(1,0))/255 (Eq.22)
Similar processing is repeated while shifting the coordinate rightward. When the rightmost coordinate is reached, Q(0,1) is obtained for the leftmost coordinates positioned one above from the following equation.
Q(0,1)=(P(0)·L(0,1)+(255−P(0))·R(0,1))/255 (Eq.23)
Similar processing is repeated to find Q(x,y) for all the coordinate values, whereby the completed image data 903 is generated in the completed image data area 1904.
The thus generated completed image data 903 is stored in the storage unit 121 and the storage position of the completed image data is recorded in a row ID=4 of the shooting database 135.
Performance of the general control unit 127 will then be described.
The shooting conditions generating unit 123 is first operated in response to a shooting start trigger from the user received by the input unit. The shooting control unit 124 is operated after the completion of the actions of the shooting conditions generating unit 123.
The blend pattern generating unit 125 is operated after the completion of the actions of the shooting control unit 124. The image blending unit is activated after the completion of the actions of the blend pattern generating unit 125.
After the termination of the operations of the image blending unit, a shooting completion message is displayed for the user via the output unit 133, to inform the user of the completion to bring the actions of the general control unit to an end.
Description will be made of regular reflection suppressing effect that is achieved by the use of this shooting system. The shot image data obtained through the shooting under the disposition conditions of
An image of an area corresponding to the shooting range W is shot as the L-image data 901. From the corresponding relationship to
Similarly, an image of an area corresponding to the shooting range W is shot as the R-image data 911. From the corresponding relationship to
In this state, the general control unit 127 operates the image blending unit to generate the completed image data 903. In the actual shooting, the light source has a certain width due to the presence of a diffusion plate, etc. and the subject also has a minute unevenness. Accordingly, the lighting reflection area appears on the screen with a certain width around each of the coordinate values b and c, as shown in
When the x-coordinate is from 0 to b, the completed image data 903 has the blend pattern of 0 and is therefore equal to the R-image data 911. When the x-coordinate is from b to c, the blend pattern gradually varies up to 255, so that the image gradually changes from the R-image data 911 to the L-image data. When the x-coordinate is from c to H−1, the blend pattern is 0 and therefore the image becomes equal to the L-image data.
Therefore, since the value of the blend pattern is near 0 in the vicinity of the coordinate value b, the R-image data 911 without reflection is combined with the L-image data 901 in a larger proportion than the L-image data 901, whereas since the value of the blend pattern is near 255 in the vicinity of the coordinate value c, the L-image data 901 without reflection is combined with the R-image data 911 in a larger proportion than the R-image data 911.
By virtue of the blend pattern and the actions of the image blending unit in this manner, image data without reflection is predominantly blended around the areas where reflection occurs, thus advantageously enabling the reflections to be suppressed.
According to this embodiment as set forth hereinabove, there can be obtained a shot image easily suppressing the lighting reflections without any need to adjust the position and direction of the lighting even when the subject is glossy.
In this embodiment, an example of a shooting system is described that is capable of not only obtaining shot images suppressing reflections caused by regular reflection but also generating images bearing the beauty of the subject more faithfully.
In this second embodiment, the blend pattern generating unit 125 generates three different blend patterns depending on the contents of shooting conditions data and stores them in the storage unit 121.
The image blending unit 126 uses the three different blend patterns to combine the L-image data 901 and the R-image data 911, to consequently generate three different completed image data for the storage in the storage unit 121.
An execution procedure of a shooting method of this embodiment will be described with reference to the drawings. Referring to
First, similar to the case of the first embodiment, the blend pattern generating unit 125 obtains a horizontal pixel count H of image data. The blend pattern generating unit 125 then finds a coordinate value b corresponding to a position where the L-lighting undergoes regular reflection and a coordinate value c corresponding to a position where the R-lighting undergoes regular reflection.
The blend pattern generating unit 125 generates a blend A-pattern to store it in the storage unit. As used herein, the blend A-pattern refers to the blend pattern of the first embodiment, which will not again be described in detail.
The blend pattern generating unit then generates a blend B-pattern using the x-coordinate values b and c as follows. With j=b+(c−b)/4 and k=c−(c−b)/4, all the values are set to 0 when 0≦x≦j. Since the blend pattern value gradually increases from 0 up to 255 when j<x<k, the value of z expressed by Equation 12 is set. Note that 0 is set if z<0 and 255 is set if z>255. All the values are set to 255 when k≦x≦H−1.
z=255·(x−j)/(k−j) (Eq.12)
By setting in this manner, the contents of the blend B-pattern result in values indicated by graphs of
z=255·(1−cos(π(x−j)/(k−j)))/2 (Eq.13)
Use of Equation 13 leads to an effect that the changes in the blend boundaries (near j and k) become smoother than the case of using Equation 12.
The blend pattern generating unit 125 then stores the generated blend B-pattern in the storage unit.
The blend pattern generating unit 125 then finds an x-coordinate value a of the blend pattern corresponding to the left end of the subject and an x-coordinate value d of the blend pattern corresponding to the right end of the subject, from Equations 14 and 15.
a=(W−P)/2·(H/W) (where round up the first decimal place) (Eq.14)
d=(W+P)/2·(H/W) (where round down the first decimal place) (Eq.15)
The blend pattern generating unit generates a blend C-pattern as follows by making use of the x-coordinate values a and d. All the values are set to 0 when 0≦x≦a. Since the blend pattern value gradually increases from 0 up to 255 when a<x<d, the value of z expressed by Equation 16 is set. Note that 0 is set if z<0 and 255 is set if z>255.
z=255·(x−a)/(d−a) (Eq.16)
All the values are set to 255 when d≦x≦H−1. By setting in this manner, the contents of the blend pattern result in values indicated by graphs of
z=255·(1−cos(π(x−a)/(d−a)))/2 (Eq.17)
Use of Equation 17 leads to an effect that the changes in the blend boundaries (near a and d) become smoother than the case of using Equation 16.
The blend pattern generating unit 125 then stores the generated blend C-pattern in the storage unit. Performance of the image blending unit 126 will then be described.
The image blending unit 126 reserves in the memory 130 the L-image data area 1901 storing the L-image data 901, the R-image data area 1902 storing the R-image data 911, a completed image data area 2001 storing completed A-image data 1701, a completed B-image data area 2002 storing completed B-image data 1702, and a completed C-image data area 2003 storing completed C-image data 1703.
The image blending unit refers to the storage position of L-image data 901 written in the row ID=2 of the shooting database, to read out the L-image data 901 for writing to the L-image data area 1901 of the memory 130 and refers to the storage position of R-image data 911 written in the row ID=3 of the shooting database, to readout the R-image data 911 for writing to the R-image data area 1902 of the memory 130.
For all of the pixels of the L-image data area 1901 and the R-image data area 1902, the image blending unit 126 performs operations expressed by equations 18, 19, and 20 to generate completed A-image data, completed B-image data, and completed C-image data.
Qa(x,y)=(Pa(x)·L(x,y)+(255−Pa(x))·R(x,y))/255 (Eq.18)
Qb(x,y)=(Pb(x)·L(x,y)+(255−Pb(x))·R(x,y))/255 (Eq.19)
Qc(x,y)=(Pc(x)·L(x,y)+(255−Pc(x))·R(x,y))/255 (Eq.20)
where x is a horizontal coordinate value of image data, y is a vertical coordinate value of the image data, L(x,y) is a value of the L-image data area 1901 at the coordinate values x and y, R(x, y) is a value of the R-image data area 1902 at the coordinate values x and y, Pa(x) is a value of the blend A-pattern at the horizontal coordinate value x, Pb(x) is a value of the blend B-pattern at the horizontal coordinate value x, Pc(x) is a value of the blend C-pattern at the horizontal coordinate value x, Qa (x, y) is a value of the completed A-image data area 2001 at the coordinate values x and y, Qb(x,y) is a value of the completed B-image data area 2002 at the coordinate values x and y, and Qc(x,y) is a value of the completed C-image data area 2003 at the coordinate values x and y.
The contents of processing will be specifically described referring to Tables 2 and 3. First, for the coordinate at the left bottom corner, Qa(0,0), Qb(0,0), and Qc(0,0) are obtained from the following equations.
Qa(0,0)=(Pa(0)·L(0,0)+(255−Pa(0))·R(0,0))/255 (Eq.24)
Qb(0,0)=(Pb(0)·L(0,0)+(255−Pb(0))·R(0,0))/255 (Eq.25)
Qc(0,0)=(Pc(0)·L(0,0)+(255−Pc(0))·R(0,0))/255 (Eq.26)
Then, for the next coordinate on the right side, Qa(1,0), Qb(1,0), and Qc(1,0) are obtained from the following equations.
Qa(1,0)=(Pa(1)·L(1,0)+(255−Pa(1))·R(1,0))/255 (Eq.27)
Qb(1,0)=(Pb(1)·L(1,0)+(255−Pb(1))·R(1,0))/255 (Eq.28)
Qc(1,0)=(Pc(1)·L(1,0)+(255−Pc(1))·R(1,0))/255 (Eq.29)
Similar processing is repeated while shifting the coordinate rightward. When the rightmost coordinate is reached, Qa(0,1), Qb(0,1), and Qc(0,1) are obtained for the leftmost coordinates positioned one above from the following equations.
Qa(0,1)=(Pa(0)·L(0,1)+(255−Pa(0))·R(0,1))/255 (Eq.30)
Qb(0,1)=(Pb(0)·L(0,1)+(255−Pb(0))·R(0,1))/255 (Eq.31)
Qc(0,1)=(Pc(0)·L(0,1)+(255−Pc(0))·R(0,1))/255 (Eq.32)
Similar processing is repeated to find Qa(x,y), Qb(x,y), and Qc(x,y) for all the coordinate values, whereby the completed A-image data 1701, the completed B-image data 1702, and the completed C-image data 1703 are generated in the completed image-A data area 2001, the completed image-B data area 2002, and the completed image-C data area 2003, respectively.
The actions of the image blending unit 126 generate three different completed image data through combining the L-image data 901 and the R-image data 911 with weights using three different blend patterns for each pixel.
Performance of an image selecting unit will then be described. The image selecting unit is stored in the storage unit 121 as one element of the program 122 implementing the functions of the computer 134.
The image selecting unit reads out the completed A-image data 1701, the completed B-image data 1702, and the completed C-image data 1703 from the completed image-A data area 2001, the completed image-B data area 2002, and the completed image-C data area 2003, respectively, to display the respective images at predetermined positions on the selection screen of
The image selecting unit displays on the selection screen a selection request message 1505 to select one from among three images and accepts a select code from a selection input area 1506 via the input unit.
If the select code is “A”, the image selecting unit stores the completed A-image data 1701 in the completed A-image data area 2001 into the storage unit 121 and records the completed A-image data storage position in the row ID=4 of the shooting database 135.
If the select code is “B”, the image selecting unit stores the completed B-image data 1702 in the completed B-image data area 2002 into the storage unit 121 and records the completed B-image data storage position in the row ID=4 of the shooting database 135.
If the select code is “C”, the image selecting unit stores the completed C-image data 1703 in the completed C-image data area 2003 into the storage unit 121 and records the completed C-image data storage position in the row ID=4 of the shooting database 135.
If the select code is other than “A”, “B”, and “C”, the image selecting unit ignores the input to again receive a select code.
As a result of the above actions, a completed image selected by the user is stored in the storage unit 121.
Performance of the general control unit 127 will then be described.
The shooting conditions generating unit 123 is first operated in response to a shooting start trigger from the user received by the input unit 132.
The shooting control unit 124 is operated after the completion of the actions of the shooting conditions generating unit 123.
The blend pattern generating unit 125 is operated after the completion of the actions of the shooting control unit 124.
The image blending unit 126 is activated after the completion of the actions of the blend pattern generating unit 125.
The image selecting unit is activated after the completion of the actions of the image blending unit 126.
After the completion of the actions of the image selecting unit, a shooting completion message is displayed for the user via the output unit 133, to inform the user of the completion to bring the actions of the general control unit 127 to an end.
Effects achieved by the use of the shooting system of this embodiment will be described. Shot image data obtained by shooting in the disposition conditions of
In this state, the general control unit operates the image blending unit to allow the generation of the completed A-image data 1701, the completed B-image data 1702, and the completed C-image data 1703 as shown in
In the actual shooting, the light source has a certain width due to the presence of the diffusion plate, etc. and the subject also has a minute unevenness. Accordingly, the lighting reflection area appears on the screen with a certain width around each of the coordinate values b and c, as shown in
When the x-coordinate is from 0 to b, the completed A-image data 1701 has the blend pattern of 0 and is therefore equal to the R-image data 911. When the x-coordinate is from b to c, the blend pattern gradually varies up to 255, so that the image gradually changes from the R-image data 911 to the L-image data. When the x-coordinate is from c to H−1, the blend pattern is 0 and therefore the image becomes equal to the L-image data.
When the x-coordinate is from 0 to j, the completed B-image data 1702 has the blend pattern of 0 and is therefore equal to the R-image data 911. When the x-coordinate is from j to k, the blend pattern gradually varies up to 255, so that the image gradually changes from the R-image data 911 to the L-image data. When the x-coordinate is from k to H−1, the blend pattern is 0 and therefore the image becomes equal to the L-image data.
When the x-coordinate is from 0 to a, the completed C-image data 1703 has the blend pattern of 0 and is therefore equal to the R-image data 911. When the x-coordinate is from a to d, the blend pattern gradually varies up to 255, so that the image gradually changes from the R-image data 911 to the L-image data. When the x-coordinate is from d to H−1, the blend pattern is 0 and therefore the image becomes equal to the L-image data.
In the vicinity of the coordinate value b, the R-image data 911 has a reflection but the L-image data 901 has no reflection. On the contrary, in the vicinity of the coordinate value c, the L-image data 901 has a reflection but the R-image data 911 has no reflection. Therefore, the reflection near the coordinate value b can be suppressed to a greater extent according as the blend pattern near the coordinate value b approaches 0, whereas the reflection near the coordinate value c can be suppressed to a greater extent according as the blend pattern near the coordinate value c value approaches 255.
Since the three different blend patterns are formed as shown in
The completed A-image data 1701, the completed B-image data 1702, and the completed C-image data 1703 are displayed in a column by the action of the image selecting unit. In the case of a painting using a glossy material such as gold, it is important “how it looks” of the portion of reflection due to lighting in order to express the texture and beauty of the subject. From such a viewpoint, the user selects an image that is most suitable for a shot image. Since the image selecting unit records the storage position of an image selected by the user in the row ID=4 of the shooting database 135, the user acquires an image lying at the storage position stored in the row ID=4 of the shooting database 135, thereby enabling an image most suitable for a shot image to be obtained.
According to this embodiment as set forth hereinabove, there can not only be obtained a shot image easily suppressing the lighting reflections without any need to adjust the position and direction of the lighting even when the subject is glossy, but also can be generated an image having a higher regular reflection suppressing effect than the first embodiment. It also enables the generation of an image bearing the beauty of the subject more faithfully.
Although three different blend patterns are provided in this embodiment, similar shooting system may be configured by using two different or four or more different blend patterns.
101: subject, 102: L-lighting, 103: R-lighting, 104: light source, 105: light source control unit, 106: communication unit, 111: camera, 112: shooting unit, 113: shooting control unit, 114: communication unit, 121: storage unit, 122: program, 123: shooting conditions generating unit, 124: shooting control unit, 125: blend pattern generating unit, 126: image blending unit, 127: general control unit, 128: bus, 129: control unit, 130: memory, 131: communication unit, 132: input unit, 133: output unit, 401: image pickup device, 402: L-lighting position, 403: R-lighting position, 404: camera position, 405: shooting conditions data, 501: blend pattern, 502: shot image data, 901: L-image data, 903: completed image data, 911: R-image data, 1501: selection screen, 1505: selection request message, 1506: selection input area, 1701: completed A-image data, 1702: completed B-image data, 1703: completed C-image data, 1901: L-image data area, 1902: R-image data area, 1903: completed image data area, 2001: completed A-image data area, 2002: completed A-image data area, 2003: completed A-image data area
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/065678 | 6/20/2012 | WO | 00 | 6/12/2013 |