This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2022-143496 filed Sep. 9, 2022.
The present disclosure relates to an information processing apparatus, an information processing method, and a non-transitory computer readable medium.
There is a difference in illumination between an environment in which a designer etc. prepares an image and an environment in which printed matter obtained by printing the prepared image is observed. This difference may render how the printed matter looks in terms of color or gloss different from that intended by the designer etc. Similar mismatches are also caused for industrially manufactured products.
A related technique is disclosed in Japanese Unexamined Patent Application Publication No. 2021-149679.
When an omnidirectional image (hereinafter referred to as an “environment light map”) that includes illumination information at an observation location has been prepared in advance, it is possible to simulate how an article would look in terms of color or gloss at the observation location.
However, it takes a lot of trouble and a lot of time to prepare an environment light map. Therefore, it is not practical to prepare an environment light map for a desired observation location.
Aspects of non-limiting embodiments of the present disclosure relate to easily reproducing how an article would look at an observation location, an environment light map for which is not available, rather than when an environment light map for an observation location has been prepared in advance.
Aspects of certain non-limiting embodiments of the present disclosure overcome the above disadvantages and/or other disadvantages not described above. However, aspects of the non-limiting embodiments are not required to overcome the disadvantages described above, and aspects of the non-limiting embodiments of the present disclosure may not overcome any of the disadvantages described above.
According to an aspect of the present disclosure, there is provided an information processing apparatus including a processor configured to: acquire a feature amount related to brightness distribution from a first image captured at an observation location; select an environment light map that is similar to the feature amount, from among a plurality of environment light maps prepared in advance; and control expression of a second image corresponding to an article observed at the observation location using the selected environment light map.
Exemplary embodiments of the present disclosure will be described in detail based on the following figures, wherein:
Exemplary embodiments of the present disclosure will be described below with reference to the drawings.
<System Configuration>
The print system 1 illustrated in
Each of the client terminal 10, the image forming apparatus 20, and the print server 30 is an example of an information processing apparatus.
The client terminal 10 and the print server 30 are basically constituted of a computer. The image forming apparatus 20 and the print server 30 may be connected to each other through a dedicated line.
The image forming apparatus 20 is a device that forms an image on a recording medium such as paper. A recording material such as a toner or an ink is used to form an image. The colors of the recording material include yellow (Y), magenta (M), cyan (C), and black (K) which are called basic colors, and metallic colors and fluorescent colors which are called special colors.
The client terminal 10 may be a desktop computer, a laptop computer, a tablet computer, a smartphone, or a wearable computer, for example. In the present exemplary embodiment, the client terminal 10 is exclusively used as an input/output device.
The image forming apparatus 20 according to the present exemplary embodiment may be a production printer, a printer for office use, or a printer for home use, for example. The image forming apparatus 20 may be provided with not only a print function but also a scanner function. The print function may use a print method corresponding to electrophotography or a print method corresponding to an inkjet system.
The print server 30 according to the present exemplary embodiment is provided with a function of receiving a print job from the client terminal 10 and outputting the print job to the image forming apparatus 20, and a function of reproducing how an article would look at an observation location.
The phrase “how an article would look” refers to an impression (so-called “texture”) that the color or the gloss of the article would give to people. The color and the gloss are affected by irregularities on a surface, the direction of a normal to the surface and the direction of incident illumination light, the intensity of the illumination light, the color of the illumination light, etc.
The print server 30 according to the present exemplary embodiment receives an image (hereinafter referred to as an “environment image”) obtained by capturing an observation location and information on an article as a target of reproduction as to how it would look from the client terminal 10, and reproduces how the article would look in a posture specified by a user through a computer technology. Examples of the information on an article include a three-dimensional shape, and a fine structure, a pattern, and a color of a surface.
The environment image is uploaded from the client terminal 10 to the print server 30, for example. The print server 30 may download the environment image specified from the client terminal 10 from the Internet etc., or may read the environment image from a data storage.
In
Examples of the environment image according to the present exemplary embodiment include an omnidirectional image, an upper hemisphere image, and a planar image.
The upper hemisphere image refers to an upper half of the omnidirectional image above the equator. It is not necessary that the upper hemisphere image should strictly be an image obtained by capturing a range from the equator to the zenith, and the upper hemisphere image may be an image obtained by capturing a range from a certain latitude to the zenith.
The planar image refers to a two-dimensional image for a specific angle of view captured by a camera of a smartphone etc.
The observation location is a location at which an article is expected to be observed, and is assumed to be a specific booth at an exhibition site, an exhibition room, a conference room, etc., for example. The booth is a space defined by partitions etc. The observation location is not limited to an indoor environment, and may be an outdoor environment.
Even for the same article, different textures may be observed when the intensity or the color of illumination light is different. Even when the intensity or the color of illumination light is the same, different textures may be observed when the direction of incident illumination light and the direction of a normal to the surface of the article are different.
The network N in
While one client terminal 10, one image forming apparatus 20, and one print server 30 are connected to the network N of the print system 1 illustrated in
<Terminal Configuration>
<Hardware Configuration of Print Server>
The print server 30 illustrated in
The devices are connected to each other through a signal line 36 such as a bus.
The processor 31, the ROM 32, and the RAM 33 function as a so-called computer.
The processor 31 implements various functions through execution of a program. For example, the processor 31 acquires information (hereinafter also referred to as “illumination information”) about illumination from an environment image, and generates an image that reproduces how an article would look at an observation location. In the present exemplary embodiment, generating an image that reproduces how an article would look is referred to as “controlling expression of an image”.
The auxiliary storage device 34 is constituted of a hard disk device or a semiconductor storage, for example. The auxiliary storage device 34 stores a program and various data. The term “program” is used herein as a generic name for an operating system (OS) and application programs. The application programs include a program that simulates the texture of an article.
While the auxiliary storage device 34 is built in the print server 30 in
The communication module 35 is an interface that implements communication with the client terminal 10 (see
<Hardware Configuration of Client Terminal>
The client terminal 10 illustrated in
The processor 11, the ROM 12, and the RAM 13 function as a so-called computer.
The processor 11 implements various functions through execution of a program. For example, the processor 11 executes uploading of an environment image, uploading of information on an article to be observed at an observation location, and display of an image that reproduces how the article would look.
The auxiliary storage device 14 may be a hard disk device or a semiconductor storage, for example. The auxiliary storage device 14 stores not only a program such as an OS but also an environment image, an image of an article to be processed, etc.
The display 15 may be a liquid crystal display or an organic electro-luminescence (EL) display, for example. An image that reproduces how an article would look at an observation location is displayed on the display 15.
The I/O interface 16 is a device that receives an input from the user made using a keyboard or a mouse, for example. Specifically, the I/O interface 16 receives an input such as positioning or movement of a mouse cursor, clicking, etc. The I/O interface 16 is also a device that outputs data to an external terminal. The external terminal includes not only the image forming apparatus 20 etc. connected through the network N but also a terminal connected by way of the Internet.
The communication module 17 is a device that enables communication with the print server 30 etc. connected to the network N. A module that conforms to any communication standard such as Ethernet (registered trademark) or Wi-Fi (registered trademark) may be used as the communication module 17.
<Overview of Texture Reproduction Process>
A texture reproduction process executed by the print server 30 (see
The texture reproduction process according to the present exemplary embodiment is started when information on an article and an environment image are given from the client terminal 10 (see
Through execution of a program, the processor 31 functions as an environment image acquisition section 311, a glossiness effect degree calculation section 312, an environmental light map selection section 313, and a glossiness reproduction section 314.
The environment image acquisition section 311 is a functional section that acquires an environment image. The environment image acquisition section 311 acquires an environment image uploaded from the client terminal 10, for example. The environment image acquisition section 311 may acquire an environment image from the auxiliary storage device 34 (see
The glossiness effect degree calculation section 312 is a functional section that calculates a glossiness effect degree from an environment image. The glossiness effect degree is an index that indicates an effect of illumination on a glossiness, and indicates that the glossiness is felt better as the value of the index is larger. The glossiness effect degree is an example of illumination information at an observation location.
In the present exemplary embodiment, the glossiness effect degree is calculated as a standard deviation of the brightness of environment images.
When the standard deviation is small, for example, it is indicated that variations in the brightness among the environment images are small. In this case, illumination light at the observation location illuminates the surface of the article uniformly from various directions. Therefore, a small glossiness is felt on the surface of the article. Examples of this type of illumination include illumination with a diffusion plate.
When the standard deviation is medium, for example, it is indicated that variations in the brightness among the environment images are medium. In this case, illumination light at the observation location illuminates the surface of the article as a surface light source. Therefore, a medium glossiness is felt on the surface of the article. Examples of this type of illumination include organic electro-luminescence (EL) illumination.
When the standard deviation is large, for example, it is indicated that variations in the brightness among the environment images are large. In this case, illumination light at the observation location illuminates the surface of the article from a specific direction as a point light source. Therefore, a large glossiness is felt on the surface of the article. Examples of this type of illumination include light emitting diode (LED) illumination.
The standard deviation is an example of a “feature amount related to brightness distribution”.
The environment light map selection section 313 is a functional section that selects an environment light map with a glossiness effect degree that is similar to that of the environment image, from among environment light maps A, B, C, . . . stored in an environment light map storage section 341.
In the present exemplary embodiment, the environment light map storage section 341 stores one environment light map for each glossiness effect degree. The environment light maps A, B, C, . . . as used herein are an example of a “plurality of environment light maps prepared in advance”.
While the environment light map A with a glossiness effect degree of 1.4, the environment light map B with a glossiness effect degree of 1.0, and the environment light map C with a glossiness effect degree of 0.7 are illustrated in
For example, the values of the glossiness effect degree may be 1.3, 1.2, 1.1, 1.0, 0.9, 0.8, etc. with the interval between such values being 0.1, or the interval may be 0.2 or 0.5. Different intervals may be used in a mixed manner.
The glossiness effect degree may have values more than 1.4 such as 1.5 and 1.6, or may have values less than 0.7 such as 0.6 and 0.5, for example.
In the present exemplary embodiment, omnidirectional images are assumed as the environment light maps, for example. The environment light maps may be upper hemisphere images.
The glossiness reproduction section 314 is a functional section that generates an image (hereinafter referred to as a “glossiness reproduction image”) that reproduces how an article would look in terms of glossiness etc. at an observation location using an environment light map that is close to illumination information at the observation location.
The information on an article as a target of reproduction as to how it would look is uploaded from the client terminal 10 (see
The glossiness reproduction section 314 according to the present exemplary embodiment generates a glossiness reproduction image using image-based lighting. A glossiness reproduction image that reflects how an article would look when observed at an observation location from various viewing directions is generated through the image-based lighting. The glossiness reproduction image is an example of a “second image corresponding to an article observed at an observation location”.
<Details of Texture Reproduction Process>
Process operation executed by the various functional sections will be described in detail below.
<Environment Image Acquisition Section>
The environment image acquired by the environment image acquisition section 311 may be a single image captured at an observation location. That is, it is not necessary that a plurality of environment images should be provided for each observation location. Since image capture is performed once, a single color temperature, a single exposure condition, etc. are used. That is, any camera may be used to capture an environment image. For example, a camera of a smartphone or a camera capable of capturing an omnidirectional image may be used.
In the present exemplary embodiment, a High Dynamic Range (HDR) format or an OpenEXR format, for example, is assumed as an image format of the environment image. The HDR format and the OpenEXR format are known as file formats with a high dynamic range.
The OpenEXR format supports a higher tone resolution than the HDR format. That is, the OpenEXR format enables finer tone expression than the HDR format.
In the HDR format, RGB values and an exponent are each expressed in 8 bits (i.e. a total of 32 bits) per pixel.
In the OpenEXR format, RGB values are each expressed in 16 bits, a sign is expressed in 1 bit, an exponent is expressed in 5 bits, and a mantissa is expressed in 10 bits per pixel. In other versions, RGB values are each expressed in 32 bits or each expressed in 24 bits.
<Glossiness Effect Degree Calculation Section>
First, the glossiness effect degree calculation section 312 calculates a brightness of an environment image using a calculation formula, and calculates a standard deviation of the brightness (step 1). The calculation formula may be 0.299×R+0.587×G+0.114×B, for example.
The brightness is calculated for each pixel, and the standard deviation is calculated for the entire environment image. In the present exemplary embodiment, a calculated value of the standard deviation is rounded off to the second decimal place.
Next, the glossiness effect degree calculation section 312 gives the standard deviation of the brightness to a calculation model 1 to calculate a glossiness effect degree (step 2). The calculation model 1 may be (coefficient 1)×(standard deviation of brightness), for example. The coefficient 1 is a coefficient that is used to calculate a glossiness effect degree using the standard deviation of the brightness. In the present exemplary embodiment, a calculated value of the glossiness effect degree is rounded off to the first decimal place. The glossiness effect degree is an example of an “index that represents a gloss degree”.
The glossiness effect degree may be calculated using a distortion degree. The distortion degree is an example of a “feature amount related to brightness distribution”.
First, the glossiness effect degree calculation section 312 calculates a brightness of an environment image using a calculation formula, and calculates a distortion degree of the brightness (step 1A). The calculation formula may be 0.299×R+0.587×G+0.114×B, for example.
The brightness is calculated for each pixel, and the distortion degree of the brightness is calculated for the entire environment image. The distortion degree indicates the distortion degree of the distribution of the brightness calculated for the entire environment image with respect to a normal distribution. In other words, the distortion degree of the brightness is an index that indicates the bilateral symmetry of the distribution.
Next, the glossiness effect degree calculation section 312 gives the distortion degree of the brightness to a calculation model 2 to calculate a glossiness effect degree (step 2A). The calculation model 2 may be (coefficient 2)×(distortion degree of brightness), for example. The coefficient 2 is a coefficient that is used to calculate a glossiness effect degree using the distortion degree of the brightness. Also in this case, a calculated value of the glossiness effect degree is rounded off to the first decimal place.
<Environment Light Map Selection Section>
The environment light map selection section 313 is a functional section that selects an environment light map with a value that is close to the glossiness effect degree calculated by the glossiness effect degree calculation section 312.
The vertical axis in
In
The environment light map A is an omnidirectional image that includes a light source with high directivity such as a point light source. Examples of the point light source include LED illumination, for example. The glossiness effect degree of the environment light map A illustrated in
The environment light map B is an omnidirectional image that includes a light source with high diffusion, such as a surface light source, compared to the point light source. Examples of the surface light source include organic EL illumination. In the present exemplary embodiment, the glossiness effect degree of the environment light map B is 1.0. The value here is also exemplary, and it is not intended that the glossiness effect degree of the environment light map B is limited to 1.0.
The environment light map C is an omnidirectional image that includes a light source with high diffusion, such as a uniform diffusion light source, compared to the surface light source. Examples of the uniform diffusion light source include illumination with a diffusion plate. In the present exemplary embodiment, the glossiness effect degree of the environment light map C is 0.7. The value here is also exemplary, and it is not intended that the glossiness effect degree of the environment light map C is limited to 0.7.
<Glossiness Reproduction Section>
First, the glossiness reproduction section 314 sets the selected environment light map to a glossiness reproduction program (step 11). In the present exemplary embodiment, the glossiness effect degree of the environment image is 1.3. Therefore, the environment light map A with a glossiness effect degree of 1.4 is set to the glossiness reproduction program (see
Next, the glossiness reproduction section 314 generates a rendered image of an article through image-based lighting (step 12).
The image-based lighting is a rendering method to reproduce how an article given from the user would look in terms of color and gloss using the set environment light map as illumination information and using the camera position as the point of view.
After that, the glossiness reproduction section 314 outputs the generated rendered image (i.e. glossiness reproduction image) (step 13). Natural light and shade that are close to those would be obtained when the article were observed at the observation location are expressed in the rendered image.
The vertical axis in
The article illustrated in
It is seen from
The generated glossiness reproduction image is displayed on the display 15 (see
In
In
The display screen illustrated in
Besides, in displaying a glossiness reproduction image, the glossiness reproduction section 314 may display an environment light map that was used to generate the glossiness reproduction image on the display 15 of the client terminal 10.
In
In
This information display enables the user to confirm not only the generated glossiness reproduction image 151 but also the environment light map that was used to generate the glossiness reproduction image 151. As a result, the user is enabled to verify the choice of the environment light map.
In
The display illustrated in
In
Not only an image of the environment light map that was used to generate the glossiness reproduction image 151 but also an image of the different candidate 154 for an environment light map is displayed, which enables the user to provide an instruction to regenerate a glossiness reproduction image 151 using the different candidate 154.
Providing the function of displaying the different candidate 154 not only enables the user to verity the environment light map that was used to generate the glossiness reproduction image 151, but also enables the user to confirm the glossiness reproduction image 151 generated using the different candidate 154 on the display 15.
The screen examples illustrated in
A method of enhancing the degree of reproducibility of the glossiness of an article will be described in accordance with the present exemplary embodiment.
Also in the present exemplary embodiment, the print system 1 illustrated in
<Overview of Texture Reproduction Process>
One of features that are peculiar to the print server 30 illustrated in
In this manner, a glossiness effect degree and an average brightness calculated for an environment image captured at an observation location are given to the environment light map selection section 313A that is used in the present exemplary embodiment. That is, the environment light map selection section 313A selects an environment light map that is close to the environment at the observation location from an environment light map storage section 341A using the glossiness effect degree and the average brightness.
In the present exemplary embodiment, the environment light map storage section 341A is required to store a plurality of environment light maps with different average brightnesses for each glossiness effect degree.
In
For example, an environment light map collection AA is a collection of environment light maps A1, A2, A3, . . . all with a glossiness effect degree of 1.4. The environment light maps A1, A2, A3, . . . have different average brightnesses.
An environment light map collection BB is a collection of environment light maps B1, B2, B3, . . . all with a glossiness effect degree of 1.0. The environment light maps B1, B2, B3, . . . have different average brightnesses.
Similarly, an environment light map collection CC is a collection of environment light maps C1, C2, C3, . . . all with a glossiness effect degree of 0.7. The environment light maps C1, C2, C3, . . . have different average brightnesses.
The environment light map selection section 313A discussed earlier selects an environment light map with a glossiness effect degree and an average brightness close to those of the environment image, and outputs the selected environment light map to the glossiness reproduction section 314.
<Details of Texture Reproduction Process>
Differences from the first exemplary embodiment will be described below.
<Average Brightness Calculation Section>
The average brightness calculation section 315 calculates a brightness of an environment image using a calculation formula, and calculates an average brightness of the environment image (step 21). The calculation formula may be 0.299×R+0.587×G+0.114×B, for example.
The glossiness effect degree calculation section 312 also requires the brightness of each pixel. Thus, the glossiness effect degree calculation section 312 and the average brightness calculation section 315 may share the brightness calculated for each pixel of the environment image.
The average brightness is calculated for the entire environment image. In the present exemplary embodiment, a calculated value of the average brightness is rounded off to the second decimal place.
<Environment Light Map Selection Section>
The environment light map selection section 313A selects an environment light map with values close to the glossiness effect degree calculated by the glossiness effect degree calculation section 312 and the average brightness calculated by the average brightness calculation section 315.
The vertical axis in
Also in
Due to space limitations, only three environment light maps that belong to each environment light map collection are illustrated.
The environment light map collection AA includes omnidirectional images that include a point light source such as LED illumination. The environment light map A1 has a glossiness effect degree of 1.4 and an average brightness of 78. The environment light map A2 has a glossiness effect degree of 1.4 and an average brightness of 80. The environment light map A3 has a glossiness effect degree of 1.4 and an average brightness of 72.
The environment light map collection BB includes omnidirectional images that include a surface light source such as organic EL illumination. The environment light map B1 has a glossiness effect degree of 1.0 and an average brightness of 85. The environment light map B2 has a glossiness effect degree of 1.0 and an average brightness of 86. The environment light map B3 has a glossiness effect degree of 1.0 and an average brightness of 71.
The environment light map collection CC includes omnidirectional images that include uniform diffusion light source such as illumination with a diffusion plate. The environment light map C1 has a glossiness effect degree of 0.7 and an average brightness of 75. The environment light map C2 has a glossiness effect degree of 0.7 and an average brightness of 84. The environment light map C3 has a glossiness effect degree of 0.7 and an average brightness of 80.
The environment light map selection section 313A according to the present exemplary embodiment selects an environment light map with not only a glossiness effect degree but also an average brightness that is close to that of the environment image, and outputs the selected environment light map to the glossiness reproduction section 314. In the example in
Another method of enhancing the degree of reproducibility of the glossiness of an article will be described in accordance with the present exemplary embodiment.
Also in the present exemplary embodiment, the print system 1 illustrated in
<Overview of Texture Reproduction Process>
One of features that are peculiar to the print server 30 illustrated in
In this manner, a glossiness effect degree and a chromaticity calculated for an environment image captured at an observation location are given to the environment light map selection section 313B that is used in the present exemplary embodiment. That is, the environment light map selection section 313B selects an environment light map that is close to the environment at the observation location from an environment light map storage section 341B using the glossiness effect degree and the chromaticity. The chromaticity is given by hue and saturation.
In the present exemplary embodiment, the environment light map storage section 341B is required to store a plurality of environment light maps with different chromaticities for each glossiness effect degree.
Also in
For example, an environment light map collection AA is a collection of environment light maps A1, A2, A3, . . . all with a glossiness effect degree of 1.4. The environment light maps A1, A2, A3, . . . have different chromaticities.
An environment light map collection BB is a collection of environment light maps B1, B2, B3, . . . all with a glossiness effect degree of 1.0. The environment light maps B1, B2, B3, . . . have different chromaticities.
Similarly, an environment light map collection CC is a collection of environment light maps C1, C2, C3, . . . all with a glossiness effect degree of 0.7. The environment light maps C1, C2, C3, . . . have different chromaticities.
The environment light map selection section 313B discussed earlier selects an environment light map with a glossiness effect degree and a chromaticity close to those of the environment image, and outputs the selected environment light map to the glossiness reproduction section 314.
<Details of Texture Reproduction Process>
Differences from the first exemplary embodiment will be described below.
<Chromaticity Calculation Section>
The chromaticity calculation section 316 converts an environment image into HSV values (step 31).
As discussed earlier, the environment image is given in RGB values. The equations for conversion from RGB values into HSV values are known.
The chromaticity is calculated for the entire environment image. In the present exemplary embodiment, a calculated value of the chromaticity is rounded off to the second decimal place.
<Environment Light Map Selection Section>
The environment light map selection section 313B selects an environment light map with values close to the glossiness effect degree calculated by the glossiness effect degree calculation section 312 and the chromaticity calculated by the chromaticity calculation section 316.
First, the environment light map selection section 313B selects an environment light map collection with a glossiness effect degree that is close to the calculated glossiness effect degree from the environment light map storage section 341B (step 41). In the present exemplary embodiment, the glossiness effect degree of the environment image is 1.3. Therefore, the environment light map selection section 313B selects the environment light map collection AA with a glossiness effect degree of 1.4.
Next, the environment light map selection section 313B converts environment light maps of the selected environment light map collection (environment light map collection AA) into HSV values (step 42).
Subsequently, the environment light map selection section 313B calculates a hue difference and a saturation difference between the environment image and the environment light maps (step 43).
The vertical axis in
Also in
Due to space limitations, only three environment light maps that belong to each environment light map collection are illustrated.
The environment light map collection AA includes omnidirectional images that include a point light source such as LED illumination. The environment light map A1 has a glossiness effect degree of 1.4, a hue of 221°, and a saturation of 70%. The environment light map A2 has a glossiness effect degree of 1.4, a hue of 222°, and a saturation of 72%. The environment light map A3 has a glossiness effect degree of 1.4, a hue of 218°, and a saturation of 74%.
In step 43, a hue difference and a saturation difference from the environment image are calculated for the environment light maps A1, A2, A3, . . . .
For example, the hue difference and the saturation difference between the environment image and the environment light map A1 are calculated as −11° (=210°−221°) and 14% (=84%−70%), respectively.
Similarly, the hue difference and the saturation difference between the environment image and the environment light map A2 are calculated as −12° (=210°−222°) and 12% (=84%−72%), respectively, and the hue difference and the saturation difference between the environment image and the environment light map A3 are calculated as −8° (=210°−218°) and 10% (=84%−74%), respectively.
For reference, the environment light map collection BB includes omnidirectional images that include a surface light source such as organic EL illumination. The environment light map B1 has a glossiness effect degree of 1.0, a hue of 201°, and a saturation of 78%. The environment light map B2 has a glossiness effect degree of 1.0, a hue of 203°, and a saturation of 75%. The environment light map B3 has a glossiness effect degree of 1.0, a hue of 210°, and a saturation of 74%.
The environment light map collection CC includes omnidirectional images that include uniform diffusion light source such as illumination with a diffusion plate. The environment light map C1 has a glossiness effect degree of 0.7, a hue of 221°, and a saturation of 69%. The environment light map C2 has a glossiness effect degree of 0.7, a hue of 223°, and a saturation of 72%. The environment light map C3 has a glossiness effect degree of 0.7, a hue of 218°, and a saturation of 78%.
After that, the environment light map selection section 313A selects an environment light map with the smallest hue difference and saturation difference (step 44). For example, when a plurality of environment light maps with the smallest hue difference are found, an environment light map with the smallest saturation difference is selected. When a plurality of environment light maps with the smallest saturation difference are found, any one of the environment light maps is selected.
The environment light map selection section 313B according to the present exemplary embodiment selects an environment light map with not only a glossiness effect degree but also a chromaticity that is close to that of the environment image, and outputs the selected environment light map to the glossiness reproduction section 314. In the example in
Another method of enhancing the degree of reproducibility of the glossiness of an article will be described in accordance with the present exemplary embodiment.
Also in the present exemplary embodiment, the print system 1 illustrated in
<Overview of Texture Reproduction Process>
The exemplary embodiment in
The exemplary embodiment is also the same as the first exemplary embodiment in that the environment light map selection section 313 selects an environment light map with a glossiness effect degree that is close to that of the environment image.
The difference is that the environment light map selected by the environment light map selection section 313 is corrected to be closer to the illumination environment of the environment image.
To this end, an environment light map correction section 317 is additionally provided in
The environment light map correction section 317 receives an input of the environment image acquired by the environment image acquisition section 311 and the environment light map selected by the environment light map selection section 313.
In the present exemplary embodiment, the environment light map correction section 317 is composed of a brightness correction section 317A that corrects the average brightness of the environment light map to be closer to the illumination environment of the environment image, and a chromaticity correction section 317B that corrects the chromaticity of the environment light map to be closer to the illumination environment of the environment image.
The environment light map with the average brightness corrected by the brightness correction section 317 and the environment light map with the chromaticity corrected by the chromaticity correction section 317B are each output to the glossiness reproduction section 314.
Thus, the glossiness reproduction section 314 according to the present exemplary embodiment reproduces the glossiness of an article using the corrected environment light maps.
Specifically, in generating a rendered image, the environment light map with the corrected brightness is used for the average brightness of the illumination environment at the observation location, and the environment light map with the corrected chromaticity is used for the chromaticity of the illumination environment at the observation location.
<Details of Texture Reproduction Process>
Process operation executed by the various functional sections will be described in detail below.
<Environment Light Map Correction Section>
First, the brightness correction section 317A calculates an average brightness of an environment image and an average brightness of a selected environment light map (step 51). The brightness of each pixel is calculated as 0.299×R+0.587×G+0.114×B, for example.
Next, the brightness correction section 317A converts the brightness of the environment light map through exponentiation with an exponent a (step 52). The exponent a is a real number.
When the brightness before the conversion is defined as brightnessIN and the brightness after the conversion is defined as brightnessOUT, the conversion is performed by the following equation.
BrightnessOUT=brightnessINa
Subsequently, the brightness correction section 317A calculates an average brightness of the environment light map after the conversion (step 53).
After the calculation, the brightness correction section 317A determines whether or not the average brightness of the environment light map is equal to the average brightness of the environment image (step 54).
It may be determined in step 54 whether or not the difference between the average brightness of the environment light map and the average brightness of the environment image is less than a threshold. The threshold is given in advance.
When the average brightness of the environment light map and the average brightness of the environment image are different from each other, a negative result is obtained in step 54. In this case, the brightness correction section 317A changes the exponent a (step 55), and the process returns to step 52.
The exponent a may be increased or decreased by a fixed value, or the amount of increase or the amount of decrease may be determined in accordance with the difference between the average brightness of the environment light map and the average brightness of the environment image.
The direction of increasing or decreasing the exponent a is changed when the average brightness of the environment light map which has been more than the average brightness of the environment image becomes less than the average brightness of the environment image or when the average brightness of the environment light map which has been less than the average brightness of the environment image becomes more than the average brightness of the environment image. For example, the exponent a is decreased when the magnitude relationship between the average brightnesses is reversed as a result of increasing the exponent a.
Alternatively, the exponent a may be obtained as a result of inputting an environment light map and an environment image to a learning model that has learned the relationship between an environment light map and an environment image as inputs and the exponent a as an output through machine learning.
When the average brightness of the environment light map and the average brightness of the environment image are equal to each other, a positive result is obtained in step 54. In this case, the brightness correction section 317A outputs the environment light map after the brightness correction (step 56).
<Chromaticity Correction Section>
The chromaticity correction section 317B converts the environment image and the selected environment light map into HSV values (step 61). The equations for conversion indicated in
Next, the chromaticity correction section 317B adjusts the hue of the environment light map through exponentiation with a correction coefficient h, and adjusts the saturation of the environment light map through exponentiation with a correction coefficient s (step 62). The coefficients h and s are real numbers.
When the hue before the adjustment is defined as hueIN and the hue after the adjustment is defined as hueOUT, the adjustment is performed using the following equation.
HueOUT=hueINh
Similarly, when the saturation before the adjustment is defined as saturationIN and the saturation after the adjustment is defined as saturationOUT, the adjustment is performed using the following equation.
SaturationOUT=saturationINS
Subsequently, the chromaticity correction section 317B determines whether or not the hue of the environment image is equal to the hueOUT of the environment light map and the saturation of the environment image is equal to the saturationOUT of the environment light map (step 63).
In step 63, it may be determined whether or not the difference between the hue of the environment light map and the hue of the environment image is less than a threshold, and it may be determined whether or not the difference between the saturation of the environment light map and the saturation of the environment image is less than a threshold. The threshold is given in advance.
When at least one of the hue and the saturation of the environment light map is different from the corresponding value of the environment image, a negative result is obtained in step 63. In this case, the brightness correction section 317A changes one or both of the correction coefficients h and s (step 64), and the process returns to step 62.
The correction coefficients h and s may be changed in the same manner as the exponent a is changed in step 55 (see
When the hue and the saturation of the environment light map are equal to the corresponding values of the environment image, a positive result is obtained in step 63. In this case, the chromaticity correction section 317B outputs the environment light map after the chromaticity correction (step 65).
<Overview of Process>
When an environment image is given, first, an environment light map with a glossiness effect degree that is close to that of the environment image is selected by the environment light map selection section 313 (see
In the present exemplary embodiment, an environment light map with a glossiness effect degree of 1.4 is selected for an environment image with a glossiness effect degree of 1.3.
The selected environment light map and the environment image have different hues and saturations. For example, the environment light map has a hue of 221° while the environment image has a hue of 219°. The environment light map has a saturation of 70% while the environment image has a saturation of 84%.
In this manner, the chromaticity of the selected environment light map is different from the chromaticity at the observation location.
Thus, in the present exemplary embodiment, the selected environment light map is corrected such that the hue and the saturation of the corrected environment light map coincide with those of the environment image.
As a result, it is possible to improve the reproduction degree of the chromaticity compared to before the correction.
Although not illustrated in
In the present exemplary embodiment, a glossiness effect degree is calculated on the basis of the standard deviation and the distortion degree of the brightness, and thus the glossiness effect degree of the environment light map may be varied by causing the average brightness of the environment light map to coincide with the average brightness of the environment image.
However, the glossiness effect degree of the environment light map before the correction is originally close to the glossiness effect degree of the environment image, and thus it is expected that the illumination environment of the environment light map is brought closer to the illumination environment at the observation location by causing the average brightness of the environment light map to coincide with the average brightness of the environment image.
While the present exemplary embodiment is based on the method according to the first exemplary embodiment, the present exemplary embodiment may be combined with the method according to the second exemplary embodiment, or may be combined with the method according to the third exemplary embodiment.
While the brightness of the environment light map is corrected such that the average brightness of the environment light map becomes equal to the average brightness of the environment image in the present exemplary embodiment, the brightness of the environment light map may be corrected such that the glossiness effect degree of the environment light map coincides with that of the environment image.
A modification of the fourth exemplary embodiment will be described as the present exemplary embodiment.
While both the average brightness and the chromaticity of the environment light map are corrected in the fourth exemplary embodiment, only the chromaticity of the environment light map is corrected in the present exemplary embodiment.
The environment light map correction section 317 in
Another method of enhancing the degree of reproducibility of the glossiness of an article will be described in accordance with the present exemplary embodiment.
Also in the present exemplary embodiment, the print system 1 illustrated in
<Overview of Texture Reproduction Process>
One of features that are peculiar to the print server 30 illustrated in
Instead, the print server 30 illustrated in
In the present exemplary embodiment, a brightness standard deviation of the illuminated portion calculated in advance is linked to the environment light map stored in the environment light map storage section 341C. For example, “37.5” is linked to the environment light map A, “60.0” is linked to the environment light map B, and “90.0” is linked to the environment light map C.
The environment light map selection section 313C in the print server 30 illustrated in
In this manner, in the present exemplary embodiment, the environment light map selection section 313C selects an environment light map with a feature amount (i.e. the brightness standard deviation of the illuminated portion) that is highly similar to that at the observation location. In other words, an environment light map is selected with focus on the similarity of a feature amount of a principal illuminated portion, rather than the similarity for the entire screen.
<Details of Texture Reproduction Process>
Differences from the first exemplary embodiment will be described below.
<Environment Image Acquisition Section>
The range to be acquired as an environment image may be specified by the user.
Alternatively, an environment image captured at an observation location may be input to a machine learning model that outputs a principal illuminated portion of an input image. Alternatively, a region that includes a lighting fixture may be extracted as an environment image using an image recognition technology.
<Feature Amount Calculation Section>
The feature amount calculation section 318 calculates a brightness of an environment image using a calculation formula, and calculates a standard deviation of the brightness (i.e. a brightness standard deviation) (step 71). The calculation formula may be 0.299×R+0.587×G+0.114×B, for example. The brightness standard deviation is an example of a “feature amount related to brightness distribution”.
The brightness is calculated for each pixel, and the brightness standard deviation is calculated for the entire environment image (i.e. principal illuminated portion).
<Process of Linking Standard Deviation to Environment Light Map>
In the present exemplary embodiment, the processor 31 (see
First, the processor 31 renders an environment light map stored in the environment light map storage section 341C (step 81). The “environment light map” is an omnidirectional image.
Next, the processor 31 extracts an illumination image from the rendered image (step 82). The “illumination image” refers to a partial image that includes principal illumination. The term “principal illumination” refers to a region with a high brightness compared to the other regions and with a larger light source area compared to the other regions.
Subsequently, the processor 31 calculates a brightness of the illumination image using a calculation formula, and calculates a standard deviation (i.e. brightness standard deviation) of the brightness (step 83). The calculation formula may be 0.299×R+0.587×G+0.114×B, for example. The brightness standard deviation is obtained by rounding off a calculated value to the first decimal place, for example.
After that, the processor 31 links the calculated brightness standard deviation to the environment light map (step 84).
The brightness standard deviation of the illuminated portion surrounded by the broken line is linked to the environment light map A. In this example, the brightness standard deviation is 37.5. For reference, the environment light map A has a glossiness effect degree of 1.4.
Similarly, the brightness standard deviation of the illuminated portion surrounded by the broken line is linked to the environment light map B. In this example, the brightness standard deviation is 60.0. For reference, the environment light map B has a glossiness effect degree of 1.0.
The brightness standard deviation of the illuminated portion surrounded by the broken line is linked to the environment light map C. In this example, the brightness standard deviation is 90.0. For reference, the environment light map C has a glossiness effect degree of 0.7.
While an illumination image is extracted from the rendered image in the example in
<Feature Amount Difference Calculation Section>
First, the feature amount difference calculation section 319 acquires a feature amount of the environment image (step 91). The feature amount of the environment image is the brightness standard deviation as discussed earlier, and is given from the feature amount calculation section 318.
Next, the feature amount difference calculation section 319 calculates a feature amount difference for each environment light map (step 92).
The feature amount difference is calculated by the following equation, for example.
Feature amount difference=(feature amount of environment image)−(feature amount of environment light map)
In the present exemplary embodiment, the feature amount difference is obtained by rounding off a calculated value to the second decimal place.
In
In this case, the feature amount difference for the environment light map A is 7.5 (=45.0−37.5). The feature amount difference for the environment light map B is −15.0 (=45.0−60.0). The feature amount difference for the environment light map C is −45.0 (=45.0−90.0).
<Environment Light Map Selection Section>
The environment light map selection section 313C according to the present exemplary embodiment selects an environment light map, the absolute value of the feature amount difference calculated for which is the smallest, and outputs the selected environment light map to the glossiness reproduction section 314. In the example in
While the brightness standard deviation of the illuminated portion is used as the feature amount in the present exemplary embodiment, the distortion degree of the illuminated portion may also be used.
The screen display illustrated in
In the present exemplary embodiment, the principal illuminated portion of the environment light map that is used to calculate a feature amount may be presented. This presentation enables the user to verify setting of the principal illuminated portion.
In the present exemplary embodiment, a case where the precision in selecting an environment light map is enhanced using information on the average brightness of an illuminated portion will be described on the basis of the print system 1 described in relation to the sixth exemplary embodiment discussed earlier.
In the present exemplary embodiment, as in the second exemplary embodiment discussed earlier, an average brightness of an environment image acquired by the environment image acquisition section 311 is calculated, and given to the environment light map selection section 313C (see
The environment light map storage section 341C (see
In this exemplary embodiment, the environment light map selection section 313C first specifies a plurality of environment light maps with a small feature amount difference as candidates to be selected.
Next, the environment light map selection section 313C selects an environment light map with an average brightness that is close to (or that is not significantly different from) the average brightness of the environment image, and outputs the selected environment light map to the glossiness reproduction section 314 (see
In the present exemplary embodiment, a case where the precision in selecting an environment light map is enhanced using information on the chromaticity of an illuminated portion will be described on the basis of the print system 1 described in relation to the sixth exemplary embodiment discussed earlier.
In the present exemplary embodiment, as in the third exemplary embodiment discussed earlier, the chromaticity of an environment image acquired by the environment image acquisition section 311 is calculated, and given to the environment light map selection section 313C (see
The environment light map storage section 341C (see
Also in this exemplary embodiment, the environment light map selection section 313C first specifies a plurality of environment light maps with a small feature amount difference as candidates to be selected.
Next, the environment light map selection section 313C selects an environment light map with a chromaticity that is close to (or that is not significantly different from) the chromaticity of the environment image, and outputs the selected environment light map to the glossiness reproduction section 314 (see
In the present exemplary embodiment, a case where the selected environment light map is corrected will be described on the basis of the print system 1 described in relation to the sixth exemplary embodiment discussed earlier.
Also in the sixth exemplary embodiment, an environment light map with an illuminated portion with a brightness standard deviation that is close to that of the illuminated portion of the environment image is selected from among the environment light maps stored in the environment light map storage section 341C, there remain differences in the average brightness and the chromaticity.
Thus, also in the present exemplary embodiment, as in the fourth exemplary embodiment discussed earlier, the environment light map selected by the environment light map selection section 313C (see
As described in relation to the fifth exemplary embodiment, the environment light map may be corrected for only the chromaticity.
For example, the function may be executed by the client terminal 10 or the image forming apparatus 20 (see
The information processing system 1A illustrated in
The cloud server 40 is also an example of an information processing apparatus. The hardware configuration of the cloud server 40 may be the same as the hardware configuration illustrated in
The information processing system 1A illustrated in
In the present exemplary embodiment, the processing functions according to the first to ninth exemplary embodiments discussed earlier are implemented through execution of a program on the cloud server 40.
While the cloud server 40 is prepared in
A mobile communication system such as 4G or 5G may be used in place of the cloud network CN.
In the embodiments above, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the embodiments above, and may be changed.
The foregoing description of the exemplary embodiments of the present disclosure has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the disclosure and its practical applications, thereby enabling others skilled in the art to understand the disclosure for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the following claims and their equivalents.
(((1)))
An information processing apparatus comprising a processor configured to: acquire a feature amount related to brightness distribution from a first image captured at an observation location; select an environment light map that is similar to the feature amount, from among a plurality of environment light maps prepared in advance; and control expression of a second image corresponding to an article observed at the observation location using the selected environment light map.
(((2)))
The information processing apparatus according to (((1))), wherein the processor is configured to select a first environment light map that is similar to an average brightness of the first image, from among a plurality of environment light maps that are similar to the feature amount, and control expression of the second image using the first environment light map.
(((3)))
The information processing apparatus according to (((2))), wherein the processor is configured to generate a second environment light map that is closer to the average brightness of the first image by correcting the first environment light map, and control expression of the second image using the second environment light map.
(((4)))
The information processing apparatus according to (((1))), wherein the processor is configured to select a first environment light map that is similar to a chromaticity of the first image, from among a plurality of environment light maps that are similar to the feature amount, and controls expression of the second image using the first environment light map.
(((5)))
The information processing apparatus according to (((4))), wherein the processor is configured to generate a second environment light map that is closer to the chromaticity of the first image by correcting the first environment light map, and control expression of the second image using the second environment light map.
(((6)))
The information processing apparatus according to any one of (((1))) to (((5))), wherein the processor is configured to display an index that represents a gloss degree of the environment light map that is used to generate the second image, the index being displayed in association with the second image.
(((7)))
The information processing apparatus according to any one of (((1))) to (((5))), wherein the processor is configured to display the environment light map that is used to generate the second image, the environment light map being displayed in association with the second image.
(((8)))
The information processing apparatus according to any one of (((1))) to (((5))), wherein the processor is configured to display one or more environment light maps that are similar to the feature amount of the first image, the one or more environment light maps being displayed in association with the second image.
(((9)))
The information processing apparatus according to (((1))), wherein the processor is configured to acquire the feature amount from an illuminated portion of the first image.
(((10)))
A program causing a computer to execute a process comprising: acquiring a feature amount related to brightness distribution from a first image captured at an observation location; selecting an environment light map that is similar to the feature amount, from among a plurality of environment light maps prepared in advance; and controlling expression of a second image corresponding to an article observed at the observation location using the selected environment light map.
Number | Date | Country | Kind |
---|---|---|---|
2022-143496 | Sep 2022 | JP | national |