The aspect of the embodiments relates to a technique to display data of material appearance of an object.
In recent years, various techniques for searching data including material appearance information have been proposed. For example, Japanese Patent Application Laid-Open No. 2017-37455 discusses a technique for searching for material information corresponding to verbal information input by a user, and displaying an image of a product made of the material indicated by the material information.
In a case where an object having appearance changed depending on an observation angle is searched for, it is not clear that the object has the material appearance intended by the user, by simply displaying one still image as the search result as in Japanese Patent Application Laid-Open No. 2017-37455, in some cases.
According to an aspect of the embodiments, an apparatus includes a setting unit configured to set an element to be given priority in search of data representing material appearance of an object, a search unit configured to search for the data based on the set element, and a control unit configured to display, on a display unit, an image that corresponds to the searched data and corresponds to appearance of the object in a case where the object is observed under a plurality of conditions.
Further features of the disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Some exemplary embodiments are described below with reference to the accompanying drawings. The following exemplary embodiments are not intended to limit the disclosure. All of combinations of characteristics described in the exemplary embodiments are not necessarily essential for solving means of the disclosure.
A first exemplary embodiment of the disclosure will be described below. Material appearance data representing a texture of an object is used for a user to recognize, via a display apparatus, change in color appearance and glossiness depending on a condition such as a position and brightness of illumination. In the present exemplary embodiment, a search corresponding to an input by the user is performed by using a database storing the material appearance data, and search results are displayed in a form of moving images on the display apparatus. Targets to be searched for in the present exemplary embodiment are images of curtains, but may be images of other objects.
A procedure for searching for images of curtains according to the present exemplary embodiment will now be described with reference to
When the “enlarge fabric” button is selected in
Search results are displayed in two axes as illustrated in
In the present exemplary embodiment, material appearance data corresponding to a plurality of curtains and thumbnail moving images with which the user can easily tell the material appearance of the respective curtains are stored in the HDD 113 in
The information processing apparatus 1 includes a setting unit 1001, an acquisition unit 1002, a determination unit 1003, a display control unit 1004, and a search unit 1005. The setting unit 1001 sets elements to be given priority in search of the material appearance data. The acquisition unit 1002 acquires instruction information indicating a user instruction. The determination unit 1003 determines whether the instruction from the user indicates similarity search, based on the acquired instruction information. The display control unit 1004 controls display of the display apparatus 115. The search unit 1005 searches for the material appearance data.
Processing that is performed by the information processing apparatus 1 according to the present exemplary embodiment will now be described with reference to a flowchart of
In step S301, the setting unit 1001 initializes two elements to be given priority in search of the material appearance data. The setting unit 1001 according to the present exemplary embodiment sets a color as a first priority element, and sets a surface shape as a second priority element. In step S302, the acquisition unit 1002 acquires the instruction information indicating the user instruction. The instruction is input by the user via the input apparatus 110. In step S303, the determination unit 1003 determines whether the instruction information acquired in step S302 indicates execution of the similarity search. In a case where the instruction information acquired in step S302 indicates execution of the similarity search (YES in step S303), the processing proceeds to step S305. Otherwise (NO in step S303), the processing proceeds to step S304. In step S305, the search unit 1005 searches for the material appearance data, and the display control unit 1004 displays search results on the display apparatus 115. The details of the search processing in step S305 will be described below. In step S304, the display control unit 1004 changes the display of the display apparatus 115 in accordance with the instruction indicated by the instruction information acquired in step S302. The setting unit 1001 updates contents of the priority elements. Details of the processing of changing the display and the processing of updating the priority elements in step S304 will be described below. In step S306, the determination unit 1003 determines whether to end the series of processing, based on whether an end instruction has been received from the user. If the series of processing is not to be ended (NO in step S306), the processing returns to step S302, and the acquisition unit 1002 newly acquires the instruction information.
The processing to change the display and to update the priority elements in step S304 will be described with reference to a flowchart of
In a case where the instruction information acquired in step S302 indicates “gloss check” (YES in step S402), the processing proceeds to step S406. In step S406, the setting unit 1001 sets the element of interest to the gloss, and the display control unit 1004 displays a screen on which the user can adjust the movement of the light source, on the display apparatus 115. In a case where the instruction information acquired in step S302 does not indicate “gloss check” (NO in step S402), the processing proceeds to step S403. In step S403, the setting unit 1001 determines whether the instruction information acquired in step S302 indicates an instruction to enlarge and display the entire image of the selected curtain. In a case where the instruction information acquired in step S302 indicates the instruction to enlarge and display the entire image of the selected curtain (YES in step S403), the processing proceeds to step S407. In step S407, the setting unit 1001 sets the element of interest to the color, and the display control unit 1004 enlarges and displays the entire image of the curtain. In a case where the instruction information acquired in step S302 does not indicate the instruction to enlarge and display the entire image of the selected curtain (NO in step S403), the processing proceeds to step S404.
In step S404, the display control unit 1004 changes the display to the display corresponding to the instruction indicated by the instruction information. For example, in a case where the “display list” button is selected, the screen is returned to the screen displaying the list, in the processing in step S404. In a case where any of the other buttons is selected, the screen transitions to the screen corresponding to the selected button. In a case where the setting unit 1001 updates the element of interest and the display control unit 1004 updates the display screen in step S405, S406, or S407, the processing proceeds to step S408. In step S408, the setting unit 1001 determines whether the first priority element is coincident with the element of interest updated in step S405, S406, or S407. If the first priority element is coincident with the element of interest (YES in step S408), the processing in step S304 is ended. If the first priority element is not coincident with the element of interest (NO in step S408), the processing proceeds to step S409. In step S409, the setting unit 1001 substitutes the first priority element into the second priority element, and substitutes the element of interest set in step S405, S406, or S407 into the first priority element. The first priority element and the second priority element are parameters used in the search processing in step S305.
In a case where the user instruction received last and the user instruction received before the last user instruction are the same in the above-described processing, the first priority element and the second priority element are not updated. In a case where the user instruction received last is the instruction about update of the priority elements and is different from the user instruction about update of the priority elements received before the last user instruction, the first priority element and the second priority element are both updated. During the update, the first priority element is set down to the second priority element, and the first priority element is determined based on the last user instruction.
The search processing in step S305 is described with reference to a flowchart of
In step S503, the search unit 1005 divides all material appearance data into predetermined N data groups Ga(0) to Ga(N−1) in the order of proximity to the selected image by using the feature amount F1. In the present exemplary embodiment, N is set to four. The proximity of the data is defined by a distance (difference) of the feature amount F1, and the data groups can be generated by dividing the distance by four stages. In step S504, the search unit 1005 acquires a feature amount F2 of the second priority element. In step S505, the search unit 1005 initializes a counter K by substituting zero into the counter K. In step S506, the search unit 1005 ranks the proximity of the data in the data group Ga(K) to the material appearance data acquired in step S501, by using the feature amount F2. The proximity of the feature amount is defined by a distance of the feature amount F2. In step S507, the search unit 1005 acquires thumbnail moving images corresponding to the respective pieces of data included in the data group Ga(k) in accordance with the ranking set in step S506. The display control unit 1004 displays the acquired thumbnail moving images on the display apparatus 115. The thumbnail moving images are previously generated as moving images that facilitate check of appearance change in observation with different geometric conditions, for example, by allowing change in the light source position, inclining the object, or changing a viewpoint of the user in each piece of the data. The generated thumbnail moving images are stored as the material appearance data. In step S508, the search unit 1005 increments the counter K by one. In step S509, the search unit 1005 compares the counter K and the number of divisions in step S503. In a case where the counter K and the number of divisions are equal to each other (YES in step S509), it is determined that the calculation of the distance and the ranking based on the feature amount F2 are performed on all of the data groups, and the processing is ended. In a case where the counter K and the number of divisions are not equal to each other (NO in step S509), the processing returns to step S506 in order to perform the calculation of the distance based on the feature amount F2 and the display of the next data group.
In the present exemplary embodiment, as the search results, the data groups Ga(0), Ga(1), and Ga(2) are displayed in order from left to right. The pieces of data in each of the groups are displayed in ascending order of proximity in distance of the feature amount F2 from top to bottom.
According to the present exemplary embodiment, the user can check the search results of data similar to the selected data, with the moving images with which the user can easily tell the material appearance, thus enabling the user to easily narrow down the search results. In the present exemplary embodiment, the display enabling the user to easily recognize the elements used for the determination of similarity is performed. Further, the elements used for the determination are determined by the user instruction. Therefore, elements strongly interested by the user or important elements are automatically reflected on the search results.
In the present exemplary embodiment, the case where the texture data, the normal map data, and the specular map data are stored as the data corresponding to the three elements (color, surface shape, and gloss) configuring the material appearance data is described. However, the material appearance data is not limited to the above-described example. For example, the data corresponding to the three elements may be the texture data, an onomatopoeia about hand feeling, and an onomatopoeia about the gloss. The onomatopoeia about the hand feeling is configured by a combination of a word representing the surface shape such as “smooth”, “superfine”, “dry”, “fluffy”, “coarse”, and “rough”, and presence/absence of a word representing a degree such as “very”, “somewhat”, and “slightly” in this modification. The onomatopoeia about the gloss is configured by a combination of a word representing the light reflection degree, presence/absence of a word representing a degree, and presence/absence of a word representing a coverage. Examples of the word representing the light reflection degree include “mat”, “glossy”, and “shiny”. Examples of the word representing the degree include “very”, “somewhat”, and “slightly”. Examples of the word representing the coverage include “totally”, “partially”, “in places”, and “in spots”. Values obtained by one-dimensionally quantifying these onomatopoeias are used as the feature amount of the surface shape and the feature amount of the gloss.
A method of quantifying the onomatopoeia about the surface shape will be described. Numerical values 10, 20, . . . , and 60 are assigned to the words representing the surface shape, “smooth”, “superfine”, “dry”, “fluffy”, “coarse”, and “rough”. Further, numerical values −7, −5, and −3 are respectively assigned to the words representing the degree, “very”, “somewhat”, and “slightly”. In a case where the word representing the surface shape itself is combined with the word representing the degree as the surface shape element of certain data, a value obtained by summing the both numerical values is used as the feature amount corresponding to the onomatopoeia about the surface shape. As a result, the distance of the feature amount can be derived.
A method of quantifying the onomatopoeia about the gloss will be described. The material appearance data inevitably includes the onomatopoeia of the gloss combined with the word “totally” representing the coverage, and may further include the onomatopoeia combined with the word representing the other coverage. In other words, there are data including only the onomatopoeia about the gloss, “totally”+“very”+“mat”, and data including two onomatopoeias, “totally”+“very”+“mat” and “shiny”+“in places”. Numerical values 10, 30, and 50 are respectively assigned to the words representing the light reflection degree, “mat”, “glossy”, and “shiny”. Numerical values −5, 5, and 10 are respectively assigned to the words representing the degree, “very”, “somewhat”, and “slightly”. Numerical values 1.0, 0.5, 0.3, and 0.1 are respectively assigned to the words representing the coverage, “totally”, “partially”, “in places”, and “in spots”. The numerical value of each onomatopoeia is calculated by (numerical value of light reflection degree)+(numerical value of degree)×(numerical value of coverage). The distance of the feature amount can be derived by using the numerical value as the feature amount of the gloss.
A second exemplary embodiment of the disclosure will be described below. In the first exemplary embodiment, the search results are presented to the user by using proximity of the feature amount of each element, as illustrated in
Presenting the search results as in the present exemplary embodiment makes it possible to set an axis to further display the search results among a plurality of pieces of data configuring each element data, and to arrange information about the search results based on the added axis with a reference.
The present exemplary embodiment is different in the search processing in step S305 from the first exemplary embodiment. A hardware configuration and a functional configuration of an information processing apparatus according to the present exemplary embodiment are equivalent to the hardware configuration and the functional configuration of the information processing apparatus according to the first exemplary embodiment. Thus, descriptions of the hardware configuration and the functional configuration are omitted. In the following, differences between the present exemplary embodiment and the first exemplary embodiment are mainly described. The components similar to those in the first exemplary embodiment are described while being denoted by the same reference numerals.
The search processing in step S305 will now be described with reference to a flowchart of
In step S802, the search unit 1005 sorts the data groups Ga(0) to Ga(N−1) by using the direction determination item D1 acquired in step S801. A frame 901 in
In step S803, the search unit 1005 acquires a direction determination item D2 of the second priority element.
In step S 804, as in the first exemplary embodiment, the search unit 1005 ranks the pieces of data in the data group Ga(K) by using the feature amount F2. The search unit 1005 further sorts the ranked pieces of data based on the direction determination item D2. The pieces of data are sorted in a manner similar to the method in step S802.
However, in this step, not the value located at the center of the area but a value of respective pieces of data may be used. In step S805, the search unit 1005 acquires moving images for check of the material appearance in order of the pieces of data sorted in step S804, and the display control unit 1004 displays the acquired moving images on the display apparatus 115 as in
In the present exemplary embodiment, as in the modification of the first exemplary embodiment, the surface shape and the gloss may be held as the onomatopoeia. In the present exemplary embodiment, as the direction determination item, a word representing a degree may be used for the surface shape and the gloss, and a word representing a coverage may be used for the gloss.
In the above-described exemplary embodiments, the normal map data is used as the data representing the surface shape; however, bump map data or height map data may be used. The specular map data is used as the data representing the gloss; however, diffuse map data may be used.
In the above-described exemplary embodiments, the texture data, the normal map data, and the specular map data are used as the material appearance data; however, the material appearance data is not limited to the above-described example. For example, elements, such as transparency and diffuse, reflection may be added, and four or more types of material appearance data may be used. Two elements can be selected based on the user operation history, and similarity can be defined and displayed.
The processing according to the above-described exemplary embodiments may be realized by a stand-alone search apparatus or may be realized by a search system including a server and a client via a network. In a case where the information processing apparatus 1 is the stand-alone search apparatus, the user instruction is received via the input apparatus 110, and the display of the search results and the like are performed by using the display apparatus 115. In contrast, in a case where the search system including the server and the client is used, the information processing apparatus 1 serving as the server receives the user instruction through communication via the NIC 107. Further, the information processing apparatus 1 provides data to be used for display of the search results and the like, to the client apparatus through communication via the NIC 107.
The information processing apparatus 1 may be a mobile device including an acceleration sensor and an image sensor like a tablet personal computer (PC). In this case, the light source position may be changed based on sensing of inclination of a device main body, or a surrounding light source (e.g., fluorescent lamp, outdoor lamp, or tungsten) may be determined by using the image sensor, and the display image may be changed based on a result of the determination.
In the above-described modifications, the onomatopoeia of the surface shape and the onomatopoeia of the gloss are each described as the combination of the limited words, but are not limited to the above-described examples. The words may be changed to other words, for example, scratchy, rough, lumpy, crunchy, and brightly, or other words may be added. The words representing the degree and the words representing the coverage are used, but words representing impression (silky, mannish, quiet, tender) may be used.
In the above-described exemplary embodiments, the moving images, each having a thumbnail size, with which the user easily understand the material appearance information are used as the search results; however, any other method with which the user can check the material appearance may be used. For example, data enabling display of the material appearance may be rendered and displayed, or still images different in geometric condition may be continuously displayed. In these cases, it is unnecessary to previously store the thumbnail moving images, thus reducing a necessary data capacity.
In the above-described exemplary embodiments, the example in which, as the UI screen to check the gloss, the position and the brightness of the light source are changed has been described in
According to the exemplary embodiments of the disclosure, in a case where the data representing the material appearance of an object is searched for, it is possible to easily check whether the search results intended by the user are obtained.
Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2021-164320, filed Oct. 5, 2021, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2021-164320 | Oct 2021 | JP | national |