The present invention relates to an information processing apparatus, an information processing method, and an information processing program.
There are known technologies related to Augmented Reality that augments a real world by adding digital information to information regarding a real space visible through a camera. For example, proposed is a technology in which a real object existing in a user's real environment is imaged, and a color gamut of a virtual object displayed to be superimposed on the real object is corrected by the augmented reality technology based on color information regarding the imaged real object.
However, with the above known technology, it is difficult to always ensure improvement of convenience regarding color correction of a virtual object. For example, the above known technology merely performs imaging of a real object existing in user's real environment and correction of a color gamut of a virtual object displayed to be superimposed on the real object by using the augmented reality technology based on color information regarding the imaged real object, and it is difficult to always ensure improvement of convenience regarding color correction of the virtual object.
In view of this, the present disclosure proposes an information processing apparatus, an information processing method, and an information processing program capable of improving convenience regarding color correction of a virtual object.
To solve the above problem, an information processing apparatus according to the present disclosure includes: a proposal unit that proposes, to a user, imaging of another real object that complements color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object; and a display control unit that corrects the color gamut of the virtual object based on color information acquired by imaging of the another real object presented by the user.
Embodiments of the present disclosure will be described below in detail with reference to the drawings. In each of the following embodiments, the same parts are denoted by the same reference numerals, and a repetitive description thereof will be omitted.
The present disclosure will be described in the following order.
There is a known technology of displaying a virtual object image to be superimposed on an image obtained by capturing an object existing in a real space (real object) by using an augmented reality technology. For example, there is known a technology of displaying an image of a product (virtual object) a user of electronic commerce desires to purchase to be superimposed on a captured image of an object existing in a real environment (real object) of the user. In addition, there is also a known technology of adjusting a color tone of a virtual object according to a real environment when an image of the virtual object is displayed so as to be superimposed on the captured image of the real object in this manner.
For example, by acquiring camera setting information and using a known calibration color chart (also referred to as a color sample), the color tone of the virtual object is adjusted according to the real environment. However, acquisition of the camera setting information cannot be performed in many of cameras for general consumers. In addition, in order to correct the color of the light source, it is necessary to image a physically special color chart.
To handle this, Non Patent Literature 1 described above proposes a technology of imaging a real object existing in a user's real environment without using a special color chart and correcting the color gamut of the virtual object displayed to be superimposed on the real object based on color information regarding the imaged real object. The technology described in Non Patent Literature 1 enables color correction under augmented reality without acquiring the camera setting information or depending on a calibration object such as a color chart. However, in the technology described in Non Patent Literature 1, the accuracy of color correction depends on color information regarding the imaged real object. This has caused a problem that an error due to the color correction is likely to increase in attempt of reproducing, by correction, a color that does not exist in the captured real object or a completely different color.
In view of this, a terminal device 100 according to an embodiment of the present disclosure proposes, to the user, imaging of another real object that complements the color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object. Furthermore, the terminal device 100 corrects the color gamut of the virtual object based on color information acquired by imaging of another real object presented by the user. In this manner, in a case where the color information necessary for expressing the color of the virtual object is insufficient, the terminal device 100 proposes, to the user, observation of another real object that complements the insufficient color information, and corrects the color gamut of the virtual object based on the color information regarding the another real object. With this correction, the terminal device 100 can more easily correct the color gamut of the virtual object by using an object near the user as the color sample. This makes it possible for the terminal device 100 to improve convenience regarding color correction of the virtual object.
Hereinafter, an overview of information processing according to the embodiment of the present disclosure will be described with reference to
First, the user U1 selects the virtual object (product G1) to be displayed by augmented reality. The terminal device 100 receives selection of the virtual object (product G1) from the user U1 (step S1). In addition, the user U1 images the real object (product G2) used as a marker when superimposing a virtual object by a camera C1 of the terminal device 100.
The terminal device 100 acquires an image of the real object (product G2) captured by the camera C1. Subsequently, based on the acquired image of the real object (product G2), the terminal device 100 recognizes the real object (product G2) and estimates the position and posture of the real object (product G2). After estimating the position and posture of the real object (product G2), the terminal device 100 displays an image G1-1 in which the virtual object (product G1) is superimposed on the real object (product G2) by augmented reality (step S2).
Furthermore, the terminal device 100 acquires a sample of the color under the real environment from the real object (product G2) designated as a marker when superimposing the virtual object. Specifically, the terminal device 100 acquires a sample of the color of the real object (product G2) under the real environment based on the acquired image of the real object G2.
Subsequently, based on the sample of the color of the real object (product G2) under the real environment, the terminal device 100 estimates a parameter for color gamut correction of the camera C1 (hereinafter, also simply referred to as a parameter) and then corrects the color gamut of the virtual object (product G1). At this time, the terminal device 100 determines whether there is sufficient color information necessary for expressing the color of the virtual object (product G1). Here, the terminal device 100 compares the color information regarding the virtual object (product G1) with the color information regarding the real object (product G2), and determines that the color information expressing the color of the virtual object (product G1) is insufficient since the colors of the virtual object (product G1) and the real object (product G2) are significantly different from each other. Note that details of the determination processing regarding the sufficiency of color information will be described below.
When having determined that the color information expressing the color of the virtual object (product G1) is insufficient, the terminal device 100 requests a server device 200 to determine whether there is information regarding another real object (different from the product G2) that complements the insufficient color information. The server device 200 receives a determination request of the information from the terminal device 100. When having received the determination request of the information, the server device 200 acquires information regarding the product purchased in the past by the user U1 based on the product purchase history of the user U1. Subsequently, the server device 200 selects a product G3 as another real object that can complement the insufficient color information from among the products purchased by the user U1 in the past. Specifically, the product G3 is a red smartphone. After having selected the another real object (product G3) that can complement the insufficient color information, the server device 200 transmits information regarding the selected another real object (product G3) to the terminal device 100. The terminal device 100 acquires information related to the another real object (product G3) from the server device 200. When having acquired information regarding the another real object (product G3), the terminal device 100 proposes, to the user U1, to additionally image the another real object (product G3) (step S3).
In response to the proposal from the terminal device 100, the user U1 images the another real object (product G3) in addition to the real object (product G2) by using the camera C1 of the terminal device 100. The terminal device 100 acquires an image of the another real object (product G3) captured by the camera C1. Subsequently, based on the acquired image of the another real object (product G3), the terminal device 100 recognizes the another real object (product G3) and estimates the position and posture of the another real object (product G3). Furthermore, based on the acquired image of the another real object (product G3), the terminal device 100 acquires a sample of the color of the another real object (product G3) under the real environment.
Subsequently, the terminal device 100 corrects the color gamut of the virtual object (product G1) based on the sample of the color of the another real object (product G3) under the real environment. Details of the processing of correcting the color gamut of the virtual object will be described below.
Subsequently, based on the color gamut of the virtual object (product G1) after the correction, the terminal device 100 corrects the color of the virtual object (product G1). Subsequently, the terminal device 100 displays an image G1-2 in which the virtual object (product G1) after the color correction is superimposed on the real object (product G2) being a marker (step S4).
As described above, the terminal device 100 proposes, to the user, imaging of another real object that complements the color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object. Furthermore, the terminal device 100 corrects the color gamut of the virtual object based on color information acquired by imaging of another real object presented by the user. In this manner, in a case where the color information for expressing the color of the virtual object is insufficient, the terminal device 100 proposes, to the user, imaging of another real object that can complement insufficient color information, and corrects the color gamut of the virtual object based on the color information regarding the another real object. With this correction, the terminal device 100 can more easily correct the color gamut of the virtual object by using a real object near the user as a color sample. This makes it possible for the terminal device 100 to improve convenience regarding color correction of the virtual object.
Next, a relationship between estimation of parameters for color gamut correction and an observation error will be described with reference to
Before describing the relationship between estimation of parameters for color gamut correction and an observation error, a color value, a color space, and a color gamut will be described. The color value indicates a numerical value representing a color. Among several indexes representing color values, the embodiment of the present disclosure uses a color value CIE Lab, which is a commonly used index. CIE Lab is also described as L*a*b* in the sense of Lab defined by Commission Internationale de l'Eclairage (CIE) representing International Commission on Illumination. “L*” in L*a*b* represents lightness (light or dark). In addition, “a*” in L*a*b* represents the chromaticity, or the intensity of color tone, in green-red components. In addition, “b*” in L*a*b* represents the chromaticity, or the intensity of color tone, in blue-yellow components. These three values are independent indexes, and thus can be expressed in a three-dimensional orthogonal coordinate system using three coordinate axes orthogonal to each other. This is referred to as a L*a*b* color space (hereinafter, also referred to as a color space.). The color value is indicated by coordinates of a point in the color space. The color gamut refers to a portion of a color space. In other words, the color gamut indicates a range (region) of a color value in the color space.
Next, estimation of a parameter for color gamut correction will be described. For example, there is a method of estimating a parameter for color gamut correction using the following formula.
In the above formula, (v) represents a correction amount in an input color (e), and (4)) is a function that controls the correction amount for each observation sample color (i). By performing weighted (w) averaging on these samples, the final correction amount is calculated. However, as can be seen from this equation, since the correction amount heavily depends on the sample, an occurrence of an error in these observation samples would cause propagation of the error to the final estimation of color correction. In addition, the farther the color to be corrected is from the observation sample, the larger the error will be. This point will be described in detail below with reference to
In addition, there is a method of estimating a parameter for color gamut correction using the following formula.
In the above equation, the color correction is expressed by a correction matrix. The correction matrix is found so as to obtain the minimum value of a difference between the sampled color and the known value. When this matrix is found, there is a high possibility that the non-sampled region is greatly shifted due to the bias with respect to the sample color.
As described above, in a case where the color to be expressed does not match the color sample of the observation, there is a high possibility of occurrence of a failure in even expressing the color. Therefore, in the calibration using the existing color chart, the color samples for correction have as wide variety of types as possible so as not to fail in expression. Therefore, a special chart is required.
Returning to the description of
Next, the relationship between estimation of parameters for color gamut correction and the observation error when no observation error is included will be described with reference to
A filled circle (filled circle mark) illustrated on the left side of
Next, a relationship between estimation of parameters for color gamut correction and an observation error in a case where an observation error is included will be described with reference to
A filled circle (filled circle mark) illustrated on the left side of
Here, as compared with the example illustrated in
Next, a relationship between estimation of parameters for color gamut correction and the observation error when the observation error is included will be described with reference to
A filled circle (filled circle mark) illustrated on the left side of
Here, as compared with the example illustrated in
As illustrated in
Next, a configuration of the information processing system according to the embodiment will be described with reference to
The terminal device 100 is an information processing apparatus used by a user. The terminal device 100 is implemented by devices such as a desktop personal computer (PC), a laptop PC, a smartphone, a tablet terminal, a mobile phone, or a personal digital assistant (PDA). In the example illustrated in
The server device 200 is an information processing apparatus that provides an electronic commerce service. The server device 200 stores information related to a product. Specifically, the server device 200 stores identification information that identifies a product in association with the shape, feature point, and color information regarding the product. The server device 200 provides product information in response to a request from the terminal device 100. Furthermore, the server device 200 stores information related to a product purchase history of the user.
Next, a configuration of the terminal device 100 according to the embodiment will be described with reference to
The communication unit 110 is implemented by a network interface card (NIC), for example. The communication unit 110 is connected to the network in a wired or wireless channel, and transmits and receives information to and from the server device 200, for example.
The imaging unit 120 has a function of capturing various types of information regarding the user or the surrounding environment. In the example illustrated in
The input unit 130 is an input device that receives various operations from the user. For example, the input unit 130 is implemented by a keyboard, a mouse, an operation key, and the like.
The output unit 140 includes a display unit 141 and a sound output unit 142. The display unit 141 is a display device for displaying various types of information. For example, the display unit 141 is implemented by a liquid crystal display or the like. When a touch panel is adopted as the terminal device 100, the input unit 130 and the display unit 141 are integrated.
The sound output unit 142 reproduces a sound signal under the control of the control unit 160.
The storage unit 150 stores various data and programs. Specifically, the storage unit 150 stores programs and parameters for the control unit 160 to execute each function. For example, the storage unit 150 stores information regarding an AR marker used by the recognition unit 162 to recognize the AR marker, and information regarding a virtual object to be displayed in the AR by the display control unit 166. The storage unit 150 is implemented by semiconductor memory elements such as random access memory (RAM) or flash memory, or storage devices such as a hard disk or an optical disk.
The control unit 160 is a controller, and is implemented by execution of various programs (corresponding to an example of an information processing program) stored in a storage device inside the terminal device 100 by a central processing unit (CPU), a micro processing unit (MPU), or the like, using the RAM as a work area, for example. In addition, the control unit 160 is a controller, and is implemented by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
As illustrated in
The reception unit 161 receives selection of a virtual object from the user. Specifically, the reception unit 161 receives a virtual object selection operation from the user via the input unit 130. When having received the selection operation of the virtual object from the user, the reception unit 161 requests the server device 200 for information related to the selected virtual object. Subsequently, the reception unit 161 acquires information related to the virtual object from the server device 200. For example, the reception unit 161 acquires information related to the shape of the virtual object and color information regarding the virtual object. For example, the reception unit 161 acquires a sample of the color of the virtual object.
In the example illustrated in
Furthermore, the reception unit 161 receives designation of a real object to be used as a marker when superimposing a virtual object from the user. Specifically, the reception unit 161 acquires an image of a real object (marker) captured by the imaging unit 120. Subsequently, the reception unit 161 requests the server device 200 for information necessary for recognizing the real object (marker). The reception unit 161 then acquires information necessary for recognizing the real object (marker) from the server device 200. For example, the reception unit 161 acquires feature points of a real object (marker) and color information regarding the real object (marker). For example, the reception unit 161 acquires a sample of the color of a real object (marker).
In the example illustrated in
The recognition unit 162 acquires an image of a real object (marker) from the reception unit 161. Subsequently, based on the acquired image of the real object, the recognition unit 162 recognizes the real object (marker) and estimates the position and posture of the real object. For example, the recognition unit 162 compares the feature points of the real object (marker) or the color information regarding the real object (marker) acquired from the server device 200 with the image of the real object (marker), recognizes the real object (marker), and estimates the position and posture of the real object (marker).
In the example illustrated in
Furthermore, the recognition unit 162 acquires, from the real object (marker), a sample of the color of the real object (marker) under the real environment. Specifically, the recognition unit 162 acquires a sample of the color of the real object (marker) under the real environment based on the acquired image of the real object (marker).
In the example illustrated in
The recognition unit 162 acquires an image of another real object captured by the imaging unit 120. Subsequently, the recognition unit 162 recognizes the another real object based on the acquired image of the another real object, and estimates the position and posture of the another real object. Furthermore, the recognition unit 162 acquires a sample of the color of the another real object under the real environment based on the acquired image of the another real object.
In the example illustrated in
The estimation unit 163 estimates a parameter for color gamut correction of a virtual object. The estimation unit 163 estimates a parameter for color gamut correction in the imaging unit 120 based on the sample of the color of the real object under the real environment acquired by the recognition unit 162. Specifically, the estimation unit 163 calculates a difference vector in the color space between the color value of the sample of the color of the real object (marker) acquired by the reception unit 161 and the color value of the sample of the color of the real object (marker) under the real environment acquired by the recognition unit 162. Subsequently, based on the calculated difference vector, the estimation unit 163 estimates a parameter for color gamut correction of the virtual object. For example, the estimation unit 163 estimates the parameter for color gamut correction of the virtual object based on the above Mathematical Expression 1 or 2.
In the example illustrated in
After the estimation unit 163 has estimated the parameter, the determination unit 164 determines whether the color information necessary to express the color of the virtual object is sufficient. Specifically, the determination unit 164 determines whether the color information necessary for expressing the color of the virtual object is sufficient based on a comparison between the color information regarding the real object and the color information regarding the virtual object. For example, when the distance between the color value of the real object and the color value of the virtual object in the color space exceeds a predetermined threshold, the determination unit 164 determines that the color information necessary for expressing the color of the virtual object is not sufficient. Alternatively, in a case where the area in the color space of the region where the color gamut of the real object and the color gamut of the virtual object overlap is less than a predetermined threshold, the determination unit 164 determines that the color information necessary for expressing the color of the virtual object is not sufficient.
Having determined that the color information necessary for expressing the color of the virtual object is not sufficient, the determination unit 164 determines that it is necessary to perform imaging of another real object (also referred to as an additional object) that complements the color information expressing the color of the virtual object. Having determined that it is necessary to perform imaging of another real object that complements the color information expressing the color of the virtual object, the determination unit 164 requests the server device 200 to determine the presence or absence of information regarding the another real object that complements the color information expressing the color of the virtual object.
In the example illustrated in
Here, the determination processing regarding the necessity of complementation of color information expressing the color of a virtual object will be described with reference to
Next, a description will be given with reference to
The proposal unit 165 proposes, to the user, imaging of another real object that complements color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object. Specifically, the proposal unit 165 acquires, from the server device 200, information regarding the another real object that complements color information expressing the color of the virtual object. Having acquired information regarding the another real object that complements the color information expressing the color of the virtual object, the proposal unit 165 displays an illustration (“?” mark, etc.) or a message prompting additional imaging of another real object on the display unit 141 together with an image of the another real object. For example, the proposal unit 165 proposes, to the user, imaging of another real object selected based on the product purchase history of the user. The proposal unit 165 proposes, to the user, imaging of the another real object selected from the products purchased by the user.
Furthermore, the proposal unit 165 proposes imaging of another real object to the user based on a comparison between color information regarding a predetermined real object and color information regarding a virtual object. Specifically, in a case where the distance between the color value of the predetermined real object and the color value of the virtual object in the color space exceeds a predetermined threshold, the proposal unit 165 proposes imaging of another real object to the user. Specifically, when the determination unit 164 has determined that the distance between the color value of the predetermined real object and the color value of the virtual object in the color space exceeds a predetermined threshold, the proposal unit 165 proposes, to the user, imaging of the another real object acquired from the server device 200.
Alternatively, in a case where the area in the color space of the region where the color gamut of the predetermined real object and the color gamut of the virtual object overlap is less than a predetermined threshold, the proposal unit 165 proposes, to the user, imaging of the another real object. Specifically, when the determination unit 164 has determined that the area in the color space of the region in which the color gamut of the predetermined real object and the color gamut of the virtual object overlap is less than the predetermined threshold, the proposal unit 165 proposes, to the user, imaging of another real object acquired from the server device 200.
In the example illustrated in
The display control unit 166 corrects the color gamut of the virtual object based on color information acquired by imaging of another real object presented by the user. The display control unit 166 corrects the color gamut of the virtual object as described above with reference to
Next, the server device 200 according to the embodiment will be described with reference to
The communication unit 210 is implemented by a NIC, for example. The communication unit 210 is connected to the network in a wired or wireless channel, and transmits and receives information to and from the terminal device 100, for example.
The storage unit 220 stores various data and programs. The storage unit 220 is implemented by semiconductor memory elements such as flash memory, or storage devices such as a hard disk or an optical disk. As illustrated in
The product information storage unit 221 stores various types of information regarding products.
The “product ID” indicates identification information that identifies a product. The “color information” indicates information related to the color of the product.
The user information storage unit 222 stores various types of information regarding the user.
The “user ID” indicates identification information that identifies a user. The “purchase history” indicates information related to the product purchase history of the user.
Returning to the description of
As illustrated in
The receiving unit 231 receives a request for information related to a virtual object from the terminal device 100. Furthermore, the receiving unit 231 receives a request for information necessary for recognizing a real object (marker) from the terminal device 100.
When the receiving unit 231 has receives the request for the information regarding a virtual object, the transmitting unit 232 transmits the information regarding the virtual object to the terminal device 100. Specifically, the transmitting unit 232 refers to the product information storage unit 221 to acquire information related to the shape of the virtual object and color information regarding the virtual object.
Subsequently, the transmitting unit 232 transmits information related to the shape of the virtual object and color information regarding the virtual object.
When the receiving unit 231 has received a request for information necessary for recognizing a real object (marker), the transmitting unit 232 transmits information necessary for recognizing the real object (marker) to the terminal device 100. Specifically, the transmitting unit 232 refers to the product information storage unit 221 to acquire the feature point of the real object (marker) and the color information regarding the real object (marker). Subsequently, the transmitting unit 232 transmits the feature point of the real object (marker) and the color information regarding the real object (marker).
The determination unit 233 receives, from the terminal device 100, a determination request regarding the presence or absence of information related to another real object (hereinafter, also referred to as an additional object) capable of complementing color information expressing the color of the virtual object. In response to a request from the terminal device 100, the determination unit 233 determines the presence or absence of information related to the additional object. Specifically, having received the request for determining the presence or absence of the information regarding the additional object, the determination unit 233 refers to the user information storage unit 222 to determine whether the user of the terminal device 100 as the transmission source has a purchase history of the product. When determining that the user has a purchase history of the product, the determination unit 233 specifies the product purchased by the user. Having specified the product purchased by the user, the determination unit 233 refers to the product information storage unit 221 to acquire color information regarding the product purchased by the user. Having acquired the color information regarding the product purchased by the user, the determination unit 233 selects a product that can complement the color information expressing the color of the virtual object from among the products purchased by the user. Subsequently, the determination unit 233 transmits information related to the selected product to the terminal device 100.
In contrast, when having determined that the user has no purchase history of the product, the determination unit 233 requests the terminal device 100 to transmit a current image captured by the camera of the terminal device 100.
In the example illustrated in
The recognition unit 234 acquires the current image captured by the camera of the terminal device 100 from the terminal device 100. When having acquired the image, the recognition unit 234 determines whether the acquired image includes an image of a real object capable of complementing color information expressing the color of the virtual object. When having determined that the acquired image includes an image of a real object capable of complementing color information expressing the color of the virtual object, the recognition unit 234 transmits, to the terminal device 100, information related to the real object capable of complementing the color information expressing the color of the virtual object. In contrast, when having determined that the acquired image does not include an image of a real object capable of complementing color information expressing the color of the virtual object, the recognition unit 234 refers to the product information storage unit 221 to select a real object capable of complementing the color information expressing the color of the virtual object, and transmits information related to the selected real object to the terminal device 100.
Next, a procedure of information processing performed by the information processing system according to the embodiment will be described with reference to
Subsequently, the information processing system 1 acquires a sample of the color of the object (step S12). Subsequently, the information processing system 1 estimates a parameter for color gamut correction based on the acquired sample of the color of the object (step S13).
Subsequently, the information processing system 1 compares a color palette of the superimposition object (virtual object) with the acquired sample of the color of the object (step S14).
Subsequently, the information processing system 1 determines whether the acquired sample of the color is sufficient to express the color of the superimposition object (virtual object) (step S15). When having determined that the acquired sample of the color is sufficient to express the color of the superimposition object (virtual object) (Yes in step S15), the information processing system 1 does not propose observation (imaging) of an additional object to the user (step S16).
In contrast, when having determined that the acquired sample of the color is not sufficient to express the color of the superimposition object (virtual object) (No in step S15), the information processing system 1 estimates an object having a sample of an insufficient color from the database (for example, the product information storage unit 221) and proposes observation (imaging) of an additional object to the user (step S17).
Next, a procedure of information processing by each device according to the embodiment will be described with reference to
The server device 200 receives a request for information from the terminal device 100. Having receiving the request for information, the server device 200 transmits information regarding the shape of the virtual object and color information regarding the virtual object to the terminal device 100 as the information regarding the virtual object.
Furthermore, the server device 200 transmits the feature points of the marker and the color information regarding the marker to the terminal device 100 as information necessary for recognizing the marker (step S102).
The terminal device 100 receives the information related to the virtual object and the information related to the marker from the server device 200. Having receiving the information regarding the marker, the terminal device 100 recognizes the marker (step S103). Subsequently, after recognizing the marker, the terminal device 100 estimates a parameter for color gamut correction of the virtual object (step S104). Subsequently, after estimating the parameter, the terminal device 100 determines the necessity of imaging of the additional object (step S105).
After determining that imaging of the additional object is necessary, the terminal device 100 requests the server device 200 to determine the presence or absence of information related to the additional object (step S106). The server device 200 determines the presence or absence of information related to the additional object. When determining that the information related to the additional object exists, the server device 200 transmits the information related to the additional object to the terminal device 100 (step S108).
The terminal device 100 receives the information related to the additional object from the server device 200. Having received the information regarding the additional object, the terminal device 100 captures an image of the additional object (step S109). Subsequently, after imaging the additional object, the terminal device 100 corrects the color gamut of the virtual object based on the sample of the color acquired from the imaged additional object (step S110). Subsequently, after correcting the color gamut of the virtual object, the terminal device 100 superimposes the virtual object after the color gamut correction on the marker so as to be displayed by using AR (step S111).
Next, a modification of the embodiment will be described with reference to
The proposal unit 165 may propose, to the user, imaging of another real object selected from a packaging container of the product purchased by the user. For example, the proposal unit 165 proposes, to the user, imaging of another real object selected based on color information regarding a box of a product purchased by the user.
Furthermore, the proposal unit 165 proposes, to the user, imaging of another real object selected based on the image captured by the terminal device of the user. The proposal unit 165 proposes, to the user, imaging of another real object selected from the objects included in the image captured by the terminal device of the user.
In addition, the proposal unit 165 proposes, to the user, imaging of another real object whose distance in the color space between the color value of the another real object and the color value of the virtual object is less than a predetermined threshold with higher priority over the another real object whose distance is the predetermined threshold or more.
In addition, the proposal unit 165 proposes, to the user, imaging of another real object in which the color information regarding the another real object does not exist in the color information regarding the predetermined real object with higher priority over imaging of the another real object in which the color information regarding the another real object exists in the color information regarding the predetermined real object.
Although the above-described embodiment is an example in which the terminal device 100 and the server device 200 are separate devices, the terminal device 100 and the server device 200 may be an integrated device.
As described above, the information processing apparatus (terminal device 100 in the embodiment) according to the present disclosure includes the proposal unit (proposal unit 165 in the embodiment) and the display control unit (display control unit 166 in the embodiment). The proposal unit 165 proposes, to the user, imaging of another real object that complements color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object. The display control unit 166 corrects the color gamut of the virtual object based on color information acquired by imaging of another real object presented by the user.
With this configuration, in a case where the color information necessary for expressing the color of the virtual object is insufficient, the information processing apparatus proposes, to the user, observation of another real object that complements the insufficient color information, and corrects the color gamut of the virtual object based on the color information of the another real object. With this correction, the information processing apparatus can more easily correct the color gamut of the virtual object by using an object near the user as the color sample. This makes it possible for the information processing apparatus to improve convenience regarding color correction of the virtual object.
Furthermore, the proposal unit 165 proposes, to the user, imaging of another real object selected from known objects of which color information is known. Furthermore, the proposal unit 165 proposes, to the user, imaging of another real object selected based on a product purchase history of the user. Furthermore, the proposal unit 165 proposes, to the user, imaging of another real object selected from products purchased by the user.
With this configuration, the information processing apparatus can more easily correct the color gamut of the virtual object by using a product near the user as the color sample.
In addition, the proposal unit 165 proposes, to the user, imaging of another real object selected from a packaging container of the product purchased by the user.
With this configuration, the information processing apparatus can more easily correct the color gamut of the virtual object by using a box of the product near the user as the color sample.
Furthermore, the proposal unit 165 proposes, to the user, imaging of another real object selected based on the image captured by the terminal device of the user. In addition, the proposal unit 165 proposes, to the user, imaging of another real object selected from the objects included in the image captured by the terminal device of the user.
With this configuration, in a case where there is no product purchased by the user, the information processing apparatus proposes, to the user, the use of an object included in an image captured by the terminal device of the user as a color sample, thereby enabling color gamut correction of the virtual object more easily.
Furthermore, the proposal unit 165 proposes imaging of another real object to the user based on a comparison between color information regarding a predetermined real object and color information regarding a virtual object. In addition, in a case where the distance between the color value of the predetermined real object and the color value of the virtual object in the color space exceeds a predetermined threshold, the proposal unit 165 proposes imaging of another real object to the user. Moreover, in a case where the area in the color space of the region where the color gamut of the predetermined real object and the color gamut of the virtual object overlap is less than a predetermined threshold, the proposal unit 165 proposes imaging of another real object to the user.
In general, the case where there is a long distance between each sample of the color of the real object and the color of the virtual object before correction in the color space will lead to the possibility of occurrence of a large error of the color of the virtual object after correction caused by the influence of the observation error. Therefore, in a case where the error in the color of the virtual object after the correction is likely to increase, the information processing apparatus proposes imaging of another real object to the user, thereby enabling better color correction.
In addition, the proposal unit 165 proposes, to the user, imaging of another real object whose distance in the color space between the color value of the another real object and the color value of the virtual object is less than a predetermined threshold with higher priority over the another real object whose distance is the predetermined threshold or more.
With this configuration, the information processing apparatus preferentially proposes an object having a higher effect of color correction improvement by complementing the color information, enabling better color correction.
In addition, the proposal unit 165 proposes, to the user, imaging of another real object in which the color information regarding the another real object does not exist in the color information regarding the predetermined real object with higher priority over imaging of the another real object in which the color information regarding the another real object exists in the color information regarding the predetermined real object.
With this configuration, the information processing apparatus preferentially proposes an object having a higher effect of color correction improvement by complementing the color information, enabling better color correction.
The information devices such as the terminal device 100 and the server device 200 according to the above-described embodiment and modification are implemented by a computer 1000 having a configuration as illustrated in
The CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400 so as to control each of components. For example, the CPU 1100 develops the program stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processing corresponding to various programs.
The ROM 1300 stores a boot program such as a basic input output system (BIOS) executed by the CPU 1100 when the computer 1000 starts up, a program dependent on hardware of the computer 1000, or the like.
The HDD 1400 is a non-transitory computer-readable recording medium that records a program executed by the CPU 1100, data used by the program, and the like. Specifically, the HDD 1400 is a recording medium that records an information processing program according to the present disclosure, which is an example of program data 1450.
The communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet). For example, the CPU 1100 receives data from other devices or transmits data generated by the CPU 1100 to other devices via the communication interface 1500.
The input/output interface 1600 is an interface for connecting an input/output device 1650 to the computer 1000. For example, the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600. In addition, the CPU 1100 transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600. Furthermore, the input/output interface 1600 may function as a media interface for reading a program or the like recorded on predetermined recording medium (or simply medium). Examples of the media include optical recording media such as a digital versatile disc (DVD) or a phase change rewritable disk (PD), a magneto-optical recording medium such as a magneto-optical disk (MO), a tape medium, a magnetic recording medium, or semiconductor memory.
For example, when the computer 1000 functions as the terminal device 100 according to the embodiment, the CPU 1100 of the computer 1000 executes the information processing program loaded on the RAM 1200 so as to implement the functions of the control unit 160 or the like. Furthermore, the HDD 1400 stores the information processing program according to the present disclosure or data in the storage unit 150. While the CPU 1100 executes program data 1450 read from the HDD 1400, the CPU 1100 may acquire these programs from another device via the external network 1550, as another example.
Note that the present technology can also have the following configurations.
(1)
An information processing apparatus comprising:
a proposal unit that proposes, to a user, imaging of another real object that complements color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object; and
a display control unit that corrects the color gamut of the virtual object based on color information acquired by imaging of the another real object presented by the user.
(2)
The information processing apparatus according to (1),
wherein the proposal unit
proposes, to the user, imaging of the another real object selected from known objects of which color information is known.
(3)
The information processing apparatus according to (1) or (2),
wherein the proposal unit
proposes, to the user, imaging of the another real object selected based on a product purchase history of the user.
(4)
The information processing apparatus according to (3),
wherein the proposal unit
proposes, to the user, imaging of the another real object selected from products purchased by the user.
(5)
The information processing apparatus according to (3) or
(4), wherein the proposal unit proposes, to the user, imaging of the another real object selected from a packaging container of a product purchased by the user.
(6)
The information processing apparatus according to (1) or (2), wherein the proposal unit proposes, to the user, imaging of the another real object selected based on an image captured by a terminal device of the user.
(7)
The information processing apparatus according to (6), wherein the proposal unit proposes, to the user, imaging of the another real object selected from objects included in the image captured by the terminal device of the user.
(8)
The information processing apparatus according to any of (1) to (7),
wherein the proposal unit
proposes, to the user, imaging of the another real object based on a comparison between color information regarding the predetermined real object and color information regarding the virtual object.
(9)
The information processing apparatus according to any of (1) to (8), wherein, in a case where a distance in a color space between a color value of the predetermined real object and a color value of the virtual object exceeds a predetermined threshold, the proposal unit proposes, to the user, imaging of the another real object.
(10)
The information processing apparatus according to any of (1) to (9),
wherein, in a case where an area, in a color space, of a region in which a color gamut of the predetermined real object and the color gamut of the virtual object overlap with each other is less than a predetermined threshold, the proposal unit
proposes, to the user, imaging of the another real object.
(11)
The information processing apparatus according to any of (1) to (10),
wherein the proposal unit
proposes, to the user, imaging of the another real object in which a distance in a color space between a color value of the another real object and a color value of the virtual object is smaller than a predetermined threshold with higher priority over the another real object in which the distance between the color value of the another real object and the color value of the virtual object is the predetermined threshold or more.
(12)
The information processing apparatus according to any of (1) to (11),
wherein the proposal unit
proposes, to the user, imaging of the another real object in which color information regarding the another real object does not exist in color information regarding the predetermined real object with higher priority over the another real object in which the color information regarding the another real object exists in the color information regarding the predetermined real object.
(13)
An information processing method in which a computer executes processing of:
proposing, to a user, imaging of another real object that complements color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object; and
correcting the color gamut of the virtual object based on color information acquired by imaging of the another real object presented by the user. (14)
An information processing program for causing a computer to execute:
a proposal procedure of proposing, to a user, imaging of another real object that complements color information used for correcting a color gamut of a virtual object displayed to be superimposed on a predetermined real object; and
a display control procedure of correcting the color gamut of the virtual object based on color information acquired by imaging of the another real object presented by the user.
Number | Date | Country | Kind |
---|---|---|---|
2019-235264 | Dec 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/046342 | 12/11/2020 | WO |