The described embodiments relate generally to visualization systems. More particularly, the present embodiments relate to design material visualization systems.
Manufacturers and suppliers of design materials (such as fabrics, floorings, paints, wood, paneling, stone, brick, carpet, laminates, countertops, cabinets, wallpaper, molding, tiles, housewares, and so on) often present images of their materials for advertising purposes. For example, the design materials may be advertised to potential clients such as designers (e.g., interior designers, exterior designers, and so on) as well as end users (e.g., homeowners, businesses, and so on). Images of the materials may be displayed in print, such as industry magazines, showroom booklets, conference materials, and so on. Images of the materials may also be displayed on a website, such as a manufacturer website, a supplier website, and so on. In some instances, videos including the design materials may be presented. For example, a manufacturer or supplier of a design material may present a promotional video of their design materials to potential clients in a showroom.
Potential clients may use images of design materials as part of a design process related to the design materials, as part of making purchasing and/or other acquisition decisions related to the design materials, and so on. The images may be the most complete information that the potential clients have for making design process decisions related to the design materials, making purchasing and/or other acquisition decisions related to the design materials, and so on. Without sufficient information, it may be challenging for potential clients to make design process decisions related to the design materials, make purchasing and/or other acquisition decisions related to the design materials, and so on.
As an alternative, potential clients may visit locations (such as showrooms, conferences, and so on) where physical samples of design materials are available. For example, potential clients may obtain physical samples of different design materials and place them next to each other to evaluate potential designs that may include the different design materials. Similar to the images discussed above, potential clients may use such physical samples of design materials as part of a design process related to the design materials, as part of making purchasing and/or other acquisition decisions related to the design materials, and so on.
The present disclosure relates to a visualization system that uses digital materials digitized using “physically based rendering” (or “PBR”) by scanning physical samples of design materials. The digital materials have texture stacks that include texture maps representing characteristics of the physical samples and/or metadata specifying such characteristics. The visualization system, in response to the user input, dynamically generates a model of a demonstration area from a viewpoint, applies one or more digital materials to one or more model surfaces of the model that correspond to one or more demonstration surfaces of the demonstration area, adjusts the model for a demonstration area condition, splits the model into split images for the projector zones, and provides the split images to projectors that correspond to the projector zones. The result is a projection into the demonstration area that is true to life of the physical samples of the design materials.
In various embodiments, a system includes a scanner computing device that creates digital materials by scanning physical samples of design materials, the digital materials including images of the physical samples and texture stacks that include texture maps representing characteristics of the physical samples; projectors configured to project onto projector zones of a demonstration area; and a rendering computing device. The rendering computing device receives user input indicating a selection of a digital material of the digital materials for a demonstration surface of the demonstration area and, in response to the user input, dynamically generates a model of the demonstration area from a viewpoint, applies the digital material to a model surface of the model that corresponds to the demonstration surface, adjusts the model for a demonstration area condition, splits the model into split images for the projector zones, and provides the split images to the projectors that correspond to the projector zones.
In some examples, the scanner computing device recognizes that a physical sample of the physical samples includes a repeating pattern and replicates the repeating pattern instead of scanning all of the physical sample. In various examples, the demonstration area condition includes a lighting condition. In some implementations of such examples, the rendering computing device adjusts for the lighting condition by compensating for the demonstration area having a different lighting than the model.
In a number of examples, the rendering computing device performs ray tracing to add lighting to the model. In some examples, the texture maps include an alpha texture map, a displacement texture map, a roughness texture map, a metallic texture map, a normal texture map, and a base color texture map. In various examples, the scanner computing device is operable to perform a metallic/roughness workflow and a specular/glossy workflow as part of scanning the physical samples.
In some embodiments, a system includes projectors configured to project onto projector zones of a demonstration area and a rendering computing device. The rendering computing device receives user input indicating a selection of a digital material of digital materials for a demonstration surface of the demonstration area, the digital materials including images of physical samples and texture stacks that include texture maps representing characteristics of the physical samples and, in response to the user input, dynamically generates a model of the demonstration area from a viewpoint; applies the digital material to a model surface of the model that corresponds to the demonstration surface; splits the model into split images for the projector zones; and provides the split images to the projectors that correspond to the projector zones.
In various examples, the rendering computing device uses a UV map included in the digital material to maintain scale of the digital material when applying the digital material to the model surface. In some examples, the rendering computing device uses a gaming engine to generate the model of the demonstration area and apply the digital material to the model surface.
In a number of examples, the projector zones overlap. In various implementations of such examples, the rendering computing device reduces intensity of pixels in overlapping areas where the projector zones overlap.
In some examples, the rendering computing device provides projector alignment grids to the projectors for mapping pixel density to the demonstration area. In various implementations of such examples, the projector alignment grids include a blend zone.
In a number of embodiments, a system includes a demonstration area, a non-transitory storage medium that stores instructions, and a processor. The processor executes the instructions to receive first user input indicating a first selection of a first digital material from multiple digital materials for a first demonstration surface of the demonstration area, the digital materials including images of physical samples and texture stacks that include texture maps representing characteristics of the physical samples; receive second user input indicating a second selection of a second digital material from the multiple digital materials for a second demonstration surface of the demonstration area; assign projectors to projector zones that each correspond to a portion of the demonstration area; and, in response to the second user input, dynamically apply the first digital material to a first model surface of a model of the demonstration area, the first model surface corresponding to the first demonstration surface; apply the second digital material to a second model surface of the model of the demonstration area, the second model surface corresponding to the second demonstration surface; split the model into split images for the projector zones; and provide the split images to the projectors that correspond to the projector zones.
In some examples, the system further includes a surface treatment applied to the first demonstration surface that prevents or reduces light bounce from one or more of the projectors. In a number of implementations of such examples, the surface treatment includes crushed carbon nanotubes applied in a polymer.
In various examples, the processor is operable to receive third user input indicating to order a product associated with a respective physical sample associated with the first digital material. In some implementations of such examples, the first digital material includes metadata that specifies a stock keeping unit for the product.
In a number of examples, the physical samples include at least one of fabrics, floorings, paints, wood, paneling, stone, brick, carpet, laminates, countertops, cabinets wallpaper, molding, tiles, paint, and housewares.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements. The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
The description that follows includes sample systems, methods, apparatuses, and computer program products that embody various elements of the present disclosure. However, it should be understood that the described disclosure may be practiced in a variety of forms in addition to those described herein.
Using images or physical samples of design materials as part of a design process related to the design materials, as part of making purchasing and/or other acquisition decisions related to the design materials, and so on may present challenges as this may not convey how the actual design materials will appear, behave, and/or integrate with other actual design materials in a design. Without such information, final designs may drift away from the concept that was originally planned.
For example, people picking countertops, cabinets, flooring, and other design materials for a new house may visit a design center where images of design materials and/or physical samples of design materials may be available. The people may place the images and/or physical samples next to each other to see how the design materials selected for one aspect of the new house integrate with design materials selected for other aspects of the new house, particularly those that will be located in the same design area or design space. However, this may require the people to use their imaginations to visualize the images and/or physical samples in their final form (such as a sample rectangle of granite countertop versus an actual countertop made of that granite), and does not take into account any impact that lighting, time of day, and/or other conditions of the design area into which the design materials will be placed might have on how design materials will appear, behave, and/or integrate with other design materials.
By way of an alternative, an image of a design area may be generated and image editing software may be used to apply images of design materials to part and/or all of one or more of the surfaces of one of the walls, floor, and/or other objects in the design area (such as one or more counters, cabinets, tables and/or other furniture items, and so on). However, this may also not take into account any impact that lighting, time of day, and/or other conditions of the design area into which the design materials will be placed might have on how design materials will appear, behave, and/or integrate with other design materials. Further, the images of the design materials may not be “true to life” regarding the color, substance, scale, and/or other features of the respective design material. Even if the image of the design material is of a particular physical example of the design material, the image of the design material may not sufficiently correspond to a version of the design material that can be purchased and/or otherwise obtained.
In another alternative, a physical model of a design area may be constructed and one or more projectors may be used to project one or more images of design materials onto part and/or all of one or more of the surfaces of one of the walls, floor, and/or other objects in the design area (such as one or more counters, cabinets, tables and/or other furniture items, and so on). However, like the example above, this may also not take into account any impact that lighting, time of day, and/or other conditions of the design area into which the design materials will be placed might have on how design materials will appear, behave, and/or integrate with other design materials. Further, like the above, the images of the design materials may not be “true to life” regarding the color, substance, scale, and/or other features of the respective design material. Additionally, like the above, even if the image of the design material is of a particular physical example of the design material, the image of the design material may not sufficiently correspond to a version of the design material that can be purchased and/or otherwise obtained.
The following disclosure relates to a visualization system that uses digital materials digitized using “physically based rendering” (or “PBR”) by scanning physical samples of design materials. The digital materials have texture stacks that include texture maps representing characteristics of the physical samples and/or metadata specifying such characteristics. The visualization system, in response to the user input, dynamically generates a model of a demonstration area from a viewpoint, applies one or more digital materials to one or more model surfaces of the model that corresponds to one or more demonstration surfaces of the demonstration area, adjusts the model for a demonstration area condition, splits the model into split images for the projector zones, and provides the split images to projectors that correspond to the projector zones. The result is a projection into the demonstration area that is true to life of the physical samples of the design materials.
In this way, the visualization system provides improved user interfaces and/or projection output over previous visualization systems that could not project true to life images and used more software and/or hardware resources attempting to do so. As a result, the visualization system is able to perform functions that existing systems were not previously able to perform, lacking the technology disclosed herein. This may enable the visualization system to operate more efficiently than previous visualization systems while consuming fewer hardware and/or software resources as more resource consuming techniques may be omitted.
These and other embodiments are discussed below with reference to
Digital materials may be materials that may be accessed and/or visualized by computing devices. Digital materials may include materials that were not initially created as digital, but were converted to digital form through the process of digitization.
Digital materials may be digital twins of the physical materials from which they were generated. A digital twin may be a digital representation of a real-world entity or system. The implementation of a digital twin may be an encapsulated software object or model that mirrors a unique physical object, process, organization, person, abstraction, and so on. Data from multiple digital twins may be aggregated for a composite view across a number of real-world entities, such as a power plant, a city, and so on and/or their related processes.
A texture map may be a two-dimensional image of a surface that may be used to cover three-dimensional objects and describe characteristics of a given physical material. Texture maps may be a way of applying properties to a three-dimensional model so as to alter the appearance of the three-dimensional model using the aforementioned two-dimensional images. This appearance may include the three-dimensional model's color, fine detail, how shiny or metallic the three-dimensional model appears, whether or not the three-dimensional model is transparent and/or glows, and so on.
Materials digitized by PBR may be digital materials. PBR may be a methodology of shading and rendering three-dimensional models of physical materials that provides a more accurate representation of how light interacts with surfaces from a physically accurate standpoint. PBR may generate a texture stack for a digital material, which may be a series of texture maps. These texture maps may make up the physical characteristics of a digital twin of the physical material from which the digital material is generated.
The digital material may include one or more images of the respective physical sample, one or more texture stack maps generated from the respective physical sample, color information generated from the respective physical sample, scale information for the respective physical sample (which may be expressed in texel density, or pixels per unit of measure), use information (such as that the design material associated with the respective physical sample is typically used as a flooring, that the design material is typically interspersed with a grout as well as the dimensions and possible colors for such grout, and so on), light behavior (such as how reflective and/or non-reflective the respective physical sample is, how diffuse the physical sample is, and so on), metadata (such as a stock keeping unit or “SKU” associated with the physical sample, a manufacturer or supplier associated with the respective physical sample, price related to purchasable materials associated with the physical sample, availability of materials associated with the physical sample, time of availability of materials associated with the physical sample, durability of materials associated with the physical sample, one or more wear ratings of materials associated with the physical sample, and/or other information), and so on.
The metadata may provide real-world information for the digital material related to the respective physical sample (and/or product) from which the digital material was generated. The metadata may enable application of the digital material to surfaces of models within three-dimensional environments in a way that is accurate to the respective physical sample (and/or product) from which the digital material was generated. The metadata may include real-world width, real-world length, real-world height (thickness), pixel density, texture map width (in pixels), texture map height (in pixels), associated information (such as real-world grout width (where applicable), default grout selection (where applicable), or the like), and so on. Accurate scaling of materials may be a factor of the texture map size in pixels divided by the pixel density (which may be in inches for this case for pixel density).
The metadata may also specify constraints associated with the respective physical sample, such as scale constraints, typical uses, and so on. By way of illustration, the metadata may indicate that a tile is generally used on walls. This metadata may be used to automatically place the tile on walls when selected, to provide a warning that the tile is typically used on walls when a user attempts to place the tile on a floor, and so on. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
The system 100A may also include one or more repository computing devices 104. The repository computing device 104 may store one or more digital materials generated by the scanner computing device 102. The repository computing device 104 may make stored digital materials available to one or more other devices, such as one or more rendering computing devices 105 that may use the digital materials for one or more different kinds of material visualization.
Additionally, the system 100A may further include one or more rendering computing devices 105. The rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials obtained from the repository computing device 104 for one or more different kinds of material visualization. In various implementations, the rendering computing device 105 may use one or more game engines (such as the Unreal Engine) as part of ingesting one or more digital materials, generating models, adjusting models for various conditions, rendering one or more images, and so on. As illustrated, the system 100A may include one or more projector systems 110, one or more printed sheets systems 170, one or more web systems 180, and/or other systems. In some examples, the rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials obtained from the repository computing device 104 for one or more different kinds of material visualization using one or more of the projector system 110, the printed sheet system 170, the web system 180, and/or one or more other systems.
By way of illustration, the rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials obtained from the repository computing device 104 for material visualization using the projector system 110. The rendering computing device 105 may do so by dynamically generating a model, dynamically applying one or more selected digital materials to one or more surfaces of the model as per the specifications included in the digital materials, dynamically adjusting the model for demonstration area conditions, dynamically splitting the applied and adjusted model into processor zone images, and dynamically providing respective processor zone images to the respective projector of the projector system 110 for that zone. As the model may be adjusted for demonstration area conditions, the impact that lighting, time of day, and/or other conditions of the design area into which the design materials will be placed might have on how design materials will appear, behave, and/or integrate with other design materials may be taken into account. Further, the images supplied to the projectors of the projector system 110 may be true to life regarding the color, substance, scale, and/or other features of the respective design material because of the specifications included in the digital material. Additionally, the rendering performed by the rendering computing device 105 may be PBR because the digital materials are rendered according to the specifications included in the digital materials that were generated from scanning the physical samples 101 of the design materials. Due to the information included in the digital material related to SKU, manufacturer or supplier, and so on, the images provided to the projectors of the projector system 110 may directly correspond to a version of the design material that can be purchased and/or otherwise obtained.
By way of another illustration, the rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials obtained from the repository computing device 104 for material visualization using the printed sheet system 170. The rendering computing device 105 may do so by processing an image from a digital material (such as by cropping if the image is above a selected print area, generating additional portions of the image based on the image if the image is below a selected print area), printing the image on an adhesive-backed material sheet, and laminating and/or otherwise applying one or more glossy coatings, matte coatings, and/or other coatings to cause the printed adhesive-backed material sheet to interact with light like (i.e., look and feel like) the physical sample from which the digital material was generated.
In a third illustration, the rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials obtained from the repository computing device 104 for material visualization using the web system 180. The rendering computing device 105 may do so by generating a model, applying a digital material to the model, adjusting the model for web output conditions, using a virtual camera to capture the adjusted model for web output, and providing the virtual camera captured adjusted model to one or more web servers, such as one or more web servers that serve product detail pages (or “PDP”).
As discussed above, the scanner computing device 102 may be communicably coupled to one or more scanners. The scanner may be any kind of scanner that is operable to scan one or more physical samples 101 as part of generating one or more digital samples. Examples of such scanners include TAC7™ scanners, Vizoo™ scanners, METIS™ scanners, and so on.
Although the above illustrates and describes three material visualization uses for digital materials, it is understood that these are examples and that digital materials may be otherwise used and/or other material visualizations may be performed. Various configurations are possible and contemplated without departing from the scope of the present disclosure. Material visualization using the projector system 110, the printed sheet system 170, and the web system 180 are discussed in more detail below.
At operation 210A, an electronic device (such as the scanner computing device 102 of
At operation 220A, the electronic device may normalize the influence of light color on the scanning. Scanning generally involves use of one or more light sources. Such light sources may have a color temperature and/or light color that may cause a resulting image to appear different than the physical sample scanned under reference lighting conditions. Normalizing the influence of light color on the scanning may correct for this, causing the resulting image to appear the same (i.e., “true to life”) and/or substantially similar to the physical sample scanned under the reference lighting conditions. As a result, the system may be linearized to be color accurate.
For example, a number of example physical scan items (which may correspond to samples of white, shades of gray, and/or one or more colors) may be scanned that correspond to example scan item images that appear the same as the example physical scan items. These example physical scan items and the corresponding example scan item images may be used to calibrate the scanning process in order to normalize the influence of light color on the scanning. This may be performed by scanning one or more of the example physical scan items and comparing the resulting scan images to the respective example scan item images, determining one or more light color corrections to be performed to get the resulting scan images to match or substantially match the respective example scan item images, and then making the same light color corrections to later obtained scan images of other physical samples. This may linearize the scanning system (i.e., create a scale over all of the different shades such that deviation from actual color is less than approximately 3-6%).
At operation 230A, the electronic device may generate a texture stack. The texture stack 200B may include an alpha (opacity) texture map, a displacement (bump, height) texture map, a roughness (opposite of glossiness) texture map, a metallic (metalness) texture map, a normal texture map, a base color (albedo) texture map, and so on.
At operation 240A, the electronic device may record metadata. The metadata may provide real-world information for the digital material related to the respective physical sample (and/or product) from which the digital material was generated. The metadata may enable application of the digital material to surfaces of models within three-dimensional environments in a way that is accurate to the respective physical sample (and/or product) from which the digital material was generated. The metadata may include real-world width, real-world length, real-world height (thickness), pixel density, texture map width (in pixels), texture map height (in pixels), associated information (such as real-world grout width (where applicable), default grout selection (where applicable), or the like), and so on.
At operation 250A, the electronic device may post-process one or more images. Such post-processing may create a seamlessly tileable digital material; a modular, object-based digital material; and so on.
In various examples, this example method 200A may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or edge network and/or by one or more computing devices, such as the scanner computing device 102 of
Although the example method 200A is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
For example, in some implementations, the method 200A may include one or more operations for reducing scan time. By way of illustration, some physical samples (such as carpets with geometric designs) may include one or more areas that are virtually identical and/or otherwise repetitive of one or more other areas. Rather than scan all such virtually identical and/or otherwise repetitive areas, the method 200A may include determining that such virtually identical and/or otherwise repetitive areas are present, omitting scanning one or more of the virtually identical and/or otherwise repetitive areas, and using scan data from scanning one or more of the other virtually identical and/or otherwise repetitive areas instead of scanning the one or more virtually identical and/or otherwise repetitive areas. In this way, the efficiency of the scanner computing device 102 may be improved.
Scanning equipment may be capable of generating the raw maps from physical samples scanned from a physical inventory at a hub. These raw maps may be post-processed. Such post-processing may create a seamlessly tileable digital material; a modular, object-based digital material; and so on.
For example, a tile may have edges that are not visible when the tile is used to form a mosaic with other tiles. As such, an image of the tile may be post-processed to create a modular, object-based digital material by cropping in on edges of and/or squaring off a scanned image of the tile. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
By way of another example, a wooden board may have a grain. Combining images of a single board into a compound image instead of multiple images of multiple boards may be visibly apparent due to the grain (creating the effect of combining squares of a cloth resulting in a quilt instead of a single larger piece of the cloth). As such, post-processing of an image of the wooden board to create a seamlessly tileable digital material may involve combining images of the wooden board into a compound image while masking, aligning, ensuring sufficient grain is present to enable pattern repeat (and/or estimating how the pattern would extend and extending such according to the estimation), and/or offsetting edges of the individual images of the wooden board. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
In still another example, bricks may have slight variations from each other. Post-processing may involve scanning multiple bricks to determine variations and then altering one or more individual brick images to account for the variations. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
In yet another example, a carpet may have a repeating pattern. Rather than scanning the same repeating pattern over and over, a determination may be made that the pattern repeats and the repeating pattern may be created based on the scanning that has already occurred. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
In still another example, correction for influence of light color during scanning was discussed. However, the way that a physical material behaves in response to light may still need to be determined and stored in the digital material and/or related metadata in order to be able to accurately render the digital material and/or the response of the digital material to light. As such, the way that a physical material behaves in response to light (i.e., how much the material reflects light, diffuses light, and so on) may be determined and stored in the digital material and/or related metadata as part of post-processing. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
Additionally and/or alternatively, post-processing may involve one or more other processes, such as one or more quality assurance processes. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
In various examples, post-processing may be performed using various software tools at the direction of one or more graphical artists. In other examples, such a process may be automated, controlled by artificial intelligence and/or machine learning informed by previous post-processing performed using various software tools at the direction of one or more graphical artists, and so on.
The texture stack 200B may be raw and pre-processed. As shown, the texture stack 200B may include an alpha (opacity) texture map, a displacement (bump, height) texture map, a roughness (opposite of glossiness) texture map, a metallic (metalness) texture map, a normal texture map, a base color (albedo) texture map, and so on.
There may be at least two workflows for rendering PBR materials. These include a metallic/roughness workflow, a specular/glossy workflow, and so on. In some examples, digital materials may be set up to primarily support the metallic/roughness workflow as this may be the most common in real-time rendering environments. Converting PBR texture stacks to be useful between workflows may be a relatively simple operation.
The biggest difference between a metallic/roughness workflow and a specular/glossy workflow may be in how the diffuse and reflectivity content is defined in the texture maps. In the specular/glossy workflow, these values maybe set explicitly within two unique texture maps. The metallic/roughness workflow, on the other hand, may use an albedo texture map to define both the diffuse and reflectivity content and a metallic texture map to define whether the material is dielectric or a metal.
The metallic/roughness workflow may use a base color texture map, a metallic texture map, and a roughness texture map to generate a combined result. The specular/glossy workflow may use a diffuse texture map, a specular texture map, and a glossiness texture map to generate a combined result.
Certain embodiments may calibrate a system output to provide an accurate, true-to-original-sample color. As different projectors, displays, and other output devices may vary in their color gamut and color accuracy, this calibration is designed to accurately measure and portray an output against colors of an original sample scanned into the system. Further, calibration routines may account for color shifts of an output due to a color or tint of a projection surface, room lighting, and the like. That is, the calibration routine may measure an output color as seen by a user under existing conditions and change that output to accurately recreate a sample's color(s) while taking into account environmental factors.
Next, in operation 220C, the sample is scanned, digitized, recorded, or otherwise analyzed and entered into the system. Certain embodiments may use a large flatbed scanner (such as, for example, a Vizoo A2 scanner) while others utilize a moving overhead scanner (one example of which is a Metis scanner). Typically, although not necessarily, scanners are calibrated weekly by scanning a color chart having known values and comparing those values to the xyY values obtained from the scanner (or other color space values). Differences between actual and scanned values are linearly adjusted in order to ensure the scanner is accurately calibrated prior to scanning the sample.
In operation 230C, color values for the sample digital data are corrected by mapping them to target values in a CIE color space. The scanner is typically calibrated to a color space, such as the CIE color space, and the sample is scanned to produce a set of sample digital data. The sample digital data includes color values for each color of the sample. This sample digital data is normalized about the CIE color space scale, thereby normalizing the color values of the sample with respect to the scanner. “Linearization” may determine a best-fit line across an entire set of colors or color gamut for the sample digital data, and that best-fit line is used to correct the digital data for the sample. For example, color values deviating from the best-fit line by less than a threshold amount may be matched to, or moved to, the line. As an alternative example, each value within a given color may linearly adjusted by a set or proportionate offset amount. Thus, linearization may be either applied across a color space (e.g., uniformly across all instances of sample digital data) or individually to each single color (e.g., fitting each instance of sample digital data to a best-fit line, curve, or the like, and so adjusting each instance of sample digital data by an individual amount).
Following or concurrently with operation 230C, in operation 230C the mapped color values for the sample digital data are adjusted to account for the system output's color gamut. In this operation the scanned colors of the material are plotted in a color space (and a color gamut within that space) against the “true” value of each color, as assigned in operation 220C. Typically, an embodiment utilizes xyY values for this where x and y are coordinates within a color space and Y is luminance, as previously mentioned.
It should be noted that sample colors (or, more particularly, instances of sample digital data) can vary in Y/luminance only. For example, different shades of gray are generally distinguished only by differing Y values. This yields an idealized set of color values for a material. An “idealized” color space is a pure space, unadjusted for any projector characteristics or errors.
As part of operation 230C, the idealized color space (and/or the idealized color gamut) is mapped to the actual color space (and/or gamut) of the real-world projector. This is done by measuring the projector outputs and comparing them back to the idealized color space/gamut. This comparison yields a set of xyY values for each color of interest, which can be the material colors, a set of representative colors, a set of color gamut edge points/colors, or any other set of colors, that can be used to adjust the idealized color gamut. Depending on the variance between the idealized color space or gamut and the actual color space or gamut of the projector, the entirety of the idealized color space/gamut may be adjusted in a single operation or different points in the idealized color space/gamut may be individually adjusted.
Finally, in operation 240C the system output projects, displays, or otherwise outputs an image of the sample from the projector.
An embodiment may measure this output to calibrate specifically for the color values of a given image, thereby providing real-time feedback and adjustment. Alternatively, the output may be measured weekly or at any given interval to detect and account for any drift in the output device.
The rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials obtained from the repository computing device 104 for material visualization using the projectors 311A-311F. The rendering computing device 105 may do so by dynamically generating a model, dynamically applying one or more selected digital materials (which may be dynamically selected via user input received via a user interface) to one or more surfaces of the model as per the specifications included in the digital materials (such as by using one or more UV maps included in the digital materials that specify true to life scaling of the digital material across the pixels of the surfaces) (which may involve unwrapping one or more images included in the digital materials and wrapping them around the surfaces), dynamically adjusting the model (which may involve light simulation) for demonstration area conditions (which may be detected using the sensor 312, such as where light and/or other conditions of the example demonstration area 315 differ from the model, where light from one or more of the projectors 311A-311F bounces to interfere with the projection of one or more of the other projectors 311A-311F, and so on), dynamically splitting the applied and adjusted model into processor zone images, and dynamically providing respective processor zone images to the respective one of the projectors 311A-311F for that zone. As the model may be adjusted for demonstration area conditions, the impact that lighting (which may be performed using ray tracing using light responsiveness information included in the digital material, lighting specifications for the model, ways that the example demonstration area 315 differs from the model that may need to be corrected for so that the model appears as intended when projected into the demonstration area 315), time of day, and/or other conditions of the design area into which the design materials will be placed might have on how design materials will appear, behave, and/or integrate with other design materials may be taken into account. Alternatively, though the previous describes adjusting the model for lighting conditions of the example demonstration area 315 that differ from the model, ray tracing and/or similar techniques may be used to apply lighting to the model separate from any lighting conditions of the example demonstration area 315 and/or prior to adjusting for lighting conditions of the demonstration area 315. Further, the images supplied to the projectors of the projector system 100A may be true to life regarding the color, substance, scale, and/or other features of the respective design material because of the specifications included in the digital material. Additionally, the rendering performed by the rendering computing device 105 may be PBR because the digital materials are rendered according to the specifications included in the digital materials that were generated from scanning the physical samples of the design materials. Due to the information included in the digital material related to SKU, manufacturer or supplier, and so on, the images provided to the projectors 311A-311F may directly correspond to a version of the design material that can be purchased and/or otherwise obtained.
As discussed above, the model may be generated from a viewpoint. In some cases, that may be a static viewpoint, such as from an assumption that a viewer will be standing before the example demonstration area 315 at the very middle. The viewpoint from which the model is generated may be significant. For example, some sidewalk art appears to be three-dimensional when viewed from the viewpoint from which the perspective is generated and two-dimensional from other viewpoints.
In some examples, the rendering computing device 105 may perform one or more of the above operations using a gaming engine, such as the Unreal Engine. The rendering computing device 105 may use one or more plugins and/or other software and/or hardware components to supplement and/or leverage the rendering capabilities of the gaming engine to import, generate, and/or render and/or adjust and/or project one or more digital materials and/or models. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
The projectors 311A-311F may be configured to project onto different zones of the demonstration area 315. The rendering computing device 105 may assign each of the projectors 311A-311F to their respective zones and/or otherwise determine which of the projectors 311A-311F to use for which zone as part of a projector calibration process. The zones may be configured to overlap. This may result in a more true to life material visualization as there may not be gaps between zones and/or projected images. As multiple of the projectors 311A-311F may be projecting into overlapping areas between zones, the projectors 311A-311F may be configured (and/or instructed by the rendering computing device 105) to reduce the intensity of pixels in the overlapping areas as compared to pixels in non-overlapping areas to avoid and/or reduce intensity distortion in the overlapping areas.
For example, each of two projectors 311A-311F projecting into an overlapping area may each reduce the intensity of pixels in the overlapping areas by 50% as compared to pixels in non-overlapping areas to avoid and/or reduce intensity distortion in the overlapping areas. In another example, one of two projectors 311A-311F projecting into an overlapping area may reduce the intensity of pixels in the overlapping areas by 25% as compared to pixels in non-overlapping areas to avoid and/or reduce intensity distortion in the overlapping areas whereas the other of the two projectors 311A-311F projecting into an overlapping area may reduce the intensity of pixels in the overlapping areas by 75% as compared to pixels in non-overlapping areas to avoid and/or reduce intensity distortion in the overlapping areas. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
The projectors 311A-311F may be automatically and/or otherwise calibrated. To adjust for thermal drift, the projectors 311A-311F may not be calibrated until the projectors 311A-311F are brought up to their running temperature. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
In some examples, the projectors 311A-311F may all be mounted to the same mounting apparatus, such as a metal rack mounted to a ceiling. This may prevent deviations from occurring after calibration of the projectors 311A-311F that could be caused by relative movement of the structures (such as may be caused by temperature drift) to which one or more of the projectors 311A-311F were mounted had they not been mounted to the same mounting apparatus. The projectors 311A-311F may also be calibrated and/or aligned after being brought up to operating temperature to prevent subsequent temperature drift from causing one or more of the projectors 311A-311F to become misaligned.
Control of the projectors 311A-311F in concert by the rendering computing device 105 may be significant for ensuring that the resulting dynamic projection is true to life of the physical materials and the model as this ensures that all portions of the dynamic projection maintain pixel density, accurate color, alignment, and so on. By way of contrast, individual control of the projectors 311A-311F may result in a projection that has portions with differing pixel density, color, misalignment, and/or other inaccuracies that are not true to life of the physical materials and the model. The rendering computing device 105 may include one or more graphics cards (such as one or more NVIDIA 9600 RTX graphics cards) that enable control of the projectors 311A-311F in concert by the rendering computing device 105.
The projectors 311A-311F may be any kind of projectors, such as a digital light processing projector, a liquid crystal display projector, and/or any other image projecting device. One or more of the projectors 311A-311F may have a resolution of at least 4K (such as 3840×2160 pixels, 4096×2160 pixels, and so on). The higher the resolution of the projectors 311A-311F, the more true to life the projections may appear, though this may also be limited by the quality of the images used in the digital materials. Although six projectors 311A-311F are shown and described, it is understood that this is an example. In other implementations, other numbers of projectors 311A-311F may be used (such as two, five, ten, and so on) without departing from the scope of the present disclosure.
Further, although the projectors 311A-311F are illustrated and described as forward projection devices, it is understood that this is an example. In some implementations, one or more of the projectors 311A-311F may be replaced with one or more rear projection devices and the walls 313A-313C, floor 314, and/or other element may be configured as a translucent and/or transparent screen onto which the rear projection devices may project. In some cases, rear projection implementations may not generate material visualization projections that are as true to life as front projection implementations. However, rear projection implementations may still have the advantage that an observer could walk through the example demonstration area 315 without interrupting any of the projection. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
In various examples, one or more surfaces of the example demonstration area 315 may include one or more surface treatments to prevent and/or reduce light bounce from one or more of the projectors 311A-311F interfering with the projection by one or more other of the projectors 311A-311F. For example, such a surface treatment may include carbon nanotubes (such as crushed carbon nanotubes) applied in one or more polymers. Alternatively and/or additionally, the surface treatments may include one or more treatments that allow reflection of light projected straight at the surface but inhibit reflection of light traveling at angles to the surface, such as from light bounce. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
As described above, modeling may be performed and the model may be adjusted for perspective and/or lighting conditions in the environment in which the demonstration area 315 is located. In some examples, a user may use an app and/or other application (such as executing on a smart phone or other electronic device) to take a light meter measurement of an intended environment. This light meter measurement may be use to adjust the model for lighting conditions, which may include determining the color spectrum associated. Alternatively, the user may provide information regarding the lighting that will be present in the intended environment, such as information regarding lighting that will be used. In another example, the app may be used to take a picture of a white piece of paper and/or other reference in the intended environment and data may be derived from the picture and used to adjust the model for lighting conditions. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
At operation 410, an electronic device (such as the rendering computing device 105 of
At operation 420, the electronic device may apply one or more selected digital materials to at least part of one or more surfaces of the model. Each digital material may include one or more images of a respective physical sample, one or more texture stack maps generated from the respective physical sample, color information generated from the respective physical sample, scale information for the respective physical sample (which may be expressed in texel density, or pixels per unit of measure), use information (such as that the design material associated with the respective physical sample is typically used as a flooring, that the design material is typically interspersed with a grout as well as the dimensions and possible colors for such grout, and so on), light behavior (such as how reflective and/or non-reflective the respective physical sample is, how diffuse the physical sample is, and so on), metadata (such as a stock keeping unit or “SKU” associated with the physical sample, a manufacturer or supplier associated with the respective physical sample, and/or other information), and so on.
At operation 430, the electronic device may adjust the applied model for one or more demonstration area conditions. Such demonstration area conditions may include the impact that lighting (which may be performed using ray tracing using light responsiveness information included in the digital material, lighting specifications for the model, ways that the demonstration area differs from the model that may need to be corrected for so that the model appears as intended when projected into the demonstration area), time of day, and/or other conditions of the design area into which the design materials will be placed might have on how design materials will appear, behave, and/or integrate with other design materials.
At operation 440, the electronic device may split the adjusted applied model into images for one or more projector zones. The electronic device may assign projectors to respective zones and/or otherwise determine which of the projectors to use for which zone as part of a projector calibration process.
At operation 450, the electronic device may provide the respective images to the respective projectors. This may cause the projectors to project the images into the demonstration area, generating a true to life visualization of the digital materials applied to the model.
In various implementations, the method 400 and/or similar and/or related methods may involve one or more feedback loops. In such implementations, generation and/or projection of images onto all surfaces may operate in unison to render a complete image of the space/materials/conditions, and so on. As such, any variable that impacts one of the surfaces may impacts all of the surfaces and accordingly may be updated in real time. These updates may be performed using one or more run-time commands, which may user prompted and/or automated, that may continuously modify the rendering of the images of the space/materials/conditions, and so on. This feedback loop may support decision making by the user.
In various examples, this example method 400 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or edge network and/or by one or more computing devices, such as the rendering computing device 105 of
Although the example method 400 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
For example, the above illustrates and describes the electronic device may split the adjusted applied model into images for one or more projector zones. However, it is understood that this is an example. In various implementations, the electronic device may split the adjusted applied model into images for one or more projectors that then project into respective projector zones. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
Although
It is understood that
At operation 610, an electronic device (such as the rendering computing device 105 of
At operation 620, the electronic device may adjust one or more of the projectors based on the displayed projector calibration image. Adjustment of the one or more of the projectors may include controlling the projectors, signaling the projectors to adjust, providing instructions to adjust the projectors, and so on. Adjustment of the projectors may include controlling a zoom and/or a direction of projection and/or other property.
At operation 630, the electronic device may determine whether or not calibration is complete. If not, the flow may return to operation 610 where the electronic device continues displaying the projector calibration image. Otherwise, the flow may proceed to operation 640 where the electronic device may complete calibration.
In various examples, this example method 600 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or edge network and/or by one or more computing devices, such as the rendering computing device 105 of
Although the example method 600 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
For example, the above illustrates and describes the electronic device adjusting one or more of the projectors based on the displayed projector calibration image. However, it is understood that this is an example. In some implementations, the electronic device may provide instructions to adjust the projectors (such as by displaying the projector calibration image) instead of directly controlling the projectors. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
Projector system calibration map 700A may show projection alignment grids in use on a 4 surface volume. Projector system calibration map 700B may show projection alignment grids in use on a 4 surface volume arranged for half of the 10 projection channels (indicated as letters A-E) in use in the examples depicted in
In the context of the projection alignment grids of the system calibration maps 700A-700D: letters may indicate projection channel (indicated as letters A-K), color borders may be projector frustums, blend zones may be indicated by frustum overlap, ley reference points may be marked, each large square may correspond to an amount of pixels (such as 200 pixels), each small square may correspond to a smaller amount of pixels (such as 25 pixels), and each circle may be a certain number of pixels in diameter (such as 80). Each grid may be custom for each projection surface. The projection alignment grids may function to provide alignment references, ensure even pixel distribution, ensure warp linearity, enable accurate calculation of exact scale, provide a blend zone/blanking reference, and so on.
Projection alignment grids and the reference points within the projection alignment grids may be used for multi-channel projection mapping. More significant may be the clear definition and distribution of pixel density as pixel density pertains to the validation of real-world scale in the context of mapping digital materials to projected surfaces.
At operation 801, an electronic device (such as the rendering computing device 105 of
At operation 802, the electronic device may present digital material options for the surface. In some implementations, the digital material options that are presented may be those that are available for the selected surface. For example, flooring options may be presented when a floor is selected.
At operation 803, the electronic device may receive a digital material selection. The digital material selection may be received from a user via one or more user interfaces. The digital material selection may be one of the digital material options presented in operation 802.
At operation 804, the electronic device may apply the digital material to a model for a demonstration area. At operation 805, the electronic device may split the applied model into images for a number of projectors. At operation 806, the electronic device may cause the projectors to display their respective split images.
At operation 807, the electronic device may determine whether or not an order is received. The order may be received from a user via one or more user interfaces. If so, the flow may proceed to operation 808 where the electronic device may order a physical design material corresponding to the digital material. Otherwise, the flow may proceed to operation 809.
At operation 809, the electronic device may determine whether or not to modify the applied model. If so, the flow may proceed to operation 801 where the electronic device may receive a surface selection. Otherwise, the flow may proceed to operation 810 and end.
In various examples, this example method 800 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or edge network and/or by one or more computing devices, such as the rendering computing device 105 of
Although the example method 800 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
For example, the above illustrates and describes receiving a surface selection, presenting digital material options for the selected surface, receiving a digital material selection, and then generating a visualization accordingly. However, it is understood that this is an example. In other implementations, digital material options may be presented, a digital material selection may be received, and then surface options to apply the digital material to may be presented. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
The system 900 may create one or more material sheets using one or more digital materials. Such a material sheet may be applied to one or more surfaces to visualize how a physical sample corresponding to the digital material would appear in a demonstration space.
For example, the rendering computing device 105 may use the printer 920 to print an image included in the digital material on an adhesive-backed sheet and then use the laminator 921 to apply one or more glossy and/or matte coatings (such as one or more polymers) to make the resulting laminated adhesive-backed sheet true to life of the glossiness properties, matte properties, and/or tactile properties (such as smoothness, stickiness, and so on) indicated in the digital material.
In some cases, the size specified for the material sheet may be different than that of the image included in the digital material. In such case, the rendering computing device 105 may crop the image if the image is larger or perform a process to generate a larger image from the image included in the digital material if the image is smaller. The latter process is discussed in more detail below.
At operation 1010, an electronic device (such as the rendering computing device 105 of
At operation 1020, the electronic device may receive dimensions for the material sheet. The dimensions may be received from a user via one or more user interfaces.
At operation 1030, the electronic device may determine whether an image in the digital material is smaller, larger, or the same size than the received dimensions. If the image in the digital material is the same size as the received dimensions, the flow may proceed to operation 1050 where the electronic device may determine to use the digital material without adjustment before the flow proceeds to operation 1070 where the electronic device may use a printer to print the material sheet using the digital material and the specifications included therein. However, if the image in the digital material is larger than the received dimensions, the flow may proceed to operation 1040 where the electronic device may crop the image before the flow proceeds to operation 1070 where the electronic device may use a printer to print the material sheet using the cropped image and the specifications included in the digital material. Otherwise, if the image in the digital material is smaller than the received dimensions, the flow may proceed to operation 1060 where the electronic device may generate an image from the image included in the digital material before the flow proceeds to operation 1070 where the electronic device may use a printer to print the material sheet using the generated image and the specifications included in the digital material.
The material sheet may be an adhesive-backed material sheet. For example, the adhesive may be a pressure-sensitive adhesive mounted on the back of a substrate (such as acrylic, paper, cardboard, plastic, and so on) on which an image is printed (the top of which may be laminated with one or more coatings). In some examples, the adhesive may be covered by a backing that may be removed for use. In other examples, the adhesive-backed material sheet may be configured in a roll such that the adhesive from one layer of the roll contacts the printed image on the substrate of a lower layer of the roll.
The electronic device may generate the image from the image included in the digital material without rescaling the image by analyzing the appearance of the image included in the digital material, estimating based on the analysis of what the appearance of the image included in the digital material would be if the image included in the digital material extended beyond its dimensions to the received dimensions, generating one or more additional portions of the image included in the digital material based on the estimation, and combining the generated one or more additional portions of the image included in the digital material with the image included in the digital material. This combined image may be the image that the electronic device uses when the image in the digital material is smaller than the received dimensions.
For example, the digital material may be a digital material generated from a physical sample of a marble tile. The marble tile may be 3″×5″ and have a vein pattern. However, the received dimensions may be 3″×9″. In this situation, the electronic device may analyze the vein pattern, estimate what the vein pattern would look like in the portion of 3″×9″ that extends beyond the 3″×5″ of the image included in the digital material generated from the physical sample of a marble tile, generate an additional marble tile image portion based on the estimation of what that vein pattern would look like, and combine the image included in the digital material generated from the physical sample of a marble tile with the additional marble tile image portion.
This process of generating the combined image may not completely correspond (i.e., true to life) to the original piece of marble from which the physical sample from which the digital material was generated was cut. However, the combined image may still be more true to life than if the image included in the digital material were distorted by stretching, scaling, and/or other image manipulations to make the image included in the digital material fit the received dimensions. In this way, material sheets may be printed on demand at any dimensions from any digital material without the distortion that results from stretching, scaling, and/or other image manipulations.
After the electronic device may use a printer to print the material sheet, the flow may proceed to operation 1080 where the electronic device may use a laminator to apply one or more coatings to the material sheet according to specifications included in the digital material. For example, the coatings may be one or more glossy coatings or one or more matte coatings. Printing the image from the digital material or the adjusted image on an adhesive-backed material sheet and laminating and/or otherwise applying one or more glossy coatings, matte coatings, and/or other coatings may cause the printed adhesive-backed material sheet to interact with light like (i.e., look and feel like) the physical sample from which the digital material was generated.
In various examples, this example method 1000 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or edge network and/or by one or more computing devices, such as the rendering computing device 105 of
Although the example method 1000 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
For example, the above illustrates and describes one or more coatings being applied to the material sheet via lamination. However, it is understood that this is an example. In other implementations, other processes may be used to apply the coatings. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
The rendering computing device 105 may use one or more stored digital materials and/or one or more digital materials (which may be obtained from a repository computing device) to generate one or more visualizations for one or more websites. The rendering computing device 105 may do so by generating a model, applying a digital material to the model, adjusting the model for web output conditions, using a virtual camera to capture the adjusted model for web output, and providing the virtual camera captured adjusted model to the web server 1212, which may serve the virtual camera captured adjusted model via one or more PDPs.
At operation 1310, an electronic device (such as the rendering computing device 105 of
At operation 1330, the electronic device may adjust the model for web output conditions. For example, the electronic device may adjust the model for the color space of the average monitor used to view web output. At operation 1340, the electronic device may use a virtual camera to capture the adjusted model for web output.
At operation 1350, the electronic device may provide the virtual camera captured adjusted model to one or more web servers. For example, the web server may serve one or more PDPs that correspond to the digital material and the electronic device may provide the virtual camera captured adjusted model to a web server to use for the PDP that corresponds to the digital material.
In various examples, this example method 1300 may be implemented as a group of interrelated software modules or components that perform various functions discussed herein. These software modules or components may be executed within a cloud network and/or edge network and/or by one or more computing devices, such as the rendering computing device 105 of
Although the example method 1300 is illustrated and described as including particular operations performed in a particular order, it is understood that this is an example. In various implementations, various orders of the same, similar, and/or different operations may be performed without departing from the scope of the present disclosure.
For example, the above describes the electronic device as generating the model and supplying the virtual camera captured adjusted model to one or more web servers. However, it is understood that this is an example. In some implementations, the electronic device may receive a request from a web server for a model to generate and may then generate the model and supply such to the requesting web server. Various configurations are possible and contemplated without departing from the scope of the present disclosure.
The scanner computing device 102 may be any kind of electronic device. Examples of such devices include, but are not limited to, one or more desktop computing devices, laptop computing devices, server computing devices, mobile computing devices, tablet computing devices, set top boxes, digital video recorders, televisions, displays, wearable devices, smart phones, digital media players, and so on. The scanner computing device 102 may include one or more processors 1451 and/or other processing units and/or controllers, one or more non-transitory storage media 1452 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication units 1454, input and/or output components 1453 (such as one or more keyboards, mice, track pads, touch screens, displays, printers, microphones, graphics cards, and so on), and/or other components. The processor 1451 may execute instructions stored in the non-transitory storage medium 1452 to perform various functions. Such functions may include scanning physical materials, creating digital materials, communicating with the rendering computing device 105 and/or the repository computing device 104 via one or more networks 350, control and/or interface with one or more scanners 1463, and so on. Alternatively and/or additionally, the scanner computing device 102 may involve one or more memory allocations configured to store at least one executable asset and one or more processor allocations configured to access the one or more memory allocations and execute the at least one executable asset to instantiate one or more processes and/or services, such as one or more rendering services, web services, printing services, ordering services, and so on.
Similarly, rendering computing device 105 may be any kind of electronic device. Examples of such devices include, but are not limited to, one or more desktop computing devices, laptop computing devices, server computing devices, mobile computing devices, tablet computing devices, set top boxes, digital video recorders, televisions, displays, wearable devices, smart phones, digital media players, and so on. The rendering computing device 105 may include one or more processors 1459 and/or other processing units and/or controllers, one or more non-transitory storage media 1462 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication units 1460, input and/or output components 1461 (such as one or more keyboards, mice, track pads, touch screens, displays, printers, microphones, graphics cards, and so on), and/or other components. The processor 1459 may execute instructions stored in the non-transitory storage medium 1462 to perform various functions. Such functions may include generating one or more models, applying images from digital materials to one or more surfaces of the models, communicating with the scanner computing device 102 and/or the repository computing device 104 via one or more networks 350, and so on. Alternatively and/or additionally, the rendering computing device 105 may involve one or more memory allocations configured to store at least one executable asset and one or more processor allocations configured to access the one or more memory allocations and execute the at least one executable asset to instantiate one or more processes and/or services, such as one or more services, and so on.
Likewise, the repository computing device 104 may be any kind of electronic device. Examples of such devices include, but are not limited to, one or more desktop computing devices, laptop computing devices, server computing devices, mobile computing devices, tablet computing devices, set top boxes, digital video recorders, televisions, displays, wearable devices, smart phones, digital media players, and so on. The scanner computing device 102 may include one or more processors 1455 and/or other processing units and/or controllers, one or more non-transitory storage media 1456 (which may take the form of, but is not limited to, a magnetic storage medium; optical storage medium; magneto-optical storage medium; read only memory; random access memory; erasable programmable memory; flash memory; and so on), one or more communication units 1458, input and/or output components 1457 (such as one or more keyboards, mice, track pads, touch screens, displays, printers, microphones, graphics cards, and so on), and/or other components. The processor 1455 may execute instructions stored in the non-transitory storage medium 1456 to perform various functions. Such functions may include storing digital materials, communicating with the scanner computing device 102 and/or the rendering computing device 105 via one or more networks 350, and so on. Alternatively and/or additionally, the repository computing device 104 may involve one or more memory allocations configured to store at least one executable asset and one or more processor allocations configured to access the one or more memory allocations and execute the at least one executable asset to instantiate one or more processes and/or services, such as one or more services, and so on.
As used herein, the term “computing resource” (along with other similar terms and phrases, including, but not limited to, “computing device” and “computing network”) refers to any physical and/or virtual electronic device or machine component, or set or group of interconnected and/or communicably coupled physical and/or virtual electronic devices or machine components, suitable to execute or cause to be executed one or more arithmetic or logical operations on digital data.
Example computing resources contemplated herein include, but are not limited to: single or multi-core processors; single or multi-thread processors; purpose-configured co-processors (e.g., graphics processing units, motion processing units, sensor processing units, and the like); volatile or non-volatile memory; application-specific integrated circuits; field-programmable gate arrays; input/output devices and systems and components thereof (e.g., keyboards, mice, track pads, generic human interface devices, video cameras, microphones, speakers, and the like); networking appliances and systems and components thereof (e.g., routers, switches, firewalls, packet shapers, content filters, network interface controllers or cards, access points, modems, and the like); embedded devices and systems and components thereof (e.g., system(s)-on-chip, Internet-of-Things devices, and the like); industrial control or automation devices and systems and components thereof (e.g., programmable logic controllers, programmable relays, supervisory control and data acquisition controllers, discrete controllers, and the like); vehicle or aeronautical control devices systems and components thereof (e.g., navigation devices, safety devices or controllers, security devices, and the like); corporate or business infrastructure devices or appliances (e.g., private branch exchange devices, voice-over internet protocol hosts and controllers, end-user terminals, and the like); personal electronic devices and systems and components thereof (e.g., cellular phones, tablet computers, desktop computers, laptop computers, wearable devices); personal electronic devices and accessories thereof (e.g., peripheral input devices, wearable devices, implantable devices, medical devices and so on); and so on. It may be appreciated that the foregoing examples are not exhaustive.
Example information can include, but may not be limited to: personal identification information (e.g., names, social security numbers, telephone numbers, email addresses, physical addresses, driver's license information, passport numbers, and so on); identity documents (e.g., driver's licenses, passports, government identification cards or credentials, and so on); protected health information (e.g., medical records, dental records, and so on); financial, banking, credit, or debt information; third-party service account information (e.g., usernames, passwords, social media handles, and so on); encrypted or unencrypted files; database files; network connection logs; shell history; filesystem files; libraries, frameworks, and binaries; registry entries; settings files; executing processes; hardware vendors, versions, and/or information associated with the compromised computing resource; installed applications or services; password hashes; idle time, uptime, and/or last login time; document files; product renderings; presentation files; image files; customer information; configuration files; passwords; and so on. It may be appreciated that the foregoing examples are not exhaustive.
The foregoing examples and description of instances of purpose-configured software, whether accessible via API as a request-response service, an event-driven service, or whether configured as a self-contained data processing service are understood as not exhaustive. In other words, a person of skill in the art may appreciate that the various functions and operations of a system such as described herein can be implemented in a number of suitable ways, developed leveraging any number of suitable libraries, frameworks, first or third-party APIs, local or remote databases (whether relational, NoSQL, or other architectures, or a combination thereof), programming languages, software design techniques (e.g., procedural, asynchronous, event-driven, and so on or any combination thereof), and so on. The various functions described herein can be implemented in the same manner (as one example, leveraging a common language and/or design), or in different ways. In many embodiments, functions of a system described herein are implemented as discrete microservices, which may be containerized or executed/instantiated leveraging a discrete virtual machine, that are only responsive to authenticated API requests from other microservices of the same system. Similarly, each microservice may be configured to provide data output and receive data input across an encrypted data channel. In some cases, each microservice may be configured to store its own data in a dedicated encrypted database; in others, microservices can store encrypted data in a common database; whether such data is stored in tables shared by multiple microservices or whether microservices may leverage independent and separate tables/schemas can vary from embodiment to embodiment. As a result of these described and other equivalent architectures, it may be appreciated that a system such as described herein can be implemented in a number of suitable ways. For simplicity of description, many embodiments that follow are described in reference to an implementation in which discrete functions of the system are implemented as discrete microservices. It is appreciated that this is merely one possible implementation.
As described herein, the term “processor” refers to any software and/or hardware-implemented data processing device or circuit physically and/or structurally configured to instantiate one or more classes or objects that are purpose-configured to perform specific transformations of data including operations represented as code and/or instructions included in a program that can be stored within, and accessed from, a memory. This term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, or other suitably configured computing element or combination of elements.
Although
For example,
Although the figures of the present application depict projection onto two-dimensional surfaces it is understood that these figures are examples. In various implementations, one or more images may be projected onto various three-dimensional surfaces without departing from the scope of the present disclosure. In some such implementations, image blending and/or other processes may be performed on the one or more images to prevent and/or reduce distortion related to projecting onto the various three-dimensional surfaces. Image blending within the context of the three dimensional mapping may involve treating multiple projectors as single output source for the represented image. Calculations for how the represented image conforms to the various surfaces, lighting conditions, and so on when treating the multiple projectors as a single output source for the represented image may be performed by one or more processors at runtime. This may enable delivering of a unified experience.
In various implementations, a system may include a scanner computing device that creates digital materials by scanning physical samples of design materials, the digital materials including images of the physical samples and texture stacks that include texture maps representing characteristics of the physical samples; projectors configured to project onto projector zones of a demonstration area; and a rendering computing device. The rendering computing device may receive user input indicating a selection of a digital material of the digital materials for a demonstration surface of the demonstration area and, in response to the user input, dynamically generate a model of the demonstration area from a viewpoint, apply the digital material to a model surface of the model that corresponds to the demonstration surface, adjust the model for a demonstration area condition, split the model into split images for the projector zones, and provide the split images to the projectors that correspond to the projector zones.
In some examples, the scanner computing device may recognize that a physical sample of the physical samples includes a repeating pattern and replicate the pattern instead of scanning the entire physical sample. In various examples, the demonstration area condition may include a lighting condition. In some such examples, the rendering computing device may adjust for the lighting condition by compensating for the demonstration area having a different lighting than the model.
In a number of examples, the rendering computing device may perform ray tracing to add lighting to the model. In some examples, the texture maps may include an alpha texture map, a displacement texture map, a roughness texture map, a metallic texture map, a normal texture map, and a base color texture map. In various examples, the scanner computing device may be operable to perform a metallic/roughness workflow and a specular/glossy workflow as part of scanning the physical samples.
In some embodiments, a system may include projectors configured to project onto projector zones of a demonstration area and a rendering computing device. The rendering computing device may receive user input indicating a selection of a digital material of the digital materials for a demonstration surface of the demonstration area, the digital materials including images of the physical samples and texture stacks that include texture maps representing characteristics of the physical samples and, in response to the user input, dynamically generate a model of the demonstration area from a viewpoint; apply the digital material to a model surface of the model that corresponds to the demonstration surface; split the model into split images for the projector zones; and provide the split images to the projectors that correspond to the projector zones.
In various examples, the rendering computing device may use a UV map included in the digital material to maintain scale of the digital material when applying the digital material to the model surface. In some examples, the rendering computing device may use a gaming engine to generate the model of the demonstration area and apply the digital material to the model surface.
In a number of examples, the projector zones may overlap. In various such examples, the rendering computing device may reduce intensity of pixels in overlapping areas where the projector zones overlap.
In some examples, the rendering computing device may provide projector alignment grids to the projectors for mapping pixel density to the demonstration area. In various such examples, the projector alignment grids may include a blend zone.
In a number of embodiments, a system may include a demonstration area, a non-transitory storage medium that stores instructions, and a processor. The processor may execute the instructions to receive first user input indicating a first selection of a first digital material from multiple digital materials for a first demonstration surface of the demonstration area, the digital materials including images of the physical samples and texture stacks that include texture maps representing characteristics of the physical samples; receive second user input indicating a second selection of a second digital material from the multiple digital materials for a second demonstration surface of the demonstration area; assign projectors to projector zones that each correspond to a portion of the demonstration area; and, in response to the user input, dynamically apply the first digital material to a first model surface of a model of the demonstration area, the first model surface corresponding to the first demonstration surface; apply the second digital material to a second model surface of the model of the demonstration area, the second model surface corresponding to the second demonstration surface; split the model into split images for the projector zones; and provide the split images to the projectors that correspond to the projector zones.
In some examples, the system may further include a surface treatment applied to the first demonstration surface that prevents or reduces light bounce from one or more of the projectors. In a number of such examples, the surface treatment may include crushed carbon nanotubes applied in a polymer.
In various examples, the processor may be operable to receive third user input indicating to order a product associated with a respective physical sample associated with the first digital material. In some such examples, the first digital material may include metadata that specifies a stock keeping unit for the product.
In a number of examples, the physical samples may include at least one of fabrics, floorings, paints, wood, paneling, stone, brick, carpet, laminates, countertops, cabinets wallpaper, molding, tiles, paint, and housewares.
Although the above illustrates and describes a number of embodiments, it is understood that these are examples. In various implementations, various techniques of individual embodiments may be combined without departing from the scope of the present disclosure.
As described above and illustrated in the accompanying figures, the present disclosure relates to a visualization system that uses digital materials digitized using PBR by scanning physical samples of design materials. The digital materials have texture stacks that include texture maps representing characteristics of the physical samples and/or metadata specifying such characteristics. The visualization system, in response to the user input, dynamically generates a model of a demonstration area from a viewpoint, applies one or more digital materials to one or more model surfaces of the model that corresponds to one or more demonstration surfaces of the demonstration area, adjusts the model for a demonstration area condition, splits the model into split images for the projector zones, and provides the split images to projectors that correspond to the projector zones. The result is a projection into the demonstration area that is true to life of the physical samples of the design materials.
In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are examples of sample approaches. In other embodiments, the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
The described disclosure may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A non-transitory machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The non-transitory machine-readable medium may take the form of, but is not limited to, a magnetic storage medium (e.g., floppy diskette, video cassette, and so on); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; and so on.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.