Systems and methods to display rendered images

Information

  • Patent Grant
  • 9483853
  • Patent Number
    9,483,853
  • Date Filed
    Friday, October 26, 2012
    12 years ago
  • Date Issued
    Tuesday, November 1, 2016
    8 years ago
Abstract
A computer-implemented method to display a rendered image is described. A base image is obtained. A rendered image is obtained. The rendered image is matched to a location on the base image. The rendered image is overlaid onto the base image at the location to generate a set of layered images. The set of layered images is displayed.
Description
BACKGROUND

The use of computer systems and computer-related technologies continues to increase at a rapid pace. This increased use of computer systems has influenced the advances made to computer-related technologies. Indeed, computer systems have increasingly become an integral part of the business world and the activities of individual consumers. For example, computers have opened up an entire industry of internet shopping. In many ways, online shopping has changed the way consumers purchase products. However, in some cases, consumers may avoid shopping online. For example, it may be difficult for a consumer to know if they will look good in and/or with a product without seeing themselves in and/or with the product. In many cases, this challenge may deter a consumer from purchasing a product online. Therefore, improving the online shopping experience may be desirable. In various situations, it may be desirable to render an image.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the instant disclosure.



FIG. 1 is a block diagram illustrating one embodiment of an environment in which the present systems and methods may be implemented;



FIG. 2 is a block diagram illustrating one example of the rendered image display module;



FIG. 3 is a block diagram illustrating one example of how a rendered image and a base image may be layered;



FIG. 4 is a block diagram illustrating one example of a rendered object movie based on multiple sets of layered images;



FIG. 5 is a block diagram illustrating one example of an image file that includes more than one rendered images;



FIG. 6 is a block diagram illustrating one example of how an image file may be used to generate a rendered object movie;



FIG. 7 is a block diagram illustrating one example of how multiple image files may be used to generate a rendered object movie;



FIG. 8 is a block diagram illustrating one example of how a single image file with multiple sets of rendered images may be used to generate a rendered object movie;



FIG. 9 is a block diagram illustrating one example of how a single image file with multiple sets of rendered images may be used to generate disparate rendered object movies;



FIG. 10 is a flow diagram illustrating one embodiment of a method to display rendered images;



FIG. 11 is a flow diagram illustrating another embodiment of a method to display rendered images; and



FIG. 12 depicts a block diagram of a computer system suitable for implementing the present systems and methods.





While the embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the instant disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.


SUMMARY

A computer-implemented method to display a rendered image is described. A base image is obtained. A rendered image is obtained. The rendered image is matched to a location on the base image. The rendered image is overlaid onto the base image at the location to generate a set of layered images. The set of layered images is displayed.


The rendered image may include a rendered version of at least a portion of the base image. The location on the base image may correspond to the at least a portion of the base image.


The set of layered images may include the base image and the rendered image. In some cases, the rendered image may cover the at least a portion of the base image.


In some configurations, an image file having a plurality of rendered images may be obtained. In one example, obtaining a rendered image may include determining a rendered image from the plurality of rendered images based on the base image. In another example, obtaining a rendered image may include selecting a pixel area of the image file that corresponds to the determined rendered image. In some cases, each of the plurality of rendered images may correspond to the same rendering scheme.


A computing device configured to display a rendered image is also described. The computing device includes instructions stored in memory that in is in electronic communication with a processor. The instructions being executable by the processor to obtain a base image. The instructions also being executable by the processor to obtain a rendered image. The instructions additionally being executable by the processor to match the rendered image to a location on the base image. The instructions also being executable by the processor to overlay the rendered image onto the base image at the location to generate a set of layered images. The instructions further being executable by the processor to display the set of layered images.


A computer-program product to display a rendered image is additionally described. The computer-program product includes a non-transitory computer-readable storage medium that stores computer executable instructions. The instructions being executable by the processor to obtain a base image. The instructions also being executable by the processor to obtain a rendered image. The instructions additionally being executable by the processor to match the rendered image to a location on the base image. The instructions also being executable by the processor to overlay the rendered image onto the base image at the location to generate a set of layered images. The instructions further being executable by the processor to display the set of layered images.


DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS

In various situations, it may be desirable to render an image. For example, it may be desirable for a user to virtually try-on a product so that the user may see what they would look like in and/or with the product. In this example, a base image of the user may be rendered to include the product. Typically, only a portion of the base image may be rendered. Therefore, a portion of the image may not be rendered. In these situations, it may be desirable to store only the rendered portion of the image as a rendered image. Alternatively, it may be desirable to store the region that includes the rendered portion of the image as the rendered image. This may allow the full rendered image to be created by overlaying the rendered image onto the base image at the location where the rendered image was rendered from.


In some configurations, this may allow for efficient storing, transferring, and/or processing of rendered images. This may be particularly beneficial when the same base image is used for displaying multiple renderings (as in the case of virtually trying-on products, for example). For instance, the same base image may be quickly modified to show different renderings by overlaying the appropriate rendering image. In the case that rendered images are obtained from a server, the reduced size of each rendering image and the reuse of the same base image may allow the user to virtually try-on and shop simultaneously (e.g., the products are presented to the user in the context of a rendered virtual try-on).


In one example, a user may desire to virtually try-on a pair of glasses (so that the user may see how the glasses look on their face/head, for example). The user may provide one or more images of the user's face/head (e.g., the base images). In some configurations, each base image may be rendered with a virtual pair of glasses. In one example, a region that includes the rendered pair of glasses (a strip of a user's head that includes the region around the eyes and ears of the user, for example) may be stored as the rendered image. In some configurations, multiple rendered images may be formed. For example, a rendered image may be generated for each pair of glasses. As another example, a rendered image may be generated for each position on the face that a particular pair of glasses may be positioned at. This may allow a user to change glasses and/or change the position of a pair of glasses on their face simply by overlaying the proper rendered image on the base image.


Referring now to the figures, FIG. 1 is a block diagram illustrating one embodiment of an environment 100 in which the present systems and methods may be implemented. In one embodiment, a device 102 may be coupled to a memory 108 through an interface 106. In some configurations, the device 102 may include a rendered image display module 104 that implements the systems and methods described herein.


In some configurations, the rendered image display module 104 may access data stored in the memory 108. For example, the rendered image display module 104 may obtain one or more base images 110 and one or more rendered images 112 from the memory 108. In some configurations, the rendered image display module 104 may match a rendered image 112 with a particular portion of a base image 110. In some configurations, the rendered image display module 104 may overlay the rendered image 112 onto the base image 110 to cover the particular portion of the base image 110. In some configurations, the rendered image display module 104 may display the layered images (e.g., the rendered image 112 and the base image 110). In some cases, displaying the set of layered images creates the illusion that the layered images are a single rendered image. The rendered image display module 104 is discussed in greater detail below.


In one embodiment, the memory 108 may be local to the device 102. For example, the memory 108 may be within the device 102 and/or directly attached to the device 102. In another embodiment, the memory 108 may be remote from the device 102. For example, the memory 108 may be hosted by a server. For instance, the device 102 may access the one or more base images 110 and the one or more rendered images 112 through a network (via the server, for example). In yet another embodiment, a first memory 108 may be local to the device 102 and a second memory 108 may be remote from the device 102. In this case, the rendered image display module 104 may obtain a base image 110 and/or a rendered image 112 from either memory 108. Examples of interface 106 include system buses, serial AT attachment (SATA) interfaces, universal serial bus (USB) interfaces, wired networks, wireless networks, cellular networks, satellite networks, etc. In some cases, the interface 106 may be the internet.



FIG. 2 is a block diagram illustrating one example of the rendered image display module 104-a. The rendered image display module 104-a may be one example of the rendered image display module 104 illustrated in FIG. 1. In some configurations, the rendered image display module 104-a may include an obtaining module 202, a matching module 204, a layering module 206, and a displaying module 208.


In one embodiment, the obtaining module 202 may obtain one or more base images 110 and one or more rendered images 112. In one example, the obtaining module 202 may obtain a base image 110. In some cases, a base image 110 may be associated with a perspective (e.g., an x, y, z, orientation, for example). In one example, the obtaining module 202 may obtain a rendered image 112 based on the perspective of the base image 110. For example, the obtaining module 202 may obtain a rendered image 112 that corresponds (has the same perspective, for example) to a base image 110.


In one example, the obtaining module 202 may obtain a base image 110 and/or a rendered image 112 by initiating a read access to the memory 108. In another example, the obtaining module 202 may obtain a base image 110 and/or a rendered image 112 by receiving the base image 110 and/or the rendered image 112 from a server. In some cases, the obtaining module 202 may request the base image 110 and/or the rendered image 112 from the server and may receive the base image 110 and/or the rendered image 112 from the server in response to the request.


In one embodiment, the obtaining module 202 may obtain one or more image files (image files 502 as illustrated in FIG. 5, 6, 7, 8, or 9, for example). For example, the obtaining module 202 may obtain the one or more image files from a server as described previously. In some cases, an image file may include multiple rendered images 112. In these cases, the obtaining module 202 may determine one or more rendered image 112 from the multiple rendered images 112. For example, the obtaining module 202 may determine a rendered image 112 that corresponds to a base image 110, that corresponds to a particular rendering scheme (associated with a particular set of rendered images, for example), and/or that meets a predetermined criteria. In some cases, the obtaining module 202 may obtain a rendered image 112 by selecting the determined rendered image 112 from the image file 502. In one example, the obtaining module 202 may extract a pixel area from the image file 502 to obtain the rendered image 112.


In one embodiment, the matching module 204, may match a rendered image 112 to a location in its corresponding base image 110. For example, the matching module 204 may match the determined rendered image 112 to the location in the base image 110 that the rendered image 112 was rendered from. In some cases, the rendered image 112 may be a rendered version of a portion of the base image. In these cases, the rendered image 112 may be matched with a corresponding un-rendered version portion of the base image 110. In some cases, the matching module 204 may match the rendered image 112 with the base image 110 so that a layering of the images may create the illusion of a single rendered image.


In one embodiment, the layering module 206 may overlay a rendered image 112 onto a base image 110. In some configurations, the rendered image 112 may cover a portion of the base image 110. In one example, the rendered image 112 may be a rendered version of the portion of the base image 110 that it is covering. For instance, the layering module 206 may overlay the rendered image 112 onto the base image 110 based on the determined matching from the matching module 204. In some cases, the layering module 206 may overlay multiple rendering images 112 onto a single base image 110.


In one embodiment, the displaying module 208 may display the overlaying rendered image 112 and the underlying base image 110. In some cases, the layered images create the illusion that the base image 110 has been rendered.



FIG. 3 is a block diagram illustrating one example of how a rendered image 112 and a base image 110 may be layered. In some configurations, the rendered image 112 may cover a portion of the base image 306. In some cases, the rendered image 112 may be a rendered version of a portion of the base image 306. In some configurations, the rendered image 112 may match the rendered image 112 with the base image 110 so that that the rendered image 112 may cover the portion of the base image 306 that it was rendered from. In one example, the pixel location 302 of the rendered image 112 may correspond to a rendered version of the pixel location 304 of the base image 110. In this example, the rendered image 112 may be matched so that the rendered image 112 may cover the portion of the base image 306. For example, pixel location 302 may be matched to pixel location 304 so that the rendered image 112 may cover the portion of the base image 306.


In some configurations, the rendered image 112 may be layered over the base image 110 based on the determined matching. For example, the rendered image 112 may be overlaid onto the base image 110 so that the rendered image 112 covers the portion of the base image 306 that the rendered image 212 was rendered from. Although FIG. 3 illustrates the case of layering one rendered image 112 onto a base image 110, it is noted that more than one rendered images 112 may be layered onto a single base image 110. In some configurations, one or more rendered images 112 that are layered with a base image 110 may be referred to as a set of layered images 308. In some cases, the set of layered images 308 may create the illusion of a single rendered image. For example, the set of layered images 308 may provide the same effect as if the entire base image 110 were rendered as a single image.



FIG. 4 is a block diagram illustrating one example of a rendered object movie 402 based on multiple sets of layered images 308. In some configurations, each set of layered images 308 may be an example of the set of layered images 308 illustrated in FIG. 3.


In some configurations, different base images 110 may depict an object in different orientations. For example, a first base image 110-a-1 may depict the object in a first orientation, the second base image 110-a-2 may depict the object in a second orientation, and the nth base image 110-a-n may depict the object in an nth orientation. In one example, these base images 110a-n may be ordered to create an object movie of the object. For example, the base images 110a-n may be ordered to so that the orientation of the object appears to rotate as two or more consecutive base images 110 are cycled.


In some configurations, each rendered image 112 may correspond to a particular base image 110. For example, a first rendered image 112-a-1 may correspond to the first base image 110-a-1, the second rendered image 112-a-2 may correspond to the second base image 110-a-2, and the nth rendered image 112-a-n may correspond to the nth base image 110-a-n. For instance, each rendered image 112 may be a rendered version of a portion of its corresponding base image 110. In some configurations, a first set of layered images 308-a-1 may include the first rendered image 112-a-1 and the first base image 110-a-1, a second set of layered images 308-a-2 may include the second rendered image 112-a-2 and the second base image 110-a-2, and an nth set of layered images 308-a-n may include the nth rendered image 112-a-n and the nth base image 110-a-n. In some cases, the first set of layered images 308-a-1, the second set of layered images 308-a-2, and the nth set of layered images 308-a-n may be ordered to form a rendered object movie 402. In some configurations, the rendered object movie 402 may create the illusion of an object movie of rendered base images.


In one example, the systems and methods described herein may be used for virtually trying-on glasses. In this example the base images 110 may be images of a user's face/head in various orientations. For instance, a first base image 110-a-1 may be an image of a user's face/head in a left facing orientation, a second base image 110-a-2 may be an image of a user's face/head in a center facing orientation, and a third base image 110-a-3 may be an image of a user's face/head in a right facing orientation. In this example, each rendered images 112 may be a rendered version of a portion of a base image 110. For example, the first rendered image 112-a-1 may include a portion of the user's face/head with a rendered set of glasses (rendered based on the left facing orientation of the first base image 110-a-1, for example), the second rendered image 112-a-2 may include a portion of the user's face/head with the rendered set of glasses (rendered based on the center facing orientation of the second base image 110-a-2), and the third rendering image 112-a-2 may include a portion of the user's face/head with the rendered set of glasses (rendered based on the right facing orientation of the third base image 110-a-3). In one example, the portion of the user's face/head may be a strip that includes the eye and ear regions of the user's face/head. In some configurations, each rendered image 112 may be layered with its corresponding base image 110 to create sets of layered images 308. In one example, these sets of layered images 308 may be ordered to create a rendered object movie 402 that allows the rendered glasses to be viewed on the user's face/head from the various orientations of the underlying base images 110.


In a similar example, seventeen base images 110 may be used as the basis for a rendered object movie 402 (of a virtual try-on, for example). In one embodiment, base images 110-a-1 through 110-a-8 may be images of the user's face/head in various (e.g., decreasingly) left facing orientations, the ninth base image 110-a-9 may be an image of the user's face/head in a center facing orientation, and base images 110-a-10 through 110-a-17 may be images of the user's face/head in various (e.g., increasingly) right facing orientations. As described above, rendered images 112 based on these base images 110 and layered on their corresponding base images 110 may enable the efficient creation of a rendered object movie 402 (e.g., virtual try-on). In some cases, more (or less) than seventeen base images 110 may be used.



FIG. 5 is a block diagram illustrating one example of an image file 502 that includes more than one rendered images 112. In some configurations, the image file 502 may include at least one rendered image 112 for each base image 110. In one example, the first rendered image 112-a-1 may be a rendered version of a portion of the first base image 110-a-1, the second rendered image 112-a-2 may be a rendered version of a portion of the second base image 110-a-2, and the nth rendered image 112-a-n may be a rendered version of a portion of the nth base image 110-a-n. In this example, the first rendered image 112-a-1 may be a rendered image 112 for the first base image 110-a-1, the second rendered image 112-a-2 may be a rendered image 112 for the second base image 110-a-2, and the nth rendered image 112-a-n may be a rendered image 112 for the nth base image 110-a-n.


In some cases, disparate pixel areas of the image file 502 may correspond to disparate rendered images 112. For example, a first pixel area 504-a-1 of the image file 502 may correspond to a first rendered image file 112-a-1, a second pixel area 504-a-2 of the image file 502 may correspond to a second rendered image file 112-a-2, and an nth pixel area 504-a-n of the image file 502 may correspond to an nth rendered image file 112-a-n.


In one example, each pixel area 504 may be a strip that is 80 pixels high and as wide as the image of the image file 502. In this example, the first rendered image 112-a-1 may be obtained by selecting pixels 0-80 from the image file 502, the second rendered image 112-a-2 may be obtained by selecting pixels 80-160 from the image file 502, and so forth. In other examples, the dimensions and/or location of a pixel area 504 in the image file 502 may differ. In some configurations, one or more configuration settings may define the dimensions and/or location of a pixel area 504 that should be selected to obtain a particular rendered image 112. In one example, the location of a particular rendered image 112 may be consistent across multiple image files 502.


In some configurations, an image file 502 may include one or more sets of rendered images. In one example, a set of rendered images may allow an entire object movie to be rendered with a particular rendering scheme (e.g., a first style of glasses in a first position, a second style of glasses in a first position, a first style of glasses in a second position, etc.). For example, each rendering image 112 in the set of rendering images may apply the particular rendering scheme to its corresponding base image 110. In some cases, a rendered object movie 402 may be rendered based on a single image file 502.



FIG. 6 is a block diagram illustrating one example of how an image file 502 may be used to generate a rendered object movie 402. In one embodiment, each of the rendered images 112 in the image file 502 are part of a set rendered images 112-a.


In some cases, a set of rendered images 112-a may include a rendered image 112 for each base image 110 that applies a particular rendering scheme to each base image 110. For example, the first rendered image 112-a-1 may apply a first rendering scheme to the first base image 110-a-1, the second rendered image 112-a-2 may apply the first rendering scheme to the second base image 110-a-2, and the nth rendered image 112-a-n may apply the first rendering scheme to the nth base image 110-a-n. In one example, the first rendering scheme may be to render a particular pair of glasses in a particular position. In some cases this may allow for the creation of a rendered object movie 402 with a particular rendering scheme using a single image file 502.


In some configurations, a first set of layered images 308-a-1 may be obtained by covering a portion of the first base image 110-a-1 with the first rendered image 112-a-1, a second set of layered images 308-a-2 may be obtained by covering a portion of the second base image 110-a-2 with the second rendered image 112-a-2, and an nth set of layered image 308-a-n may be obtained by covering a portion of the nth base image 110-a-n with the nth rendered image 112-a-n. In some configurations, the sets of layered images may be ordered to create a rendered object movie 402. In the case that the rendered images 112 are from the same set of rendered images 112-a, the rendered object movie 402 may be an object movie with a single rendering scheme.


In one example, the rendered object movie 402 may be an object movie of a virtual try-on of a pair of glasses. In this scenario, the first rendered image 112-a-1 may include a particular pair of glasses rendered in a first orientation at a particular position on the user's face/head, the second rendered image 112-a-2 may include the particular pair of glasses rendered in a second orientation at the particular position on the user's face/head, and the nth rendered image 112-a-n may include the particular pair of glasses rendered in an nth orientation at the particular position on the user's face/head. In some configurations, the rendered images 112 for a rendered object movie 402 of a particular product at a particular position may be stored in and/or extracted from a single image file 502.



FIG. 7 is a block diagram illustrating one example of how multiple image files 502 may be used to generate a rendered object movie 402. In some cases, a first image file 502-a-1 may include a first set of rendered images 112-a (e.g., a first pair of glasses rendered in a first position) and a second image file 502-a-2 may include a second set of rendered images 112-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).


In one example, a first rendering scheme may be selected (e.g., a first pair of glasses rendered in a first position). For example, a first set of rendered images 112-a may be used to generate each set of layered images 308. For instance, a first set of layered images 308-a-1 may be obtained by overlaying the first rendered image 112-a-1 from the first image file 502-a-1 onto the first base image 110-a-1. In some cases, the rendering scheme may be changed. For example, a second rendering scheme may subsequently be selected. For example, a second set of rendering image 112-b may be used to generate each set of layered images 308. For instance, a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 112-b-2 from the second image file 502-a-2 onto the second base image 110-a-2, and an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image from the second image file 502-a-2 onto the nth base image 110-a-n. As illustrated in FIG. 7, the first set of layered images 308-a-1 is based on the first set of rendered images 112-a and the second and nth sets of layered images 308-b-2, 308-b-n are based on the second set of rendered images 112-b. However, before the change in rendering schemes, each of sets of layered images 308 may be based on the first set of rendered images 112-a and after the change in rendering schemes, each of the sets of layered images 308 may be based on the second set of rendered images 112-b.


For example, a user a user may change glasses positions during the display of the rendered object movie 402. In this scenario, previous to the change in position, the sets of rendered images 308 may be based on the first glasses position and after the change in position, the sets of rendered images 308 may be based on the second glasses position.



FIG. 8 is a block diagram illustrating one example of how a single image file 502-b with multiple sets of rendered images may be used to generate a rendered object movie 402. In some configurations, an image file 502-b may include a first set of rendered images 112-a (e.g., a first pair of glasses rendered in a first position) and a second set of rendered images 112-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).


In one example, a first rendering scheme may be selected (e.g., a first pair of glasses rendered in a first position). For example, a first set of rendered images 112-a may be used to generate each set of layered images 308. For instance, a first set of layered images 308-a-1 may be obtained by overlaying the first rendered image 112-a-1 onto the first base image 110-a-1. In some cases, the rendering scheme may be changed. For example, a second rendering scheme may subsequently be selected. For example, a second set of rendering image 112-b may be used to generate each set of layered images 308. For instance, a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 112-b-2 onto the second base image 110-a-2, and an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image onto the nth base image 110-a-n. As illustrated in FIG. 8, the first set of layered images 308-a-1 is based on the first set of rendered images 112-a and the second and nth sets of layered images 308-b-2, 308-b-n are based on the second set of rendered images 112-b. However, before the change in rendering schemes, each of sets of layered images 308 may be based on the first set of rendered images 112-a and after the change in rendering schemes, each of the sets of layered images 308 may be based on the second set of rendered images 112-b. In this example, multiple sets of rendered images may be included in the same image file 502-b.



FIG. 9 is a block diagram illustrating one example of how a single image file 502 with multiple sets of rendered images may be used to generate disparate rendered object movies 402. In some configurations, the image file 502-b may include a first set of rendered images 112-a (e.g., a first pair of glasses rendered in a first position) and a second set of rendered images 112-b (e.g., the first pair of glasses rendered in a second position, a second pair of glasses rendered in a first position, the second pair of glasses rendered in a second position, etc.).


In some configurations, a first set of layered images 308-a-1 may be obtained by overlaying the first rendered image 112-a-1 from the first set of rendered images 112-a onto the first base image 110-a-1, a second set of layered images 308-a-2 may be obtained by overlaying the second rendered image 112-a-2 from the first set of rendered images 112-a onto the second base image 110-a-2, and an nth set of layered images 308-a-n may be obtained by overlaying the nth rendered image from the first set of rendered images 112-a onto the nth base image 110-a-n. Similarly, a first set of layered images 308-b-1 may be obtained by overlaying the first rendered image 112-b-1 from the second set of rendered images 112-b onto the first base image 110-a-1, a second set of layered images 308-b-2 may be obtained by overlaying the second rendered image 112-b-2 from the second set of rendered images 112-b onto the second base image 110-a-2, and an nth set of layered images 308-b-n may be obtained by overlaying the nth rendered image from the second set of rendered images 112-b onto the nth base image 110-a-n. As illustrated in FIG. 9, the same base images 110 may be used for disparate rendered object movies 402 because the particular rendered images 112 are overlaid onto the base images 110.


In one example, the first set of rendered images 112-a may have a first rendering scheme (e.g., a first pair of glasses in a first position) and the second set of rendered images 112-b may have a second rendering scheme (e.g., a first pair of glasses in a second position). In this example, the first rendered object movie 402-a-1 may be a virtual-try on for the first pair of glasses in the first position and the second rendered object movie 402-a-2 may be a virtual try-on for the first pair of glasses in the second position.



FIG. 10 is a flow diagram illustrating one embodiment of a method 1000 to display rendered images. In some configurations, the method 1000 may be implemented by the rendered image display module 104 illustrated in FIG. 1 or 2.


At step 1002, a base image may be obtained. At step 1004, a rendered image may be obtained. At step 1006, the rendered image may be matched to a location on the base image. For example, the rendered image may be matched to a pixel location on the base image so that the rendered image covers the portion of the base image that the rendered image was rendered from. At step 1008, the rendered image may be overlaid onto the base image at the location to generate a set of layered images. At step 1010, the set of layered images may be displayed.



FIG. 11 is a flow diagram illustrating another embodiment of a method 1000 to display rendered images. In some configurations, the method 1100 may be implemented by the rendered image display module 104 illustrated in FIG. 1 or 2.


At step 1102, a base image having a first perspective may be obtained. At step 1104, an image file having a plurality of rendered images may be obtained. At step 1106, a rendered image may be selected from the plurality of rendered images based at least upon the first perspective. For example, the rendered image may be rendered based on the first perspective. At step 1108, the rendered image may be matched to a location on the base image. At step 1110, the rendered image may be overlaid onto the base image at the location to generate a set of layered images. At step 1112, the set of layered images may be displayed.



FIG. 12 depicts a block diagram of a computer system 1210 suitable for implementing the present systems and methods. Computer system 1210 includes a bus 1212 which interconnects major subsystems of computer system 1210, such as a central processor 1214, a system memory 1217 (typically RAM, but which may also include ROM, flash RAM, or the like), an input/output controller 1218, an external audio device, such as a speaker system 1220 via an audio output interface 1222, an external device, such as a display screen 1224 via display adapter 1226, an keyboard 1232 (interfaced with a keyboard controller 1233) (or other input device), multiple USB devices 1292 (interfaced with a USB controller 1291), and a storage interface 1234. Also included are a mouse 1246 (or other point-and-click device) and a network interface 1248 (coupled directly to bus 1212).


Bus 1212 allows data communication between central processor 1214 and system memory 1217, which may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown), as previously noted. The RAM is generally the main memory into which the operating system and application programs are loaded. The ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices. For example, the rendered image display module 104 to implement the present systems and methods may be stored within the system memory 1217. Applications resident with computer system 1210 are generally stored on and accessed via a non-transitory computer readable medium, such as a hard disk drive (e.g., fixed disk 1244) or other storage medium. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via interface 1248.


Storage interface 1234, as with the other storage interfaces of computer system 1210, can connect to a standard computer readable medium for storage and/or retrieval of information, such as a fixed disk drive 1244. Fixed disk drive 1244 may be a part of computer system 1210 or may be separate and accessed through other interface systems. Network interface 1248 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence). Network interface 1248 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection, or the like.


Many other devices or subsystems (not shown) may be connected in a similar manner (e.g., document scanners, digital cameras, and so on). Conversely, all of the devices shown in FIG. 12 need not be present to practice the present systems and methods. The devices and subsystems can be interconnected in different ways from that shown in FIG. 12. The operation of a computer system such as that shown in FIG. 12 is readily known in the art and is not discussed in detail in this application. Code to implement the present disclosure can be stored in a non-transitory computer-readable medium such as one or more of system memory 1217 or fixed disk 1244. The operating system provided on computer system 1210 may be iOS®, MS-DOS®, MS-WINDOWS®, OS/2®, UNIX®, Linux®, or another known operating system.


While the foregoing disclosure sets forth various embodiments using specific block diagrams, flowcharts, and examples, each block diagram component, flowchart step, operation, and/or component described and/or illustrated herein may be implemented, individually and/or collectively, using a wide range of hardware, software, or firmware (or any combination thereof) configurations. In addition, any disclosure of components contained within other components should be considered exemplary in nature since many other architectures can be implemented to achieve the same functionality.


The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.


Furthermore, while various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these exemplary embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these software modules may configure a computing system to perform one or more of the exemplary embodiments disclosed herein.


The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present systems and methods and their practical applications, to thereby enable others skilled in the art to best utilize the present systems and methods and various embodiments with various modifications as may be suited to the particular use contemplated.


Unless otherwise noted, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” In addition, for ease of use, the words “including” and “having,” as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.” In addition, the term “based on” as used in the specification and the claims is to be construed as meaning “based at least upon.”

Claims
  • 1. A computer-implemented method to display a rendered image, comprising: obtaining, via a processor, a first base image and a second base image;obtaining, via the processor, an object image;overlaying, via the processor, the object image over the first base image;obtaining, via the processor, a rendered image, wherein obtaining the rendered image comprises extracting a portion of the first base image overlaid with the object image;identifying, via the processor, a point of interest in the portion of the first base image in the rendered image;identifying, via the processor, a matching point of interest in the second base image;matching, via the processor, the identified point of interest in the portion of the first base image of the rendered image to the identified point of interest in the second base image;overlaying, via the processor, the rendered image onto the second base image at the matching points of interest to generate a first set of layered images; anddisplaying, via the processor, the first set of layered images.
  • 2. The method of claim 1, wherein the first set of layered images comprises the rendered image covering at least a portion of the second base image.
  • 3. The method of claim 1, further comprising, obtaining an image file having a plurality of rendered images.
  • 4. The method of claim 3, wherein obtaining a rendered image comprises: determining a rendered image from the plurality of rendered images based on the second base image; andselecting a pixel area of the image file that corresponds to the determined rendered image.
  • 5. The method of claim 3, wherein each of the plurality of rendered images corresponds to a rendering scheme.
  • 6. The method of claim 1, wherein the first set of layered images creates an illusion that the overlaying rendered image and the underlying base image are a single image.
  • 7. The method of claim 1, wherein the first set of layered images is displayed as part of an object movie.
  • 8. The method of claim 7, further comprising: obtaining, via a processor, a third base image, wherein the first base image, the second base image, and the third base image are images of a user, wherein the third base image displays the user in an orientation different than an orientation of the user in the second base image;overlaying, via the processor, a second rendered image onto the third base image to generate a second set of layered images; andselectively displaying, via the processor, the first and second set of layered images to create the object movie.
  • 9. A computing device configured to display a rendered image, comprising: a processor;memory in electronic communication with the processor;instructions stored in the memory, the instructions being executable by the processor to: obtain a first base image and a second base image;obtain an object image;overlay the object image over the first base image;obtain a rendered image, obtaining the rendered image comprising extracting a portion of the first base image overlaid with the object image;identify a point of interest in the portion of the first base image in the rendered image;identify a matching point of interest in the second base image;match the identified point of interest in the portion of the first base image of the rendered image to the identified point of interest in the second base image;overlay the rendered image onto the second base image at the matching points of interest to generate a first set of layered images; anddisplay the first set of layered images.
  • 10. The computing device of claim 9, wherein the first set of layered images comprises the rendered image covering, at least a portion of the second base image.
  • 11. The computing device of claim 9, wherein the instructions are further executable by the processor to obtain an image file having a plurality of rendered images.
  • 12. The computing device of claim 11, wherein the instructions executable by the processor to obtain a rendered image comprise instructions executable by the processor to: determine a rendered image from the plurality of rendered images based on the base image; andselect a pixel area of the image file that corresponds to the determined rendered image.
  • 13. The computing device of claim 11, wherein each of the plurality of rendered images corresponds to a rendering scheme.
  • 14. The computing device of claim 9, wherein the first set of layered images creates an illusion that the overlaying rendered image and the underlying base image are a single image.
  • 15. The computing device of claim 9, wherein the first set of layered images is displayed as part of an object movie.
  • 16. The computing device of claim 9, wherein the instructions are further executable by the processor to: obtain a third base image, wherein the first base image, the second base image, and the third base image are images of a user, wherein the third base image displays the user in an orientation different than an orientation of the user in the second base image;overlay a second rendered image onto the third base image to generate a second set of layered images; andselectively display the first and second set of layered images to form an object movie.
  • 17. A computer-program product to display a rendered image, the computer-program product comprising a non-transitory computer-readable storage medium that stores computer executable instructions, the instructions being executable by a processor to: obtain a first base image and a second base image;on of the first base image;obtain an object image;overlay the object image over the first base image;obtain a rendered image, obtaining the rendered image comprising extracting a portion of the first base image overlaid with the object image;identify a point of interest in the portion of the first base image in the rendered image;identify a matching point of interest in the second base image;match the identified point of interest in the portion of the first base image of the rendered image to the identified point of interest in the second base image;overlay the rendered image onto the second base image at the matching points of interest to generate a first set of layered images; anddisplay the first set of layered images.
  • 18. A computer-implemented method to display a rendered image, comprising: obtaining, via a processor, a first base image and a second base image;obtaining, via the processor, an object image;overlaying, via the processor, the object image over the first base image;obtaining, via the processor, a rendered image, wherein obtaining the rendered image comprises extracting data representing a portion of the first base image overlaid with the object image;matching, via the processor, an identified point of interest in the portion of the first base image of the rendered image to a corresponding identified point of interest in the second base image;overlaying, via the processor, the rendered image, using the extracted data, onto the second base image at the matching points of interest to generate a set of layered images; anddisplaying, via the processor, the set of layered images.
  • 19. The computer-implemented method of claim 18, wherein the extracting data representing a portion of the first base image overlaid with the object image comprises extracting, via the processor, a portion of the rendered image of the first base image overlaid with the object image.
  • 20. The method of claim 18, further comprising: obtaining, via a processor, a third base image, wherein the first base image, the second base image, and the third base image are images of a user, wherein the third base image displays the user in an orientation different than an orientation of the user in the second base image;overlaying, via the processor, a second rendered image onto the third base image to generate a second set of layered images; andselectively displaying, via the processor, the first and second set of layered images to create the object movie.
  • 21. A computer-implemented method to display a rendered image, comprising: obtaining, via a processor, a first base image and a second base image;obtaining, via the processor, an object image;overlaying, via the processor, the object image over the first base image;obtaining, via the processor, a rendered image, wherein obtaining the rendered image comprises extracting data representing a portion of the first base image overlaid with the object image;overlaying, via the processor, the rendered image, onto the second base image to generate a first set of layered images;obtaining, via a processor, a third base image, wherein the first base image, the second base image, and the third base image are images of a user, wherein the third base image displays the user in an orientation different than an orientation of the user in the second base image;overlaying, via the processor, a second rendered image onto the third base image to generate a second set of layered images; andselectively displaying, via the processor, the first and second set of layered images to create the object movie.
RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application No. 61/650,983, entitled SYSTEMS AND METHODS TO VIRTUALLY TRY-ON PRODUCTS, and filed on May 23, 2012, which is incorporated herein in its entirety by this reference.

US Referenced Citations (486)
Number Name Date Kind
3927933 Humphrey Dec 1975 A
4370058 Trötscher et al. Jan 1983 A
4467349 Maloomian Aug 1984 A
4522474 Slavin Jun 1985 A
4534650 Clerget et al. Aug 1985 A
4539585 Spackova et al. Sep 1985 A
4573121 Saigo et al. Feb 1986 A
4613219 Vogel Sep 1986 A
4698564 Slavin Oct 1987 A
4724617 Logan et al. Feb 1988 A
4730260 Mori et al. Mar 1988 A
4781452 Ace Nov 1988 A
4786160 Fürter Nov 1988 A
4845641 Ninomiya et al. Jul 1989 A
4852184 Tamura et al. Jul 1989 A
4957369 Antonsson Sep 1990 A
5139373 Logan et al. Aug 1992 A
5255352 Falk Oct 1993 A
5257198 van Schoyck Oct 1993 A
5280570 Jordan Jan 1994 A
5281957 Schoolman Jan 1994 A
5428448 Albert-Garcia Jun 1995 A
5485399 Saigo et al. Jan 1996 A
5550602 Braeuning Aug 1996 A
5592248 Norton et al. Jan 1997 A
5631718 Markovitz et al. May 1997 A
5666957 Juto Sep 1997 A
5682210 Weirich Oct 1997 A
5720649 Gerber et al. Feb 1998 A
5724522 Kagami et al. Mar 1998 A
5774129 Poggio et al. Jun 1998 A
5809580 Arnette Sep 1998 A
5844573 Poggio et al. Dec 1998 A
5880806 Conway Mar 1999 A
5908348 Gottschald Jun 1999 A
5974400 Kagami et al. Oct 1999 A
5980037 Conway Nov 1999 A
5983201 Fay Nov 1999 A
5987702 Simioni Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
D417883 Arnette Dec 1999 S
6016150 Lengyel et al. Jan 2000 A
6018339 Stevens Jan 2000 A
D420037 Conway Feb 2000 S
D420379 Conway Feb 2000 S
D420380 Simioni et al. Feb 2000 S
6024444 Little Feb 2000 A
D421764 Arnette Mar 2000 S
D422011 Conway Mar 2000 S
D422014 Simioni et al. Mar 2000 S
D423034 Arnette Apr 2000 S
D423552 Flanagan et al. Apr 2000 S
D423553 Brune Apr 2000 S
D423554 Conway Apr 2000 S
D423556 Conway Apr 2000 S
D423557 Conway Apr 2000 S
D424094 Conway May 2000 S
D424095 Brune et al. May 2000 S
D424096 Conway May 2000 S
D424589 Arnette May 2000 S
D424598 Simioni May 2000 S
D425542 Arnette May 2000 S
D425543 Brune May 2000 S
D426568 Conway Jun 2000 S
D427225 Arnette Jun 2000 S
D427227 Conway Jun 2000 S
6072496 Guenter et al. Jun 2000 A
6095650 Gao et al. Aug 2000 A
6102539 Tucker Aug 2000 A
D430591 Arnette Sep 2000 S
D432156 Conway et al. Oct 2000 S
D433052 Flanagan Oct 2000 S
6132044 Sternbergh Oct 2000 A
6139141 Zider Oct 2000 A
6139143 Brune et al. Oct 2000 A
6142628 Saigo Nov 2000 A
6144388 Bornstein Nov 2000 A
D434788 Conway Dec 2000 S
D439269 Conway Mar 2001 S
6208347 Migdal et al. Mar 2001 B1
6222621 Taguchi et al. Apr 2001 B1
6231188 Gao et al. May 2001 B1
6233049 Kondo et al. May 2001 B1
6246468 Dimsdale Jun 2001 B1
6249600 Reed et al. Jun 2001 B1
6281903 Martin et al. Aug 2001 B1
6305656 Wemyss Oct 2001 B1
6307568 Rom Oct 2001 B1
6310627 Sakaguchi Oct 2001 B1
6330523 Kacyra et al. Dec 2001 B1
6356271 Reiter et al. Mar 2002 B1
6377281 Rosenbluth et al. Apr 2002 B1
6386562 Kuo May 2002 B1
6415051 Callari et al. Jul 2002 B1
6419549 Shirayanagi Jul 2002 B2
6420698 Dimsdale Jul 2002 B1
6434278 Hashimoto Aug 2002 B1
6456287 Kamen et al. Sep 2002 B1
6466205 Simpson et al. Oct 2002 B2
6473079 Kacyra et al. Oct 2002 B1
6492986 Metaxas et al. Dec 2002 B1
6493073 Epstein Dec 2002 B2
6508553 Gao et al. Jan 2003 B2
6512518 Dimsdale Jan 2003 B2
6512993 Kacyra et al. Jan 2003 B2
6516099 Davison et al. Feb 2003 B1
6518963 Waupotitsch et al. Feb 2003 B1
6527731 Weiss et al. Mar 2003 B2
6529192 Waupotitsch Mar 2003 B1
6529626 Watanabe et al. Mar 2003 B1
6529627 Callari et al. Mar 2003 B1
6533418 Izumitani et al. Mar 2003 B1
6535223 Foley Mar 2003 B1
6556196 Blanz et al. Apr 2003 B1
6563499 Waupotitsch et al. May 2003 B1
6583792 Agnew Jun 2003 B1
6624843 Lennon Sep 2003 B2
6634754 Fukuma et al. Oct 2003 B2
6637880 Yamakaji et al. Oct 2003 B1
6647146 Davison et al. Nov 2003 B1
6650324 Junkins Nov 2003 B1
6659609 Mothes Dec 2003 B2
6661433 Lee Dec 2003 B1
6664956 Erdem Dec 2003 B1
6668082 Davison et al. Dec 2003 B1
6671538 Ehnholm et al. Dec 2003 B1
6677946 Ohba Jan 2004 B1
6682195 Dreher Jan 2004 B2
6692127 Abitbol et al. Feb 2004 B2
6705718 Fossen Mar 2004 B2
6726463 Foreman Apr 2004 B2
6734849 Dimsdale et al. May 2004 B2
6736506 Izumitani et al. May 2004 B2
6760488 Moura et al. Jul 2004 B1
6775128 Leitao Aug 2004 B2
6785585 Gottschald Aug 2004 B1
6791584 Xie Sep 2004 B1
6792401 Nigro et al. Sep 2004 B1
6807290 Liu et al. Oct 2004 B2
6808381 Foreman et al. Oct 2004 B2
6817713 Ueno Nov 2004 B2
6825838 Smith et al. Nov 2004 B2
6847383 Agnew Jan 2005 B2
6847462 Kacyra et al. Jan 2005 B1
6876755 Taylor et al. Apr 2005 B1
6893245 Foreman et al. May 2005 B2
6903746 Fukushima et al. Jun 2005 B2
6907310 Gardner et al. Jun 2005 B2
6922494 Fay Jul 2005 B1
6943789 Perry et al. Sep 2005 B2
6944327 Soatto Sep 2005 B1
6950804 Strietzel Sep 2005 B2
6961439 Ballas Nov 2005 B2
6965385 Welk et al. Nov 2005 B2
6965846 Krimmer Nov 2005 B2
6968075 Chang Nov 2005 B1
6980690 Taylor et al. Dec 2005 B1
6999073 Zwern et al. Feb 2006 B1
7003515 Glaser et al. Feb 2006 B1
7016824 Waupotitsch et al. Mar 2006 B2
7034818 Perry et al. Apr 2006 B2
7043059 Cheatle et al. May 2006 B2
7051290 Foreman et al. May 2006 B2
7062722 Carlin et al. Jun 2006 B1
7069107 Ueno Jun 2006 B2
7095878 Taylor et al. Aug 2006 B1
7103211 Medioni et al. Sep 2006 B1
7116804 Murase et al. Oct 2006 B2
7133048 Brand Nov 2006 B2
7152976 Fukuma et al. Dec 2006 B2
7154529 Hoke et al. Dec 2006 B2
7156655 Sachdeva et al. Jan 2007 B2
7184036 Dimsdale et al. Feb 2007 B2
7209557 Lahiri Apr 2007 B2
7212656 Liu et al. May 2007 B2
7212664 Lee et al. May 2007 B2
7215430 Kacyra et al. May 2007 B2
7218323 Halmshaw et al. May 2007 B1
7219995 Ollendorf et al. May 2007 B2
7224357 Chen et al. May 2007 B2
7234937 Sachdeva et al. Jun 2007 B2
7242807 Waupotitsch et al. Jul 2007 B2
7290201 Edwards Oct 2007 B1
7310102 Spicer Dec 2007 B2
7324110 Edwards et al. Jan 2008 B2
7415152 Jiang et al. Aug 2008 B2
7421097 Hamza et al. Sep 2008 B2
7426292 Moghaddam et al. Sep 2008 B2
7434931 Warden et al. Oct 2008 B2
7436988 Zhang et al. Oct 2008 B2
7441895 Akiyama et al. Oct 2008 B2
7450737 Ishikawa et al. Nov 2008 B2
7489768 Strietzel Feb 2009 B1
7492364 Devarajan et al. Feb 2009 B2
7508977 Lyons et al. Mar 2009 B2
7523411 Carlin Apr 2009 B2
7530690 Divo et al. May 2009 B2
7532215 Yoda et al. May 2009 B2
7533453 Yancy May 2009 B2
7540611 Welk et al. Jun 2009 B2
7557812 Chou et al. Jul 2009 B2
7563975 Leahy et al. Jul 2009 B2
7573475 Sullivan et al. Aug 2009 B2
7573489 Davidson et al. Aug 2009 B2
7587082 Rudin et al. Sep 2009 B1
7609859 Lee et al. Oct 2009 B2
7630580 Repenning Dec 2009 B1
7634103 Rubinstenn et al. Dec 2009 B2
7643685 Miller Jan 2010 B2
7646909 Jiang et al. Jan 2010 B2
7651221 Krengel et al. Jan 2010 B2
7656402 Abraham et al. Feb 2010 B2
7657083 Parr et al. Feb 2010 B2
7663648 Saldanha et al. Feb 2010 B1
7665843 Xie Feb 2010 B2
7689043 Austin et al. Mar 2010 B2
7699300 Iguchi Apr 2010 B2
7711155 Sharma et al. May 2010 B1
7717708 Sachdeva et al. May 2010 B2
7720285 Ishikawa et al. May 2010 B2
D616918 Rohrbach Jun 2010 S
7736147 Kaza et al. Jun 2010 B2
7755619 Wang et al. Jul 2010 B2
7756325 Vetter et al. Jul 2010 B2
7760923 Walker et al. Jul 2010 B2
7768528 Edwards et al. Aug 2010 B1
D623216 Rohrbach Sep 2010 S
7804997 Geng et al. Sep 2010 B2
7814436 Schrag et al. Oct 2010 B2
7830384 Edwards et al. Nov 2010 B1
7835565 Cai et al. Nov 2010 B2
7835568 Park et al. Nov 2010 B2
7845797 Warden et al. Dec 2010 B2
7848548 Moon et al. Dec 2010 B1
7852995 Strietzel Dec 2010 B2
7856125 Medioni et al. Dec 2010 B2
7860225 Strietzel Dec 2010 B2
7860301 Se et al. Dec 2010 B2
7876931 Geng Jan 2011 B2
7896493 Welk et al. Mar 2011 B2
7907774 Parr et al. Mar 2011 B2
7929745 Walker et al. Apr 2011 B2
7929775 Hager et al. Apr 2011 B2
7953675 Medioni et al. May 2011 B2
7961914 Smith Jun 2011 B1
8009880 Zhang et al. Aug 2011 B2
8026916 Wen Sep 2011 B2
8026917 Rogers et al. Sep 2011 B1
8026929 Naimark Sep 2011 B2
8031909 Se et al. Oct 2011 B2
8031933 Se et al. Oct 2011 B2
8059917 Dumas et al. Nov 2011 B2
8064685 Solem et al. Nov 2011 B2
8070619 Edwards Dec 2011 B2
8073196 Yuan et al. Dec 2011 B2
8090160 Kakadiaris et al. Jan 2012 B2
8113829 Sachdeva et al. Feb 2012 B2
8118427 Bonnin et al. Feb 2012 B2
8126242 Brett et al. Feb 2012 B2
8126249 Brett et al. Feb 2012 B2
8126261 Medioni et al. Feb 2012 B2
8130225 Sullivan et al. Mar 2012 B2
8131063 Xiao et al. Mar 2012 B2
8132123 Schrag et al. Mar 2012 B2
8144153 Sullivan et al. Mar 2012 B1
8145545 Rathod et al. Mar 2012 B2
8155411 Hof et al. Apr 2012 B2
8160345 Pavlovskaia et al. Apr 2012 B2
8177551 Sachdeva et al. May 2012 B2
8182087 Esser et al. May 2012 B2
8194072 Jones et al. Jun 2012 B2
8199152 Sullivan et al. Jun 2012 B2
8200502 Wedwick Jun 2012 B2
8204299 Arcas et al. Jun 2012 B2
8204301 Xiao et al. Jun 2012 B2
8204334 Bhagavathy et al. Jun 2012 B2
8208717 Xiao et al. Jun 2012 B2
8212812 Tsin et al. Jul 2012 B2
8217941 Park et al. Jul 2012 B2
8218836 Metaxas et al. Jul 2012 B2
8224039 Ionita et al. Jul 2012 B2
8243065 Kim Aug 2012 B2
8248417 Clifton Aug 2012 B1
8260006 Callari et al. Sep 2012 B1
8260038 Xiao et al. Sep 2012 B2
8260039 Shiell et al. Sep 2012 B2
8264504 Naimark Sep 2012 B2
8269779 Rogers et al. Sep 2012 B2
8274506 Rees Sep 2012 B1
8284190 Muktinutalapati et al. Oct 2012 B2
8286083 Barrus et al. Oct 2012 B2
8289317 Harvill Oct 2012 B2
8290769 Taub et al. Oct 2012 B2
8295589 Ofek et al. Oct 2012 B2
8300900 Lai et al. Oct 2012 B2
8303113 Esser et al. Nov 2012 B2
8307560 Tulin Nov 2012 B2
8330801 Wang et al. Dec 2012 B2
8346020 Guntur Jan 2013 B2
8351649 Medioni et al. Jan 2013 B1
8355079 Zhang et al. Jan 2013 B2
8372319 Liguori et al. Feb 2013 B2
8374422 Roussel Feb 2013 B2
8385646 Lang et al. Feb 2013 B2
8391547 Huang et al. Mar 2013 B2
8411092 Sheblak et al. Apr 2013 B2
8433157 Nijim et al. Apr 2013 B2
8447099 Wang et al. May 2013 B2
8459792 Wilson Jun 2013 B2
8605942 Takeuchi Dec 2013 B2
8605989 Rudin et al. Dec 2013 B2
8743051 Moy et al. Jun 2014 B1
8813378 Grove Aug 2014 B2
20010023413 Fukuma et al. Sep 2001 A1
20010026272 Feld et al. Oct 2001 A1
20010051517 Strietzel Dec 2001 A1
20020010655 Kjallstrom Jan 2002 A1
20020105530 Waupotitsch et al. Aug 2002 A1
20020149585 Kacyra et al. Oct 2002 A1
20030001835 Dimsdale et al. Jan 2003 A1
20030030904 Huang Feb 2003 A1
20030071810 Shoov et al. Apr 2003 A1
20030110099 Trajkovic et al. Jun 2003 A1
20030112240 Cerny Jun 2003 A1
20040004633 Perry et al. Jan 2004 A1
20040090438 Alliez et al. May 2004 A1
20040217956 Besl et al. Nov 2004 A1
20040223631 Waupotitsch et al. Nov 2004 A1
20040257364 Basler Dec 2004 A1
20050053275 Stokes Mar 2005 A1
20050063582 Park et al. Mar 2005 A1
20050111705 Waupotitsch et al. May 2005 A1
20050128211 Berger et al. Jun 2005 A1
20050162419 Kim Jul 2005 A1
20050190264 Neal Sep 2005 A1
20050208457 Fink et al. Sep 2005 A1
20050226509 Maurer et al. Oct 2005 A1
20060012748 Periasamy et al. Jan 2006 A1
20060017887 Jacobson et al. Jan 2006 A1
20060067573 Parr et al. Mar 2006 A1
20060127852 Wen Jun 2006 A1
20060161474 Diamond et al. Jul 2006 A1
20060212150 Sims Sep 2006 A1
20060216680 Buckwalter et al. Sep 2006 A1
20070013873 Jacobson et al. Jan 2007 A9
20070104360 Huang et al. May 2007 A1
20070127848 Kim et al. Jun 2007 A1
20070160306 Ahn et al. Jul 2007 A1
20070183679 Moroto et al. Aug 2007 A1
20070233311 Okada et al. Oct 2007 A1
20070262988 Christensen Nov 2007 A1
20080084414 Rosel et al. Apr 2008 A1
20080112610 Israelsen et al. May 2008 A1
20080136814 Chu et al. Jun 2008 A1
20080152200 Medioni et al. Jun 2008 A1
20080162695 Muhn et al. Jul 2008 A1
20080163344 Yang Jul 2008 A1
20080170077 Sullivan et al. Jul 2008 A1
20080201641 Xie Aug 2008 A1
20080219589 Jung et al. Sep 2008 A1
20080240588 Tsoupko-Sitnikov et al. Oct 2008 A1
20080246759 Summers Oct 2008 A1
20080271078 Gossweiler et al. Oct 2008 A1
20080278437 Barrus et al. Nov 2008 A1
20080278633 Tsoupko-Sitnikov et al. Nov 2008 A1
20080279478 Tsoupko-Sitnikov et al. Nov 2008 A1
20080280247 Sachdeva et al. Nov 2008 A1
20080294393 Laake et al. Nov 2008 A1
20080297503 Dickinson et al. Dec 2008 A1
20080310757 Wolberg et al. Dec 2008 A1
20090010507 Geng Jan 2009 A1
20090040216 Ishiyama Feb 2009 A1
20090123037 Ishida May 2009 A1
20090129402 Moller et al. May 2009 A1
20090132371 Strietzel et al. May 2009 A1
20090135176 Snoddy et al. May 2009 A1
20090135177 Strietzel et al. May 2009 A1
20090144173 Mo et al. Jun 2009 A1
20090153552 Fidaleo et al. Jun 2009 A1
20090153553 Kim et al. Jun 2009 A1
20090153569 Park et al. Jun 2009 A1
20090154794 Kim et al. Jun 2009 A1
20090184960 Carr et al. Jul 2009 A1
20090185763 Park et al. Jul 2009 A1
20090219281 Maillot Sep 2009 A1
20090279784 Arcas et al. Nov 2009 A1
20090304270 Bhagavathy et al. Dec 2009 A1
20090310861 Lang et al. Dec 2009 A1
20090316945 Akansu Dec 2009 A1
20090316966 Marshall et al. Dec 2009 A1
20090324030 Frinking et al. Dec 2009 A1
20090324121 Bhagavathy et al. Dec 2009 A1
20100030578 Siddique et al. Feb 2010 A1
20100134487 Lai et al. Jun 2010 A1
20100138025 Morton et al. Jun 2010 A1
20100141893 Altheimer et al. Jun 2010 A1
20100145489 Esser et al. Jun 2010 A1
20100166978 Nieminen Jul 2010 A1
20100179789 Sachdeva et al. Jul 2010 A1
20100191504 Esser et al. Jul 2010 A1
20100198817 Esser et al. Aug 2010 A1
20100209005 Rudin et al. Aug 2010 A1
20100277476 Johanson et al. Nov 2010 A1
20100293192 Suy et al. Nov 2010 A1
20100293251 Suy et al. Nov 2010 A1
20100302275 Saldanha et al. Dec 2010 A1
20100329568 Gamliel et al. Dec 2010 A1
20110001791 Kirshenboim et al. Jan 2011 A1
20110025827 Shpunt et al. Feb 2011 A1
20110026606 Bhagavathy et al. Feb 2011 A1
20110026607 Bhagavathy et al. Feb 2011 A1
20110029561 Slaney et al. Feb 2011 A1
20110040539 Szymczyk et al. Feb 2011 A1
20110043540 Fancher et al. Feb 2011 A1
20110043610 Ren et al. Feb 2011 A1
20110071804 Xie Mar 2011 A1
20110075916 Knothe et al. Mar 2011 A1
20110096832 Zhang et al. Apr 2011 A1
20110102553 Corcoran et al. May 2011 A1
20110115786 Mochizuki May 2011 A1
20110148858 Ni et al. Jun 2011 A1
20110157229 Ni et al. Jun 2011 A1
20110158394 Strietzel Jun 2011 A1
20110166834 Clara Jul 2011 A1
20110188780 Wang et al. Aug 2011 A1
20110208493 Altheimer et al. Aug 2011 A1
20110211816 Goedeken et al. Sep 2011 A1
20110227923 Mariani et al. Sep 2011 A1
20110227934 Sharp Sep 2011 A1
20110229659 Reynolds Sep 2011 A1
20110229660 Reynolds Sep 2011 A1
20110234581 Eikelis et al. Sep 2011 A1
20110234591 Mishra et al. Sep 2011 A1
20110249136 Levy Oct 2011 A1
20110262717 Broen et al. Oct 2011 A1
20110267578 Wilson Nov 2011 A1
20110279634 Periyannan et al. Nov 2011 A1
20110292034 Corazza et al. Dec 2011 A1
20110293247 Bhagavathy et al. Dec 2011 A1
20110304912 Broen et al. Dec 2011 A1
20120002161 Altheimer et al. Jan 2012 A1
20120008090 Atheimer et al. Jan 2012 A1
20120013608 Ahn et al. Jan 2012 A1
20120016645 Altheimer et al. Jan 2012 A1
20120021835 Keller et al. Jan 2012 A1
20120038665 Strietzel Feb 2012 A1
20120075296 Wegbreit et al. Mar 2012 A1
20120079377 Goosens Mar 2012 A1
20120082432 Ackley et al. Apr 2012 A1
20120114184 Barcons-Palau et al. May 2012 A1
20120114251 Solem et al. May 2012 A1
20120121174 Bhagavathy et al. May 2012 A1
20120130524 Clara et al. May 2012 A1
20120133640 Chin et al. May 2012 A1
20120133850 Broen et al. May 2012 A1
20120147324 Marin et al. Jun 2012 A1
20120158369 Bachrach et al. Jun 2012 A1
20120162218 Kim et al. Jun 2012 A1
20120166431 Brewington et al. Jun 2012 A1
20120170821 Zug et al. Jul 2012 A1
20120176380 Wang et al. Jul 2012 A1
20120183202 Wei et al. Jul 2012 A1
20120183204 Aarts et al. Jul 2012 A1
20120183238 Savvides et al. Jul 2012 A1
20120192401 Pavlovskaia et al. Aug 2012 A1
20120206610 Wang et al. Aug 2012 A1
20120219195 Wu et al. Aug 2012 A1
20120224629 Bhagavathy et al. Sep 2012 A1
20120229758 Marin et al. Sep 2012 A1
20120256906 Ross et al. Oct 2012 A1
20120263437 Barcons-Palau et al. Oct 2012 A1
20120288015 Zhang et al. Nov 2012 A1
20120294369 Bhagavathy et al. Nov 2012 A1
20120294530 Bhaskaranand Nov 2012 A1
20120299914 Kilpatrick et al. Nov 2012 A1
20120306874 Nguyen et al. Dec 2012 A1
20120307074 Bhagavathy et al. Dec 2012 A1
20120314023 Barcons-Palau et al. Dec 2012 A1
20120320153 Barcons-Palau et al. Dec 2012 A1
20120323581 Strietzel et al. Dec 2012 A1
20130027657 Esser et al. Jan 2013 A1
20130070973 Saito et al. Mar 2013 A1
20130088490 Rasmussen et al. Apr 2013 A1
20130187915 Lee et al. Jul 2013 A1
20130201187 Tong et al. Aug 2013 A1
20130271451 Tong et al. Oct 2013 A1
Foreign Referenced Citations (68)
Number Date Country
10007705 Sep 2001 DE
0092364 Oct 1983 EP
0359596 Mar 1990 EP
0994336 Apr 2000 EP
1011006 Jun 2000 EP
1136869 Sep 2001 EP
1138253 Oct 2001 EP
0444902 Jun 2002 EP
1450201 Aug 2004 EP
1728467 Dec 2006 EP
1154302 Aug 2009 EP
2966038 Apr 2012 FR
2449855 Dec 2008 GB
2003345857 Dec 2003 JP
2004272530 Sep 2004 JP
2005269022 Sep 2005 JP
20000028583 May 2000 KR
200000051217 Aug 2000 KR
20040097200 Nov 2004 KR
20080086945 Sep 2008 KR
20100050052 May 2010 KR
WO 9300641 Jan 1993 WO
WO 9604596 Feb 1996 WO
WO 9740342 Oct 1997 WO
WO 9740960 Nov 1997 WO
WO 9813721 Apr 1998 WO
WO 9827861 Jul 1998 WO
WO 9827902 Jul 1998 WO
WO 9835263 Aug 1998 WO
WO 9852189 Nov 1998 WO
WO 9857270 Dec 1998 WO
WO 9956942 Nov 1999 WO
WO 9964918 Dec 1999 WO
WO 0000863 Jan 2000 WO
WO 0016683 Mar 2000 WO
WO 0045348 Aug 2000 WO
WO 0049919 Aug 2000 WO
WO 0062148 Oct 2000 WO
WO 0064168 Oct 2000 WO
WO 0123908 Apr 2001 WO
WO 0132074 May 2001 WO
WO 0135338 May 2001 WO
WO 0161447 Aug 2001 WO
WO 0167325 Sep 2001 WO
WO 0174553 Oct 2001 WO
WO 0178630 Oct 2001 WO
WO 0188654 Nov 2001 WO
WO 0207845 Jan 2002 WO
WO 0241127 May 2002 WO
WO 03079097 Sep 2003 WO
WO 03084448 Oct 2003 WO
WO 2007012261 Feb 2007 WO
WO 2007017751 Feb 2007 WO
WO 2007018017 Feb 2007 WO
WO 2008009355 Jan 2008 WO
WO 2008009423 Jan 2008 WO
WO 2008135178 Nov 2008 WO
WO 2009023012 Feb 2009 WO
WO 2009043941 Apr 2009 WO
2010039976 Apr 2010 WO
2010042990 Apr 2010 WO
WO 2011012743 Feb 2011 WO
WO 2011095917 Aug 2011 WO
WO 2011134611 Nov 2011 WO
WO 2011147649 Dec 2011 WO
WO 2012051654 Apr 2012 WO
WO 2012054972 May 2012 WO
WO 2012054983 May 2012 WO
Non-Patent Literature Citations (30)
Entry
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042504, mailed Aug. 19, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042509, mailed Sep. 2, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042514, mailed Aug. 30, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042517, mailed Aug. 29, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042512, mailed Sep. 6, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042529, mailed Sep. 17, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042525, mailed Sep. 17, 2013.
PCT International Search Report for PCT International Patent Application No. PCT/US2013/042520, mailed Sep. 27, 2013.
Tracker, Tracker Help, Nov. 2009.
3D Morphable Model Face Animation, http://www.youtube.com/watch?v=nice6NYb—WA, Apr. 20, 2006.
Visionix 3D iView, Human Body Measurement Newsletter, vol. 1., No. 2, Sep. 2005, pp. 2 and 3.
Blaise Aguera y Arcas demos Photosynth, May 2007. Ted.com, http://www.ted.com/talks/blaise—aguera—y—arcas—demos—photosynth.html.
ERC Tecnology Leads to Eyeglass “Virtual Try-on” System, Apr. 20, 2012, http://showcase.erc-assoc.org/accomplishments/microelectronic/imsc6-eyeglass.htm.
PCT International Search Report for PCT International Patent Application No. PCT/US2012/068174, mailed Mar. 7, 2013.
Information about Related Patents and Patent Applications, see the section below having the same title.
U.S. Appl. No. 13/837,039, filed Mar. 15, 2013, Systems and Methods for Generating a 3-D Model of a Virtual Try-On Product.
U.S. Appl. No. 13/775,785, filed Feb. 25, 2013, Systems and Methods for Adjusting a Virtual Try-On.
U.S. Appl. No. 13/775,764, filed Feb. 25, 2013, Systems and Methods for Feature Tracking.
U.S. Appl. No. 13/774,995, filed Feb. 22, 2013, Systems and Methods for Scaling a Three-Dimensional Model.
U.S. Appl. No. 13/774,985, filed Feb. 22, 2013, Systems and Methods for Generating a 3-D Model of a Virtual Try-On Product.
U.S. Appl. No. 13/774,983, filed Feb. 22, 2013, Systems and Methods for Generating a 3-D Model of a User for a Virtual Try-On Product.
U.S. Appl. No. 13/774,978, filed Feb. 22, 2013, Systems and Methods for Efficiently Processing Virtual 3-D Data.
U.S. Appl. No. 13/774,958, filed Feb. 22, 2013, Systems and Methods for Rendering Virtual Try-On Products.
U.S. Appl. No. 13/706,909, filed Dec. 6, 2012, Systems and Methods for Obtaining a Pupillary Distance Measurement Using a Mobile Computing Device.
U.S. Appl. No. 13/662,118, filed Oct. 23, 2012, Systems and Methods to Display Rendered Images.
Sinha et al., GPU-based Video Feautre Tracking and Matching, http::frahm.web.unc.edu/files/2014/01/GPU-based-Video-Feature-Tracking-And Matching.pdf, May 2006.
Dror et al., Recognition of Surface Relfectance Properties form a Single Image under Unknown Real-World Illumination, IEEE, Proceedings of the IEEE Workshop on Identifying Objects Across Variations in Lighting: Psychophysics & Computation, Dec. 2011.
Simonite, 3-D Models Created by a Cell Phone, Mar. 23, 2011, url: http://www.technologyreview.com/news/423386/3-d-models-created-by-a-cell-phone/.
Fidaleo, Model-Assisted 3D Face Reconstruction from Video, AMFG'07 Analysis and Modeling of Faces and Gestures Lecture Notes in Computer Science vol. 4778, 2007, pp. 124-138.
Garcia-Mateos, Estimating 3D facial pose in video with just three points, CVPRW '08 Computer vision and Patter Recognition Workshops, 2008.
Related Publications (1)
Number Date Country
20130342575 A1 Dec 2013 US
Provisional Applications (1)
Number Date Country
61650983 May 2012 US