The present invention relates to a 3D garment fitting method, in particular, to a 3D garment fitting method applying 3D deformation.
With the popularity of e-commerce, online shopping has been favored by people for its convenience, time-saving and money saving. Online shopping has become a living habit for the consumer, which has immersed into people's daily lives.
However, for the purchase of clothes online, consumers may encounter the trouble of not being able to try on clothes and not knowing what they look like when putting the clothes on. Although some garment websites offer the feature of “virtual fitting room”, this feature is based on a simple 2D texture stitching of the user's avatar and website clothes to show the fitting effect, but the real situation in which the clothes are worn by the user cannot be shown. Therefore, when the user tries on the clothes purchased online, he will find that the clothes are not suitable for wearing, which resulted in a very high rate of returned goods and replaced goods.
Thus, how to make the effect presented in the virtual fitting room close to the actual situation in which the user is wearing the clothes is worthy of consideration by those skilled in the art.
For above problems, a 3D garment fitting method is provided in the present invention in order to make the effect presented in the virtual fitting room close to the actual situation in which the user is wearing the clothes.
According to an exemplary embodiment, a 3D garment fitting method is provided. The method includes the following steps: A: 3D-scanning a mannequin to form a first 3D body shape figure file; B: wearing a garment on the mannequin to form a garment mannequin; C: 3D-scanning the garment mannequin to form a body shape and garment figure file; D: removing a part of the body shape and garment figure file including the first 3D body shape figure file to form a first 3D garment figure file; E: 3D-scanning an actual body shape of a user to form a second 3D body shape figure file; F: stretching or cutting the first 3D body shape figure file into the second 3D body shape figure file, and defining a scale of the stretching or the cutting as a 3D deformation; and G: adjusting the first 3D garment figure file according to the 3D deformation to form a second 3D garment figure file.
In one embodiment, a step H of combining the second 3D garment figure file with the second 3D body shape figure file to form a user body shape and garment figure file.
In one embodiment, the mannequin further includes a plurality of sensing markers, and the plurality of sensing markers are evenly distributed on the surface of the mannequin.
In one embodiment, a 3D scanner is used to scan the mannequin.
In one embodiment, the 3D scanner is used to scan the garment mannequin.
In one embodiment, the 3D scanner is used to scan the actual body shape of the user.
Aspects of the present invention are best understood from the following detailed description when read with the accompanying figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
The following invention provides different embodiment, or examples, for implementing different features of the provided subject matter. Specific examples of components and arrangements are described below to simplify the present invention. These are, of course, merely examples and are not intended to be limiting. For example, the formation of a first feature over or on a second feature in the description that follows may include embodiment in which the first and second features are formed in direct contact, and may also include embodiment in which additional features may be formed between the first and second features, such that the first and second features may not be in direct contact. In addition, the present invention may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiment and/or configurations discussed.
Further, spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein may likewise be interpreted accordingly.
With reference to
First, with reference to the step S1,
Then, with reference to the step S2 and
Thereafter, with reference to the step S3,
After that, with reference to the step S4 and
Then, with reference to the step S5,
And then, with reference to the step S6 and
Subsequently, with reference to the step S7,
Then, with reference to the step S8 and
In summary, the 3D garment fitting method of the present embodiment may display the actual situation in which the user himself wears the garment in a virtual manner.
The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present invention. Those skilled in the art should appreciate that they may readily use the present invention as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present invention, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present invention.