A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
It is common for clothing manufactures to construct garments with visible patterns on the garments. These patterns will form lines or details that fall on the wearer's body.
It is with respect to these and other general considerations that aspects disclosed herein have been made. In addition, although relatively specific problems may be discussed, it should be understood that the aspects should not be limited to solving the specific problems identified in the background or elsewhere in this disclosure.
This disclosure generally relates to systems and methods for anatomy patterning. More specifically, anatomy patterning is any deliberate manipulation of a pattern applied to a garment in order to change the perceived shape of a wearer of the garment toward a desired appearance. Additionally, the disclosure generally relates to the garments that result from use of these systems and methods for anatomy patterning.
In one aspect, the disclosure is directed to a body-enhancing garment. The body-enhancing garment includes a front side, a rear side, and an adjusted pattern. The rear side is opposite the front side. The adjusted pattern is displayed on at least one of the front side and the rear side. Further, the adjusted pattern is manipulated around a first feature of a wearer to change a perceived shape of the first feature of the wearer toward a desired first feature shape.
In another aspect, the disclosure is directed to a method for designing a body-enhancing garment. The method includes:
In yet another aspect, the disclosure is directed to a method for designing a body-enhancing garment. The method includes:
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are illustrative only and are not restrictive of the claims.
Non-limiting and non-exhaustive examples or aspects are described with reference to the following Figures.
In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These embodiments or examples may be combined, other embodiments or examples may be utilized, and structural changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present disclosure is defined by the appended claims and their equivalents.
Each time humans open their eyes, their brains performs trillions of computations in order to see a three-dimensional (3-D) world. These computations operate according to a set of rules. One of these rules is the geodesic assumption: Curved lines on a surface reveal 3-D shape of that surface. This is why, a person looking at
Further rules utilized by the visual system include foreshortening cues and scaling gradients. Scaling gradients refer to the local size of the pattern elements such that larger elements are generally perceived as being on a surface that is closer to the observer, while smaller elements are generally perceived as being on a surface that is further from the viewer. Foreshortening cues can provide additional information about slant, tilt, and curvature of a surface. For example, if a flat pattern consisting of circular elements is distorted in depth then areas that are slanted or tilted away from the observer will feature ellipses rather than circles. The visual system can use the width of the ellipses as an additional cue when constructing a 3-D shape of the surface. For instance, foreshortening cues are visible in the bust of
Another rule utilized by the visual system is linear perspective. An example of linear perspective is shown in
Another rule utilized by the visual system relates to how the brain uses brightness gradients to construct and perceive 3-D shapes. This rule is known as shape from shading and an example of this rule is illustrated in
Many garments are constructed with visible patterns on the fabric. These patterns typically utilize symmetrical, straight, and/or repeating details or pattern elements and have no intentional brightness gradients when the garments are laid flat. Additionally, patterns may include illusory details or lines created within the negative space between the pattern elements, and serve as an informative element of the pattern itself. These patterns become curved and shaded when worn on the body. The visual system assumes that the curvature and/or brightness gradients of those patterns is attributed entirely to the body shape (i.e., that curved lines of the pattern on the garment would be straight lines if the garment was laid flat). Thus, using the rules of perception, the visual system constructs a three-dimensional body shape based in part on the curvature, size, and shading of the pattern.
It is known from the field of evolutionary psychology that each time an individual encounters a person, the individual's brain automatically evaluates a multitude of sensory cues relating to the health and reproductive fitness of the person within a fraction of a second. The individual's initial judgment of attractiveness is a summary of that evaluation, with individuals who appear healthier and more reproductively fit being perceived as more attractive. Therefore, the three-dimensional shape of a person's body is a critical sensory cue that is used to assess the attractiveness of the person.
When a person wears clothing, he or she voluntarily puts patterned clothing on his or her body. The brain interprets the lines, spacing, sizing, and other elements of the pattern using the rules discussed above and other rules known within the field of vision science. Current clothing designs do not take into account that the brain uses these patterns on garments to construct a 3D shape of the wearer. As such, a problem with existing garment construction or design is that it can create garments that make an individual's form less attractive to others, a result that is typically not desired by the individual wearing the garment. While the rules of perception have been heavily studied, these rules have not been applied to clothing. Further, the rules of perception have not been utilized on a garment to change the perception of a human feature to fall within or move toward known attractive size and shape ranges and/or desired size and shape ranges when worn.
As such, there is typically no system or method that utilizes the rules of perception and desired feature ranges to design or manufacture clothing. Therefore, the systems and methods disclosed herein provide systems and/or methods for systematically applying patterns on garments, using the rules of perception, to change the perceived shape of the wearer. The changes to the patterns are based on the anatomy of any wearer and are referred to herein as anatomy patterning. In some embodiments, anatomy patterning is used to increase the attractiveness of the wearer. For instance, an attractive body will bend and/or shade a pattern differently than an unattractive body. Thus, the systems and methods as disclosed herein may adjust a pattern on a flat garment based on the curves and shading created by an attractive body to change the perception of the 3-D shape of the wearer in such a way that the wearer is perceived as more attractive. However, in other embodiments, anatomy patterning is used to change the appearance of the wearer toward any desired feature shape.
The feature of the body may cover any human body part or area, such as the buttocks, legs, chest, waist, feet, hips, etc. This list is exemplary only and is not meant to be limiting. Garments include any clothing item that can be worn by a human, such as pants, shirts, skirts, jackets, shorts, skirts, dresses, leggings, capris, bras, underwear, swim wear, shoes, skorts, outerwear, and etc. This list is exemplary only and is not meant to be limiting.
Knowing that the brain automatically constructs a 3-D shape from the pattern and shading on a wearer, the shape, size, shading and/or positioning of the pattern can be adjusted to change the perceived shape of the wearer. The field of plastic surgery has identified several properties of the shape of the female buttocks and other human features that are considered attractive. As such, the patterning could, for example, be adjusted to change the perceived shape of the wearer of the garment to appear more attractive or to appear closer to these known plastic surgery properties.
Referring now to the drawings, in which like numerals represent like elements through the several figures, various aspects of the present disclosure will be described.
Several different processes or methods may be utilized to anatomy-pattern garments. In some embodiments, anatomy patterning may be performed by adjusting a pattern on clothing based on the rules of perception (such as the principles of the geodesic assumption) after visible inspection on live models. In other embodiments, anatomy patterning is based on a difference in curves found between an actual body shape of a selected feature and a desired body shape for that selected feature. In other embodiments, a method 500 for anatomy patterning may be utilized as illustrated in
The routine or method 500 begins at operation 502, where a desired 3-D body shape is identified. The desired 3-D body shape may include one or more features of the body. A feature may be any body part or area of the body that is covered by a selected garment. For example, the feature may be the buttocks and/or the legs. In some embodiments, the desired 3-D body shape is generated by one or more computing devices. In some embodiments, the desired 3-D body shape is an attractive body shape based on known attractive size and shape ranges. In other embodiments, the desired 3-D body shape accentuates or minimizes the appearance of a specific feature of the body. For example, the desired body shape may be any desired range of sizes and/or shapes for one or more features.
After the 3-D body shape is identified during operation 502, method 500 moves to operation 504. At operation 504, a pattern or flat pattern for the selected garment is identified. For example,
Next, operation 506 is performed. At operation 506, the pattern is adjusted based on the desired 3-D body shape and based on one or more selected shaping effects to create an adjusted pattern. The one or more selected shaping effects may be warping, shading, halftoning, and/or stippling the pattern.
Warping the pattern involves adjusting the provided pattern based on the curves of a desired body shape as illustrated by the warped pattern 406 in
At operation 512, the desired 3-D body shape is converted into a 2D depth map. In some embodiments, the 2D depth map of the desired 3-D body shape is generated by one or more computing devices. For example,
Next, operation 514 is performed. At operation 514, the identified or selected pattern is positioned on the 2D depth map 606 of the desired 3-D body shape. In some embodiments, the size of the pattern is also determined at operation 508. The positioning at operation 514 ensures that the pattern falls over or near a selected feature of a body appropriately when worn. In some embodiments, operation 504 is performed by one or more computing devices.
After the performance of operation 514, operation 516 is performed. At operation 516 the pattern is displaced based on the flat pattern's position on the 2D depth map to create the adjusted pattern. Accordingly, in these embodiments, the pattern is displaced according to the 2D depth map at operation 514 to show the curves that would be created on the flat pattern if it were being worn by a body with the desired 3-D body shape.
Shading the pattern involves adjusting the local brightness of the provided pattern based on the 3D brightness gradients of a desired body shape as illustrated by the warped and shaded pattern 806 in
At operation 517 light is applied to the desired 3-D body shape to determine a 3-D brightness gradient (or shadowing) created by the desired 3-D body shape. Next, at operation 518 a 2D image of the 3-D brightness gradient is created. After operation 518, operation 519 is performed. At operation 519, a brightness gradient based on the 2D-image of the brightness gradient is applied to the flat pattern to form the adjusted pattern. Accordingly, during operations 517-519, the pattern is shaded to show the brightness gradient that would be created on the flat pattern as if it were being worn by a body with the desired 3-D body shape. Anatomy shading as directly applied to a garment and not to a pattern is discussed in more detail in U.S. patent application Ser. No. 14/517,339, filed Oct. 17, 2014, which claims priority to U.S. Provisional Application Ser. No. 61/892,749, filed Oct. 18, 2013, which are both hereby incorporated by reference herein in their entirety. The principles discussed therein for creating shading may illuminate how the shading is applied to a pattern or an adjusted pattern herein.
Halftoning the pattern involves adjusting the size of the pattern elements based on the desired 3-D body shape. For example, when one or more selected shaping effects includes halftoning, operations 520-524 are performed at operation 506 as illustrated by
At operation 520 the 3-D body shape is converted into a modified 2D depth map. The modified 2D depth map is a depth map that has been inverted and contrast adjusted as illustrated by the modified depth map 1600 in
Next, operation 522 is performed. At operation 522, the identified or selected pattern is positioned on the modified 2D depth map of the desired 3-D body shape. The positioning at operation 522 ensures that the pattern falls over or near a selected feature of a body appropriately when worn.
After the performance of operation 522, operation 524 is performed. At operation 524, elements of the pattern are resized based on the flat pattern's position on the modified 2D depth map and/or based on the shading on the modified 2D depth map to create the adjusted pattern. Accordingly, in these embodiments, the elements of the pattern are resized according to the modified 2D depth map at operation 522 to show larger pattern elements at brighter spots and smaller pattern elements at darker spots that would be created on the flat pattern if it were being worn by a body with the desired 3-D body shape. For example,
Stippling the pattern involves adjusting the frequency of the pattern elements based on the desired 3-D body shape. For example, when one or more selected shaping effects includes stippling, operations 526-530 are performed at operation 506 as illustrated by
Similar to the halftoning operation 522, at operation 526 the 3-D body shape is converted into a modified 2D depth map. The modified 2D depth map is a depth map that has been inverted and contrast adjusted as illustrated by the modified depth map 1600 in
Next, operation 528 is performed. At operation 528, the identified or selected pattern is positioned on the modified 2D depth map of the desired 3-D body shape. The positioning at operation 528 ensures that the pattern falls over or near a selected feature of a body appropriately when worn.
After the performance of operation 528, operation 530 is performed. At operation 530 the frequency of the elements of the pattern are changed based on the flat pattern's position on the modified 2D depth map and/or based on the shading on the modified 2D depth map to create the adjusted pattern. Accordingly, in these embodiments, the frequency of the pattern elements (e.g., the number of pattern elements per unit of area) of the pattern is changed according to the modified 2D depth map at operation 522 to show more pattern elements at brighter spots and less pattern elements at darker spots that would be created on the flat pattern if it were being worn by a body with the desired 3-D body shape. For example,
While the different shaping effects where discussed individually above, one or more different perspective elements may be utilized in combination. In some embodiments, the halftoning operations 520-524 and/or the stippling operations 526-530 are performed before the warping operations 512-516. In other embodiment, the shading operations 517-519 may be performed before or after any of the other shaping effect operations. For example,
In some embodiments, a consumer may further adjust a pattern formed during operation 506. This input may come from an adjustment task where the consumer can adjust the pattern on a simulated garment to provide different warping, shading, stippling, and/or halftoning. For example, the consumer may move a slider left or right, where left simulates less warping, shading, stippling, and/or halftoning and right simulates more warping, shading, stippling, and/or halftoning of the pattern. Consumer preferences are then accumulated to inform the preferred amount of adjustment to apply to the pattern during operation 506.
After operation 506, operation 508 is performed. At operation 508, a 2D image of the adjusted pattern is created. In some embodiments, operation 508 is performed by one or more computing devices. In some embodiments, where at least one of the perspective elements is warping, the 2D is image is created utilizing perspective projection. The formed 2D image provides a template for adding and/or applying the adjusted pattern to a garment that changes the perception of the identified feature towards the appearance of the desired 3-D body shape. For example,
At operation 510, the 2D image of the adjusted pattern is applied to a garment or utilized as a template for applying the adjusted pattern to a garment to form a body-enhancing garment. In some embodiments, the adjusted pattern is applied to the garment with a machine, such as laser or printer, and/or in an automated assembly process. In other embodiments, the adjusted pattern is manually added to the garment. In alternative embodiments, the adjusted pattern is formed manually and via a machine.
For example,
In other embodiments, operation 510 includes modifying the adjusted pattern before application to the garment to ensure that the applied adjusted pattern emulates the brightness gradients, curves, and/or shading that would be created by the flat pattern on a garment when worn by the desired 3-D body shape. For example, the adjusted pattern may be modified so that the pattern adjustments are applied to the garment in the correct position, size, and intensity. In some embodiments, as discussed above, the brightness gradient, stippling, warping, and/or halftoning may be modified based on the size of the garment. For example, smaller sizes may receive brighter brightness gradients, more stippling, warping, and/or more halftoning than larger sizes. In other embodiments, the brightness gradient, stippling, and/or halftoning may be adjusted or calibrated based on the visible contrast range of a garment or pattern. In still further embodiment, the adjusted pattern may be modified after visible inspection of the garment with an applied adjusted pattern while being worn by a model or mannequin. These visual inspections ensure that the adjusted pattern when applied to the garment when worn emulate the desired 3-D body shape's curves and shading.
In some embodiments, a method 1100 for designing an anatomy-patterned garment is disclosed as illustrated in
In some embodiments, the amount of warping, shading, halftoning, and/or stippling of the pattern is determined or adjusted based on consumer feedback during the determining of the adjusted pattern. For example, the amount of warping, shading, and/or halftoning of the adjusted pattern may be determined by utilizing an adjustment task where consumers may adjust the amount of patterning on a simulated garment. For example, the consumer may move a slider left or right, where left simulates less warping, shading, and/or halftoning and right simulates more warping, shading, and/or halftoning of the pattern. Consumer preferences are then accumulated to inform the preferred amount of adjustments to apply to the pattern.
In further embodiments, a method 1200 for designing an anatomy-patterned garment is provided as illustrated in
In further embodiments, an adaptive genetic algorithm may be utilized to determine the amount of warping, shading, and/or halftoning for a selected pattern to adjust the pattern. The adaptive genetic algorithm utilizes data from various test subjects to find the most desired pattern adjustment on a garment for a specific feature of the wearer. In this process, subjects are given a random set of different garments illustrating a specific feature (i.e., buttocks, chest, legs, waist, etc.) of the wearer with various different pattern adjustments that change the appearance of these features of the wearer. The subjects are then asked to select a garment or garments from the group that is most attractive or best demonstrates the desired feature. The algorithm then modifies the garments based on the previous selections containing different pattern adjustments to change the appearance of the wearer and asks the same subjects to again select the garment or garments from the group that is most attractive or best demonstrates the desired feature. Each pattern adjustment is specifically created to alter the appearance of wearer based on the rules of perception. This process is performed repeatedly. In some embodiments, the algorithm converges on the most attractive or most desired the amount of warping, shading, and/or halftoning of the pattern for a garment located over or near a particular feature after about 20 generations or trials. However, any suitable system or method may be utilized to adjust the amount of warping, shading, and/or halftoning of the pattern based on the rules of perception for anatomy patterning.
Surprisingly, similar amounts of warping, shading, halftoning, and/or stippling are found to increase attractiveness of the wearer when applied across a variety garment sizes and styles. Additionally, similar amounts of warping, shading, halftoning, and/or stippling are found to increase attractiveness of the wearer when applied across different ethnicities and geographic regions with only small differences, such as China and the United States.
The adjusted pattern on a garment, as discussed above, utilizes the rules of perception to change the appearance of a feature of the wearer. While the above examples adjust curves, angles, widths, heights, shading, sizing and/or etc. of a pattern to change the perception of body features, these adjustments should be subtle enough that the brain interprets the adjustments as being created by the shape of the wearer instead of attributing them to the garment itself. For example, changes to a flat pattern that are too large or too extreme are interpreted by the brain as being attributed to the garment itself instead of the wearer. These types of pattern changes that are attributed to the garment itself are design choices and may fall outside the definition of anatomy patterning.
While the pattern adjustment discussed above has been illustrated on pants, shirts, and dresses, anatomy patterning can be applied to various different garments, such as skirts, shorts, capris, overalls, skorts, dresses, and etc. While the anatomy patterning discussed above has focused on increased attractiveness, any desired feature ranges/dimensions may be utilized by anatomy patterning to change the perception of any feature toward a desired body shape utilizing the rules of perception. While the above anatomy patterning focused on the legs, buttocks, chest, and waist of the wearer, anatomy patterning can also be applied to change the perception of other features of a wearer, such as the shoulders and/or feet.
Additionally, while anatomy patterning has been described in detail for specific features of female garments, the principles discussed above for anatomy patterning can be applied to various other female garments and various other male garments. Additionally, while the disclosed anatomy patterning were discussed on specific garments and in specific combinations above, any of the disclosed anatomy patterning principles may be utilized alone and/or in any combination on any desired garment. Further, as understood by a person of skill in the art additional anatomy patterning other than discussed above may be utilized to change the appearance of a feature discussed above. Additionally, as understood by a person of skill in the art, additional anatomy patterning may be utilized to change the appearance of the additional features that have not been discussed above.
Body-enhancing garments 908 and 1004 are illustrated in
Aspects of the present disclosure, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to aspects of the disclosure. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
This disclosure described some aspects of the present technology with reference to the accompanying drawings, in which only some of the possible aspects were described. Other aspects can, however, be embodied in many different forms and the specific aspects disclosed herein should not be construed as limited to the various aspects of the disclosure set forth herein. Rather, these exemplary aspects were provided so that this disclosure was thorough and complete and fully conveyed the scope of the other possible aspects to those skilled in the art. For example, the various aspects disclosed herein may be modified and/or combined without departing from the scope of this disclosure.
Although specific aspects were described herein, the scope of the technology is not limited to those specific aspects. One skilled in the art will recognize other aspects or improvements that are within the scope and spirit of the present technology. Therefore, the specific structure, acts, or media are disclosed only as illustrative aspects. The scope of the technology is defined by the following claims and any equivalents therein.
Various embodiments and/or examples are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products. The functions/acts noted in the blocks may occur out of the order as shown in any flow diagram. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The description and illustration of one or more aspects provided in this application are not intended to limit or restrict the scope of the disclosure as claimed in any way. The embodiments, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed disclosure. The claims should not be construed as being limited to any embodiment, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the general inventive concept embodied in this application that do not depart from the broader scope of the claims.
Number | Name | Date | Kind |
---|---|---|---|
1654064 | Zaki | Dec 1927 | A |
1697255 | Wolff | Jan 1929 | A |
1817053 | Zerk | Aug 1931 | A |
1852138 | Zerk | Apr 1932 | A |
2381580 | Einbinder | Aug 1945 | A |
2718640 | Suckle | Sep 1955 | A |
2840824 | Horner | Jul 1958 | A |
3683419 | Lewis | Aug 1972 | A |
3691971 | Clarke | Sep 1972 | A |
3696845 | Acker et al. | Oct 1972 | A |
D259820 | Heinfling | Jul 1981 | S |
4282609 | Freedman et al. | Aug 1981 | A |
4888713 | Falk | Dec 1989 | A |
5065458 | Johnson | Nov 1991 | A |
5535451 | Tassone | Jul 1996 | A |
5990444 | Costin | Nov 1999 | A |
6175966 | Wiesenthal | Jan 2001 | B1 |
7058987 | Salazar | Jun 2006 | B2 |
7107621 | Meekins | Sep 2006 | B2 |
D558954 | Lew | Jan 2008 | S |
D586084 | Mendoza | Feb 2009 | S |
7814576 | Nakazawa | Oct 2010 | B2 |
D650154 | Saavedra | Dec 2011 | S |
8869313 | Maramotti | Oct 2014 | B2 |
8959665 | Garner | Feb 2015 | B1 |
8984668 | Tulin | Mar 2015 | B2 |
9326552 | Hays | May 2016 | B2 |
D768359 | Lago | Oct 2016 | S |
9468239 | Battah | Oct 2016 | B2 |
D793033 | Neary | Aug 2017 | S |
9801420 | Hays | Oct 2017 | B2 |
D802256 | Peshek | Nov 2017 | S |
20040205879 | Leba | Oct 2004 | A1 |
20060235656 | Mochimaru et al. | Oct 2006 | A1 |
20070266478 | Girod | Nov 2007 | A1 |
20080271226 | Erana | Nov 2008 | A1 |
20090157021 | Sullivan et al. | Jun 2009 | A1 |
20100136882 | Malish | Jun 2010 | A1 |
20110010818 | Hood | Jan 2011 | A1 |
20110185477 | Olenicoff | Aug 2011 | A1 |
20110214216 | Zarabi | Sep 2011 | A1 |
20120023644 | Maramotti | Feb 2012 | A1 |
20140165265 | Tulin | Jun 2014 | A1 |
20140182044 | Cole | Jul 2014 | A1 |
20140273742 | Hays | Sep 2014 | A1 |
20150082516 | Doan | Mar 2015 | A1 |
20150106993 | Hoffman | Apr 2015 | A1 |
20150196068 | Tulin | Jul 2015 | A1 |
20150213646 | Ma | Jul 2015 | A1 |
20150339853 | Wolper | Nov 2015 | A1 |
20160189431 | Ueda | Jun 2016 | A1 |
20160227854 | Ellis | Aug 2016 | A1 |
20160324234 | Hoffman et al. | Nov 2016 | A1 |
20180014583 | Peshek | Jan 2018 | A1 |
20180014590 | Peshek | Jan 2018 | A1 |
20180110273 | Peshek | Apr 2018 | A1 |
20180168256 | Peshek | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
3546110 | Jul 2006 | CN |
301869010 | Mar 2010 | CN |
301600315 | Jul 2011 | CN |
302245662 | Dec 2012 | CN |
302245667 | Dec 2012 | CN |
302554428 | Sep 2013 | CN |
2876880 | Apr 2006 | FR |
2004156153 | Jun 2004 | JP |
2006032096 | Mar 2006 | WO |
2007055072 | May 2007 | WO |
2007112494 | Oct 2007 | WO |
2012004365 | Jan 2012 | WO |
20130154445 | Oct 2013 | WO |
2018017732 | Jan 2018 | WO |
Entry |
---|
Spoon Graphics “8 Free Stipple Shading Brushes for Adobe Illustrator”, May 25, 2015 , https://blog.spoongraphics.co.uk/freebies/8-free-stipple-shading-brushes-for-adobe-illustrator. |
PCT International Search Report in PCT/US2017/042885, dated Sep. 11, 2017, 14 pages. |
PCT International Search Report in PCT/US2017/042888, dated Sep. 11, 2017, 15 pages. |
PCT Invitiation to Pay Additional Fees in PCT/US2017/0422000, dated Sep. 20, 2017, 15 pages. |
“Truly WOW Slim Leg Jeans Long Simply Be”, Aug. 1, 2012, 3 pages, http://www.simplybe.co/uk/shop/truly-wow-slimleg-jeans-long.co.uk/shop/truly-wow-slim-leg-jeans-long/qy094/product/details/show.action?pdBoUiD=5205#colour;size. |
Anonymous, “Kilowog muscle suit”, Aug. 3, 2010, XP055371226, http://theleagueofheroes.yuku.com. Retreived from the Internet: http://theleagueofheroes.yuku.com/topic9266/Kilowog-muscle-suit#.WRL4-HpMek5, 7 pages. |
European Extended Search Report in Application 14854060.2, dated May 18, 2017, 9 pages. |
PCT International Search Report in PCT/US2017/042200, dated Nov. 28, 2017, 22 pages. |
Cuenca-Guerra, et al., “What makes Buttocks Beautiful” A Review and Classification of the Determinants of Gluteal Beauty and the Surgical Techniques to Achieve Them, Aesthetic Plastic Surgery, 2004 Springer Science+Business Media, Inc. 2004. |
Hoffman, “Visual Intelligence: How We Create What We See”, Published by W.W. Norton, 2000. |
PCT International Search Report in PCT/US2014/061277, dated Jan. 27, 2015, 10 pages. |
Lee Jeans, The Geek Lounge, Jun. 4, 2014, https://geek.lounge.wordpress.com/tag/lee-jeans. |
Men's Jeans and women's jeans and denim jacket measurements to check or make your own pattern, Oct. 11, 2007, https://web.archive.org/web/20071011004827/http://www.jeansinfo.org/measurements.html. |
PCT International Search Report in PCT/US2018/017967, dated Apr. 26, 2018, 14 pages. |
Number | Date | Country | |
---|---|---|---|
20180020752 A1 | Jan 2018 | US |