The present disclosure generally relates to systems and methods for constructing a three-dimensional watch object from a watch image.
In accordance with one embodiment, a computing device obtains an image depicting a watch and performs segmentation on the watch in the image to generate a segmented watch region comprising a watch case region, a first watch strap region, and a second watch strap region. The computing device generates a three-dimensional (3D) mesh according to the watch case region, the first watch strap region, the second watch strap region, and a pre-defined skeleton containing at least one straight part and at least two curved parts. The computing device generates texture attributes according to the segmented watch and applies the texture attributes to the 3D mesh to generate a textured 3D watch object. The computing device renders the textured 3D watch object in an augmented reality display.
Another embodiment is a system that comprises a memory storing instructions and a processor coupled to the memory. The processor is configured to obtain an image depicting a watch and perform segmentation on the watch in the image to generate a segmented watch region comprising a watch case region, a first watch strap region, and a second watch strap region. The processor is further configured to generate a three-dimensional (3D) mesh according to the watch case region, the first watch strap region, the second watch strap region, and a pre-defined skeleton containing at least one straight part and at least two curved parts. The processor is further configured to generate texture attributes according to the segmented watch and apply the texture attributes to the 3D mesh to generate a textured 3D watch object. The processor is further configured to render the textured 3D watch object in an augmented reality display.
Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device. The computing device comprises a processor, wherein the instructions, when executed by the processor, cause the computing device to obtain an image depicting a watch and perform segmentation on the watch in the image to generate a segmented watch region comprising a watch case region, a first watch strap region, and a second watch strap region. The processor is further configured by the instructions to generate a three-dimensional (3D) mesh according to the watch case region, the first watch strap region, the second watch strap region, and a pre-defined skeleton containing at least one straight part and at least two curved parts. The processor is further configured by the instructions to generate texture attributes according to the segmented watch and apply the texture attributes to the 3D mesh to generate a textured 3D watch object. The processor is further configured by the instructions to render the textured 3D watch object in an augmented reality display
Other systems, methods, features, and advantages of the present disclosure will be apparent to one skilled in the art upon examining the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
Various aspects of the disclosure are better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, with emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
The subject disclosure is now described with reference to the drawings, where like reference numerals are used to refer to like elements throughout the following description. Other aspects, advantages, and novel features of the disclosed subject matter will become apparent from the following detailed description and corresponding drawings.
The jewelry industry has seen significant advancements in recent years with the incorporation of such technology as computer-aided design (CAD) and three-dimensional (3D) printing to streamline the design and manufacturing processes. Despite these advancements, however, creating custom watches traditionally requires specialized skills and extensive manual labor. To address these challenges, embodiments are disclosed for constructing a 3D watch object from a two-dimensional (2D) watch image for providing designers with an efficient tool for facilitating the evaluation of different watch designs. Notably, individuals are able to merely provide designers with 2D images of watch designs of interest. Three-dimensional 3D watch objects are rendered in an augmented reality display where individuals are shown wearing the 3D watch objects, thereby reducing the time and resources required for designing and producing watch designs.
A description of a system for constructing a 3D watch object from a watch image is described followed by a discussion of the operation of the components within the system.
A watch design application 104 executes on a processor of the computing device 102 and includes an image processor 106, a segmentation mask module 108, a 3D mesh generator 110, a texture processor 112, and an image editor 114. The image processor 106 is configured to obtain digital images of a 2D watch design of interest using, for example, a camera of the computing device 102. The computing device 102 may also be equipped with the capability to connect to the Internet, and the image processor 106 may be configured to obtain an image or video of a watch design from another device or server.
The images obtained by the image processor 106 may be encoded in any of a number of formats including, but not limited to, JPEG (Joint Photographic Experts Group) files, TIFF (Tagged Image File Format) files, PNG (Portable Network Graphics) files, GIF (Graphics Interchange Format) files, BMP (bitmap) files or any number of other digital formats. The video may be encoded in formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc (DVD) multimedia, Video Compact Disc (VCD) multimedia, High-Definition Digital Versatile Disc (HD-DVD) multimedia, Digital Television Video/High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), an MPEG Audio Layer III (MP3), an MPEG Audio Layer II (MP2), Waveform Audio Format (WAV), Windows Media Audio (WMA), 360 degree video, 3D scan model, or any number of other digital formats.
Referring back to the system block diagram in
Referring back to the system block diagram in
Reference is made to
For some embodiments, the depth values of the 2D mesh are adjusted to generate the first intermediate 3D mesh by transforming the 2D mesh onto a plane perpendicular to a z-axis in a 3D coordinate system to generate the first intermediate 3D mesh with all vertices initially having a same z value along the z-axis. In particular, each vertex (a, b) of the 2D mesh will be mapped to a corresponding vertex (a, b, 0) in the 3D coordinate system. The z-coordinate of each vertex of the first intermediate 3D mesh is adjusted according to a distance between each vertex and a closest edge. In particular, the z-coordinate is adjusted according to an inverse relationship with respect to the distance between each vertex and the closest edge such that the z-coordinate is adjusted to a greater extent as the distance between each vertex and the closest edge decreases. For some embodiments, a threshold is utilized to determine which vertices of the first intermediate 3D mesh should be adjusted. For each vertex, if the distance between the vertex and the edge is less than the threshold, the z-coordinate of that vertex will be adjusted. The smaller the distance between the vertex and the edge, the more the z-coordinate of the vertex is adjusted. The adjusted vertices will appear as a curve from the side, where the adjusted mesh comprises a curved 3D mesh.
Referring now to
For some embodiments, the 3D mesh generator 110 is configured to adjust the depth values of the 2D mesh to generate the first intermediate 3D mesh by generating a 2D edge point set of the segmented watch region. The 3D mesh generator 110 transforms the 2D mesh and the 2D edge point set onto a plane perpendicular to a z-axis in a 3D coordinate system to generate the first intermediate 3D mesh and a 3D edge point set, wherein all vertices of the first intermediate 3D mesh and all points of the 3D edge point set initially having a same z-value along the z-axis. The 3D mesh generator 110 then adjusts the z-value of each vertex of the first intermediate 3D mesh according to a distance between each vertex and a closest point in the 3D edge point set. In particular, the z-value is adjusted according to an inverse relationship with respect to the distance between each vertex and the closest point on an edge of the 3D edge point set such that the z-value is adjusted to a greater extent as the distance between each vertex and the closest point on the edge decreases. The pre-defined skeleton is defined on an x-z plane.
The 3D mesh generator 110 applies the warping operation to the third intermediate 3D mesh according to the pre-defined skeleton to generate the 3D mesh by adjusting the watch case sub-mesh of the third intermediate 3D mesh to be a watch case sub-mesh of a fourth intermediate 3D mesh so that a projection of the watch case sub-mesh of the fourth intermediate 3D mesh onto the x-z plane approximates the one straight part of the pre-defined skeleton. The 3D mesh generator 110 adjusts the first watch strap sub-mesh of the third intermediate 3D mesh to be a first watch strap sub-mesh of the fourth intermediate 3D mesh so that the projection of the first watch strap sub-mesh of the fourth intermediate 3D mesh onto the x-z plane approximates one of the curved parts of the pre-defined skeleton. The 3D mesh generator 110 adjusts the second watch strap sub-mesh of the third intermediate 3D mesh to be a second watch strap sub-mesh of the fourth intermediate 3D mesh so that the projection of the second watch strap sub-mesh of the fourth intermediate 3D mesh onto the x-z plane approximates one of the curved parts of the pre-defined skeleton. The 3D mesh generator 110 then defines the fourth intermediate 3D mesh as the 3D mesh.
For some embodiments, the 3D mesh generator 110 generates the 3D mesh according to the watch case region, the first watch strap region, the second watch strap region, and the pre-defined skeleton containing the at least one straight part and the at least two curved parts by defining a watch case sub-mesh of the 3D mesh, a first watch strap sub-mesh of the 3D mesh, and a second watch strap sub-mesh of the 3D mesh according to the watch case region, the first watch strap region, and the second watch strap region. The 3D mesh generator 110 adjusts the thickness values of the watch case sub-mesh of the 3D mesh to be greater than thickness values of the first watch strap sub-mesh of the 3D mesh and the second watch strap sub-mesh of the 3D mesh. The 3D mesh generator 110 adjusts the thickness values of the 3D mesh to incorporate a thickness transition around a first joint of the watch case sub-mesh of the 3D mesh and the first watch strap sub-mesh of the 3D mesh. The 3D mesh generator 110 also adjusts the thickness values of the 3D mesh to incorporate a thickness transition around a second joint of the watch case sub-mesh of the 3D mesh and the second watch strap sub-mesh of the 3D mesh.
For some embodiments, the 3D mesh generator 110 generates the 3D mesh according to the watch case region, the first watch strap region, the second watch strap region, and the pre-defined skeleton containing the at least one straight part and the at least two curved parts by generating a first 2D mesh comprising a watch case sub-mesh of the first 2D mesh, a first watch strap sub-mesh of the first 2D mesh, and a second watch strap sub-mesh of the first 2D mesh according to the watch case region, the first watch strap region and the second watch strap region. The 3D mesh generator 110 duplicates the first 2D mesh to generate a second 2D mesh comprising a watch case sub-mesh of the second 2D mesh, a first watch strap sub-mesh of the second 2D mesh, and a second watch strap sub-mesh of the second 2D mesh. The 3D mesh generator 110 adjusts the depth values of the first 2D mesh to generate a first intermediate 3D mesh comprising a watch case sub-mesh of the first intermediate 3D mesh, a first watch strap sub-mesh of the first intermediate 3D mesh, and a second watch strap sub-mesh of the first intermediate 3D mesh. The 3D mesh generator 110 adjusts depth values of the second 2D mesh to generate a second intermediate 3D mesh comprising a watch case sub-mesh of the second intermediate 3D mesh, a first watch strap sub-mesh of the second intermediate 3D mesh, and a second watch strap sub-mesh of the second intermediate 3D mesh. The 3D mesh generator 110 then merges the first intermediate 3D mesh with the second intermediate 3D mesh to generate a third intermediate 3D mesh comprising a watch case sub-mesh of the third intermediate 3D mesh, a first watch strap sub-mesh of the third intermediate 3D mesh, and a second watch strap sub-mesh of the third intermediate 3D mesh. The 3D mesh generator 110 then applies a warping operation to the third intermediate 3D mesh according to the pre-defined skeleton to generate the 3D mesh.
For some embodiments, the 3D mesh generator 110 applies a warping operation to the third intermediate 3D mesh according to the pre-defined skeleton to generate the 3D mesh by adjusting the watch case sub-mesh of the third intermediate 3D mesh to be a watch case sub-mesh of a fourth intermediate 3D mesh so that a projection of the watch case sub-mesh of the fourth intermediate 3D mesh onto the x-z plane approximates the one straight part of the pre-defined skeleton.
The 3D mesh generator 110 adjusts the first watch strap sub-mesh of the third intermediate 3D mesh to be a first watch strap sub-mesh of the fourth intermediate 3D mesh so that the projection of the first watch strap sub-mesh of the fourth intermediate 3D mesh onto the x-z plane approximates one of the curved parts of the pre-defined skeleton. The 3D mesh generator 110 adjusts the second watch strap sub-mesh of the third intermediate 3D mesh to be a second watch strap sub-mesh of the fourth intermediate 3D mesh so that the projection of the second watch strap sub-mesh of the fourth intermediate 3D mesh onto the x-z plane approximates one of the curved parts of the pre-defined skeleton. The 3D mesh generator 110 then defines the fourth intermediate 3D mesh as the 3D mesh.
Referring back to the system block diagram in
The image editor 114 in
The image editor 114 then renders the textured 3D watch object on the wrist depicted in the second image according to the pose of the wrist and the at least one control point of the textured 3D watch object. For some embodiments, the image editor 114 is configured to render the textured 3D watch object while capturing live video of the user. For example, the user may utilize a camera of the computing device 102 to capture a video of the user's wrist. The image editor 114 then renders the textured 3D watch object on the user's wrist depicted in the video.
The processing device 202 may include a custom made processor, a central processing unit (CPU), or an auxiliary processor among several processors associated with the computing device 102, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and so forth.
The memory 214 may include one or a combination of volatile memory elements (e.g., random-access memory (RAM) such as DRAM and SRAM) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM). The memory 214 typically comprises a native operating system 216, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. For example, the applications may include application specific software that may comprise some or all the components of the computing device 102 displayed in
In accordance with such embodiments, the components are stored in memory 214 and executed by the processing device 202, thereby causing the processing device 202 to perform the operations/functions disclosed herein. For some embodiments, the components in the computing device 102 may be implemented by hardware and/or software.
Input/output interfaces 204 provide interfaces for the input and output of data. For example, where the computing device 102 comprises a personal computer, these components may interface with one or more input/output interfaces 204, which may comprise a keyboard or a mouse, as shown in
In the context of this disclosure, a non-transitory computer-readable medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
Reference is made to
Although the flowchart 300 of
At block 310, the computing device 102 obtains an image depicting a watch. At block 320, the computing device 102 performs segmentation on the watch in the image to generate a segmented watch region comprising a watch case region, a first watch strap region, and a second watch strap region. For some embodiments, the computing device 102 performs segmentation using a machine learning method, an image processing method, or a user specified method.
At block 330, the computing device 102 generates a 3D mesh according to the watch case region, the first watch strap region, the second watch strap region, and a pre-defined skeleton containing at least one straight part and at least two curved parts. For some embodiments, the computing device 102 generates the 3D mesh by defining a watch case sub-mesh of the 3D mesh, a first watch strap sub-mesh of the 3D mesh, and a second watch strap sub-mesh of the 3D mesh according to the watch case region, the first watch strap region, and the second watch strap region. The computing device 102 adjusts thickness values of the watch case sub-mesh of the 3D mesh to be greater than thickness values of the first watch strap sub-mesh of the 3D mesh and the second watch strap sub-mesh of the 3D mesh. The computing device 102 also adjusts thickness values of the 3D mesh to incorporate a thickness transition around a first joint of the watch case sub-mesh of the 3D mesh and the first watch strap sub-mesh of the 3D mesh. The computing device 102 further adjusts thickness values of the 3D mesh to incorporate a thickness transition around a second joint of the watch case sub-mesh of the 3D mesh and the second watch strap sub-mesh of the 3D mesh.
For other embodiments, the computing device 102 generates the 3D mesh by generating a first 2D mesh comprising a watch case sub-mesh of the first 2D mesh, a first watch strap sub-mesh of the first 2D mesh, and a second watch strap sub-mesh of the first 2D mesh according to the watch case region, the first watch strap region and the second watch strap region. The computing device 102 duplicates the first 2D mesh to generate a second 2D mesh comprising a watch case sub-mesh of the second 2D mesh, a first watch strap sub-mesh of the second 2D mesh, and a second watch strap sub-mesh of the second 2D mesh. The computing device 102 adjusts depth values of the first 2D mesh to generate a first intermediate 3D mesh comprising a watch case sub-mesh of the first intermediate 3D mesh, a first watch strap sub-mesh of the first intermediate 3D mesh, and a second watch strap sub-mesh of the first intermediate 3D mesh. The computing device 102 also adjusts depth values of the second 2D mesh to generate a second intermediate 3D mesh comprising a watch case sub-mesh of the second intermediate 3D mesh, a first watch strap sub-mesh of the second intermediate 3D mesh, and a second watch strap sub-mesh of the second intermediate 3D mesh. The computing device 102 then merges the first intermediate 3D mesh with the second intermediate 3D mesh to generate a third intermediate 3D mesh comprising a watch case sub-mesh of the third intermediate 3D mesh, a first watch strap sub-mesh of the third intermediate 3D mesh, and a second watch strap sub-mesh of the third intermediate 3D mesh. The computing device 102 then applies a warping operation to the third intermediate 3D mesh according to the pre-defined skeleton to generate the 3D mesh.
For some embodiments, the computing device 102 generates a 3D mesh according to the watch, the plurality of anchor points, and a pre-defined skeleton. The pre-defined skeleton may be defined according to a quadratic curve. As discussed earlier, the pre-defined skeleton models the curvature of the user's wrist for purposes of later rendering the 3D watch object onto the user's wrist. For some embodiments, the computing device 102 generates the 3D mesh by generating a 2D mesh according to the watch and adjusting the depth values of the 2D mesh to generate a first intermediate 3D mesh. The computing device 102 generates a second intermediate 3D mesh that comprises an inverted duplicate of the first intermediate 3D mesh. The first intermediate 3D mesh is merged with the second intermediate 3D mesh to generate a third intermediate 3D mesh.
The computing device 102 adjusts the thickness values of the third intermediate 3D mesh according to the anchor points to generate a fourth intermediate 3D mesh. The thickness values associated with the watch case are generally adjusted to be greater than thickness values associated with the watch strap. A warping operation is performed on the fourth intermediate 3D mesh according to the plurality of anchor points and the pre-defined skeleton to generate the 3D mesh.
For other embodiments, the computing device 102 generates the 3D mesh by generating a first 2D mesh according to the segmented watch region and generates a second 2D mesh comprising a duplicate of the first 2D mesh. The computing device 102 adjusts depth values of the first 2D mesh to generate a first intermediate 3D mesh and adjusts depth values of the second 2D mesh to generate a second intermediate 3D mesh. The first intermediate 3D mesh is then merged with the second intermediate 3D mesh to generate a third intermediate 3D mesh.
As an alternative to generating a second 2D mesh comprising a duplicate of the first 2D mesh, the computing device 102 duplicates and inverts the first intermediate 3D mesh to generate a second intermediate 3D mesh comprising a watch case sub-mesh of the second intermediate 3D mesh, a first watch strap sub-mesh of the second intermediate 3D mesh, and a second watch strap sub-mesh of the second intermediate 3D mesh. The first intermediate 3D mesh is merged with the second intermediate 3D mesh to generate the third intermediate 3D mesh.
The computing device 102 adjusts thickness values of the third intermediate 3D mesh according to the various anchor points to generate a fourth intermediate 3D mesh. A warping operation is applied to the fourth intermediate 3D mesh according to the anchor points and the pre-defined skeleton to generate the 3D mesh.
The depth values of the first 2D mesh are adjusted to generate the first intermediate 3D mesh by transforming the first 2D mesh onto a plane perpendicular to a z-axis in a 3D coordinate system to generate the first intermediate 3D mesh with all vertices initially having a same z value along the z-axis. In particular, each vertex (a, b) of the first 2D mesh will be mapped to a corresponding vertex (a, b, 0) in the 3D coordinate system. The z-coordinate of each vertex of the first intermediate 3D mesh is adjusted according to a distance between each vertex and a closest edge. In particular, the z-coordinate is adjusted according to an inverse relationship with respect to the distance between each vertex and the closest edge such that the z-coordinate is adjusted to a greater extent as the distance between each vertex and the closest edge decreases. For some embodiments, a threshold is utilized to determine which vertices of the first intermediate 3D mesh should be adjusted. For each vertex, if the distance between the vertex and the edge is less than the threshold, the z-coordinate of that vertex will be adjusted. The adjusted vertices will appear as a curve from the side, where the first adjusted mesh comprises a curved 3D mesh.
The depth values of the second 2D mesh are adjusted to generate the second intermediate 3D mesh using the same steps as those used for generating the first intermediate 3D mesh except that the z-coordinate adjustment for each vertex is in the opposite direction to the adjustment of the first intermediate 3D mesh. The first and second intermediate 3D meshes are merged to generate the third intermediate 3D mesh by applying a constant translation to either one of the intermediate 3D meshes or to both of the intermediate 3D meshes to align the vertices at the edges of the first and the second intermediate 3D meshes.
At block 340, the computing device 102 generates texture attributes according to the segmented watch. The texture attributes may comprise, for example, an albedo texture or a normal texture. The texture attributes may be generated according to the watch case region, the first watch strap region, or the second watch strap region. At block 350, the computing device 102 applies the texture attributes to the 3D mesh to generate a textured 3D watch object.
At block 360, the computing device 102 renders the textured 3D watch object in an augmented reality display. For some embodiments, the computing device 102 renders the textured 3D watch object in the augmented reality display by obtaining a second image depicting a wrist and determining a pose of the wrist in the second image. The computing device 102 obtains at least one control point of the textured 3D watch object and renders the textured 3D watch object on the wrist in the second image according to the pose of the wrist and the at least one control point of the textured 3D watch object. Thereafter, the process in
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are included herein within the scope of this disclosure and protected by the following claims.
This application claims priority to, and the benefit of, U.S. Provisional Patent application entitled, “Method for 3D Watch Object Reconstruction from a Watch Image,” having Ser. No. 63/515,882, filed on Jul. 27, 2023, which is incorporated by reference in its entirety.
| Number | Date | Country | |
|---|---|---|---|
| 63515882 | Jul 2023 | US |