1. Field of the Invention
The invention relates to a method of applying virtual makeup, a virtual makeup electronic system and an electronic device having the virtual makeup electronic system and, more particularly, to a method of applying virtual makeup in real time, a real-time virtual makeup electronic system and a real-time electronic device having the virtual makeup electronic system.
2. Description of the Related Art
Conventionally, a method of applying virtual makeup on a face image usually searches feature points (such as eyes, lips) at a two dimension (2D) image of a face, and then the virtual makeup (such as virtual eye shadow, virtual lipstick) is provided at a corresponding position of the 2D image.
However, when a user turns or moves his/her head, or when certain facial characters are covered, the feature points are not easily been found correctly. Then, the virtual makeup is failed to apply or applied to inappropriate positions. Consequently, since a conventional virtual makeup method has a fixed shape and size for a face image, the virtual makeup effect is in a proper manner when a user faces right to a camera with a frontal view. However, if the user does not show the face right to a camera, the virtual makeup process is not properly performed since the fixed shape and size cannot be changed accordingly.
A method of applying virtual makeup is provided, and a shape, a size and a position of the virtual makeup are adjusted with the moving or turning of a face in real time.
A virtual makeup electronic system is provided, and suitable virtual makeup is provided to along with a face in moving or turning in real time.
An electronic device having a virtual makeup electronic system is provided, and suitable virtual makeup is provided and displayed along with the moving or turning of a face in real time.
A method of applying virtual makeup applied to an electronic device is provided. The method of applying virtual makeup includes the steps: obtaining a plurality of facial images of different angles of a face to construct a three dimensional (3D) facial model corresponding to the face; recording a real-time facial image of the face, and the 3D facial model varies with the face in real time according to a position and an angle of the real-time facial image; providing 3D virtual makeup to the 3D facial model; converting the 3D virtual makeup to two dimension (2D) virtual makeup according to the position and the angle of the real-time facial image; combining the real-time facial image and the 2D virtual makeup to generate an output image; and displaying the output image.
A virtual makeup electronic system applied to an electronic device is provided. The virtual makeup electronic system includes an image receiving unit, a 3D facial model constructing unit, a 3D facial model moving unit, a makeup information receiving unit, a virtual makeup unit and a data processing unit. The image receiving unit receives a plurality of facial images of different angles of a face and a real-time facial image of the face. The 3D facial model constructing unit is coupled to the image receiving unit, and the 3D facial model constructing unit constructs a 3D facial model via the facial images. The 3D facial model moving unit is coupled to the image receiving unit and the 3D facial model constructing unit, and the 3D facial model moving unit changes a position and an angle of the 3D facial model according to a position and an angle of the real-time facial image. The makeup information receiving unit receives makeup information. The virtual makeup unit is coupled to the 3D facial model constructing unit, the 3D facial model moving unit and the makeup information receiving unit, and the virtual makeup unit provides 3D virtual makeup to the 3D facial model according to the makeup information. The data processing unit is coupled to the image receiving unit and the virtual makeup unit, the data processing unit converts the 3D virtual makeup to 2D virtual makeup according to the position and the angle of the real-time facial image, and the real-time facial image and the 2D virtual makeup are combined to generate an output image.
An electronic device having a virtual makeup electronic system is provided. The electronic device includes an image capture module, a processing module and a display screen. The image capture module captures a real-time facial image of a face. The processing module is electrically connected to the image capture module, and the processing module constructs a 3D facial model via a plurality of facial images of different angles of the face. The processing module makes the 3D facial model vary with the face in real time according to the real-time facial image. The processing module provides 3D virtual makeup to the 3D facial model according to a position and an angle of the real-time facial image, and the processing module converts the 3D virtual makeup to 2D virtual makeup and combines the real-time facial image and the 2D virtual makeup to generate an output image. The display screen is electrically connected to the processing module, and the display screen displays the output image.
In sum, the method of applying virtual makeup and the electronic device having the virtual makeup electronic system construct the 3D facial model via the facial images of different angles, and the 3D facial model varies with the face in real time. The processing unit provides the 3D virtual makeup to the 3D facial model, and the processing unit converts the 3D virtual makeup to the 2D virtual makeup according to the position and the angle of the real-time facial image (such as the angle between the image capture module and the face). The position, the shape, the size and the angle of the 2D virtual makeup change according to the real-time facial image (the face is in moving or turning). The real-time facial image and the 2D virtual makeup are combined to generate the output image. The display module displays the output image. Consequently, even if the face turns to some angles where the feature points of the real-time facial image are different from that of the frontal view of a face (for example, when the face turns 60°, the size and the distance of the eyes are changed) or part of the feature points are covered, the output image still has the 2D virtual makeup which is similar to the actual makeup, and the output image looks more natural.
These and other features, aspects and advantages of the invention will become better understood with regard to the following embodiments and accompanying drawings.
In the embodiment, the electronic device 200 having the virtual makeup electronic system includes an image capture module 210, a processing module 220, a storage module 230 and a display screen 250. In an embodiment, the image capture module 210 is a camera of a notebook computer, the processing module 220 is a central process unit (CPU), and the storage module 230 is a hard disc, compact disk, or a flash drive, which is not limited herein. The image capture module 210, the storage module 230 and the display screen 250 are electrically connected to the processing module 220, respectively.
In the embodiment, the method of applying virtual makeup 100 includes the following steps. A plurality of facial images of different angles of a face are obtained to construct a three dimensional (3D) facial model (step 110). In the embodiment, the user turns his or her face to different angles in front of an image capture module 210 to take multiple facial images of different angles. The image capture module 210 is a 2D image capture module or a 3D image capture module, which is used to capture a 2D image or a 3D image. In an embodiment, the facial images are captured by one or a plurality of lens and input to the electronic device 200 having the virtual makeup electronic system.
In the embodiment, the processing module 220 constructs a 3D facial model 30 (shown in
In an embodiment, a method of fast constructing the 3D facial model 30 is provided. As shown in step 114, a storage device 230 stores a 3D facial model database, and the 3D facial model database collects numerous (ex. hundreds) of the 3D facial models. When the processing module 220 analyzes the real-time facial images 10, the processing module 220 selects a most similar model sample in the 3D facial model database to apply directly according to the information of the main feature points 12 (which are set manually by the user) of the real-time facial images 10, the 3D facial model 30 is thus constructed (step 114). In the embodiment, in the step 114, the processing program of the processing module 220 is simplified, the same 3D facial model 30 can be used by different users who have similar facial characters, and the efficiency is thus improved.
Further, a real-time facial image 10 of the face is recorded, and the 3D facial model 30 varies with the change of the face in real time according to a position and an angle of the real-time facial image 10 (step 120). In the embodiment, the image capture module 210 captures an image of the user in front of the image capture module 210 in real time to obtain the real-time facial images 10. The processing module 220 analyzes the feature points of each of the real-time facial images 10 continuously to adjust the position and the angle of the 3D facial model 30 in real time (step 122). As a result, the 3D facial model 30 varies with moving of the face in real time.
In the embodiment, when the user in front of the image capture module 210 turns the face, the image capture module 210 captures the real-time facial images 10 of different time sequences. Since the real-time facial images 10 of different time sequences are different (for example, the user turns the face or leans the head) and the 2D virtual makeup 20a (shown in
Moreover, since surrounding light performance affects the 2D virtual makeup 20a (shown in
In the embodiment, the sensor module 240 detects the lighting information around the face when the image capture module 210 captures the image of the face, and the processing module 220 adjusts the luminosity and the shadow of the 3D virtual makeup 20 according to the received lighting information, so as to get a natural effect. In an embodiment, when the light performance around the user is strong, the hue of the 3D virtual makeup 20 is lighter accordingly. In an embodiment, when the light illuminates on the user from left to right, the left face looks brighter than the right face, and then brightness and hue of the 3D virtual makeup 20 of the 3D facial model 30 at different parts are adjusted accordingly.
In the embodiment, since the processing module 220 provides the 3D virtual makeup 20 to the 3D facial model 30, and then the processing module 220 converts the 3D virtual makeup 20 to the 2D virtual makeup 20a according to the position and the angle of the real-time facial image 10 (such as the angle between the image capture module 210 and the face). Thus, even if the face turns to some angles where the feature points 12 of the real-time facial image 10 are different from that of the frontal face or part of the feature points 12 are covered, the output image 40 still has the suitable 2D virtual makeup 20a. In other words, the position, the shape, the size, the angle, the luminosity and the shadow of the 2D virtual makeup 20a vary with the moving or turning of the face, and then the 2D virtual makeup 20a of the output image 40 displayed at the display screen 250 looks more natural.
Further, in
Compared
A virtual makeup electronic system 300 is provided, which is executed by an electronic device (such as the electronic device shown in
In the embodiment, the image receiving unit 310 receives a plurality of facial images of different angles and a real-time facial image. The makeup information receiving unit 340 receives makeup information. The image receiving unit 310 and the makeup information receiving unit 340 are different components in
The 3D facial model constructing unit 320 is coupled to the image receiving unit 310, and the 3D facial model constructing unit 320 utilize the facial images to construct a 3D facial model. In the embodiment, the virtual makeup electronic system 300 further includes a 3D facial model database 370 which is coupled to the 3D facial model constructing unit 320. The 3D facial model constructing unit 320 selects a most similar model sample in the 3D facial model database 370 to apply directly according to a plurality of feature points of facial images, the suitable 3D facial model is constructed. In an embodiment, the 3D facial model database 370 is omitted in the virtual makeup electronic system 300, and the 3D facial model constructing unit 320 constructs the 3D facial model directly according to the feature points of the facial images.
The 3D facial model moving unit 330 is coupled to the image receiving unit 310 and the 3D facial model constructing unit 320. The 3D facial model moving unit 330 changes the position and the angle of the 3D facial model according to the position and the angle of the real-time facial image. That means, the 3D facial model varies with the change of the face in real time.
The virtual makeup unit 350 is coupled to the 3D facial model constructing unit 320, the 3D facial model moving unit 330 and the makeup information receiving unit 340, and the virtual makeup unit 350 provides 3D virtual makeup to the 3D facial model according to the makeup information. More detail, the term “makeup information” herein means changes of the face area that the makeup applied, the skin color, the shape of the face, and the whole makeup scope of the face while turning or moving of the 3D facial model. In the embodiment, the virtual makeup electronic system 300 further includes a lighting information receiving unit 380 which is coupled to the virtual makeup unit 350. The lighting information receiving unit 380 receives lighting information, and the virtual makeup unit 350 adjusts the luminosity and the hue of the 3D virtual makeup according to the lighting information.
The data processing unit 360 is coupled to the image receiving unit 310 and the virtual makeup unit 350, and the data processing unit 360 converts the 3D virtual makeup to the 2D virtual makeup according to the position and the angle of the real-time facial image, and the data processing unit 360 combines the real-time facial image and the 2D virtual makeup to generate an output image. In the embodiment, the 2D virtual makeup of the output image matches with the angle and the position of the real-time facial image, which makes the 2D virtual makeup look like the reality.
In sum, the method of applying virtual makeup and the electronic device having the virtual makeup electronic system construct the 3D facial model via the facial images of different angles, and the 3D facial model varies with the face in real time. The processing unit provides the 3D virtual makeup to the 3D facial model, and the processing unit converts the 3D virtual makeup to the 2D virtual makeup according to the position and the angle of the real-time facial image (such as the angle between the image capture module and the face). The position, the shape, the size and the angle of the 2D virtual makeup change with the varying of the real-time facial image (the face is in moving or turning). The real-time facial image and the 2D virtual makeup are combined to generate the output image. The display module displays the output image. Consequently, even if the face turns to some angles where the feature points of the real-time facial image are different from that of the frontal face (for example, when the face turns 60°, the size and the distance of the eyes are different from that of the frontal face) or part of the feature points are covered, the output image still has the 2D virtual makeup which is similar to the realistic makeup, and the output image looks more natural.
Although the invention has been disclosed with reference to certain preferred embodiments thereof, the disclosure is not for limiting the scope. Persons having ordinary skill in the art may make various modifications and changes without departing from the spirit and the scope of the invention. Therefore, the scope of the appended claims should not be limited to the description of the preferred embodiments described above.
Number | Date | Country | Kind |
---|---|---|---|
104119554 | Jun 2015 | TW | national |
This application claims the priority benefits of U.S. provisional application Ser. No. 62/034,800, filed on Aug. 8, 2014 and TW application serial No. 104119554, filed on Jun. 17, 2015. The entirety of each of the above-mentioned patent applications is hereby incorporated by reference herein and made a part of specification.
Number | Date | Country | |
---|---|---|---|
62034800 | Aug 2014 | US |