The present disclosure relates to a processing device and method. More particularly, the present disclosure relates to an image processing device and method.
Traditionally, there is a texture edition method for the three-dimensional model, that is, the three-dimensional model is UV mapping into 2D images. The user must adjust the contents of 2D images by the mouse cursor to perform editions for three-dimensional model.
However, the image process said above is not an intuitive way, and strong dependents on user's edition experiences. In addition, the virtual reality products are targeted at average people. The foregoing cannot provide the average people to operate the edition. Hence, how to provide an intuitive way for editing the three-dimensional model is the problem must to be solved.
One aspect of the present disclosure is related to an image processing device. In accordance with some embodiments of the present disclosure, the image processing device includes a display device and a processor. The display device is configured to display a virtual reality object; the processor, coupled to the display device is configured to: receive a control command, and receive a indicated region on a virtual reality object according to a control command; read a control block of a 2D image associated with the indicated region; read a 3D normal vector mapping data associated with the indicated region; calibrate a range the control block according to a position data of the control command and the 3D normal vector mapping data; and edit, in the calibrated range, a content of the control block according to the received control command.
Another aspect of the present disclosure is related to an image processing method. In accordance with some embodiments of the present disclosure, the image processing method includes: receiving a control command, and obtaining an indicated region on a virtual reality object according to the control command; reading a control block of a 2D image associated with the indicated region; reading a 3D normal vector mapping data corresponding to the indicated region; calibrating a range of the control block according to a position data of the control command and the 3D normal vector mapping data; and editing, in the calibrated range, a content of the control block according to the received control command.
Another aspect of the present disclosure is related to a non-transitory computer readable storage medium. In accordance with some embodiments of the present disclosure, the non-transitory computer readable storage medium stores one or more programs including instructions, which when executed, causes a processing circuit to perform operations including: receiving a control command, and obtaining an indicated region on a virtual reality object according to the control command; reading a control block of a 2D image associated with the indicated region; reading a 3D normal vector mapping data corresponding to the indicated region; calibrating a range of the control block according to a position data of the control command and the 3D normal vector mapping data; and editing, in the calibrated range, a content of the control block according to the received control command.
It is to be understood that both the foregoing general description and the following detailed description are by examples, and are intended to provide further explanation of the disclosure as claimed.
The disclosure can be more fully understood by reading the following detailed description of the embodiments, with reference made to the accompanying drawings as follows:
Reference will now be made in detail to the present embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
It will be understood that, in the description herein and throughout the claims that follow, when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Moreover, “electrically connect” or “connect” can further refer to the interoperation or interaction between two or more elements.
It will be understood that, in the description herein and throughout the claims that follow, although the terms “first,” “second,” etc. may be used to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the embodiments.
It will be understood that, in the description herein and throughout the claims that follow, the terms “comprise” or “comprising,” “include” or “including,” “have” or “having,” “contain” or “containing” and the like used herein are to be understood to be open-ended, i.e., to mean including but not limited to.
It will be understood that, in the description herein and throughout the claims that follow, the phrase “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that, in the description herein and throughout the claims that follow, words indicating direction used in the description of the following embodiments, such as “above,” “below,” “left,” “right,” “front” and “back,” are directions as they relate to the accompanying drawings. Therefore, such words indicating direction are used for illustration and do not limit the present disclosure.
It will be understood that, in the description herein and throughout the claims that follow, unless otherwise defined, all terms (including technical and scientific terms) have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. § 112(f). In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. § 112(f).
Reference is made to
The virtual reality device 100 includes a display device 110, a processor 120 and a storage medium 130. The processor 120 is coupled to the display device 110 and the storage medium 130. The storage medium 130 is configured to store a plurality of texture images, a position mapping data and a normal vector mapping data. The display device 110 is configured to display a virtual reality object. The processor 120 is configured to execute an image editing procedure on the virtual reality object in a virtual reality environment. In some embodiments, a user can hold the controller 700 in his/her hand, and watch screens and objects of the virtual reality environment on the display device 110.
Reference is made to
The virtual spray-painting can 711 comprises a virtual spray nozzle 715. The user uses a control button (not shown) of the controller 700, that is, the virtual hand 713 pushes the virtual spray nozzle 715 and sprays on the virtual reality object 210 (such as the cow shown in
Reference is made to
Reference is made to
In some circumstances, while the virtual reality object 210 is edited, some ranges (such as the cow's chin area) indicated by the virtual spray nozzle 715 are also edited, that is, the control block 223 of the texture image 225 will be edited. The edited result is shown as the cow's chin (i.e. the indicated region 211 shown in
Reference is made to
Reference is made to
A position mapping data 410 records the mapping data between each of the texture images and each object block of the virtual reality object 210. For example, the texture image 225 is a two-dimensional image storing two-dimensional coordinates. An object block 251 is a three-dimensional image, and each of pixels thereof comprises three-dimensional coordinates. The position mapping data 410 records mapping relations between two-dimensional coordinates of pixels of the texture image 225 and three-dimensional coordinates of pixels of the object block 251 corresponding to the texture image 225. By querying the mapping relations of the position mapping data 410, the texture image 225 to which part of the virtual reality object 210 is mapping could be found out, and vice versa. In some embodiments, the processor 120 queries the position mapping data 410 according to the two-dimensional coordinates of pixels of the control block 412, to obtain the three-dimensional coordinates of pixels of the indicated region 211.
A normal vector mapping data 420 records mapping relations between the two-dimensional coordinates of pixels of the texture image 225 and the three-dimensional normal vectors of pixels in object blocks of the virtual reality object 210. For example, the texture image 225 is the two-dimensional image, and each pixel of the texture image 225 correspondingly records the two-dimensional coordinate. The object block 251 is the three-dimensional image, and each pixel of the object block 251 correspondingly records the three-dimensional normal vector. The normal vector mapping data 420 records mapping relations between the two-dimensional coordinates of pixels of the texture image 225 and the three-dimensional normal vectors of pixels of the object block 251 corresponding to the texture image 225. In some embodiments, the processor 120 queries the normal vector mapping data 420 by using the two-dimensional coordinates of pixels of the control block 422, to obtain the three-dimensional normal vectors of pixels of the object block 251.
The following illustrates that, in respond to reading the control block 223, the virtual reality device 100 is how to calibrate the range of the control block 223 such that the spraying result conforms the stereoscopic effect in accordance of the present disclosure.
Reference is made to
Please referring to
The processor 120 queries the normal vector mapping data 420, to obtain the three-dimensional normal vector n1 of the pixel P1 of the indicated region 211. The processor 120 calculates the angle θ1 between the direction vector V1 and the three-dimensional normal vector n1. When the processor 120 determines the angle is not larger than a threshold, the pixel is considered to be part of the range to be edited. For example, the angle θ1 is not more than angle 90°, then it is not necessary to exclude the pixel P1 from the control block 223 (as shown in
Similarly, there is a pixel P2 containing a direction vector V2 in the indicated region 211. The processor 120 queries the normal vector mapping data 420, to obtain the three-dimensional normal vector n2 of the pixel P2 in the indicated region 211. The processor 120 calculates an angle θ2 between the direction vector V2 and the three-dimensional normal vector n2. Because the angle 82 is larger than the threshold angle 90°, then the processor 120 excludes the pixel P2 from the range to be edited in the control block 223 (as shown in
Reference is made to
Reference is made to
Therefore, the present disclosure provides users to spray or edit the virtual reality object in the virtual reality environment with reasonable stereoscopic effect. In the spraying range, the pixels (if they are edited) which do not fit stereoscopic effect are founded and then excluded from the edited range according to the three-dimensional space logic. Hence, the spraying result is more realistic in the virtual reality environment. For example, the user takes the virtual spray-painting can to spray the head part object block of the virtual reality object 210 in
Reference is made to
It is noted that the above embodiments are simplified for better understanding of the present disclosure. It should be noted that, in some embodiments, the image processing method may be implemented as a computer program. When the computer program is executed by a computer, an electronic device, or the processor 120 in
As mentioned above, the image processing device and image processing method in the present disclosure can exclude pixels not fit the stereoscopic effect when editing objects in the virtual reality environment (such as spaying the surface of the virtual reality object), to achieve more realistic effect during editing the three-dimensional objects.
Although the present disclosure has been described in considerable detail with reference to certain embodiments thereof, other embodiments are possible. Therefore, the scope of the appended claims should not be limited to the description of the embodiments contained herein.
This application claims priority to U.S. Provisional Application Ser. No. 62/592,419, filed on Nov. 30, 2017, which is herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62592419 | Nov 2017 | US |