This application claims priority under 35 U.S.C. §119(a) to an Indian Provisional Patent Application filed in the Indian Patent Office on Jan. 4, 2011 and assigned Serial No. 17/CHE/2011, the contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates generally to image processing, and more particularly, to a method for sketching an image in real time and/or automatically and painting the sketched image on a mobile terminal.
2. Description of the Related Art
A sketch refers to a freehand drawing that is not usually intended as a finished work. A sketch may serve a number of purposes: it might record something that the artist sees, it might record or develop an idea for later use or it might be used as a quick way of graphically demonstrating an image, idea or principle.
Sketches are routinely drawn by a human artist to represent an object or a scene on a paper. For example, the human artist may paint or draw a picture based on an original scene in a manner that is based on the creative and abstract judgment of the artist. Generally, the artist develops a sketch/painting in a sequential process. The process involves an inherent knowledge of knowing what to sketch/paint first, then deciding what to sketch/paint next and so on.
Currently, various image processing applications known for generating sketch effect of an image. One of the existing applications enables a user to create a pencil/pen sketch from an image while another existing application use a number of images in different angles to create three dimensional images or videos. However, current image processing applications fail to mimic a sequential sketching/painting process of the human artist for sketching/painting an artistic sketch of an image as these applications do not have intelligence in deciding priority order of steps in the sequential sketching/painting process.
An aspect of an embodiment of the present invention is to provide a method for allowing a user to create a sketched image from a single input image, depict a sketching process, and save the sketching process as a video file.
In accordance with an aspect of the present invention, there is provided a method for sketching and painting in a mobile terminal, including detecting, when an image for sketching is selected, edges of the selected image, determining a type of a sketch mode and depicting a sketching process with the detected edges to correspond to the determined type of the sketch mode, and determining, upon a painting request for an image sketched through the sketching process, a painting mode and painting the sketched image to correspond to the determined painting mode.
The above and other aspects, features and advantages of embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
The present invention provides a method and apparatus for creating a live artistic sketch of an image. In the following detailed description of the embodiments of the invention, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
The term ‘artistic sketch’ refers to a freehand drawing created by an image processing device from an image.
The image source 102 inputs an image for creating an artistic sketch of the input image. In one embodiment, the image source 102 inputs an image file or an image captured by a camera. The edge detector 112 detects a plurality of edges in the input image. In one embodiment, the edge detector 112 is based on a minimum filtered negative edge detection algorithm which replaces every pixel at the centre of the filter span by a minimum valued negative of all pixels in the given filter span. This procedure is repeated for all pixels in the input image to create a minimum filtered negative image. The edge detector 112 detects edges in the input image by blending each pixel in the input image with corresponding pixel in the minimum filtered negative image using dodge blending technique.
The sketching module 114 connects each of the edges with one or more of remaining edges in a pre-defined neighborhood of said each of the edges. Then, the sketching module 114 determines one or more features associated with said each of the connected edges in the input image. These features include a spatial location, neighborhood zone, category, and/or pixel value. Thus, the sketching module 114 generates a sequence database 118 containing pixel information associated with each of the connected edges and feature information associated with each of the connected edges and stores the sequence database 118 in the storage unit 110. Accordingly, the sketching module 114 creates an artistic sketch of the input image in a pre-defined sequence using the feature information and the pixel information in the sequence database 118. In the case of a manual sketching process, the sketching module 114 creates a preliminary outline sketch by drawing prominent edges in the input image. In this process, the user is allowed to reveal/erase finer edges by touching the graphical user interface 106.
The graphical user interface 106 renders the artistic sketch of the input image substantially simultaneously with creation of the artistic sketch. Additionally, the sketching module 114 enables the user to store the artistic sketch of the input image in an image file format. Also, the sketching module 114 outputs a video file containing frames associated with creation of the artistic sketch in a multiple video file formats such as those shown in
The painting module 116 enables the user to paint the artistic sketch using an automatic paint option or a manual paint option. If the user selects automatic paint option, the painting module 116 fills the artistic sketch with non-photo realistic colors to produce the effect of a painting. Accordingly, the graphical user interface 106 renders the act of filling the artistic sketch with the non-photo realistic colors while the painting module 116 fills the artistic sketch with the non-photo realistic colors. If the user selects the manual paint option, the painting module 116 fills user specified regions in the artistic sketch with non-photo realistic colors. Alternatively, the painting module 116 fills user specified regions in the artistic sketch using one or more colors selected by the user from a color palette.
Additionally, the painting module 116 enables the user to store the artistic sketch filled with the non-photo realistic colors in an image file format. Also, the painting module 116 outputs a video file containing frames associated with filling the artistic sketch with the non-photo realistic colors in a multiple video file formats such as those shown in
In accordance with the foregoing description, the edge detector 112, the sketching module 114 and the painting module 116 may be stored in the memory 108 in the form of machine readable instructions, which when executed by the processor 104, cause the processor 104 to performs functionality of the edge detector 112, the sketching module 114 and the painting module 116.
Further if the user wishes to fill non-photo realistic colors in the artistic sketch, then at step 208, the artistic sketch of the input image is painted with non-photo realistic colors. The detailed process steps of painting the artistic sketch will be described in
At step 306, a different level of priority is assigned to the one or more pixels in the pre-defined neighborhood of the first pixel. It can be noted that, each of the one or more pixels have different level of priority and hence the edge to which said each pixel belongs also have different level of priority. At step 308, information associated with the first pixel and the one or more pixels in the neighborhood of the first pixel is stored in the sequence database 118. At step 310, the pixel belonging to the edge with highest priority is selected from the one or more pixels in the pre-defined neighborhood of the first pixel.
At step 312, it is checked whether any edges are detected in a pre-defined neighborhood of the selected pixel with highest priority. If one or more edges are detected in the pre-defined neighborhood, then step 304 through 312 are repeated. If there are no edges found in the pre-defined neighborhood of the selected pixel, then at step 314, it is determined whether all of the one or more pixels are covered. If there are pixels remaining for processing, then at step 316, a next pixel containing an edge in the decreasing order of priority is selected from the remaining pixels and steps 304 through 312 are repeated for the selected pixel.
If no pixels are left for processing, then step 318 is performed. Thus, at the end of step 316, pixel information indicating connection between one or more edges in the neighborhood of each of the detected edges (hereinafter referred to as connected edges) is obtained. The pixel information associated with the stored in the sequence database 118 at step 308. Further, at step 318, features associated with each pixel of said each of the connected edges are computed. The features associated with each connected edge include spatial location, neighborhood zone, category, and pixel value. At step 320, the feature information associated with each connected edge is stored in the sequence database 118. At step 322, the pixel information and the feature information associated with each of the connected edges is extracted from the sequence database 118.
At step 324, each of the connected edges is divided into a pre-determined number of timelines using the pixel information and the feature information. In one embodiment, the pre-determined number of timelines is determined based on a frame rate at which creation of the artistic sketch of the input image is to be rendered on the graphical user interface 106. At step 326, the time information associated with said each of the connected edges is stored in the sequence database 118. At step 328, an artistic sketch of the input image is created by rendering each of the connected edges in a pre-defined sequence according to the stored timeline information associated with said each of the connected edges. In this manner, the image processing device 100 creates a sketch of an image in a same manner as a human artist would create a sketch. Once the artistic sketch is created, a user is enabled to preview the artistic sketch and to play a video of creation of the artistic sketch in a sequential manner.
If the determination at step 404 is false, then at step 408 it is determined whether the user prefers a reveal option or a color fill option. If the user prefers a reveal option, then at step 410, information associated with one or more regions in the artistic sketch to be filled with non-photo realistic colors is received from the user. The user can reveal the one or more regions by scratching over the graphical user interface (e.g., touch screen display). At step 412, the one or more regions revealed by the user are filled with non-photo realistic colors. The detailed process of filling the non-photo realistic colors in the one or more regions revealed by the user will be described in greater detail in
If the user selects a color option, then at step 414, non-photo realistic colors selected by the user is filled in tapped points in the artistic sketch. The detailed process of filling user selected non-photo realistic colors in tapped points of the artistic sketch will be described in greater detail in
At step 508, non-photo realistic colors in the non-photo realistic image are grouped into a number of color groups. At step 510, a first pixel belonging to each of the color groups is identified by scanning the non-photo realistic image. At step 512, one or more pixels belonging to same color group in a pre-defined neighborhood of the first pixel are identified by scanning the non-photo realistic image. At step 514, information associated with the first pixel and the one or more pixels in the neighborhood of the first pixel and belonging to the respective color groups is stored in the sequence database 118. At step 516, it is determined whether any pixel is left for processing in the non-photo realistic image.
If any pixels are left for processing, the steps 508 through 516 are repeated. If no pixels are left for processing, then at step 518, an image file containing artistic sketch of the input image is obtained. At step 520, total amount colors to be filled in the artistic sketch are divided into a pre-determined number of timelines for each of the color groups based on the pixels belonging to the respective color groups. In one embodiment, the pre-determined number of timelines for filling the color is determined based on a frame rate at which frames associated with filling color has to be displayed. At step 522, the timeline information associated with each of the color groups is stored in the sequence database 118. At step 524, the artistic sketch is filled with the non-photo realistic colors according to the pre-determined number of timelines associated with each of the color groups.
At step 608, an image file containing the artistic sketch of the input image is obtained. At step 610, one or more regions of the artistic sketch revealed by the user are obtained from the user. At step 612, the color/grey scale values from the non-photo realistic image are applied to corresponding regions revealed by the user in the artistic sketch. It can be noted that, amount of color to be filled in one or more regions is divided into timelines based on a frame rate and timeline information is stored in the sequence database 118. At step 614, it is determined whether the user wishes to paint any other regions in the artistic sketch. If the determination is true, steps 610 through 614 are performed, else the process 600 is terminated.
At step 704, an image file containing the artistic sketch of the input image is obtained. At step 706, one or more colors are selected from a color palette. At step 708, one or more regions of the artistic sketch are obtained from the user. At step 710, the colors selected by the user are applied to corresponding regions in the artistic sketch. For example, one or more colors are applied to one or more regions upon tapping a point in the artistic sketch on the graphical user interface 106. It can be noted that, amount of color to be filled in one or more regions is divided into timelines based on a frame rate and timeline information is stored in the sequence database 118. At step 712, it is determined whether the user wishes to paint any other regions in the artistic sketch. If the determination is true, steps 706 through 712 are performed, else the process 700 is terminated.
The present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments. Furthermore, the various devices, modules, detectors, and the like described herein may be enabled and operated using hardware circuitry, for example, complementary metal oxide semiconductor based logic circuitry, firmware, software and/or any combination of hardware, firmware, and/or software embodied in a machine readable medium. For example, the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits, such as application specific integrated circuit.
While the invention has been shown and described with reference to embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
17/CHE/2011 | Jan 2011 | IN | national |