1. Field of the Invention
The present invention relates to an image processing apparatus, an image processing system, and an image processing method that change a tone of an image.
2. Description of the Related Art
An image processing method that easily creates an artwork image that artificially reproduces features observed in paintings produced by painters from an original image in non-painting tone such as a snapshot has been known.
According to this image processing method, a painting image drawn by an actual painter is input along with an original image to be processed and color information and information about a touch of the brush are analyzed from the painting image. Then, based on the analyzed information, an artwork image is generated by imitating colors of the original image and arranging the touch of the brush (see, for example, Jpn. Pat. Appln. KOKAI Publication No. 2004-213598).
Thus, by using a snapshot taken by a digital camera as the original image, the snapshot can be converted into an artwork image imitating a painting drawn by a specific painter.
However, according to the conventional technology, an apparatus automatically completes, based on the analyzed information, an artwork by imitating colors of the original image and arranging the touch of the brush. Thus, a user cannot join in the creation of an artwork image and can only view the completed artwork image.
Therefore, user's interest in image processing cannot be increased and a user's desire to draw a painting cannot be satisfied, proving unsatisfactory in arousing user's interest.
It is an object of the invention to provide an image processing apparatus, an image processing system, an image processing method, and a storage medium capable of increasing user's interest in processing an image or satisfying a user's desire to draw an artwork by changing a tone of an original image accompanied by user's involvement.
According to an embodiment of the present invention, an image processing apparatus comprises:
a first display controller configured to display a first image;
a touch area detector configured to detect a touched area of the first image displayed by the first display controller;
a first processor configured to change a tone of the touched area of the first image;
a storage configured to store the touched area detected by the touch area detector;
a second display controller configured to display a second image instead of the first image; and
a second processor configured to change a tone of the touched area of the second image which is stored in the storage.
According to another embodiment of the present invention, an image processing system comprises an image processing apparatus and an imaging apparatus connected to the image processing apparatus via a network, wherein the imaging apparatus comprises:
a transmitter configured to transmit images, the images comprising a first image and a second image, and wherein the image processing apparatus comprises:
a receiver configured to receive the images transmitted from the transmitter;
a first display controller configured to display the first image;
a touch area detector configured to detect a touched area of the first image displayed by the first display controller;
a first processor configured to change a tone of the touched area of the first image;
a storage configured to store the touched area detected by the touch area detector;
a second display controller configured to display the second image instead of the first image; and
a second processor configured to change the tone of the touched area of the second image which is stored in the storage.
According to another embodiment of the present invention, an image processing method comprises:
displaying a first image;
detecting a touched area of the displayed first image;
changing a tone of the touched area of the first image;
storing the detected touched area;
displaying a second image instead of the first image; and
changing a tone of the touched area of the second image which is stored.
The objects and advantages of the present invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention, and together with the general description given above and the detailed description of the embodiments given below, serve to explain the principles of the invention.
An embodiment of an image processing apparatus, an image processing system, an image processing method, and a storage medium according to the present invention will now be described with reference to the accompanying drawings.
The CPU 11 includes a snapshot-to-painting conversion engine 200 that converts a non-artwork image such as a snapshot into an artwork image. Snapshot-to-painting conversion processing changes a tone of an original image such that an original image (captured image) stored in the RAM 13 and to be processed is converted into an artwork image having features of the original image, that is, an artwork image in which a specific effect is produced and the artwork image is displayed in a liquid crystal display panel 3. The non-artwork image from which to convert is not limited to snapshots and may be an image created by CG or an image obtained by scanning a hand-written picture.
For conversion into an artwork image, the type of a target painting, that is, features (painting tone) of the converted artwork image can be selected. In the present embodiment, selectable painting tones include 12 styles of artwork: oil painting, thick oil painting, gothic oil painting, fauvist oil paining, water color painting, gouache painting, pastel painting, color pencil sketch, pointillism, silkscreen, drawing, and air brush, which are drawn/painted by a real artist. However, painting tones are not limited to the above examples and conversions having painters' features added such as a Van Gogh tone, Monet tone, and Picasso tone may be made selectable. Alternatively, an algorithm of other painting tones may be provided by a memory card 60 described later. It is assumed in the description of the present embodiment below that the oil painting tone is pre-selected.
The internal memory 14 is a large-capacity nonvolatile memory of a hard disk or flash memory in which folders 141, 142, . . . are formed by processing described later so that artwork images, which are painting tone converted images, can be saved in each of the folders 141, 142, . . . .
A display controller 16 causes the liquid crystal display panel 3 to display an image or various menus by driving the liquid crystal display panel 3 based on display image data supplied from the CPU 11.
A key input controller 17 inputs an operation signal of a touch panel 5 or an operation signal of a key input device 21 based on control of the CPU 11. In the present embodiment, the key input device 21 includes at least a capture switch 22 and a complete switch 23 and in addition, a power switch (not shown), mode changeover switch (not shown) and the like. The capture switch 22 and the complete switch 23 are normally open switches that maintain an off state by being projected and are turned on only when pressed by the user.
A memory card interface 18 is an input/output interface that controls input/output of data between a variety of the memory cards 60 detachably inserted into a memory card slot and the CPU 11. A GPS controller 20 acquires position information based on information received by a GPS antenna 7. In this manner, the current position of the image processing apparatus 1 can be known.
A human sensing sensor 19 is connected to the CPU 11 and is used to detect whether any human being is in the vicinity thereof. Thus, if a state in which no human being is in the vicinity thereof lasts for a predetermined time or longer, power is automatically turned off to save energy (auto power-off).
A communication controller 30 exercises communication control including transmission and reception of images or mail via a telephone line 31 or a wireless LAN 32. An address book 33 is used for mail transmission/reception and is actually provided inside the internal memory 14.
A backup server 40 is connected via a network 90 and backs up data stored in the internal memory 14 automatically or based on manual instructions. A content server 50 has a large number of pieces of content or images and can deliver data to the image processing apparatus 1 via the network 90.
An imaging apparatus 70 is a so-called digital camera and includes an image sensor, an imaging controller to control the image sensor, and an image transmission unit. The imaging controller drives the image sensor and captures a color image of a subject at a predetermined frame rate. The transmission unit transmits a live view image including the captured image to the outside. The imaging apparatus 70 is connected to the communication controller 30 of the image processing apparatus 1 through the telephone line 31 or the wireless LAN 32 via the network 90. Thus, the CPU 11 of the image processing apparatus 1 can sequentially capture the live view image picked up by the imaging apparatus 70 and transmitted by the transmission unit.
At this point, since the imaging apparatus 70 is arranged at a remote location that is different from the location of the image processing apparatus 1 owned by the user, the user can view scenes of the remote location through the liquid crystal display panel 3 of the image processing apparatus 1 or select scenes of the remote location as images to be converted.
A power supply controller 80 receives an AC power supply via a power supply plug 31 and converts AC into DC before supplying power to each unit. The power supply controller 80 also controls the auto power-off.
Live view images transmitted, as described above, at a predetermined frame rate from the imaging apparatus 70 are sequentially stored in the captured image storage area 131 while being updated. Then, the liquid crystal display panel 3 is driven based on image data captured by the display controller 16 and stored in the captured image storage area 131 under the control of the CPU 11 until the capture switch 22 is operated. Accordingly, the live view image being picked up by the imaging apparatus 70 is displayed in the liquid crystal display panel 3.
An image displayed on the liquid crystal display panel 3 when the capture switch 22 is operated is stored in the processing image storage area 132 as a processing image (capture image). At this point, the display controller 16 switches the read source of images from the captured image storage area 131 to the processing image storage area 132. Thus, after the capture switch 22 is operated, the processing image (capture image) continues to be displayed on the liquid crystal display panel 3.
The image stored in the processing image storage area 132 is converted into an oil painting image by conversion processing described later and the display controller 16 reads the image in the processing image storage area 132 in predetermined timing (at a predetermined frame rate) to display the image on the liquid crystal display panel 3. Thus, after the capture switch 22 is operated, instead of the live view image, a converted image being gradually converted into an oil painting image is displayed.
The touch area data storage area 133 stores data “touch area data TA0”, “touch area data TA1”, “touch area data TA2”, . . . , “touch area data TAN” showing touch areas that are areas from positions where a touch is detected by the touch panel 5 to positions the touch is no longer detected. That is, in the present embodiment, an area from a position where a touch is detected by the touch panel 5 to a position where the touch is no longer is detected is defined as a unit of the touch area and data showing the touch area in this unit is stored.
Content of data “touch area data TA0”, “touch area data TA1”, “touch area data TA2”, . . . , “touch area data TAN” showing each touch area includes, as shown on the right end portion of
Next, operations of the present embodiment according to the above configuration will be described.
(Live View Image Display)
When the power supply switch is turned on, the CPU 11 starts control and processing of each unit according to a program stored in the ROM 12.
Then, the CPU 11 captures a live view image transmitted via the network 90 and the telephone line 31 or the wireless LAN 32 from the imaging apparatus 70 (step SB2) and stores the live view image in the captured image storage area 131 (step SB3). Further, the CPU 11 controls the display controller 16 to cause the liquid crystal display panel 3 to display content of the live view image stored in the captured image storage area 131 (step SB4).
Thus, live view images picked up by the imaging apparatus 70 and transmitted at a predetermined frame rate are displayed on the liquid crystal display panel 3 after the power supply switch is turned on until the capture switch 22 is operated. Therefore, the user can enjoy viewing live view images displayed on the liquid crystal display panel 3. Processing in steps SB5 to SB7 performed when CAPF=1 will be described later.
(Decision of the Image to be Processed)
Thus, the user viewing the live view images in the liquid crystal display panel 3 presses the capture switch 22 when the image whose painting tone should be converted is displayed on the liquid crystal display panel 3. Accordingly, the processing target image whose tone should be changed is decided, the image is stored in the processing image storage area 132, and the liquid crystal display panel 3 is maintained in a state in which the image is displayed.
If, for example, as shown in
Therefore, while viewing the liquid crystal display panel 3 in which live view images are displayed, the user can select a desired image as an original image, that is, a material for an image to be imitatively drawn by operating the capture switch 22 at any time.
The complete switch processing (step SC2) in the flowchart in
(Image Conversion)
If, however, CAPF=1, as described in the flowchart in
The touch flag TF is set (=1) in step SE6 described later on condition that the touch is detected by the touch panel 5 through a user's finger while the processing image storage area 132 is displayed on the liquid crystal display panel 3. The touch flag TF is reset (=0) in step SE9 described later on condition that the touch is no longer detected.
Thus, TF=0 while the user is not touching the processing image LP1 on the screen displayed on the liquid crystal display panel 3. If TF=0 and the user is not touching the processing image LP1 on the screen, the CPU 11 proceeds from step SE2 to step SE3 to determine whether the user touches the processing image LP1. If the user is determined to touch, the CPU 11 secures touch area data TAi, which is an i-th touch area beginning with the initial value of “0”, in the touch area data storage area 133 shown in
Therefore, when one touch is started by assuming that the unit of one touch is from the start of a touch to the end of the touch, the start of the touch is indicated by the touch flag TF being set.
If TF changes to 1, the determination in step SE2 becomes NO when the processing according to the flowchart is performed again. Thus, the CPU 11 proceeds from step SE2 to step SE7 to determine whether the processing image LP1 is still being touched, that is, the touch still continues. If the touch continues, the CPU 11 stores coordinates of pixels contained in new touch areas after being stored in step SE5 in the touch area data TAi secured in step SE4 (step SE8).
If the user moves the touched finger away from the processing image LP1 on the screen, the determination in step SE7 becomes NO when the processing according to the flow is performed again and the CPU 11 proceeds from step SE7 to step SE9. Therefore, the data “touch area data TA0” indicating one touch area that is an area from the start of a touch detected by the touch panel 5 to the end of the touch is stored in the touch area data storage area 133 shown in
Then, in step SE9 subsequent to step SE7, the CPU 11 resets (=0) the touch flag TF because the one touch has ended. Thereafter, the CPU 11 performs conversion processing described later (step SE10). Thus, the conversion processing will be performed each time one touch ends by assuming that the unit of one touch is from the start of a touch to the end of the touch.
Therefore, the painting tone in an area touched of the processing image LP1 is changed by the conversion processing each time one touch ends so that the user can appreciate the sense of painting on canvas. Moreover, the user paints by using the processing image LP1 as a rough sketch so that a user who is not good at painting can be made to think of being able to paint well.
The detected touch area is an area that assumes from the start of a touch to the end of the touch as one touch and thus, the area closely resembles a touch operation of the brush so that features of user's touch of the brush can be reflected in touch data.
Subsequently, the CPU 11 sets a conversion flag HF indicating that conversion processing is being performed (step SE11) and increments the value of i (step SE12) before returning.
Therefore, after CAPF changes to 1 and the processing image LP1 is decided, the touch processing shown in the flowchart of
The CPU 11 also operates an average value of color codes of the one pixel specified in step SF1 and the plurality of pixels specified in step SF2 (step SF3). Next, the CPU 11 changes the color code of the one pixel specified first (the one pixel specified in step SF1) to the average value operated in step SF3 (step SF4). Further, the CPU 11 determines whether the processing to change the color code has been performed on all pixels belonging to the touch area TAi (step SF5). Then, the CPU 11 repeats the processing starting with step SF1 until the processing on all pixels belonging to the touch area TAi is completed.
Therefore, the color codes of all pixels belonging to the touch area TAi are changed to the average value of the plurality of pixels before and after the one pixel whose color code has been changed before the determination in step SF5 becomes YES. Consequently, after each one touch of the processing image LP1 on the screen by the user, the color of the area of the one touch is changed to a different color from the original color of the processing image LP1. Accordingly, conversion to an artwork image can be made while being accompanied by user involvement in which one touch of the processing image LP1 on the screen is repeated. As a result, user's interest in painting tone conversion can be increased or a user's desire to paint can be satisfied.
Moreover, if one touch as a touch operation of the brush is continued by using the processing image LP1 as a rough sketch, the processing image LP1 shown in
The conversion processing shown in the flowchart of
(Completion of the Artwork Image)
Therefore, the user can freely decide the completion of artwork image PP1 by operating the complete switch 23 at any time point.
The user can also view artwork image PP1 stored in the folder 141 of the internal memory 14 at any time by causing the CPU 11 to read artwork image PP1 from the folder 141 and causing the liquid crystal display panel 3 to display artwork image PP1 at a later date. Then, the CPU 11 resets the capture flag CAPF (=0) (step SG4) before returning.
(Total Conversion of Live View Images)
After the capture flag CAPF is set to 0 in step SG4 as described above, the determination in step SB1 in the flowchart of
For example, the scene of Mt. Fuji shown in
Then, the determination in step SD1 in the flowchart of
On the other hand, if CAPF is set to 1 in this manner, the determination in step SB1 in the flowchart of
Therefore, the CPU 11 proceeds from step SB1 to step SB5 to determine whether the conversion flag HF is 1. The conversion flag HF is set in step SE11 in the flowchart of
The conversion processing is performed according to the processing procedure shown in the flowchart of
The CPU 11 also operates an average value of color codes of the one pixel specified in step SF1 and the plurality of pixels specified in step SF2 (step SF3). Next, the CPU 11 changes the color code of the one pixel specified first (the one pixel specified in step SF1) to the average value operated in step SF3 (step SF4). Further, the CPU 11 determines whether the processing to change the color code has been performed on all pixels belonging to the touch area TAi (step SF5). Then, the CPU 11 repeats the processing starting with step SF1 until the processing on all pixels belonging to the touch area TAi is completed.
Therefore, the color codes of all pixels belonging to the touch area TAi specified by the value of i are changed to the average value of the plurality of pixels before and after the one pixel whose color code has been changed before the determination in step SF5 becomes YES. Consequently, the color of the processing image LP2 is changed to a different color from the original color by touch data when the processing image LP1 is created without one touch, which is an imitative painting operation on the processing image LP2 on the screen, by the user. Thus, in this case, conversion to an artwork image can be made by using the last touch data without the need to perform an operation of repeating one touch on the processing image LP2 on the screen.
Then, after the conversion processing in step SH2 is performed, the CPU 11 increments the value of i (step SH3) and determines whether i>N (step SH4). The CPU 11 repeats the processing of steps SH2 to SH4 before the relation i>N holds. Therefore, a painting tone conversion can be made by using touch data stored in each of the touch area Tao to touch area TAN used in the last artwork image PP1 and stored in the touch area data storage area 133.
In the case of the above modification, a color of pixels on a periphery of the touch area is detected, and as a pixel gets closer to the periphery, the color of the pixel becomes closer to the color of the periphery than the color of the initially specified pixel. In this case of the modification, the color of the periphery changes, and accordingly, the color in the touch area also changes.
Then, when the relation i>N holds and the painting tone conversion is completed by using all touch data stored in the touch area TA0 to touch area TAN stored in the touch area data storage area 133, the processing image LP2 shown in
Incidentally, while a professional painter creates a large number of paintings, the style of the painter and common features based on the style generally appear in every painting. For a nonprofessional, on the other hand, the style has not yet been established and features of every painting vary.
Although the image serving as a base is different for artwork image PP2 newly saved in the new folder 142 (the processing image LP1 and the processing image LP2), artwork image PP2 is an image in which the touch when the artwork image PP1 is created by the user is reflected.
Thus, artwork image PP1 saved in the last folder 141 and artwork image PP2 saved in the current folder 142 are in common in that the touch when artwork image PP1 is created by the user is reflected in these images. Therefore, even a nonprofessional can express, like a professional painter, the style and features based on the style common to artwork images PP1 and PP2 as works.
In the present embodiment, a live view image transmitted from the imaging apparatus 70 is acquired and set as a processing image, which is an image whose painting tone should be converted. However, the processing image is not limited to the above example and may be any image such as an image stored in the internal memory 14 in advance or an image downloaded from the delivery content server 50. It should be noted that touch operation may be performed with anything such as a finger, a pen, or a mouse.
The captured subject image is displayed, like in the first embodiment, in the liquid crystal display panel 3 by the display controller 16. The CPU 11 performs the processing shown in the flowcharts in
While the description above refers to particular embodiments of the present invention, it will be understood that many modifications may be made without departing from the spirit thereof. The accompanying claims are intended to cover such modifications as would fall within the true scope and spirit of the present invention. The presently disclosed embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims, rather than the foregoing description, and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. For example, the present invention can be practiced as a computer readable recording medium in which a program for allowing the computer to function as predetermined means, allowing the computer to realize a predetermined function, or allowing the computer to conduct predetermined means.
Number | Date | Country | Kind |
---|---|---|---|
2010-172202 | Jul 2010 | JP | national |
This is a Divisional of U.S. application Ser. No. 13/192,984, filed Jul. 28, 2011, which is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2010-172202, filed Jul. 30, 2010, the entire contents of both of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 13192984 | Jul 2011 | US |
Child | 14081701 | US |