1. Field of the Invention
The present invention relates to an image processing apparatus and an image processing method, and for example, to an image processing apparatus and an image processing method for integrating a document image obtained by a first image obtaining apparatus, such as a scanner, and a document image obtained by a second image obtaining apparatus, such as a camera, as the same file.
2. Description of the Related Art
Transmitting data generated by an apparatus to another apparatus is performed. For example, processing to transmit/transfer document data generated by text creating software of a personal computer (PC) to another PC is performed. Further, scanning a photo image or a document image printed on a paper medium by an MFP having a scanner device or a document scanner, converting the scanned image into the JPEG, TIFF, or PDF format, and transmitting the converted image to a PC by the electronic mail function etc. are performed. In recent years, converting an image captured by a mobile terminal, such as a smart phone or a tablet, into, for example, the JPEG format and transmitting the converted image to a PC from the mobile terminal by the electronic mail function etc. have also begun to be performed. The performance of the camera function of a mobile terminal has been improved and it is made possible to obtain not only a natural image but also an image equivalent to that obtained by a scanner by capturing an image of document text and by performing geometric correction or color correction.
Japanese Patent Laid-Open No. 2013-29934 has disclosed a technique to combine all or two kinds of pieces of digital data obtained by bitmapping the above-described document data or image data to integrate the pieces of data into one piece of data.
Further, Japanese Patent Laid-Open No. 2001-292300 has disclosed a technique to replace part of an image read by a scan with a specific image. For example, creating one document by replacing part of an application form scanned in advance with an image captured in advance by making use of such the technique as disclosed in Japanese Patent Laid-Open No. 2001-292300 is also thought of.
However, in a case where an image and another image are integrated into one file, if images obtained by different devices are directly combined or inserted, there is such a problem that brightness, dynamic range, and the like, are different from each other.
An image processing apparatus according to the present invention has an obtaining unit configured to obtain first image information indicative of characteristics of a first image obtained from a first device and second image information indicative of characteristics of a second image obtained from a second device that is a device different from the first device, a correction unit configured to correct the first image based on the first image information and the second image information, and a combination unit configured to combine the first image corrected by the correction unit and the second image.
According to the present invention, it is made possible to match the dynamic range, color tone, or base color in a case where images obtained by different devices are combined or inserted.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
Hereinafter, embodiments according to the present invention are explained in detail by using the drawings. However, components described in the embodiments are only illustrative and are not intended to limit the scope of the present invention to those embodiments.
In a first embodiment, a paper document is scanned by a scan terminal and a scanned image is generated. Further, a captured image is generated by using the camera function of a mobile terminal. Then, image processing to integrate the scanned image and the captured image within a server is explained.
The cloud service server 103 is connected to the LAN 105 including the Ethernet (registered trademark), a wireless LAN or the like, and then is connected to the Internet 102. The mobile terminal 106 is connected to the Internet 102 through a public wireless communication network 104 etc. The image forming apparatus 100, the PC 101, the cloud service server 103, and the mobile terminal 106 are connected to the Internet 102 through the LAN 105 or the public wireless communication network 104 and are able to communicate with one another.
The image forming apparatus 100 is a multifunction peripheral (MFP) having an operation unit, a scanner unit, and a printer unit. In the system of the present embodiment, the image forming apparatus 100 is utilized as a scan terminal of a paper document. It is possible to refer to the image forming apparatus 100 as a first image obtaining apparatus. Further, it is also possible to refer to a scanned image obtained by the image forming apparatus 100 as a first image.
The mobile terminal 106 is a smart phone or a tablet terminal having an operation unit, a camera unit, and a wireless communication unit. In the system of the present embodiment, the mobile terminal 106 is utilized for checking the image data of a scanned paper document. Further, the mobile terminal 106 is also utilized as an image capturing terminal for generating a captured image by capturing an image of a paper document or a natural image. It is also possible to refer to the mobile terminal 106 as a second image obtaining apparatus. Further, it is also possible to refer to a captured image obtained by the mobile terminal 106 as a second image.
An HDD 204 stores image data and various kinds of programs. An operation unit I/F unit 205 connects an operation unit 206 and the control unit 200. The operation unit 206 includes a liquid crystal display unit having a touch panel function, a keyboard, etc. A printer I/F unit 207 connects a printer unit 208 and the control unit 200. Image data to be printed in the printer unit 208 is transferred from the control unit 200 via the printer I/F unit 207 and in the printer unit 208, an image is printed on a recording medium.
A scanner I/F unit 209 connects a scanner unit 210 and the control unit 200. The scanner unit 210 reads an image on a document and generates image data and inputs the image data to the control unit 200 via the scanner I/F unit 209.
A network I/F unit 211 connects the control unit 200 (image forming apparatus 100) to the LAN 105. The network I/F unit 211 performs transmission of image data and information to an external apparatus (e.g., cloud service server 103) on the LAN 105, reception of various kinds of information from an external apparatus on the LAN 105, and so on.
A network I/F unit 305 connects the control unit 300 (cloud service server 103) to the LAN 105. The network I/F unit 305 transmits and receives various kinds of information to and from another apparatus on the LAN 105.
<Hardware configuration of mobile terminal 106>
A control unit 400 including a CPU 401 controls the operation of the entire mobile terminal 106. In
The CPU 401 is a central processing unit and comprehensively performs control based on programs etc. stored in the ROM 402. Further, the CPU 401 performs communication to control a touch panel unit 406 and a camera unit 408 connected via the operation unit I/F unit 405 and the camera I/F unit 407. The ROM 402 is a nonvolatile flash memory and stores various kinds of programs and data. Further, the ROM 402 is utilized as a storage area of an electronic file. The RAM 403 is utilized as a work area at the time of the execution of a program.
The operation unit I/F unit 405 is for connecting the control unit 400 and the touch panel unit 406. It is possible for the touch panel unit 406 to process information on a number of simultaneously touched points including processing data related to the pressure and the magnitude of the touch and the position of each touched point. Further, the touch panel unit 406 is an input apparatus and also functions as a display apparatus that produces a display to a user.
The camera unit 408 is hardware including a camera lens and a sensor for capturing an image. The wireless communication I/F unit 409 is for connecting the control unit 400 and a wireless communication unit 410. The wireless communication unit 410 is hardware for performing wireless communication. The wireless communication unit 410 connects with the public wireless communication network 104. The wireless communication unit 410 transmits and receives various kinds of information to and from another apparatus on the public wireless communication network 104. Further, the wireless communication unit 410 performs transmission, reception, and so on, of images etc. captured by the camera unit 408 with an external apparatus (e.g., cloud service server 103) on the public wireless communication network 104 or the LAN 105.
A scan application 500 is a software module for transmitting a scanned image to the cloud service server 103. The scan application includes a screen display unit 501, a scan processing unit 502, an image processing unit 503, a scan data management unit 504, and a communication unit 505.
The screen display unit 501 is a software module for producing a display for performing scan processing on the operation unit 206 via the operation unit I/F unit 205.
The scan processing unit 502 is a software module for performing processing to read a paper document by driving the scanner unit 210 via the scanner I/F unit 209. The scan processing unit 502 receives image data from the scanner unit 210 and saves the image data in the HDD 204.
The image processing unit 503 is a software module for converting image data saved in the HDD 204 into an image format, such as JPEG. Further, it is also possible for the image processing unit 503 to perform image processing, such as edge enhancement processing and color adjustment processing, on image data obtained by the scanner unit 210.
The scan data management unit 504 is a software module for saving image data converted into an image format, such as JPEG, by the image processing unit 503 in the HDD 204 and for managing the image data as an scanned image. In the present embodiment, image data obtained and managed by the scan application 500 is referred to as a scanned image.
The communication unit 505 is a software module for registering a scanned image saved in the HDD 204 to the cloud service server 103 via the network I/F unit 211.
In the present embodiment, the software configuration of the image forming apparatus 100 as in
An image capturing application 600 is a software module for transmitting an image captured by a camera to the cloud service server 103. The image capturing application 600 includes a screen display unit 601, a camera processing unit 602, an image processing unit 603, a captured image data management unit 604, a communication unit 605, and a Web browser 606.
The screen display unit 601 is a software module for performing image capturing processing via the operation unit I/F unit 405 and producing a display for checking a captured image on the touch panel unit 406.
The camera processing unit 602 is a software module for performing processing to capture an image of a paper document, a natural image, an image of a landscape, etc., to obtain a captured image by driving the camera unit 408 via the camera I/F unit 407. The camera processing unit 602 receives image data from the camera unit 408 and saves the image data in the HDD 404.
The image processing unit 603 is a software module for converting image data saved in the HDD 404 into an image format, such as JPEG. Further, it is also possible for the image processing unit 603 to perform image processing, such as edge enhancement processing and color adjustment processing, on an image captured by a camera.
The captured image data management unit 604 is a software module for saving image data converted into an image format, such as JPEG, by the image processing unit 603 in the HDD 404 and for managing the image data as a captured image. In the present embodiment, image data captured and managed by the image capturing application 600 is referred to as a captured image.
The communication unit 605 is a software module for transmitting a captured image saved in the HDD 404 to the cloud service server 103 via the wireless communication I/F unit 409.
The Web browser 606 is a software module for performing communication by the HTTP protocol with the cloud service server 103 and for displaying received HTML data and receiving an input from a user. By the Web browser 606, for example, it is made possible for the browser to perform activation of the camera unit 408 and so on.
In the present embodiment, the above-described software configuration of the mobile terminal 106 is introduced, but the configuration is not limited to that described above as long as the configuration is a system in which a captured image can be obtained by a camera etc. and can be transferred.
Although detailed explanation will be given in the following, the contents of the comprehensive processing within the cloud service server 103 include image processing, such as processing to match the dynamic ranges between a scanned image obtained by the image forming apparatus 100 and a captured image obtained by the mobile terminal 106. In the case where the image processing is performed, the cloud service server 103 obtains image information, such as a histogram and a count of number of chromatic colors, from each image. Then, the cloud service server 103 performs correction of at least one of the scanned image and the captured image based on the obtained image information and combines the corrected image with the other image to form the same file. Combining images to form the same file means to form an image file (image data) indicating, for example, a combined image obtained by combining a scanned image and a captured image. Further, a PDF file into which a page of a scanned image and a page of a captured image are integrated may be formed.
The image processing application 700 is an application for causing the image processing unit 705 to perform image processing on an image sent via the Internet 102 on the cloud service server 103.
The communication unit 701 is a software module for receiving a scanned image or a captured image transmitted from another apparatus and saving the image in the HDD 304, and for performing transmission, reception, etc., of an integrated file saved in the HDD with another apparatus via the network I/F unit 305.
The image combining position determination unit 702 is a software module for determining a position where a captured image is inserted from among scanned images saved in the HDD 304 and for determining a position where a scanned image is inserted from among captured images.
The scanned image data management unit 703 is a software module for managing an intermediate product obtained by performing image processing on a scanned image and data of the image processing result.
The captured image data management unit 704 is a software module for managing an intermediate product obtained by performing image processing on a captured image and data of the image processing result.
The image processing unit 705 is a software module for performing image processing on a scanned image and a captured image in accordance with instructions of the image processing application 700.
The scanned image information obtaining unit 706 is a software module for obtaining image information, such as a histogram, from a scanned image.
The captured image information obtaining unit 707 is a software module for obtaining image information, such as a histogram, from a captured image.
The image combining unit 708 is a software module for combining a scanned image and a captured image processed in the image processing unit 705 and saved in the HDD 304 to form the same file.
In the present embodiment, in order to make explanation more specific, an example of a case where a user performs operations below is introduced. A user handles a variety of paper documents in operations. A case is thought of where a user pastes a scanned image of, for example, a receipt having an irregular size, onto an electronic application form obtained by scanning an application form printed on a regular size sheet, and then presents the electronic application form. As described above, in the case where, for example, a scanned image obtained by scanning an irregular size sheet is inserted into a scanned image obtained by scanning a regular size sheet, usually it is necessary to scan each sheet by a scanner and to perform insertion work after performing the work, such as trimming and size change by a PC etc. In the following, by taking a use case where a paper document having an irregular size is inserted into a paper document having a regular size as an example, a method different from the above-described method for inserting an image obtained by a scan into another image obtained by another scan, and which can be performed more easily is introduced as an example.
First, the processing to obtain a scanned image in the present embodiment is explained. In the following, a flow of processing is explained, from transmitting a scanned image obtained by the image forming apparatus 100 performing a scan to the cloud service server 103 until the cloud service server 103 performs image processing.
As an example of an image to be obtained, a scanned image obtained by scanning a regular size sheet as in
After the activation of the scan application 500, the screen display unit 501 of the scan application 500 produces a display to prompt the operation unit 206 to start a scan.
At step S801, upon receipt of instructions to start a scan from a user via the screen display unit 501, the scan application 500 gives instructions to perform a scan to the scan processing unit 502. The scan processing unit 502 obtains image data by driving the scanner unit 210 and generates a scanned image by using the image processing unit 503. The scan processing unit 502 stores the generated scanned image in the scanned data management unit 504.
At step S802, the image processing unit 503 of the scan application 500 performs image processing after obtaining a scanned image, such as filtering processing and color correction processing, on the scanned image generated at step S801. The scanned image after the image processing is stored in the scanned data management unit 504.
Next, at step S803, the communication unit 505 transmits the stored scanned image to the cloud service server 103.
Next, processing is performed in the cloud service server 103. At step S804, the image processing application 700 of the cloud service server 103 receives image data (scanned image) transmitted from the image forming apparatus 100. The scanned image data management unit of the cloud service server 103 stores the received scanned image.
At step S805, the image combining position determination unit 702 of the cloud service server 103 searches for a position where an image is combined with the scanned image. In order to make explanation specific, explanation is given with reference to
In a case where the two-dimensional code 900 in
At step S806, the scanned image information obtaining unit 706 of the cloud service server 103 obtains scanned image information by using a scanned image. For example, the scanned image information obtaining unit 706 obtains scanned image information, such as a histogram of the image, a result of chromatic/achromatic color pixel determination, an achromatic color pixel histogram, and MTF characteristics. The image processing application 700 of the cloud service server 103 makes adjustment so that the dynamic range of the scanned image becomes the same as that of the captured image at step S814, to be described later, based on the scanned image information obtained at step S806. The image processing application 700 of the cloud service server 103 stores the scanned image information in the HDD 304.
To make more specific the explanation of the processing to obtain scanned image information, explanation is given with reference to
The histogram obtaining unit 1000 obtains a histogram of an image. The histogram obtaining unit 1000 obtains histograms corresponding to channels in a case of a color image or obtains a histogram corresponding to one channel in a case of a monochrome image. In order to achieve faster speed of processing, it is also possible to use a signal of one channel also for a color image, or to handle a plurality of channels as one channel by averaging the channels.
The base color determination unit 1001 determines the base color and the background color based on the histogram obtained by the histogram obtaining unit 1000. For example, in a case of color data in the RGB color space, the chromatic/achromatic color pixel determination unit 1002 converts the color space into the L*a*b* color space for each pixel and determines whether the color is a chromatic color or an achromatic color from the value of a color difference. The achromatic color part histogram obtaining unit 1003 obtains a histogram of the achromatic color pixel of the pixels determined by the chromatic/achromatic color pixel determination unit 1002. The reference black determination unit 1004 obtains the blackest signal value from the histogram of the achromatic color pixel obtained by the achromatic color part histogram obtaining unit 1003 and determines reference black. The resolving characteristics obtaining unit 1005 measures the MTF (Modulation Transfer Function) etc. in the image and obtains the resolving power within the image. The image blurring determination unit 1006 determines the degree of blurring of the image based on information on the resolving power of the image obtained by the resolving characteristics obtaining unit 1005.
In the manner such as described above, the cloud service server 103 obtains scanned image information. Next, at step S807 in
Next, the processing to obtain a captured image in the present embodiment is explained. In the following, a flow of processing is explained, from transmitting a captured image obtained by performing image capturing at the mobile terminal 106 to the cloud service server 103 until the cloud service server 103 performs image processing.
As an example of an image to be obtained, it is assumed that a captured image obtained by capturing an image of an irregular size sheet as in
A flow of the processing to obtain a captured image is explained by using a sequence chart in
After the activation of the image capturing application 600, the screen display unit 601 of the image capturing application 600 produces a display to prompt the start of image capturing on the touch panel unit 406.
Upon receipt of instructions to start image capturing from the screen display unit 601, the image capturing application 600 gives instructions to perform image capturing to the camera processing unit 602. The camera processing unit 602 obtains image data by driving the camera unit 408. The image processing unit 603 generates a captured image by using the image data obtained by the camera processing unit 602. The captured image data management unit 604 stores the generated captured image.
At step S810, the image processing unit 603 performs image processing after obtaining the captured image, such as filtering processing, color correction processing, and dynamic range adjustment, on the captured image generated at step S809. At step S810, it may also be possible for the image processing unit 603 to perform processing, such as trimming processing to remove unnecessary portions of the captured image, projection conversion, and trapezoid correction. The processing such as this may be performed within the cloud service server 103 after the communication unit 605 transmits the captured image to the cloud service server 103.
Next, at step S811, the communication unit 605 transmits the stored captured image to the cloud service server 103 by using the wireless communication unit 410.
Next, the processing moves to the processing by the cloud service server 103. At step S812, the image processing application 700 of the cloud service server 103 receives image data (captured image) transmitted from the mobile terminal 106 and stores the captured image by using the captured image data management unit 704.
At step S813, the captured image information obtaining unit 707 of the image processing application 700 obtains captured image information, such as a histogram of the captured image, a result of chromatic/achromatic color pixel determination, an achromatic color pixel histogram, and MTF characteristics. Based on the captured image information obtained at step S813, at step S814, to be described later, adjustment is made so that the dynamic range will be the same as that of the scanned image. The captured image information obtaining unit 707 stores the obtained captured image information in the HDD 304. The method for obtaining captured image information may be the same as that in
By still using
At step S814, the image processing unit 705 of the cloud service server 103 performs processing to match the features of a scanned image and a captured image based on the scanned image information and the captured image information. In other words, the image processing unit 705 performs correction to match the features of the scanned image and those of the captured image based on the histogram, the chromatic/achromatic color pixel determination results, the achromatic color pixel histogram, the MTF characteristics, etc., included in the scanned image information and the captured image information. As a specific example of the correction to match the features of the images, mention is made of the processing to match the dynamic range of a captured image with the dynamic range of a scanned image. Further, mention is made of the processing to match the reference black of the black color of the achromatic color represented by the character part of a captured image with reference to the reference black of a scanned image because it is difficult to determine reference black in a case of a captured image obtained via a camera. Furthermore, mention is made of the filtering processing in which the MFT characteristics of a scanned image and a captured image, respectively, are obtained and in a case where the resolving power is low, for example, an edge enhancement filter etc. is used. Still furthermore, mention is made of the shadow removal processing to determine a threshold value, based on which the influence by shadow etc. captured in a captured image at the time of image capturing by a camera is removed, based on the background color of a scanned image.
To more specifically explain the processing by the image processing unit 705 at step S814, explanation is given by using
At step S1101, the image processing unit 705 obtains scanned image information and captured image information. At step S1102, the image processing unit 705 obtains an image to be corrected. As explained previously, in the present embodiment, the processing to correct a captured image with a scanned image being taken as a reference is explained, and therefore, at step S1102, a captured image to be corrected is obtained.
At step S1103, the image processing unit 705 determines whether the captured image obtained at step S1102 is a character- or text-based document image, or a photo, or a natural image. For example, the image processing unit 705 determines whether or not the captured image is a character-based image by referring to the edge feature amount of a block in a predetermined range in the captured image. In a case where it is determined that the captured image is not a document image, the image processing unit 705 exits the processing. The reason for this is that there is a possibility that making histogram adjustment in accordance with the scanned image will break the tone of a photo, a natural image, etc. In a case where it is determined that the captured image is a document image, the image processing unit 705 proceeds the processing to step S1104.
At step S1104, the image processing unit 705 adjusts the dynamic range of the captured image obtained at step S1102. Specifically, the image processing unit 705 corrects the reference black of the captured image by using information on the reference black included in the scanned image information. Further, the image processing unit 705 makes histogram adjustment of the captured image by using information on the histogram included in the scanned image information and information on the histogram included in the captured image information. For more detailed explanation, explanation is given by using
Next, at step S1105, the image processing unit 705 performs processing to adjust the base color. The image processing unit 705 determines the base color of the captured image from base color information included in the scanned image information and base color information included in the captured image information and removes the base color and performs tone curve correction.
Next, at step S1106, the image processing unit 705 performs filtering processing on the captured image based on image blurring information included in the scanned image information and image blurring information included in the captured image information. The image processing unit 705 performs filtering by using an edge enhancement filter in a case where the captured image is more blurred than the scanned image or by using a smoothing filter in a case where the captured image is too sharp on the contrary.
Next, at step S1107, the image processing unit 705 outputs the corrected image in a case where correction is performed at step S1104 to step S1106 or outputs the image obtained at step S1102 in a case where correction is not performed.
The above is the explanation of the image processing at step S814. Next, returning to
At step S815, the image processing application 700 of the cloud service server 103 stores the corrected image in the HDD 304.
At step S816, the image combining unit 708 of the cloud service server 103 performs processing to combine the scanned image and the corrected captured image. Specifically, the corrected captured image is inserted into the combining position within the scanned image determined at step S805. By the processing at step S816, a document as shown in
At step S817, the image processing application 700 returns the image upload results to the mobile terminal 106 as a response to step S811.
Finally, browsing of the image as the result of combining the scanned image and the captured image in the present embodiment is explained. Here, an example in which browsing is performed by using the mobile terminal 106 is introduced, but the example is not limited to this and it is possible to perform browsing by, for example, the PC 101 connected to the LAN 105. Besides this, it is also possible to produce a printout of a combined image in a case of an MFP, such as the image forming apparatus 100.
At step S818, the Web browser 606 of the mobile terminal 106 accesses a specific URL (Uniform Resource Locator) on the cloud service server 103. The URL indicates a home page in which images stored in the cloud service server 103 are browsed and for example, the URL may be notified to the mobile terminal 106 in the response of the image upload at step S817.
At step S819, to the Web browser 606, information on the URL in which images can be browsed is returned from the cloud service server 103 as a response to the request to check images.
At step S820, the Web browser 606 browses the integrated image by accessing the URL returned at step S819.
By performing the above processing, it is made possible to match the dynamic ranges, color tones, and base colors at the time of combining images obtained by different devices or inserting an image into another. In the present embodiment, explanation is given by using the example in which the captured image is corrected in a case where the scanned image and the captured image obtained by a camera are integrated, but the example is not limited to this. For example, it is also possible to perform the same processing on data of generated text generated by a PC and a captured image, or on data of generated text of a PC and a scanned image, or on data of generated text of a PC, scanned image and captured image.
Further, in the present embodiment, the example is described in which an image obtained from another device is combined within one image, but the example is not limited to this and it is also possible to perform the processing in a case where an image is inserted between pages, for example, as shown in
In the first embodiment, the method for matching the dynamic ranges, color tones, and base colors by obtaining histograms etc. from whole images in a case where images obtained by different devices are combined or one image is inserted into another is explained. However, in a case where correction is performed by simply determining dynamic ranges etc. from whole images, if a plurality of attributes exists, adjustment is made in accordance with the features of a different attribute. For example, a case is supposed where an image of document text captured by a camera (captured image) partially includes a photo attribute region. For example, in a case where the captured image is corrected so as to match with a scanned image in accordance with the method in the first embodiment, the dynamic range and the histogram are determined from the image of the document obtained by a scanner (scanned image). Because of this, the character attribute portion of the captured image is corrected as desired, but there is a possibility that the photo attribute portion will lose the tone properties.
Consequently, in the second embodiment, an example is explained in which processing to determine an attribute, such as a character attribute and a photo attribute, is incorporated in the processing of obtaining information, such as a histogram, and in the processing of performing image correction in accordance with the information, and histogram adjustment etc. is made for each attribute.
In the examples in
At step S1603, the image processing unit 705 determines whether or not a captured image is a document image and in a case where it is determined that the captured image is a document image, the processing is caused to proceed to step S1610. At step S1610, the image processing unit 705 determines attributes of objects as explained in
The processing at step S1603 and that at step S1610 may be put together into one piece of processing as common processing to determine whether or not the attribute is a character. For example, the image processing unit 705 determines all the pixels based on the edge feature amount in a predetermined range of the captured image. In a case where it is determined that no character region is included in the captured image as the result of the determination, the processing is caused to proceed to processing in a case of NO at step S1603. In a case where a character region is included even though partially, the image processing unit 705 performs the processing at step S1604 to step S1606 on the character region of the captured image by using information on the character region in the captured image and information on the character region in the scanned image information.
In the examples in
Next, at step S1604 to step S1606, the same processing as that explained in
By performing the above processing, it is made possible to adjust differences in color tone between devices, the dynamic ranges, and colors without destroying tone in the portions other than characters also in a case where an attribute other than the character attribute exists within an image.
In the first embodiment and the second embodiment, the method for correcting an image based on obtained image information by obtaining image information, such as a histogram, from the image itself, such as a scanned image and a captured image, is explained. However, there is a limit to the image information that can be obtained from the images themselves. In order to bring the image quality of an image obtained by a first device close to that of an image obtained by a second device, the presence of information on the devices used to obtain the images will make the method more effective.
Consequently, in a third embodiment, a configuration is explained, which will make it possible to perform correction to obtain equivalent image quality by storing device information on devices used to obtain images within the cloud service server and by performing correction by further using the stored device information.
As explained above, not only by obtaining information, such as histograms, from images obtained from devices but also by using various kinds of information on the devices and the device characteristics, it is made possible to improve accuracy of adjustment of color tones and image qualities between images.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment (s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment (s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment (s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment (s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2013-242152, filed Nov. 22, 2013, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2013-242152 | Nov 2013 | JP | national |