Update control of image processing control data

Information

  • Patent Grant
  • 8279481
  • Patent Number
    8,279,481
  • Date Filed
    Monday, April 11, 2011
    13 years ago
  • Date Issued
    Tuesday, October 2, 2012
    12 years ago
Abstract
This invention relates to the updating of image processing control data that is related to image data and controls image processing of the image data. An image related data generator generates an image file that includes image data and image processing data pre-stored therein. The image processing control data can be updated according to the following process. The image related data generator sends specification data that specifies image processing control data to be updated to an update data server. Then the image related data generator receives the update data from the update data server. And the image related generator updates the image processing control data stored therein with the update data.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


This invention relates to a technique for image processing of image data, based on image processing control data that is related to the image data and is used to control the image processing of the image data.


2. Description of the Related Art


Various types of image generators, such as digital still cameras, digital video cameras, and scanners, are widely used. Various types of image output devices, such as computer displays and printers, are also widely used. Image data to be output can be retouched with photo-retouch software.


This retouching, however, requires the user to be highly skilled and therefore it is difficult to generate retouched image data that sufficiently reflects the characteristics of the image generator and the intention of the user. And also it is labor intensive to carry out such retouching processes whenever outputting the images, which can reduce the convenience of the retouch software or the image outputting devices.


One available technique to reduce the amount of labor required and make the image processing easier is to attach image processing control data to image data when the image data is generated, thereby controlling image processing based on the image processing control data. In this technique, the image processing control data is preferably updated afterwards.


The update of the image processing control data is also labor intensive, because the user has to check whether the image processing control data should be updated or not, and then acquire the correct update data for the image processing control data to be updated.


SUMMARY OF THE INVENTION

This invention solves at least part of the above-mentioned problem, and reduces the amount of labor required to update the image processing control data, thereby making image processing more easily reflect the user's intentions.


This invention provides a number of embodiments. One embodiment is an image related data generator that is connected to a server. The image related data generator comprises an image data generator, a control data storage module, a relating module, and a storage manager. The image data generator is configured to generate image data. The control data storage module pre-stores image processing control data to control image processing of the image data. The relating module is configured to relate the image data to the image processing control data. The storage manager is configured to update at least a part of the image processing control data stored in the control data storage, by sending specification data, which specifies the image processing control data to be updated, to the server and receiving update data from the server.


The image related data generator can easily update the image processing control data stored therein.


In this invention, the image related data generator may be implemented as, e.g., a digital still camera and a scanner. In this case, at least part of the control data storage module and the storage manager can be provided by another device, such as a computer. And the image related data is typically constructed as an image file that includes image data and image processing control data therein.


The server performs the function of transmitting the update data corresponding to the specification data sent from the image related data generator. The specification data specifies the image processing control data to be updated and the image processing control data itself can be used as the specification data.


The update data corresponding to the specification data can be a new version of the image processing control data corresponding to the specification data.


The transmission between the server and the image related data generator can be implemented in various manners, such as via a communication line, e.g., a network such as the Internet or a LAN, and via a recording medium, e.g., memory cards.


Various triggers are available for the transmission of the update data, such as an instruction by the user of the image related data generator and a predetermined time period. It is also possible to check whether or not there is update data in the server and then acquire any such update data, according to the user's instruction. The information as to whether or not there is update data in the server can be periodically transmitted to the image related data generator.


In one preferred embodiment of the present invention, the storage manager may add the update data into the control data storage, and also may overwrite the image processing control data corresponding to the specification data with the update data. In the former case, the old version of the image processing control data can be used as well as the new version. In the latter case, the old version of the image processing control data is deleted, thereby achieving effective use of the hardware resources of the control data storage module and easily managing the image processing control data where a large amount of the image processing control data is stored in the control data storage.


The overwritten area of the control data storage does not have to correspond to the area where the old version of the image processing control data is stored. By way of example, it is also possible to store the update data into a new area of the control data storage and make the old version invalid.


In one preferred embodiment of the present invention, the specification data can be at least a part of the image processing control data. In this embodiment, the data structure of the control data is storage, because there is no need to store additional data as the specification data.


In one preferred embodiment, the image related data generator may include an identification data storage module configured to store identification data mapped to the image processing control data, the identification data identifying each piece of the image processing control data. And in this embodiment, the specification data can be the identification data. This embodiment makes the data size of the specification data rather small, thereby reducing the amount of hardware resources required to store the specification data.


By way of example, the identification data may include the name or the version number of the image processing. The data size of this type of identification data is generally smaller than that of the image processing control data. Accordingly, using this small size specification data can reduce the time required for transmitting the update data from the server. And that also can reduce the use of the hardware resources of the server where the update data is stored and the time required for retrieving the update data, thereby enhancing the speed of the update data transmission process of the server.


The identification data can include attribute data of the image related data generator, such as the name of the device and the properties of the generation of the image data, and data related to the image output devices, such as the name of the device and the properties of the processing by the image output devices. In this embodiment, the image related data can acquire the update data corresponding to the image related data generator and the image output device specified by the identification data.


Another embodiment of the present invention is an image related data editor, which is connected to a server. The image related data editor comprises an input module, an update data acquiring module, and a relating module. The input module is configured to input image data and attribute data related to the image data, with the attribute data including at least one of image processing control data to control image processing of the image data and identification data to identify the image processing control data. The update data acquiring module is configured to acquire update data of the image processing control data based on the attribute data from the server. The relating module is configured to relate the image data to the update data.


The image processing may conducted in either of two ways. The first way is to input image related data including the image processing control data therein, and to modify the image processing control data. The second way is to input image related data including the identification data therein, and to change the identification data to the image processing control data corresponding to the identification data.


The image related data editor of the present invention can easily edit the attribute data included in the image related data so that the update data acquired from the server is included therein.


The server can be placed outside of the image related data editor as well as included in the image related data editor. Further, the server can be constructed with storage resources both outside and inside of the image related data editor, and can switchover between such storage resources according to a predetermined condition.


The image related data editor can be used in a combination with the image related data generator and other image output devices. Even when the image related data generator and the image output devices cannot use the update data, the combination of the image related data editor and the image related data generator and the other image output devices can be used to construct an image related data generating system or an image processing system, which uses the update data.


In the case where the image related data editor can perform the function of transmitting image data via a network, it can be used as an image related data server that transmits image related data via the network. In this embodiment, the image related editor can input image related data including the attribute data, generate modified image related data that reflects the update data, and transmit the modified image related data to other devices connected to the network. The destination may be the same device from which the original image related data is input, or may be a different device. The user who intends to transmit the image related data can easily apply the update data to the image related data in this manner.


Another embodiment of the present invention is an update data transmit apparatus, which is connected to a predetermined outer device. The update data transmit apparatus comprises an update data storage module, an input module, and an update data transmission module. The update data storage module pre-stores update data for image processing control data to control image processing of image data, the update data being mapped to at least one of the image processing control data and the identification data to identify the image processing control data. The input module is configured to input predetermined data from the outer device, the predetermined data including the image processing control data to be updated or the identification data. The update data transmission module is configured to transmit the update data corresponding to the predetermined data to the outer device. The update data transmit apparatus is typically constructed with a server that performs the functions of storing and transmitting the update data.


In this embodiment, various devices including the image related data generator and the image related data editor can, even when they do not store the update data therein, acquire and use the update data through communication with the update data transmit apparatus.


The update data transmit apparatus can integrally manage the mapping between the image processing control data and the identification data to the update data. This mapping is available for responding to the transmission requirement or the update data sent from various devices including the image related data generator and the image related data editor.


The image processing control data of the present invention can be set according to the image output devices that use the data during image processing. By way of example, the image processing control data can be set according to the type of the image output device. By way of example, the type of the image output device may be categorized into one of the following groups: personal computers that execute image processing application programs, printers, and projectors. The type also may be categorized according to the maker of the printer.


The update data transmit apparatus transmits the update data corresponding to the type of the image output device, thereby implementing appropriate image processing according to such type. By way of example, a user of a new type of printer can easily acquire the image processing control data corresponding to such new type. The maker of the printer also can easily provide the image processing control data for such new type.


In addition to the image related data generator, the image related data editor, and the update data transmit apparatus, the present invention also provides a method for generating or editing the image related data, and a method for transmitting the update data. A computer program that causes a computer executing each function of the above-mentioned devices or apparatuses, and a computer readable recording medium in which the computer program is stored are also provided.


Typical examples of the recording medium include flexible disks, CD-ROMs, magneto-optic discs, IC cards, ROM cartridges, punched cards, prints with barcodes or other codes printed thereon, internal storage devices (memories such as a RAM and a ROM) and external storage devices of the computer, and a variety of other computer readable media.


The above and other objects, features, aspects, and advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments when read in conjunction with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic that shows the general construction of the image output system.



FIG. 2 is a schematic that shows the general construction of the digital still camera.



FIG. 3 is a schematic that shows the function blocks of the digital still camera.



FIG. 4 is a schematic that shows the function blocks of the update data server.



FIG. 5A and FIG. 5B are schematics that show the data structure of the image file.



FIG. 6A and FIG. 6B are schematics that show an example of the interface window to set the image processing mode.



FIG. 7 shows an example of the parameters of the image processing control data.



FIG. 8 is a flowchart of the image processing.



FIG. 9A and FIG. 9B are schematics that show an example of the interface window to update the image processing control data.



FIG. 10A and FIG. 10B are schematics that show an example of the interface window to apply the update data.



FIG. 11 is a flowchart of the update data transmission process.



FIG. 12 is a schematic that shows a first example of the processing of the update data server.



FIG. 13 is a schematic that shows a second example of the processing of the update data server.



FIG. 14 is a schematic that shows the general construction of the image output system.



FIG. 15 is a schematic that shows the function blocks of the image related data editor.



FIG. 16 is a flowchart of the image file edit process.



FIG. 17 is a schematic that shows the general construction of the image output system.



FIG. 18 is a schematic that shows the function blocks of the image file transmission server.



FIG. 19 is a flowchart of the image file transmission process.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Exemplary embodiments of the present invention are discussed below in the following sequence:

    • A. Image output system;
    • A1. Construction of the System;
    • A2. Function Blocks of System;
    • A3. Data Structure of Image Files;
    • A4. Processing Mode and Image Processing Control Data;
    • A5. Image Processing Apparatus;
    • A6. Update of Image Processing Control Data;
    • A7. Advantages of the System;
    • B. Second Embodiment; and
    • C. Third Embodiment.


A. Image Output System

A1. Construction of the System



FIG. 1 is a schematic that shows the general construction of the image output system. The image output system of the first embodiment includes a digital still camera (DSC) 10 as an image related data generator, a color printer 11 as an image processing apparatus and an image output apparatus, and an update data storage server (UDS server) 12 as an update data transmit apparatus. Data transmission is achieved through a memory card MC between the DSC 10 and printer 11, and via a network INT between the DSC 10 and the UDS server 12. For this purpose, the DSC 10 is equipped with a network connection terminal PLG and a memory card slot SLT1, and the printer 11 is equipped with a memory card slot SLT2.


The DSC 10 generates an image file and transmits the image file to the printer 11 through the memory card MC. Two kinds of data are related and stored in the image file: image data and image processing control data (hereinafter referred to as “PIM parameters” or “PIM”). This data is used to control image processing of the image data that the printer 11 executes. The PIMs are pre-stored in the DSC 10, and attached in the image file according to the user's instructions.


In this embodiment, the image files adopt a specific file structure in conformity with the Exif (Exchangeable Image File) format standard for digital still cameras. The Exif format defines an image data recording area to record image data therein, and an attribute information recording area to record various attribute information related to the image data.


Though the DSC 10 as the image related data generator is discussed in this embodiment, various devices, such as scanners and digital video cameras, may be used as the image related data generator. In the case where the image related data generator can record movies, e.g., digital video cameras, the MPEG format may be used for the image data included in the image file. The image file including a movie, as image data, and the PIM can be used to control the image processing for the movie when it is output.


The printer 11 receives the image file from the DSC 10, analyzes it, and extracts the image data and the PIM from the image file. The printer 11, then, executes the image processing of the image data according to the PIM and prints out the image.


For data transmission between the printer 11 and the DSC 10, various techniques are applicable besides recording medium such as memory cards, and such techniques include networks, e.g., a LAN, and communications through infrared and cables. Various devices, such as CRTs and projectors, are applicable for the image processing apparatus and the image output apparatus.


The UDS server 12 sends update data to the DSC 10. The update data to be sent is determined based on the specification data sent from the DSC 10. The specification data includes predetermined information that specifies the update data to be sent. As discussed later, the specification data may include a part of the PIM, or identification data to identify the PIM. The UDS server 12 retrieves the update data according to the received specification data and determines the update data to be sent. The determined update data is transmitted to the DSC 10.


The client to which the UDS server 12 transmits the update data can be selected from among various devices that can use the update data for the PIM.


Various types of networks are applicable for the network INT including wide area networks, e.g., the Internet, and relatively limited networks, e.g., a LAN.



FIG. 2 is a schematic that shows the general construction of the digital still camera (DSC). The DSC 10 generates image data by means of collecting and focusing light rays onto a digital device, such as CCDs or photoelectric scanners. In the DSC 10, image data is generated by means of an optical circuit 121 including a CCD for collecting information regarding the light rays and an image acquisition circuit 122 for converting voltage signals of the CCD into image data.


The DSC 10 includes an optical circuit 20 to collect information regarding the light rays, an image generation circuit 21 to control the optical circuit 20 and to acquire images, and a control circuit 23. The control circuit 23 includes a CPU, ROM, and RAM.


The DSC 10 is also equipped with a select button 24 to input various settings and a LCD 25 to display pictures taken by the DSC 10, and various settings and information.


The DSC 10 generates image files including the image data and the PIM related to the image data. The user can set the PIM through operations of the select button 24. The image data is generated by the optical circuit 20 as described previously.


The PIM includes various information such as the following: gamma values of the DSC 10, setting parameters for contrast and color balance, and color space parameters that identifies the color space where the image data is defined. Besides these parameters, the PIM also includes various parameters related to the shooting condition: exposures, white balances, apertures, shutter speeds, and zoom settings.


The meaning of the phrase “color space parameters” is described below. In the DSC 10, the image data is ordinarily generated in the RGB color space from the voltage signal of the CCD. The type of camera determines one of the following two kinds of color spaces: sRGB color space and NTSC color space. Both color spaces are defined in the RGB coordinate system, and the range of color reproduction of NTSC is broader than that of sRGB. The sRGB color space is generally in the range of 8 bits (0 to 255); a color space that extends through this range to negative values or values of and over 256 (hereinafter referred to as the “extended sRGB space”) may be used instead of the general sRGB color space. Information on the color space used for shooting is included as the color space parameter described previously and attached to the image data as information representing the color reproduction characteristics of the DSC 10.


A2. Function Blocks of System



FIG. 3 is a schematic that shows the function blocks of the DSC. These function blocks are implemented by software in the control circuit 23 where a CPU, ROM and RAM are installed therein (see FIG. 2).


An image data generator 31 controls the optical circuit 20, and generates image data from the information regarding the light rays collected by the image generation circuit 21 (see FIG. 2). A control data storage module 33 stores the PIMs. At least a part of the PIMs is stored in the RAM in a rewritable manner. An image file generator 36 generates an image file that records image data generated by the image data generator 31 and the PIM stored in the control data storage module 33 and related to the image data. The image file is an output file of the DSC 10 and is used in the outer devices, such as the printer 11.


A storage manager 35 manages the PIMs stored in the control data storage module 33 by storing update data, if the update data corresponding to the PIMs stored in the control data storage module 33 can be acquired. The storage manager 35 communicates with the UDS server 12 via the network INT to acquire the update data. The UDS server 12 is connected to the DSC 10 through the network INT, and functions to provide the update data. The storage manager 35 sends specification data to the UDS server 12. The UDS server 12 defines the update data to be transmitted based on the specification data, and transmits the update data.


The storage manager 35 receives the update data, which is the PIM in this embodiment, from the UDS server 12 and stores the PIM into the control data storage module 33. There are two options for storing the PIM into the control data storage module 33: 1) the acquired PIM may be added to the old data; and 2) the acquired PIM may overwrite the old data. In this embodiment, as discussed later, the user can select either one of these options.


An identification data storage module 34 stores identification data that identifies the PIM. The identification data is related to the PIM stored in the control data storage module 33. The identification data is used as the specification data that is sent to the UDS server 12 by the storage manager 35 to acquire the update data.


By way of example, the name, the property, or the version number of the image processing or the PIM is applicable as the identification data.



FIG. 4 is a schematic that shows the function blocks of the update data server. The UDS server 12 receives the specification data from the DSC 10, and transmits the update data corresponding to the specification data.


The UDS server 12 includes the communication module 40 that controls the communication via the network INT. A receiver 41 receives the specification data via the network and inputs the specification data to an update data transmit module 42. The update data transmit module 42 determines the update data to be transmitted according to the specification data, and transmits the update data. The data stored in an update data storage module 49 is used to determine the update data to be transmitted.


The update data storage module 49 stores the update data mapped to the specification data. In this embodiment, the PIM and the identification data is used as the specification data.


Accordingly, the update data transmit module 42 specifies at least one of the PIM and the identification data included in the specification data, and compares each piece of PIM and identification data stored in the update data storage module 49, thereby determining the update data to be transmitted. The way to determine the update data to be transmitted is not restricted to the above-described methods.


A3. Data Structure of Image Files



FIG. 5A and FIG. 5B are schematics that show the data structure of the image file. As previously described, the image file can take the Exif format that is one of the standards of image files for DSCs.


The Exif format was established by JEIDA (Japan Electronic Industry Development Association). As shown in FIG. 5A, an image file in the Exif format has various recording areas: an image data recording area 59 to store image data; and an attribute information recording area 51 to store attribute information of the image data stored therein. The image data is stored in the JPEG format in the image data recording area 59.


Although the Exif standard applies the JPEG format to record the image data, the image files are not restricted to this format. In addition to the JPEG format, other formats that may be used to format the image data generated by the DSC include the PNG format, the TIFF format, the GIF format, and the BMP format. It will be apparent to those skilled in the art that any format that can include the image data and the attribute data may be used.


The attribute information recording area 51 stores the attribute information. The attribute information recording area 51 has a MakerNote recording area 52, which is a non-defined area and is released to the makers of the DSC. As is well known to the person having ordinary skill in the art, the Exif format uses tags to define each piece of data, and the tag that defines the data in the MakerNote recording area 52 is titled “MakerNote,” which is called a MakerNote tag.


The MakerNote recording area 52 of the image file is divided into several areas, where the tag specifies the data stored therein. The recording area where the data related to the image processing is in the PrintMatching recording area 53. This area is specified with a “PrintMatching” tag, and is divided into two areas: 1) a recording area of the data structure 54; and 2) a recording area of the image processing 55. The recording area of the data structure 54 records the data structure related to the recording area of the image processing 55.



FIG. 5B schematically shows the data structure of the recording area of the image processing 55. Parameters to control the image processing are recorded as combinations of the name of the tag and the parameter value.


A4. Processing Mode and PIM



FIG. 6A and FIG. 6B are schematics that show an example of the interface window to set the image processing mode. This interface window is displayed on the LCD DISP of the DSC. The user can set the processing mode by selecting the “OK” button in the interface shown in FIG. 6A. In this embodiment, several processing modes are available: “1. Standard”; “2. Portrait”; “3. Landscape”; and “4. Sunset”, as shown in FIG. 6B. The selection of the processing mode defines the PIM to be used when the image file is generated.



FIG. 7 shows an example of the parameters of the PIM. There are four processing modes that are suitable for the four kinds of situations that may occur. The parameters of the PIM are pre-determined for each of these processing modes. In this embodiment, the PIM includes the four kinds of parameters shown in FIG. 7: “Gamma correction”; “color space”; “contrast”; and “color balance.”


A5. Image Processing Apparatus



FIG. 8 is a flowchart of the image processing. The processes according to the PIM are depicted with double lines.


The CPU of the color printer 11 inputs an image file and extracts image data from the image file (Step S81). The printer 11 then converts the image data in the YCbCr color space into data in the RGB color space used for shooting (Step S82). An inversion matrix, which is inverse to a matrix used for conversion from the RGB space to the YCbCr space in the digital still camera 10, is applied for this conversion. This process converts the image data into the data in the color space used for shooting, that is, one of the NTSC color space, the sRGB color space, and the extended sRGB color space. In the case of conversion into the extended sRGB color space, the color reproduction range includes negative values and values of and over 256.


The printer 11 then performs gamma correction of the image data (Step S83). The gamma value used for the gamma correction is included in the control data as information representing the characteristics of the digital still camera 10.


After completion of the gamma correction, the printer 11 converts the color space of the image data into the wRGB color space, which is defined to have a wider color reproduction range than the sRGB color space (Step S84). When the image data defined in either the NTSC color space or the extended sRGB color space is processed in the sRGB color space having the narrower color reproduction range, the colors of the subject may not be reproduced accurately. From this point of view, the image data generated in the sRGB color space may skip a series of processing steps, as discussed below. In this embodiment, however, the color space information included in the control data does not discriminate the sRGB color space from the extended sRGB color space. Accordingly, the procedure of this embodiment carries out the conversion of the image data generated in the sRGB color space into data in the wRGB color space. Even under such conditions, since the image data in the extended sRGB color space includes negative values or values of or over 256, the extended sRGB color space may be distinguished from the sRGB color space based on these tone values.


Matrix computation is applied for the conversion into the wRGB color space. As described previously, the printer 11 deals with image data defined in either the sRGB color space or the extended sRGB color space and the image data defined in the NTSC color space. Matrices for directly converting the image data in the respective color spaces into data in the wRGB color space may be specified. The procedure of this embodiment, however, carries out conversion via a standard XYZ color space.


The printer 11 first carries out the conversion from the RGB color space to an XYZ color space (Step S84). The conversion process depends upon the color space in which the image data is defined. Accordingly, the procedure of this embodiment provides two conversion matrices in advance, that is, a conversion matrix TM1 for the sRGB color space or the extended sRGB color space and another conversion matrix TM2 for the NTSC color space, and selectively uses one of the two conversion matrices TM1 and TM2 to implement the conversion according to the color space used for shooting. The conversion changes the individual color spaces, in which the image data is generated, into the standard XYZ color space.


The printer 11 then carries out conversion from the XYZ color space to the wRGB color space (Step S85). Matrix computation is also applied for this conversion. An identical matrix is used for this conversion, irrespective of the color space used for shooting. The matrix used for the computation may be set arbitrarily according to the definition of the wRGB color space.


After completion of the color space conversion process, the printer 11 carries out inverse gamma correction (Step S86). The gamma value used here is set, based on the color reproduction characteristics of the image output device.


The printer 11 subsequently carries out automatic adjustment of the picture quality to reflect the intention of the photographer in shooting (Step S87). In this embodiment, the adjustment parameter with regard to, for example, the contrast is included as the color correction parameter in the control data. The printer 11 implements the automatic adjustment based on this parameter. The method of the adjustment is known in the art and is thus not specifically described here.


This series of processing completes the correction of the image data that reflects the color reproduction characteristics of the DSC 10 and the intention of the photographer in shooting. The printer 11 converts the processed image data into a specific format suitable for printing.


The printer 11 causes the RGB image data to be subjected to a color conversion process corresponding to the type of the printer (Step S88). This process converts the RGB color system into the CMYK color system used in the printer. The conversion is performed by referring to a conversion lookup table (LUT) that maps the colors of one color system to the colors of the other color system. The procedure of this embodiment typically uses a conversion table LUTw for conversion of the wRGB color space into the CMYK color space. The printer 11 also stores another conversion table LUTs for the sRGB color space, in order to allow processing of image data defined in the sRGB color space. The printer 11 selectively uses the appropriate conversion table corresponding to the color space in which the image data is defined. The conversion table LUTs may be used, for example, when the color space conversion process of steps S84 and S85 is skipped for the image data obtained in the sRGB color space and when the input image file is output without any processing for adjusting the picture quality.


The printer 11 then carries out halftoning of the image data converted to the tone values of CMYK (Step S89). The halftoning process enables the tone values of the image data to be expressed by the density of the dots formed by the printer. Any known method such as, for example, the error diffusion method or the systematic dither method, may be applied for the halftoning process.


In addition to the above-described series of processing steps, the printer 11 may carry out a resolution conversion process, which converts the resolution of picture data into a printing resolution adopted in the printer, and an interlace data generation process, which sets a data arrangement and feeding quantities of sub-scan to enable interlace recording in the printer.


The picture data is converted into a specific format of print data that enables immediate output from the printer 11 through the conversion at steps S88 and S89. The printer 11 implements printing based on the converted data.


A6. Update of PIM



FIG. 9A and FIG. 9B are schematics that show an example of the interface window to update the PIM. This interface is subsequently displayed when the update button 61 in FIG. 6B is selected.



FIG. 9A is a schematic of an interface to give a final instruction for updating the PIM. In this figure, the interface shows that update of the PIM for “Portrait” mode to “Ver.1.2” is available. The interface also displays the size of the update data to be downloaded, the estimated time required for downloading, and information regarding the UDS server 12 from which the update data is downloaded.



FIG. 9B is a schematic of an interface that is subsequently displayed after the “Update” button in FIG. 9A is selected and the download is completed. This interface indicates completion of the download and requires selection of the storage mode regarding the way in which the downloaded update data is to be stored. The storage mode is selected by add button 72 and overwrite button 71. In the case where the add button 72 is selected, the update data is added to the PIMs that are already stored in the DSC 10. In the case where the overwrite button 71 is selected, the update data is overwritten to the old-version PIM corresponding to the update data.



FIG. 10A and FIG. 10B are schematics that show an example of the interface window to apply the update data. These interfaces are modified versions of the interfaces shown in FIG. 6B. In these interfaces, the menu regarding the “Portrait” mode varies according to downloading and storing, relative to that shown in the FIGS. 9A and 9B, with the update data corresponding to the mode. In the case where the add button 72 is selected in FIG. 9B, the old version of “Portrait” mode, depicted as “Portrait(Ver1.0)” in the FIG. 9B″ is available. In the case where the overwrite button 71 is selected in FIG. 9B, on the other hand, the PIM for “Portrait(Ver1.0)” mode is deleted, and the old version cannot be selected, as shown in FIG. 10B.



FIG. 11 is a flowchart of the update data transmission process. Processing of the DSC 10 that requires the update data is shown in the left side, and processing of the UDS server 12 that transmits the update data is shown in the right side.


The DSC 10 inputs specification data (Step S11). As previously described, the specification data includes PIM or identification data. The specification data is sent to the UDS server 12 (Step S12).


The UDS server 12 receives the specification data (Step S23). The UDS server 12 compares the specification data to each piece of data stored in the update data storage module 49, thereby retrieving the update data to be transmitted (Step S24). Then the UDS server 12 transmits the update data to the DSC 10 (Step S25).


The DSC 10 receives the update data from the UDS server 12 (Step S17) and determines the storage mode (Step S18). In the case of the addition mode where the add button 72 in FIG. 9B is selected, the DSC 10 defines a memory area to store the update data (Step S18A) without losing the old-version PIM, and stores the update data to the defined memory area (Step S19). In the case of the overwrite mode where the overwrite button 71 in FIG. 9B is selected, on the other hand, the DSC 10 stores the update data in a manner that overwrites the old-version PIM (Step S19).


In this embodiment, the update data is stored in the same memory area where the old-version PIM is stored. But the memory area is not restricted to the same memory area. By way of example, one applicable way to overwrite is to store the update data to a different memory area from the area where the old-version PIM is stored and to invalidate the old-version PIM or the specification data corresponding to the old-version PIM.



FIG. 12 is a schematic that shows a first example of the processing of the update data server. The specification data or old-version PIM 75 is compared to data stored in the update data storage module 49. In this example, the data corresponding to the PIM of “Standard, Ver.1.1” is equivalent to the specification data 75. And the UDS server 12 finds that there is a new version of the data, depicted “Ver.1.2” in FIG. 12. Accordingly, the latest version of the PIM, “Standard Ver.1.2”, is defined as the update data to be transmitted. The UDS server 12 extracts the update data from the update data storage module 49 and transmits it.



FIG. 13 is a schematic that shows a second example of the processing of the update data server. In this example, the receiver 41 receives the identification data 76 from the DSC 10. This identification data 76 is compared to data stored in the update data storage module 49 by the update data transmit module 42. In this example, the name of the processing mode is compared first, then the version number; through these two steps of comparison the specification data is identified as being equivalent to the “Standard Ver.1.1”. And the update data transmit module 42 finds that there is a new version of PIM, Ver.1.2, and that the Ver.1.1 that is identified by the identification data 76 is an old version. The update data transmit module, accordingly, extracts the latest version of the PIM of “Standard” mode, Ver.1.2, and transmits it.


A7. Advantages of the System


In this embodiment, the DSC 10 can acquire the update data corresponding to the PIM, and thereby easily use such update data. For example, in the case where makers of the DSC 10 provide new-version PIMs for users, the DSC 10 of the present embodiment easily acquires and stores the update data and thereafter uses it.


The user of the DSC 10 easily selects between two storage modes: 1) the addition mode; and 2) the overwrite mode. This makes the management of the PIMs easy, and reduces the hardware resources required to store the PIMs.


In this embodiment, the specification data to be sent to acquire the desired update data can be defined in various forms: 1) PIM stored in the DSC 10; and 2) identification data that identifies the PIM. This enables various types of image related data generators to update the PIM according to the update data. Even when the system can be sent only one of the PIM and the identification data as the specification data, the system can still update the PIM.


The UDS server 12 can accept various types of specification data, thereby responding to requests for the update data from various types of clients.


The UDS server 12 of the present invention easily updates the PIM stored in image related data generators such as the DSC 10, which is connected to the UDS server 12, and thereby reduces the amount of labor required to provide the update data of the operator of the UDS server 12, and integrally manages the update data and the mapping between the update data and the PIMs.


B. Second Embodiment


FIG. 14 is a schematic that shows the general construction of the image output system. In this embodiment, the personal computer 80 communicates with the UDS server 12 via the network INT as an image related data editor. The computer 80 also communicates with the DSC 10 as an image related data generator via the memory card MC. The computer 80 is connected to the printer 11 as an image processing apparatus and an image output apparatus.


The DSC 10 sends image files to the computer 80 via the memory card MC. For this purpose, the DSC 10 and the computer 80 are equipped with memory card slots SLT1, SLT2, respectively. In this embodiment, an image file including image data and specification data therein, is sent to the computer 80. The specification data is similar to that in the first embodiment, and includes at least one of the PIM and the identification data to identify the PIM. In other words, the specification data includes predetermined data to specify the update data corresponding to the image data.


The computer 80 functions as the image related data editor, and edits the original image file that is generated by the DSC 10. During the editing process, the computer 80 acquires the update data, by sending the specification data recorded in the image file to the UDS server 12. The computer 80 outputs the modified image file in which the update data and the image data extracted from the original image file are related and recorded. In this embodiment, the update data is the PIM, which is sent to the printer 11 and is used for image processing for the print out. The functions of the UDS server 12 and the printer 11 are the same as those described in connection with the first embodiment.



FIG. 15 is a schematic that shows the function blocks of the image related data editor. The image file that the image file inputter 81 inputs includes image data and attribute data. The specification data is included in the attribute data, and is extracted by the specification data generator 83 in the update data acquiring module 82. The extracted specification data is sent to the UDS server 12, which is connected with the network INT, via the communication module 85. The update data is sent from the UDS server 12 to the update data acquiring module 82 via the communication module 85. The image file generator 84 generates the modified image file in which the update data and the image data extracted by the image file inputter 81 are related and stored.



FIG. 16 is a flowchart of the image file edit process. The computer 80 inputs the image file (Step S51) and extracts the attribute data and the image data from the image file (Step S52). The computer 80 further extracts the specification data from the attribute data (Step S53) and sends the specification data to the UDS server 12 (Step S54). Then, the computer 80 receives the update data from the UDS server 12 (Step S55). Subsequently, the computer 80 generates the modified image file with the update data and the extracted image data (Step S56).


In this embodiment, the image related data editor that can acquire the update data from the UDS server 12 can easily edit the image file so that the update data is recorded in the attribute data of the image file. The specification data may include at least one of the PIM and the identification data, and this makes the image related data editor useful because the editor can handle various types of image files as long as it includes at least one of the PIM and the identification data. The image related data editor of this embodiment is especially useful in the case where many image files with various types of PIM need to be edited to include the update data.


C. Third Embodiment

The third embodiment of the present invention, which utilizes an image file transmission server, is discussed below. The image file transmission server in this embodiment edits the original image file sent from a client, and transmits the modified image file to the instructed destination.



FIG. 17 is a schematic that shows the general construction of the image output system. Data transmission via the network INT takes place between the DSC 10 and the image file transmission (IFT) server 90, and also between the IFT server 90 and the printer 11. The DSC 10 is equipped with a network connection terminal PLG to perform this transmission.


The data sent from the DSC 10 to the IFT server 90 includes two kinds of information: 1) the image file, including the specification data and the image data therein; and 2) the destination data that specifies the destination where the modified image file is to be transmitted.


The IFT server 90 edits the image file as an image related data editor. The IFT server 90 generates the PIM as the update data, based on the specification data. This function is similar to that of the UDS server 12 in the previous embodiments, and, as such, the details of such function have already been described herein. The IFT server 90 transmits the modified image file to the instructed destination according to the destination data. In this example, the destination is the printer 11 that is possessed by the user of the DSC 10.



FIG. 18 is a schematic that shows the function blocks of the IFT server 90. The IFT server 90 also functions as an image related data editor. Some parts of the function blocks illustrated in FIG. 18 are similar to those of the image related data editor shown in FIG. 14. One of the differences is that the IFT server 90 does not acquire the update data from the outer devices; the update data acquiring module 92, accordingly, does not have the function of acquiring the update data through the communication module 99.


The update data acquiring module 92 refers to the update data storage module 97, thereby generating the update data corresponding to the specification data, which is extracted by the specification data generator 93 from the image file.


The function of the update data acquiring module 92 generating the update data is similar to that of the update data transmit module 42 of the UDS server 12 (see FIG. 4). The update data acquiring module 92 compares the specification data to data stored in the update data storage module 97, and determines the update data to be used for the update. The update data storage module 97 is similar to the update data storage module 49 in the UDS server 12.


Image file generator 981 generates an image file in which the update data generated by the update data acquiring module 92 and the image data extracted from the image data are related and stored. The image file is transmitted by image file transmitter 982 to the destination according to the destination data.



FIG. 19 is a flowchart of the image file transmission process. The IFT server 90 inputs the image file (Step S911) and extracts the image data and the attribute data (Step 912). The IFT server 90 further extracts the specification data from the attribute data (Step S913) and defines the update data to be used (Step S915).


In this embodiment, the IFT server 90 retrieves the update data from the update data storage module 97, without acquiring the update data from outer devices or other servers. The IFT server 90, then, generates the modified image file with the update data and the image data (Step S916). Subsequently, the IFT server 90 inputs the destination data (Step S920), and transmits the image data to the destination according to the destination data (Step S930).


The IFT server 90 of this embodiment can edit and transmit the image file, which allows the user to use the PIM or the update data to edit the image file without complicated instructions.


In this embodiment, the PIM may be set according to the type of printer 11 being used. In this case, the IFT server 90 stores the update data in the mapping of the types of printers 11. The IFT server 90 specifies the type of the printer 11 based on the specification data sent from the DSC 10. This modification allows the user to use the appropriate PIM according to the type of the printer 11 without increasing the complexity of the process. And makers of the printer 11 can easily provide the PIMs for various types of printers.


The above embodiments and modifications thereof are to be considered in all aspects as illustrative and not restrictive. There may be many modifications, changes, and alterations without departing from the scope or spirit of the main characteristics of the present invention. For example, the diverse control processes discussed above may be attained by hardware structures, instead of software configurations.


The scope and spirit of the present invention are indicated by the appended claims, and should not be restricted unnecessarily by the foregoing description.

Claims
  • 1. A digital camera, comprising: an image data generating module configured to generate image data of an image captured by the digital camera;a display device;a control data storage for storing in advance plural sets of image control data for use in controlling image processing of the image data, each of the plural sets of image control data including control parameters for a plurality of image processing items;an image control data generating module configured to generate image control data that is suitable for controlling image processing of the image data; anda memory management module configured to acquire additional image control data from an external memory device separate from the digital camera, and to store the additional image control data into the control data storage, wherein the image control data generating module displays a user interface screen including a plurality of image control modes on the display device to enable a user to select one of the plurality of image control modes, and generates the image control data for the image data using specific image control data which is associated with the selected image control mode and which is selected from plural sets of the image control data stored in the control data storage, and when the additional image control data has been stored in the control data storage, the image control data generating module displays on the display device a plurality of image control modes which includes an additional image control mode related to the additional image control data.
  • 2. The digital camera according to claim 1, further comprising a memory card slot that receives a memory card storing the additional image control data.
  • 3. The digital camera according to claim 2, wherein the image control data includes a control parameter for contrast and another control parameter for color balance.
Priority Claims (1)
Number Date Country Kind
2002-131207 May 2002 JP national
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 12/231,516, filed on Sep. 2, 2008, now U.S. Pat. No. 7,924,472 which is a continuation of U.S. application Ser. No. 10/429,017, filed on May 2, 2003 (now U.S. Pat. No. 7,428,082 B2). The disclosures of these prior applications from which priority is claimed are incorporated herein by reference for all purposes.

US Referenced Citations (124)
Number Name Date Kind
4216480 Buehner et al. Aug 1980 A
4476497 Oshikoshi et al. Oct 1984 A
4561025 Tsuzuki Dec 1985 A
4792847 Shimazaki et al. Dec 1988 A
5168371 Takayanagi Dec 1992 A
5477264 Sarbadhikari et al. Dec 1995 A
5489998 Yamada et al. Feb 1996 A
5495349 Ikeda Feb 1996 A
5497419 Hill Mar 1996 A
5512977 Imai Apr 1996 A
5528293 Watanabe Jun 1996 A
5646752 Kohler et al. Jul 1997 A
5646994 Hill Jul 1997 A
5940121 Mcintyre et al. Aug 1999 A
5982416 Ishii et al. Nov 1999 A
6011547 Shiota et al. Jan 2000 A
6108008 Ohta Aug 2000 A
6115104 Nakatsuka Sep 2000 A
6192191 Suga et al. Feb 2001 B1
6229625 Nakatsuka May 2001 B1
6229965 Ito et al. May 2001 B1
6256350 Bishay et al. Jul 2001 B1
6263251 Rutschmann Jul 2001 B1
6273535 Inoue et al. Aug 2001 B1
6282362 Murphy et al. Aug 2001 B1
6337922 Kumada Jan 2002 B2
6384928 Nagasawa et al. May 2002 B2
6456400 Ikegami et al. Sep 2002 B1
6459500 Takaoka Oct 2002 B1
6539169 Tsubaki et al. Mar 2003 B1
6560374 Enomoto May 2003 B1
6563603 Yamazaki May 2003 B1
6567119 Parulski et al. May 2003 B1
6583811 Kinjo Jun 2003 B2
6597468 Inuiya Jul 2003 B1
6603506 Ogawa et al. Aug 2003 B2
6609162 Shimizu et al. Aug 2003 B1
6628325 Steinberg et al. Sep 2003 B1
6633410 Narushima Oct 2003 B1
6650365 Sato Nov 2003 B1
6650437 Nakajima Nov 2003 B1
6657658 Takemura Dec 2003 B2
6678402 Jones et al. Jan 2004 B2
6724926 Jones et al. Apr 2004 B2
6728428 Kinjo Apr 2004 B1
6736476 Inoue et al. May 2004 B2
6750902 Steinberg et al. Jun 2004 B1
6771889 Suga et al. Aug 2004 B1
6785814 Usami et al. Aug 2004 B1
6822678 Hatori Nov 2004 B2
6822758 Morino Nov 2004 B1
6829006 Hayashi Dec 2004 B1
6834130 Niikawa et al. Dec 2004 B1
6836565 Nishikawa Dec 2004 B1
6850271 Ichikawa Feb 2005 B1
6869156 Inoue et al. Mar 2005 B2
6888943 Lam et al. May 2005 B1
6889324 Kanai et al. May 2005 B1
6903776 Tsujino et al. Jun 2005 B1
6914625 Anderson et al. Jul 2005 B1
6919924 Terashita Jul 2005 B1
6937997 Parulski Aug 2005 B1
6968058 Kondoh et al. Nov 2005 B1
6992711 Kubo Jan 2006 B2
7085377 Norr Aug 2006 B1
7139087 Hayashi Nov 2006 B2
7151832 Fetkovich et al. Dec 2006 B1
7170632 Kinjo Jan 2007 B1
7177429 Moskowitz et al. Feb 2007 B2
7194130 Nishikawa Mar 2007 B2
7206081 Sada et al. Apr 2007 B2
7221482 Yamazaki et al. May 2007 B2
7271932 Izumi Sep 2007 B2
7327490 Kuwata et al. Feb 2008 B2
7350086 Nakajima Mar 2008 B2
7360852 Inoue et al. Apr 2008 B2
7362891 Jones et al. Apr 2008 B2
7375848 Nakami et al. May 2008 B2
7403696 Suga et al. Jul 2008 B2
7428082 Nakajima Sep 2008 B2
7453598 Takahashi Nov 2008 B2
7457483 Tokiwa Nov 2008 B2
7471414 Xueping Dec 2008 B2
7474424 Hokiyama Jan 2009 B2
7483168 Kuwata et al. Jan 2009 B2
7499201 Kodama et al. Mar 2009 B2
7533949 Inoue et al. May 2009 B2
7701626 Inuiya Apr 2010 B2
7751644 Kuwata Jul 2010 B2
7774605 Kanai et al. Aug 2010 B2
7787041 Kaku Aug 2010 B2
7825962 Nakajima et al. Nov 2010 B2
7852494 Asai Dec 2010 B2
7852520 Iida Dec 2010 B2
7898680 Misawa et al. Mar 2011 B2
7899269 Kuwata Mar 2011 B2
7920291 Fukada et al. Apr 2011 B2
7924472 Nakajima Apr 2011 B2
20010024292 Otake Sep 2001 A1
20010035909 Kubo Nov 2001 A1
20020008782 Kurokawa Jan 2002 A1
20020105582 Ikeda Aug 2002 A1
20020122194 Kuwata et al. Sep 2002 A1
20020135687 Nakajima et al. Sep 2002 A1
20020140952 Fukasawa Oct 2002 A1
20020196346 Nishio et al. Dec 2002 A1
20030025811 Keelan et al. Feb 2003 A1
20030215220 Ohmura et al. Nov 2003 A1
20040032400 Freeman et al. Feb 2004 A1
20040201699 Parulski et al. Oct 2004 A1
20040260935 Usami et al. Dec 2004 A1
20050007617 Tanaka et al. Jan 2005 A1
20050030568 Narushima et al. Feb 2005 A1
20050134900 Kuwata Jun 2005 A1
20050152613 Okutsu et al. Jul 2005 A1
20050166057 Kanai et al. Jul 2005 A1
20050223228 Ogawa et al. Oct 2005 A1
20060072156 Shima Apr 2006 A1
20060232797 Hayaishi Oct 2006 A1
20080002035 Yoshida Jan 2008 A1
20080174677 Nakajima et al. Jul 2008 A1
20080198396 Nakami et al. Aug 2008 A1
20090009813 Nakajima Jan 2009 A1
20100289808 Hitaka Nov 2010 A1
Foreign Referenced Citations (67)
Number Date Country
0653879 May 1995 EP
0706285 Apr 1996 EP
0757473 Feb 1997 EP
0860980 Aug 1998 EP
1056272 Nov 2000 EP
1241870 Sep 2002 EP
3-38986 Feb 1991 JP
3038986 Feb 1991 JP
05-292533 Nov 1993 JP
06-008537 Jan 1994 JP
08-315106 Nov 1996 JP
09-098373 Apr 1997 JP
09-219817 Aug 1997 JP
10-174036 Jun 1998 JP
10-191246 Jul 1998 JP
10-221791 Aug 1998 JP
10-226139 Aug 1998 JP
10-276295 Oct 1998 JP
10-308870 Nov 1998 JP
10-334212 Dec 1998 JP
11-041511 Feb 1999 JP
11-041622 Feb 1999 JP
11-066274 Mar 1999 JP
11-088672 Mar 1999 JP
11-098461 Apr 1999 JP
11-122581 Apr 1999 JP
11-127415 May 1999 JP
11-150656 Jun 1999 JP
11-164160 Jun 1999 JP
11-191871 Jul 1999 JP
11-234605 Aug 1999 JP
11-239269 Aug 1999 JP
11-298848 Oct 1999 JP
11-308564 Nov 1999 JP
11-317863 Nov 1999 JP
11-317921 Nov 1999 JP
11-327605 Nov 1999 JP
11-331596 Nov 1999 JP
2000-013718 Jan 2000 JP
2000-020691 Jan 2000 JP
2000-022994 Jan 2000 JP
2000-050043 Feb 2000 JP
2000-069277 Mar 2000 JP
2000-069419 Mar 2000 JP
2000-101884 Apr 2000 JP
2000-115688 Apr 2000 JP
2000-125186 Apr 2000 JP
2000-132470 May 2000 JP
2000-137806 May 2000 JP
2000-165720 Jun 2000 JP
2000-196946 Jul 2000 JP
2000-215379 Aug 2000 JP
2000-278598 Oct 2000 JP
2000-312296 Nov 2000 JP
2000-324430 Nov 2000 JP
2000-354255 Dec 2000 JP
2001-084354 Mar 2001 JP
2001-084361 Mar 2001 JP
2001-147481 May 2001 JP
2001-243031 Sep 2001 JP
2002-010134 Jan 2002 JP
2002-014834 Jan 2002 JP
2002-063098 Feb 2002 JP
2003-060935 Feb 2003 JP
3725454 Dec 2005 JP
9601467 Jan 1996 WO
0106786 Jan 2001 WO
Related Publications (1)
Number Date Country
20110187897 A1 Aug 2011 US
Continuations (2)
Number Date Country
Parent 12231516 Sep 2008 US
Child 13083914 US
Parent 10429017 May 2003 US
Child 12231516 US