Imaging apparatus

Information

  • Patent Grant
  • 8248506
  • Patent Number
    8,248,506
  • Date Filed
    Thursday, September 30, 2010
    14 years ago
  • Date Issued
    Tuesday, August 21, 2012
    12 years ago
Abstract
An imaging apparatus for performing efficient signal processing depending on the operational mode. In the finder mode, a CCD interface 21a decimates horizontal components of image data supplied from an image generating unit 10 to one-third and moreover processes the decimated image data with data conversion and resolution conversion to produce Y, Cb and Cr image data which are routed to and written in an image memory 32 over a memory controller 22. In the recording mode, the CCD interface 21a causes the image data from the image generating unit 10 to be written in the image memory 32 via memory controller 22 after decimation and gamma correction etc. The camera DSP 21c reads out the image data via memory controller 22 from the image memory 32 to effect data conversion for writing the resulting data via memory controller 22 in the image memory 32.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


This invention relates to an imaging apparatus for performing signal processing depending on the operational modes.


2. Description of the Related Art


A digital still camera retrieves image data obtained by a CCD image sensor into a DRAM or a flash memory and subsequently transfers the image data to a so-called personal computer or the like. A major proportion of this type of the digital still camera has hitherto been of the type coping with the video graphics array (VGA) system.


Referring for example to FIG. 1, this digital still camera 200 includes a CCD image sensor 201 for generating image signals, an input processing/image processing circuit 202, a memory controller 203 for reading and writing image data, an output processing circuit 204 for conversion to an output image of a pre-set system, a finder 205 for displaying the state of an object at the time of image shooting, a recording unit 207 for recording compressed image data over a CPU bus 206 and a compression/expansion circuit 208 for compressing/expanding image data. The digital still camera 200 also includes a memory 209, formed by, for example, a DRAM, and a CPU 210 for controlling the overall device.


Before starting the image shooting of an object, the user has to confirm an object image displayed on the finder 205. This state is termed a finder mode. At this time, the CCD image sensor 201 sends image signals obtained on photoelectric conversion to the input processing/image processing circuit 202. The input processing/image processing circuit 202 performs the correlated dual sampling processing on the image signals to digitize the image signals. The input processing/image processing circuit 202 then performs pre-set signal processing, such as gamma correction, knee processing or camera processing and routes the processed image signals to the memory controller 203. The memory controller 203 then is responsive to the control by the CPU 210 to send the image data from the input processing/image processing circuit 202 to the output processing circuit 204. The output processing circuit 204 encodes image data in accordance with, for example, the National Television System Committee (NTSC) system, and analogizes the encoded image data to route the resulting analog data to the finder 205. This allows the object as an object of image shooting to be indicated on the finder 205.


On the other hand, if the user pushes a shutter button, not shown, to shift to the recording mode, the memory controller 203 causes the image data furnished from the input processing/image processing circuit 202 to be written in the memory 209. The CPU 210 causes the image data to be read out from the memory 209 and compresses the image data from the recording unit 207 in the compression/expansion circuit 208 with compression in accordance with, for example, the Joint photographic Experts Group (JPEG) system to record the compressed image data in the recording unit 207.


If the user performs pre-set processing to shift to the reproducing mode, the CPU 210 causes image data to be read out from the recording unit 207 to cause the image data to be expanded in JPEG system in the compression/expansion circuit 208 to route the resulting data via memory controller 203 and output processing circuit 204 to the finder 205. This causes the as-shot image to be displayed on the finder 205.


In keeping up with recent outstanding technical progress in the CCD image sensor, the resolution of image data is nearly surpassing 1,000,000 pixels. On the other hand, it may be feared that the digital still camera of the above-described structure cannot sufficiently cope with the image data exceeding 1,000,000 pixels.


If, for example, the CCD image sensor 201 outputs image signals of high resolution in the finder mode, the input processing/image processing circuit 202, memory controller 203 or the output processing circuit 204 cannot process image data in real-time, such that an image of the object is displayed on the finder 205 in a frame-skipping fashion. This incurs an inconvenience in shooting an image of object even if the object makes the slightest movement.


In the recording mode, since in which only multi-pixel image data is recorded in the recording unit 207, it is unnecessary to perform the processing in e.g., the input processing/image processing circuit 202.


That is, in the digital still camera 200, since the pre-set signal processing is performed on e.g., the input processing/image processing circuit 202 without regard to the operational mode, the signal processing has not necessarily been efficient insofar as the entire apparatus is concerned.


SUMMARY OF THE INVENTION

It is therefore an object of the present invention to provide an imaging apparatus that is able to perform efficient signal processing depending on the operational mode.


In one aspect, the present invention provides a


In another aspect, the present invention provides a controlling method for a signal processing apparatus adapted for transmitting/receiving image data between a plurality of signal processing means and storage means for storing image data, the signal processing means being adapted for processing the image data in a pre-set fashion and for outputting to the control means a request signal for demanding furnishment of the image data for signal processing or demanding the outputting of the processed image data. The controlling method includes selecting, on furnishment of the request signal from the plural signal processing means, one or more of the signal processing means which has outputted the request signal, and furnishing the image data read out from the storage means to the selected signal processing means or writing the image data outputted by the selected signal processing means in the storage means.


In still another aspect, the present invention provides an imaging apparatus including imaging means, storage means for transiently storing image data from the imaging means, control means for controlling the writing/readout of the image data for the storage means, a plurality of signal processing means for processing the image data in a pre-set fashion and for outputting to the control means a request signal for demanding famishment of the image data for signal processing or demanding the outputting of the processed image data, and outputting means for outputting image data processed by the signal processing means. The control means manages control on furnishment of the request signal to select one or more of the signal processing means which has outputted the request signal to furnish the image data read out from the storage means to the selected signal processing means or to write the image data outputted by the selected signal processing means in the storage means.


In yet another aspect, the present invention provides a recording/reproducing apparatus including imaging means, input processing means for performing pre-set input processing on image data from the imaging means, display processing means for displaying image data on display means, first storage means for transiently storing the image data from the imaging means, control means for controlling the writing/readout of the image data for the first storage means, resolution converting means for converting the resolution of image data, compression/expansion means for compressing/expanding the image data and recording/reproducing controlling means for causing the compressed image data to be recorded on second storage means and for causing the image data recorded on the second storage means to be reproduced. The control means selects one or more signal processing means from the input processing means, display processing means, resolution converting means and the compression/expansion means. The control means causes the image data read out from the first storage means to be sent to the selected signal processing means or causes the image data outputted by the selected signal processing means to be written in the first storage means.


In the signal processing apparatus and the control method therefor, according to the present invention, if a request signal is sent from each signal processing means, the signal processing means which has outputted the request signal having the utmost priority in the priority order is selected. Control is then performed for supplying the image data read out from the storage means over the image data bus to the selected signal processing means, or writing the processed image data of the selected signal processing means over the image data bus to the storage means, so that efficiently signal processing will be executed in the respective signal processing means.


The present invention provides a An imaging apparatus comprising:


imaging means for generating image data based on the imaging light from an object;


memory means for storing the image data;


a plurality of signal processing means for performing pre-set signal processing on the image data;


display means for displaying an image corresponding to said image data;


a recording medium for recording the image data thereon; and


control means for performing control in a first operational mode for processing the image data from said imaging means in a pre-set fashion by the signal processing means of said plural signal processing means required to perform real-time processing to write the image data in said memory means and for reading out the processed image data from said memory means to supply the read-out image data to said display means, said control means performing control in a second operational mode for writing the image data from said imaging means in said memory means and subsequently reading out the written image data to route the read-out image data to said plural signal processing means to record the image data processed by said plural signal processing means on said recording medium.


In the first mode of the imaging apparatus, the image data from the imaging means are decimated and processed in a pre-set fashion by signal processing means required to perform real-time processing. In the second mode, multi-pixel image data are first written in the memory means. The multi-pixel image data are then read out therefrom and routed to and processed by the plural signal processing means.


More specifically, the signal processing means of the plural signal processing means which is required to perform real-time processing is caused to perform pre-set signal processing on the image data from the imaging means, in the first operational mode of the imaging apparatus, the resulting image data being then written in the memory means and the processed image data being then read out from the memory means and routed to the display means. In the second operational mode, the image data from the imaging means is written in the memory means and read out therefrom so as to be routed to the respective signal processing means for processing, with the processed image data being then recorded on the recording medium. This realizes signal processing most efficient depending on the operational mode.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram for illustrating the structure of a conventional digital still camera.



FIG. 2 is a block diagram showing a schematic structure of a digital still camera embodying the present invention.



FIG. 3 is a block diagram showing the schematic structure of the digital still camera shown in FIG. 2.



FIG. 4 is a block diagram for illustrating flow of image data in a signal processing unit of the digital still camera shown in FIG. 2.



FIG. 5 is a block diagram for illustrating the structure of a simplified resolution conversion circuit in an input processing circuit of the signal processing unit.



FIG. 6 is a block diagram showing the structure of the resolution conversion circuit of the signal processing unit.



FIG. 7 is a block diagram showing a specified structure of a horizontal direction buffer, a horizontal direction conversion processing circuit, a vertical direction buffer and a vertical direction conversion processing circuit of the resolution conversion circuit.



FIG. 8 is a block diagram showing an alternative structure of the resolution conversion circuit.



FIG. 9 is a block diagram showing the structure of the vertical direction buffer of the resolution conversion circuit.



FIG. 10 illustrates a technique for reading out image data from the image memory by the memory controller.



FIG. 11 illustrates the coordinate position of pixels making up an image.



FIG. 12 illustrates another technique for reading out image data from the image memory by the memory controller.



FIG. 13 is a block diagram showing the structure of the horizontal direction buffer of the resolution conversion c constituted by a line buffer.



FIG. 14 illustrates the technique when the memory controller reads out image data from the image memory.



FIG. 15 is a block diagram showing the structure of the simplified resolution conversion circuit in the NTSC/PAL encoder of the signal processing unit.



FIGS. 16A to 16F show a timing chart for illustrating the contents of the signal processing in the respective circuits in the finder mode.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

Referring to the drawings, preferred embodiments of the present invention will be explained in detail.


The present invention is applied to digital still camera 1, configured as shown for example in FIG. 2.


The digital still camera 1 includes an image generating unit 10 for generating image signals, an input signal processor 20 for processing image data in a pre-set fashion, an image memory 32, comprised of an SDRAM, and a controller 40 for controlling the input signal processor 20.


The image generating unit 10 includes a solid-state imaging device for generating image signals, such as a CCD image sensor 11, a sample holding-analog/digital circuit (S/H-A/D circuit 12) for sample-holding and digitizing the image signals to output image data, and a timing generator 13 for generating timing signals. This timing generator 13 generates horizontal synchronization signals and vertical synchronization signals for controlling respective circuits of the image generating unit 10 based on synchronization signals supplied from the signal processor input.


The CCD image sensor 11 generates image data corresponding to XGA (extended graphic array: 1024×768) pixel data made up of, for example, 800,000 pixels. The CCD image sensor 11 is driven based on the synchronization signals from the timing generator 13 to output image signals at a rate of 30 frames per second. Meanwhile, the CCD image sensor 11 has the function of thinning out image signals and is able to thin out vertical components of the image signals to ½, ⅓, ¼, . . . to output the resulting thinned-out signals.


The S/H-A/D circuit 12 is also adapted to perform sample-holding and A/D conversion at a pre-set sampling interval based on the synchronization signals from the timing generator 13 to send the resulting image data to the signal processor 20.


The signal processor 20 includes a sole LSI (large scale integrated circuit). The signal processor 20 includes an input signal processor 21 for input processing and camera processing on image data from the image generating unit 10, a memory controller 22 for controlling the readout/write of image data for the image memory 32, an NTSC/PAL (phase alternation by line) encoder 23, a D/A converter 24 for analogizing image data and outputting the resulting analog signals to outside, and a sync generator 26 for generating synchronization signals and supplying the resulting synchronization signals to the timing generator 13.


The signal processor 20 also includes a memory interface 27, as an interface for the image memory 32, a resolution conversion circuit 28 for converting the resolution of the image data, a JPEG (Joint Photographic Experts Group) encoder/decoder 29, for compressing/expanding image data, a JPEG interface 30, as an interface of the JPEG encoder/decoder 29, and a host interface 31, as an interface for having data transmission/reception with the CPU of the controller 40.


The input signal processor 21 processes the image data from the S/H-A/D circuit 12 with digital clamp, shading correction, aperture correction, gamma correction or color processing and routes the resulting processed signals to the memory controller 22. The input signal processor 21 has the function of processing input data to convert the input data into Y, Cb and Cr. If the resolution of the image data is larger than that of the VGA (Video Graphics Array), the input signal processor 21 is able to perform the processing of lowering the resolution. The input signal processor 21 also performs the auto-focusing and auto-iris detection to route the data to the controller 40 to effect automatic adjustment of the focusing mechanism and the iris mechanism. The input signal processor 21 also detects the signal level of the three prime colors making up the image data to adjust automatic white balance.


The memory controller 22 also performs control to cause image data supplied from the input signal processor 21 or other circuitry to be written in the image memory 32 via a memory interface 27 and to read out image data of the image memory 32 via the memory interface 27. At this time, the memory controller 22 detects whether or not there is any defective pixel in the CCD image sensor 11 based on the image data stored in the image memory 32.


The memory controller 22 routes the image data read out from the image memory 32 to, for example, the NTSC/PAL encoder 23. When fed with the image data from the memory controller 22, the NTSC/PAL encoder 23 encodes the image data in accordance with the NTSC system or the PAL system to send the encoded data to the DIA converter 24. The DIA converter 24 analogizes the image data to output the resulting analog signals via output terminal 25.


The memory controller 22 routes the image data, read out from the memory controller 22, to the resolution conversion circuit 28 to cause the image data to be converted in resolution, while causing the image data outputted by the resolution conversion circuit 28 to be written in the image memory 32.


The memory controller 22 routes the image data via the JPEG interface 30 to the JPEG encoder/decoder 29 to effect compression of the still image, while causing the image data expanded by the JPEG encoder/decoder 29 to be written in the image memory 32.


The image memory 32 not only stores the image data as described above but also stores OSD data (on-screen-display data) as the so-called character generator data. The OSD data is made up of bit map data. The controller 22 controls the readout/write of the OSD data. The image data and the OSD data are synthesized by the NTSC/PAL encoder 23.


The controller 40 includes a CPU (central processing unit) 41 for controlling the respective circuits of the signal processor 20, a DRAM (dynamic random access memory) 42, a ROM (read-only memory) 43, having the control program for the CPU 41 stored therein, a flash memory interface 44, as an interface for exchanging image data with a storage device 51, such as a flash memory, and an IrDA interface 45, as an interface of the communication circuit 52 constituted such as by IrLED.


For example, the CPU 41 causes image data compressed by the JPEG encoder/decoder 29 to be written via a flash memory/interface 44 in a storage device 51, made up of a flash memory, while causing image data to be read out from the storage device 51 to route the image data read out from the JPEG encoder/decoder 29. The CPU 41 also causes the image data read out from the storage device 51 to be outputted via the IrDA interface 45 and the communication circuit 52 as infrared light to outside.


The schematic structure of the digital still camera 1 is shown in FIG. 3.


The input signal processor 21 routes the image data from the CCD image sensor 11 via an image data bus 33 to the image memory 32. The NTSC/PAL encoder 23 encodes the image data from the image memory 32 in a pre-set fashion to send the resulting encoded data to the finder 36. This causes an image of an object to be displayed on the finder 36 which is adapted to display the image in association with the image data up to the VGA format.


The memory controller 22 performs data transfer between the image memory 32 and the signal processing circuits connecting to the image data bus 33. The resolution conversion circuit 28 performs resolution conversion of the image data from the image memory 32 to route the results to the image memory 32. The JPEG encoder/decoder 29 compresses the image data from the image memory 32 in accordance with the JPEG system to route the compressed image data via CPU bus 34 to the CPU 41, which then causes the compressed image data to be written in the storage device 51. The CPU 41 is also able to output the compressed image data via the CPU bus 34 and the communication circuit 52 to outside.


Thus, is FIG. 3, the respective circuits of the signal processor 20 are interconnected over the image data bus 33. The image data bus 33 is a virtual bus and indicates that there is placed a limit to the transfer band for image data exchanged between the respective circuits.


In the signal processor 20, the respective circuits, such as NTSC/PAL encoder 23 or the resolution conversion circuit 28, send to the memory controller 22 a request signal indicating that image data are demanded. These circuits also transmit a request signal to the memory controller 22 when outputting the image data after the end of the processing of the image data.


On reception of the request signals from the respective circuits, the memory controller 22 selects those circuits having the high priority sequence, and transmits an acknowledge signal to the selected circuit. The acknowledge signal indicates that image data can be routed to a circuit receiving the signal or that image data outputted by a circuit which has received the acknowledge signal is ready to be received. The memory controller 22 reads out image data from the image memory 32 to route the read-out image data via image data bus 33 to the circuit corresponding to the destination of the acknowledge signal. The memory controller 22 receives the image data outputted by the circuit which has sent the acknowledge signal to write the image data in the image memory 32.


On reception of the request signals from plural circuits, the memory controller 22 is able to select preferentially the circuit which has to perform the processing in real-time. For example, if an image of an object is to be displayed on the finder 36, the memory controller 22 preferentially selects the input signal processor 21 and the NTSC/PAL encoder 23. It is also possible for the memory controller 22 to decipher the bus occupation ratio of the image data on the image data bus 33 to determine the priority sequence of the respective circuits depending on the occupation ratio.


If image data can be routed to the respective circuits within the transfer band limitation of the image data bus 33, it is possible for the memory controller 22 to perform control to send the acknowledge signal to the respective circuits time-divisionally to permit the respective circuits to perform pre-set processing. This enables the memory controller 22 to have access in real-time to data in the respective circuits to cause the image data from the respective circuits to be written in the image memory 32 or to cause the image data in the image memory 32 to be read out and sent to the respective circuits.


If, when the memory controller 22 has access to external circuitry, not shown, over the image data bus 33, the external circuitry can send the above-mentioned request signal or receive the transmitted acknowledge signal, the memory controller 22 can have access simultaneously and time-divisionally to the respective circuits within the signal processor 20 within the transfer band limitation range of the image data bus 33. That is, if within the range of the band of the image data bus 33, the memory controller 22 can have simultaneous access to the circuits in the signal processor 20 or to the external circuits within the signal processor 20 time-divisionally without regard to the number of the circuits within the signal processor 20 or the external circuit.


As mentioned above, the memory controller 22 performs arbitration of the image data bus 33, write/readout control of image data between the image memory 32 and the respective circuits and data transfer to the CPU bus 34.


The specified flow of image data in the signal processor 20 is explained with reference to FIG. 4.


The input signal processor 21 includes a CCD interface 21a for performing pre-set signal processing on the image data from the image generating unit 10, a detection circuit 21b for processing the CCD interface 21a, and a camera digital signal processor 21c (camera DSP 21c) for doing conversion processing of the image data.


The CCD interface 21a performs the processing, such as the digital clamp, white balance adjustment or gamma correction, on the image data made up of R, G and B from the S/H-A/D circuit 12 shown information FIG. 2, or decimates the components in the horizontal direction of image data in case of necessity. After such processing, the CCD interface 21a routes image data to the camera DSP 21c or to the memory controller 22 via the image data bus 33.


From the image data of the CCD interface 21a, the detection circuit 21b performs detection for auto-focusing, auto-iris or white balance adjustment.


The camera DSP 21c converts the image data of R, G and B from the CCD interface 21a into image data made up of luminance signal Y and chrominance signals Cb, Cr. The camera DSP 21c also has a simplified resolution conversion circuit 21 which not only performs the above processing but also converts the resolution of the image data in a simplified fashion.


The simplified resolution conversion circuit 21d operates for converting the resolution of the image data to lower values if the resolution of the image data generated by the CCD image sensor 11 is larger than, for example, the VGA format.


Specifically, the simplified resolution conversion circuit 21d includes a B-Y/R-Y separation circuit 61, for separating chrominance signals a horizontal direction linear interpolation circuit 62 for interpolation in the horizontal direction, a B-Y/R-Y synthesis circuit 63 for synthesizing the chrominance signals, a 1H delay circuit 64 for delaying the respective signals by a horizontal scanning period (1H period), and a vertical direction linear interpolation circuit 65.


The B-Y/R-Y separation circuit 61 separates the chrominance signals B-Y and R-Y, as chroma signals Cb, Cr, from the image data from the camera DSP 21c, to route the separated chroma signals to the horizontal direction linear interpolation circuit 62. The horizontal direction linear interpolation circuit 62 interpolates the luminance signals Y and the chrominance signals B-Y, R-Y in the horizontal direction to lower the luminance in the horizontal direction to route the interpolated luminance signals Y and the chrominance signals B-Y, R-Y to the B-Y/R-Y synthesis circuit 63.


The B-Y/R-Y synthesis circuit 63 synthesizes the chrominance signals B-Y, R-Y, to route the luminance signals Y from the horizontal direction linear interpolation circuit 62 and the synthesized chrominance signals B-Y, R-Y to the 1H delay circuit 64 and to the vertical direction linear interpolation circuit 65. The 1H delay circuit 64 delays the luminance signals Y and the chrominance signals by 1H to route the delayed signals to the vertical direction linear interpolation circuit 65. The vertical direction linear interpolation circuit 65 performs linear interpolation processing in the vertical direction, based on the luminance signals Y and the chrominance B-Y, R-Y from the B-Y/R-Y synthesis circuit 63 and the 1H delay circuit 64, to output image data made up of luminance signals Y′ and chrominance signals (B-Y)′, (R-Y)′ lowered in resolution in both the horizontal and vertical directions.


The resolution conversion circuit 28 performs resolution conversion processing of converting [p×q] image data into [m×n] image data. The resolution conversion circuit 28 performs processing for suppressing the resolution to a pre-set value if the image data produced in the CCD image sensor 11 are of high resolution. It is however possible to process the image data of low resolution into data of high resolution.


Referring to FIG. 6, the resolution conversion circuit 28 includes an input buffer 71 for storing image data, inputted from the image data bus 33, a horizontal direction buffer 72, for buffering the image data from the an input buffer 71 in the horizontal direction, a horizontal direction transform processing circuit 73 for converting the resolution of the image data from the horizontal direction buffer 72 in the horizontal direction, a vertical direction buffer 74 for buffering the image data from the horizontal direction transform processing circuit 73 in the vertical direction, a vertical direction transform processing circuit 75 for converting the resolution of the image data in the vertical direction, and an output buffer 76 for buffering at the time of outputting.


When ready for converting the resolution of the image data, the resolution conversion circuit 28 outputs a read request signal requesting the memory controller 22 to read out image data from the image memory 32, while outputting a write request signal requesting the memory controller 22 to write the image data in the image memory 32 after the conversion processing of the image data. The resolution conversion circuit 28 also receives an acknowledge signal indicating that the memory controller 22 has responded to the request signal.


Referring to FIG. 7, the horizontal direction buffer 72 is made up of a first delay circuit 81, a second delay circuit 82 and a third delay circuit 83 each for producing the delay of one pixel. Thus, the first delay circuit 81 outputs image data delayed by one pixel, while the second and third delay circuits 81, 82 output image data delayed by two pixels and image data delayed by three pixels, respectively.


Referring to FIG. 7, the horizontal direction transform processing circuit 73 includes first to fourth multipliers 84, 85, 86, 87, and first to third adders 88, 89, 90. A circuit for normalizing data is incidentally annexed at back of the adder 90.


The first multiplier 84 multiplies the image data supplied from the an input buffer 71 with a pre-set coefficient to route the resulting data to the adder 88. The second multiplier 85 multiplies the image data supplied from the first delay circuit 81 with a pre-set coefficient to route the resulting data to the adder 88. The third multiplier 86 multiplies the image data supplied from the second delay circuit 82 with a pre-set coefficient to route the resulting data to the adder 89. The fourth multiplier 87 multiplies the image data supplied from the third delay circuit 83 with a pre-set coefficient to route the resulting data to the adder 90. The first adder 88 synthesizes the image data to send the resulting data to the second adder 89. The second adder 89 synthesizes the image data to send the resulting data to the third adder 90. The third adder 90 synthesizes the respective image data to send the resulting data as image data converted in resolution in the horizontal direction to the vertical direction buffer 74.


Thus, the horizontal direction transform processing circuit 73 weights plural image data each having one pixel delay in a pre-set fashion with pre-set weights and synthesizes the weighted image data to interpolate or decimate the pixels in the horizontal direction to convert the resolution in the horizontal direction.


The vertical direction buffer 74 is constituted by a serial connection of first to third buffers 91, 92, 93, each adapted to produce a one-line delay. Thus, the first buffer memory 91 outputs image data delayed by one line, while the second and third buffer memories 92, 93 output the image data delayed by two and three lines, respectively.


Referring to FIG. 7, the vertical direction transform processing circuit 75 includes fifth to eighth multipliers 94 to 97 and fourth to sixth adders 98 to 100. The vertical direction transform processing circuit 75 occasionally includes a circuit for normalizing data on the downstream side of the adder 90.


The fifth multiplier 94 multiplies the image data supplied from the horizontal direction conversion circuit 73 with a pre-set coefficient to route the resulting data to the fourth adder 98. The sixth multiplier 95 multiplies the image data supplied from the first line memory 91 with a pre-set coefficient to route the resulting data to the fourth adder 98. The seventh multiplier 96 multiplies the image data supplied from the second line memory 92 with a pre-set coefficient to route the resulting data to the fifth adder 99. The eighth multiplier 97 multiplies the image data supplied from the third line memory 93 with a pre-set coefficient to route the resulting data to the sixth adder 100. The fourth adder 98 synthesizes the image data to send the resulting data to the fifth adder 99. The fifth adder 99 synthesizes the image data to send the resulting data to the sixth adder 100. The sixth adder 100 synthesizes the respective image data to output the resulting data as image data converted in resolution in the horizontal direction.


Thus, the vertical direction transform processing circuit 75 weights plural image data each having one line delay in a pre-set fashion with pre-set weights and synthesizes the weighted image data to interpolate or decimate the pixels in the horizontal direction to convert the resolution in the vertical direction.


In FIG. 7, the resolution conversion circuit 28 first performs resolution conversion in the horizontal direction followed by resolution conversion in the vertical direction. It is however possible for the resolution conversion circuit 28 to perform resolution conversion in the vertical direction followed by conversion in the horizontal direction. That is, the resolution conversion circuit 28 may be configured to supply the image data from the input buffer 71 to the vertical direction buffer 74 and to effect the processing in the vertical direction buffer 74, vertical direction transform processing circuit 75, horizontal direction buffer 72 and in the horizontal direction transform processing circuit 73, in this order.


In the above-described embodiment, the first to third buffer memories 91 to 93 in the vertical direction buffer 74 are configured to store one-line (1H) image data. Alternatively, the first to third buffer memories 91 to 93 may be configured for storing image data lesser than one line, as shown in FIG. 9. It is then necessary for the memory controller 22 to read out the image data stored in the image memory 32 every N pixels, as shown in FIG. 10.


Specifically, the memory controller 22 reads out pixel data corresponding to a viewing screen stored in the image memory 32 every N pixels on the line basis in the vertical direction. Referring to FIG. 11, each viewing screen is made up of p×q pixels, with the coordinate of the upper left pixel being (1,1), that of the upper right pixel being (p,1), that of the lower left pixel being (1,q) and with the lower right pixel being (p,q).


Referring to FIG. 12, the memory controller 22 causes the image data of N pixels to be read out on the line basis in the horizontal direction in the sequence of the rows 1, 2, . . . , q. This causes the memory controller 22 to read out image data corresponding to N pixels from the left end, or N×q pixels, that is pixel data in an area defined by (1,1), (1,q), (N,q) and (N,1). This image data is referred to below as image data set (1).


The memory controller 22 then reads out image data in a range defined by (N−1, 1) (N−1 q), (2N−2, q), (2N−2, 1), referred to below as the image data set (2). If the memory controller 22 reads out the image data set (1) and the image data set (2), it is tantamount to reading out the image data of the (N−1)st column and the Nth column twice.


The reason is that, since the vertical direction transform processing circuit 75 performs interpolation beginning from the surrounding pixel, the pixels stored in the beginning end and the trailing end of the first to third buffer memories 91 to 93 are not the object of processing. For example, if the image data set (1) is read out, the pixel (N, 1) is not the object of the interpolation processing in the vertical direction. However, this pixel (N, 1) is read out when the pixel data set (2) is read out, and becomes the object of interpolation processing.


In similar manner, the memory controller 22 reads out image data of N pixels in the horizontal direction every line so that image data of the last two columns of the directly previous image data set will be included. This routes the image data set to the resolution conversion circuit 28.


The vertical direction buffer 74 is fed with image data, in an amount corresponding to the capacity of the first to third buffers 91 to 93, on the line basis. Thus, image data offset one line is stored in each of the first to third buffer memories 91 to 93. The vertical direction transform processing circuit 75 is able to perform the resolution conversion processing in the vertical direction based on the image data from the first to third buffers 91 to 93 of the vertical direction buffer 74.


With the memory controller 22, the memory controller 22 can cause the resolution conversion circuit 28 to execute the resolution conversion in the vertical direction, by readout in meeting with the capacity of the buffer memory, even if the capacity of the buffer memory required for resolution conversion in the vertical direction is not up to one line.


Although the read-out overlap between the image data sets is two columns, it is probable that the overlap exceeds two columns or there is no overlap. It is noted that the present invention is applicable to image signal processing; such as camera signal processing, without limitation to resolution conversion.


Although the foregoing description is directed to the embodiment in which the buffer memory is being used for interpolation for the vertical direction, the present invention is also applicable to an embodiment in which the buffer memory is being used for interpolation for the horizontal direction.


That is, the resolution conversion circuit 28 may perform resolution conversion in the horizontal direction using a horizontal direction buffer 72a comprised of a buffer memory 72a having a capacity of N pixels, as shown in FIG. 13. The memory controller 22 can read out image data of N pixels on the column basis in the sequence of the rows 1, 2, . . . p in the vertical direction, as shown in FIG. 14. Meanwhile, it is necessary for the memory controller 22 to read out the image data stored at the leading and trailing ends of the buffer memory twice, as in the above-described vertical interpolation processing, so that these image data will be the object of the horizontal interpolation processing.


Thus, the memory controller 22 is able to read out image data from the image memory 32 so that resolution conversion processing in the horizontal and vertical directions will be effected for the first to third buffer memories 91 to 93 each having a capacity of N pixels. This enables the circuit scale of the horizontal direction buffer 72 and the vertical direction buffer 74 to be reduced to lower the production cost.


The NTSC/PAL encoder 23, executing the encoding as described above, also has a simplified resolution conversion circuit 23a for increasing the resolution of the image data, if need be, before proceeding to encoding.


The simplified resolution conversion circuit 23a performs resolution conversion for matching to the display standard of the finder 36 if the image data on the image memory 32 is lower than the resolution required for display.


Referring to FIG. 15, the simplified resolution conversion circuit 23a includes a line memory 101 for storing image data from the image data bus 33, a vertical direction linear interpolation circuit (V-direction linear interpolation circuit 102) for interpolating image data in the vertical direction, and a horizontal direction interpolation circuit 103.


The line memory 101 stores image data from an input terminal in an amount corresponding to one line to send the image data to the V-direction linear interpolation circuit 102 in the order it is stored. The V-direction linear interpolation circuit 102 weights the image data from the input terminal in and the image data from the V-direction linear interpolation circuit 102 with a pre-set weighting to perform linear interpolation in the vertical direction. The horizontal direction interpolation circuit 103 interpolates Y with an order-seven filter, while interpolating Cb and Cr with an order-three filter. This is simply the interpolation for increasing the resolution by a factor of two. The horizontal direction interpolation circuit 103 outputs the image data at an output terminal out.


For example, if image data inputted from the input terminal in is denoted a, image data read out from the line memory 101 is b, a coefficient for weighting is g, where 0≦g≦1, and image data outputted by the V-direction linear interpolation circuit 102 is c, the V-direction linear interpolation circuit 102 effectuates the following processing:

c=g*a+(1−g)*b.


The image data outputted by the output terminal out is encoded by the NTSC/PAL encoder 23, as mentioned previously.


In the signal processing system, the digital still camera 1 is made up of so-called two chips, namely s signal processor 20 and a CPU 41. Therefore, the respective signal processing circuits are each of the chip configuration, so that the substrate surface area and further the power consumption can be made smaller than if the respective signal processing circuits are of separate chip configurations.


Also, since the signal processor 20 is not of the chip configuration inclusive of the CPU, signal processing can be adaptively effectuated even if the application in connection with the CPU 41 is changed. That is, if the signal processor 20 is of the chip configuration inclusive of the CPU, it is impossible to reconstruct the chip in case the application of the CPU is changed. However, the signal processor 20 can perform the pre-set signal processing using a CPU of an optimum structure on the application basis.


The digital still camera 1 of the above-described structure has a finder mode for confirming the status or the position of an object prior to image shooting, a recording mode for shooting the image of the object as confirmed, and a reproducing mode for confirming the shot state of the object image, and effects the processing depending on the prevailing mode.


In the finder mode, the user has to observe the state of the object indicated on the finder 36 before thrusting a shutter button, not shown, to shoot the object. In this finder mode, the memory controller 22 and other circuits are controlled in the following manner. For illustrating the respective modes, reference is had mainly to FIG. 4 and occasionally to FIG. 16.


In the finder mode, the CCD image sensor 11 generates image signals, thinned out to one-third from the vertical components, and furnishes the digitized image data via the S/H-A/D circuit 12 to the CCD interface 21a.


The CCD interface 21a performs signal processing in synchronism with clocks shown in FIG. 16A. Specifically, the CCD interface 21a decimates the horizontal components of the image data supplied by the image generating unit 10 to one-third and corrects the decimated image data for gamma to send the gamma-corrected data to the camera DSP 21c. The CCD interface 21a furnishes the image data converted to 340×256 from the ⅓ decimation process to the camera DSP 21c.


The camera DSP 21c performs data conversion processing on the decimated image data into YCrCb image data. The camera DSP 21c converts the resolution of the image data in the simplified resolution conversion circuit 21d (340×256→320×240) for lowering the resolution of the image data to route the converted image data via image data bus 33 to the memory controller 22.


It is noted that the simplified resolution conversion circuit 21d lowers the resolution in a simplified fashion to an extent necessary for subsequent processing. In this manner, if image data generated by the CCD image sensor 11 is of high resolution, the transfer range taken up by the image data generated by the CCD image sensor 11 can be decreased to evade the stagnancy on the image data bus 33 to maintain the real-time characteristics of the finder mode.


The memory controller 22 writes the image data in the image memory 32, while reading out the image data from the image memory 32 as shown in FIG. 16D to send the read-out image data via the image data bus 33 to the NTSC/PAL encoder 23. Simultaneously, the memory controller 22 reads out the OSD data stored in the image memory 32, as shown in FIG. 16E, to send the OSD data stored in the image memory 32, as shown in FIG. 16E. FIG. 16 F shows the state of transfer on the image data bus 33 which enables the above-described real-time processing.


The NTS/PAL encoder 23 performs resolution conversion of 320×240→640×240 or 320×240→640×288 in the case of the NTSC system or the PAL system, respectively, to send the converted image data to the NTSC/PAL encoder 23. The NTSC/PAL encoder 23 also converts the image data into data of the NTSC system or the PAL system into OSD data which is routed to the finder 36 shown in FIG. 3. This allows the image of the object and the title information etc to be displayed in-real time on the finder 36.


Meanwhile, the NTSC/PAL encoder 23 converts the resolution so that data with low resolution will be increased in resolution, such that, if 320×200 image data is furnished, it is converted into 640×240 image data and into 640×288 image data for the NTSC system and for the PAL system, respectively.


In the digital still camera 1, the resolution of the image data generated by the CCD image sensor 11 is lowered in a simplified fashion in the finder mode to reduce the data volume, so that the image data will be within the bandwidth limitation of the image data bus 33 and so that the resolution will be increased at an output stage to the extent that is necessary for display, at a timing shown in FIG. 16F.


Thus, with the digital still cameral, the image data is held in the bandwidth limitation of the image data bus 33 to permit the image of the object to be displayed on the finder 36, even if the image data is of high resolution, without the necessity of performing the time-consuming decimation processing.


If the circuitry for preferential processing, namely the CCD interface 21a, camera DSP 21c or the NTSC/PAL encoder 23, is previously set in the CPU 41, and signal processing is carried out time-divisionally in other circuits as in the above circuits, the processing of the respective circuits with high priority may be preferentially performed depending on the data volume of the image data.


In the event of the large data volume of the image data in the simplified resolution conversion circuit 21d, data processing may be performed at a high processing speed, in order to give priority to real-time processing, even though the picture quality is degraded to a certain extent, under control by the CPU 41. In this manner, high-speed processing can be effected in the finder mode even in case of the large data volume of the image data generated in the image generating unit 10.


In the case of the digital still camera 1, having an electronic zooming function, the CPU 41 can control the respective circuits in the following manner.


The memory controller 22 causes the image data, supplied via the CCD interface 21a and camera DSP 21c, to be written in the image memory 32, while causing the image data to be read out from the image memory 32 and routed to the resolution conversion circuit 28. The resolution conversion circuit 28 formulates image data enlarged from a portion of the input image, by an electronic zooming function, to output the resulting image data to the image memory 32. This image data is read out from the image memory 32 and outputted to the finder 36 via the NTSC/PAL encoder 23. This generates electronically zoomed image data.


Since the finder mode gives utmost priority to the real-time characteristics, time-consuming processing is not executed by the respective circuits. However, the CPU 41 can be configured to cause the memory controller 22 and other circuits to perform various processing operations if within the range allowed by the transfer area of the image data bus 33.


For example, the memory controller 22 may be configured to read out image data from the image memory 32, in which is stored the image data furnished from the CCD interface 21a, and to furnish the read-out image data to the NTSC/PAL encoder 23 over the image data bus 33 and to the JPEG encoder/decoder 29. The finder 36 displays the image of the object in real-time, while the JPEG encoder/decoder 29 compresses the image data in accordance with the JPEG system.


The JPEG encoder/decoder 29 compresses/expands the still image, while it cannot process high-pixel image in real-time. It is thus possible for the JPEG encoder/decoder 29 to decimate a pre-set number of frames of the image data supplied from the image data bus 33 (number of frames or fields) by way of compression or to slice a portion of the image to lower the resolution by way of compression. This enables shooting of a frame-decimated still image continuously or shooting of a low-resolution image continuously.


The user observes the state of the object displayed on the finder 36 in the above-mentioned finder mode. If the object is decided to be shot, the user pushes a shutter button, not shown.


If the shutter button is pushed, the digital still camera 1 proceeds to the recording mode. In the recording mode, the CPU 41 controls the memory controller 22 or the respective circuits in the following manner to record the image of the as-shot object on a recording device 51.


The CCD image sensor 11 halts the decimation operation in synchronism with the thrusting the shutter button to generate image signals of the XGA format to route the digitized image data via the S/H-A/D circuit 12 to the CCD interface 21a.


The CCD interface 21a routes the image data furnished from the S/H-A/D circuit 12 not to the camera DSP 21c, but to the memory controller 22 via the image data bus 33. The memory controller 22 first writes the image data in the image memory 32 and subsequently reads out the image data to route the read-out image data via the image data bus 33 to the camera DSP 21c. The camera DSP 21c converts the image data made up of ROB into image data made up of Y, Cb and Cr.


The camera DSP 21c is fed with image data once written in the image memory 32. That is, the camera DSP 21c effects data conversion on the image data from the image memory 32 instead of on the image data directly supplied from the CCD interface 21a. Thus, it is unnecessary for the camera DSP 21c to perform high-speed data conversion, but it is only sufficient if the camera DSP 21c executes such processing when the image data bus 33 is not busy. Stated differently, it is unnecessary for the camera DSP 21c to perform the processing in real-time, so that data conversion processing can be executed with priority given to the high picture quality rather than to the high processing speed and the resulting converted image data may be routed to the memory controller 22 via the image data bus 33. The memory controller 22 causes the image data to be written in the image memory 32.


The memory controller 22 causes the image data to be read out from the image memory 32 to route the read-out image data to the JPEG encoder/decoder 29. The JPEG encoder/decoder 29 compresses the image data in accordance with the JPEG system to write the compressed image data in the recording device 51 shown in FIG. 3.


If real-time processing is not unnecessary, as during recording, the CPU 41 permits the pre-set processing to be executed after writing the image data transiently in the image memory 32 to exploit the transfer band of the image data bus 33 to process the high-pixel image.


The CPU 41 records the image data of the XGA format directly in the recording device 51 in the recording mode. It is however possible for the resolution conversion circuit 28 to convert the resolution of the image data before recording the image data On the recording device 51. Specifically, it is possible to cause the resolution conversion circuit 28 to convert the resolution of the image data read out from the image memory 32 via the memory controller 22 in meeting with the VGA (1024×768→640×480) to permit the JPEG encoder/decoder 29 to compress the image data to record the compressed data in the recording device 51.


If desirous to confirm the as-shot image after image shooting, the operator thrusts the playback button, not shown, for reproducing the as-shot image.


If the reproducing button is thrust, the digital still camera 1 moves to the reproducing mode. In the reproducing mode, the CPU 41 controls the respective circuits in the following manner to read out the image data of the object.


That is, on detecting the thrusting the reproducing button, the CPU 41 reads out the image data from the recording device 51 and transiently stores the read-out image data in the DRAM 42 before routing the data via CPU bus 34 to the JPEG encoder/decoder 29. The JPEG encoder/decoder 29 expands the image data read out from the recording device 51 in accordance with the JPEG system to produce image data of the XGA format to route the resulting image data via the image data bus 33 to the memory controller 22.


The memory controller 22 writes the image data on the image memory 32 and reds out the image data from the image memory 32 to send the read-out image data via the image data bus 33 to the resolution conversion circuit 28.


The resolution conversion circuit 28 effects resolution conversion so that the image data will be in meeting with the VGA format (1024×768→640×480 in the NTSC system and 1024×768→640×576 in the PAL system) to route the converted image data over the image data bus 33 to the memory controller 22. The image data then is read from the image memory 32 and routed via the NTSC/PAL encoder 23 to the finder 36. This displays an image corresponding to the image data recorded in the recording device 51 on the finder 36.


That is, since the image data recorded in the recording device 51 has high resolution, the CPU 41 first lowers the resolution and subsequently routes the image data to the finder 36.


It is also possible for the CPU 41 to set, for each of the finder mode, recording mode and the reproducing mode, the order of priority of the circuits to be processed in preference and to cause the pertinent circuit to execute the processing in accordance with the order of priority on movement to one of the modes. This enables the signal processing of image data to be executed efficiently depending on the processing contents in each mode.


In the above-described embodiment, it is assumed that the data being processed is the image data equivalent to XGA. It is to be noted that the present invention is not limited to this embodiment and can be applied to, for example, the processing of image data comprised of one million or more pixels.

Claims
  • 1. An imaging apparatus comprising: an image sensor capable of generating image data from incident light wherein the image sensor is capable of decimating the image data at least in a vertical direction;a signal processor connected to the image sensor, capable of selectively performing a plurality of signal processing on image data, capable of performing a processing of image data comprising a luminance signal and two chrominance signals as a performable signal processing, and capable of performing a processing of image data in a plurality of resolutions including a low resolution associated with a display resolution and a high resolution that is higher than said low resolution as a performable signal processing, wherein the signal processor comprises a data bus for interconnecting a plurality of components of the signal processor for data exchange;a display having the display resolution, connected to the signal processor and capable of displaying image data;a storage interface connected to the signal processor and capable of connecting to a recording medium for recording image data thereon; anda memory connected to the signal processor and capable of storing image data from the image sensor;wherein said imaging apparatus is configured to be operated in first, second and third operation modes;wherein in the first operation mode:image data is generated by the image sensor and decimated into the low resolution,the image data is stored in the memory,the image data is formatted for display by the signal processor, andthe formatted image data is displayed on the display;in the second operation mode:image data is generated by the image sensor in the high resolution,the image data is stored in the memory,the image data is compressed by the signal processor into compressed image data, andthe compressed image data is recorded on the recording medium; andin the third operation mode,compressed image data is read out from the recording medium,the compressed image data is decompressed by the signal processor,the decompressed image data is stored in the memory,the decompressed image data is formatted for display by the signal processor, andthe formatted image data is displayed on the display.
  • 2. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes resolution conversion of said image data comprising a luminance signal and two chrominance signals.
  • 3. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes electronic zooming.
  • 4. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes enlarging at least a portion of image data.
  • 5. The imaging apparatus of claim 1, wherein the memory stores image data comprising a luminance signal and two chrominance signals.
  • 6. The imaging apparatus of claim 1, wherein said signal processor is capable of processing YCrCb image data where Y is said luminance component, and Cr and Cb are said two chrominance components.
  • 7. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes NTSC encoding.
  • 8. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes PAL encoding.
  • 9. The imaging apparatus of claim 1, wherein the signal processor generates OSD (on-screen-display) data and outputs the OSD data with image data to the display.
  • 10. The imaging apparatus of claim 1, wherein the formatting for display by the signal processor includes resolution conversion.
  • 11. The imaging apparatus of claim 1, wherein the formatting for display by the signal processor is VGA formatting.
  • 12. The imaging apparatus of claim 1, wherein at least part of the data exchange on the signal processor is performed on the basis of request and response.
  • 13. The imaging apparatus of claim 1, further comprising two separate buses for exchanging data between components of the imaging apparatus.
  • 14. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes arbitration of data transfer within the signal processor.
  • 15. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes memory access control for the signal processor.
  • 16. The imaging apparatus of claim 1, wherein the signal processor is capable of detecting defective pixels of the image sensor.
  • 17. The imaging apparatus of claim 1, wherein said storage interface is a flash memory interface.
  • 18. The imaging apparatus of claim 1, further comprising a communication circuit capable of outputting image data to outside of the imaging apparatus.
  • 19. The imaging apparatus of claim 1, wherein the performable signal processing of the signal processor includes color space conversion.
  • 20. The imaging apparatus of claim 1 further comprising a controller configured to control the signal processor.
  • 21. The imaging apparatus of claim 1, wherein the signal processor is capable of focus control.
  • 22. The imaging apparatus of claim 1, wherein the signal processor is capable of white balance adjustment of image data.
  • 23. The imaging apparatus of claim 1, wherein the signal processor is capable of performing color processing of image data.
  • 24. The imaging apparatus of claim 1, wherein the signal processor is capable of performing shading correction of image data.
  • 25. The imaging apparatus of claim 1, wherein the signal processor is capable of performing gamma correction of image data.
  • 26. The imaging apparatus of claim 18, wherein the communication circuit is connected to a data bus.
  • 27. The imaging apparatus of claim 1, wherein the compressed image data is compressed by JPEG image compression.
  • 28. The imaging apparatus of claim 1, wherein the signal processor generates synchronization signals and supplies the synchronization signals to the image sensor.
  • 29. The imaging apparatus of claim 1, wherein the image sensor comprises a solid-state imaging sensor and an analog/digital circuit.
  • 30. The imaging apparatus of claim 29, wherein the image sensor further comprises a timing generator capable of generating timing signals based on synchronization signals supplied from the signal processor.
  • 31. The imaging apparatus of claim 30, wherein the timing signals comprise horizontal and vertical synchronization signals for controlling image sensor.
  • 32. The imaging apparatus of claim 1 wherein said image sensor is operable to perform frame-decimation.
  • 33. The imaging apparatus of claim 1 operable to shoot still pictures continuously.
  • 34. An image processor comprising: a signal processing circuit capable of selectively performing a plurality of signal processing on image data, capable of performing a processing of image data comprising a luminance signal and two chrominance signals as a performable signal processing, and capable of performing a processing of image data in a plurality of resolutions including a low resolution associated with a display resolution and a high resolution that is higher than said low resolution as a performable signal processing;said signal processing circuit including a data bus interconnecting components of the signal processor for data exchange; andsaid image processor being operable with an image sensor capable of generating image data from incident light wherein the image sensor is capable of decimating the image data at least in a vertical direction, a display capable of displaying image data and a storage section capable of using a recording medium for recording image data thereon;said image processor being operable in first, second and third operation modes,(1) in the first operation mode, said signal processor processes, in real-time, image data generated by the image sensor, the image data in the first operation mode having a lowered resolution and formatted for the display, and(2) in the second operation mode, the signal processor processes the image data generated by the image sensor in higher resolution into compressed image data and the compressed image data is stored on the recording medium, and(3) in the third operation mode, said signal processor decompresses the image data read out from the recording medium, and formats the decompressed image data for the display.
  • 35. The image processor of claim 34, wherein formatting of the image data is performed by resolution conversion.
  • 36. The image processor of claim 34, wherein the image data is formatted to VGA format.
  • 37. The image processor of claim 34, wherein at least part of the data exchange in the signal processor is performed on the basis of request and response.
  • 38. The image processor of claim 34, further comprising two separate data buses for exchanging data between components of the image processor.
  • 39. The image processor of claim 34, further comprising a memory, and wherein the performable signal processing of the signal processor includes memory access control for the signal processor.
  • 40. The image processor of claim 34, wherein the signal processor is capable of performing gamma correction of the image data.
  • 41. An imaging method for an imaging apparatus comprising (a) an image sensor capable of generating image data from incident light wherein the image sensor is capable of decimating the image data at least in a vertical direction, (b) a signal processor connected to the image sensor, capable of selectively performing a plurality of signal processing on image data, capable of performing a processing of image data comprising a luminance signal and two chrominance signals as a performable signal processing, capable of performing a processing of image data in a plurality of resolutions including a low resolution associated with a display resolution and a high resolution that is higher than said low resolution as a performable signal processing, and comprising a data bus for interconnecting a plurality of components of the signal processor for data exchange, (c) a display having the display resolution, connected to the signal processor and capable of displaying image data, (d) a storage interface connected to the signal processor and capable of connecting to a recording medium for recording image data thereon, and (e) a memory connected to the signal processor and capable of storing image data from the image sensor, the method comprising the steps of: in the first operation mode:generating image data by the image sensor,decimating the generated image data into the low resolution,storing the image data in the memory,formatting the image data for display by the signal processor, anddisplaying the formatted image data on the display;in the second operation mode:generating image data in the high resolution by the image sensor,storing the image data in the memory,compressing the image data by the signal processor into compressed image data, andrecording the compressed image data on the recording medium; andin the third operation mode,reading compressed image data out from the recording medium,decompressing the compressed image data by the signal processor,storing the decompressed image data in the memory,formatting the decompressed image data for display by the signal processor, anddisplaying the formatted image data on the display.
  • 42. The method of imaging of claim 41, wherein formatting of the image data is performed by resolution conversion.
  • 43. The method of imaging of claim 41, wherein the image data is formatted to VGA format.
  • 44. The method of imaging of claim 41, wherein at least a portion of the signal processing exchanges data based on request and response.
  • 45. The method of imaging of claim 41, further comprising exchanging data by two separate data buses.
  • 46. The method of imaging of claim 41, wherein the performable signal processing of the signal processor includes gamma correction of the image data.
  • 47. The imaging apparatus of claim 29, wherein the image sensor further comprises a sample-holding circuit.
Priority Claims (2)
Number Date Country Kind
10-204089 Jul 1998 JP national
10-333965 Nov 1998 JP national
Parent Case Info

This is a continuation of application Ser. No. 12/079,129, filed Mar. 25, 2008 now U.S. Pat. No. 7,839,447, which is a continuation of application Ser. No. 10/668,904, filed Sep. 23, 2003 now U.S. Pat. No. 7,358,992, which is a divisional of application Ser. No. 09/354,476, filed Jul. 15, 1999 now U.S. Pat. No. 6,674,464, which is entitled to the priority filing dates of Japanese applications 10-204089 filed on Jul. 17, 1998 and 10-333965 filed on Nov. 25, 1998, the entirety of which are incorporated herein by reference.

US Referenced Citations (203)
Number Name Date Kind
3971065 Bayer Jul 1976 A
4131919 Lloyd et al. Dec 1978 A
4323916 Dischert et al. Apr 1982 A
4338492 Snopko Jul 1982 A
4356509 Skerlos et al. Oct 1982 A
4412252 Moore et al. Oct 1983 A
4420641 Tsuchida Dec 1983 A
4456925 Skerlos et al. Jun 1984 A
4456931 Toyoda et al. Jun 1984 A
4460928 Kishimoto Jul 1984 A
4468755 Iida Aug 1984 A
4475131 Nishizawa et al. Oct 1984 A
4479143 Watanabe et al. Oct 1984 A
4489351 D'Alayer de Costemore d'Arc Dec 1984 A
4502788 Lowden Mar 1985 A
4533952 Norman, III Aug 1985 A
4541010 Alston Sep 1985 A
4546390 Konishi et al. Oct 1985 A
4605956 Cok Aug 1986 A
4623922 Wischermann Nov 1986 A
4642678 Cok Feb 1987 A
4647975 Alston et al. Mar 1987 A
4691253 Silver Sep 1987 A
4714963 Vogel Dec 1987 A
4730222 Schauffele Mar 1988 A
4740828 Kinoshita Apr 1988 A
4743959 Frederiksen May 1988 A
4746980 Petersen May 1988 A
4746988 Nutting et al. May 1988 A
4746993 Tada May 1988 A
4754333 Nara Jun 1988 A
4757384 Nonweiler et al. Jul 1988 A
4758881 Laspada Jul 1988 A
4758883 Kawahara et al. Jul 1988 A
4764805 Rabbani et al. Aug 1988 A
4771279 Hannah Sep 1988 A
4772956 Roche et al. Sep 1988 A
4774562 Chen et al. Sep 1988 A
4774574 Daly et al. Sep 1988 A
4774581 Shiratsuchi Sep 1988 A
4774587 Schmitt Sep 1988 A
4779135 Judd Oct 1988 A
4780761 Daly et al. Oct 1988 A
4782399 Sato Nov 1988 A
4791741 Kondo Dec 1988 A
4792856 Shiratsuchi Dec 1988 A
4803554 Pape Feb 1989 A
4814905 Hashimoto Mar 1989 A
4819059 Pape Apr 1989 A
4821121 Beaulier Apr 1989 A
4825301 Pape et al. Apr 1989 A
4837614 Omi Jun 1989 A
4837628 Sasaki Jun 1989 A
4855724 Yang Aug 1989 A
4876590 Parulski Oct 1989 A
4881127 Isoguchi et al. Nov 1989 A
4905079 Hayashi Feb 1990 A
4918523 Simon et al. Apr 1990 A
4928137 Kinoshita May 1990 A
4931250 Greszczuk Jun 1990 A
4979028 Minematsu et al. Dec 1990 A
5012328 Ishiguro Apr 1991 A
5015854 Shigyo et al. May 1991 A
5016107 Sasson et al. May 1991 A
5018017 Sasaki et al. May 1991 A
5032927 Watanabe et al. Jul 1991 A
5034804 Sasaki et al. Jul 1991 A
5038202 Ooishi et al. Aug 1991 A
5040068 Parulski et al. Aug 1991 A
5045327 Tarlow et al. Sep 1991 A
5067019 Juday et al. Nov 1991 A
5070406 Kinoshita Dec 1991 A
5073926 Suzuki et al. Dec 1991 A
5073927 Grube Dec 1991 A
5097518 Scott et al. Mar 1992 A
5111283 Nagasawa et al. May 1992 A
5113455 Scott May 1992 A
5124692 Sasson Jun 1992 A
5125045 Murakami et al. Jun 1992 A
5128776 Scorse et al. Jul 1992 A
5136628 Araki et al. Aug 1992 A
5138454 Parulski Aug 1992 A
5138459 Roberts et al. Aug 1992 A
5153730 Nagasaki et al. Oct 1992 A
5164831 Kuchta et al. Nov 1992 A
5164834 Fukuda et al. Nov 1992 A
5164980 Bush et al. Nov 1992 A
5173779 Lee Dec 1992 A
5177614 Kawaoka et al. Jan 1993 A
5189511 Parulski et al. Feb 1993 A
5194955 Yoneta et al. Mar 1993 A
5218452 Kondo et al. Jun 1993 A
5218457 Burkhardt et al. Jun 1993 A
5226114 Martinez et al. Jul 1993 A
5226145 Moronaga et al. Jul 1993 A
5237420 Hayashi Aug 1993 A
5251019 Moorman et al. Oct 1993 A
5251036 Kawaoka et al. Oct 1993 A
5262871 Wilder et al. Nov 1993 A
5264939 Chang Nov 1993 A
5264944 Takemura Nov 1993 A
5272526 Yoneta et al. Dec 1993 A
5280343 Sullivan Jan 1994 A
5293225 Nishiyama et al. Mar 1994 A
5293232 Seki et al. Mar 1994 A
5293236 Adachi et al. Mar 1994 A
5293252 Kim et al. Mar 1994 A
5295077 Fukuoka Mar 1994 A
5309528 Rosen et al. May 1994 A
RE34654 Yamawaki Jul 1994 E
5327486 Wolff et al. Jul 1994 A
5331551 Tsuruoka et al. Jul 1994 A
5335016 Nakagawa Aug 1994 A
5341153 Benzschawel et al. Aug 1994 A
5359698 Goldberg et al. Oct 1994 A
5363203 Tahara et al. Nov 1994 A
5367332 Kerns et al. Nov 1994 A
5379069 Tani Jan 1995 A
5382976 Hibbard Jan 1995 A
5396290 Kannegundla et al. Mar 1995 A
5402518 Lowery Mar 1995 A
5418565 Smith May 1995 A
5420636 Kojima May 1995 A
5420637 Zeevi et al. May 1995 A
5428389 Ito et al. Jun 1995 A
5440343 Parulski et al. Aug 1995 A
5442686 Wada et al. Aug 1995 A
5444482 Misawa et al. Aug 1995 A
5452017 Hickman Sep 1995 A
5473370 Moronaga et al. Dec 1995 A
5479206 Ueno et al. Dec 1995 A
5491837 Haartsen Feb 1996 A
5493335 Parulski et al. Feb 1996 A
5497193 Mitsuhashi et al. Mar 1996 A
5499316 Sudoh et al. Mar 1996 A
5528740 Hill et al. Jun 1996 A
5533013 Leppanen Jul 1996 A
5539455 Makioka Jul 1996 A
5546447 Skarbo et al. Aug 1996 A
5550646 Hassan et al. Aug 1996 A
5559868 Blonder Sep 1996 A
5576757 Roberts et al. Nov 1996 A
5576758 Arai et al. Nov 1996 A
5581301 Ninomiya Dec 1996 A
5612732 Yuyama et al. Mar 1997 A
5657246 Hogan et al. Aug 1997 A
5687157 Imai et al. Nov 1997 A
5696815 Smyk Dec 1997 A
5717496 Satoh et al. Feb 1998 A
111714 Partridge, III Mar 1998 A
5734427 Hayashi et al. Mar 1998 A
5739868 Butler et al. Apr 1998 A
5751350 Tanaka May 1998 A
5754636 Bayless et al. May 1998 A
5761279 Bierman et al. Jun 1998 A
5771451 Takai et al. Jun 1998 A
5774863 Okano et al. Jun 1998 A
5787399 Lee et al. Jul 1998 A
5805677 Ferry et al. Sep 1998 A
5828406 Parulski et al. Oct 1998 A
5845212 Tanaka Dec 1998 A
5848384 Hollier et al. Dec 1998 A
5903871 Terui et al. May 1999 A
5907604 Hsu May 1999 A
5923816 Ueda Jul 1999 A
5926218 Smith Jul 1999 A
5940743 Sunay et al. Aug 1999 A
5960035 Sridhar et al. Sep 1999 A
5960347 Ozluturk Sep 1999 A
6021158 Schurr et al. Feb 2000 A
6055427 Ojaniemi Apr 2000 A
6055500 Terui et al. Apr 2000 A
6070075 Kim May 2000 A
6073025 Chheda et al. Jun 2000 A
6084633 Gouhara et al. Jul 2000 A
6097430 Komiya et al. Aug 2000 A
6144411 Kobayashi et al. Nov 2000 A
6177956 Anderson et al. Jan 2001 B1
6236676 Shaffer et al. May 2001 B1
6263205 Yamaura et al. Jul 2001 B1
6292218 Parulski et al. Sep 2001 B1
6311092 Yamada Oct 2001 B1
6339612 Stewart et al. Jan 2002 B1
6342921 Yamaguchi et al. Jan 2002 B1
6393216 Ootsuka et al. May 2002 B1
6487366 Morimoto et al. Nov 2002 B1
6496222 Roberts et al. Dec 2002 B1
6496224 Ueno Dec 2002 B2
6507611 Imai et al. Jan 2003 B1
6518999 Miyamoto Feb 2003 B1
6529236 Watanabe Mar 2003 B1
6559889 Tanaka May 2003 B2
6564070 Nagamine et al. May 2003 B1
6639626 Kubo et al. Oct 2003 B1
6639627 Takezawa et al. Oct 2003 B1
6657658 Takemura Dec 2003 B2
6674464 Mizutani et al. Jan 2004 B1
6674732 Boehnke et al. Jan 2004 B1
7589779 Lyons et al. Sep 2009 B2
7839447 Mizutani et al. Nov 2010 B2
20020015447 Zhou Feb 2002 A1
20030147634 Takezawa et al. Aug 2003 A1
20050041116 Tsukioka Feb 2005 A1
Foreign Referenced Citations (96)
Number Date Country
1126920 Jul 1996 CN
1151081 Jun 1997 CN
1180273 Apr 1998 CN
0129122 Dec 1984 EP
0202009 Nov 1986 EP
0202009 Nov 1986 EP
212784 Mar 1987 EP
212784 Mar 1987 EP
0323194 Jul 1989 EP
0323194 Jul 1989 EP
0360615 Mar 1990 EP
0400906 Dec 1990 EP
0405491 Jan 1991 EP
0456369 Nov 1991 EP
0533107 Mar 1993 EP
0667726 Aug 1995 EP
0671819 Sep 1995 EP
0726659 Aug 1996 EP
0778566 Jun 1997 EP
204626 Jan 1923 GB
289944 Feb 1927 GB
2089169 Jun 1982 GB
60-136481 Jul 1985 JP
61-253982 Nov 1986 JP
61-264880 Nov 1986 JP
62-108678 May 1987 JP
62-185490 Aug 1987 JP
63-064485 Mar 1988 JP
63-128879 Jun 1988 JP
63-141485 Jun 1988 JP
63-286078 Nov 1988 JP
64-10784 Jan 1989 JP
64-010784 Jan 1989 JP
64-013877 Jan 1989 JP
64-051786 Mar 1989 JP
01-221985 Sep 1989 JP
01-221989 Sep 1989 JP
01-243686 Sep 1989 JP
01-288186 Nov 1989 JP
02-007680 Jan 1990 JP
02-76385 Mar 1990 JP
02-104078 Apr 1990 JP
02-105686 Apr 1990 JP
02-105786 Apr 1990 JP
02-113683 Apr 1990 JP
02-202792 Aug 1990 JP
02-214271 Aug 1990 JP
02-222383 Sep 1990 JP
02-226984 Sep 1990 JP
02-249371 Oct 1990 JP
02-277385 Nov 1990 JP
02-292962 Dec 1990 JP
03-001681 Jan 1991 JP
03-500119 Jan 1991 JP
03-042973 Feb 1991 JP
03-088488 Apr 1991 JP
03-143084 Jun 1991 JP
03-184482 Aug 1991 JP
03-234182 Oct 1991 JP
03-240384 Oct 1991 JP
03-252282 Nov 1991 JP
03-268583 Nov 1991 JP
03-284079 Dec 1991 JP
04-035181 Feb 1992 JP
04-142892 May 1992 JP
04-170176 Jun 1992 JP
04-170879 Jun 1992 JP
04-213970 Aug 1992 JP
04-239279 Aug 1992 JP
04-315356 Nov 1992 JP
04-319893 Nov 1992 JP
04-324778 Nov 1992 JP
04-348685 Dec 1992 JP
05-049000 Feb 1993 JP
05-122574 May 1993 JP
05-167908 Jul 1993 JP
06-022189 Jan 1994 JP
06-110107 Apr 1994 JP
6-189256 Jul 1994 JP
06-237431 Aug 1994 JP
07-264489 Oct 1995 JP
07-312714 Nov 1995 JP
08-125957 May 1996 JP
2526825 Jun 1996 JP
09-098379 Apr 1997 JP
9-127989 May 1997 JP
09-247543 Sep 1997 JP
10-098642 Apr 1998 JP
10-108133 Apr 1998 JP
2000 92349 Mar 2000 JP
2000 92365 Mar 2000 JP
WO 8912939 Dec 1989 WO
WO 9114334 Sep 1991 WO
WO 9213423 Aug 1992 WO
WO 9301685 Jan 1993 WO
WO9512459 May 1995 WO
Related Publications (1)
Number Date Country
20110019059 A1 Jan 2011 US
Divisions (1)
Number Date Country
Parent 09354476 Jul 1999 US
Child 10668904 US
Continuations (2)
Number Date Country
Parent 12079129 Mar 2008 US
Child 12894622 US
Parent 10668904 Sep 2003 US
Child 12079129 US