1. Field of the Invention
This invention relates to an imaging apparatus for performing signal processing depending on the operational modes.
2. Description of the Related Art
A digital still camera retrieves image data obtained by a CCD image sensor into a DRAM or a flash memory and subsequently transfers the image data to a so-called personal computer or the like. A major proportion of this type of the digital still camera has hitherto been of the type coping with the video graphics array (VGA) system.
Referring for example to
Before starting the image shooting of an object, the user has to confirm an object image displayed on the finder 205. This state is termed a finder mode. At this time, the CCD image sensor 201 sends image signals obtained on photoelectric conversion to the input processing/image processing circuit 202. The input processing/image processing circuit 202 performs the correlated dual sampling processing on the image signals to digitize the image signals. The input processing/image processing circuit 202 then performs pre-set signal processing, such as gamma correction, knee processing or camera processing and routes the processed image signals to the memory controller 203. The memory controller 203 then is responsive to the control by the CPU 210 to send the image data from the input processing/image processing circuit 202 to the output processing circuit 204. The output processing circuit 204 encodes image data in accordance with, for example, the National Television System Committee (NTSC) system, and analogizes the encoded image data to route the resulting analog data to the finder 205. This allows the object as an object of image shooting to be indicated on the finder 205.
On the other hand, if the user pushes a shutter button, not shown, to shift to the recording mode, the memory controller 203 causes the image data furnished from the input processing/image processing circuit 202 to be written in the memory 209. The CPU 210 causes the image data to be read out from the memory 209 and compresses the image data from the recording unit 207 in the compression/expansion circuit 208 with compression in accordance with, for example, the Joint photographic Experts Group (JPEG) system to record the compressed image data in the recording unit 207.
If the user performs pre-set processing to shift to the reproducing mode, the CPU 210 causes image data to be read out from the recording unit 207 to cause the image data to be expanded in JPEG system in the compression/expansion circuit 208 to route the resulting data via memory controller 203 and output processing circuit 204 to the finder 205. This causes the as-shot image to be displayed on the finder 205.
In keeping up with recent outstanding technical progress in the CCD image sensor, the resolution of image data is nearly surpassing 1,000,000 pixels. On the other hand, it may be feared that the digital still camera of the above-described structure cannot sufficiently cope with the image data exceeding 1,000,000 pixels.
If, for example, the CCD image sensor 201 outputs image signals of high resolution in the finder mode, the input processing/image processing circuit 202, memory controller 203 or the output processing circuit 204 cannot process image data in real-time, such that an image of the object is displayed on the finder 205 in a frame-skipping fashion. This incurs an inconvenience in shooting an image of object even if the object makes the slightest movement.
In the recording mode, since in which only multi-pixel image data is recorded in the recording unit 207, it is unnecessary to perform the processing in e.g., the input processing/image processing circuit 202.
That is, in the digital still camera 200, since the pre-set signal processing is performed on e.g., the input processing/image processing circuit 202 without regard to the operational mode, the signal processing has not necessarily been efficient insofar as the entire apparatus is concerned.
It is therefore an object of the present invention to provide an imaging apparatus that is able to perform efficient signal processing depending on the operational mode.
In another aspect, the present invention provides a controlling method for a signal processing apparatus adapted for transmitting/receiving image data between a plurality of signal processing means and storage means for storing image data, the signal processing means being adapted for processing the image data in a pre-set fashion and for outputting to the control means a request signal for demanding furnishment of the image data for signal processing or demanding the outputting of the processed image data. The controlling method includes selecting, on furnishment of the request signal from the plural signal processing means, one or more of the signal processing means which has outputted the request signal, and furnishing the image data read out from the storage means to the selected signal processing means or writing the image data outputted by the selected signal processing means in the storage means.
In still another aspect, the present invention provides an imaging apparatus including imaging means, storage means for transiently storing image data from the imaging means, control means for controlling the writing/readout of the image data for the storage means, a plurality of signal processing means for processing the image data in a pre-set fashion and for outputting to the control means a request signal for demanding famishment of the image data for signal processing or demanding the outputting of the processed image data, and outputting means for outputting image data processed by the signal processing means. The control means manages control on furnishment of the request signal to select one or more of the signal processing means which has outputted the request signal to furnish the image data read out from the storage means to the selected signal processing means or to write the image data outputted by the selected signal processing means in the storage means.
In yet another aspect, the present invention provides a recording/reproducing apparatus including imaging means, input processing means for performing pre-set input processing on image data from the imaging means, display processing means for displaying image data on display means, first storage means for transiently storing the image data from the imaging means, control means for controlling the writing/readout of the image data for the first storage means, resolution converting means for converting the resolution of image data, compression/expansion means for compressing/expanding the image data and recording/reproducing controlling means for causing the compressed image data to be recorded on second storage means and for causing the image data recorded on the second storage means to be reproduced. The control means selects one or more signal processing means from the input processing means, display processing means, resolution converting means and the compression/expansion means. The control means causes the image data read out from the first storage means to be sent to the selected signal processing means or causes the image data outputted by the selected signal processing means to be written in the first storage means.
In the signal processing apparatus and the control method therefor, according to the present invention, if a request signal is sent from each signal processing means, the signal processing means which has outputted the request signal having, the utmost priority in the priority order is selected. Control is then performed for supplying the image data read out from the storage means over the image data bus to the selected signal processing means, or writing the processed image data of the selected signal processing means over the image data bus to the storage means, so that efficiently signal processing will be executed in the respective signal processing means.
The present invention provides a An imaging apparatus comprising:
In the first mode of the imaging apparatus, the image data from the imaging means are decimated and processed in a pre-set fashion by signal processing means required to perform real-time processing. In the second mode, multi-pixel image data are first written in the memory means. The multi-pixel image data are then read out therefrom and routed to and processed by the plural signal processing means.
More specifically, the signal processing means of the plural signal processing means which is required to perform real-time processing is caused to perform pre-set signal processing on the image data from the imaging means, in the first operational mode of the imaging apparatus, the resulting image data being then written in the memory means and the processed image data being then read out from the memory means and routed to the display means. In the second operational mode, the image data from the imaging means is written in the memory means and read out therefrom so as to be routed to the respective signal processing means for processing, with the processed image data being then recorded on the recording medium. This realizes signal processing most efficient depending on the operational mode.
Referring to the drawings, preferred embodiments of the present invention will be explained in detail.
The present invention is applied to digital still camera 1, configured as shown for example in
The digital still camera 1 includes an image generating unit 10 for generating image signals, an input signal processor 20 for processing image data in a pre-set fashion, an image memory 32, comprised of an SDRAM, and a controller 40 for controlling the input signal processor 20.
The image generating unit 10 includes a solid-state imaging device for generating image signals, such as a CCD image sensor 11, a sample holding-analog/digital circuit (S/H-A/D circuit 12) for sample-holding and digitizing the image signals to output image data, and a timing generator 13 for generating timing signals. This timing generator 13 generates horizontal synchronization signals and vertical synchronization signals for controlling respective circuits of the image generating unit 10 based on synchronization signals supplied from the signal processor input.
The CCD image sensor 11 generates image data corresponding to XGA (extended graphic array: 1024×768) pixel data made up of, for example, 800,000 pixels. The CCD image sensor 11 is driven based on the synchronization signals from the timing generator 13 to output image signals at a rate of 30 frames per second. Meanwhile, the CCD image sensor 11 has the function of thinning out image signals and is able to thin out vertical components of the image signals to ½, ⅓, ¼, . . . to output the resulting thinned-out signals.
The S/H-A/D circuit 12 is also adapted to perform sample-holding and A/D conversion at a pre-set sampling interval based on the synchronization signals from the timing generator 13 to send the resulting image data to the signal processor 20.
The signal processor 20 includes a sole LSI (large scale integrated circuit). The signal processor 20 includes an input signal processor 21 for input processing and camera processing on image data from the image generating unit 10, a memory controller 22 for controlling the readout/write of image data for the image memory 32, an NTSC/PAL (phase alternation by line) encoder 23, a D/A converter 24 for analogizing image data and outputting the resulting analog signals to outside, and a sync generator 26 for generating synchronization signals and supplying the resulting synchronization signals to the timing generator 13.
The signal processor 20 also includes a memory interface 27, as an interface for the image memory 32, a resolution conversion circuit 28 for converting the resolution of the image data, a JPEG (Joint Photographic Experts Group) encoder/decoder 29, for compressing/expanding image data, a JPEG interface 30, as an interface of the JPEG encoder/decoder 29, and a host interface 31, as an interface for having data transmission/reception with the CPU of the controller 40.
The input signal processor 21 processes the image data from the S/H-A/D circuit 12 with digital clamp, shading correction, aperture correction, gamma correction or color processing and routes the resulting processed signals to the memory controller 22. The input signal processor 21 has the function of processing input data to convert the input data into Y, Cb and Cr. If the resolution of the image data is larger than that of the VGA (Video Graphics Array), the input signal processor 21 is able to perform the processing of lowering the resolution. The input signal processor 21 also performs the auto-focussing and auto-iris detection to route the data to the controller 40 to effect automatic adjustment of the focussing mechanism and the iris mechanism. The input signal processor 21 also detects the signal level of the three prime colors making up the image data to adjust automatic white balance.
The memory controller 22 also performs control to cause image data supplied from the input signal processor 21 or other circuitry to be written in the image memory 32 via a memory interface 27 and to read out image data of the image memory 32 via the memory interface 27. At this time, the memory controller 22 detects whether or not there is any defective pixel in the CCD image sensor 11 based on the image data stored in the image memory 32.
The memory controller 22 routes the image data read out from the image memory 32 to, for example, the NTSC/PAL encoder 23. When fed with the image data from the memory controller 22, the NTSC/PAL encoder 23 encodes the image data in accordance with the NTSC system or the PAL system to send the encoded data to the D/A converter 24. The D/A converter 24 analogizes the image data to output the resulting analog signals via output terminal 25.
The memory controller 22 routes the image data, read out from the memory controller 22, to the resolution conversion circuit 28 to cause the image data to be converted in resolution, while causing the image data outputted by the resolution conversion circuit 28 to be written in the image memory 32.
The memory controller 22 routes the image data via the JPEG interface 30 to the JPEG encoder/decoder 29 to effect compression of the still image, while causing the image data expanded by the JPEG encoder/decoder 29 to be written in the image memory 32.
The image memory 32 not only stores the image data as described above but also stores OSD data (on-screen-display data) as the so-called character generator data. The OSD data is made up of bit map data. The controller 22 controls the readout/write of the OSD data. The image data and the OSD data are synthesized by the NTSC/PAL encoder 23.
The controller 40 includes a CPU (central processing unit) 41 for controlling the respective circuits of the signal processor 20, a DRAM (dynamic random access memory) 42, a ROM (read-only memory) 43, having the control program for the CPU 41 stored therein, a flash memory interface 44, as an interface for exchanging image data with a storage device 51, such as a flash memory, and an IrDA interface 45, as an interface of the communication circuit 52 constituted such as by IrLED.
For example, the CPU 41 causes image data compressed by the JPEG encoder/decoder 29 to be written via a flash memory/interface 44 in a storage device 51, made up of a flash memory, while causing image data to be read out from the storage device 51 to route the image data read out from the JPEG encoder/decoder 29. The CPU 41 also causes the image data read out from the storage device 51 to be outputted via the IrDA interface 45 and the communication circuit 52 as infrared light to outside.
The schematic structure of the digital still camera 1 is shown in
The input signal processor 21 routes the image data from the CCD image sensor 11 via an image data bus 33 to the image memory 32. The NTSC/PAL encoder 23 encodes the image data from the image memory 32 in a pre-set fashion to send the resulting encoded data to the finder 36. This causes an image of an object to be displayed on the finder 36 which is adapted to display the image in association with the image data up to the VGA format.
The memory controller 22 performs data transfer between the image memory 32 and the signal processing circuits connecting to the image data bus 33. The resolution conversion circuit 28 performs resolution conversion of the image data from the image memory 32 to route the results to the linage memory 32. The JPEG encoder/decoder 29 compresses the image data from the image memory 32 in accordance with the JPEG system to route the compressed image data via CPU bus 34 to the CPU 41, which then causes the compressed image data to be written in the storage device 51. The CPU 41 is also able to output the compressed image data via the CPU bus 34 and the communication circuit 52 to outside.
Thus, is
In the signal processor 20, the respective circuits, such as NTSC/PAL encoder 23 or the resolution conversion circuit 28, send to the memory controller 22 a request signal indicating that image data are demanded. These circuits also transmit a request signal to the memory controller 22 when outputting the image data after the end of the processing of the image data.
On reception of the request signals from the respective circuits, the memory controller 22 selects those circuits having the high priority sequence, and transmits an acknowledge signal to the selected circuit. The acknowledge signal indicates that image data can be routed to a circuit receiving the signal or that image data outputted by a circuit which has received the acknowledge signal is ready to be received. The memory controller 22 reads out image data from the image memory 32 to route the read-out image data via image data bus 33 to the circuit corresponding to the destination of the acknowledge signal. The memory controller 22 receives the image data outputted by the circuit which has sent the acknowledge signal to write the image data in the image memory 32.
On reception of the request signals from plural circuits, the memory controller 22 is able to select preferentially the circuit which has to perform the processing in real-time. For example, if an image of an object is to be displayed on the finder 36, the memory controller 22 preferentially selects the input signal processor 21 and the NTSC/PAL encoder 23. It is also possible for the memory controller 22 to decipher the bus occupation ratio of the image data on the image data-bus 33 to determine the priority sequence of the respective circuits depending on the occupation ratio.
If image data can be routed to the respective circuits within the transfer band limitation of the image data bus 33, it is possible for the memory controller 22 to perform control to send the acknowledge signal to the respective circuits time-divisionally to permit the respective circuits to perform pre-set processing. This enables the memory controller 22 to have access in real-time to data in the respective circuits to cause the image data from the respective circuits to be written in the image memory 32 or to cause the image data in the image memory 32 to be read out and sent to the respective circuits.
If, when the memory controller 22 has access to external circuitry, not shown, over the image data bus 33, the external circuitry can send the above-mentioned request signal or receive the transmitted acknowledge signal, the memory controller 22 can have access simultaneously and time-divisionally to the respective circuits within the signal processor 20 within the transfer band limitation range of the image data bus 33. That is, if within the range of the band of the image data bus 33, the memory controller 22 can have simultaneous access to the circuits in the signal processor 20 or to the external circuits within the signal processor 20 time-divisionally without regard to the number of the circuits within the signal processor 20 or the external circuit.
As mentioned above, the memory controller 22 performs arbitration of the image data bus 33, write/readout control of image data between the image memory 32 and the respective circuits and data transfer to the CPU bus 34.
The specified flow of image data in the signal processor 20 is explained with reference to
The input signal processor 21 includes a CCD interface 21 a for performing pre-set signal processing on the image data from the image generating unit 10, a detection circuit 21b for processing the CCD interface 21a, and a camera digital signal processor 21c (camera DSP 21c) for doing conversion processing of the image data.
The CCD interface 21a performs the processing, such as the digital clamp, white balance adjustment or gamma correction, on the image data made up of R, G and B from the S/H-A/D circuit 12 shown information
From the image data of the CCD interface 21a, the detection circuit 21b performs detection for auto-focussing, auto-iris or white balance adjustment.
The camera DSP 21c converts the image data of R, G and B from the CCD interface 21 a into image data made up of luminance signal Y and chrominance signals Cb, Cr. The camera DSP 21c also has a simplified resolution conversion circuit 21 which not only performs the above processing but also converts the resolution of the image data in a simplified fashion.
The simplified resolution conversion circuit 21d operates for converting the resolution of the image data to lower values if the resolution of the image data generated by the CCD image sensor 11 is larger than, for example, the VGA format.
Specifically, the simplified resolution conversion circuit 21d includes a B-Y/R-Y separation circuit 61, for separating chrominance signals, a horizontal direction linear interpolation circuit 62 for interpolation in the horizontal direction, a B-Y/R-Y synthesis circuit 63 for synthesizing the chrominance signals, a 1H delay circuit 64 for delaying the respective signals by a horizontal scanning period (1H period), and a vertical direction linear interpolation circuit 65.
The B-Y/R-Y separation circuit 61 separates the chrominance signals B-Y and R-Y, as chroma signals Cb, Cr, from the image data from the camera DSP 21c, to route the separated chroma signals to the horizontal direction linear interpolation circuit 62. The horizontal direction linear interpolation circuit 62 interpolates the luminance signals Y and the chrominance signals B-Y, R-Y in the horizontal direction to lower the luminance in the horizontal direction to route the interpolated luminance signals Y and the chrominance signals B-Y, R-Y to the B-Y/R-Y synthesis circuit 63.
The B-Y/R-Y synthesis circuit 63 synthesizes the chrominance signals B-Y, R-Y, to route the luminance signals Y from the horizontal direction linear interpolation circuit 62 and the synthesized chrominance signals B-Y, R-Y to the 1H delay circuit 64 and to the vertical direction linear interpolation circuit 65. The 1H delay circuit 64 delays the luminance signals Y and the chrominance signals by 1H to route the delayed signals to the vertical direction linear interpolation circuit 65. The vertical direction linear interpolation circuit 65 performs linear interpolation processing in the vertical direction, based on the luminance signals Y and the chrominance B-Y, R-Y from the B-Y/R-Y synthesis circuit 63 and the 1H delay circuit 64, to output image data made up of luminance signals Y′ and chrominance signals (B-Y)′, (R-Y)′ lowered in resolution in both the horizontal and vertical directions.
The resolution conversion circuit 28 performs resolution conversion processing of converting [p×q] image data into [m×n] image data. The resolution conversion circuit 28 performs processing for suppressing the resolution to a pre-set value if the image data produced in the CCD image sensor 11 are of high resolution. It is however possible to process the image data of low resolution into data of high resolution.
Referring to
When ready for converting the resolution of the image data, the resolution conversion circuit 28 outputs a read request signal requesting the memory controller 22 to read out image data from the image memory 32, while outputting a write request signal requesting the memory controller 22 to write the image data in the image memory 32 after the conversion processing of the image data. The resolution conversion circuit 28 also receives an acknowledge signal indicating that the memory controller 22 has responded to the request signal.
Referring to
Referring to
The first multiplier 84 multiplies the image data supplied from the an input buffer 71 with a pre-set coefficient to route the resulting data to the adder 88. The second multiplier 85 multiplies the image data supplied from the first delay circuit 81 with a pre-set coefficient to route the resulting data to the adder 88. The third multiplier 86 multiplies the image data supplied from the second delay circuit 82 with a pre-set coefficient to route the resulting data to the adder 89. The fourth multiplier 87 multiplies the image data supplied from the third delay circuit 83 with a pre-set coefficient to route the resulting data to the adder 90. The first adder 88 synthesizes the image data to send the resulting data to the second adder 89. The second adder 89 synthesizes the image data to send the resulting data to the third adder 90. The third adder 90 synthesizes the respective image data to send the resulting data as image data converted in resolution in the horizontal direction to the vertical direction buffer 74.
Thus, the horizontal direction transform processing circuit 73 weights plural image data each having one pixel delay in a pre-set fashion with pre-set weights and synthesizes the weighted image data to interpolate or decimate the pixels in the horizontal direction to convert the resolution in the horizontal direction.
The vertical direction buffer 74 is constituted by a serial connection of first to third buffers 91, 92, 93, each adapted to produce a one-line delay. Thus, the first buffer memory 91 outputs image data delayed by one line, while the second and third buffer memories 92, 93 output the image data delayed by two and three lines, respectively.
Referring to
The fifth multiplier 94 multiplies the image data supplied from the horizontal direction conversion circuit 73 with a pre-set coefficient to route the resulting data to the fourth adder 98. The sixth multiplier 95 multiplies the image data supplied from the first line memory 91 with a pre-set coefficient to route the resulting data to the fourth adder 98. The seventh multiplier 96 multiplies the image data supplied from the second line memory 92 with a pre-set coefficient to route the resulting data to the fifth adder 99. The eighth multiplier 97 multiplies the image data supplied from the third line memory 93 with a pre-set coefficient to route the resulting data to the sixth adder 100. The fourth adder 98 synthesizes the image data to send the resulting data to the fifth adder 99. The fifth adder 99 synthesizes the image data to send the resulting data to the sixth adder 100. The sixth adder 100 synthesizes the respective image data to output the resulting data as image data converted in resolution in the horizontal direction.
Thus, the vertical direction transform processing circuit 75 weights plural image data each having one line delay in a pre-set fashion with pre-set weights and synthesizes the weighted image data to interpolate or decimate the pixels in the horizontal direction to convert the resolution in the vertical direction.
In
In the above-described embodiment, the first to third buffer memories 91 to 93 in the vertical direction buffer 74 are configured to store one-line (1H) image data. Alternatively, the first to third buffer memories 91 to 93 may be configured for storing image data lesser than one line, as shown in
Specifically, the memory controller 22 reads out pixel data corresponding to a viewing screen stored in the image memory 32 every N pixels on the line basis in the vertical direction. Referring to
Referring to
The memory controller 22 then reads out image data in a range defined by (N−1, 1) (N−1, q), (2N−2, q), (2N−2, 1), referred to below as the image data set (2). If the memory controller 22 reads out the image data set (1) and the image data set (2), it is tantamount to reading out the image data of the (N−1)st column and the Nth column twice.
The reason is that, since the vertical direction transform processing circuit 75 performs interpolation beginning from the surrounding pixel, the pixels stored in the beginning end and the trailing end of the first to third buffer memories 91 to 93 are not the object of processing. For example, if the image data set (1) is read out, the pixel (N, 1) is not the object of the interpolation processing in the vertical direction. However, this pixel (N, 1) is read out when the pixel data set (2) is read out, and becomes the object of interpolation processing.
In similar manner, the memory controller 22 reads out image data of N pixels in the horizontal direction every line so that image data of the last two columns of the directly previous image data set will be included. This routes the image data set to the resolution conversion circuit 28.
The vertical direction buffer 74 is fed with image data, in an amount corresponding to the capacity of the first to third buffers 91 to 93, on the line basis. Thus, image data offset one line is stored in each f the first to third buffer memories 91 to 93. The vertical direction transform processing circuit 75 is able to perform the resolution conversion processing in the vertical direction based on the image data from the first to third buffers 91 to 93 of the vertical direction buffer 74.
With the memory controller 22, the memory controller 22 can cause the resolution conversion circuit 28 to execute the resolution conversion in the vertical direction, by readout in meeting with the capacity of the buffer memory, even if the capacity of the buffer memory required for resolution conversion in the vertical direction is not up to one line.
Although the read-out overlap between the image data sets is two columns, it is probable that the overlap exceeds two columns or there is no overlap. It is noted that the present invention is applicable to image signal processing, such as camera signal processing, without limitation to resolution conversion.
Although the foregoing description is directed to the embodiment in which the buffer memory is being used for interpolation for the vertical direction, the present invention is also applicable to an embodiment in which the buffer memory is being used for interpolation for the horizontal direction.
That is, the resolution conversion circuit 28 may perform resolution conversion in the horizontal direction using a horizontal direction buffer 72a comprised of a buffer memory 72a having a capacity of N pixels, as shown in
Thus, the memory controller 22 is able to read out image data from the image memory 32 so that resolution conversion processing in the horizontal and vertical directions will be effected for the first to third buffer memories 91 to 93 each having a capacity of N pixels. This enables the circuit scale of the horizontal direction buffer 72 and the vertical direction buffer 74 to be reduced to lower the production cost.
The NTSC/PAL encoder 23, executing the encoding as described above, also has a simplified resolution conversion circuit 23a for increasing the resolution of the image data, if need be, before proceeding to encoding.
The simplified resolution conversion circuit 23a performs resolution conversion for matching to the display standard of the finder 36 if the image data on the image memory 32 is lower than the resolution required for display.
Referring to
The line memory 101 stores image data from an input terminal in in an amount corresponding to one line to send the image data to the V-direction linear interpolation circuit 102 in the order it is stored. The V-direction linear interpolation circuit 102 weights the image data from the input terminal in and the image data from the V-direction linear interpolation circuit 102 with a pre-set weighting to perform linear interpolation in the vertical direction. The horizontal direction interpolation circuit 103 interpolates Y with an order-seven filter, while interpolating Cb and Cr with an order-three filter. This is simply the interpolation for increasing the resolution by a factor of two. The horizontal direction interpolation circuit 103 outputs the image data at an output terminal Out.
For example, if image data inputted from the input terminal in is denoted a, image data read out from the line memory 101 is b, a coefficient for weighting is g, where 0≦g≦1, and image data outputted by the V-direction linear interpolation circuit 102 is c, the V-direction linear interpolation circuit 102 effectuates the following processing:
c=g*a+(1−g)*b.
The image data outputted by the output terminal out is encoded by the NTSC/PAL encoder 23, as mentioned previously.
In the signal processing system, the digital still camera 1 is made up of so-called two chips, namely s signal processor 20 and a CPU 41. Therefore, the respective signal processing circuits are each of the chip configuration, so that the substrate surface area and further the power consumption can be made smaller than if the respective signal processing circuits are of separate chip configurations.
Also, since the signal processor 20 is not of the chip configuration inclusive of the CPU, signal processing can be adaptively effectuated even if the application in connection with the CPU 41 is changed. That is, if the signal processor 20 is of the chip configuration inclusive of the CPU, it is impossible to reconstruct the chip in case the application of the CPU is changed. However, the signal processor 20 can perform the pre-set signal processing using a CPU of an optimum structure on the application basis.
The digital still camera. I of the above-described structure has a finder mode for confirming the status or the position of an object prior to image shooting, a recording mode for shooting the image of the object as confirmed, and a reproducing mode for confirming the shot state of the object image, and effects the processing depending on the prevailing mode.
In the finder mode, the user has to observe the state of the object indicated on the finder 36 before thrusting a shutter button, not shown, to shoot the object. In this finder mode, the memory controller 22 and other circuits are controlled in the following manner. For illustrating the respective modes, reference is had mainly to
In the finder mode, the CCD image sensor 11 generates image signals, thinned out to one-third from the vertical components, and furnishes the digitized image data via the S/H-A/D circuit 12 to the CCD interface 21a.
The CCD interface 21a performs signal processing in synchronism with clocks shown in
The camera DSP 21c performs data conversion processing on the decimated image data into YCrCb image data. The camera DSP 21c converts the resolution of the image data in the simplified resolution conversion circuit 21d (340×256→320×240) for lowering the resolution of the image data to route the converted image data via image data bus 33 to the memory controller 22.
It is noted that the simplified resolution conversion circuit 21d lowers the resolution in a simplified fashion to an extent necessary for subsequent processing. In this manner, if image data generated by the CCD image sensor 11 is of high resolution, the transfer range taken up by the image data generated by the CCD image sensor 11 can be decreased to evade the stagnancy on the image data bus 33 to maintain the real-time characteristics of the finder mode.
The memory controller 22 writes the image data in the image memory 32, while reading out the image data from the image memory 32 as shown in
The NTSC/PAL encoder 23 performs resolution conversion of 320×240→640×240 or 320×240→640×288 in the case of the NTSC system or the PAL system, respectively, to send the converted image data to the NTSC/PAL encoder 23. The NTSC/PAL encoder 23 also converts the image data into data of the NTSC system or the PAL system into OSD data which is routed to the finder 36 shown in
Meanwhile, the NTSC/PAL encoder 23 converts the resolution so that data with low resolution will be increased in resolution, such that, if 320×200 image data is furnished, it is converted into 640×240 image data and into 640×288 image data for the NTSC system and for the PAL system, respectively.
In the digital still camera 1, the resolution of the image data generated by the CCD image sensor 11 is lowered in a simplified fashion in the finder mode to reduce the data volume, so that the image data will be within the bandwidth limitation of the image data bus 33 and so that the resolution will be increased at an output stage to the extent that is necessary for display, at a timing shown in
Thus, with the digital still cameral, the image data is held in the bandwidth limitation of the image data bus 33 to permit the image of the object to be displayed on the finder 36, even if the image data is of high resolution, without the necessity of performing the time-consuming decimation processing.
If the circuitry for preferential processing, namely the CCD interface 21a, camera DSP 21c or the NTSC/PAL encoder 23, is previously set in the CPU 41, and signal processing is carried out time-divisionally in other circuits as in the above circuits, the processing of the respective circuits with high priority may be preferentially performed depending on the data volume of the image data.
In the event of the large data volume of the image data in the simplified resolution conversion circuit 21d, data processing may be performed at a high processing speed, in order to give priority to real-time processing, even though the picture quality is degraded to a certain extent, under control by the CPU 41. In this manner, high-speed processing can be effected in the finder mode even in case of the large data volume of the image data generated in the image generating unit 10.
In the case of the digital still camera 1, having an electronic zooming function, the CPU 41 can control the respective circuits in the following manner.
The memory controller 22 causes the image data, supplied via the CCD interface 21a and camera DSP 21c, to be written in the image memory 32, while causing the image data to be read out from the image memory 32 and routed to the resolution conversion circuit 28. The resolution conversion circuit 28 formulates image data enlarged from a portion of the input image, by an electronic zooming function, to output the resulting image data to the image memory 32. This image data is read out from the image memory 32 and outputted to the finder 36 via the NTSC/PAL encoder 23. This generates electronically zoomed image data.
Since the finder mode gives utmost priority to the real-time characteristics, time-consuming processing is not executed by the respective circuits. However, the CPU 41 can be configured to cause the memory controller 22 and other circuits to perform various processing operations if within the range allowed by the transfer area of the image data bus 33.
For example, the memory controller 22 may be configured to read out image data from the image memory 32, in which is stored the image data furnished from the CCD interface 21a, and to furnish the read-out image data to the NTSC/PAL encoder 23 over the image data bus 33 and to the JPEG encoder/decoder 29. The finder 36 displays the image of the object in real-time, while the JPEG encoder/decoder 29 compresses the image data in accordance with the JPEG system.
The JPEG encoder/decoder 29 compresses/expands the still image, while it cannot process high-pixel image in real-time. It is thus possible for the JPEG encoder/decoder 29 to decimate a pre-set number of frames of the image data supplied from the image data bus 33 (number of frames or fields) by way of compression or to slice a portion of the image to lower the resolution by way of compression. This enables shooting of a frame-decimated still image continuously or shooting of a low-resolution image continuously.
The user observes the state of the object displayed on the finder 36 in the above-mentioned finder mode. If the object is decided to be shot, the user pushes a shutter button, not shown.
If the shutter button is pushed, the digital still camera 1 proceeds to the recording mode. In the recording mode, the CPU 41 controls the memory controller 22 or the respective circuits in the following manner to record the image of the as-shot object on a recording device 51.
The CCD image sensor 11 halts the decimation operation in synchronism with the thrusting the shutter button to generate image signals of the XGA format to route the digitized image data via the S/H-A/D circuit 12 to the CCD interface 21 a.
The CCD interface 21a routes the image data furnished from the S/H-A/D circuit 12 not to the camera DSP 21c, but to the memory controller 22 via the image data bus 33. The memory controller 22 first writes the image data in the image memory 32 and subsequently reads out the image data to route the read-out image data via the image data bus 33 to the camera DSP 21c. The camera DSP 21c converts the image data made up of RGB into image data made up of Y, Cb and Cr.
The camera DSP 21c is fed with image data once written in the image memory 32. That is, the camera DSP 21c effects data conversion on the image data from the image memory 32 instead of on the image data directly supplied from the CCD interface 21a. Thus, it is unnecessary for the camera DSP 21c to perform high-speed data conversion, but it is only sufficient if the camera DSP 21c executes such processing when the image data bus 33 is not busy. Stated differently, it is unnecessary for the camera DSP 21c to perform the processing in real-time, so that data conversion processing can be executed with priority given to the high picture quality rather than to the high processing speed and the resulting converted image data may be routed to the memory controller 22 via the image data bus 33. The memory controller 22 causes the image data to be written in the image memory 32.
The memory controller 22 causes the image data to be read out from the image memory 32 to route the read-out image data to the JPEG encoder/decoder 29. The JPEG encoder/decoder 29 compresses the image data in accordance with the JPEG system to write the compressed image data in the recording device 51 shown in
If real-time processing is not unnecessary, as during recording, the CPU 41 permits the pre-set processing to be executed after writing the image data transiently in the image memory 32 to exploit the transfer band of the image data bus 33 to process the high-pixel image.
The CPU 41 records the image data of the XGA format directly in the recording device 51 in the recording mode. It is however possible for the resolution conversion circuit 28 to convert the resolution of the image data before recording the image data on the recording device 51. Specifically, it is possible to cause the resolution conversion circuit 28 to convert the resolution of the image data read out from the image memory 32 via the memory controller 22 in meeting with the VGA (1024×768→640×480) to permit the JPEG encoder/decoder 29 to compress the image data to record the compressed data in the recording device 51.
If desirous to confirm the as-shot image after image shooting, the operator thrusts the playback button, not shown, for reproducing the as-shot image.
If the reproducing button is thrust, the digital still camera 1 moves to the reproducing mode. In the reproducing mode, the CPU 41 controls the respective circuits in the following manner to read out the image data of the object.
That is, on detecting the thrusting the reproducing button, the CPU 41 reads out the image data from the recording device 51 and transiently stores the read-out image data in the DRAM 42 before routing the data via CPU bus 34 to the JPEG encoder/decoder 29. The JPEG encoder/decoder 29 expands the image data read out from the recording device 51 in accordance with the JPEG system to produce image data of the XGA format to route the resulting image data via the image data bus 33 to the memory controller 22.
The memory controller 22 writes the image data on the image memory 32 and reds out the image data from the image memory 32 to send the read-out image data via the image data bus 33 to the resolution conversion circuit 28.
The resolution conversion circuit 28 effects resolution conversion so that the image data will be in meeting with the VGA format (1024×768→640×480 in the NTSC system and 1024×768→640×576 in the PAL system) to route the converted image data over the image data bus 33 to the memory controller 22. The image data then is read from the image memory 32 and routed via the NTSC/PAL encoder 23 to the finder 36. This displays an image corresponding to the image data recorded in the recording device 51 on the finder 36.
That is, since the image data recorded in the recording device 51 has high resolution, the CPU 41 first lowers the resolution and subsequently routes the image data to the finder 36.
It is also possible for the CPU 41 to set, for each of the finder mode, recording mode and the reproducing mode, the order of priority of the circuits to be processed in preference and to cause the pertinent circuit to execute the processing in accordance with the order of priority on movement to one of the modes. This enables the signal processing of image data to be executed efficiently depending on the processing contents in each mode.
In the above-described embodiment, it is assumed that the data being processed is the image data equivalent to XGA. It is to be noted that the present invention is not limited to this embodiment and can be applied to, for example, the processing of image data comprised of one million or more pixels.
Number | Date | Country | Kind |
---|---|---|---|
10-204089 | Jul 1998 | JP | national |
10-333965 | Nov 1998 | JP | national |
This is a continuation of application Ser. No. 12/894,622, filed Sep. 30, 2010, which is a continuation of application Ser. No. 12/079,129, filed Mar. 28, 2008, which is a continuation of application Ser. No. 10/668,904, filed Sep. 23, 2003, which is a divisional of application Ser. No. 09/354,476, now U.S. Pat. No. 6,674,464, which is entitled to the priority filing date of Japanese application(s) 10-204089 and 10-333965, filed Jul. 17, 1998 and Nov. 25, 1998, respectively, the entirety of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3971065 | Bayer | Jul 1976 | A |
4131919 | Lloyd et al. | Dec 1978 | A |
4323916 | Dischert et al. | Apr 1982 | A |
4338492 | Snopko | Jul 1982 | A |
4356509 | Skerlos et al. | Oct 1982 | A |
4412252 | Moore et al. | Oct 1983 | A |
4456925 | Skerlos et al. | Jun 1984 | A |
4456931 | Toyoda et al. | Jun 1984 | A |
4460928 | Kishimoto | Jul 1984 | A |
4468755 | Iida | Aug 1984 | A |
4475131 | Nishizawa et al. | Oct 1984 | A |
4479143 | Watanabe et al. | Oct 1984 | A |
4489351 | D'Alayer de Costemore d'Arc | Dec 1984 | A |
4502788 | Lowden | Mar 1985 | A |
4533952 | Norman, III | Aug 1985 | A |
4541010 | Alston | Sep 1985 | A |
4546390 | Konishi et al. | Oct 1985 | A |
4605956 | Cok | Aug 1986 | A |
4623922 | Wischermann | Nov 1986 | A |
4642678 | Cok | Feb 1987 | A |
4647975 | Alston et al. | Mar 1987 | A |
4691253 | Silver | Sep 1987 | A |
4714963 | Vogel | Dec 1987 | A |
4730222 | Schauffele | Mar 1988 | A |
4740828 | Kinoshita | Apr 1988 | A |
4743959 | Frederiksen | May 1988 | A |
4746980 | Petersen | May 1988 | A |
4746988 | Nutting et al. | May 1988 | A |
4746993 | Tada | May 1988 | A |
4754333 | Nara | Jun 1988 | A |
4757384 | Nonweiler et al. | Jul 1988 | A |
4758881 | Laspada | Jul 1988 | A |
4758883 | Kawahara et al. | Jul 1988 | A |
4764805 | Rabbani et al. | Aug 1988 | A |
4771279 | Hannah | Sep 1988 | A |
4772956 | Roche et al. | Sep 1988 | A |
4774562 | Chen et al. | Sep 1988 | A |
4774574 | Daly et al. | Sep 1988 | A |
4774581 | Shiratsuchi | Sep 1988 | A |
4774587 | Schmitt | Sep 1988 | A |
4779135 | Judd | Oct 1988 | A |
4780761 | Daly et al. | Oct 1988 | A |
4782399 | Sato | Nov 1988 | A |
4791741 | Kondo | Dec 1988 | A |
4792856 | Shiratsuchi | Dec 1988 | A |
4803554 | Pape | Feb 1989 | A |
4814905 | Hashimoto | Mar 1989 | A |
4819059 | Pape | Apr 1989 | A |
4821121 | Beaulier | Apr 1989 | A |
4825301 | Pape et al. | Apr 1989 | A |
4837614 | Omi | Jun 1989 | A |
4837628 | Sasaki | Jun 1989 | A |
4855724 | Yang | Aug 1989 | A |
4876590 | Parulski | Oct 1989 | A |
4881127 | Isoguchi et al. | Nov 1989 | A |
4905079 | Hayashi | Feb 1990 | A |
4918523 | Simon et al. | Apr 1990 | A |
4928137 | Kinoshita | May 1990 | A |
4931250 | Greszczuk | Jun 1990 | A |
4979028 | Minematsu et al. | Dec 1990 | A |
5012328 | Ishiguro | Apr 1991 | A |
5015854 | Shigyo et al. | May 1991 | A |
5016107 | Sasson et al. | May 1991 | A |
5018017 | Sasaki et al. | May 1991 | A |
5032927 | Watanabe et al. | Jul 1991 | A |
5034804 | Sasaki et al. | Jul 1991 | A |
5038202 | Ooishi et al. | Aug 1991 | A |
5040068 | Parulski et al. | Aug 1991 | A |
5045327 | Tarlow et al. | Sep 1991 | A |
5067019 | Juday et al. | Nov 1991 | A |
5070406 | Kinoshita | Dec 1991 | A |
5073926 | Suzuki et al. | Dec 1991 | A |
5073927 | Grube | Dec 1991 | A |
5097518 | Scott et al. | Mar 1992 | A |
5111283 | Nagasawa et al. | May 1992 | A |
5113455 | Scott | May 1992 | A |
5124692 | Sasson | Jun 1992 | A |
5125045 | Murakami et al. | Jun 1992 | A |
5128776 | Scorse et al. | Jul 1992 | A |
5136628 | Araki et al. | Aug 1992 | A |
5138454 | Parulski | Aug 1992 | A |
5138459 | Roberts et al. | Aug 1992 | A |
5153730 | Nagasaki et al. | Oct 1992 | A |
5164831 | Kuchta et al. | Nov 1992 | A |
5164834 | Fukuda et al. | Nov 1992 | A |
5164980 | Bush et al. | Nov 1992 | A |
5173779 | Lee | Dec 1992 | A |
5177614 | Kawaoka et al. | Jan 1993 | A |
5189511 | Parulski et al. | Feb 1993 | A |
5194955 | Yoneta et al. | Mar 1993 | A |
5218452 | Kondo et al. | Jun 1993 | A |
5218457 | Burkhardt et al. | Jun 1993 | A |
5226114 | Martinez et al. | Jul 1993 | A |
5226145 | Moronaga et al. | Jul 1993 | A |
5237420 | Hayashi | Aug 1993 | A |
5251019 | Moorman et al. | Oct 1993 | A |
5251036 | Kawaoka et al. | Oct 1993 | A |
5262871 | Wilder et al. | Nov 1993 | A |
5264939 | Chang | Nov 1993 | A |
5264944 | Takemura | Nov 1993 | A |
5272526 | Yoneta et al. | Dec 1993 | A |
5280343 | Sullivan | Jan 1994 | A |
5293225 | Nishiyama et al. | Mar 1994 | A |
5293232 | Seki et al. | Mar 1994 | A |
5293236 | Adachi et al. | Mar 1994 | A |
5293252 | Kim et al. | Mar 1994 | A |
5295077 | Fukuoka | Mar 1994 | A |
5309528 | Rosen et al. | May 1994 | A |
RE34654 | Yamawaki | Jul 1994 | E |
5327486 | Wolff et al. | Jul 1994 | A |
5331551 | Tsuruoka et al. | Jul 1994 | A |
5335016 | Nakagawa | Aug 1994 | A |
5341153 | Benzschawel et al. | Aug 1994 | A |
5359698 | Goldberg et al. | Oct 1994 | A |
5363203 | Tahara et al. | Nov 1994 | A |
5367332 | Kerns et al. | Nov 1994 | A |
5379069 | Tani | Jan 1995 | A |
5382976 | Hibbard | Jan 1995 | A |
5396290 | Kannegundla et al. | Mar 1995 | A |
5402518 | Lowery | Mar 1995 | A |
5418565 | Smith | May 1995 | A |
5420636 | Kojima | May 1995 | A |
5420637 | Zeevi et al. | May 1995 | A |
5420641 | Tsuchida | May 1995 | A |
5428389 | Ito et al. | Jun 1995 | A |
5440343 | Parulski et al. | Aug 1995 | A |
5442686 | Wada et al. | Aug 1995 | A |
5444482 | Misawa et al. | Aug 1995 | A |
5452017 | Hickman | Sep 1995 | A |
5473370 | Moronaga et al. | Dec 1995 | A |
5479206 | Ueno et al. | Dec 1995 | A |
5491837 | Haartsen | Feb 1996 | A |
5493335 | Parulski et al. | Feb 1996 | A |
5497193 | Mitsuhashi et al. | Mar 1996 | A |
5499316 | Sudoh et al. | Mar 1996 | A |
5528740 | Hill et al. | Jun 1996 | A |
5533013 | Leppanen | Jul 1996 | A |
5539455 | Makioka | Jul 1996 | A |
5546447 | Skarbo et al. | Aug 1996 | A |
5550646 | Hassan et al. | Aug 1996 | A |
5559868 | Blonder | Sep 1996 | A |
5576757 | Roberts et al. | Nov 1996 | A |
5576758 | Arai et al. | Nov 1996 | A |
5581301 | Ninomiya | Dec 1996 | A |
5612732 | Yuyama et al. | Mar 1997 | A |
5657246 | Hogan et al. | Aug 1997 | A |
5687157 | Imai et al. | Nov 1997 | A |
5696815 | Smyk | Dec 1997 | A |
5717496 | Satoh et al. | Feb 1998 | A |
H1714 | Partridge, III | Mar 1998 | H |
5734427 | Hayashi et al. | Mar 1998 | A |
5739868 | Butler et al. | Apr 1998 | A |
5751350 | Tanaka | May 1998 | A |
5754636 | Bayless et al. | May 1998 | A |
5761279 | Bierman et al. | Jun 1998 | A |
5771451 | Takai et al. | Jun 1998 | A |
5774863 | Okano et al. | Jun 1998 | A |
5787399 | Lee et al. | Jul 1998 | A |
5805677 | Ferry et al. | Sep 1998 | A |
5828406 | Parulski et al. | Oct 1998 | A |
5845212 | Tanaka | Dec 1998 | A |
5848384 | Hollier et al. | Dec 1998 | A |
5903871 | Terui et al. | May 1999 | A |
5907604 | Hsu | May 1999 | A |
5923816 | Ueda | Jul 1999 | A |
5926218 | Smith | Jul 1999 | A |
5940743 | Sunay et al. | Aug 1999 | A |
5960035 | Sridhar et al. | Sep 1999 | A |
5960347 | Ozluturk | Sep 1999 | A |
6021158 | Schurr et al. | Feb 2000 | A |
6055427 | Ojaniemi | Apr 2000 | A |
6055500 | Terui et al. | Apr 2000 | A |
6070075 | Kim | May 2000 | A |
6073025 | Chheda et al. | Jun 2000 | A |
6084633 | Gouhara et al. | Jul 2000 | A |
6097430 | Komiya et al. | Aug 2000 | A |
6144411 | Kobayashi et al. | Nov 2000 | A |
6177956 | Anderson et al. | Jan 2001 | B1 |
6236676 | Shaffer et al. | May 2001 | B1 |
6263205 | Yamaura et al. | Jul 2001 | B1 |
6292218 | Parulski et al. | Sep 2001 | B1 |
6311092 | Yamada | Oct 2001 | B1 |
6339612 | Stewart et al. | Jan 2002 | B1 |
6342921 | Yamaguchi et al. | Jan 2002 | B1 |
6393216 | Ootsuka et al. | May 2002 | B1 |
6487366 | Morimoto et al. | Nov 2002 | B1 |
6496222 | Roberts et al. | Dec 2002 | B1 |
6496224 | Ueno | Dec 2002 | B2 |
6507611 | Imai et al. | Jan 2003 | B1 |
6518999 | Miyamoto | Feb 2003 | B1 |
6529236 | Watanabe | Mar 2003 | B1 |
6559889 | Tanaka | May 2003 | B2 |
6564070 | Nagamine et al. | May 2003 | B1 |
6639626 | Kubo et al. | Oct 2003 | B1 |
6639627 | Takezawa et al. | Oct 2003 | B1 |
6657658 | Takemura | Dec 2003 | B2 |
6674464 | Mizutani et al. | Jan 2004 | B1 |
6674732 | Boehnke et al. | Jan 2004 | B1 |
7589779 | Lyons et al. | Sep 2009 | B2 |
7839447 | Mizutani et al. | Nov 2010 | B2 |
8089527 | Tomita | Jan 2012 | B2 |
20020015447 | Zhou | Feb 2002 | A1 |
20030147634 | Takezawa et al. | Aug 2003 | A1 |
20050041116 | Tsukioka | Feb 2005 | A1 |
20090295945 | Watanabe et al. | Dec 2009 | A1 |
Number | Date | Country |
---|---|---|
1126920 | Jul 1996 | CN |
1151081 | Jun 1997 | CN |
1180273 | Apr 1998 | CN |
0129122 | Dec 1984 | EP |
0202009 | Nov 1986 | EP |
0202009 | Nov 1986 | EP |
212784 | Mar 1987 | EP |
212784 | Mar 1987 | EP |
0323194 | Jul 1989 | EP |
0323194 | Jul 1989 | EP |
0360615 | Mar 1990 | EP |
0400906 | Dec 1990 | EP |
0405491 | Jan 1991 | EP |
0456369 | Nov 1991 | EP |
0533107 | Mar 1993 | EP |
0667726 | Aug 1995 | EP |
0671819 | Sep 1995 | EP |
0726659 | Aug 1996 | EP |
0778566 | Jun 1997 | EP |
204626 | Jan 1923 | GB |
289944 | Feb 1927 | GB |
2089169 | Jun 1982 | GB |
60-136481 | Jul 1985 | JP |
61-253982 | Nov 1986 | JP |
61-264880 | Nov 1986 | JP |
62-108678 | May 1987 | JP |
62-185490 | Aug 1987 | JP |
63-064485 | Mar 1988 | JP |
63-128879 | Jun 1988 | JP |
63-141485 | Jun 1988 | JP |
63-286078 | Nov 1988 | JP |
64-10784 | Jan 1989 | JP |
64-010784 | Jan 1989 | JP |
64-013877 | Jan 1989 | JP |
64-051786 | Mar 1989 | JP |
01-221985 | Sep 1989 | JP |
01-221989 | Sep 1989 | JP |
01-243686 | Sep 1989 | JP |
01-288186 | Nov 1989 | JP |
02-007680 | Jan 1990 | JP |
02-76385 | Mar 1990 | JP |
02-104078 | Apr 1990 | JP |
02-105686 | Apr 1990 | JP |
02-105786 | Apr 1990 | JP |
02-113683 | Apr 1990 | JP |
02-202792 | Aug 1990 | JP |
02-214271 | Aug 1990 | JP |
02-222383 | Sep 1990 | JP |
02-226984 | Sep 1990 | JP |
02-249371 | Oct 1990 | JP |
02-277385 | Nov 1990 | JP |
02-292962 | Dec 1990 | JP |
03-001681 | Jan 1991 | JP |
03-500119 | Jan 1991 | JP |
03-042973 | Feb 1991 | JP |
03-088488 | Apr 1991 | JP |
03-143084 | Jun 1991 | JP |
03-184482 | Aug 1991 | JP |
03-234182 | Oct 1991 | JP |
03-240384 | Oct 1991 | JP |
03-252282 | Nov 1991 | JP |
03-268583 | Nov 1991 | JP |
03-284079 | Dec 1991 | JP |
04-035181 | Feb 1992 | JP |
04-142892 | May 1992 | JP |
04-170176 | Jun 1992 | JP |
04-170879 | Jun 1992 | JP |
04-213970 | Aug 1992 | JP |
04-239279 | Aug 1992 | JP |
04-315356 | Nov 1992 | JP |
04-319893 | Nov 1992 | JP |
04-324778 | Nov 1992 | JP |
04-348685 | Dec 1992 | JP |
05-049000 | Feb 1993 | JP |
05-122574 | May 1993 | JP |
05-167908 | Jul 1993 | JP |
06-022189 | Jan 1994 | JP |
06-110107 | Apr 1994 | JP |
6-189256 | Jul 1994 | JP |
06-237431 | Aug 1994 | JP |
07-264489 | Oct 1995 | JP |
07-312714 | Nov 1995 | JP |
08-125957 | May 1996 | JP |
2526825 | Jun 1996 | JP |
09-098379 | Apr 1997 | JP |
9-127989 | May 1997 | JP |
09-247543 | Sep 1997 | JP |
10-098642 | Apr 1998 | JP |
10-108133 | Apr 1998 | JP |
2000 92349 | Mar 2000 | JP |
2000 92365 | Mar 2000 | JP |
WO 8912939 | Dec 1989 | WO |
WO 9114334 | Sep 1991 | WO |
WO 9213423 | Aug 1992 | WO |
WO 9301685 | Jan 1993 | WO |
WO9512459 | May 1995 | WO |
Entry |
---|
Rochelle Q&A Caller ID FAQ, Sep. 9, 1996, Rochelle Communications, Inc. Austin, TX. |
“CallAudit Awarded Product of the Year 1995,” Aug. 8, 1996. |
Fuyun Ling, et al., “Behavior and Performance of Power Controlled 1S-95 Reverse Link Under Soft Handoff”, May 4, 1997, IEEE, pp. 924-928. |
Apple QuickTake 100: User's Guide for MacIntosh (1994),Apple QuickTake Camera Associated Press NC-2000. |
Basu et al., Variable Resolution Teleconferencing, in Systems, Man, and Cybernetics 170 (1993). |
Casio Press Release, Nov. 14, 1994, Announcement: LCD Digital Camera in a Compact Size (Amended Feb. 1995). |
Daniel & Sally Grotta, Digital Photography, Popular Science at 62 (Sep. 1992). |
DCS200, pp. 7-10 (Dec. 1992). |
Electric Eye, pp. 98-99 (Dec. 1989). |
ES-30TW, Imaging Technology, pp. 115-120 (Mar. 1990). |
F. Izawa et al., Memory Card Camera and Player, vol. 46, No. 2, pp. 113-117 (1991). |
F. Izawa, M. Sasaki, et al., Digital Still Video Camera Using Semiconductor Memory Card, 1990 IEEE Transactions on Consumer Electronics, vol. 36, No. 1 (1990). |
Fujix DS-X Memory Card Camera User's Manual. |
Fujix, Memory Card Camera DS-X Advertisement. |
Fukuoka, Motion Picture Recording Reproducing by an Electronic Still Camera, Electronics, pp. 7-11 (1993). |
Fumiyoshi Itoh et al., Digital Card Camera “VMC-1”, ITEJ Technical Report, vol. 15, No. 7, pp. 13-18 (Jan. 1991). |
Gregory Wallace, Overview of the JPEG Still Image Compression Standard, 1244 SPIE Image Processing Algorithms and Techniquesm 220, 220-33 (1990). |
Haruhiko Murata et a., The Development of Compact Electronic Still Camera, ITEJ Technical Report, vol. 12, No. 57, pp. 31-36 (Dec. 1988). |
Hiroyoshi Fujimori et al., Digital Card Camera, ITEJ Technical Report, vol. 14, No. 5, pp. 7-12 (1990). |
Hiroyuki Suetaka, LCD Digital Camera QV-IO, ITE Technical Report, vol. 18, No. 45, pp. 13-14 (Sep. 1995). |
Hisashi Niwa, Digital Still Camera with Pixel-Adaptive DPCM Data Compression, ITEC, pp. 15-16 (Jan. 1991). |
IC Card Camera System—Toshiba & Fuji Photo Film, Technical Report, DEMPA Daily, Mar. 30, 1989 IEEE 1989. |
International Conference on Consumer Electronics, Digest of Technical Papers (Jun. 1989). |
Imaging Promenade, No. 47, pp. 102-105 (Sep. 1993). |
Imaging Technology, pp. 101-106 (Nov. 1987). |
Isaac Shenberg, et al., An Image Compression Chip Set for Digital Still Cameras and Peripherals, Electronic Imaging International '91 Convention pp. 439-447 (1991). |
John J. Larish, Electronic Photography (1990). |
Kazunori Ohnishi et al., Electronic Still-Picture Camera Using Magnetic Bubble Memory, IEEE Transactions on Consumer Electronics, vol. 28, No. 3, pp. 321-324 (1982). |
Kazuo Shiozawa, Current Situation and Outlook on Still Video Camera, pp. 601-604 (1989). |
Kenneth A. Parulski, Color Filters and Processing Alternatives for One-Chip Cameras, IEEE Transactions on Electrons, vol. 32, No. 8, pp. 1381-1389 (1985). |
Lionel J. D'Luna & Kenneth Parulski, A Systems Approach to Custom VLSI for a Digital Color Imaging System, IEEE Journal of Solid-State Circuits, vol. 26, No. 5, pp. 727-737 (1991). |
M. Sasaki & S. Yamaguchi, Signal Processing Technologies for Digital Still Camera System, Toshiba Review, vol. 46. No. 2, pp. 125-128 (1991). |
M.C. Malin et al., Design and Development of the Mars Observer Camera, Int'l J. of Imaging Systems and Technology, vol. 3, pp. 76-91 (1991). |
Masaaki Nlayama, Single-Hand Movie. [Brondby] (NV-S1), Japan Society for Fuzzy Theory and Systems, pp. 51-55 (Feb. 1991). |
Masaki Nakagawa et al., DCT-Based Still Image Compression ICS With Bit-Rate Control, IEEE Transactions on Consumer Electronics, vol. 38, No. 3, pp. 711-717 (1992). |
Masami Suzuki et al., Standard Subscriber Line Compatible Color Videophone, 1998 IEEE, p. 759. |
Masani Shimoda, Development and Future of MA VICA, pp. 597-600 (1989). |
Michael Kris, Kenneth Parulski, & David Lewis, Critical Technologies for Electronic Still Imaging Systems, 1082 SPIE Applications of Electronic Imaging pp. 157-184 (1989). |
Mikio Takemae et al., Nikon Still Video Kamera System, ITEJ Technical Report, vol. 12, No. 54, pp. 7-12 (Nov. 1989). |
Mikio Watanabe et al., A Bit Rate Controlled DCT Algorithm for Digital Still Camera, 1244 SPIE Image Processing Algorthims and Techniques 234, 234-39 (1990). |
Minoru Sasaki et al., Toshiba Digital Camera-Picture Coding for Digital Still Camera, J. of Institute of Television Engineers of Japan, vol. 46, No. 3, pp. 300-307 (Mar. 1992). |
Minoru Sasaki, et al.. Digital Electronic Still Camera System, ITEJ Technical Report, vol. 13, No. 22, pp. 17-22 (1989). |
N. Kihara, et al., The Electronic Still Camera A New Concept in Photography, IEEE Transactions on Consumer Electronics, vol. 28, No. 3, pp. 325-331 (1982). |
News Flash, pp. 107-109 (Jul. 1987). |
Olympus VC-1100 Manual, pp. 56-113. |
Paik et al., Combined Digital Zooming and Digital Effects System Utilizing CCD Signal Characteristics, IEEE Transactions on Consumer Electronics, vol. 39, No. 3 (Aug. 1993). |
Pulnix Technical Note No. TH-1060, Jul. 9, 1990. |
Robert Chapman Wood, Photos Go Electronic,High Technology Business, at 15 Robert Terry Gray, Multispectral Data Compression Using Staggered Detector Arrays, PhD Dissertation, University of Arizona (1983). |
S. Okada, et al., A Single Chip Motion JPEG Codec LSI, 1997 IEEE Custom Integrated Circuits Conference, pp. 233-236. |
S. Tsuruta et al., Color Pixel Arrangement Evaluation for LC-TV, Proceedings, 1985 International Display Research Conference, pp. 24-26 (1985). |
Shigeharu Ochi et al., Fujix DS-1P Card Camera, ITEJ Technical Report, vol. 13, No. 22, pp. 11-16 (Mar. 1989). |
Shigeharu Ochi, et al., Development of the “DS-IP” Memory Card Camera, Fuji Film Research & Development, No. 35, pp. 52-57 (1990). |
Shin Ohno, Electronic Photography, pp. 45-51 (Aug. 1993). |
Shin Ohno, Still Image Communication: Trend of Electronic Photography, ITEJ Technical Report, vol. 15, No. 71, pp. 19-24 (Nov. 1991). |
Stuart M. Dambrot, Battle for Lead in Electronic Photography Intensifies, Electronics, vol. 65, No. 13 (Oct. 12, 1992). |
Sugaya, Complete Analysis of Sharp's Liquid Crystal Viewcam, Electronic Engineering, vol. 35, No. 1 (1993). |
Sumihasa Hashiguchi, Electronic Still Cameras (1989). |
Takaaki Suyama et al., Memory Card Camera and Peripheral Equipments, ITEJ Technical Report, vol. 15, No. 7, pp. 19-24 (Jan. 1991). |
Takuya Imaide et al, A Multimedia Color Camera Providing Multi-format Digital Images, IEEE Transactions on Consumer Electronics, vol. 39, No. 3, pp. 467-473 (1993). |
Timothy J. Tredwell, Sensors and Signal Processing for Digital Electronic Photography, in Optoelectronics—s and Technologies, vol. 6, No. 2, pp. 287-300. |
Toshinori Morikawa et al., Single-Hand Operated Camera Recorder NV-SI, National Technical Report, vol. 37, No. 3 (Jun. 1991). |
VS-101, Imaging Technology, pp. 106-110 (Jun. 1989). |
Wesley R. Iversen, Digital Photography: All-Digital Camera Technology Is Inching Electronic Photography Closer to the Commercial Mainstream, Computer Graphics World, vol. 15, No. 2 (Feb. 1992). |
William B. Pennebaker & Joan L. Mitchell, JPEG: Still Image Data Compression Standard (1. |
Wiliam K. Pratt, Digital Image Processing (1978). |
Yamakawa et al., Development of a Field Sequential Color View Finder for Color Video Cameras, IEEE Transactions on Consumer Electronics, vol. 39, No. 3 (Aug. 1993). |
Yasuo Itoh, et al., Nonvolatile Memories, Digest of Technical Papers, in 1989 IEEE International Solid State Circuit Conference, pp. 134-135, 314. |
Yoshinori Takizawa et al., Low-Cost Digital Electronic Still Cameras for Computer Imaging, IEEE Conference Paper, pp. 156-157 (1994). |
CDCA-2-10CV9956 Complaint for Patent Infringement filed on Dec. 28, 2010 with the U.S. District Court Central District of California. Sony Corporation, a Japanese Corporation (Plaintiff) v. LG Electronics U.S.A. Inc., a Delaware Corporation and LG Electronics Mobilecomm U.S.A., Inc., a California corporation (Defendant). |
Complaint filed in ITC: 337-TA-758: Certain Mobile Telephones. |
Number | Date | Country | |
---|---|---|---|
20120274821 A1 | Nov 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09354476 | Jul 1999 | US |
Child | 10668904 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12894622 | Sep 2010 | US |
Child | 13546026 | US | |
Parent | 12079129 | Mar 2008 | US |
Child | 12894622 | US | |
Parent | 10668904 | Sep 2003 | US |
Child | 12079129 | US |