This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2009-113931, filed on May 8, 2009, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate to an image processing system and method that captures images of surrounding parts of an object using a plurality of image capture apparatuses, combines the captured images, and displays a resultant image on a display apparatus.
There has been proposed an apparatus that allows a plurality of cameras mounted on a vehicle to shoot surroundings of the vehicle which correspond to blind angles for a driver and displays captured images on a display apparatus in the vehicle to assist the driver in driving safely.
However, when the resolution of each camera is increased in order to display a higher-definition image, the amount of data transmitted from the cameras to an image processing apparatus that processes images captured through the cameras is remarkably increased. In addition, if the number of cameras mounted on the vehicle is increased, the amount of data transmitted from the cameras to the image processing apparatus is similarly increased remarkably. Accordingly, in some cases, the amount of data that can be transmitted is limited by the bandwidth of a transmission path connecting each camera to the image processing apparatus.
Japanese Laid open Patent Application Publication No. 2000-83193 and No. 10-136345 discuss techniques of reducing the amount of data transmitted from an image capture device or a camera control device to an image receiving device. In the technique discussed in Japanese Laid open Patent Application Publication No. 2000-83193, the image receiving device generates layout information regarding an image to be generated and transmits the information to image capture devices. Each image capture device processes captured image data so as to crop a captured image in accordance with the received layout information and transmits the resultant image data to the image receiving device. In the technique discussed in Japanese Laid open Patent Application Publication No. 10-136345, each camera control device detects an image capturing direction and a zoom magnification of a camera and transforms an image captured through the camera into an image with a desired resolution at a desired frame rate. The resultant image is transmitted from the camera control device to a terminal. The terminal combines images transmitted from the camera control devices and displays the resultant image on a monitor.
According to an embodiment of the invention, an image processing system includes a display apparatus displaying image data, a plurality of image capture units mounted on a vehicle and capturing images of surroundings of a vehicle and an image processing apparatus connected with the plurality of image capture units via a network in the vehicle to generate a combined image data based on a plurality of image data captured by the image capture units, and connected with the display apparatus. The plurality of image capture units each include a camera capturing an image data of one of surrounding parts of the vehicle a storage unit storing segment information for identifying each of segments divided from the image data captured by the camera, and importance degrees calculated for the segments based on resolutions which are required of the segments in the image data upon generation of the combined image data.
According to an embodiment, an image compressing unit is provided that compresses each of the segments of the image data at a compression ratio depending on a corresponding importance degree for each of the segments, and generates compressed image data and a transmitting unit transmits the compressed image data to the image processing apparatus through the network. The image processing apparatus includes a network interface unit inputting the compressed image data transmitted from each of the image capture units, an image generating unit generating a combined image data based on the compressed image data and a transmitting unit transmitting the combined image data to the display apparatus, and the display apparatus includes a receiving unit receiving the combined image data transmitted from the image processing apparatus; and a display unit displaying the combined image data.
Aspects and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Additional aspects and/or advantages will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the invention.
Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below to explain the present invention by referring to the figures.
By switching a pattern of a combined image generated by an image processing apparatus to another one in accordance with a driver's driving operation, such as turning to right or left, changing lanes, or parking a car into a garage, convenience of the driver can be increased. A portion to be used in an image captured through each camera and the resolution of a necessary image differ depending on combined image pattern. For example, the road area per pixel of an image is small in a location 1002 close to a camera 1001, as shown in
Accordingly, in order to generate different types of combined images with high definition while reducing an amount of data transmitted from each image capture device, such as a camera, to the image processing apparatus, it is necessary to adjust the resolutions of segments, which are to be used in the combined images, in an image captured through each camera and transmit a resultant image from the camera to the image processing apparatus.
An embodiment will be described below with reference to the accompanying drawings.
Referring to
The image capture apparatus 100 will now be described in detail with reference to
The camera 111 captures an image of surroundings of a vehicle and outputs captured image data to the image compressing unit 112.
The image compressing unit 112 functions as an image compressing component that compresses image data captured through the camera 111 at a compression ratio specified through the compression ratio control unit 114. The image compressing unit 112 also functions as a transmitting component that outputs the compressed image data to the network I/F unit 113.
The segment importance degree pattern storage unit 115 is a storage component that stores segment importance degree pattern data indicating importance degrees of a plurality of segments divided from image data.
Referring to
The compression ratio control unit 114 accepts an instruction specifying a generation pattern for combined image data from the control unit 230 of the image processing apparatus 200 through the network I/F unit 113. In response to the instruction from the control unit 230, the compression ratio control unit 114 reads segment importance degree pattern data relevant to the specified generation pattern from the segment importance degree pattern storage unit 115. The compression ratio control unit 114 controls compression ratios in image data captured through the camera 111 in accordance with the read segment importance degree pattern data. Note that combined image data is image data generated by processing, for example, performing coordinate transformation on camera image data captured through the cameras 111, 121, 131, and 141. Segments to be used in camera image data or coordinate transformation may be changed, so that a plurality of types, namely, different patterns of combined image data are generated. The details of combined image data are described in detail below.
A process by the compression ratio control unit 114 will now be concretely described.
It is assumed that the importance degrees of segments of image data are determined on the basis of segment importance degree pattern data, for example, as shown in part 7001 of
The compression ratio control unit 114 controls an amount of compression codes allocated to an image data segment assigned the high importance degree, i.e., the importance degree 3, as shown in part 7002 of
The process by the compression ratio control unit 114 will now be more concretely described. First, description will be given with respect to a transmission data amount management unit, a compression data amount management unit, and a compression unit which will be used in the following description.
The transmission data amount management unit is a management unit for adjusting the amount of data packets to be transmitted over a transmission path. For example, the transmission data amount management unit is set in units of frames, e.g., one frame or eight frames. In a method for transmission in which the compression ratio per transmission data amount management unit is fixed, the amount of compression codes available per frame, serving as the transmission data amount management unit, is fixed to a constant value.
The compression data amount management unit is a management unit for adjusting the amount of compression codes. For example, when the compression ratio is fixed, the amount of compression codes to be generated is adjusted to a constant value on the basis of the compression data amount management unit. The compression data amount management unit is set in units of, for example, lines (image lines) constituting image data, e.g., eight lines.
The compression unit is a unit for allocating compression codes. On the side in which packets are received, data is decompressed in this compression unit.
In the following description, it is assumed that the transmission data amount management unit is set to, for example, one frame. A method for transmission in which the compression ratio per transmission data amount management unit is fixed will now be described.
In the case where the compression ratio per frame is fixed, the compression ratio control unit 114 determines the amount of codes to be allocated in each compression data amount management unit. In the following description, it is assumed that the compression data amount management unit is set to, for example, eight lines.
First, the compression ratio control unit 114 allocates compression codes to unnecessary pixels, corresponding to a segment assigned the importance degree 0, of pixels included in one frame so that the compression ratio is set to a maximum value, namely, the amount of compression codes is set to a minimum value.
Subsequently, the compression ratio control unit 114 subtracts the amount of compression codes allocated to the unnecessary pixels from the amount of compression codes that can be allocated in the transmission data amount management unit to obtain the amount of remaining compression codes. After that, the compression ratio control unit 114 allocates the obtained remaining compression codes to pixels in accordance with the proportion of the sums of pixels classified by segment importance degree in the compression data amount management unit. Specifically, the sum of pixels is obtained with respect to each of the importance degrees 1, 2 and 3 and the amount of remaining codes is divided in accordance with the obtained sums of pixels classified by importance degree and the proportion of the importance degrees (1, 2 and 3), thus determining the amount of compression codes allocated to each pixel.
The amount of codes allocated to each pixel can also be determined on the basis of the degree of complexity of image content and the importance degree of the corresponding image data segment. For example, the compression ratio control unit 114 first obtains the degree of complexity in each compression data amount management unit. For instance, the degree of complexity of image data can be determined on the basis of the amount of high-frequency components included in the image data. For example, it is assumed that the degree of complexity of image data is determined on a scale of 1 to 5. The compression ratio control unit 114 corrects the determined degree of complexity in accordance with the importance degree of each image data segment to determine the amount of compression codes allocated to each of the corresponding pixels. For instance, even when the degree of complexity of an image data segment is indicated at “5” corresponding to “high complexity”, so long as the importance degree of the image data segment is low, an evaluation value indicating the degree of complexity is corrected to a low value. On the other hand, if the degree of complexity of an image data segment is indicated at “1” corresponding to “low complexity”, so long as the importance degree thereof is high, an evaluation value indicating the degree of complexity is corrected to a high value. The compression ratio control unit 114 determines the amount of codes allocated to each pixel on the basis of an evaluation value representing the corrected degree of complexity.
Before compression of image data, an image data segment assigned the importance degree 0 may be previously replaced with, for example, a black image segment indicated by a fixed value, as shown in parts 8001 and 8002 in
The network I/F unit 113 includes a buffer (not shown). The network I/F unit 113 temporarily stores image data, compressed and output by the image compressing unit 112, into the buffer. The network I/F unit 113 divides the stored image data into data packets and transmits the data packets through the network 150 to the image processing apparatus 200 while controlling a transmission rate. The network I/F unit 113 operate as a transmitting unit. Note that the number of packets output from the network I/F unit 113 to the network 150 is determined to a predetermined value within a fixed period of time. Accordingly, as the size of image data is reduced by compression through the image compressing unit 112, the size of each data packet is also reduced in accordance with the reduction. The data packets generated by the network I/F unit 113 are transmitted through the network 150 to the image processing apparatus 200.
The image processing apparatus 200 in
The network I/F unit 210 is an input unit that receives data packets transmitted from the first to fourth image capture units 110, 120, 130 and 140. The network I/F unit 210 also includes a buffer (not illustrated) and temporarily stores the received data packets into the buffer. The network I/F unit 210 decompresses the received packet data into image data and adds blanking data to the image data and then outputs the resultant data to the image generating unit 220. When receiving the data packets from the first to fourth image capture units 110, 120, 130 and 140, the network I/F unit 210 may output the received data packets as data sequences to the image generating unit 220 without decompressing the packet data into image data. In this case, the image generating unit 220 converts the data sequences into image data.
Each image data segment assigned the importance degree 0 to which compression codes are allocated to provide a high compression ratio in the image capture apparatus 100 is received as specific color data or specific pattern data by the image processing apparatus 200. Since this segment is not used for generation of combined image data, the segment does not affect the combined image data.
The image generating unit 220 coordinate-transforms image data respectively transmitted from the first to fourth image capture units 110, 120, 130 and 140 to generate combined image data. The storage unit 240 stores combined-image conversion pattern data, which is described in detail below. The image generating unit 220 acquires combined-image conversion pattern data for generating combined image data, specified through the control unit 230, from the storage unit 240. The image generating unit 220 coordinate-transforms the image data respectively transmitted from the first to fourth image capture units 110, 120, 130 and 140 in accordance with the acquired combined-image conversion pattern data to generate combined image data.
The transmitting unit 250 transmits the combined image data generated by the image generating unit 220 to the display apparatus 320.
The control unit 230 will now be described.
The ROM 232 stores a program that the CPU 231 uses for control. The CPU 231 reads the program stored in the ROM 232 and performs a process in accordance with the read program. The RAM 233 stores data that the CPU 231 uses for calculation and data indicating results of calculation. The input-output unit 234 accepts an operation input entered through the operation unit 310 by a user and outputs the input to the CPU 231. In addition, the input-output unit 234 outputs an instruction signal output from the CPU 231 to the network I/F unit 210. The network I/F unit 210 transmits the instruction signal output from the input-output unit 234 through the network 150 to the first to fourth image capture units 110, 120, 130 and 140. RAM233 is one of the examples of the storage unit 240.
The control unit 230 generates a plurality of segment importance degree pattern data for each pattern of combined image data. The segment importance degree pattern data are generated for each of the first to fourth image capture units 110, 120, 130 and 140. The control unit 230 transmits the generated segment importance degree pattern data through the network 150 to the image capture apparatus 100. The first, second, third, and fourth image capture units 110, 120, 130, and 140 store the relevant segment importance degree pattern data, transmitted from the control unit 230, into the segment importance degree pattern storage units 115, 125, 135, and 145, respectively. For example, the first image capture unit 110 stores the segment importance degree pattern data into the segment importance degree pattern storage unit 115. As for the segment importance degree pattern data, all of data for patterns of combined image data may be stored in each segment importance degree pattern storage unit. Alternatively, when a pattern of combined image data is switched to another one, the relevant segment importance degree pattern data may be transmitted and stored into each segment importance degree pattern storage unit.
In addition, the control unit 230 generates combined-image conversion pattern data, which is described in detail below, and stores the generated data into the storage unit 240. A plurality of combined-image conversion pattern data are generated for each pattern of combined image data.
The operation unit 310 accepts an operation input from the user. The display apparatus 320 includes the receiving unit 321 that receives combined image data transmitted from the image processing apparatus 200 and the display unit 322 that displays the received combined image data. The transmitting unit 250 transmits combined image data, generated by the image processing apparatus 200, to the display apparatus 320. The display apparatus 320 receives the combined image data, transmitted from the transmitting unit 250, through the receiving unit 321 and displays the data on the display unit 322. The operator operates the operation unit 310 to switch between different patterns of combined image displayed on the display unit 322. Examples (patterns) of combined image displayed on the display unit 322 are shown in
A process for generating segment importance degree pattern data through the control unit 230 and a process for generating combined-image conversion pattern data through the control unit 230 will now be described with reference to flowcharts of
As preparation, position coordinates and attachment angles of the cameras 111, 121, 131, and 141 mounted on the vehicle are previously calculated. Such calculation may use one or more techniques. A worker calculates the position coordinates and the attachment angles of the cameras using equipment. It is assumed that the center of the vehicle is set to the origin, the widthwise direction of the vehicle is set to the X axis, the lengthwise direction thereof is set to the Y axis, and the vertical direction thereof is set to the Z axis, as illustrated in
Data previously stored in the storage unit 240 includes characteristic data of the cameras 111, 121, 131, and 141 and combined-image layout pattern data in addition to the camera setting condition information. The characteristic data includes the number of pixels in the horizontal direction and that in the vertical direction of each of the cameras 111, 121, 131, and 141, the angle of view thereof, and lens distortion data thereof. The angle of view of each of the cameras 111, 121, 131, and 141 is the viewing angle thereof. The lens distortion data is data about lens distortion aberration. The combined-image layout pattern data includes image projection plane shape data, view point vector data, image display range data, and the like. The image projection plane shape data is data about the form of a projection plane where a plurality of image data are projected, as shown in
The control unit 230, for example, generates combined-image conversion pattern data using the camera setting condition information, the camera characteristic data, and the combined-image layout pattern data stored in the storage unit 240 (operation S1). The combined-image conversion pattern data is coordinate transformation data to converts camera image data captured through the cameras 111, 121, 131, and 141 into combined image data. The combined-image conversion pattern data includes, for example, polygon numbers, vertex numbers each representing the number of a vertex of a polygon indicated by a polygon number, image data coordinates, and combined-image pixel coordinates. Image data coordinates are information indicating the coordinate values of each vertex indicated by the corresponding vertex number before coordinate transformation. Combined-image pixel coordinates are data indicating the coordinate values of each vertex indicated by the corresponding vertex number in combined-image data after coordinate transformation. Note that a polygon is a block which serves as a coordinate transformation unit used when image data captured through the cameras are coordinate-transformed into combined image data and the polygon numbers are identification numbers to identify the polygons. The combined-image conversion pattern data may have a data format shown in
Subsequently, the control unit 230 generates a plurality of segment importance degree pattern data based on the combined-image conversion pattern data (operation S3). The process for generating the segment importance degree pattern data on the basis of the combined-image conversion pattern data will be described with reference to the flowchart of
The control unit 230 then transmits the generated segment importance degree pattern data to the first to fourth image capture units 110, 120, 130 and 140 which serve as relevant image capture units (operation S4). The first, second, third, and fourth image capture units 110, 120, 130, and 140 store the segment importance degree pattern data transmitted from the control unit 230 into the segment importance degree pattern storage units 115, 125, 135, and 145, respectively. For example, the first image capture unit 110 stores the segment importance degree pattern data into the segment importance degree pattern storage unit 115.
The process for generating the segment importance degree pattern data on the basis of the combined-image conversion pattern data through the control unit 230 in operation S3 of
First, the control unit 230 calculates an available portion used for combined image data in each of the image data captured through the cameras 111, 121, 131, and 141 with reference to the combined-image conversion pattern data. In addition, the control unit 230 calculates a scaling rate based on the coordinate transformation of each pixel included in image data corresponding to each available portion (operation S11). Note that the scaling rate includes reduction. In the following description, the scaling rate will be termed “pixel scaling rate”. Furthermore, the control unit 230 calculates a distribution of pixel scaling rates on the basis of the calculated pixel scaling rates of the pixels. The control unit 230 calculates a pixel scaling rate at which each pixel of image data corresponding to each available portion is enlarged or reduced in accordance with the coordinate transformation. As for calculation of the pixel scaling rates, each pixel scaling rate in the X axis direction and that in the Y axis direction may be obtained. Alternatively, each pixel scaling rate may be obtained on the basis of the ratio of the area of the corresponding pixel before coordinate transformation to that after coordinate transformation.
Subsequently, the control unit 230 obtains an area where the pixel scaling rates are corrected on the basis of the distance between the vehicle and a position corresponding to each pixel in image data to correct the pixel scaling rates (operation S12). For example, the control unit 230 determines a portion corresponding to a distant place or the sky which does not affect driving assistance in image data as a portion that is not needed in combined image data for driving assistance. The control unit 230 sets image data corresponding to such a portion to non-target data or data assigned the importance degree to be reduced. Referring to
Then, the control unit 230 performs processing such as clustering or normalization on the pixel scaling rates of the pixels corrected in operation S12, thus clustering the pixel scaling rates. The control unit 230 obtains the distribution of importance degrees on the basis of the clustered pixel scaling rates (operation S13). For example, the control unit 230 obtains a cluster of pixels having pixel scaling rates of 5 or higher in image data, a cluster of pixels having pixel scaling rates ranging from 2 to less than 5, a cluster of pixels having pixel scaling rates ranging from greater than 0 to less than 2, and a cluster of pixels having a pixel scaling rate of 0. The control unit 230 sets the cluster of pixels having pixel scaling rates of 5 or higher to a pixel group assigned the high importance degree. In addition, the control unit 230 sets the cluster of pixels having pixel scaling rates ranging from 2 to less than 5 to a pixel group assigned the medium importance degree. Similarly, the control unit 230 sets the cluster of pixels having pixel scaling rates ranging from greater than 0 to less than 2 to a pixel group assigned the low importance degree. In addition, the control unit 230 sets the cluster of pixels having a pixel scaling rate of 0 to a pixel group assigned an importance degree of 0 (namely, a pixel group that is not used in combined image). Since the display area of each pixel having a high pixel scaling factor is large after conversion into combined image data, the pixel can be determined as data assigned the high importance degree. Since the display area of each pixel having a low pixel scaling factor is small after conversion into combined image data, the pixel can be determined as data assigned the low importance degree.
The control unit 230 performs the above-described processes to generate segment importance degree pattern data and combined-image conversion pattern data. The control unit 230 transmits the segment importance degree pattern data, respectively generated for the camera image data captured though the cameras, to the first to fourth image capture units 110, 120, 130 and 140. The segment importance degree pattern storage units 115, 125, 135, and 145 of the first, second, third, and fourth image capture units 110, 120, 130, and 140 each store an importance degree pattern data for each pattern of combined image data.
An operation procedure of the image processing system 1 will now be described with reference to flowcharts of
First, the control unit 230 determines whether an instruction to change of combined image is commanded through the operation unit 310 (operation S21). If the instruction to change the combined image is given (YES in operation S21), the control unit 230 notifies the image capture apparatus 100 of the change instruction. In response to the notification, the first to fourth image capture units 110, 120, 130 and 140 each record the instruction to change the combined image into a memory (operation S22).
Subsequently, when the cameras 111, 121, 131, and 141 capture images, and input image data into image compressing units 112,122,132, and 142 (YES in operation S23), the compression ratio control units 114, 124, 134, and 144 acquire segment importance degree pattern data for generation of a specified combined image from the control unit 230. The compression ratio control units 114, 124, 134, and 144 control compression ratios for the camera image data captured through the cameras 111, 121, 131, and 141 with reference to the acquired segment importance degree pattern data (operation S24). At this time, each of the compression ratio control units 114 to 144 changes the compression ratio for each segment in the image data in accordance with the corresponding importance degree included in the segment importance degree pattern data. The image compressing units 112, 122, 132, and 142 compress the image data and output to the network I/F units 113, 123, 133, and 143, respectively. The network I/F units 113 to 143 each generate data packets having a size depending on the amount of compressed image data of one frame and each transmit the packets to the image processing apparatus 200 (operation S25).
A procedure of the image processing apparatus 200 will now be described with reference to
The image processing apparatus 200 receives data packets transmitted over the network 150 from the first to fourth image capture units 110, 120, 130 and 140 through the network I/F unit 210. When the network I/F unit 210 receives the data packets (YES in operation S31), the network I/F unit 210 decompresses the received packets data to image data (operation S32). The network I/F unit 210 adds blanking data to the decompressed image data and outputs the resultant data to the image generating unit 220. When receiving the image data from the network I/F unit 210, the image generating unit 220 converts (synthesizes) the image data into combined image data with reference to combined-image conversion pattern data stored in the storage unit 240 (operation S33).
As described above, in an embodiment, the compression ratio for each segment in image data is changed in accordance with the corresponding importance degree included in segment importance degree pattern data, and the compressed image data is transmitted to the image processing apparatus 200. Accordingly, while the amount of data transmitted from the image capture apparatus can be reduced, a high-quality combined image can be generated.
A method of image processing is provided including determining a degree of importance for segments of an image captured using multiple image capturing devices, adjusting resolutions of to the segments based on a corresponding degree of importance, and combining the segments with the adjusted resolutions to produce a resultant image. The method includes selectively adjusting resolutions of segments of the divided image based on a degree of importance assigned. Further, while specific examples of an image capturing device(s) of a vehicle are described herein, the present invention is not limited to use in relation to vehicles.
It should be understood that the present invention is not limited to the above-described embodiments and various changes and modifications thereof can be made without departing from the spirit and scope of the present invention.
The embodiments can be implemented in computing hardware (computing apparatus) and/or software, such as (in a non-limiting example) any computer that can store, retrieve, process and/or output data and/or communicate with other computers. The results produced can be displayed on a display of the computing hardware. A program/software implementing the embodiments may be recorded on computer-readable media comprising computer-readable recording media. The program/software implementing the embodiments may also be transmitted over transmission communication media. Examples of the computer-readable recording media include a magnetic recording apparatus, an optical disk, a magneto-optical disk, and/or a semiconductor memory (for example, RAM, ROM, etc.). Examples of the magnetic recording apparatus include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape (MT). Examples of the optical disk include a DVD (Digital Versatile Disc), a DVD-RAM, a CD-ROM (Compact Disc-Read Only Memory), and a CD-R (Recordable)/RW. An example of communication media includes a carrier-wave signal.
Further, according to an aspect of the embodiments, any combinations of the described features, functions and/or operations can be provided.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present inventions have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention, the scope of which is defined in the claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2009-113931 | May 2009 | JP | national |