This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2009-113930, filed on May 8, 2009, the entire contents of which are incorporated herein by reference.
Various embodiments described herein relate to an image processing system of taking images of surrounding parts of an object using a plurality of image capture apparatuses and synthesizing the taken images to be displayed on a display device.
A device that supports safety driving of a driver by photographing a vehicle's surrounding parts which are in the blind angles of the driver using a plurality cameras mounted on the vehicle and displaying the parts on a display device installed in a cab is proposed. However, in the case that the resolution of each camera is increased in order to display an image of higher definition, an amount of data transmitted from the cameras to an image processing device that processes images taken using the cameras is greatly increased accordingly. Likewise, in the case that the number of cameras mounted on the vehicle concerned is increased, the amount of data transmitted from the cameras to the image processing device is greatly increased accordingly. Therefore, it sometimes occurs that the amount of data allowed to be transmitted is limited depending of a band width of a transmission path that connects each camera with the image processing device.
Japanese Laid open Patent Application Publication No. 2000-83193 and No. 10-136345 disclose techniques for reducing the amount of data transmitted from an image fetching device or a camera control device to an image receiving device. In Japanese Laid open Patent Application Publication No. 2000-83913, layout information of images to be generated is prepared on the side of the image receiving device and then is transmitted to the image fetching device. The image fetching device clips fetched image data in accordance with the acquired layout information and transmits clipped images to the image receiving device. In Japanese Laid open Patent Application Publication No. 10-136345, each camera control device detects a photographing direction and a zoom magnification of each camera and converts an image taken using each camera to an image of desired resolution and frame rate. The converted image is transmitted from the camera control device to a terminal. The terminal then synthesizes the images transmitted from respective camera control devices and displays a synthesized image on a monitor.
An image processing system includes: a plurality of image capture units mounted on a vehicle; and an image processing apparatus connected with the plurality of image capture units via a network to generate combined image data from a plurality of pieces of image data taken by the plurality of imaging capture units and to make a display unit display the combined image data, the plurality of image capture units, each includes: a camera photographing surrounding parts of the vehicle; a storage unit storing use part information indicative of a use part of the image data which is used in the combined image data, the use part information is calculated with reference to coordinate conversion data corresponding to a pattern of the combined image data, and importance degrees calculated per use part based on the resolution necessary for the each use part upon generation of the combined image data; an image clipping unit clipping segment image data serving as the use part in the combined image data from the image data taken by the camera with reference to the use part information; a transmit image data generating unit performing image processing according to the importance degree of the use part on the segment image data corresponding to the clipped use part with reference to the importance degree to generate transmit image data; and a transmitting unit transmitting the transmit image data to the image processing apparatus, and the image processing apparatus includes: a network interface unit inputting a plurality of pieces of the transmit image data transmitted from the plurality of image capture units; an image generating unit generating the combined image data based on the plurality of pieces of transmit image data; and an display unit outputting the combined image data to the display device.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Usefulness for a driver may be increased by switching a pattern of a combined image which is generated using an image processing apparatus in accordance with each driving situation of the driver, such as right/left turning, lane change, garaging or the like. A use part of each camera image and a resolution of a required image vary as a pattern of the combined image is switched. For example, as illustrated in
Accordingly, in order to generate a plurality of kinds of combined images with high definition while decreasing the amount of data transmitted from each camera to the image processing apparatus, it may be necessary to process a use part of an image to be used in a combined image so as to satisfy the resolution required for the use part and to transmit the processed part from each camera. It may be difficult to transmit whole images as they have been taken using a plurality of cameras at a processing speed attained by an existing in-vehicle LAN. Therefore, it may be desirable to decrease the amount of data to be transmitted by clipping only necessary parts from within images and transmitting the clipped parts from the cameras to the image processing apparatus.
Next, an embodiment will be described with reference to the accompanying drawings.
As illustrated in
Next, details of the imaging apparatus 100 will be described with reference to
The camera 111 is an example of the image capture unit for taking images of surrounding parts of a vehicle and outputs the taken images to the transmit image converting unit 112.
When a command of combined image data (a command that combined image data be generated) is given from the control unit 230 of the image processing apparatus 200, the transmit image converting unit 112 accepts the command of combined image data and acquires a transmit image conversion pattern used to generate the commanded combined image data from the transmit image conversion pattern storage unit 115. The transmit image converting unit 112 operates as an image clipping unit for clipping a used part (a part to be used in combined image data) from the image data which has been taken using the camera 111 in accordance with the acquired transmit image conversion pattern. The transmit image converting unit 112 also operates as a transmit image data generating unit for reducing the size of the clipped image data to perform image converting processing on the image data. Incidentally, the camera image data which has been subjected to image converting processing using the transmit image converting unit 112 will be referred to as segment image data. The transmit image converting unit 112 outputs the segment image data which has been subjected to image working processing to the transmission speed adjusting section 113.
The transmission speed adjusting section 113 includes a buffer (not illustrated) and temporarily stores the segment image data output from the transmit image converting unit 112 in the buffer. The transmission speed adjusting section 113 divides the segment image data acquired from the transmit image converting unit 112 into pieces of data of predetermined sizes. In addition, the transmission speed adjusting section 113 also operates as a transmitting unit that adds header information or the like addressed to the image processing apparatus 200 to the divided piece of data and transmits the data to the image processing apparatus 200 via the network I/F unit 114 as packet data. The number of pieces of packet data transmitted from each image capture unit (the first to fourth image capture units 110 to 140) of the imaging apparatus 100 via the network 150 is defined to be constant in a constant time period. Therefore, as the data size of the segment image data acquired from the transmit image converting unit 112 is decreased, the transmission speed adjusting section 113 decreases the data size of each piece of packet data accordingly. The packet data generated using the transmission speed adjusting section 113 is transmitted to the image processing apparatus 200 via the network I/F unit 114.
The transmit image conversion pattern storage unit 115 is a storage unit for storing transmit image conversion pattern data. RAM (Random Access Memory) and HDD (Hard Disk Drive) are examples of storage unit.
Next, processing executed using the transmit image converting unit 112 will be described in more detail with reference to
In the case that images taken using the plurality of cameras 111, 121, 131 and 141 are transmitted as they are, the data amount is increased when it is intended to transmit the images of high definition, so that data transfer takes much time at the data transfer speed attained by the in-vehicle LAN. Accordingly, in this embodiment, each of the transmit image converting unit 112, 122, 132 and 142 prepares segment image data obtained by clipping a used part from the camera image data which has been taken, conforming to a resolution with which the segment image data is to be displayed. In addition, each of the transmission speed adjusting sections 113, 123, 133 and 143 allocates a band used for transmission to each piece of segment image data generated using each of the transmission speed adjusting sections 113, 123, 133 and 143 and transmits the data by adjusting the transmission speed to a speed at which data transmission is possible over the in-vehicle LAN. When the speed of transmission over the in-vehicle LAN has been increased some day, the system may be configured such that images which have been taken using the cameras 111, 121, 131 and 141 are transmitted to the image processing apparatus 200 as they are, processing sections corresponding to the transmit image conversion pattern storage units 115, 125, 135 and 145 and the transmit image converting units 112, 122, 132 and 142 of the imaging apparatus 100 are installed in the image processing apparatus 200 and the transmit speed adjusting unit 113 of the imaging apparatus 100 is eliminated so as to eliminate transmission band width allocation to each image. In the latter case, execution of complicated image processing using a camera is not necessary and hence it may become possible to handle data transmission using low cost cameras.
The transmit image converting unit 112 does not perform size-reduction processing, for example, on image data in an image range which has been set to the high importance degree and transmits the image data in the image range to the image processing apparatus 200 as it is. The transmit image converting unit 112 reduces the data size of the segment image data in an image range which has been set to the moderate importance degree, for example, horizontally and transmits the reduced segment image data to the image processing apparatus 200 via the network I/F unit 114.
In addition, the transmit image converting unit 112 reduces the data size of the segment image data in an image range which has been set to the low importance degree, for example, horizontally and vertically. The transmit image converting unit 112 transmits the reduced segment image data to the image processing apparatus 200 via the network I/F unit 114. The transmit image converting unit 112 reduces the size of the segment image data in accordance with the transmit image conversion pattern data acquired from the transmit image conversion patter storage unit 115 and outputs the reduced segment image data to the transmission speed adjusting section 113.
In an example illustrated in
Next, details of the image processing apparatus 200 illustrated in
The network I/F unit 210 receives packet data transmitted from each of the first image capture unit 110 to the fourth image capture unit 140. The network I/F unit 210 converts the received packet data to the segment image data, adds blanking data to the segment image data and output the segment image data with the blanking data added to the image generating unit 220. Instead of the above mentioned operations, the NETWORK I/F UNIT 210 may operate to output the packet data which has been received from each of the first image capture unit 110 to the fourth image capture unit 140 as it is in the form of a data sequence to the image generating unit 220 without converting the received packet data to the segment image data. In the latter case, the image generating unit 220 will operate to convert the packet data in the form of the data sequence to the segment image data.
The image generating unit 220 performs coordinate transformation on the respective pieces of segment image data transmitted from the first image capture unit 110 to the fourth image capture unit 140 to generate (synthesize) combined image data in the following manner. Combined image conversion pattern data which will be described later is stored in the storage unit 240. The image generating unit 220 acquires the combined image conversion pattern data used to generate the combined image data generation of which has been commanded from the control unit 230 from the storage unit 240. The image generating unit 220 performs coordinate transformation on the respective pieces of segment image data transmitted from the first image capture unit 110 to the fourth image capture unit 140 in accordance with the acquired combined image conversion pattern data to generate the combined image data.
The display control unit 250 serves as an output unit to control to make the display unit 320 display the combined image data which has been generated by the image generating unit 220.
Next, details of the control unit 230 will be described.
Programs that the CPU 23 uses for controlling operations are recorded in the ROM 232. The CPU 231 reads therein a program recorded in the ROM 232 to execute processing in accordance with the read-in program. Data that the CPU 231 uses for arithmetic operations and data indicative of results of arithmetic operations are stored in the RAM 233. The input/output unit 234 accepts input of an operation that a user has performed using the operation unit 310 and outputs it to the CPU 231. In addition, the input/output unit 234 outputs a command signal which is output from the CPU 231 to the network I/F unit 210. The network I/F unit 210 transmits the command signal output from the input/output unit 234 to the first image capture unit 110 to the fourth image capture unit 140 via the network 150. RAM 233 is one of the examples of the storage unit 240.
The control unit 230 generates a plurality of pieces of transmit image conversion pattern data which will be described later for each pattern of the generation mage data. The transmit image conversion pattern data is generated for each of the first image capture unit to the fourth image capture unit 140. The control unit 230 transmits the respective pieces of generated transmit image conversion pattern data to the imaging apparatus 100 via the network 150. Each of the first image capture unit 110 to the fourth image capture unit 140 stores the corresponding piece of transmit image conversion pattern data transmitted from the control unit 230 in each of their transmit image conversion patter storage units 115, 125, 135 and 145. For example, the first image capture unit 110 stores the transmit image conversion pattern data in the transmit image conversion pattern storage unit 115. In addition, the control unit 230 generates the combined image conversion pattern data which will be described later and stores the generated combined image conversion pattern data in the storage unit 240. A plurality of pieces of the combined image conversion pattern data are also generated for each pattern of the combined image data as in the case with the transmit image conversion pattern data.
The operation unit 310 accepts input of an operation from the user. A combined image generated from the respective pieces of image data that the cameras 111 to 141 have taken is displayed on the display unit 320. The operator performs an operation to switch a pattern of the combined image to be displayed on the display unit 320 through the operation unit 310. Examples (patterns) of combined images displayed on the display unit 320 are illustrated in
Next, procedures of generating the transmit image conversion pattern data and the combined image conversion pattern data using the control unit will be described with reference to
First, in preparation for operations, position coordinates and attaching angles of the respective cameras 111, 121, 131 and 141 mounted on the vehicle concerned are calculated using some means. The reason for calculation of the position coordinates and the attaching angles of the cameras per vehicle lies in that although the position coordinates and the attaching angle of each camera are fixed in advance, when the camera is actually installed, the coordinates and the attaching angle may be slightly varied and hence it is desirable to calculate the coordinates and the attaching angle after the camera concerned has been actually attached to the vehicle. An operation of calculating the position coordinates and the attaching angle of each camera is performed by an operator using equipment. As illustrated in
Characteristic data of the cameras 111, 121, 131 and 141 and combined image layout pattern data are included in the data stored in advance in the storage unit, in addition to the camera setting condition information. The characteristic data includes data on the number of pixels and an angle of view in each of horizontal and vertical directions and lens distortion data of each of the cameras 111, 121, 131 and 141. The angle of view is an angle of visibility of each of the cameras 111, 121, 131 and 141. The lens distortion data is data on a distorted aberration of a lens of each camera. The combined image layout pattern data includes image projection plane shape data (data on a shape of a projection plane of an image), observing point vector data and data indicative of a display range of the image. As illustrated in
First, the control unit 230 generates combined image direct conversion data by using the camera setting condition information, the camera characteristic data and the combined image layout pattern data stored in the storage unit 240 (step S1). The combined image direct conversion data is coordinate transformation data used to convert respective pieces of image data of the cameras 111, 121, 131 and 141 to combined image data. The combined image direct conversion data includes, for example, a polygon number (a number of each polygon), a vertex number indicative of a number of each vertex of each polygon indicated by a corresponding polygon number, image data coordinates and combined image pixel coordinates as illustrated in
Next, the control unit 230 generates transmit image conversion pattern data using the combined image direction conversion data (step S2). The transmit image conversion pattern data includes data indicative of a use range which is used in the combined image data, of the image data taken using each of the cameras 111, 121, 131 and 141 and data on reduction rates at which the image data within this use range is reduced. Procedures of generating the transmit image conversion pattern data from the combined image direct conversion data will be described later with reference to a flowchart illustrated in
Next, the control unit 230 transmits respective pieces of transmit image conversion pattern data so generated to the corresponding image capture units (the first image capture unit 110 to the image capture unit 140) (step S3). The first image capture unit 110 to the image capture unit 140 store the respective pieces of transmit image conversion data transmitted from the control unit 230 in their transmit image conversion pattern storage units 115, 125, 135 and 145. For example, the first image capture unit 110 stores the transmit image conversion data in the transmit image conversion pattern storage unit 115.
Next, the control unit 230 corrects the combined image direct conversion data in accordance with the transmit image conversion pattern data to generate combined image conversion pattern data (step S4). Image data transmitted from each of the first image capture unit 110 to the fourth image capture unit 140 is not image data just as it has been taken using each of the cameras 111 to 141 and image data including only a use part used in the combined image data. Thus, the control unit 230 corrects the combined image direct conversion data in accordance with the transmit image conversion pattern data in order to generate the combined image data from the segment image data including only the use part. The control unit 230 stores the combined image conversion pattern data so generated in the storage unit 240 (step S5).
Next, procedures of generating the transmit image conversion pattern data from the combined image direct conversion data using the control unit 230 at step S2 in
First, the control unit 230 calculates a use range of an image used in the combined image data from the image data sent from each of the cameras 111, 121, 131 and 141 with reference to the combined image conversion pattern data. The control unit 230 calculates scaling rates obtained by executing coordinate conversion on each pixel of the image data in the use range (step 511). In the example illustrated in the drawing, the scaling rates include reduction rates and enlargement rates hereinafter, the scaling rates will be referred to as pixel scaling rates. In addition, the control unit 230 calculates a distribution of the pixel scaling rates based on the calculated pixel scaling rates of each pixel. That is, the control unit 230 calculates the pixel scaling rates at which each pixel of the image data in the use range is enlarged or reduced as coordinate conversion is executed. The pixel scaling rates may be either calculated respectively in X-axis and Y-axis directions or obtained from a ratio of an area of each pixel obtained before coordinate transformation to an area of each pixel obtained after coordinate transformation.
Next, the control unit 230 obtains a range in which the pixel scaling rates are corrected based on a distance from the vehicle concerned to correct the pixel scaling rates (step S12). For example, the control unit 230 judges distant and sky parts in the image data which will not be effective to support the driving to be parts unnecessary for the combined image data used to support the driving and sets the parts as out-of-object data or data to be reduced in the importance degree. In an example illustrated in
Next, the control unit 230 performs processing such as clustering or normalization on the pixel scaling rates of the respective pixels which have been corrected at step S12 to classify the pixel scaling rates. The control unit 230 calculates distributions of importance degrees on the basis of the classified pixel scaling rates (step S13). For example, a distribution of pixels of the pixel scaling rates of 5 or more, a distribution of pixels of the pixel scaling rates of 2 or more and 5 or less, a distribution of pixels of the pixel scaling rates of zeros or more and 2 or less and a distribution of pixels of the pixel scaling rates of zeros in the image data are respectively obtained. Then, the distribution of the pixels of the pixel scaling rates of 5 or more is defined as a distribution of high-importance-degree pixels. The distribution of the pixels of the pixel scaling rates of 2 or more and 5 or less is defined as a distribution of moderate-importance-degree pixels. The distribution of the pixels of the pixel scaling rates of zeros or more and 2 or less is defined as a distribution of low-importance-degree pixels. A pixel of high pixel scaling rates is displayed over a large area when converted to the combined image data and hence may be judged to be high-importance-degree data. A pixel of low pixel scaling rates is displayed over a small area when converted to the combined image data and hence may be judged to be low-importance-degree data.
Next, at step 14, the control unit 230 clips a rectangular region including the use range on the basis of the use ranges and the importance degree distributions calculated at step S13. That is, the image data in the use range is divided into pieces depending on the respective importance degrees in the form of the importance degree distributions. The control unit 230 calculates the rectangular regions including the image data in the use range which is divided into pieces depending on the importance degrees and sets the calculated rectangular regions in a transmit range.
In addition, the control unit 230 determines reduction rates at which the image data in the transmit range is reduced on the basis of the importance degree distributions. For example, the control unit 230 sets so as not to reduce the size of the image data in a high-importance-degree transmit range. The control unit 230 sets predetermined reduction rates for the image data in moderate-importance-degree and low-importance-degree transmit ranges.
The control unit 230 executes the above mentioned processing to generate the transmit image conversion pattern data and the combined image conversion pattern data. The control unit 230 transmits each piece of transmit image conversion pattern data generated for each piece of image data of each camera to each of the first image capture unit 110 to the fourth image capture unit 140. The respective pieces of pattern-based transmit image conversion pattern data of the combined image data are stored in the transmit image conversion pattern storage units 115, 125, 135 and 145 of the first image capture unit 110 to the fourth image capture unit 140.
Next, procedures of operations of the image processing system 1 will be described with reference to
First, the control unit 230 judges whether a combined image change command (a command that a combined image be changed) has been input through the operation unit 310 (step S21). In the case that the combined image change command has been input (step S21/YES), the control unit 230 sends a notification that the combined image change command has been given to the imaging apparatus 100. Each of the first IMAGE CAPTURE UNIT 110 to the fourth IMAGE CAPTURE UNIT 140 which has received the notification stores the combined image change command in its memory (step S22).
Next, when an image is taken using each of the cameras 111 to 141 and image data is input (step S23/YES), each of the transmit image converting units 112 to 142 acquires transmit image conversion pattern data used to generate a combined image generation of which has been commanded from the control unit 230. Each of the transmit image converting units 112 to 142 clips image data serving as a use part from the image data taken using each of the cameras 111 to 141 and reduces the size thereof to generate segment image data with reference to the acquired transmit image conversion pattern data. For example, the transmit image converting unit 112 clips the image data serving as the use part from the image data which has been acquired using the camera 111 and reduces the size thereof to generate the segment image data (step S24). Respective pieces of segment image data which have been subjected to image working processing using the transmit image converting units 112 to 142 are respectively output to the transmission speed adjusting sections 113 to 143. For example, the segment image data generated using the transmit image converting unit 112 is output to the transmission speed adjusting section 113. The transmission speed adjusting sections 113 to 143 generate packet data of sizes corresponding to the data amounts of respective pieces of size-reduced segment image data and transmit the generated packet data to the image processing apparatus 200 (step S25).
Next, procedures of processing executed using the image processing apparatus 200 will be described with reference to
The image processing apparatus 200 receives respective pieces of packet data which have been transmitted from the first image capture unit 110 to the fourth image capture unit via the network 150 using the network I/F unit 210. When the packet data is input into the network I/F unit 210 (step S31/YES), the network I/F unit 210 restores the input packet data to the segment image data, adds blanking data to the segment image data and outputs the segment image data with the blanking data added to the image generating unit 220. When the image data is input through the network I/F unit 210, the image generating unit 220 generates (synthesizes) combined image data from the plurality of pieces of image data with reference to the combined image conversion pattern data stored in the storage unit 240 (step S33).
As described above, according to the above mentioned embodiment, image data is worked to obtain use ranges and reduction rates recorded as transmit image conversion pattern data and segment image data obtained by working the image data is transmitted to the image processing apparatus 200. Therefore, reduction of the amount of data transferred from the imaging apparatus and generation of combined image of high quality may be attained simultaneously.
The invention is not limited to the above mentioned embodiment and may be embodied in a variety of ways without departing from the gist of the present invention.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present inventions have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2009-113930 | May 2009 | JP | national |