1. Field of the Invention
The present invention relates to an image processing apparatus, an image processing method, and a computer program product which pick an image of a predetermined field of view and perform image processing to a generated image.
2. Description of the Related Art
A conventional inter-car distance detection apparatus which is mounted on a given vehicle such as an automobile, processes an image obtained by picking a preceding vehicle running in front of the given vehicle, and detects a distance from the given vehicle to the preceding vehicle are known (For example, see Japanese Patent No. 2635246). In order to capture the inter-vehicle distance detection apparatus captures the preceding vehicle on an image, a plurality of distance measurement windows are set at predetermined positions in an image, an image is processed in each of the set distance measurement windows to calculate a distance to an arbitrary object, and recognizes an image pickup position of the preceding vehicle on the basis of the calculation result and position information of the measurement windows.
A technique which, in order to detect a road surface condition of a road in a traveling direction in a traveling state of a given vehicle, picks an image in front of the vehicle and recognizes a predetermined object on the picked image is also known (for example, see Japanese Patent No. 3290318). In this technique, by using the picked image, a driving lane dividing line such as a white line or a driving lane dividing zone such as a central reservation on a road on which the given vehicle runs is recognized.
An image processing apparatus according to an aspect of the present invention includes an imaging unit which is mounted on a mobile object and picks an image of a predetermined field of view to generate an image; a moving information detection unit which detects moving information including a speed of the mobile object; a processing content setting unit which sets contents of a process to be performed in the image generated by the imaging unit, based on the moving information detected by the moving information detection unit; and a processing calculation unit which performs processing calculation according to the contents of the process set by the processing content setting unit.
An image processing method according to another aspect of the present invention includes picking an image of a predetermined field of view from a mobile object to generate an image; detecting moving information including a speed of the mobile object; setting contents of a process to be performed in the generated image, based on the detected moving information; and performing processing calculation according to the contents of the process set.
A computer program product according to still another aspect of the present invention has a computer readable medium including programmed instructions for performing image processing on an image generated by an imaging unit which is mounted on a mobile object and picks an image of a predetermined field of view to generate the image. The instructions, when executed by a computer, cause the computer to perform detecting moving information including a speed of the mobile object; setting contents of a process to be performed in the generated image, based on the detected moving information; and performing processing calculation according to the contents of the process set.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
Best modes (to be referred to as embodiments hereinafter) for carrying out the present invention will be described below with reference to the accompanying drawings.
The imaging unit 10 is a stereo camera having a right camera 11a and a left camera 11b which are arranged on the right and left side by side. The right camera 11a includes a lens 12a, an image pickup element 13a, an analog/digital (A/D) conversion unit 14a, and a frame memory 15a. The lens 12a converges light from an arbitrary object arranged in a predetermined field of image pickup view on the image pickup element 13a. The image pickup element 13a is an image pickup element such as a CCD or a CMOS, detects light from the object converted by the lens 12a as an optical signal, converts the optical signal into an electric signal serving as an analog signal to output the electric signal. The A/D conversion unit 14a converts the analog signal output from the image pickup element 13a into a digital signal to output the digital signal. The frame memory 15a stores the digital signal output from the A/D conversion unit 14a and arbitrarily outputs a digital signal group corresponding to one image pickup image as an image signal group corresponding to the field of image pickup view, i.e., image information. On the other hand, the left camera 11b has the same configuration as that of the right camera 11a and includes a lens 12b, an image pickup element 13b, an A/D conversion unit 14b, and a frame memory 15b. The respective constituent components of the left camera 11b have the same functions as the corresponding constituent components of the right camera 11a.
The lenses 12a and 12b serving as one pair of image pickup optical systems held by the imaging unit 10 are separately located with a distance L to have parallel optical axes. The image pickup elements 13a and 13b are separately located with distances f from the lenses 12a and 12b on the optical axes, respectively. The right camera 11a and the left camera 11b picks images of the same object from different positions through different optical paths, respectively. The lenses 12a and 12b are generally constructed by combining a plurality of lenses. For example, the lenses 12a and 12b preferably correct aberrations such as distortions of the lenses.
The image analyzing unit 20 includes a distance calculation unit 21 which processes an image signal group acquired from the imaging unit 10 to calculate a distance to the object the image of which is picked. The distance calculation unit 21 detects a right image signal matched with an arbitrary left image signal in a left image signal group output by the left camera 11b from a right image signal group output by the right camera 11a and calculates a distance to an object located in the field of image pickup view on the basis of an amount of movement which is a distance between the detected right image signal and a corresponding left image signal. More specifically, positions of a right image signal group from the right camera 11a and a left image signal group from the left camera 11b on optical axes of image pickup optical systems are overlapped on a reference, and an arbitrary left image signal in the left image signal group and a right image signal in the right image signal group maximally matched with the left image signal are detected, and an amount of movement I which is a distance from the corresponding left image signal to the right image signal on the image pickup element is obtained. By using the following equation (1) based on the principle of triangular surveying, for example, a distance R from the imaging unit 10 to a vehicle C in
For the descriptive convenience, the parallel stereo is described above. However, crossing of the axes at an angle, different focal distances, different positional relationships between the image pickup elements and the lenses, and the like are calibrated and corrected by rectification, so that a parallel stereo may be realized by calculation processing.
R=f·L/I (1)
The distance calculation unit 21 calculates a distance to an object corresponding to an arbitrary image signal in an calculation range, i.e., an object corresponding to an arbitrary pixel of the image pickup element. The image analyzing unit 20 forms distance information in which a distance to be calculated to the object and a position of an object in an image are caused to correspond to each other by the distance calculation unit 21 and output the distance information to the control unit 30. The object the distance of which is calculated here is not limited to an object serving as a tangible entity but also an arbitrary object the image of which is to be picked and which includes a road surface or a background such as sky.
The control unit 30 includes a CPU which executes a processing program stored in the storage unit 50 to control various processing operations in the imaging unit 10, the image analyzing unit 20, the output unit 40, the storage unit 50, and the detection unit 60. In particular, the control unit 30 according to the first embodiment includes an arrival point prediction unit 31 and an calculation range setting unit 32. The arrival point prediction unit 31 acquires moving information from the detection unit 60 to predict a point which a given vehicle reaches a predetermined period of time after. the calculation range setting unit 32 selects a window matched with a prediction result of the arrival point prediction unit 31 from pieces of window information stored in window information 51 to set an calculation range in the distance calculation unit 21. The calculation range setting unit 32 outputs the selected window information to the image analyzing unit 20. The window information is information related to a size, a shape, and the like of a window.
The output unit 40 outputs various pieces of information including distance information. For example, the output unit 40 includes a display apparatus such as a liquid crystal display or an organic EL (Electroluminescence) display to display various displayable pieces of information such as an image or the like picked by the imaging unit 10 together with the distance information. Furthermore, the output unit 40 includes an audio output apparatus such as a loudspeaker, and may be constructed to output various pieces of audio information such as a warning based on the distance information or the distance information.
The storage unit 50 includes a ROM in which various pieces of information such as a program and an image processing program for starting a predetermined OS is stored in advance and a RAM in which calculation parameters for processing and various pieces of information input/output to various constituent components are stored. Furthermore, the storage unit 50 stores a window information 51 which stores pieces of window information selected by the calculation range setting unit 32, image information 52 obtained by an image pickup operation by the imaging unit 10, steering angle information 53 detected by a steering angle sensor 61, speed information 54 detected by a speed sensor 62, arrival point information 55 predicted by the arrival point prediction unit 31, and distance information 56 calculated and formed by the distance calculation unit 21.
The image processing program described above can be recorded on a computer readable recording medium such as a hard disk, a flexible disk, a CD-ROM, a CD-R, a CD-RW, a DVD-ROM, DVD±R, a DVD±RW, a DVD-RAM, a MO disk, a PC card, an xD picture card, or a smart media and can be widely circulated.
The detection unit 60 detects moving information of a given vehicle. In particular, the detection unit 60 in the first embodiment includes the steering angle sensor 61 which detects a moving direction of the given vehicle and a speed sensor 62 which detects a moving speed. The steering angle sensor 61 is a sensor which detects a steering angle serving as right and left rotating angles of front wheels as a moving direction of the given vehicle and which calculates a steering angle on the basis of a rotating angle and a rotating direction of a steering wheel. The speed sensor 62 is a speedometer that such as an automobile generally provides. The detection unit 60 outputs the steering angle information detected by the steering angle sensor 61 and the speed information detected by the speed sensor 62 to the control unit 30 as the moving information of the given vehicle. The speed sensor 62 may be a sensor which observes a road surface on which the given vehicle runs to calculate a speed, a sensor which detects an acceleration to calculate a speed, or the like. As a means which detects a moving direction of the given vehicle, in place of the steering angle sensor 61, an angle detection sensor using a gyroscope may be used.
In the first embodiment, the detection unit 60 constitutes at least a part of a moving information detection unit 600 which detects moving information including a speed of a mobile object. The calculation range setting unit 32 constitutes at least a part of a processing content setting unit 320 which sets contents of processes to be performed in an image based on the moving information detected by the moving information detection unit 600. Furthermore, the distance calculation unit 21 constitutes at least a part of a processing calculation unit 210 which performs processing calculation corresponding to the contents of the processes set by the processing content setting unit 320.
A processing operation executed by the image processing apparatus 1 will be described below with reference to a flow chart in
As shown in
The lights converged by the lenses 12a and 12b are focused on the surfaces of the image pickup elements 13a and 13b and converted into electric signals (analog signals), respectively. The analog signals output from the image pickup elements 13a and 13b are converted into digital signals by the A/D conversion units 14a and 14b, respectively, and the converted digital signals are temporarily stored in the frame memories 15a and 15b, respectively. The digital signals temporarily stored in the frame memories 15a and 15b are sent to the image analyzing unit 20 a predetermined period of time after.
After the image pickup process in step S101, the detection unit 60 performs a moving information detection process which detects moving information of the given vehicle to output the moving information to the control unit 30 (step S103). In this case, in the detection of the moving information, the steering angle sensor 61 detects a steering angle as a moving direction, and the speed sensor 62 detects a moving speed. The arrival point prediction unit 31 of the control unit 30 performs an arrival point prediction process which an arrival point of the given vehicle a predetermined period of after on the basis of the moving information from the detection unit 60 (step S105). The calculation range setting unit 32 performs an calculation range setting process which sets an calculation range to be processed by the distance calculation unit 21 on the basis of a prediction result of the arrival (step S107). The control unit 30 outputs information of the set calculation range to the image analyzing unit 20.
In the image analyzing unit 20, the distance calculation unit 21 calculates a distance to an arbitrary object the image of which is picked within the set calculation range, forms distance information in which the calculated distance and the position of the object in the image are caused to correspond to each other, and outputs the distance information to the control unit 30 (step S109).
In order to perform the distance calculation in step S109, coordinate values of all pixel points or some pixel points in the field of view the image of which is picked by using the right and left camera coordinate systems. However, prior to this calculation, calculation of coordinate values in the left and right camera coordinate system and corresponding (corresponding point searching) between the coordinate values are performed. When a three-dimension is reconstructed by the corresponding point searching, a pixel point located on an arbitrary straight line passing through an image serving as a reference is desirably located on the same straight line in the other image (epipolar condition of constraint). However, the epipolar condition of constraint is not always satisfied. For example, in a stereo image region Iab shown in
When the epipolar condition of constraint is not satisfied, a searching range cannot be narrowed down, and an amount of calculation in searching of a corresponding point is vast. In such a case, the image analyzing unit 20 performs a process (rectification process) which normalizes the right and left camera coordinate systems in advance to convert the state into a state in which the epipolar condition of constraint is satisfied.
An example of a corresponding point searching method will be described below. In the referenced left image region Ib, a local region is set near an interested pixel point, and the same region as the local region is set on the corresponding epipolar line (E in the right image region Ia. A local region having the highest degree of similarity to the local region in the left image region Ib is searched for while the local region in the right image region Ia is scanned on the epipolar line αE. As a result of the searching, a center point of a local region having the highest degree of similarity is set as a corresponding point of a pixel point in the left image region Ib.
As the degree of similarity used in the corresponding point searching, a sum (SAD: Sum of Absolute Difference) of absolute values of differences between pixel points in the local region, a sum of squares (SSD: Sum of Squared Difference) of differences between pixel points in the local region, or normalized cross correlation (NCC) between pixel points in the local region can be applied. When the SAD or the SSD of these degrees of similarity is applied, a point having a minimum value is set as a point having the highest degree of similarity. When the NCC is applied, a point having a maximum value is set as a point having the highest degree of similarity.
Subsequent to step S109 described above, the control unit 30 outputs the distance information and predetermined processing information based on the distance information to the output unit 40 (step S111), and one series of processing operations are finished. The control unit 30 arbitrarily stores the image information 52, the steering angle information 53 the speed information 54, the arrival point information 55, and the distance information 56 serving as pieces of information generated in the various processing steps in the storage unit 50.
The series of processes are continuously repeated in the absence of a designation of a predetermined end of processes or a designation of interruption from a passenger or the like of a given vehicle. The processes described above are described as processes to be sequentially executed. However, in fact, independent processing operations are parallel performed by a processor to increase a speed of a processing cycle. The image pickup process in step S101 may be performed immediately after any one of the processes in step S103, step S105, and step S107.
An arrival point prediction process in step S105 shown in
In the calculation of the arrival point in step S124, the arrival point prediction unit 31 calculates coordinates of an arrival point after a predetermined period of time on the assumption that the given vehicle linearly moves at a speed detected by the speed sensor 62 in a moving direction detected by the steering angle sensor 61. For example, as shown in
Subsequently, an calculation range setting process in step S107 shown in
An example of a window selected by the calculation range setting unit 32 will be described below.
On the other hand,
The window information 51 held by the storage unit 50 stores pieces of window information regulated by using a direction and a distance to a predicted arrival point as parameters. The calculation range setting unit 32 selects the short-distance window 51a, the long-distance window 51b, or the like as a window which is maximally matched with the predicted arrival point from the pieces of window information. At this time, the calculation range setting unit 32, for example, may store a corresponding table between parameters and window information and refer to the corresponding table in accordance with various combinations of the parameters to select a window to be selected.
In the flow chart shown in
The image processing apparatus 1 according to the first embodiment described above predicts an arrival point of a given vehicle after a predetermined period of time on the basis of steering angle information and speed information serving as moving information and sets an limits an calculation range depending on the predicted arrival point to calculate a distance. For this reason, in comparison with a conventional image processing apparatus which calculates a distance to all image signals of image information, a time required for distance calculation can be shortened. As a result, the image processing apparatus 1 can shorten processing time required until distance information is output after image information is acquired, and the distance information can be output at a high speed.
A second embodiment of the present invention will be described below. In the first embodiment described above, in the arrival point prediction process in step S105, the arrival point prediction unit 31 calculates an arrival point of a given vehicle a predetermined period of time after such that the given vehicle moves linearly at a speed detected by the speed sensor 62 in a moving direction detected by the steering angle sensor 61. However, in the second embodiment, with reference to a change rate of a steering angle serving as a change rate of a moving direction, an arrival point is calculated by using a function which is experientially calculated.
The image processing apparatus 2 executes an image pickup process to a process of outputting distance information shown in the flow chart in
An arrival point prediction process executed by the image processing apparatus 2 will be described below.
The prediction function selected by the arrival point prediction unit 33 in step S226 is a function experientially calculated and is a function regulated by using the steering angle, the change rate of the steering angle, and the speed. Furthermore, functionally, the prediction function is a function which predicts a track of the given vehicle, a track drawn by the prediction function by using time as a variable expresses a prediction track of the given vehicle. For example, as shown in
In this case, a relationship between the prediction function selected by the arrival point prediction unit 33 and the steering angle, the change rate of the steering angle, and the speed serving as parameters for regulating the prediction function will be described below with reference to
As another case, for example, when a steering angle at the present point p0, a change rate of the steering angle, and the speed are given by θ0, θθ0, and v0, respectively, the prediction track q0 is applied. When the speed is given by v1 (>v0), the prediction track q1 having a curvature radius smaller than that of the prediction track q0 is applied. When the speed is v2 (<v0), the prediction track q2 having a curvature radius larger than that of the prediction track q0 is applied. Furthermore, depending on a combination of the steering angle, the change rate of the steering angle, and the speed, various prediction functions can be regulated in various cases.
A window selected by the calculation range setting unit 34 in the calculation range setting process will be described below.
On the other hand,
In the prediction for the arrival point in the second embodiment, an arrival point a predetermined period of time after is obtained by using a prediction function experientially determined to predict a track and using a steering angle, a change rate of the steering angle, and a speed at a present point as parameters. For this reason, in comparison with prediction for an arrival point based on linear movement in the first embodiment, prediction can be performed with higher reliability. A difference between arrival points predicted by both the methods is shown in
In the second embodiment, a track is predicted by using a prediction function based on a rule of thumb. In particular, in calculation in the short-distance region, since a boundary between the right and left windows is formed by a bent line or a curved line on the basis of the prediction track, an calculation range can be more strictly limited, and an unnecessary calculation region can be omitted. As a result, processing time taken for calculation can be further shortened.
In embodiment 2, the prediction function is selected by using the steering angle, the change rate of the steering angle, and the speed as parameters. However, parameters related to the selection are not limited to the parameters mentioned above. For example, an acceleration may be added as a parameter, or an acceleration may be used in place of the change rate of the steering angle to select a prediction function. Furthermore, an orientation of movement may be additionally detected in place of the steering angle and used as a parameter.
In the image processing apparatus 2 according to the second embodiment described above, an arrival point and prediction track are strictly predicted by using the prediction function, and, accordingly, the calculation range is set and limited to calculate a distance. For this reason, time required for distance calculation can be shortened in comparison with a conventional image processing apparatus which performs distance calculation to all image signals of image information. As a result, the image processing apparatus 2 can shorten processing time required until distance information is output after the image information is acquired and output the distance information at a high speed.
A third embodiment of the present invention will be described above. In the first and the second embodiments described above, an arrival point prediction unit calculates an arrival point of a given vehicle a predetermined period of time after on the basis of steering angle information and speed information. However, in the third embodiment, position information of the given vehicle is detected, and an arrival point a predetermined period of time after is calculated with reference to map information serving as geographical information near a present position.
As in the procedures of the image processing apparatus 1 described in the first embodiment, the image processing apparatus 3 executes an image pickup process to a process of outputting distance information shown in the flow chart in
The moving information detection process executed by the image processing apparatus 3 will be described below. In the detection unit 160, as in the detection unit 60 of the image processing apparatus 1, the steering angle sensor 61 detects a steering angle, and the speed sensor 62 detects a speed. Thereafter, the GPS sensor 63 detects a present position of a given vehicle. Each of the sensors outputs a detected result to the control unit 230, respectively.
An arrival point prediction process executed by the image processing apparatus 3 will be described below. The arrival point prediction unit 35, like the arrival point prediction unit 31 of the image processing apparatus 1, calculates coordinates of an arrival point after a predetermined period of time on the basis of the steering angle information and the speed information on the assumption that the given vehicle linearly moves. Thereafter, the arrival point prediction unit 35 refers to the map information 59 stored by the storage unit 250 on the basis of the position information detected by the GPS sensor 63 to correct coordinates of the calculated arrival point. The map information to be referred to is peripheral geographical information including a present position, for example, the map information includes information of a width, inclination of a road on which the vehicle runs, a curvature of a curve, and the like.
As show in
In the above description, a track is corrected on the basis of information of the inclination of the ascending slope to correct the position of the arrival point. However, correction may be performed in consideration of a decrease in speed caused by running on the ascending slope. When it is detected that the given vehicle runs on a descending slope, correction may be performed in consideration of an increase in speed. Furthermore, various corrections can be performed on the basis of various pieces of map information. When the various corrections are performed, in the image processing apparatus 3, an arrival point can be predicted with higher reliability without being influenced by a geographical condition of a driving road. As described in the second embodiment, by using a change rate of a steering angle serving as a change rate of a moving direction of the given vehicle, an arrival point may be predicted on the basis of a rounded track.
In the image processing apparatus 3 according to the third embodiment described above, position information is acquired by using a GPS, and an calculation range is set and limited depending on the arrival point corrected on the basis of the map information corresponding to the acquired position information. For this reason, in comparison with a conventional image processing apparatus which performs distance calculation to all image signals of image information, time required for the distance calculation can be shortened. As a result, the image processing apparatus 3 can shorten processing time required until the distance information is output after the image information is acquired, and can output the distance information at a high speed.
In the third embodiment, on the basis of the image processing apparatus 1, by further using the GPS, an arrival point is corrected with reference to the map information corresponding to the position information. However, an calculation method using the prediction function described in the image processing apparatus 2 according to the second embodiment described above may be further applied to perform an arrival point prediction process. Furthermore, map information may be acquired from the outside of the apparatus by using a GPS to correct a prediction track on the basis of the acquired map information.
A fourth embodiment of the present invention will be described below. In the first to the third embodiments described above, a distance to a picked object is detected by processing an image signal group from the imaging unit 10. However, in the fourth embodiment, a distance to an object located in a field of image pickup view is detected by a radar.
The radar 70 transmits a predetermined outgoing wave, receives a reflected wave obtained by reflecting the outgoing wave by an object surface, and detects a distance from the radar 70 to the object which reflects the outgoing wave and a direction in which the object is located on the basis of an outgoing state and a receiving state. The radar 70 detects a distance to the object which reflects the outgoing wave and a direction of the reflection on the basis of an outgoing angle of the outgoing wave, an incident angle of the reflected wave, a receiving intensity of the reflected wave, time until the reflected wave is received after the outgoing wave is transmitted, and changes in frequency in the received wave and the reflected wave. The radar 70 outputs the distance to the object located in the field of image pickup view of the imaging unit 10 to the control unit 330 together with the direction in which the object is located. The radar 70 transmits, for example, a laser beam, an infrared ray, a millimeter wave, a microwave, an ultrasonic wave, or the like as a transmitter wave.
In the image processing apparatus 4 according to the fourth embodiment, since a distance is detected by the radar 70 in place of calculation of a distance by processing image information from the imaging unit 10, the distance information can be acquired at a higher speed and a high accuracy.
In the image processing apparatus 4, matching between a positional relationship in image signal group picked by the imaging unit 10 and a positional relationship in a detection range of the radar 70 is obtained in advance as described below to perform processes. For example, the image processing apparatus 4 performs an image pickup process by the imaging unit 10 and a detection process by the radar 70 to an object having a known shape, positions of the known object processed by the imaging unit 10 and the radar 70 are obtained. Thereafter, the image processing apparatus 4 uses the method of least squares or the like to obtain a relationship between the positions of the known objects processed by the imaging unit 10 and the radar 70, and matches the positional relationship in the image signal group picked by the imaging unit 10 and the positional relationship in the detection range of the radar 70.
In the image processing apparatus 4, even though an original point of detection of the imaging unit 10 and an original point of detection of the radar 70 are different from each other, if distances from the image pickup point and the detection point to the image processing apparatus 4 are sufficiently long, it can be understood that an image pickup original point and a detection original point are almost overlapped. Furthermore, when matching between the positional relationship in the image signal group picked by the imaging unit 10 and the positional relationship in the detection range of the radar 70 is accurately performed, the difference between the image pickup original point and the detection original point can be corrected by geometric conversion.
In the image processing apparatus 4, radar detection points of the radar 70 are preferably set to be located at predetermined intervals in a pixel row in which image signals of the image signal group picked by the imaging unit 10. When the radar detection points are not set as described above, interpolation points of the radar detection points are obtained in the same pixel row as the row of the image signals by using primary interpolation or the like on the basis of a plurality of radar detection points located near the image signals, and a detection process may be performed by the interpolation points.
In the image processing apparatuses according to the first to the fourth embodiments, an arrival point a predetermined period of time after is predicted by using moving information detected at a present point. However, a future arrival point, a future prediction track, or the like may be predicted on the basis of arrival point information, moving information, and the like of time-series stored in the storage unit, or correction may be performed.
The image analysis unit 520 includes a distance calculation unit 521 which processes the image signal group acquired from the imaging unit 10 to calculate a distance to an object the image of which is picked. The distance calculation unit 521 calculates a distance from an arbitrary image signal in an calculation range, i.e., an arbitrary pixel of an image pickup element to the object. The image analysis unit 520 forms distance information in which a distance to the object calculated by the distance calculation unit 521 is caused to correspond to a position of the object in an image and outputs the distance information to the control unit 530.
The control unit 530 includes a window switching unit 531. The window switching unit 531 has functions of a switching process unit which acquires moving information from the probing unit 560, selects a window matched with the moving information from pieces of window information stored by window information 551, and outputs designation information which switches the window set by the distance calculation unit 521 to the selected window to the image analysis unit 520 together with the window information. The window information is information related to a size, a shape, and the like of the window.
The storage unit 550 stores the window information 551 in which pieces of window information selected by the window switching unit 531, probed information 552 probed by the probing unit 560, the image information 553 picked by the imaging unit 10, and distance information 554 calculated and formed by the distance calculation unit 521.
The probing unit 560 includes a speed sensor 561 which detects a moving speed of the given vehicle and outputs the speed information detected by the speed sensor 561 to the control unit 530 as moving information of the given vehicle. the speed sensor 561 may be a sensor which observes a road surface on which the given vehicle runs from the given vehicle to calculate a speed, a sensor which detects an acceleration to calculate a speed, or the like.
In embodiment 5, the probing unit 560 constitutes at least part of a moving information detection unit 5600 which detects the moving information including the speed of the mobile object. The window switching unit 531 constitutes at least a part of a processing content setting unit 5310 which sets contents of a process to be performed in an image based on the moving information detected by the moving information detection unit 5600. Furthermore, the distance calculation unit 521 constitutes at least a part of a processing calculation unit 5210 which performs processing calculation corresponding to the contents of the process set by the processing content setting unit 320.
Processes executed by the image processing apparatus 5 will be described below with reference to the flow chart in
As shown in
The imaging unit 10 performs an image pickup process which picks a predetermined field of view and outputs a generated image signal group to the image analysis unit 520 as image information (step S505). The distance calculation unit 521 performs a distance calculation process which calculates a distance to an object on the basis of the image signal group corresponding to the window designated by the window switching unit 531, forms distance information in which the calculated distance is caused to correspond to a position of the object on the image, and outputs the distance information to the control unit 530 (step S507). The control unit 530 outputs the distance information and predetermined processing information based on the distance information to an output unit 540 (step S509) to end a series of processes. The control unit 530 stores the probed information 552, the image information 553, and the distance information 554 serving as information generated in the processing steps as needed to the storage unit 550.
The series of processes are continuously repeated unless a predetermined process end or a designation for interruption is received from a passenger or the like of the given vehicle. The series of processes are sequentially executed in the above description. However, it is preferable that processes of independent processors are executed in parallel to each other to increase the speed of a processing cycle. Furthermore, a subsequent speed state may be predicted on the basis of time-series speed information stored in the probed information 552, and the speed probing process may be arbitrarily skipped to increase the speed of the processing cycle. the image pickup process in step S505 may be performed immediately before step S501 or step S503.
The window switching process in step S503 shown in
In the flow chart shown in
An example of a window selected by the window switching unit 531 will be described below.
On the other hand,
On the other hand,
An example of distance information calculated and formed by the distance calculation unit 521 will be described below.
On the other hand,
The image processing apparatus 5 according to the fifth embodiment described above selects an image processing range on the basis of a moving speed of a given vehicle and performs distance calculation on the basis of an image signal group corresponding to the selected image processing range. For this reason, in comparison with a conventional image processing apparatus which performs distance calculation to all image signals of an image signal group, a load of a process for distance calculation can be reduced, and time required for distance calculation can be shortened. As a result, the image processing apparatus 5 can shorten processing time required until the distance information is output after an image is acquired and can output the distance information at a high speed.
In the fifth embodiment, a moving speed, a moving direction, or position information are detected as moving information of a given vehicle to select a window. However, a moving acceleration, a moving orientation, and the like may be detected to select a window. A direction or a sight line of driver's face of a given vehicle may be detected to select a window.
A sixth embodiment of the present invention will be described below. In the fifth embodiment described above, the window switching unit 531 selects a window on the basis of a speed detected by the speed sensor 561. However, in the sixth embodiment, a moving direction of a given vehicle is detected to select a window.
The steering angle sensor 661 is a sensor which detects a steering angle serving as right and left rotating angles of front wheels as a moving direction, for example a sensor which detects a steering angle on the basis of a rotating angle and a rotating direction of a steering wheel. The probing unit 660 outputs the steering angle detected by the steering angle sensor 661 to the control unit 630 as moving information of a given vehicle. The steering angle sensor 661 may be a sensor which detects a moving direction by using a gyroscope, a sensor which detects a moving direction by a blinking state of a direction indicator, or the like.
Processes executed by the image processing apparatus 6 will be described below.
As shown in
Thereafter, the image processing apparatus 6, as shown in
In the image processing apparatus 6, as in the image processing apparatus 5, processes of independent processors are preferably executed in parallel to each other to increase the speed of a processing cycle. Furthermore, a subsequent state of a steering angle may be predicted on the basis of time-series steering angle information stored by the probing information 652, and the steering angle probing process may be arbitrarily skipped to increase the speed of the processing cycle. The image pickup process in step S605 may be performed immediately before step S601 or step S603.
The window switching process in step S603 shown in
On the other hand, when the window switching unit 631 determines that the direction is left (step S624: left), the window switching unit 631 selects a left window (step S630) and outputs designation information for switching a window to the selected window to the image analysis unit 520 together with the window information (step S632), and returns the operation to step S603. On the other hand, when the window switching unit 631 determines that the direction is straightforward (step S624: straightforward), the window switching unit 631 selects a straightforward window (step S634), and outputs designation information for switching a window to the selected window to the image analysis unit 520 together with the window information (step S636), and returns the operation to step S603).
In the flow chart shown in
An example of a window selected by the window switching unit 631 will be described below.
On the other hand,
On the other hand,
The image processing apparatus 6 according to the sixth embodiment described above can select an image processing range on the basis of a steering angle of a given vehicle and perform distance calculation on the basis of an image signal group corresponding to the selected image processing range. For this reason, in comparison with a conventional image processing apparatus which performs distance calculation to all image signals of an image signal group, a load of a process for distance calculation can be reduced, and time required for distance calculation can be shortened. As a result, the image processing apparatus 6 can shorten processing time required until the distance information is output after an image is acquired and can output the distance information at a high speed.
A seventh embodiment of the present invention will be described below. In the fifth and the sixth embodiments described above, the window switching unit selects a window on the basis of a speed or a steering angle detected by the probing unit. However, in the seventh embodiment, a position of a given vehicle is detected, map information is referred to on the basis of the detected position, and a window is selected on the basis of the referred map information.
The external communication unit 770 is a means for detecting a position of a given vehicle, for example a communication means using a GPS (Global Positioning System). The external communication unit 770 outputs a detected position to the control unit 730 as moving information of the given vehicle. The external communication unit 770 may detect the position of the given vehicle and acquire map information near the detected position. As the external communication unit 770, in place of the communication means, a sensor or the like which detects a position by applying a gyroscope or a speedometer.
Processes executed by the image processing apparatus 7 will be described below.
As shown in
Thereafter, the image processing apparatus 7, as shown in
In the image processing apparatus 7, as in the image processing apparatus 5, processes of independent processors are preferably executed in parallel to each other to increase the speed of a processing cycle. Furthermore, a subsequent position may be predicted on the basis of time-series position information or the like stored by the position information 752, and the steering angle probing process may be arbitrarily skipped to increase the speed of the processing cycle. The image pickup process in step S705 may be performed immediately before step S701 or step S703.
An example of the window switching process in step S703 shown in
The window switching unit 731 selects a freeway window from the window information 751 (step S728) when the window switching unit 731 determines that the given vehicle runs on the freeway (step S726: Yes), outputs designation information which switches a window to the selected window to the image analysis unit 520 together with the window information (step S730), and returns the operation to step S703.
On the other hand, when the window switching unit 731 determines that the given vehicle does not runs on the freeway (step S726: No), the window switching unit 731 selects a standard window (step S732, outputs designation information which switches a window to the selected window to the image analysis unit 520 together with the window information (step S734), and returns the operation to step S703.
An example of the window selected by the window switching unit 731 will be described below.
On the other hand,
The various patterns of the freeway window can be set. For example, as a slit pattern of the freeway window 751a shown in
Another example of the window switching process in step S703 shown in
When the window switching unit 731 determines that the road surface on which the given vehicle runs is a concave surface (step S746: concave surface), the window switching unit 731 selects a concave surface window from a window information 351 (step S748), outputs designation information which switches a window to the selected window to the image analysis unit 520 together with the window information (step S750), and returns the operation to step S703. On the other hand, when the window switching unit 731 determines that the road surface is a convex surface (step S746: convex surface), the window switching unit 731 selects a convex surface window (step S752), outputs designation information which switches a window to the selected window to the image analysis unit 520 together with the window information (step S754), and returns the operation to step S703.
On the other hand, the window switching unit 731 determines that the road surface is a flat surface (step S746: flat surface), the window switching unit 731 selects a standard window (step S756), outputs designation information which switches a window to the selected window to the image analysis unit 520 together with the window information (step S758), and returns the operation to step S703.
The concave road surface means that a road surface gradient tends to increase at an elevation angle direction. For example, a road surface where a flat road surface changes into an ascending slope or a road surface where a descending slope changes into a flat road surface is applied. On the other hand, the convex road surface means that a road surface gradient tends to increase at a depression angle direction. For example, as the convex road surface, a road surface where a flat road surface changes into a descending slope or a road surface where an ascending slope changes into a flat road surface is applied.
An example of a window selected by the window switching unit 731 on the basis of an inclination of a road surface gradient will be described below.
On the other hand,
In the flow charts shown in
The image processing apparatus 7 according to the seventh embodiment described above selects an image processing range on the basis of position information of the given vehicle and map information which is referred to on the basis of the position information, and distance calculation is performed on an image signal group corresponding to the selected image processing range. For this reason, in comparison with a conventional image processing apparatus which performs distance calculation to all image signals of an image signal group, a load of a process for distance calculation can be reduced, and time required for distance calculation can be shortened. As a result, the image processing apparatus 7 can shorten processing time required until the distance information is output after an image is acquired and can output the distance information at a high speed.
An eighth embodiment of the present invention will be described below. An image processing apparatus according to the eighth embodiment includes all the speed sensor 561, the steering angle sensor 661, and the external communication unit 770 which are included in the image processing apparatuses 5, 6, and 7 according to the fifth to the seventh embodiments described above, multilaterally detects moving information of the given vehicle, and selects a window matched with the detected moving information.
In the image processing apparatus 8, the window switching unit 831 independently or totally determines a speed detected by the speed sensor 561, a steering angle detected by the steering angle sensor 661, and map information based on position information received by the external communication unit 770, selects a window matched with the determination result from the window information 851, and outputs designation information which switches a window to the selected window to the image analysis unit 520 together with the window information. In this sense, in the eighth embodiment, the probing unit 860 and the external communication unit 770 constitute at least a part of a moving information detection unit 8700 as a whole.
Thereafter, the distance calculation unit 521, as in the fifth to the seventh embodiments described above, calculates a distance to an object on the basis of an image signal group corresponding to the selected window, forms distance information, and outputs the distance information. The window switching unit 831 may recognize at least one available combination of various pieces of moving information acquired from the probing unit 860 or the external communication unit 770 as a mode and determine a state of the given vehicle with respect to a mode designated by a passenger or the like of the given vehicle. The window switching unit 831 may predict a subsequent state of the given vehicle on the basis of time-series moving information stored by the probing information 852 and the position information 752.
The image processing apparatus 8 according to the eighth embodiment described above selects an image processing range on the basis of a result obtained by independently or totally determining various pieces of moving information of the given vehicle, and performs distance calculation on the basis of the image signal group corresponding to the selected image processing range. For this reason, in comparison with a conventional image processing apparatus which performs distance calculation to all image signals of an image signal group, a load of a process for distance calculation can be reduced, and time required for distance calculation can be shortened. As a result, the image processing apparatus 8 can shorten processing time required until the distance information is output after an image is acquired and can output the distance information at a high speed.
A ninth embodiment of the present invention will be described below. In the fifth to the eighth embodiments, a distance to an object the image of which is picked is detected by processing an image signal group from the imaging unit 10. However, in the fifth embodiment, a distance to an object located in the field of image pickup view is detected by a radar.
The radar 980 transmits a predetermined outgoing wave, receives a reflected wave obtained by reflecting the outgoing wave by an object surface, and detects a receiving state and a distance from the radar 980 to the object which reflects the outgoing wave and a direction in which the object is located on the basis of the receiving state. The radar 980 detects the distance and the direction of the object which reflects the outgoing wave on the basis of an outgoing angle of the outgoing wave, an incident angle of the reflected wave, a receiving intensity of the reflected wave, time from when the outgoing wave is emitted to when the reflected wave is received, and a change in frequency in the outgoing wave and the reflected wave. The radar 980 outputs a distance to an object located in a field of image pickup view of the imaging unit 10 to the control unit 930 together with the direction in which the object is located. The radar 980 transmits, for example, a laser beam, an infrared ray, a millimeter wave, a microwave, an ultrasonic wave, or the like as a transmitter wave.
In the image processing apparatus 9 according to the ninth embodiment, a distance is detected by the radar 980 in place of calculation of a distance by processing image information obtained from the imaging unit 10. For this reason, distance information can be acquired at a higher speed and a higher accuracy.
In the image processing apparatus 9, matching between a positional relationship in the image signal group picked by the imaging unit 10 and a positional relationship in the detection range of the radar 980 is obtained as follows, and then the processes are performed. For example, the image processing apparatus 9 performs an image pickup process by the imaging unit 10 and a detection process by the radar 980 to an object having a known shape. A position of the known object processed by the imaging unit 10 and the radar 980 is obtained. Thereafter, the image processing apparatus 9 obtains a relationship between positions of the known object processed by the imaging unit 10 and the radar 980 by using the method of least squares and matches the positional relationship in the image signal group picked by the imaging unit 10 with the positional relationship in the detection range of the radar 980.
In the image processing apparatus 9, even though an image pickup original point of the imaging unit 10 and a detection original point of the radar 980 are different from each other, if distances from the image pickup point and the detection point to the image processing apparatus 9 are sufficiently long, it can be understood that an image pickup original point and a detection original point are almost overlapped. Furthermore, when matching between the positional relationship in the image signal group picked by the imaging unit 10 and the positional relationship in the detection range of the radar 980 is accurately performed, the difference between the image pickup original point and the detection original point can be corrected by geometric conversion.
In the image processing apparatus 9, radar detection points of the radar 980 are preferably set to be located at predetermined intervals in a pixel row in which image signals of the image signal group picked by the imaging unit 10. When the radar detection points are not set as described above, interpolation points of the radar detection points are obtained in the same pixel row as the row of the image signals by using primary interpolation or the like on the basis of a plurality of radar detection points located near the image signals, and a detection process may be performed by the interpolation points.
The image analysis unit 1020 has a distance information generating unit 1021 which calculates a distance to an object the image of which is to be picked and which is included in a region of a field of view the image of which is picked by the imaging unit 10 and an image processing unit 1022 which performs image processing corresponding to the distance information and a speed of a given vehicle detected by the speed detection unit 1060. As is apparent from this explanation, the distance information generating unit 1021 has a function of a distance calculation unit.
The control unit 1030 has a processing selection unit 1031 which selects an image processing method actually performed by the image processing unit 1022 from a plurality of image processing methods.
The storage unit 1050 stores and memorizes image data 1051 picked by the imaging unit 10, distance information 1052 of constituent points (pixel points) of the image data 1051, an image processing method 1053 to be selected by the processing selection unit 1031, a template 1054 which expresses patterns of various objects (vehicle, person, road surface, white line, indicator, and the like) used in object recognition in an image in units of pixel points, and zone dividing information 1055 obtained by dividing a range of an available speed of a running vehicle and a range of distance in which image pickup can be performed by the imaging unit 10 into pluralities of zones, respectively.
In the tenth embodiment, the speed detection unit 1060 constitutes at least a part of a moving information detection unit 1160 which detects moving information including a speed of a mobile object. The processing selection unit 1031 constitutes at least a part of a processing content setting unit 1131 which sets contents of processing to be performed in an image based on moving information detected by the moving information detection unit 1160. Furthermore, the image processing unit 1022 constitutes at least a part of a processing calculation unit 1122 which performs processing calculation corresponding to the contents of the processing set by the processing content setting unit 1131.
An image processing method executed by the image processing apparatus 101 having the above configuration will be described below in detail with reference to a flow chart shown in
Thereafter, the distance information generating unit 1021 calculates a distance to an object the image of which is to be picked and which is included in the image of which is picked by each image of pixel points (step S1003). The distance information generating unit 1021 calculates coordinates of all pixel points or some pixel points in the field of view picked by using right and left camera coordinate systems. Subsequently, the distance information generating unit 1021 calculates a distance R from the forehead surface of the vehicle to a point the image is picked by using the calculated coordinates (x, y, z) of the pixel points. In this case, a position of the forehead surface of the vehicle in the camera coordinate system must be predicted in advance. Thereafter, the distance information generating unit 1021 stores the calculated coordinates (x, y, z) of all the pixel points or some pixel points and the distance R in the storage unit 1050.
The distance information including the distance calculated in step S1003 may be superposed on the image generated in step S1001 to generate a distance image.
In parallel to the processes (step S1001 and S1003) in the imaging unit 10 and the distance information generating unit 1021, the speed detection unit 1060 detects a speed of a given vehicle (step S1005).
The processing selection unit 1031 in the control unit 1030, on the basis of the distance calculation result in step S1003 and the speed detection result in step S1005, selects an image processing method executed to the points in the image by the image processing unit 1022 depending the distance calculated in step S1003 from the image processing methods 1053 stored in the storage unit 1050 (step S1007). Thereafter, the image processing unit 1022 performs image processing according to the image processing method selected by the processing selection unit 1031 (step S1009). At this time, the image processing unit 1022 reads the image processing method selected by the processing selection unit 1031 from the storage unit 1050, and performs the image processing based on the read image processing method.
Processing contents shown in
When the processing region belongs to a long-distance zone, and when a vehicle moves at a speed belonging to a high-speed zone, vehicle detection is performed to the processing region. In the vehicle detection performed here, when a brake must be applied when a preceding vehicle comes close to the given vehicle by a predetermined distance, image processing is performed such that a driver is notified that the brake must be applied. For this reason, when a pattern of the vehicle corresponding to a size obtained when the preceding vehicle comes close to the given vehicle by the predetermined distance is stored in the template 1054, the vehicle is not detected when the preceding vehicle runs ahead of the pattern.
A case in which the processing region belongs to a short-distance zone and a vehicle moves at a speed belonging to a high-speed zone will be described below. In this case, white line detection is performed to the corresponding processing region. On the other hand, when the processing region belongs to a long-distance zone, and when the vehicle moves at a speed belonging to a low-speed zone, white line detection is performed to the processing region. As concrete image processing for the while line detection, for example, when a white line is detected, and when a pattern of the detected white line is different from that in a normal running state, the driver may be notified that the pattern is different from that in the normal running state. In this case, a lane dividing belt (yellow line or the like) except for the white line may be detected.
When the processing region belongs to the long-distance zone, and when a vehicle moves at a speed belonging to a low-speed zone, detection of a vehicle or a person is performed to the processing region. At the same time, an obstacle may be detected. When the obstacle is detected, a message representing “Please avoid obstacle” or the like may be output.
The image processing method described above is only an example. Another image processing method may be selected by the processing selection unit 1031. For example, in
According to the tenth embodiment of the present invention described above, an image processing method depending on a distance to an object the image of which is to be picked and which is generated by a picked image and a speed of a given vehicle serving as a mobile object, and the selected image processing method is applied, so that various pieces of information included in the image to be picked can be multilaterally and efficiently processed.
In the tenth embodiment, a combination of a speed zone and a distance region is not limited to the combination described above.
The distance zone is divided on the basis of a braking distance at that time, and an image processing method can also be determined on the basis of the divided distance zone. In this case, for example, the short-distance zone falls within a braking distance at that time, the middle-distance zone is equal to or larger than a braking distance at that time and equal to or smaller than two times the braking distance, and the long-distance zone is equal to or larger than two times the braking distance at that time. “Obstacle+white line” are detected in the short-distance zone, “Person+vehicle+obstacle+white line” are detected in the intermediate-distance zone, and “white line” is detected in the long-distance zone.
Furthermore, in addition to indexes of a distance and a speed, personal information of a vehicle driver, for example, a driving experience, an age, a sex, an operation experience of a vehicle which is being operated, and the like are input in advance. When divisions of the zones are changed depending on the pieces of personal information, image processing can be more exactly performed depending on conditions of a vehicle operator.
A selection method changing unit which changes a selection method for image processing methods in the processing selection unit can also further arranged in the image processing apparatus according to the tenth embodiment. In this manner, an optimum image processing method can be selected and executed depending on various conditions in traveling state of the vehicle.
Even in the tenth embodiment, in addition to various cameras, a distance measuring apparatus such as a radar can also be used. In this manner, a radar measurement value having an accuracy higher than that of a normal camera can be used, and a resolving power of distance measurement points can be further improved.
As a method of image recognition in the tenth embodiment, in addition to the template matching described above, an object recognition method which is popularly used such as a region dividing method by edge extraction and a statistic pattern recognition method based on cluster analysis can be applied.
An eleventh embodiment of the present invention will be described below. In the first to the tenth embodiments described above, a stereo image is picked by two cameras, i.e., the right camera 11a and the left camera 11b. However, in the eleventh embodiment, a stereo image is picked by an image pickup element which has one pair of waveguide optical systems and image pickup regions corresponding to the waveguide optical systems and converts optical signals guided by the waveguide optical systems into electric signals in the image pickup regions, respectively.
The imaging unit 110 includes a camera 111 serving as an image pickup apparatus having the same configuration and the same functions as those of the right camera 11a and the left camera 11b of the imaging unit 10. The camera 111 includes a lens 112, an image pickup element 113, an A/D conversion unit 114, and a frame memory 115. Furthermore, the imaging unit 110 includes a stereo adapter 119 serving as one pair of waveguide optical systems constituted by mirrors 119a to 119d in front of the camera 111. The stereo adapter 119, as shown in
In the imaging unit 110, light from an object located in a field of image pickup view is received by the two right and left pairs of mirror systems of the stereo adapter 119, converged by the lens 112 serving as an image pickup optical system, and picked by the image pickup element 113. At this time, as shown in
In the imaging unit 110 according to the eleventh embodiment, since a stereo image is picked by one camera having a stereo adapter, an imaging unit can be simplified and made compact in comparison with a configuration in which a stereo image is picked by two cameras, mechanical strength can be increased, and right and left images can be always picked in a relatively stable state. Furthermore, since right and left images are picked by using the common lens and the common image pickup element, a fluctuation caused by individual differences can be suppressed, and trouble of calibration and cumbersome assembly such as alignment can be reduced.
As a configuration of the stereo adapter, in
In embodiment 11, as shown in
Furthermore, in the configuration in the eleventh embodiment, the right and left images are picked while being horizontally shifted from each other. For example, the plane mirrors of the stereo adapter may be almost orthogonally combined to each other, and right and left images may be picked while being vertically shifted.
Up to now, the preferable embodiments of the present invention have been described in detail. However, the present invention is not limited by the first to the eleventh embodiments described above. For example, as a stereo camera of an imaging unit, a stereo camera having a larger number of views, for example, a three-view stereo camera or a 4-view stereo camera may be arranged. It is known that a stable processing result having higher reliability is obtained in a three-dimensional reconfiguration process or the like by using the 3-view or 4-view stereo camera (see Tomita Fumiaki, “Advanced Three-Dimensional Vision System VVV”, Journal of Information Processing Society of Japan “Information Processing”, Vol. 42, No. 4, pp. 370 to 375 (2001) or the like). In particular, it is known that, when a plurality of cameras are arranged to have base lengths in two directions, a three-dimensional reconfiguration can be obtained in a more complex scene. In addition, when a plurality of cameras are arranged in one base length direction, a stereo camera of a so-called multi-base line type can be realized, and a more accurate stereo measurement can be achieved.
Furthermore, as the camera of the imaging unit, in place of a fly-eye stereo camera, a simple-eye camera may be used. In this case, a three-dimensional reconfiguration technique such as a shape from focus method, a shape from defocus method, a shape from motion method, or a shape from shading method is applied to make it possible to calculate a distance to an object in a field of image pickup view.
The shape from focus method is a method which calculates a distance from a focus position when the best focus is obtained. The shape from defocus method is a method which obtains a relative amount of blur is obtained from a plurality of images having different focal distances and obtains a distance on the basis of a correlation between the amount of blur and the distance. The shape promotion method is a method which obtains a distance to an object on the basis of a moving track of a predetermined characteristic point in a plurality of images which continues in terms of time. The shape from shading method is a method which calculates a distance to an object on the basis of the light and shade on an image, a reflecting characteristic of a target object, and light source information.
Furthermore, in the explanation of the configuration of the imaging unit 10 in the first to the tenth embodiments or the configuration of the imaging unit 110 in the eleventh embodiment, one pair of cameras or light-receiving units of the stereo adapter are arranged to be horizontally aligned. However, the cameras or the light-receiving units may be vertically aligned or may be aligned in an oblique direction.
In the first to the eleventh embodiments, the image processing apparatus mount on a vehicle such as a four-wheel vehicle is explained. However, the image processing apparatus can be mounted on another mobile object, for example, an electric wheelchair or the like. The image processing apparatus can be mounted on not only a vehicle but also a mobile object such as a person or a robot. Furthermore, an entire image processing apparatus need not be mounted on a mobile object. For example, an imaging unit and an output unit are mounted on a mobile object, and the other constituent parts are arranged out of the mobile object, so that the imaging units, the output units, and the parts may be connected to each other by wireless communication.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2005-137850 | May 2005 | JP | national |
2005-145825 | May 2005 | JP | national |
2005-145826 | May 2005 | JP | national |
This application is a continuation of PCT international application Ser. No. PCT/JP2006/309419 filed May 10, 2006 which designates the United States, incorporated herein by reference, and which claims the benefit of priority from Japanese Patent Applications No. 2005-137850, filed May 10, 2005; No. 2005-145825, filed May 18, 2005; and No. 2005-145826, filed May 18, 2005, and all incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2006/309419 | May 2006 | US |
Child | 11936492 | Nov 2007 | US |