The disclosure of Japanese Patent Application No. 2019-116382 filed on Jun. 24, 2019 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
The present disclosure relates to an in-vehicle information recording device.
Japanese Unexamined Patent Application Publication No. 2013-80518 (JP 2013-80518 A) discloses an in-vehicle moving image data recording device that compresses a camera-unit captured image at high quality and records the high-quality moving image data based on a trigger detection time at which an abnormal situation is detected by a trigger detection unit.
There is a vehicle equipped with a plurality of cameras (capturing units) having different angles of view in order to perform driving assistance. The angle of view of a camera used for driving assistance is generally narrower or wider than the angle of view of a camera suitable for vehicle information recording. Therefore, when mounting an information recording device (drive recorder) for recording information about vehicles such as that disclosed in JP 2013-80518 A, it is necessary to mount a camera dedicated for the information recording device and, therefore, there is room for improvement.
The present disclosure provides an in-vehicle information recording device that can record an image suitable for vehicle information recording without adding a new capturing unit.
A first aspect of the present disclosure relates to an in-vehicle information recording device including a first capturing unit, a second capturing unit, an information acquisition unit, and a recording unit. The first capturing unit is configured to capture a first image at a first angle of view. The first image includes at least a part of a target vehicle in front of or behind a vehicle and the surrounding area of the target vehicle. The second capturing unit is configured to capture a second image at a second angle of view that is narrower than the first angle of view. The second image includes target information so that the target information on a specific target part of the target vehicle or the surrounding area is identifiable. The information acquisition unit is configured to acquire the target information from the second image. The recording unit is configured to record the target information in association with the first image. The target information is acquired by the information acquisition unit.
In the in-vehicle information recording device according to the first aspect of the present disclosure, the first capturing unit captures the first image at the first angle of view. The second capturing unit captures the second image, which includes the target information, at the second angle of view that is narrower than the first angle of view so that the target information on the specific target part of the target vehicle or the surrounding area can be identified. In addition, the information acquisition unit acquires the target information from the second image. Then, the recording unit records the target information in association with the first image. Thus, even when not identified in the first image, the specific target part can be identified by the target information associated with the first image. This allows an image suitable for the information recording of the vehicle to be recorded without adding a new capturing unit.
In the in-vehicle information recording device according the first aspect, the recording unit may be configured to record the target information in association with the first image. The target information includes number information on the target vehicle.
In the in-vehicle information recording device according to the first aspect, since the target information includes the number information, one target vehicle can be uniquely identified based on the target information. This allows the target vehicle to be identified without recording a plurality pieces of target information.
In the in-vehicle information recording device according to the first aspect, the recording unit may be configured to record the target information in association with the first image. The target information includes vehicle traveling rule information on the surrounding area.
In the in-vehicle information recording device according to the first aspect, the target information includes the vehicle traveling rule information on the surrounding area. Therefore, it is possible to determine whether the target vehicle, which has been recorded, is traveling in compliance with the vehicle traveling rule.
In the in-vehicle information recording device according to the first aspect, the horizontal angle of view that is the first angle of view of the first capturing unit may be set to 180° or more.
In the in-vehicle information recording device according to the first aspect, since the horizontal angle of view of the first capturing unit is set to 180° or more, the capturing range is larger than that of a configuration in which the horizontal angle of view is less than 180°. This means that more surrounding information on the vehicle can be acquired.
As described above, the present disclosure provides an excellent effect that an image suitable for vehicle information recording can be recorded without adding a new capturing unit.
Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like signs denote like elements, and wherein:
The arrow FR indicates the forward longitudinal direction of the vehicle, the arrow RR indicates the rearward longitudinal direction of the vehicle, the arrow UP indicates the upward vertical direction of the vehicle, and the arrow OUT indicates the outward width direction of the vehicle. The vehicle longitudinal direction, the vehicle vertical direction, and the vehicle width direction are directions orthogonal to each other. In the description below, when the direction is mentioned simply using front-rear, up-down, or right-left, it is assumed that front-rear in the vehicle longitudinal direction, up-down in the vehicle vertical direction, or right-left in the vehicle width direction with respect to the traveling direction is mentioned.
Next, the information recording device 30 will be described.
As shown in
The ECU 32 has a central processing unit (CPU) 34, a read only memory (ROM) 35, a random access memory (RAM) 36, and a storage 37.
The ROM 35 stores various programs and various data. The RAM 36, used as a work area, temporarily stores a program or data. The storage 37, configured for example by a flash read only memory (ROM), stores various programs including the operating system and various data. The CPU 34 executes various programs, such as the information recording processing programs recorded in the ROM 35 or the storage 37, to control the operation of each part of the information recording device 30 and the recording of various information.
The camera unit 40 includes, for example, a front wide-angle camera 42, a front narrow-angle camera 43, side cameras 44, a rear wide-angle camera 45, and a rear narrow-angle camera 46. A camera having a relatively large angle of view (viewing angle) is referred to as a “wide-angle camera”, and a camera having a smaller (narrower) angle of view than the angle of view of a “wide-angle camera” is referred to as a “narrow-angle camera”. In the description below, the “angle of view” means the “horizontal angle of view”, but is not limited thereto; the “angle of view” may be set as the “vertical angle of view” or the “diagonal angle of view”. The angle of view may be converted into an angle of view based on the aspect ratio of an image (video) that is set in advance.
Each camera of the camera unit 40 includes a lens and an image sensor such as a charge-coupled device (CCD) and a complementary metal oxide semiconductor (CMOS). Furthermore, the capturing frequency of an image (video) by each camera is set, for example, to about 30 ms (millisecond).
As shown in
For example, the front narrow-angle camera 43 is provided near the stay of the room mirror 15 (see
The side cameras 44 are provided below a pair of right and left camera supports 24 (under the electronic mirror cameras 23 (see
The rear wide-angle camera 45, mounted for example at the rear end of the vehicle body 12 and at the center in the vehicle width direction, captures a first image G3 (see
The rear narrow-angle camera 46, mounted for example at the rear end of the vehicle body 12 and at the center in the vehicle width direction below the rear wide-angle camera 45, captures a second image G4 (see
The memory card 48 shown in
The main monitor 52 shown in
The recording switch 55, provided on the instrument panel 17, is turned on and off by an occupant (driver) not shown. The signal of the recording switch 55 is output to the ECU 32 (see
The collision prediction sensor 56 shown in
The timer 58 sends the current time (time) information to the ECU 32. In response to an instruction from the ECU 32, the timer 58 is configured to be able to measure the time between two time points. The time information measured by the timer 58 is sent to the ECU 32.
The ignition sensor 59 detects the ON state (start state) or the OFF state (stop state) of the ignition key (not shown) of the vehicle 10. The information on the ON state or the OFF state detected by the ignition sensor 59 is sent to the ECU 32.
The information recording device 30 implements various functions using the above hardware resources when executing the information recording processing programs. The functional configuration implemented by the information recording device 30 will be described below. For the description of each component shown in
As shown in
The first capturing unit 60 includes a front wide-angle capturing unit 62 and a rear wide-angle capturing unit 64.
The front wide-angle capturing unit 62 captures the first image G1 (see
The second capturing unit 70 includes a front narrow-angle capturing unit 72 and a rear narrow-angle capturing unit 74.
The front narrow-angle capturing unit 72 captures the second image G2, which includes the target information, at the second angle of view θ2, which is narrower than the first angle of view θ1, so that the target information on a specific target part S (see
The specific target part S means a target part that is included in at least one of the target vehicle C and the surrounding area P. This target part is a part including the information that needs to be identified and recorded. In the description below, the license plate N of the target vehicle C and the speed sign V (see
The information acquisition unit 80 acquires the target information (number information and regulated speed information) on the specific target part S from the second image G2 and the second image G4. More specifically, the information acquisition unit 80 includes, for example, a narrow-angle image acquisition unit 82, a target identification unit 84, a target analysis unit 86, and a target information recording unit 88.
The he narrow-angle image acquisition unit 82 acquires the information on the second image G2 and the second image G4 from the second capturing unit 70. The target identification unit 84 uses, for example, a known pattern matching technique and a known area recognition technique to identify (detect) the license plate N and the speed sign V included in the second image G2 and the second image G4. The target analysis unit 86 uses, for example, a known optical character recognition (OCR) method for the license plate N and the speed sign V, identified by the target identification unit 84, to analyze the number information and regulated speed information and converts the analyzed result into text.
The target information recording unit 88 temporarily stores the number information and the regulated speed information that have been converted into text by the target analysis unit 86. In other words, the number information and the regulated speed information, stored in the target information recording unit 88, are automatically deleted or overwritten when a preset setting time has elapsed. The number information and the regulated speed information are sent to the recording unit 90. Note that, when the trigger signal which will be described later is received by the ECU 32, the number information and the regulated speed information recorded during the time before and after the reception of the trigger signal are saved.
The recording unit 90 records the target information (number information and regulated speed information), acquired by the information acquisition unit 80, in association with the first image G1 and the first image G3. More specifically, the recording unit 90 includes a wide-angle image acquisition unit 92, a target identification unit 94, an information combining unit 96, and a combined information recording unit 98.
The wide-angle image acquisition unit 92 acquires the first image G1 and the first image G3 from the first capturing unit 60. The target identification unit 94 identifies the license plate N and the speed sign V included in the first image G1 and the first image G3 acquired by the wide-angle image acquisition unit 92. Note that the identification of a target by the target identification unit 94 means an identification to such an extent that the target can be roughly distinguished from the other parts. Therefore, the target identification unit 94 does not have to obtain the necessary target information.
The information combining unit 96 associates the license plate N, included in the first image G1 and the first image G3, with the number information, acquired by the information acquisition unit 80, during the period in which the number information and the regulated speed information can be acquired, while maintaining synchronization between the license plate N and the number information in point of time (with respect to time). Furthermore, during that period, the information combining unit 96 associates the speed sign V, included in the first image G1, and the speed sign V, included in the first image G3, with the regulated speed information, acquired by the information acquisition unit 80, while maintaining synchronization between the speed sign V and the regulated speed information in point of time (with respect to time). During the period in which the number information and the regulated speed information cannot be acquired, the information combining unit 96 associates the already acquired number information with the license plate N in the first image G1 and the first image G3, and the already acquired regulated speed information and with the speed sign V included in the first image G1 and the first image G3.
In the description above, to “associate” means to associate one piece of information with another piece of information. Furthermore, “association” is not limited to the association between one piece of information and another piece of information that have been individually recorded but includes the collective recording (“combining”, “superimposing”) of one piece of information and another piece of information. In the description below, “combined information” means information created by combining target information and a first image in such a way that they are synchronized with respect to time.
The combined information recording unit 98 records the combined information, combined by the information combining unit 96, when the trigger signal (ON) is received. The combined information recording unit 98 stops recording the combined information, combined by the information combining unit 96, when the trigger signal (OFF) is received. In the embodiment, an example of combining target information with the first image G1 or G3 is shown in such a way that the target information is superimposed (overlapped) on the first image G1 or G3. Combining the target information with the first image G1 or G3 includes not only superimposing the target information on the first image G1 or G3 but also displaying the target information side by side with the first image G1 or G3 to form one image.
When the inter-vehicle distance is small, the number information can be identified by the narrow-angle image. Therefore, the recorded image in which the wide-angle image (first image G1 or G3) and the number information are synchronized in real time is recorded in the recording unit 90 (see
When the inter-vehicle distance is large (very long), neither the number information on the target vehicle C nor the presence or absence of the target vehicle C can be identified. Therefore, the number information is not recorded in the recording unit 90. When the inter-vehicle distance is too small (see
Next, the operation of the information recording device 30 of the present embodiment will be described.
In the ECU 32, the CPU 34 reads the information recording processing program from the ROM 35 or the storage 37, loads the program into the RAM 36, and executes the program to perform the information recording processing.
In step S10, the CPU 34 checks the signal, detected by the ignition sensor 59, to determine whether the ignition key is ON. When it is determined that the ignition key is the ON (S10: Yes), the processing proceeds to step S12. When it is determined that the ignition key OFF (S10: No), step S10 is repeated.
In step S12, the CPU 34 acquires the narrow-angle image information (second image G4 (see
In step S14, the CPU 34 acquires the wide-angle image information (first image G3 (see
In step S16, the CPU 34 determines whether the number information on the license plate N of the following vehicle CB can be analyzed. Whether the number information can be analyzed is determined, for example, by determining whether the inter-vehicle distance between the vehicle 10 and the following vehicle CB is within the range of a preset inter-vehicle distance. When it is determined that the number information can be analyzed (S16: Yes), the processing proceeds to step S18. When it is determined that the number information cannot be analyzed (S16: No), the processing proceeds to step S12.
In step S18, the CPU 34 analyzes the number information on the following vehicle CB using the target analysis unit 86 (see
In step S20, the CPU 34 stores the analyzed number information in the target information recording unit 88. Then, the processing proceeds to step S22.
In step S22, the CPU 34 determines whether the trigger signal is received. In other words, the CPU 34 determines whether the output signals from the recording switch 55 and the collision prediction sensor 56 are detected. When the trigger signal is received (S22: Yes), the processing proceeds to step S24. When the trigger signal is not received (S22: No), the processing proceeds to step S12.
In step S24, the CPU 34 obtains the combined information (the first image G3 shown in
In step S26, the CPU 34 causes the recording unit 90 to record the combined information obtained in step S24. Then, the processing proceeds to step S28.
In step S28, the CPU 34 determines whether the ignition key is OFF. When it is determined that the ignition key is OFF (S28: Yes), the program ends. When it is determined that the ignition key is ON (S28: No), the processing proceeds to step S12.
As described above, the information recording device 30 performs the processing as follows. The first capturing unit 60 captures the first image G3 at the first angle of view θ3 that is relatively wide. The second capturing unit 70 captures the second image G4, which includes the target information, at the second angle of view θ4 that is narrower than the first angle of view θ3. Furthermore, the information acquisition unit 80 acquires the target information from the second image G4. Then, the recording unit 90 records the target information in association with the first image G3. In this case, the specific target part S, even if not identified in the first image G3, can be identified by the target information associated with the first image G3, making it possible to record an image suitable for information recording of the vehicle 10 without adding a new capturing unit.
Since the target information includes the number information, the information recording device 30 can uniquely determine one target vehicle based on the target information, allowing the target vehicle to be identified without recording a plurality pieces of target information.
In addition, since the horizontal angle of view of the first capturing unit 60 is set to 180° or more in the information recording device 30, the capturing range is larger than that of a configuration in which the horizontal angle of view is less than 180°. This means that more surrounding information on the vehicle 10 can be acquired.
Next, the information recording of a target in front of the vehicle 10 will be described.
In the example shown in
Note that the present disclosure is not limited to the embodiment described above.
Although the three pieces of number information are combined preferably with following vehicles C1, C2, and C3 in a one-to-one correspondence, rough information association (combination) may also be used in which one of the three pieces of number information corresponds to one of the following vehicles C1, C2, and C3. The number of target vehicles C is not limited to one or three, but may be two or four or more.
In the information recording device 30, the side cameras 44 shown in
In the flowchart shown in
In the embodiment described above, the CPU 34 reads the software (program) for performing the information recording processing. This information recording processing may be performed by various processors other than the CPU 34. Examples of the processors that can be used in this case include the following two processors: (1) a programmable logic device (PLD), such a field-programmable gate array (FPGA), that is a processor having a circuit configuration that can be changed after manufacture and (2) a dedicated electric circuit, such as an application specific integrated circuit (ASIC), that is a processor having a circuit configuration specifically designed for performing specific processing. The processing described above may be performed by one of these various processors or by a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs and a combination of a CPU and an FPGA). More specifically, the hardware structure of these various processors is an electric circuit that combines circuit elements such as semiconductor devices.
Although previously stored (installed) in the ROM 35 or the storage 37 in the embodiment described above, the information recording processing program may be stored in other recording media. For example, the program may be stored on a recording medium such as a compact disk read only memory (CD-ROM), a digital versatile disk read only Memory (DVD-ROM), and a universal serial bus (USB) memory for distribution to the user. Furthermore, the information recording processing program may be downloaded from an external device via a network.
Number | Date | Country | Kind |
---|---|---|---|
2019-116382 | Jun 2019 | JP | national |