This application is based on and claims priority under 35 U.S.C. § 119 to Japanese Patent Application 2022-144183, filed on Sep. 9, 2022, the entire content of which is incorporated herein by reference.
This disclosure relates to an information processing apparatus, an information processing method, and an information processing program.
JP 2021-147856A (Reference 1) discloses a technique in which when a user of a vehicle approaches a power hinge door, the power hinge door is opened by a power door control unit.
As described above, as a technique in the related art, there is a technique of automatically opening a hinge door of a vehicle without a manual operation.
Here, when automatically opening the hinge door in a situation where an obstacle such as a wall or another vehicle is present around the vehicle, the hinge door must be opened to the extent that it does not come into contact with the obstacle. In order to open the hinge door to the extent that it does not come into contact with the obstacle, it is necessary to accurately specify a three-dimensional position of the obstacle with respect to the vehicle, and there is still room for improvement in a method of specifying a three-dimensional position of an obstacle.
A need thus exists for an information processing apparatus, an information processing method, and an information processing program which are not susceptible to the drawback mentioned above.
An information processing apparatus according to an aspect of this disclosure includes: an acquisition unit configured to acquire an image captured by an imaging unit provided in a hinge door of a vehicle and a door opening angle indicating an angle at which the hinge door is opened from a closed state when the image is captured; a correction unit configured to correct an error of the door opening angle acquired by the acquisition unit; and a specification unit configured to specify a three-dimensional position of an obstacle with respect to the vehicle using corresponding points of the obstacle present around the hinge door, which are determined based on a plurality of images captured by the imaging unit from a plurality of viewpoints with different door opening angles acquired by the acquisition unit, and the door opening angles corrected by the correction unit and associated with the respective plurality of images.
An information processing method according to another aspect of this disclosure is executed by a computer, and the information processing method includes: acquiring an image captured by an imaging unit provided in a hinge door of a vehicle and a door opening angle indicating an angle at which the hinge door is opened from a closed state when the image is captured; correcting an error of the acquired door opening angle; and specifying a three-dimensional position of an obstacle with respect to the vehicle using corresponding points of the obstacle present around the hinge door, which are determined based on a plurality of images captured by the imaging unit from a plurality of viewpoints with different acquired door opening angles, and the corrected door opening angles associated with the respective plurality of images.
An information processing program according to still another aspect of this disclosure causes a computer to execute: acquiring an image captured by an imaging unit provided in a hinge door of a vehicle and a door opening angle indicating an angle at which the hinge door is opened from a closed state when the image is captured; correcting an error of the acquired door opening angle; and specifying a three-dimensional position of an obstacle with respect to the vehicle using corresponding points of the obstacle present around the hinge door, which are determined based on a plurality of images captured by the imaging unit from a plurality of viewpoints with different acquired door opening angles, and the corrected door opening angles associated with the respective plurality of images.
The foregoing and additional features and characteristics of this disclosure will become more apparent from the following detailed description considered with the reference to the accompanying drawings, wherein:
Hereinafter, a vehicle 20 according to the present embodiment will be described.
As shown in
The on-board device 15 includes a central processing unit (CPU) 21, a read only memory (ROM) 22, a random access memory (RAM) 23, a storage unit 24, an in-vehicle communication interface (I/F) 25, an input and output I/F 26, and a wireless communication I/F 27. The CPU 21, the ROM 22, the RAM 23, the storage unit 24, the in-vehicle communication I/F 25, the input and output I/F 26, and the wireless communication I/F 27 are communicably connected to each other via an internal bus 28.
The CPU 21 is a central processing unit that executes various programs and controls each unit. That is, the CPU 21 reads a program from the ROM 22 or the storage unit 24, and executes the program using the RAM 23 as a work area. The CPU 21 controls the above-described components and performs various arithmetic processing in accordance with the program recorded in the ROM 22 or the storage unit 24.
The ROM 22 stores various programs and various types of data. The RAM 23 temporarily stores the program or data as the work area.
The storage unit 24 includes a storage device such as a hard disk drive (HDD), a solid state drive (SSD), or a flash memory, and stores various programs and various types of data. The storage unit 24 stores an information processing program for executing at least an opening processing to be described later.
The in-vehicle communication I/F 25 is an interface for connecting with the door ECU 30. A communication standard according to the CAN protocol is used for the interface. The in-vehicle communication I/F 25 is connected to an external bus 29.
In the first embodiment, the door ECU 30 is provided as an ECU. Although not shown, a plurality of ECUs are provided for each function of the vehicle 20 and include ECUs other than the door ECU 30.
The actuator 31 and the angle sensor 32 are connected to the door ECU 30.
The actuator 31 automatically opens and closes at least a driver seat door among the doors of the vehicle 20. In the first embodiment, the door ECU 30 causes the actuator 31 to be driven based on the control of the on-board device 15, so that the driver seat door can be automatically opened and closed without an occupant opening and closing the driver seat door.
The angle sensor 32 is provided at least on the driver seat door among the doors of the vehicle 20, and is a sensor for detecting a door opening angle indicating an angle at which the driver seat door is opened from a closed state, that is, when the door is closed. The door opening angle detected by the angle sensor 32 is stored in the storage unit 24.
The input and output I/F 26 is an interface for communicating with the microphone 40, the camera 41, the input switch 42, the monitor 43, the speaker 44, and the GPS device 45 mounted on the vehicle 20.
The microphone 40 is provided on a front pillar, a dashboard, or the like of the vehicle 20, and is a device that collects a sound uttered by a user of the vehicle 20.
As an example, the camera 41 includes a solid-state imaging device such as a charge coupled device (CCD) image sensor and a complementary metal oxide semiconductor (CMOS) image sensor. As an example, the camera 41 is provided at least on a door mirror 33 (see
An orientation of the camera 41 in the vehicle body coordinates when the driver seat door is closed is known, and information on the orientation is stored in the storage unit 24.
The input switch 42 is provided on an instrument panel, a center console, a steering wheel, or the like, and is a switch to be operated by a driver finger to input an operation. As the input switch 42, for example, a push-button numeric keypad, a touch pad, or the like can be adopted. In the first embodiment, the input switch 42 is provided with at least one opening switch for opening the driver seat door. In the first embodiment, the driver seat door can be automatically opened by operating the opening switch in a state where the vehicle 20 is stopped or parked.
The monitor 43 is provided on an instrument panel, a meter panel, or the like, and is a liquid crystal monitor for displaying an operation proposal for a function of the vehicle 20 and an image for explaining the function. The monitor 43 may be provided as a touch panel that also serves as the input switch 42.
The speaker 44 is provided on an instrument panel, a center console, a front pillar, a dashboard, or the like, and is a device for outputting an operation proposal for a function of the vehicle 20 and a sound for explaining the function. The speaker 44 may be provided on the monitor 43.
The GPS device 45 is a device that measures a current position of the vehicle 20. The GPS device 45 includes an antenna (not shown) that receives signals from GPS satellites. The GPS device 45 may be connected to the on-board device 15 via a car navigation system connected to an ECU (for example, a multimedia ECU).
The wireless communication I/F 27 is a wireless communication module for communicating with other devices. The wireless communication module uses, for example, communication standards such as 5G, LTE, and Wi-Fi (registered trademark).
Next, the functional configuration of the on-board device 15 will be described.
As shown in
The acquisition unit 21A acquires an image captured by the camera 41 and a door opening angle when the image is captured. In the first embodiment, the acquisition unit 21A acquires a plurality of images captured by the camera 41 from a plurality of viewpoints with different door opening angles, and the door opening angles associated with the respective plurality of images.
The correction unit 21B corrects an error of the door opening angle acquired by the acquisition unit 21A. The door opening angle corrected by the correction unit 21B is stored in the storage unit 24. Here, the door opening angle detected by the angle sensor 32 may have an error due to a measurement error dependent on the angle sensor 32 (for example, a sensor mounting error or a sampling error), a measurement error dependent on the driver seat door (for example, an error due to door deflection), or the like. Therefore, in the first embodiment, it is assumed that there is an error in the door opening angle detected by the angle sensor 32, and the error is corrected by the correction unit 21B.
The specification unit 21C specifies a three-dimensional position of an obstacle with respect to the vehicle 20 using corresponding points of the obstacle present around the driver seat door, which are determined based on the plurality of images captured by the camera 41 from the plurality of viewpoints with different door opening angles acquired by the acquisition unit 21A, and the door opening angles corrected by the correction unit 21B and associated with the respective plurality of images. The corresponding points of the obstacle are determined by performing a known processing of extracting a feature point of an image on the plurality of images captured by the camera 41 from the plurality of viewpoints with different door opening angles. The specification unit 21C specifies the three-dimensional position of the obstacle by a multi-view stereo (MVS) method which is a technique of restoring a three-dimensional shape of an object using the plurality of images captured from different viewpoints.
The determination unit 21D determines a maximum door opening angle at which the driver seat door does not come into contact with the obstacle (hereinafter, referred to as a “maximum opening angle”) using the three-dimensional position of the obstacle specified by the specification unit 21C and door information on a shape and a dimension of the driver seat door. The door information is stored in the storage unit 24 in advance.
The control unit 21E determines a door opening angle when the camera 41 captures an image, and performs control to open the driver seat door to the determined door opening angle. In the first embodiment, as an example, the acquisition unit 21A determines a first door opening angle to be “0 degrees” and a second door opening angle to be “7 degrees” when the camera 41 captures an image.
The control unit 21E performs control to open the driver seat door to the maximum opening angle determined by the determination unit 21D.
In step S10 shown in
In step S11, the CPU 21 specifies the three-dimensional position of the obstacle with respect to the vehicle 20 using the corresponding points of the obstacle, which are determined based on the plurality of images captured by the camera 41 from the plurality of viewpoints with different door opening angles acquired in step S10, and the corrected door opening angles associated with the respective plurality of images. Here, the CPU 21 corrects the error of the door opening angle acquired in step S10. Then, the processing proceeds to step S12. The method of specifying the three-dimensional position of the obstacle, including the method of correcting the error of the door opening angle, will be described later.
In step S12, the CPU 21 determines the maximum opening angle using the three-dimensional position of the obstacle specified in step S11 and the door information on the driver seat door. Then, the processing proceeds to step S13. The method of determining the maximum opening angle will be described later.
In step S13, the CPU 21 opens the driver seat door to the maximum opening angle determined in step S12. Then, the opening processing ends.
Next, the method of specifying the three-dimensional position of the obstacle and the method of determining the maximum opening angle will be described with reference to
As shown in
The CPU 21 calculates (Xcα_w, Ycα_w, Zcα_w) using the following equation (3).
(Xcα_w,Ycα_w,Zcα_w)=(Lc cos(α0+α),Yc0_w,Lc sin(α0+α)) (3)
As shown in
R
cα_w=RY_w(α)Rc0_w (4)
As described above, the position and orientation of the camera 41 when the door opening angle is α can be determined using the door opening angle α and the position and orientation of the camera 41 when the door is closed.
Next,
As shown in
Here, P_w, P_c0, and P_cα represent the same target obstacle in different coordinate systems, and have relationships represented by the following equations (5) and (6).
P_w=Rc0_w P_c0Tc0_w (5)
P_w=Rcα_w P_cα+Tcα_w (6)
It is assumed that a focal length and an image center in units of pixels, which are internal parameters of the camera 41, are f and Ic_i=(xc_i, yc_i)T, respectively. At this time, a projection equation of the image is the following equations (7) and (8).
From the above equations (7) and (8), the following equations (9), (10), (11), and (12) are obtained.
By substituting the equations (9) and (10) into the above equation (5), the following equation (13) is obtained.
By substituting the equations (11) and (12) into the above equation (6), the following equation (14) is obtained.
Here, in order to simplify the following equations, the constant terms in the above equations (13) and (14) are substituted as the following equations (15) and (16).
Here, assuming that an error of the door opening angle detected by the angle sensor 32 (hereinafter, also referred to as a “door angle error”) is ε and ε is small, the rotation matrix is expressed by the following equation (17).
By giving a rotation matrix Rε of the door angle error, the following equations (18) and (19) are obtained.
P
w
=A
0
Z_c0+Tc0_w=RεAαZ_cα+Tcα_w (18)
A
0
Z_c0−RεAαZ_cα=Tcα_w−Tc0_w (19)
By transforming the above equation (19), the following equation (20) is obtained.
By solving the equation (20) with Z=εZ_cα, the following equation (21) is obtained.
From Z=εZ_cα, the following equation (22) is obtained.
Here, when a value of ε is calculated at a plurality of corresponding points, there is a possibility that the value varies, but since ε is common in the same image pair, for example, the CPU 21 sets an average value of ε calculated at the plurality of corresponding points as −ε, configures R
B
α
=R
A
α (23)
The CPU 21 calculates Z_c0 and Z_cα using the following equation (24).
Here, X_c0 and Y_c0 are calculated by substituting a value of Z_c0 into the above equations (9) and (10), and P_w is calculated by substituting P_c0=(X_c0, Y_c0, Z_c0)T into the above equation (5). Similarly, P_w is calculated by substituting a value of Z_cα into the above equations (11) and (12) and further substituting into the above equation (6).
At this time, since P_w calculated from the above equation (5) and P_w calculated from the above equation (6) generally do not coincide with each other, in the first embodiment, the CPU 21 specifies a final solution, that is, the three-dimensional position of the obstacle with respect to the vehicle 20 by averaging the two P_w values. The disclosure is not limited thereto, the CPU 21 may adopt a predetermined one of the two P_w values or a weighted average of the two P_w values as the final solution.
Then, the CPU 21 specifies the three-dimensional position of the obstacle described above at a plurality of positions in the obstacle, and calculates the three-dimensional position of the obstacle at N positions.
Here, a door shape of the driver seat door is known, and the CPU 21 can calculate a radius Lh of the driver seat door at a height Yh in the reference coordinate system. The CPU 21 performs the following calculations for all calculated three-dimensional points P_wn=(X_wn, Y_wn, Z_wn)T, (n=0 to N−1) of the obstacle.
First, the CPU 21 obtains the radius Lh of the driver seat door at the height Yh equal to the height Y_wn of the obstacle. Then, when the radius L n of the driver seat door satisfies a relationship represented by the following equation (25), there is a possibility that the driver seat door comes into contact with the obstacle, so that the CPU 21 calculates an angle θn represented by the following equation (26).
On the other hand, when the radius Ln of the driver seat door satisfies a relationship represented by the following equation (27), the CPU 21 does not calculate the angle θn. Then, the CPU 21 determines the smallest one of all the calculated angles θn as the maximum opening angle.
As described above, in the first embodiment, the CPU 21 acquires the image captured by the camera 41 and the door opening angle when the image is captured. The CPU 21 corrects the error of the acquired door opening angle. Then, the CPU 21 specifies the three-dimensional position of the obstacle with respect to the vehicle 20 using the corresponding points of the obstacle, which are determined based on the plurality of images captured by the camera 41 from the plurality of viewpoints with the different acquired door opening angles, and the corrected door opening angles associated with the respective plurality of images. As described above, the door opening angle detected by the angle sensor 32 may have an error due to the measurement error dependent on the angle sensor 32 (for example, the sensor mounting error or the sampling error), the measurement error dependent on the driver seat door (for example, the error due to the door deflection), or the like. Therefore, in the first embodiment, assuming that there is an error in the door opening angle detected by the angle sensor 32, and by correcting the error, the three-dimensional position of the obstacle present around the vehicle 20, specifically, around the driver seat door can be specified with high accuracy.
Here, when specifying the three-dimensional position of the obstacle by a multi-view stereo method, it is necessary to calculate positions and orientations of the camera in respective images captured from the plurality of viewpoints. In the related art, accuracy of estimating the position and orientation of the camera is not sufficient because this calculation is difficult. In the related art, a large number of images are required because the accuracy of estimating the position and orientation of the camera is not sufficient. As a document of the multi-view stereo method, for example, there is “A Comparison and Evaluation of Multi-View Stereo Reconstruction Algorithms, CVPR 2006”.
On the other hand, in the first embodiment, a movement of the camera 41 is restricted by having the hinge 34, which is a common rotation axis. Therefore, by using information indicating the coordinates of the hinge 34, the positions and orientations of the camera in the respective images captured from the plurality of viewpoints can be estimated with high accuracy. Therefore, according to the first embodiment, the three-dimensional position of the obstacle can be specified with fewer images than when specifying the three-dimensional position of the obstacle by the multi-view stereo method in the related art.
In the first embodiment, the three-dimensional position of the obstacle is specified using a door camera which is the camera 41 provided on the door mirror and the angle sensor 32, which are mounted on many vehicles. Therefore, there is no need to add a dedicated part for the specification.
In the first embodiment, the CPU 21 determines the maximum opening angle using the specified three-dimensional position of the obstacle and door information. Accordingly, according to the first embodiment, when opening the driver seat door in a situation where an obstacle is present around the vehicle 20, specifically, around the driver seat door, the driver seat door can be opened to the maximum extent that it does not come into contact with the obstacle.
In the first embodiment, the CPU 21 performs control to open the driver seat door to the determined maximum opening angle. Accordingly, according to the first embodiment, in a situation where an obstacle is present around the driver seat door, the driver seat door can be automatically opened to the maximum extent that it does not come into contact with the obstacle without the occupant performing the opening operation of the driver seat door.
In the first embodiment, the CPU 21 determines the door opening angle when the camera 41 captures an image, and performs control to open the driver seat door to the determined door opening angle. Accordingly, according to the first embodiment, in a situation where an obstacle is present around the driver seat door, the driver seat door can be automatically opened to the maximum extent that it does not come into contact with the obstacle without the occupant performing the opening operation of the driver seat door.
In the first embodiment, even while the CPU 21 is automatically opening the driver seat door, the occupant can manually open and close the driver seat door.
Next, a second embodiment will be described while omitting or simplifying overlapping portions with other embodiments.
As shown in
In the second embodiment, after the determination unit 21D determines a maximum opening angle, the control unit 21E performs control to open a driver seat door to a predetermined angle at which an image is not captured by the camera 41 within a range of the maximum opening angle. At this time, the control unit 21E determines the predetermined angle according to the maximum opening angle determined by the determination unit 21D. As an example, the control unit 21E basically updates the predetermined angle in increments of 10 degrees, and when the maximum opening angle determined by the determination unit 21D is larger than a specific angle (for example, 70 degrees), the control unit 21E updates the predetermined angle in increments of 20 degrees.
In the second embodiment, the specification unit 21C specifies again the three-dimensional position of the obstacle with respect to the vehicle 20 using the corresponding points of the obstacle, which are determined based on a plurality of images captured by the camera 41 from a viewpoint of the predetermined angle and a viewpoint of another door opening angle, and the door opening angles corrected by the correction unit 21B and associated with the respective plurality of images. The “other door opening angle” may be a door opening angle of “0 degrees” or may be other than the door opening angle of “0 degrees”, that is, a door opening angle of “1 degree” or more.
In the second embodiment, the determination unit 21D determines again the maximum opening angle using door information and the three-dimensional position of the obstacle specified again by the specification unit 21C.
The acceptance unit 21F accepts an input of the number of times the determination unit 21D determines the maximum opening angle (hereinafter, referred to as “the number of times of determination”). For example, the acceptance unit 21F accepts a value designated by an operation of the monitor 43 by an occupant as the number of times of determination.
In step S20 shown in
In step S21, the CPU 21 acquires the image captured by the camera 41 and the door opening angle when the image is captured. As an example, in step S21 for a first time, the CPU 21 acquires the image captured by the camera 41 at the door opening angle of “0 degrees” and the image captured by the camera 41 at the door opening angle of “7 degrees”. In step S21 for a second time, the CPU 21 acquires an image captured by the camera 41 at a door opening angle of “17 degrees” as the predetermined angle. Then, the processing proceeds to step S22.
In step S22, the CPU 21 specifies the three-dimensional position of the obstacle with respect to the vehicle 20 using the corresponding points of the obstacle, which are determined based on the plurality of images captured by the camera 41 from the plurality of viewpoints with different door opening angles acquired in step S21, and the corrected door opening angles associated with the respective plurality of images. Here, the CPU 21 corrects an error of the door opening angle acquired in step S21. As an example, in step S22 for a first time, the CPU 21 determines the corresponding points of the obstacle based on the images captured by the camera 41 from the viewpoint of the door opening angle of “0 degrees” and the viewpoint of the door opening angle of “7 degrees”. In step S22 for a second time, the CPU 21 determines the corresponding points of the obstacle based on the images captured by the camera 41 from the viewpoint of the door opening angle of “17 degrees” and the viewpoint of the door opening angle of “0 degrees”. Then, the processing proceeds to step S23.
In step S23, the CPU 21 determines the maximum opening angle using the three-dimensional position of the obstacle specified in step S22 and the door information on the driver seat door. Then, the processing proceeds to step S24.
In step S24, the CPU 21 determines whether the number of times the maximum opening angle is determined in step S23 reaches the number of times of determination for which the input is accepted in step S20. When the CPU 21 determines that the number of times the maximum opening angle is determined in step S23 reaches the number of times of determination (step S24: YES), the processing proceeds to step S25. On the other hand, when the CPU 21 does not determine that the number of times the maximum opening angle is determined in step S23 reaches the number of times of determination (step S24: NO), the processing returns to step S21.
In step S25, the CPU 21 opens the driver seat door to the maximum opening angle determined in previous step S23. Then, the opening processing ends.
As described above, in the second embodiment, after determining the maximum opening angle once, the CPU 21 performs control to open the driver seat door to the predetermined angle at which an image is not captured by the camera 41 within the range of the maximum opening angle. The CPU 21 specifies again the three-dimensional position of the obstacle with respect to the vehicle 20 using the corresponding points of the obstacle, which are determined based on the plurality of images captured by the camera 41 from the viewpoint of the predetermined angle and the viewpoint of the other door opening angle, and the corrected door opening angles associated with the respective plurality of images. Then, the CPU 21 determines again the maximum opening angle using the three-dimensional position of the obstacle specified again and the door information. Accordingly, according to the second embodiment, by determining the maximum opening angle again, the accuracy of the determined maximum opening angle can be improved compared to a configuration in which the maximum opening angle is determined only once.
In the second embodiment, the CPU 21 determines the predetermined angle according to the determined maximum opening angle. Here, when a distance to the obstacle is large (when the distance to the obstacle is greater than or equal to a predetermined distance), the three-dimensional position of the obstacle can be specified with higher accuracy by using an image with a large door opening angle rather than an image with a small door opening angle. Therefore, according to the second embodiment, as an example, when the determined maximum opening angle is larger than the specific angle, the predetermined angle is determined to be larger than a normal angle, so that the three-dimensional position of the obstacle can be specified with high accuracy.
In the second embodiment, the CPU 21 accepts the input of the number of times of determination. Accordingly, according to the second embodiment, for example, when the occupant has time to spare, by repeating the determination of the maximum opening angle many times, the maximum opening angle at which the drive seat door can be opened to just before the obstacle can be determined. According to the second embodiment, when the occupant does not have enough time, the driver seat door can be opened early by ending the determination of the maximum opening angle in a small number of repetitions.
Next, a third embodiment will be described while omitting or simplifying overlapping portions with other embodiments.
As shown in
The sonar sensor 46 is provided at least on the driver seat door, and is a device that uses ultrasonic waves to detect a distance to an obstacle approaching to the side of the vehicle. The sonar sensor 46 is an example of a “distance measurement sensor”.
An example of a functional configuration of the on-board device 15 in the third embodiment is the same as the example of the functional configuration of the on-board device 15 in the second embodiment shown in
In the third embodiment, the correction unit 21B corrects the image captured by the camera 41 using internal parameters of the camera 41. For example, the correction unit 21B performs distortion correction as correction of the image. At this time, the correction unit 21B uses, as the internal parameters of the camera 41, a parameter for correcting optical distortion for each camera model, a focal length, and the like. The internal parameters are stored in the storage unit 24 in advance.
As an example, the distortion correction by the correction unit 21B is performed using the following method. Scaramuzza, D., A. Martinelli, and R. Siegwart. “A Toolbox for Easy Calibrating Omnidirectional Cameras.” Proceedings to IEEE International Conference on Intelligent Robots and Systems, (IROS). Oct. 7-15, 2006.
In the third embodiment, the control unit 21E performs control to prohibit opening of the driver seat door based on a detection result of the sonar sensor 46 provided on the driver seat door. Specifically, when the sonar sensor 46 detects an obstacle coming close to or approaching the driver seat door, the control unit 21E performs control to prohibit opening of the driver seat door.
In step S30 shown in
In step S31, the CPU 21 acquires the image captured by the camera 41 and the door opening angle when the image is captured. Then, the processing proceeds to step S32.
In step S32, the CPU 21 corrects the image acquired in step S31 using the internal parameters of the camera 41. Then, the processing proceeds to step S33.
In step S33, the CPU 21 specifies the three-dimensional position of the obstacle with respect to the vehicle 20 using the corresponding points of the obstacle, which are determined based on the plurality of images corrected in step S32, and the corrected door opening angles associated with the respective plurality of images. Here, the CPU 21 corrects the error of the door opening angle acquired in step S31. Then, the processing proceeds to step S34.
In step S34, the CPU 21 determines the maximum opening angle using the three-dimensional position of the obstacle specified in step S33 and the door information on the driver seat door. Then, the processing proceeds to step S35.
In step S35, the CPU 21 determines whether the number of times the maximum opening angle is determined in step S34 reaches the number of times of determination for which the input is accepted in step S30. When the CPU 21 determines that the number of times the maximum opening angle is determined in step S34 reaches the number of times of determination (step S35: YES), the processing proceeds to step S36. On the other hand, when the CPU 21 does not determine that the number of times the maximum opening angle is determined in step S34 reaches the number of determinations (step S35: NO), the processing returns to step S31.
In step S36, the CPU 21 opens the driver seat door to the maximum opening angle determined in previous step S34. Then, the opening processing ends.
As described above, in the third embodiment, the CPU 21 performs the control to prohibit the opening of the driver seat door based on the detection result of the sonar sensor 46 provided on the driver seat door. Accordingly, according to the third embodiment, as an example, opening of the driver seat door can be prohibited when the sonar sensor 46 detects the obstacle coming close to or approaching the driver seat door.
In the third embodiment, the CPU 21 corrects the image captured by the camera 41 using the internal parameters of the camera 41. Accordingly, according to the third embodiment, since the three-dimensional position of the obstacle is specified using the corrected image, the three-dimensional position of the obstacle can be specified with high accuracy compared to a configuration in which the image correction is not performed.
In the above embodiment, the driver seat door of the vehicle 20 is an example of the “hinge door”, but instead of or in addition to this, at least one of a front passenger seat door and a rear door may be an example of the “hinge door”. When at least one of the front passenger seat door and the rear door is an example of the “hinge door”, an actuator that automatically opens and closes the door, an angle sensor that detects a door opening angle of the door, and a camera that is provided in the door and captures an image of a side of the vehicle are mounted on the vehicle 20. When at least one of the front passenger seat door and the rear door is an example of the “hinge door”, a sonar sensor may be provided on the door.
In the above embodiment, an example has been described in which the opening processing is started in a situation where an occupant is inside the vehicle 20, but the disclosure is not limited thereto, and the opening processing may be started in a situation where the occupant is outside the vehicle 20. As an example, the opening processing may be started when an electronic key corresponding to the vehicle 20 is detected in a situation where the occupant is outside the vehicle 20.
In the above embodiment, the camera 41 is provided on the door mirror 33 of the driver seat door of the vehicle 20, but the disclosure is not limited thereto, and the camera 41 may be provided in the driver seat door itself.
In the above embodiment, the obstacle present around the driver seat door may be an object imaged in the image captured by the camera 41, and may be an object present at a position in contact with the driver seat door when the driver seat door is opened, or may be an object present at a position not in contact with the driver seat door.
In the above embodiment, the on-board device 15 is an example of the “information processing apparatus”, but the disclosure is not limited thereto, and an external device such as a server connectable to the vehicle 20 may be an example of the “information processing apparatus”. In this case, as an example, the external device may include functions of the acquisition unit 21A, the correction unit 21B, the specification unit 21C, and the determination unit 21D described in the above embodiment, and the vehicle 20 may include functions of the control unit 21E and the acceptance unit 21F.
The opening processing executed by the CPU 21 reading software (program) in the above embodiment may be executed by various processors other than the CPU. In this case, examples of the processor include a programmable logic device (PLD) whose circuit configuration can be changed after manufacturing a field-programmable gate array (FPGA) or the like, a dedicated electric circuit such as an application specific integrated circuit (ASIC) that is a processor having a circuit configuration specially designed to execute specific processing, or the like. Further, the opening processing may be executed by one of these various processors, or may be executed by a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA). More specifically, a hardware structure of the various processors is an electric circuit in which circuit elements such as semiconductor elements are combined.
In the above embodiment, a mode has been described in which the information processing program is stored (installed) in advance in the storage unit 24, but the disclosure is not limited thereto. The information processing program may be provided in a form recorded in a recording medium such as a compact disk read only memory (CD-ROM), a digital versatile disk read only memory (DVD-ROM), and a universal serial bus (USB) memory. The information processing program may be downloaded from an external device via a network.
The present disclosure may adopt the following aspects.
(1) An information processing apparatus including:
(2) The information processing apparatus according to (1), further including:
(3) The information processing apparatus according to (2), further including:
(4) The information processing apparatus according to (3), in which
(5) The information processing apparatus according to (3) or (4), in which
(6) The information processing apparatus according to (5), in which
(7) The information processing apparatus according to (5) or (6), further including:
(8) The information processing apparatus according to any one of (3) to (7), in which
(9) The information processing apparatus according to any one of (1) to (8), in which
(10) An information processing method executed by a computer, the information processing method including:
(11) An information processing program for causing a computer to execute:
In the information processing apparatus, the information processing method, and the information processing program according to this disclosure, a three-dimensional position of an obstacle present around a vehicle can be accurately specified.
The principles, preferred embodiment and mode of operation of the present invention have been described in the foregoing specification. However, the invention which is intended to be protected is not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. Variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present invention. Accordingly, it is expressly intended that all such variations, changes and equivalents which fall within the spirit and scope of the present invention as defined in the claims, be embraced thereby.
Number | Date | Country | Kind |
---|---|---|---|
2022-144183 | Sep 2022 | JP | national |