The present application is based upon and claims the benefit of priority under 35 U.S.C. § 119 to Japanese Patent Application No. 2022-188592 filed on Nov. 25, 2022, the entire contents of which are hereby incorporated by reference.
The present disclosure relates to an imaging system, an imaging method, and a medium.
Conventionally, a technique has been known that uses an imaging device installed on a vehicle, captures images of road attachments such as a road sign and a convex traffic mirror at a road curve as inspection target objects while moving, to inspect deformation such as damage to the inspection target objects from the captured images. Note that the road attachment is a fixture or structure necessary for maintaining the structure of a road, ensuring safe and smooth road traffic, and other road management.
As one of the techniques described above, a technique is disclosed that detects a pillar area, a specific color area, and a specific shape area from a captured image of a road on which a vehicle is traveling, to determine that these are road attachments arranged in the surroundings of the road (e.g., see Patent Document 1).
However, the technique of Patent Document 1 is not disclosed from the viewpoint of obtaining multiple captured images of an inspection target object being a road attachment or the like, which are imaged from different directions at multiple viewpoints.
According to one aspect of the present invention, an imaging system includes an imaging device; and an imaging control device including a processor and a memory configured to control operations of the imaging device. The imaging device is installed on a mobile object to image an inspection target object while the mobile object is moving. A side visible from the imaging device when the inspection target object is positioned in a forward direction with respect to a traveling direction of the mobile object is defined as a front surface of the inspection target object; a side visible from the imaging device when the mobile object is positioned in a sideward direction with respect to the traveling direction is defined as a side surface of the inspection target object; and a side visible from the imaging device when the mobile object is positioned in a backward direction with respect to the traveling direction is defined as a rear surface of the inspection target object. The imaging device images the front surface of the inspection target object, at a first timing when the inspection target object is positioned in the forward direction with respect to the traveling direction, to obtain a first captured image. The processor of the imaging control device controls operations of the imaging device, to obtain a second captured image by imaging at least one of the side surface or the rear surface of the inspection target object, at a second timing when the inspection target object is positioned in at least one of the sideward direction or the backward direction with respect to the traveling direction.
An imaging system, an imaging method, and a medium according to embodiments of the present invention will be described in detail with reference to the drawings.
According to an embodiment, multiple captured images of an inspection target object that are imaged from different directions at multiple viewpoints can be obtained.
The following forms are provided for exemplifying the imaging system, imaging method, and medium for embodying the technical concepts of the present embodiments, and forms are not limited to the following. In addition, the dimensions, materials, shapes, and relative arrangements of the constituent elements described in the embodiments are not intended to limit the scope of the present invention to only those, but are merely illustrative examples. Note that the sizes and positional relationships of members illustrated in the drawings may be exaggerated for the sake of clarifying the description. In addition, in the following description, the same names and symbols indicate members of the same or the same type, and the detailed descriptions may be omitted appropriately.
In the drawings, orthogonal coordinates having X, Y, and Z axes may be used as representation of directions. The X, Y, and Z axes are substantially orthogonal to each other. The Y direction indicates a direction along a traveling direction of a mobile object on which an imaging system according to the embodiment is installed, and the Z direction indicates the vertical direction. However, the representation of directions as such does not limit directions in the embodiment, and the orientation of the imaging system can be set discretionarily.
The imaging system 6 includes an imaging device 4 and an imaging control device 5.
The imaging device 4 is installed on the mobile object 3 to image an inspection target object while the mobile object 3 is moving. In the present embodiment, the imaging device 4 images the front surface of the inspection target object at a first timing when the inspection target object is positioned in the forward direction with respect to the traveling direction 30, to obtain a first captured image. This inspection target object is a road attachment such as a guide sign, a regulation sign, a convex traffic mirror at a road curve, or the like.
In the present embodiment, the imaging device 4 includes a forward camera 4F, a sideward camera 4S, and a backward camera 4R. Here, a side visible from the imaging device 4 when an inspection target object is positioned in the forward direction with respect to the traveling direction 30 of the mobile object 3 is defined as a front surface of the inspection target object; a side visible from the imaging device 4 when the mobile object 3 is positioned in the sideward direction with respect to the traveling direction 30 is defined as a side surface of the inspection target object; and a side visible from the imaging device 4 when the mobile object 3 is positioned in the backward direction with respect to the traveling direction 30 is defined as a rear surface of the inspection target object. The forward camera 4F is arranged to be capable of imaging the front surface of the inspection target object. The sideward camera 4S is arranged to be capable of imaging the side surface of the inspection target object. The backward camera 4R is arranged to be capable of imaging the rear surface of the inspection target object.
In examples illustrated in the present specification, the forward camera 4F, the sideward camera 4S, and the backward camera 4R each have the same configuration. In the case where the forward camera 4F, the sideward camera 4S, and the backward camera 4R are not distinguished from one another, these are collectively referred to as the imaging device(s) 4. However, the forward camera 4F, the sideward camera 4S, and the backward camera 4R may have different configurations.
The imaging control device 5 controls operations of the imaging device 4. In the present embodiment, the imaging control device 5 controls operations of the imaging device 4 to obtain a second captured image by imaging at least one of the side surface or the rear surface of the inspection target object, at a second timing when the inspection target object is positioned in at least one of the sideward direction and the backward direction with respect to the traveling direction 30.
With reference to
The CPU 401 controls operations of the entire imaging device 4. The ROM 402 stores programs used for the CPU 401 and for driving the CPU 401 such as an IPL. The RAM 403 is used as a work area of the CPU 401. The EEPROM 404 reads or writes various items of data such as programs for the imaging device 4 under control of the CPU 401.
The display 407 is a type of display unit of liquid crystal or organic EL (Electro Luminescence) that displays an image of a subject or various icons. The short-range communication I/F 408 is a communication circuit such as NFC (Near Field Communication) or Bluetooth (registered trademark). The CMOS sensor 409 is a type of built-in imaging unit that captures an inspection target object to obtain image data under control of the CPU 401. Note that the CMOS sensor 409 may be replaced with an imaging unit such as a CCD (Charge Coupled Device) sensor. The imaging element I/F 410 is a circuit that controls driving of the CMOS sensor 409.
The network I/F 411 is an interface for communicating with other devices via the communication network 100. The operation button 412 is a type of input unit for operating the imaging device 4 when being pressed by the user. The media I/F 415 controls reading or writing (storing) of data on the recording medium 414 such as a flash memory. The external device connection I/F 416 is an interface for connecting various external devices.
The sound input/output I/F 417 is a circuit for processing input/output of sound signals through the microphone 418 and the speaker 419 under control of the CPU 401. The microphone 418 is a built-in circuit for converting sound into electric signals. The speaker 419 is a built-in circuit for converting electric signals into physical vibrations to produce sound such as music and speech.
In addition, the imaging device 4 is provided with a bus-line 420. The bus-line 420 is an address bus, a data bus, or the like for electrically connecting the constituent elements including the CPU 401 and the like to each other.
The CPU 501 controls operations of the entire imaging control device 5. The ROM 502 stores a program used for driving the CPU 501, such as an IPL. The RAM 503 is used as a work area of the CPU 501. The EEPROM 504 reads or writes various items of data such as programs for the imaging control device 5 under control of the CPU 501. The HD 505 stores various items of data such as programs. The HDD controller 506 controls reading or writing of various items of data on the HD 505 under control of the CPU 501.
The display 507 displays various information items such as a cursor, a menu, a window, a character, or an image. The short-range communication I/F 508 is a communication circuit such as NFC or Bluetooth. The imaging element I/F 510 is a circuit for controlling communication with the imaging device 4. The network I/F 511 is an interface for data communication using the communication network 100. The touch panel 512 is a type of input unit that operates the imaging control device 5 when the user presses the display 507. The pointing device 513 is a type of input unit that selects and executes various commands, selects a processing target, moves a cursor, and so on. The media I/F 515 controls reading or writing (storage) of data on the recording medium 514 such as a flash memory.
The external device connection I/F 516 is an interface for connecting various external devices. The external device in this case is a USB (Universal Serial Bus) memory, printer, or the like. The sound input/output I/F 517 is a circuit for processing input/output of sound signals through the microphone 518 and the speaker 519 under control of the CPU 501. The microphone 518 is a built-in circuit for converting sound into electric signals. The speaker 519 is a built-in circuit for converting electric signals into physical vibrations to produce sound such as music and voice.
The data bus 520 is an address bus, a data bus, or the like for electrically connecting the constituent elements including the CPU 501 and the like to each other.
The CPU 701 controls operations of the entire image analysis device 7. The ROM 702 stores a program used for driving the CPU 701, such as an IPL. The RAM 703 is used as a work area of the CPU 701. The EEPROM 704 reads or writes various items of data such as programs for the image analysis device 7 under control of the CPU 701. The HD 705 stores various items of data such as programs. The HDD controller 706 controls reading or writing of various items of data on the HD 705 under control of the CPU 701.
The display 707 displays various information items such as a cursor, a menu, a window, a character, or an image. The short-range communication I/F 708 is a communication circuit such as NFC or Bluetooth. The CMOS sensor 709 is a type of built-in imaging unit that captures an inspection target object to obtain image data under control of the CPU 701. Note that the CMOS sensor 709 may be replaced with an imaging unit such as a CCD sensor. The imaging element I/F 710 is a circuit for controlling communication with the imaging device 4.
The network I/F 711 is an interface for data communication using the communication network 100. The touch panel 712 is a type of input unit that operates the image analysis device 7 when the user presses the display 707. The pointing device 713 is a type of input unit that selects and executes various commands, selects a processing target, moves a cursor, and so on. The media I/F 715 controls reading or writing (storage) of data on the recording medium 714 such as a flash memory.
The external device connection I/F 716 is an interface for connecting various external devices. The external device in this case is, for example, a USB memory or a printer. The sound input/output I/F 717 is a circuit for processing input/output of sound signals through the microphone 718 and the speaker 719 under control of the CPU 701. The microphone 718 is a built-in circuit for converting sound into electric signals. The speaker 719 is a built-in circuit for converting electric signals into physical vibrations to produce sound such as music and voice.
The data bus 720 is an address bus, a data bus, or the like for electrically connecting the constituent elements including the CPU 701 and the like to each other.
The CPU 801 controls operations of the entire communication terminal 8. The ROM 802 stores programs used for the CPU 801 and for driving the CPU 801 such as an IPL. The RAM 803 is used as a work area of the CPU 801. The EEPROM 804 reads or writes various items of data such as programs for the communication terminal 8 under control of the CPU 801.
The display 807 is a type of display unit of liquid crystal or organic EL that displays an image of a subject or various icons. The short-range communication I/F 808 is a communication circuit such as NFC or Bluetooth. The CMOS sensor 809 is a type of built-in imaging unit that captures an inspection target object to obtain image data under control of the CPU 801. Note that the CMOS sensor 809 may be replaced with an imaging unit such as a CCD sensor. The imaging element I/F 810 is a circuit that controls driving of the CMOS sensor 809.
The network I/F 811 is an interface for communicating with other devices via the communication network 100. The touch panel 812 is a type of input unit that operates the communication terminal 8 when being pressed by the user. The media I/F 815 controls reading or writing (storage) of data on the recording medium 814 such as a flash memory. The external device connection I/F 816 is an interface for connecting various external devices.
The sound input/output I/F 817 is a circuit for processing input/output of sound signals through the microphone 818 and the speaker 819 under control of the CPU 801. The microphone 818 is a built-in circuit for converting sound into electric signals. The speaker 819 is a built-in circuit for converting electric signals into physical vibrations to produce sound such as music and voice.
In addition, the communication terminal 8 is provided with a bus-line 820. The bus-line 820 is an address bus, a data bus, or the like for electrically connecting the constituent elements including the CPU 801 and the like to each other.
The imaging system 6 includes a transceiver unit 61, an operation reception unit 62, a captured image obtainment unit 63, a display control unit 64, a determination unit 65, an imaging setting control unit 66, an image analysis unit 67, a generation unit 68, a storage read unit 69, a target object detection unit 90, a distance detection unit 91, and a storage unit 6000.
Functions of the transceiver unit 61 are implemented by the network I/F 511 or the like in
The transceiver unit 61 controls communication between the imaging system 6 and an external device via the communication network 100. The operation reception unit 62 receives operations performed by the operator of the imaging system 6. The captured image obtainment unit 63 obtains images captured by the imaging device 4. The display control unit 64 controls displaying of a screen on the display 407 in the imaging device 4, the display 507 in the imaging control device 5, or the like. The determination unit 65 executes various determination operations in the imaging system 6. The imaging setting control unit 66 controls settings such as imaging conditions by the imaging device 4. The image analysis unit 67 executes an analysis process of a captured image obtained by the captured image obtainment unit 63. The generation unit 68 generates a screen or the like to be displayed on the display 407, the display 507, or the like. The storage unit 6000 stores an inspection target object front surface management DB (Data Base) 6011, an inspection target object side surface management DB 6012, and an inspection target object rear surface management DB 6013.
In the present embodiment, the target object detection unit 90 detects an inspection target object based on at least one of the first captured image or the second captured image. Here, “detecting an inspection target object” means determining whether a target object to be inspected (a desired target object) appears in an image captured by the imaging system 6. In the following, the desired target object will be referred to as the inspection target object A. Note that the inspection target object does not need to be of one type, but may be of multiple types. The first captured image is an image obtained by imaging the front surface of an inspection target object, at a first timing when the inspection target object is positioned in the forward direction with respect to the traveling direction 30. The second captured image is an image obtained by imaging at least one of the side surface or the rear surface of the inspection target object, at a second timing when the inspection target object is positioned in at least one of the sideward direction or the backward direction with respect to the traveling direction 30.
In the present embodiment, the target object detection unit 90 detects an inspection target object using a deep neural network (DNN) or the like. For example, the inspection target object front surface management DB 6011, the inspection target object side surface management DB 6012, and the inspection target object rear surface management DB 6013 store a trained model or a model under training used for detecting an inspection target object. The target object detection unit 90 detects an inspection target object by the DNN with reference to the trained model or the model under training. The inspection target object front surface management DB 6011 is used when the target object detection unit 90 detects an inspection target object based on a first captured image. The inspection target object side surface management DB 6012 and the inspection target object rear surface management DB 6013 are used when the target object detection unit 90 detects an inspection target object based on a second captured image. Note that in the case of using the model under training, the target object detection unit 90 updates the model under training while using the model under training.
In addition, in the present embodiment, the distance detection unit 91 detects a distance between the inspection target object A and the mobile object 3. In this case, the distance includes distances for imaging the inspection target object A by the forward camera 4F, the sideward camera 4S, and the backward camera 4R, respectively, and means distances between a surface of the inspection target object A facing the mobile object 3, and the respective imaging surfaces of the forward camera 4F, the sideward camera 4S, and the backward camera 4R in the imaging system 6. The distance detection unit 91 can detect a distance between the inspection target object A and the mobile object 3 based on, for example, the size of the inspection target object A included in a captured image. The target object detection unit 90 described above may detect the inspection target object A based on at least one of the distances between the inspection target object A and the mobile object 3 detected by the distance detection unit 91, the first captured image, and the second captured image.
The image analysis device 7 includes a transceiver unit 71, an obtainment unit 72, a calculation estimation unit 73, a display control unit 74, a determination unit 75, an image analysis unit 77, a generation unit 78, a storage read unit 79, a damage detection unit 92, and a storage unit 7000.
Functions of the transceiver unit 71 are implemented by the network I/F 711 or the like in
The transceiver unit 71 controls communication between the image analysis device 7 and an external device via the communication network 100. The obtainment unit 72 obtains a captured image obtained by the imaging system 6 via the communication network 100. The calculation estimation unit 73 executes various estimation processes based on captured images obtained by the obtainment unit 72. The display control unit 74 controls displaying of a screen on the display 707 and the like. The determination unit 75 executes various determination operations in the image analysis device 7. The image analysis unit 77 executes an analysis process of a captured image obtained by the obtainment unit 72. The generation unit 78 generates a screen or the like to be displayed on the display 707 or the like. The storage unit 7000 stores a login information management DB 7001, a front surface deformation state management DB 7011, a side surface deformation state management DB 7012, and a rear surface deformation state management DB 7013.
In the present embodiment, the damage detection unit 92 detects damage to the inspection target object. For example, the inspection target object includes a sign, a pillar part supporting the sign, and a fixing member fixing the sign to the pillar part. A sign is, for example, a road sign. A pillar part is, for example, a pillar that supports the road sign. A fixing member is a screw member, for example, a bolt that is fastened to fix the road sign to the pillar.
In the present embodiment, based on at least one of the first captured image or the second captured image, the damage detection unit 92 detects at least one of deformation from the predetermined shape of the sign, deformation of the pillar part, or loosening of the fixing member. In other words, based on at least one of the first captured image or the second captured image, the damage detection unit 92 determines that damage occurs as at least one of deformation from the predetermined shape of the sign, deformation of the pillar part, or loosening of the fixing member.
In the present embodiment, the damage detection unit 92 detects damage to the inspection target object using a deep neural network or the like. For example, in the storage unit 7000, the front surface deformation state management DB 7011, the side surface deformation state management DB 7012, and the rear surface deformation state management DB 7013 store a trained model or a model under training used for detecting damage to an inspection target object. The damage detection unit 92 detects damage to an inspection target object by the DNN with reference to the trained model or the model under training. The front surface deformation state management DB 7011 is used when the damage detection unit 92 detects damage to an inspection target object based on a first captured image. The side surface deformation state management DB 7012 and the rear surface deformation state management DB 7013 are used when the damage detection unit 92 detects the damage to the inspection target object based on a second captured image. Note that in the case of using the model under training, the damage detection unit 92 updates the model under training while using the model under training.
Functions of the damage detection unit 92 in the image analysis device 7 may be provided on the imaging system 6.
The communication terminal 8 includes a transceiver unit 81, an operation reception unit 82, a display control unit 84, a start-up processing unit 86, a generation unit 88, a storage read unit 89, and a storage unit 8000.
Functions of the transceiver unit 81 are implemented by the network I/F 811 or the like in
The transceiver unit 81 controls communication between the communication terminal 8 and an external device via the communication network 100. The operation reception unit 82 receives operations performed by the operator of the communication terminal 8. The display control unit 84 controls displaying of a screen on the display 807 and the like. In the present embodiment, the display control unit 84 displays a result of detection of damage obtained by the damage detection unit 92. The start-up processing unit 86 executes a predetermined process upon starting up the communication terminal 8. The generation unit 88 generates a screen or the like to be displayed on the display 807 or the like. The storage unit 8000 stores various items of setting information and the like.
With reference to
In
In addition, each of the forward camera 4F, sideward camera 4S, and backward camera 4R can change its own imaging range. In
As illustrated in
An example of imaging using the imaging system 6 will be described.
First, definitions of terms will be described. A “captured image for inspection” is defined as a photograph image captured by an imaging device installed on a mobile object. The “captured image for inspection” is distinguished among a “captured image for inspection” captured by the forward camera 4F as a “captured image for forward inspection”; a “captured image for inspection” captured by the sideward camera 4S as a “captured image for sideward inspection”; and a “captured image for inspection” captured by the backward camera 4R as a “captured image for backward inspection”. Each of these “captured image for forward inspection”, “captured image for sideward inspection”, and “captured image for backward inspection” includes multiple images obtained by continuously capturing images.
The imaging device 4 continuously images still images. However, the imaging system 6 may capture a video instead of continuously imaging still images.
An image for determining whether a subject captured by the imaging device 4 is an inspection target object is defined as a “image for determination”. An image for determining whether to start or stop imaging of an inspection target object is defined as a “reference image for imaging”. An image for inspecting an inspection target object by comparing with multiple captured images for inspection obtained by imaging is defined as a “reference image for inspection”. These “image for determination”, “reference image for imaging”, and “reference image for inspection” are all images imaged at substantially the same imaging position and imaging angle as in the case where the inspection target object A is imaged from the imaging device 4 installed on the mobile object 3.
The definitions of the images used for imaging operations and inspection operations will be described in more detail. Note that values such as numbers and number of sheets are values provided for the sake of convenience in order to make the description easier to understand.
The “image for determination” is an image for determining whether a subject imaged (detected) by the imaging device is the inspection target object A in detection operations of an inspection target object, and for determining whether to start the imaging control operations (S223 in
The “reference image for imaging” is an image for determining whether to save the subject imaged by the imaging device 4 as a captured image for inspection used for inspection in the imaging operations of the inspection target object A. The “captured image for inspection” is imaged at predetermined intervals within a predetermined range forward and backward along the moving direction of the mobile object 3 around a center being the position where the “reference image for imaging” is imaged. The “reference image for imaging” is an image captured near the center of the predetermined range forward and backward along the moving direction of the mobile object 3. There are three types of “reference images for imaging”: a “reference image for forward imaging” for the forward camera 4F; a “reference image for sideward imaging” for the sideward camera 4S; and a “reference image for backward imaging” for the backward camera 4R.
For example, in the case where the inspection target object A present in the forward direction of the mobile object 3 is imaged by the forward camera 4F installed in the forward direction of the mobile object 3, if the inspection target object A is imaged within a range from 1 m to 20 cm in the forward direction of the mobile object 3, the “reference image for forward imaging” is a captured image of the inspection target object A imaged at a position at 60 cm away in the forward direction of the mobile object 3.
In the case where the inspection target object A present in the sideward direction of the mobile object 3 is imaged by the sideward camera 4S installed in the sideward direction of the mobile object 3, the “reference image for sideward imaging” is a captured image of the inspection target object positioned on the optical axis of the sideward camera 4S.
For example, in the case where the inspection target object A present in the backward direction of the mobile object 3 is imaged by the backward camera 4R installed in the backward direction of the mobile object 3, if the inspection target object A is imaged within a range from 1 m to 20 cm in the backward direction of the mobile object 3, the “reference image for backward imaging” is a captured image of the inspection target object A imaged at a position at 60 cm away in the backward direction of the mobile object 3.
The “reference image for inspection” is an image for extracting deformation of the inspection target object A, and the deformation extraction is executed by comparing “the reference image for inspection with the “captured image for inspection”. With respect to the inspection target object A before use that has not deteriorated, an image captured at substantially the same imaging position and imaging angle as in the case where the inspection target object A is captured from the imaging device 4 installed on the mobile object 3 is set as the “reference image for inspection”. The “reference image for inspection” may be set as the “reference image for imaging”. However, the inspection target object A may be deteriorated due to color fading or change in shape. Therefore, among the “captured images for inspection” imaged when the previous inspection was executed, a “captured image for inspection” imaged at substantially the same imaging position and imaging angle as the “reference image for imaging” may be set as the “reference image for imaging”.
There are three types of “reference images for inspection”: a “reference image for forward inspection” for inspecting a “captured image for forward inspection”; a “reference image for sideward inspection” for inspecting a “captured image for sideward inspection”; and a “reference image for backward inspection” for inspecting a “captured image for backward inspection”.
For example, in the case of inspecting a “captured image for forward inspection” imaged by the forward camera 4F installed in the forward direction of the mobile object 3, in which the inspection target object A is present in the forward direction of the mobile object 3, if inspection is to be executed on the “captured image for forward inspection” in which the inspection target object A is imaged within a range from 1 m to 20 cm in the forward direction of the mobile object 3, the “reference image for forward inspection” includes multiple images captured in predetermined increments within the range from 1 m to 20 cm in the forward direction of the mobile object 3. For example, in the case where the inspection target object A is imaged in increments of 20 cm, the images are five images of the inspection target object A captured at positions 1 m, 80 cm, 60 cm, 40 cm, and 20 cm in the forward direction.
In the case of inspecting a “captured image for sideward inspection” imaged by the sideward camera 4S installed in the sideward direction of the mobile object 3, in which the inspection target object A is present in the sideward direction of the mobile object 3, the “reference image for sideward inspection” includes a captured image of the inspection target object A positioned on the optical axis of the sideward camera 4S, and captured images of the inspection target object positioned at predetermined angles from the optical axis within the angle of view of the photographic lens of the sideward camera 4S. For example, the images are five captured images of the inspection target object A positioned at +40 degrees, +20 degrees, 0 degrees, −20 degrees, and −40 degrees from the optical axis.
For example, in the case where the inspection target object A present in the backward direction of the mobile object 3 is imaged by the backward camera 4R installed in the backward direction of the mobile object 3, if the inspection target object A is imaged within a range from 1 m to 20 cm in the backward direction of the mobile object 3, the “reference image for backward imaging” is a captured image of the inspection target object A imaged at a position at 60 cm away in the backward direction of the mobile object 3.
With reference to
As illustrated in
The captured image If1 or the captured image If2 corresponds to an example of the image for determination. The captured image If3 corresponds to an example of the reference image for forward inspection or an example of the captured image for forward inspection. The captured image If1, the captured image If2, and the captured image If3 each correspond to an example of a first captured image.
The imaging system 6 can obtain the reference image for sideward inspection and the captured image for sideward inspection according to the distance or the like also for the captured image Is by the sideward camera 4S, in substantially the same way as the image If captured by the forward camera 4F. In addition, the imaging system 6 can obtain the reference image for backward inspection and the captured image for backward inspection according to the distance or the like also for the captured image Ir by the backward camera 4R, in substantially the same way as the image If captured by the forward camera 4F.
With reference to
The inspection target object front surface management DB 6011 illustrated in
The inspection target object identification information is information used for identifying an inspection target object A. For example, a code having arbitrary alphanumeric characters is used for the inspection target object identification information. The inspection target object name is information indicating classification of the inspection target object. As the inspection target object name, names that indicate classification of the inspection target objects A such as guide sign, regulation sign, and the like may be enumerated. The captured image sample of the inspection target object is used as a reference training image in the DNN. The estimated distance between the mobile object 3 and the inspection target object A is a distance detected by the image analysis unit 67 illustrated in
The front surface deformation state management DB 7011 illustrated in
The inspection target object identification information and the inspection target object name are the same as those illustrated in
First, at Step S211, the imaging system 6 starts up the imaging device 4.
Next, at Step S212, the imaging system 6 executes detection and imaging processes of the imaging target object A by the determination unit 65. At this time, the imaging system 6 executes the detection and imaging processes by referring to the inspection target object front surface management DB 6011, the inspection target object side surface management DB 6012, and the inspection target object rear surface management DB 6013. After executing the detection and imaging processes, the imaging system 6 stores the results in the storage unit 6000.
Next, at Step S213, the image analysis device 7 requests the imaging system 6 to obtain captured image data.
Next, at Step S214, in response to the request from the image analysis device 7, the imaging system 6 reads captured image data from the storage unit 6000.
Next, at Step S215, the imaging system 6 transmits the captured image data related to the inspection target object A to the image analysis device 7, and transmits an analysis response of the captured image data to the image analysis device 7.
Next, at Step S216, the image analysis device 7 stores the captured image data received from the imaging system 6 in the storage unit 7000.
Next, at Step S217, the image analysis device 7 refers to the front surface deformation state management DB 7011, the side surface deformation state management DB 7012, and the rear surface deformation state management DB 7013 by the determination unit 75, and executes a detection process of damage to the inspection target object A as an image analysis process.
Next, at Step S218, the image analysis device 7 notifies the communication terminal 8 of the analysis result.
Next, at Step S219, the communication terminal 8 displays the analysis result by the image analysis device 7.
Next, at Step S220, the communication terminal 8 transmits the analysis result response to the image analysis device 7.
As described above, the information processing system 1 can execute the process of detecting the inspection target object A and detecting its damage, and display the result on the display 807 or the like of the communication terminal 8.
The imaging system 6 mainly executes “detection of an inspection target object” and “imaging of an inspection target object” as imaging control operations. The operations of “detection of an inspection target object” are operations to determine whether an inspection target object appears in an image captured by the forward camera imaging the forward direction of the detected inspection target object. Here, for “detection of an inspection target object”, it is not necessary to use the same imaging device as used for “imaging of an inspection target object”. In addition, there may be a method other than the method of detecting the inspection target object A from an image in which the forward direction of the mobile object 3 is imaged. In other words, “detection of an inspection target object” is not limited to imaging an inspection target object. Here, although an example is described that uses the same imaging device as used in “imaging of an inspection target object” for “detection of an inspection target object”, specifically, the forward camera 4F for imaging the front surface of an inspection target object A, this is merely an example.
Under control of the imaging control device 5, the operation mode of the imaging control device 5 includes a “detection mode” for detecting an inspection target object A and an “imaging mode” for imaging the inspection target object. These two operation modes are exclusive in the examples illustrated in the present specification, and the operations are executed in the “detection mode” at the initial stage when the mobile object 3 starts moving, and in the “imaging mode” once detecting the inspection target object A. Once imaging of the detected inspection target object A is completed, the operation returns to the “detection mode”.
First, at Step S221, the imaging system 6 determines whether an imaging operation of the inspection target object is in progress.
If it is determined at Step S221 that an imaging operation is in progress (YES at Step S221), the imaging system 6 executes or continues the imaging process of the inspection target object A at Step S223. On the other hand, if it is determined that an imaging operation is not in progress (NO at Step S221), the imaging system 6 executes a detection process of the inspection target object A at Step S222.
Flow charts of detailed processing at Step S222 or a “process of detecting a target object to be inspected” are illustrated in
Two cases can be considered upon executing the detection operations of an inspection target object, i.e., a case of executing automatically, and a case of executing manually by an operator of the imaging system 6.
In the analysis of the captured image for detecting an inspection target object, it is determined whether a subject as a candidate for the inspection target object A appears in the image plane of the captured image. Whether or not the subject as a candidate for the inspection target object A appears in the image plane of the captured image can be determined using a known image analysis process (see, for example, Patent Document 1).
In the case where a subject as a candidate for the inspection target object A appears in the image plane, whether the subject is the inspection target object A is determined. This determination is made using a reference image (image for determination) for determining whether the subject is the inspection target object A. The determination of whether the subject is the inspection target object A is made in a state where the distance from the mobile object 3 to the subject is further than in the imaging for inspection. In other words, the size of the subject (a candidate of the inspection target object A) in in the image plane is smaller than the subject (the inspection target object A) in the image plane by the imaging for inspection. Therefore, the reference image (image for determination) for determining whether it is the inspection target object A is a different image from the “reference image for imaging” used for imaging determination of the inspection image.
As described above, the reference image (image for determination) for determining whether it is the inspection target object A is, for example, the captured image If1 in
In the case where it is determined that a candidate of an inspection target object in the image plane is the inspection target object A, the imaging system 6 determines the inspection target object A to be captured, and sets the “reference image for imaging” of the inspection target object A determined as the “reference image for imaging”. In the case where it is determined that a candidate of an inspection target object in the image plane is the inspection target object A, the imaging system 6 executes a setting process for capturing the inspection target object A.
The imaging start timing is calculated based on the imaging position of the image at which whether the candidate of the inspection target object in the image plane is the inspection target object A is determined; the moving speed of the mobile object 3; and the like. The imaging start timing corresponds to the imaging start timing of each of the forward camera 4F, the sideward camera 4S, and the backward camera 4R.
In calculation of the imaging start timing, a ratio of the size (range in the image plane) of the inspection target object A in the image plane with which it is determined whether the candidate inspection target object in the image is the inspection target object A, to the size (range in the image plane) of the inspection target object A in the “reference image for imaging” used for imaging determination of the inspection image is calculated. From this ratio, the distance from the imaging position of the image with which it is determined whether a candidate of the inspection target object A is captured, to the imaging start position is calculated, and the imaging start timing is calculated based on the moving speed of the mobile object 3.
Note that in the imaging of the inspection image, multiple images are captured in the vicinity (forward and backward) of the “reference image for imaging” used for imaging the inspection image. As the reasons for this, the following three points may be enumerated:
For example, at Step S222 in
If it is determined at Step S231 that it is not a detection timing (NO at Step S231), the imaging system 6 executes the operation at Step S231 again. On the other hand, if it is determined that it is a detection timing (YES at Step S231), at Step S232, the imaging system 6 executes imaging by the forward camera 4F.
Next, at Step S233, the imaging system 6 stores the captured image in the RAM 503 or the like in
Next, at Step S234, the imaging system 6 executes analysis operations of the captured image, i.e., operations to determine whether the inspection target object A appears in the captured image. The operations at Step S234 will be described later in detail using a flow chart illustrated in
Next, at Step S235, the imaging system 6 determines whether the inspection target object A appears in the captured image.
If it is determined at Step S235 that the inspection target object A does not appear in the captured image (NO at Step S235), the imaging system 6 executes the operations at Step S231 and thereafter again. On the other hand, if it is determined that the inspection target object A appears in the captured image (YES at Step S235), at Step S236, the imaging system 6 executes setting operations for imaging the inspection target object A. The operations at Step S236 will be described later in detail using a flow chart illustrated in
As above, the imaging system 6 can execute the automatic detection operations.
If it is determined at Step S241 that an imaging command operation is not executed (NO at Step S241), the imaging system 6 executes the operation at Step S241 again. On the other hand, if it is determined that an imaging command operation is executed (YES at Step S241), the imaging system 6 transitions to an operation at Step S242.
As the operations in the subsequent Steps S242 to S246 are the same as the operations in Steps S232 to S236 in
As above, the imaging system 6 can execute the manual detection operations.
If it is determined at Step S251 that an imaging command operation is not executed (NO at Step S251), at Step S252, the imaging system 6 determines whether it is a detection timing. On the other hand, if it is determined that an imaging command operation is executed (YES at Step S251), the imaging system 6 transitions to an operation at Step S253.
If it is determined at Step S252 that it is not a detection timing (NO at Step S252), the imaging system 6 executes the operation at Step S251 again. On the other hand, if it is determined as a detection timing (YES at Step S251), the imaging system 6 transitions to an operation at Step S253.
As the operations in the subsequent Steps S253 to S257 are the same as the operations in Steps S232 to S236 in
As above, the imaging system 6 can execute the automatic and manual detection operations.
First, at Step S261, the imaging system 6 analyzes the captured image.
Next, at Step S262, the imaging system 6 determines whether a subject as a candidate for the inspection target object A is present in the captured image.
If it is determined at Step S262 that a subject is not present (NO at Step S262), the imaging system 6 terminates the operations. On the other hand, if it is determined as present (YES at Step S262), at Step S263, the imaging system 6 compares the captured image with the image for the inspection target determination.
Next, at Step S264, the imaging system 6 determines whether the subject present in the captured image is the inspection target object A.
If it is determined at Step S264 that the subject is not the inspection target object A (NO at Step S264), the imaging system 6 terminates the operations. On the other hand, if it is determined that the subject is the inspection target object A (YES at Step S264), at Step S265, the imaging system 6 determines the inspection target object A to be imaged, and sets the determination reference image for imaging.
As described above, the imaging system 6 can execute analysis operations of the captured image.
First, at Step S271, the imaging system 6 calculates an imaging start timing from the imaging position of the analyzed image and the moving speed of the mobile object 3.
Next, at Step S272, the imaging system 6 sets the imaging timing of the forward camera 4F.
Next, at Step S273, the imaging system 6 sets the imaging timing of the sideward camera 4S.
Next, at Step S274, the imaging system 6 sets the imaging timing of the backward camera 4R.
After calculating the imaging timing of each of the forward camera 4F, the sideward camera 4S, and the backward camera 4R, at Step S275, the imaging system 6 sets the operation mode to imaging operations of the inspection target object A (“imaging mode”).
As described above, the imaging system 6 can execute the setting operations of the captured image.
After the inspection target object is detected, the imaging system 6 returns to Step S221 in
Upon completion of the inspection target object A detection operations, the imaging system 6 starts the operations illustrated in
First, at Step S281, the imaging system 6 executes the imaging operations of the front surface of the inspection target object A by the forward camera 4F.
Next, at Step S282, the imaging system 6 executes the imaging operations of the side surface of the inspection target object A by the sideward camera 4S.
Next, at Step S283, the imaging system 6 executes the imaging operations of the rear surface of the inspection target object A by the backward camera 4R.
Considering a possibility that the imaging ranges (angles of view) of the forward camera 4F, the sideward camera 4S, and the backward camera 4R overlap, in the example illustrated in
In imaging the inspection target object A, the imaging system 6 determines whether the subject in the captured image is the same image as the detected inspection target object A. Then, upon determining that the subject in the captured image is the same image as the detected inspection target object A, the imaging system 6 determines whether it is an image in the vicinity of the “reference image for imaging” used for imaging determination of the inspection image. In the case of determining that it is an image in the vicinity of the “reference image for imaging”, the imaging system 6 stores the captured image as an inspection image in the HD 505 illustrated in
In this way, in the imaging operations of the inspection target object A, the imaging system 6 determines whether the subject in the captured image is the same image as the detected inspection target object A, or whether it is an image in the vicinity of the “reference image for imaging”. Accordingly, with regard to the imaging start timing (the timing for starting the first imaging) set in the detection operations of the inspection target object A, the imaging system 6 can provide a margin in the setting of the imaging timing in consideration of the calculation error of the distance from the position where the inspection target object A is detected to the position where the inspection target object A is imaged, and change in the moving speed of the mobile object 3. The margin in the setting of the imaging timing is approximately half to 1/10 of the required imaging time interval.
The subsequent imaging may be set based on the moving speed of the mobile object so as to make the imaging direction of each captured image appropriately different. The method of determining whether the image is in the vicinity of the “reference image for imaging” is different among imaging in the forward direction, imaging in the sideward direction, and imaging in the backward direction.
In the case of imaging the front surface on the mobile object 3 side of the inspection target object A, the imaging system 6 makes a determination using a fact that the inspection target object A in the captured image plane is small at first, and then, becomes larger as the mobile object 3 moves. In the case of imaging the side surface of the mobile object 3 of the inspection target object A, the imaging system 6 makes a determination using a fact that the inspection target object A moves in the horizontal direction in the image plane. In the case of imaging the rear surface of the mobile object 3 side of the inspection target object A, the imaging system 6 makes a determination using a fact that the inspection target object A in the captured image plane is large at first, and then, becomes smaller as the mobile object 3 moves.
The imaging system 6 also determines whether to terminate imaging (imaging operations for inspection images) by using a result of determination whether the image is in the vicinity of the “reference image for imaging”. In the case of continuing the imaging, the imaging system 6 sets the next imaging timing, and terminates the process. In the case of terminating the imaging, the imaging system 6 terminates the process without setting the next imaging timing. Note that in the examples illustrated in
First, at Step S291, the imaging system 6 determines whether it is the imaging timing.
If it is determined at Step S291 that it is not the imaging timing (NO at Step S291), the imaging system 6 terminates the operations. On the other hand, if it is determined that it is the imaging timing (YES at Step S291), at Step S292, the imaging system 6 executes imaging of the front surface of the inspection target object A by the forward camera 4F. In other words, the imaging device 4 obtains a first captured image by imaging the front surface of the inspection target object A.
Next, at Step S293, the imaging system 6 stores the captured image in the RAM 503.
Next, at Step S294, the imaging system 6 executes an operation at Step S295 as a determination operation of the captured image.
Next, at Step S295, the imaging system 6 determines whether the subject is the inspection target object A.
If it is determined at Step S295 that the subject is not the inspection target object A (NO at Step S295), the imaging system 6 terminates the operations. On the other hand, if it is determined that the subject is the inspection target object A (YES at Step S295), at Step S296, the imaging system 6 determines whether the size of the image corresponding to the inspection target object A is greater than or equal to a predetermined size to start recording.
If it is determined at Step S296 that it is not greater than or equal to the size to start recording (NO at Step S296), the imaging system 6 terminates the operations. On the other hand, if it is determined that it is greater than or equal to the size to start recording (YES at Step S296), at Step 3297, the imaging system 6 records the captured image on the HD 505 illustrated in
Next, at Step S298, the imaging system 6 determines whether the size of the image corresponding to the inspection target object A is greater than or equal to a predetermined size to stop the recording.
If it is determined at Step S298 that the size is greater than or equal to the size to stop the recording (YES at Step S298), next, the imaging system 6 terminates the operations. On the other hand, if it is determined that the size is not greater than or equal to the size to stop the recording (NO at Step S298), at Step S299, the imaging system 6 sets the next imaging timing, and then, terminates the operation.
As described above, the imaging system 6 can execute the imaging operations of the front surface of the inspection target object A.
Examples of the operations to start or stop recording described above are as follows. The same applies to the case in
In the case where the next imaging is to be executed, the imaging system 6 sets a timer to count the time until the imaging starts. A determination operation of “imaging timing?” first in
First, at Step S301, the imaging system 6 determines whether it is the imaging timing.
If it is determined at Step S301 that it is not the imaging timing (NO at Step S301), the imaging system 6 terminates the operations. On the other hand, if it is determined that it is the imaging timing (YES at Step S301), at Step S302, the imaging system 6 executes imaging of the side surface of the inspection target object A by the sideward camera 4S. In other words, the imaging control device 5 in the imaging system 6 controls operations of the imaging device 4 to image the side surface of the inspection target object A so as to obtain a second captured image.
As the operations at Steps S303 to S307 are the same as the operations at Steps S293 to S297 in
Next, at Step S308, the imaging system 6 determines whether an edge of an image corresponding to the inspection target object A is closer to an edge of the captured image than a predetermined position.
If it is determined at Step S308 as being closer to the edge (YES at Step S308), the imaging system 6 executes the operations at Step S301 and thereafter again. On the other hand, if it is determined as not being closer to the edge (NO at Step S308), at Step S309, the imaging system 6 sets the next imaging timing, and then, terminates the operation.
As described above, the imaging system 6 can execute the imaging operations of the side surface of the inspection target object A.
Here, examples of the operations to start or stop recording are as follows.
First, at Step S311, the imaging system 6 determines whether it is the imaging timing.
If it is determined at Step S311 that it is not the imaging timing (NO at Step S311), the imaging system 6 terminates the operations. On the other hand, if it is determined that it is the imaging timing (YES at Step S311), at Step S312, the imaging system 6 executes imaging of the rear surface of the inspection target object A by the backward camera 4R. In other words, the imaging control device 5 in the imaging system 6 controls operations of the imaging device 4 to image the rear surface of the inspection target object A so as to obtain a second captured image.
As the operations at Steps S313 to S319 are the same as the operations at Steps S293 to S299 in
If it is determined at Step S318 that the size is greater than or equal to the size to stop recording (YES at Step S318), the imaging system 6 sets the operation mode to the detection operations (“detection mode”) of the inspection target object A, and clears the state of executing the imaging operation of the inspection target object A in the operation mode (Step S320). Thereafter, the imaging system 6 returns to Step S221 in
As described above, the imaging system 6 can execute the imaging operations of the rear surface of the inspection target object A.
With reference to
The position P1 represents a position of the mobile object 3 at the timing t1. The position P2 represents a position of the mobile object 3 at the timing t2. The position P3 represents a position of the mobile object 3 at the timing t3. The position P4 represents a position of the inspection target object A.
The distance between the mobile object 3 and the inspection target object A at the position P1 is, for example, 1.2 m. The distance between the mobile object 3 and the inspection target object A at the position P2 is, for example, 1.0 m. The distance between the mobile object 3 and the inspection target object A at the position P3 is, for example, 0.2 m. Note that the distance here means a minimum distance between the front surface of the inspection target object A and the imaging surface of the forward camera 4F.
The position P4 represents a position to start the imaging of the side surface of the inspection target object A by the imaging system 6. The position P5 represents a position to stop the imaging of the side surface of the inspection target object A by the imaging system 6. Note that the distance here means a minimum distance between the side surface of the inspection target object A and the imaging surface of the sideward camera 4S.
The position P6 represents a position of the inspection target object A. The position P7 represents a position of the mobile object 3 at the timing t10. The position P8 represents a position of the mobile object 3 at the timing t1l.
In the embodiment described above, although it has been described that an inspection target object is imaged by using three cameras (the forward camera 4F, the sideward camera 4S, and the backward camera 4R), the inspection target object may be imaged by a single camera changing its orientation (imaging direction) as illustrated schematically in
A table 8111 shows information on detailed inspection results such as positional information on the inspection target object on a map, map information on the inspection target object, presence or absence of a defect on the inspection target object A, an image showing a defect on the inspection target object A, and comments indicating the analysis result. The button 8251 is a UI (User Interface) part operated by an observer of the screen 8110 in the case where the observer confirms the inspection results.
As described above, the imaging system 6 according to the present embodiment includes the imaging device 4 and the imaging control device 5 to control operations of the imaging device 4. The imaging device 4 is installed on the mobile object 3 to image an inspection target object A while the mobile object 3 is moving. A side visible from the imaging device 4 when the inspection target object A is positioned in the forward direction with respect to the traveling direction 30 of the mobile object 3 is defined as the front surface Af of the inspection target object A; a side visible from the imaging device 4 when the mobile object 3 is positioned in the sideward direction with respect to the traveling direction 30 is defined as the side surface As of the inspection target object A; and a side visible from the imaging device 4 when the mobile object 3 is positioned in the backward direction with respect to the traveling direction 30 is defined as the rear surface Ar of the inspection target object A. The imaging device 4 images the front surface of the inspection target object A at a first timing when the inspection target object A is positioned in the forward direction with respect to the traveling direction 30, to obtain a first captured image. The imaging control device 5 controls operations of the imaging device, to obtain a second captured image by imaging at least one of the side surface or the rear surface of the inspection target object A, at a second timing when the inspection target object A is positioned in at least one of the sideward direction or the backward direction with respect to the traveling direction 30. In the present embodiment, thanks to using the forward camera 4F, the sideward camera 4S and the backward camera 4R, multiple captured images of the inspection target object captured from different directions at multiple viewpoints can be obtained.
In addition, in the present embodiment, the information processing system 1 inspects the inspection target object A using the imaging system 6. Accordingly, multiple captured images captured from different directions at multiple viewpoints can be obtained that are useful for three-dimensional inspection of the inspection target object A in various viewpoints. The multiple captured images captured from the different directions at the multiple viewpoints are analyzed, and hence, the information processing system 1 can three-dimensionally inspect the inspection target object in the various viewpoints. In addition, the information processing system 1 prepares an imaging reference image for capturing an image that is easily compared with an inspection reference image, and executes imaging control based on the imaging reference image, and hence, the image used for inspection can be efficiently captured. By preparing the inspection reference image that is compatible with the obtainment method of the inspection image, and capturing the inspection image that is easily compared with the inspection reference image, inspection can be executed efficiently.
In addition, the imaging system 6 includes the target object detection unit 90 that detects the inspection target object A based on at least one of the first captured image or the second captured image; the damage detection unit 92 that detects damage to the inspection target object A based on at least one of the first captured image or the second captured image; and the display control unit 84 that displays a detection result of the damage. With this configuration, the detection result of the damage to the inspection target object A can be displayed on the display 807 of the communication terminal 8 or the like, and thereby, an operator of the communication terminal 8 or the like can easily recognize the inspection result of the inspection target object A.
In addition, in the present embodiment, the inspection target object A includes a sign A1, a pillar part A2 supporting the sign A1, and a fixing member A3 fixing the sign A1 to the pillar part A2. The damage detection unit 92 may detect at least one of deformation from the predetermined shape of the sign A1, deformation of the pillar part A2, or loosening of the fixing member A3, based on at least one of the first captured image or the second captured image. Accordingly, on the inspection target object A including the sign A1 of a road sign or the like fixed to the pillar part A2 by the fixing member A3, each of the sign A1, the pillar part A2, and the fixing member A3 can be inspected in various aspects.
In addition, in the present embodiment, the imaging system 6 may further include the distance detection unit 91 to detect a distance between the inspection target object A and the mobile object 3, and the target object detection unit 90 may detect the inspection target object A based on at least one of the first captured image, the second captured image, or the distance between the inspection target object A and the mobile object 3. By using information on the distance between the inspection target object A and the mobile object 3, the accuracy of detection of the inspection target object A can be improved.
In the embodiment described above, although a vehicle is taken as an example of the mobile object, the mobile object according to the embodiment is not limited to a vehicle, and may be a flying object, a ship, and the like. The flying object includes an aircraft, a drone, and the like. The imaging system, the imaging method, the program, and the information processing system according to the embodiment can be applied to inspection of “road attachments” and the like in daily inspection services of roads in social infrastructure business.
As above, although the embodiments have been described, the present invention is not limited to the above embodiments. In other words, various modifications and improvements can be made within the scope of the present invention.
The respective functions of the embodiments may be implemented by one or more processing circuits. Here, in the present specification, a “processing circuit” includes a processor that is programmed by software to execute the respective functions, such as a processor implemented by an electronic circuit, or a device such as an ASIC (Application Specific Integrated Circuit), DSP (digital signal processor), FPGA (field programmable gate array), conventional circuit module, or the like that is designed to execute the respective functions described above.
Number | Date | Country | Kind |
---|---|---|---|
2022-188592 | Nov 2022 | JP | national |