REMOTE OPERATION SYSTEM, REMOTE OPERATION CONTROL METHOD, AND REMOTE OPERATOR TERMINAL

Abstract
A remote operation system controls a remote operation of a moving body performed by a remote operator. The remote operation system acquires an image captured by a camera installed on the moving body. The remote operation system determines, based on the image, an environmental condition under which the image is captured. The remote operation system performs visibility improvement processing that improves visibility of the image according to the environmental condition, and presents an improved image with the improved visibility to the remote operator. When the visibility improvement processing according to a weather condition among environmental conditions is performed, the remote operation system determines whether a travel restriction condition is satisfied based on weather information at a position of the moving body. When the travel restriction condition is satisfied, the remote operation system performs travel restriction processing that restricts travel of the moving body.
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to Japanese Patent Application No. 202-018147 filed on Feb. 8, 2022, the entire contents of which are incorporated by reference herein.


BACKGROUND
Technical Field

The present disclosure relates to a technique for controlling a remote operation of a moving body performed by a remote operator.


Background Art

Patent Literature 1 discloses a technique for improving visibility of a local region with poor visibility while maintaining visibility of an entire image. More specifically, a shadow region in an image captured by an imaging device is recognized. Then, a pixel value of each pixel belonging to the shadow region is changed such that a feature amount (for example, luminance) of the shadow region coincides with the feature amount of the other region.


Non-Patent Literature 1 discloses an image recognition technique using ResNet (Deep Residual Net).


Non-Patent Literature 2 discloses a technique for recognizing a scene such as weather from an image by using Deep Residual Learning.


Non-Patent Literature 3 discloses a technique that uses a convolutional neural network (CNN) to improve a hazy image caused by fog and the like (dehazing, defogging).


Non-Patent Literature 4 discloses a technique (EnlightenGAN) that converts a low-illuminance image into a normal-light image by using deep learning. For example, this makes it possible to correct an image captured in a scene such as nighttime or backlight to have appropriate brightness.


Non-Patent Literature 5 discloses a technique for improving a hazy image caused by fog, rain, and the like (dehazing, deraining).


List of Related Art

Patent Literature 1: Japanese Patent Application Laid-Open No. JP-2007-272477


Non-Patent Literature 1: Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun, “Deep Residual Learning for Image Recognition”, arXiv:1512.03385 [cs.CV], Dec. 10, 2015 (https://arxiv.org/pdf/1512.03385.pdf)


Non-Patent Literature 2: Mohamed R. Ibrahim, James Haworth, and Tao Cheng, “WeatherNet: Recognising weather and visual conditions from street-level images using deep residual learning”, arXiv:1910.09910 [cs.CV], Oct. 22, 2019 (https://arxiv.org/ftp/arxiv/papers/1910/1910.09910.pdf)


Non-Patent Literature 3: Boyi Li, Xiulian Peng, Zhangyang Wang, Jizheng Xu, and Dan Feng, “AOD-Net: All-in-One Dehazing Network”, ICCV, 2017 (https://openaccess.thecvf.com/content_ICCV_2017/papers/Li_AOD-Net_All-In-One_Dehazing_ICCV_2017_paper.pdf)


Non-Patent Literature 4: Yifan Jiang, Xinyu Gong, Ding Liu, Yu Cheng, Chen Fang, Xiaohui Shen, Jianchao Yang, Pan Zhou, and Zhangyang Wang, “EnlightenGAN: Deep Light Enhancement without Paired Supervision”, arXiv:1906.06972 [cs.CV], Jun. 17, 2019 (https://arxiv.org/pdf/1906.06972.pdf)


Non-Patent Literature 5: Dongdong Chen, Mingming He, Qingnan Fan, Jing Liao, Liheng Zhang, Dongdong Hou, Lu Yuan, and Gang Hua, “Gated Context Aggregation Network for Image Dehazing and Deraining”, arXiv:1811.08747 [cs.CV], Dec. 15, 2018 (https://arxiv.org/abs/1811.08747)


SUMMARY

A remote operation of a moving body (e.g., a vehicle, a robot) performed by a remote operator is considered. In the remote operation of the moving body, an image captured by a camera installed on the moving body is used. Visibility of the image captured by the camera is affected by environmental conditions such as weather and time. Therefore, in order to improve accuracy of the remote operation, it is conceivable to perform image processing for improving the visibility of the image. In that case, however, although the visibility is improved, a gap between the image and an actual environment around the moving body is caused instead. Since the image viewed by the remote operator becomes better than the actual environment around the moving body, the remote operator may perform a remote operation that is not appropriate for the actual environment.


An object of the present disclosure is to provide a technique capable of securing safety of a remote operation of a moving body performed by a remote operator.


A first aspect is directed to a remote operation system that controls a remote operation of a moving body performed by a remote operator.


The remote operation system includes one or more processors.


The one or more processors are configured to:


acquire an image captured by a camera installed on the moving body;


determine, based on the image, an environmental condition under which the image is captured;


perform visibility improvement processing that improves visibility of the image according to the environmental condition;


present an improved image with the improved visibility to the remote operator;


when the visibility improvement processing according to a weather condition among environmental conditions is performed, determine whether or not a travel restriction condition is satisfied based on weather information at a position of the moving body; and


when the travel restriction condition is satisfied, perform travel restriction processing that restricts travel of the moving body.


A second aspect is directed to a remote operation control method for controlling a remote operation of a moving body performed by a remote operator.


The remote operation control method includes:


acquiring an image captured by a camera installed on the moving body;


determining, based on the image, an environmental condition under which the image is captured;


performing visibility improvement processing that improves visibility of the image according to the environmental condition;


presenting an improved image with the improved visibility to the remote operator;


when the visibility improvement processing according to a weather condition among environmental conditions is performed, determining whether or not a travel restriction condition is satisfied based on weather information at a position of the moving body; and


when the travel restriction condition is satisfied, performing travel restriction processing that restricts travel of the moving body.


A third aspect is directed to a remote operator terminal on a side of a remote operator performing a remote operation of a moving body.


The remote operator terminal includes one or more processors.


The one or more processors are configured to:


acquire an image captured by a camera installed on the moving body;


determine, based on the image, an environmental condition under which the image is captured;


perform visibility improvement processing that improves visibility of the image according to the environmental condition;


present an improved image with the improved visibility to the remote operator;


when the visibility improvement processing according to a weather condition among environmental conditions is performed, determine whether or not a travel restriction condition is satisfied based on weather information at a position of the moving body; and


when the travel restriction condition is satisfied, perform travel restriction processing that restricts travel of the moving body.


According to the present disclosure, the visibility improvement processing is performed according to the environmental condition under which the image is captured by the camera. When the visibility improvement processing according to the weather condition among environmental conditions is performed, not only the improved image is presented to the remote operator but also the travel of the moving body is restricted as necessary. Even if the remote operator performs a remote operation that is not appropriate for an actual environment around the moving body, the travel of the moving body is restricted and thus the safety is secured. That is, the safety of the remote operation of the moving body performed by the remote operator is secured.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a schematic diagram showing a configuration example of a remote operation system according to an embodiment of the present disclosure;



FIG. 2 is a conceptual diagram for explaining an overview of an image improvement unit according to an embodiment of the present disclosure;



FIG. 3 is a block diagram showing a functional configuration example of the image improvement unit according to an embodiment of the present disclosure;



FIG. 4 is a flowchart showing processing by the image improvement unit according to an embodiment of the present disclosure;



FIG. 5 is a conceptual diagram for explaining environmental condition determination processing (Step S20) according to an embodiment of the present disclosure;



FIG. 6 is a flowchart showing an example of visibility improvement processing (Step S30) according to an embodiment of the present disclosure;



FIG. 7 is a conceptual diagram for explaining an overview of travel restriction processing according to an embodiment of the present disclosure;



FIG. 8 is a block diagram showing a functional configuration example related to the travel restriction processing according to an embodiment of the present disclosure;



FIG. 9 is a flowchart showing processing related to the travel restriction processing according to an embodiment of the present disclosure;



FIG. 10 is a diagram showing an example of a correspondence relationship between weather information and travel restriction according to an embodiment of the present disclosure;



FIG. 11 is a block diagram showing a functional configuration example related to travel restriction processing according to a modification example;



FIG. 12 is a flowchart showing processing related to the travel restriction processing according to the modification example;



FIG. 13 is a block diagram showing a configuration example of a vehicle according to an embodiment of the present disclosure;



FIG. 14 is a block diagram showing a configuration example of a remote operator terminal according to an embodiment of the present disclosure; and



FIG. 15 is a block diagram showing a configuration example of a management device according to an embodiment of the present disclosure.





EMBODIMENTS

Embodiments of the present disclosure will be described with reference to the accompanying drawings.


1. OVERVIEW OF REMOTE OPERATION SYSTEM

A remote operation (remote driving) of a moving body is considered. Examples of the moving body being a target of the remote operation include a vehicle, a robot, a flying object, and the like. The vehicle may be an autonomous driving vehicle or may be a vehicle driven by a driver. Examples of the robot include a logistics robot, a work robot, and the like. Examples of the flying object include an airplane, a drone, and the like.


As an example, in the following description, a case where the moving body being the target of the remote operation is a vehicle will be considered. When generalizing, “vehicle” in the following description shall be deemed to be replaced with “moving body.”



FIG. 1 is a schematic diagram showing a configuration example of a remote operation system 1 according to the present embodiment. The remote operation system 1 includes a vehicle 100, a remote operator terminal 200, and a management device 300. The vehicle 100 is the target of the remote operation. The remote operator terminal 200 is a terminal device used by a remote operator O when remotely operating the vehicle 100. The remote operator terminal 200 can also be referred to as a remote operation human machine interface (HMI). The management device 300 manages the remote operation system 1. The management of the remote operation system 1 includes, for example, assigning a remote operator O to a vehicle 100 that requires the remote operation. The management device 300 is able to communicate with the vehicle 100 and the remote operator terminal 200 via a communication network. Typically, the management device 300 is a management server on a cloud. The management server may be configured by a plurality of servers that perform distributed processing.


Various sensors including a camera C are installed on the vehicle 100. The camera C images a situation around the vehicle 100 to acquire an image IMG indicating the situation around the vehicle 100. Vehicle information VCL is information acquired by the various sensors and includes the image IMG captured by the camera C. The vehicle 100 transmits the vehicle information VCL to the remote operator terminal 200 via the management device 300. That is, the vehicle 100 transmits the vehicle information VCL to the management device 300, and the management device 300 transfers the received vehicle information VCL to the remote operator terminal 200.


The remote operator station 200 receives the vehicle information VCL transmitted from the vehicle 100. The remote operator terminal 200 presents the vehicle information VCL to the remote operator O. More specifically, the remote operator terminal 200 includes a display device, and displays the image IMG and the like on the display device. The remote operator O views the displayed information, recognizes the situation around the vehicle 100, and performs remote operation of the vehicle 100. The remote operation information OPE is information relating to remote operation by the remote operator O. For example, the remote operation information OPE includes an amount of operation performed by the remote operator O. The remote operator terminal 200 transmits the remote operation information OPE to the vehicle 100 via the management device 300. That is, the remote operator terminal 200 transmits the remote operation information OPE to the management device 300, and the management device 300 transfers the received remote operation information OPE to the vehicle 100.


The vehicle 100 receives the remote operation information OPE transmitted from the remote operator terminal 200. The vehicle 100 performs vehicle travel control in accordance with the received remote operation information OPE. In this manner, the remote operation of the vehicle 100 is realized.


2. IMAGE IMPROVEMENT UNIT
2-1. Overview


FIG. 2 is a conceptual diagram for explaining an overview of an image improvement unit 10 included in the remote operation system 1 according to the present embodiment. The image improvement unit 10 acquires the image IMG captured by the camera C and improves the image IMG. In particular, the image improvement unit 10 improves “visibility” of the image IMG. The processing for improving the visibility of the image IMG is hereinafter referred to as “visibility improvement processing.” The image whose visibility is improved is hereinafter referred to as an “improved image IMG_S.” The improved image IMG_S with the improved visibility is presented to the remote operator O. As a result, accuracy of recognition by the remote operator O is improved, thereby improving the accuracy of the remote operation.


Various examples can be considered as factors that reduce the visibility of the image IMG captured by the camera C. In the present embodiment, influence of an “environmental condition (scene)” under which the image IMG is captured on the visibility is considered in particular. The environmental condition (scene) means weather, hour, backlight or not, presence or absence of fog, and the like. For example, the visibility of the image IMG captured in rainy weather is low. As another example, the visibility of the image IMG captured in a dark situation such as nighttime is low. As still another example, the visibility of the image IMG captured under a backlight condition is low. As still another example, the visibility of the image IMG captured under a foggy situation is low. As described above, examples of the factors reducing the visibility of the image IMG captured by the camera C include rain, darkness, backlight, fog, and the like.


It is desired to improve the visibility of the image IMG in consideration of such the environmental condition and to acquire the clear improved image IMG_S. However, it is difficult and cumbersome for the remote operator O to decide what processing should be performed in what order for improving the visibility of the image IMG. In view of the above, the image improvement unit 10 according to the present embodiment is configured to be able to automatically determine the factor reducing the visibility of the image IMG captured by the camera C and to execute appropriate visibility improvement processing according to the factor in an appropriate order.


Hereinafter, processing performed by the image improvement unit 10 according to the present embodiment will be described in more detail.


2-2. Functional Configuration Example and Processing Example


FIG. 3 is a block diagram showing a functional configuration example of the image improvement unit 10 according to the present embodiment. The image improvement unit 10 includes an environmental condition determination unit 20 and a visibility improvement processing unit 30.



FIG. 4 is a flowchart showing the processing performed by the image improvement unit 10 according to the present embodiment. An example of the processing performed by the image improvement unit 10 according to the present exemplary embodiment will be described below with reference to FIGS. 3 and 4.


2-2-1. Image Acquisition Processing (Step S10)

The image improvement unit 10 acquires the image IMG captured by the camera C. The image improvement unit 10 transmits the acquired image IMG to the environmental condition determination unit 20 and the visibility improvement processing unit 30.


2-2-2. Environmental Condition Determination Processing (Step S20)

The environmental condition determination unit 20 automatically determines, based on the acquired image IMG, the environmental condition (scene) under which the image IMG is captured. Examples of the technique for determining the environmental condition based on the image IMG include the techniques described in Non-Patent Literature 1 and Non-Patent Literature 2 described above.



FIG. 5 is a conceptual diagram for explaining the environmental condition determination processing (Step S20). The environmental condition determination unit 20 includes a weather determination unit 21, an hour determination unit 22, a glare determination unit 23, and a fog determination unit 24.


Based on the image IMG, the weather determination unit 21 determines the weather when the image IMG is captured. Examples of the weather include sunny, cloudy, rainy, and snowy. The weather determination unit 21 outputs the determined weather.


Based on the image IMG, the hour determination unit 22 determines an hour when the image IMG is captured. Examples of the hour include day, dawn/dusk, and night. The “night” corresponds to “darkness.” The hour determination unit 22 outputs the determined hour.


Based on the image IMG, the glare determination unit 23 determines whether or not the image IMG is captured under a backlight condition. The glare determination unit 23 outputs whether or not it is the backlight condition.


Based on the image IMG, the fog determination unit 24 determines presence or absence of fog when the image IMG is captured. The fog determination unit 24 outputs the presence or absence of fog.


The environmental condition under which the image IMG is captured is a combination of outputs from the weather determination unit 21, the hour determination unit 22, the glare determination unit 23, and the fog determination unit 24. In the example shown in FIG. 5, the environmental condition is “rainy & night (darkness) & no backlight & fog.” The environmental condition determination unit 20 outputs information on the acquired environmental condition to the visibility improvement processing unit 30.


2-2-3. Visibility Improvement Processing (Step S30)

The visibility improvement processing unit 30 receives the image IMG and the information on the environmental condition under which the image IMG is captured. Then, the visibility improvement processing unit 30 specifies the visibility improvement processing required for improving the visibility of the image IMG according to the environmental condition.


The visibility improvement processing required when the environmental condition includes “fog” is “fog removing processing (defogging).” The defogging removes haze caused by fog in the image IMG to improve the visibility. This defogging is realized by, for example, the technique described in the above-mentioned Non-Patent Literature 3.


The visibility improvement processing required when the environmental condition includes “darkness” or “backlight” is “brightness correction processing.” The brightness correction processing corrects the image IMG captured in the scene such as nighttime or backlight to have appropriate brightness to improve the visibility. The brightness correction processing is realized by, for example, the technique described in the above-mentioned Non-Patent Literature 4.


The visibility improvement processing required when the environmental condition includes “rain” is “rain removing processing (deraining).” The deraining removes haze caused by rain in the image IMG to improve the visibility. This deraining is realized by, for example, the technique described in the above-mentioned Non-Patent Literature 5.


As described above, there are three types of processing as candidates for the visibility improvement processing related to the environmental condition: defogging, brightness correction processing, and deraining. Research was made as to in what order to perform the multiple types of visibility improvement processing for obtaining the highest visibility improvement effect. As a result of the research efforts, it is found that the highest visibility improvement effect is obtained when “1. defogging”, “2. brightness correction processing”, and “3. deraining” are performed in this order. This order is adopted in the present embodiment. That is, the processing order is predetermined such that the defogging is performed before the brightness correction processing and the brightness correction processing is executed before the deraining.


The visibility improvement processing unit 30 specifies necessary visibility improvement processing from among the multiple types of processing candidates (i.e., defogging, brightness correction processing, and deraining) according to the environmental condition determined by the environmental condition determination unit 20. The processing order of the multiple types of processing candidates is predetermined. The visibility improvement processing unit 30 applies the specified necessary visibility improvement processing to the image IMG in the predetermined order to generate the improved image IMG_S with the improved visibility. In other words, the visibility improvement processing unit 30 performs the necessary visibility improvement processing not blindly but according to the predetermined order. As a result, an excellent visibility improvement effect can be obtained, and thus the improved image IMG_S that is as clear as possible can be obtained.


It should be noted that the multiple types of processing candidates related to the environmental condition may include any two of the defogging, the brightness correction processing, and the deraining. The processing order in that case is also the same.


The visibility improvement processing unit 30 may further perform visibility improvement processing that is unrelated to the environmental condition. For example, the visibility improvement processing unit 30 may perform well-known image processing such as camera-shake correction processing and contrast adjustment processing (averaging).


Hereinafter, an example of the visibility improvement processing by the visibility improvement processing unit 30 will be described. As shown in FIG. 3, the visibility improvement processing unit 30 includes a camera-shake correction unit 31, a defogging unit 33, a brightness correction unit 35, a deraining unit 37, and a contrast adjustment unit 39. FIG. 6 is a flowchart showing an example of the visibility improvement processing (Step S30).


In Step S31, the camera-shake correction unit 31 performs the well- known camera-shake correction processing with respect to the image IMG. The camera-shake correction unit 31 outputs the image IMG after the camera-shake correction processing to the defogging unit 33.


In subsequent Step S32, the defogging unit 33 determines whether or not the environmental condition determined by the environmental condition determination unit 20 includes “fog.” When the environmental condition includes “fog” (Step S32; Yes), the defogging unit 33 determines that the defogging is necessary, and performs the defogging (Step S33). Then, the defogging unit 33 outputs the image IMG after the defogging to the brightness correction unit 35. On the other hand, when the environmental condition does not include “fog” (Step S32; No), the defogging unit 33 outputs the image IMG to the brightness correction unit 35 without performing the defogging.


In subsequent Step S34, the brightness correction unit 35 determines whether or not the environmental condition determined by the environmental condition determination unit 20 includes “darkness” or “backlight.” When the environmental condition includes “darkness” or “backlight” (Step S34; Yes), the brightness correction unit 35 determines that the brightness correction processing is necessary, and performs the brightness correction processing (Step S35). Then, the brightness correction unit 35 outputs the image IMG after the brightness correction processing to the deraining unit 37. On the other hand, when the environmental condition includes neither “darkness” nor “backlight” (Step S34; No), the brightness correction unit 35 outputs the image IMG to the deraining unit 37 without performing the brightness correction processing.


In subsequent Step S36, the deraining unit 37 determines whether or not the environmental condition determined by the environmental condition determination unit 20 includes “rain.” When the environmental condition includes “rain” (Step S36; Yes), the deraining unit 37 determines that the deraining is necessary, and performs the deraining (Step S37). Then, the deraining unit 37 outputs the image IMG after the deraining to the contrast adjustment unit 39. On the other hand, when the environmental condition does not include “rain” (Step S36; No), the deraining unit 37 outputs the image IMG to the contrast adjustment unit 39 without performing the deraining.


In subsequent Step S39, the contrast adjustment unit 39 performs the well-known contrast adjustment processing with respect to the image IMG.


The image IMG thus subjected to the visibility improvement processing step by step is the improved image IMG_S.


2-2-4. Image Output Processing (Step S40)

The image improvement unit 10 outputs the improved image IMG_S thus generated to the outside. For example, the improved image IMG_S is presented to the remote operator O by the remote operator terminal 200.


2-3. Effects

As described above, the image improvement unit 10 according to the present embodiment determines, based on the image IMG captured by the camera C, the environmental condition under which the image IMG is captured. Further, the image improvement unit 10 specifies the necessary visibility improvement processing according to the environmental condition, and applies the necessary visibility improvement processing to the image IMG in the predetermined order to generate the improved image IMG_S. Since the appropriate visibility improvement processing according to the factor reducing the visibility is executed in the appropriate order, an excellent visibility improvement effect can be obtained. In addition, since individual judgment by the remote operator O is unnecessary, the load on the remote operator O is reduced. The remote operator O is able to easily acquire the improved image IMG_S with the improved visibility.


The remote operator O is able to perform the remote operation based on the improved image IMG_S. The visibility of the image IMG may be reduced depending on the environmental condition under which the vehicle 100 is placed. Even in such a case, the clear improved image IMG_S in which the influence of the environmental condition is reduced can be used. As a result, the accuracy of recognition by the remote operator O is improved, and thus the accuracy of the remote operation also is improved. In addition, since the influence of the environmental condition is reduced, it is possible to expand an operational design domain (ODD). This is preferable from a viewpoint of service improvement.


It should be noted that the image improvement unit 10 according to the present embodiment may be included in any of the vehicle 100, the remote operator terminal 200, and the management device 300. That is, at least one of the vehicle 100, the remote operator terminal 200, and the management device 300 has the function of the image improvement unit 10. For example, the image improvement unit 10 is incorporated in the management device 300. In this case, the management device 300 generates the improved image IMG_S by improving the visibility of the image IMG received from the vehicle 100, and transmits the improved image IMG_S to the remote operator terminal 200. As another example, the image improvement unit 10 may be incorporated in the remote operator terminal 200. In this case, the remote operator terminal 200 improves the visibility of the image IMG received from the vehicle 100 via the management device 300 to generate the improved image IMG_S. In either case, the remote operator terminal 200 is able to present the improved image IMG_S with the improved visibility to the remote operator O.


3. TRAVEL RESTRICTION PROCESSING
3-1. Overview

Due to the visibility improvement processing described above, the remote operator O is able to perform the remote operation based on the improved image IMG_S with the improved visibility, and thus the accuracy of the remote operation is also improved. In that case, however, although the visibility is improved, a gap between the improved image IMG_S and an actual environment around the vehicle 100 is caused instead. Since the improved image IMG_S viewed by the remote operator O becomes better than the actual environment around the vehicle 100, the remote operator O may perform a remote operation that is not appropriate for the actual environment.


For example, in a case of rainy/snowy weather, a road surface friction coefficient (road surface μ) decreases, and thus the driver is likely to drive the vehicle 100 while suppressing a vehicle speed and a steering speed. However, as a result of the visibility of the image IMG being improved by the visibility improvement processing, an actual road surface condition may not be correctly communicated to the remote operator O. Since the improved image IMG_S is clear, the remote operator O may drive the vehicle 100 at a usual vehicle speed and a usual steering speed. This is not preferable from a viewpoint of safety of the remote operation.


In view of the above, the remote operation system 1 according to the present embodiment is configured to restrict travel of the vehicle 100 as necessary. Restricting the travel of the vehicle 100 means setting an upper limit value of a travel parameter of the vehicle 100 to be lower than a default value. The travel parameter includes at least one of a speed, a steering angle, and a steering speed. Such the processing of restricting the travel of the vehicle 100 is hereinafter referred to as “travel restriction processing.”



FIG. 7 is a conceptual diagram for explaining an overview of the travel restriction processing. The “environmental conditions” under which the image IMG is captured by the camera C are classified into a “weather condition” and other conditions. Examples of the weather condition include sunny, cloudy, rainy, snowy, foggy, etc. Examples of the environmental condition other than the weather condition include darkness, backlight, and the like. Examples of the visibility improvement processing according to the weather condition among the environmental conditions include the defogging (FIG. 6; Step S33) and the deraining (FIG. 6; Step S37).


A case in which the visibility improvement processing according to the weather condition among the environmental conditions is performed is considered. In this case, the improved image IMG_S with the improved visibility is presented to the remote operator O. Furthermore, the remote operation system 1 determines whether or not a “travel restriction condition” is satisfied. The travel restriction condition is a weather condition under which the travel restriction processing should be executed. For example, when it is raining heavily at a position of the vehicle 100, it is considered that the travel restriction processing should be executed. Therefore, the remote operation system 1 determines whether or not the travel restriction condition is satisfied based on weather information at the position of the vehicle 100. That is, triggered by the fact that the visibility improvement processing according to the weather condition is performed, the remote operation system 1 determines whether or not the travel restriction condition is satisfied.


When the travel restriction condition is satisfied, the remote operation system 1 performs the travel restriction processing that restricts the travel of the vehicle 100. Accordingly, even if the remote operator O performs a remote operation that is not appropriate for an actual environment around the vehicle 100, the travel of the vehicle 100 is restricted and thus the safety is secured. That is, the safety of the remote operation of the vehicle 100 performed by the remote operator O is secured.


It should be noted that when the visibility improvement processing according to the weather condition is not performed, it is not necessary to perform the travel restriction processing. For example, in a case where the visibility improvement processing according to the weather condition is not performed and only the visibility improvement processing (the brightness correction processing) for darkness and backlight is performed, the improved image IMG_S is presented to the remote operator O, but the travel restriction processing is not performed. As another example, in a case where the visibility improvement processing is not performed at all, the original image IMG is presented to the remote operator O, and the travel restriction processing is not performed. Since the travel of the vehicle 100 is not restricted more than necessary, the remote operator O is prevented from feeling annoyed.


Hereinafter, the travel restriction processing according to the present embodiment will be described in more detail.


3-2. Functional Configuration Example and Processing Example


FIG. 8 is a block diagram showing a functional configuration example related to the travel restriction processing. The remote operation system 1 includes the image improvement unit 10, a display unit 40, and an operation amount adjustment unit 50.


The image improvement unit 10 is included in any of the vehicle 100, the remote operator terminal 200, and the management device 300. The image improvement unit 10 performs the visibility improvement processing with respect to the image IMG as necessary to output the improved image IMG_S. Furthermore, the image improvement unit 10 outputs flag information FLG indicating a content of the visibility improvement processing. For example, the flag information FLG indicates performed processing among the defogging (FIG. 6; Step S33), the brightness correction processing (FIG. 6; Step S35), and the deraining (FIG. 6; Step S37).


The display unit 40 is included in the remote operator terminal 200. The display unit 40 displays the original image IMG or the improved image IMG_S on a display device.


The operation amount adjustment unit 50 is included in any of the vehicle 100, the remote operator terminal 200, and the management device 300. The operation amount adjustment unit 50 receives the remote operation information OPE including an amount of operation performed by the remote operator O. Then, the operation amount adjustment unit 50 restricts the operation amount as necessary to output remote operation information OPE′ in which the operation amount is restricted. The operation amount adjustment unit 50 includes a determination unit 51 and a travel restriction unit 53.



FIG. 9 is a flowchart showing processing related to the travel restriction processing. Hereinafter, the processing related to the travel restriction processing will be described in more detail with reference to FIGS. 8 and 9.


3-2-1. Step S51

In Step S51, the determination unit 51 determines whether or not the visibility improvement processing according to the weather condition among the environmental conditions is performed. More specifically, the determination unit 51 receives the flag information FLG output from the image improvement unit 10. The flag information FLG indicates the content of the visibility improvement processing performed by the image improvement unit 10. Based on the flag information FLG, the determination unit 51 can determine whether or not the visibility improvement processing according to the weather condition is performed.


When the visibility improvement processing according to the weather condition is performed (Step S51; Yes), the processing proceeds to Step S52. On the other hand, when the visibility improvement processing according to the weather condition is not performed (Step S51; No), subsequent steps S52 and S53 are skipped, and the processing in the current cycle ends.


3-2-2. Step S52

In Step S52, the determination unit 51 determines whether or not the travel restriction condition is satisfied based on weather information WX at the position of the vehicle 100. The travel restriction condition is the weather condition under which the travel restriction processing should be executed. The weather information WX is distributed by, for example, a weather information service center. The determination unit 51 can communicate with the weather information service center to acquire the weather information WX at the position of the vehicle 100.



FIG. 10 shows an example of a correspondence relationship between the weather information WX and the travel restriction. Cross marks (x) mean that the travel restriction processing is not performed. On the other hand, circles (○, ⊚) mean that the travel restriction processing is performed. For example, in a case where the amount of rainfall per hour is equal to or greater than a threshold value (10 mm/h), the travel restriction processing is performed.


For example, the determination unit 51 acquires a “degree of heavy weather” at the position of the vehicle 100 based on the weather information WX at the position of the vehicle 100. When the degree of heavy weather is equal to or greater than a first threshold value, the determination unit 51 determines that the travel restriction condition is satisfied. On the other hand, when the degree of heavy weather is less than the first threshold, the determination unit 51 determines that the travel restriction condition is not satisfied.


When the travel restriction condition is satisfied (Step S52; Yes), the processing proceeds to Step S53. On the other hand, when the travel restriction condition is not satisfied (Step S52; No), Step S53 is skipped, and the processing in the current cycle ends.


3-2-3. Step S53

In Step S53, the travel restriction unit 53 restricts the operation amount for operating the vehicle 100. More specifically, the travel restriction unit 53 sets an upper limit value of the operation amount for operating the vehicle 100 to be lower than a default value. Further, the travel restriction unit 53 receives the remote operation information OPE including the amount of operation performed by the remote operator O. When the operation amount included in the remote operation information OPE exceeds the upper limit value, the travel restriction unit 53 restricts (corrects) the operation amount to be equal to or less than the upper limit value. Then, the travel restriction unit 53 outputs the remote operation information OPE′ in which the operation amount is restricted. The vehicle 100 performs the vehicle travel control according to the remote operation information OPE′ in which the operation amount is restricted. As a result, the travel of the vehicle 100 is restricted.


As described above, “restricting the operation amount for operating the vehicle 100” is equivalent to “restricting the travel of the vehicle 100.” Further, “setting the upper limit value of the operation amount for operating the vehicle 100 to be lower than the default value” is equivalent to “setting the upper limit value of the travel parameter (e.g., speed, steering angle, steering speed) of the vehicle 100 to be lower than the default value.” It can be said that the travel restriction unit 53 performs the travel restriction processing through the operation amount.


The travel restriction unit 53 may strengthen the restriction according to the degree of heavy weather at the position of the vehicle 100. Strengthening the restriction means further decreasing the upper limit value of the operation amount (that is, the travel parameter). In the example shown in FIG. 10, the upper limit value in the case of the double circle (⊚) is set to be lower than the upper limit value in the case of the single circle (○). That is, the travel restriction unit 53 sets the upper limit value in a case where the degree of heavy weather is equal to or greater than a second threshold value to be lower than the upper limit value in a case where the degree of heavy weather is less than the second threshold value. Here, the second threshold value is higher than the first threshold value used in Step S52 described above. Since the upper limit value decreases (i.e., the restriction becomes stronger) as the degree of heavy weather increases, the safety of the remote operation of the vehicle 100 is more appropriately secured.


3-3. Notification Processing

As shown in FIG. 8, the remote operation system 1 may further include a notification unit 80. The notification unit 80 is included in the remote operator terminal 200. During execution of the travel restriction processing, the notification unit 80 notifies the remote operator O of a fact that the travel of the vehicle 100 is restricted. The notification may be performed visually or auditorily. For example, the notification unit 80 displays a notification (e.g., “vehicle speed restricted: upper limit vehicle speed=** km/h”) on a display device. As another example, the notification unit 80 outputs an audio notification through a speaker. As a result, the remote operator O is prevented from sensing a feeling of discomfort with respect to the travel restriction processing.


3-4. Effects

As described above, according to the present embodiment, the visibility improvement processing is performed according to the environmental condition under which the image IMG is captured by the camera C. When the visibility improvement processing according to the weather condition among the environmental conditions is performed, not only the improved image IMG_S is presented to the remote operator O but also the travel of the vehicle 100 is restricted as necessary. Even if the remote operator O performs a remote operation that is not appropriate for an actual environment around the vehicle 100, the travel of the vehicle 100 is restricted and thus the safety is secured. That is, the safety of the remote operation of the vehicle 100 performed by the remote operator O is secured.


When the visibility improvement processing according to the weather condition is not performed, the travel restriction processing is not performed. Since the travel of the vehicle 100 is not restricted more than necessary, the remote operator O is prevented from feeling annoyed.


Furthermore, when the travel restriction processing is executed, that fact is notified to the remote operator O. Therefore, the remote operator O is prevented from sensing a feeling of discomfort with respect to the travel restriction processing.


4. MODIFICATION EXAMPLE

When a communication quality (e.g., a throughput, a delay) of the communication from the vehicle 100 to the remote operator terminal 200 is decreased, an image quality of the image IMG received by the remote operator terminal 200 may also be decreased. For example, when the decrease in the communication quality is detected, the vehicle 100 on the transmission side may perform congestion control. When the congestion control is performed, a resolution of the image IMG transmitted from the vehicle 100 decreases. Even in this case, the remote operator terminal 200 on the reception side is able to perform the visibility improvement processing by utilizing a super-resolution technique or the like to generate the improved image IMG_S. The remote operator O is able to perform the remote operation based on the improved image IMG_S with the improved visibility.


However, when the communication quality is decreased, there is a possibility that communication blackout occurs in the near future. For sake of safety, it is desirable that the remote operator O carefully performs the remote operation of the vehicle 100, for example, with reducing the vehicle speed. However, it is difficult for the remote operator O to perceive the occurrence of the decrease in communication quality from the clear improved image IMG_S acquired by the visibility improvement processing. Therefore, the above-described travel restriction processing is useful for securing the safety, also in the case where the visibility improvement processing is performed due to the decrease in the communication quality.



FIG. 11 is a block diagram showing an example of a functional configuration related to the travel restriction processing according to a modification example. The remote operation system 1 includes an image improvement unit 10, the display unit 40, a communication quality acquisition unit 60, an operation amount adjustment unit 70, and the notification unit 80. A description overlapping with the above Section 3 will be omitted as appropriate.


The image improvement unit 10 is included in the remote operator terminal 200. The image improvement unit 10 performs the visibility improvement processing with respect to the image IMG as necessary to output the improved image IMG_S. For example, the image improvement unit 10 automatically determines presence or absence of the congestion control based on the received image IMG. When the congestion control is being performed, the image improvement unit 10 applies the super-resolution technique to the received image IMG to generate the improved image IMG_S with the improved visibility. Further, the image improvement unit 10 outputs flag information FLG indicating whether or not the visibility improvement processing is performed.


The communication quality acquisition unit 60 acquires information on the communication quality (e.g., a throughput, a delay) of the communication from the vehicle 100 to the remote operator terminal 200. For example, the communication quality acquisition unit 60 acquires the communication quality information based on a reception state of the vehicle information VCL received from the vehicle 100.


The operation amount adjustment unit 70 is included in any of the vehicle 100, the remote operator terminal 200, and the management device 300. The operation amount adjustment unit 70 includes a determination unit 71 and a travel restriction unit 73.



FIG. 12 is a flowchart showing processing related to the travel restriction processing according to the modification example. Hereinafter, the processing related to the travel restriction processing according to the modification example will be described in more detail with reference to FIGS. 11 and 12.


In Step S71, the determination unit 71 receives the flag information FLG output from the image improvement unit 10. The determination unit 71 determines, based on the flag information FLG, whether or not the visibility improvement processing caused by the decrease in communication quality is performed. When the visibility improvement processing caused by the decrease in communication quality is performed (Step S71; Yes), the processing proceeds to Step S72. On the other hand, when the visibility improvement processing caused by the decrease in communication quality is not performed (Step S71; No), subsequent steps S72 and S73 are skipped, and the processing in the current cycle ends.


In Step S72, the determination unit 71 acquires the communication quality information from the communication quality acquisition unit 60. Then, the determination unit 71 determines whether or not a travel restriction condition is satisfied based on the communication quality information. For example, when the communication quality is lower than a first threshold, the determination unit 71 determines that the travel restriction condition is satisfied. On the other hand, when the communication quality is equal to or higher than the first threshold, the determination unit 71 determines that the travel restriction condition is not satisfied. When the travel restriction condition is satisfied (Step S72; Yes), the processing proceeds to Step S73. On the other hand, when the travel restriction condition is not satisfied (Step S72; No), Step S73 is skipped, and the processing in the current cycle ends.


In Step S73, the travel restriction unit 73 performs the travel restriction processing. The function of the travel restriction unit 73 is the same as that of the travel restriction unit 53 in Section 3 described above. Due to the travel restriction processing, the safety of the remote operation of the vehicle 100 is secured.


The travel restriction unit 73 may strengthen the restriction as the communication quality decreases. Strengthening the restriction means further decreasing the upper limit value of the operation amount (that is, the travel parameter). For example, the travel restriction unit 73 sets the upper limit value in a case where the communication quality is lower than a second threshold value to be lower than the upper limit value in a case where the communication quality is equal to or higher than the second threshold value. Here, the second threshold value is lower than the first threshold value used in Step S72 described above. Since the upper limit value decreases (the restriction becomes stronger) as the communication quality decreases, the safety of the remote operation of the vehicle 100 is more appropriately secured.


The notification unit 80 is the same as that in Section 3 described above. The remote operator O is prevented from sensing a feeling of discomfort with respect to the travel restriction processing.


5. EXAMPLE OF VEHICLE
5-1. Configuration Example


FIG. 13 is a block diagram showing a configuration example of the vehicle 100. The vehicle 100 includes a communication device 110, a sensor group 120, a travel device 130, and a control device (controller) 150.


The communication device 110 communicates with the outside of the vehicle 100. For example, the communication device 110 communicates with the remote operator terminal 200 and the management device 300.


The sensor group 120 includes a recognition sensor, a vehicle state sensor, a position sensor, and the like. The recognition sensor recognizes (detects) a situation around the vehicle 100. Examples of the recognition sensor include the camera C, a laser imaging detection and ranging (LIDAR), a radar, and the like. The vehicle state sensor detects a state of the vehicle 100. Examples of the vehicle state sensor include a speed sensor, an acceleration sensor, a yaw rate sensor, a steering angle sensor, and the like. The position sensor detects a position and an orientation of the vehicle 100. For example, the position sensor includes a global navigation satellite system (GNSS).


The travel device 130 includes a steering device, a driving device, and a braking device. The steering device turns wheels. For example, the steering device includes an electric power steering (EPS) device. The driving device is a power source that generates a driving force. Examples of the drive device include an engine, an electric motor, an in-wheel motor, and the like. The braking device generates a braking force.


The control device 150 is a computer that controls the vehicle 100. The control device 150 includes one or more processors 160 (hereinafter simply referred to as a processor 160) and one or more memory devices 170 (hereinafter simply referred to as a memory device 170). The processor 160 executes a variety of processing. For example, the processor 160 includes a central processing unit (CPU). The memory device 170 stores a variety of information necessary for the processing by the processor 160. Examples of the memory device 170 include a volatile memory, a non-volatile memory, a hard disk drive (HDD), a solid state drive (SSD), and the like. The control device 150 may include one or more electronic control units (ECUs).


A vehicle control program PROG1 is a computer program executed by the processor 160. The functions of the control device 150 are implemented by the processor 160 executing the vehicle control program PROG1. The vehicle control program PROG1 is stored in the memory device 170. The vehicle control program PROG1 may be recorded on a non-transitory computer-readable recording medium.


5-2. Driving Environment Information

The control device 150 uses the sensor group 120 to acquire driving environment information ENV indicating a driving environment for the vehicle 100. The driving environment information ENV is stored in the memory device 170.


The driving environment information ENV includes surrounding situation information indicating a result of recognition by the recognition sensor. For example, the surrounding situation information includes the image IMG captured by the camera C. The surrounding situation information further includes object information regarding an object around the vehicle 100. Examples of the object around the vehicle 100 include a pedestrian, another vehicle (e.g., a preceding vehicle, a parked vehicle, etc.), a white line, a traffic signal, a sign, a roadside structure, and the like. The object information indicates a relative position and a relative velocity of the object with respect to the vehicle 100.


In addition, the driving environment information ENV includes vehicle state information indicating the vehicle state detected by the vehicle state sensor.


Furthermore, the driving environment information ENV includes vehicle position information indicating the position and the orientation of the vehicle 100. The vehicle position information is acquired by the position sensor. Highly accurate vehicle position information may be acquired by performing a well-known localization using map information and the surrounding situation information (the object information).


5-3. Vehicle Travel Control

The control device 150 executes vehicle travel control that controls travel of the vehicle 100. The vehicle travel control includes steering control, driving control, and braking control. The control device 150 executes the vehicle travel control by controlling the travel device 130 (i.e., the steering device, the driving device, and the braking device).


The control device 150 may execute autonomous driving control based on the driving environment information ENV. More specifically, the control device 150 generates a travel plan of the vehicle 100 based on the driving environment information ENV. Further, the control device 150 generates, based on the driving environment information ENV, a target trajectory required for the vehicle 100 to travel in accordance with the travel plan. The target trajectory includes a target position and a target speed. Then, the control device 150 executes the vehicle travel control such that the vehicle 100 follows the target trajectory.


5-4. Processing Related to Remote Operation

Hereinafter, the case where the remote operation of the vehicle 100 is performed will be described. The control device 150 communicates with the remote operator terminal 200 via the communication device 110.


The control device 150 transmits the vehicle information VCL to the remote operator terminal 200. The vehicle information VCL is information necessary for the remote operation by the remote operator O, and includes at least a part of the driving environment information ENV described above. For example, the vehicle information VCL includes the surrounding situation information (especially, the image IMG). The vehicle information VCL may further include the vehicle state information and the vehicle position information.


In addition, the control device 150 receives the remote operation information OPE from the remote operator terminal 200. The remote operation information OPE is information regarding the remote operation by the remote operator O. For example, the remote operation information OPE includes an amount of operation performed by the remote operator O. The control device 150 performs the vehicle travel control in accordance with the received remote operation information OPE.


Furthermore, the control device 150 may have the function of the image improvement unit 10 described above. The image improvement unit 10 performs the visibility improvement processing with respect to the image IMG as necessary to output the improved image IMG_S. In addition, the image improvement unit 10 outputs the flag information FLG indicating the content of the visibility improvement processing. The improved image IMG_S and the flag information FLG are transmitted as a part of the vehicle information VCL to the management device 300 and the remote operator terminal 200.


Furthermore, the control device 150 may have the functions of the operation amount adjustment units 50 and 70 described above. The operation amount adjustment units 50 and 70 acquire the remote operation information OPE received from the remote operator terminal 200. Then, the operation amount adjustment units 50 and 70 restrict the operation amount as necessary to output the remote operation information OPE′ in which the operation amount is restricted. The control device 150 performs the vehicle travel control in accordance with the remote operation information OPE′.


6. EXAMPLE REMOTE OPERATOR TERMINAL


FIG. 14 is a block diagram showing a configuration example of the remote operator terminal 200. The remote operator station 200 includes a communication device 210, an output device 220, an input device 230, and a control device (controller) 250.


The communication device 210 communicates with the vehicle 100 and the management device 300.


The output device 220 outputs a variety of information. For example, the output device 220 includes a display device. The display device presents a variety of information to the remote operator O by displaying the variety of information. As another example, the output device 220 may include a speaker. The output device 220 has the functions of the display unit 40 and the notification unit 80.


The input device 230 receives an input from the remote operator O. For example, the input device 230 includes a remote operation member that is operated by the remote operator O when remotely operating the vehicle 100. The remote operation member includes a steering wheel, an accelerator pedal, a brake pedal, a direction indicator, and the like.


The control device 250 controls the remote operator terminal 200. The control device 250 includes one or more processors 260 (hereinafter simply referred to as a processor 260) and one or more memory devices 270 (hereinafter simply referred to as a memory device 270). The processor 260 executes a variety of processing. For example, the processor 260 includes a CPU. The memory device 270 stores a variety of information necessary for the processing by the processor 260. Examples of the memory device 270 include a volatile memory, a non-volatile memory, an HDD, an SSD, and the like.


A remote operation program PROG2 is a computer program executed by the processor 260. The functions of the control device 250 are implemented by the processor 260 executing the remote operation program PROG2. The remote operation program PROG2 is stored in the memory device 270. The remote operation program PROG2 may be recorded on a non-transitory computer-readable recording medium. The remote operation program PROG2 may be provided via a network.


The control device 250 communicates with the vehicle 100 via the communication device 210. The control device 250 receives the vehicle information VCL transmitted from the vehicle 100. The control device 250 presents the vehicle information VCL to the remote operator O by displaying the vehicle information VCL including the image information on the display device. The remote operator O is able to recognize the state of the vehicle 100 and the situation around the vehicle 100 based on the vehicle information VCL displayed on the display device.


The remote operator O operates the remote operation member of the input device 230. An operation amount of the remote operation member is detected by a sensor installed on the remote operation member. The control device 250 generates the remote operation information OPE reflecting the operation amount of the remote operation member operated by the remote operator O. Then, the control device 250 transmits the remote operation information OPE to the vehicle 100 via the communication device 210.


Furthermore, the control device 250 may have the function of the image improvement unit 10 described above. The image improvement unit 10 performs the visibility improvement processing with respect to the image IMG as necessary to output the improved image IMG_S. In addition, the image improvement unit 10 outputs the flag information FLG indicating the content of the visibility improvement processing.


Furthermore, the control device 250 may have the functions of the operation amount adjustment units 50 and 70 described above. The operation amount adjustment units 50 and 70 acquire the remote operation information OPE. Then, the operation amount adjustment units 50 and 70 restrict the operation amount as necessary to output the remote operation information OPE′ in which the operation amount is restricted. Then, the control device 250 transmits the remote operation information OPE′ to the vehicle 100 via the communication device 210.


7. EXAMPLE MANAGEMENT DEVICE


FIG. 15 is a block diagram showing a configuration example of the management device 300. The management device 300 includes a communication device 310 and a control device 350.


The communication device 310 communicates with the vehicle 100 and remote operator terminal 200.


The control device (controller) 350 controls the management device 300. The control device 350 includes one or more processors 360 (hereinafter simply referred to as a processor 360) and one or more memory devices 370 (hereinafter simply referred to as a memory device 370). The processor 360 executes a variety of processing. For example, the processor 360 includes a CPU. The memory device 370 stores a variety of information necessary for the processing by the processor 360. Examples of the memory device 370 include a volatile memory, a non-volatile memory, an HDD, an SSD, and the like.


A management program PROG3 is a computer program executed by the processor 360. The functions of the control device 350 are implemented by the processor 360 executing the management program PROG3. The management program PROG3 is stored in the memory device 370. The management program PROG3 may be recorded on a non-transitory computer-readable recording medium. The management program PROG3 may be provided via a network.


The control device 350 communicates with the vehicle 100 and the remote operator terminal 200 via the communication device 310. The control device 350 receives the vehicle information VCL transmitted from the vehicle 100. Then, the control device 350 transmits the received vehicle information VCL to the remote operator terminal 200. In addition, the control device 350 receives the remote operation information OPE transmitted from the remote operator terminal 200. Then, the control device 350 transmits the received remote operation information OPE to the vehicle 100.


Furthermore, the control device 350 may have the function of the image improvement unit 10 described above. When the image IMG is included in the vehicle information VCL received from the vehicle 100, the image improvement unit 10 performs the visibility improvement processing with respect to the image IMG as necessary to output the improved image IMG_S. In addition, the image improvement unit 10 outputs the flag information FLG indicating the content of the visibility improvement processing. The improved image IMG_S and the flag information FLG are transmitted as a part of the vehicle information VCL to the remote operator terminal 200.


Furthermore, the control device 350 may have the functions of the operation amount adjustment units 50 and 70 described above. The operation amount adjustment units 50 and 70 acquire the remote operation information OPE received from the remote operator terminal 200. Then, the operation amount adjustment units 50 and 70 restrict the operation amount as necessary to output the remote operation information OPE′ in which the operation amount is restricted. The control device 350 transmits the remote operation information OPE′ to the vehicle 100 via the communication device 310.

Claims
  • 1. A remote operation system that controls a remote operation of a moving body performed by a remote operator, the remote operation system comprising one or more processors configured to:acquire an image captured by a camera installed on the moving body;determine, based on the image, an environmental condition under which the image is captured;perform visibility improvement processing that improves visibility of the image according to the environmental condition;present an improved image with the improved visibility to the remote operator;when the visibility improvement processing according to a weather condition among environmental conditions is performed, determine whether or not a travel restriction condition is satisfied based on weather information at a position of the moving body; andwhen the travel restriction condition is satisfied, perform travel restriction processing that restricts travel of the moving body.
  • 2. The remote operation system according to claim 1, wherein when the visibility improvement processing according to the weather condition is not performed, the one or more processors refrain from performing the travel restriction processing.
  • 3. The remote operation system according to claim 1, wherein during execution of the travel restriction processing, the one or more processors notify the remote operator of a fact that the travel of the moving body is being restricted.
  • 4. The remote operation system according to claim 1, wherein the one or more processors are further configured to:acquire a degree of heavy weather at the position of the moving body based on the weather information at the position of the moving body; andwhen the degree of heavy weather is equal to or greater than a threshold value, determine that the travel restriction condition is satisfied.
  • 5. The remote operation system according to claim 1, wherein a travel parameter includes at least one of a speed, a steering angle, and a steering speed of the moving body, andthe travel restriction processing includes setting an upper limit value of the travel parameter to be lower than a default value.
  • 6. The remote operation system according to claim 5, wherein the one or more processors are further configured to:recognize a degree of heavy weather at the position of the moving body based on the weather information at the position of the moving body; anddecrease the upper limit value of the travel parameter according to the degree of heavy weather.
  • 7. A remote operation control method for controlling a remote operation of a moving body performed by a remote operator, the remote operation control method comprising:acquiring an image captured by a camera installed on the moving body;determining, based on the image, an environmental condition under which the image is captured;performing visibility improvement processing that improves visibility of the image according to the environmental condition;presenting an improved image with the improved visibility to the remote operator;when the visibility improvement processing according to a weather condition among environmental conditions is performed, determining whether or not a travel restriction condition is satisfied based on weather information at a position of the moving body; andwhen the travel restriction condition is satisfied, performing travel restriction processing that restricts travel of the moving body.
  • 8. A remote operator terminal on a side of a remote operator performing a remote operation of a moving body, the remote operator terminal comprising one or more processors configured to:acquire an image captured by a camera installed on the moving body;determine, based on the image, an environmental condition under which the image is captured;perform visibility improvement processing that improves visibility of the image according to the environmental condition;present an improved image with the improved visibility to the remote operator;when the visibility improvement processing according to a weather condition among environmental conditions is performed, determine whether or not a travel restriction condition is satisfied based on weather information at a position of the moving body; andwhen the travel restriction condition is satisfied, perform travel restriction processing that restricts travel of the moving body.
Priority Claims (1)
Number Date Country Kind
2022-018147 Feb 2022 JP national