The present disclosure relates to work vehicles, and methods for controlling work vehicles.
As attempts in next-generation agriculture, research and development of smart agriculture utilizing ICT (Information and Communication Technology) and IoT (Internet of Things) are under way. Research and development are also directed to the automation and unmanned use of tractors or other work vehicles to be used in the field. For example, work vehicles which travel via automatic steering by utilizing a positioning system that is capable of precise positioning, e.g., a GNSS (Global Navigation Satellite System), are coming into practical use.
On the other hand, development of movable units which autonomously move by utilizing distance sensors, e.g., LiDAR (Light Detection and Ranging) is also under way. For example, Japanese Laid-Open Patent Publication No. 2019-154379 discloses an example of a work vehicle which performs self-traveling in between crop rows in a field by utilizing LiDAR.
In an environment in which trees or crops are distributed with a high density, e.g., vineyards or other orchards or forests, leaves thriving in upper portions of the trees create canopies, each of which serves as an obstacle or a multiple reflector against radio waves from a satellite. Such an environment hinders accurate positioning using a GNSS. In an environment where GNSS cannot be used, use of SLAM (Simultaneous Localization and Mapping), where localization and map generation simultaneously take place, might be possible. However, various challenges exist in the practical application of a work vehicle that uses SLAM to travel automatically in an environment with a multitude of trees. One challenge is that the distribution of tree leaves changes significantly with seasonal changes, making it impossible to continue using maps that were created in the past, for example.
A work vehicle according to an illustrative example embodiment of the present disclosure performs self-traveling among a plurality of crop rows. The work vehicle includes an exterior sensor to output sensor data indicating a distribution of geographic features around the work vehicle, and a controller configured or programmed to control self-traveling of the work vehicle in an inter-row travel mode of causing the work vehicle to travel along a target path between two adjacent crop rows that are detected based on the sensor data, and in a turning travel mode of causing the work vehicle to turn in a headland before and after the inter-row travel mode. In the turning travel mode, the controller is configured or programed to calculate an amount of positional deviation and an amount of directional deviation of the work vehicle with respect to a target path in a next instance of the inter-row travel mode based on the sensor data. The controller is configured or programed to switch from the turning travel mode to the inter-row travel mode upon satisfying a plurality of conditions including a first condition that the amount of positional deviation is smaller than a first threshold, and a second condition that the amount of directional deviation is smaller than a second threshold.
Example embodiments of the present disclosure may be implemented using devices, systems, methods, integrated circuits, computer programs, non-transitory computer-readable storage media, or any combination thereof. The non-transitory computer-readable storage media may be inclusive of volatile storage media, or non-volatile storage media. The devices each may include a plurality of devices. In the case where the devices each include two or more devices, the two or more devices may be included within a single apparatus, or divided over two or more separate apparatuses.
According to example embodiments of the present disclosure, it is possible to realize work vehicles that each smoothly perform self-traveling among a plurality of crop rows (e.g., rows of trees) even in an orchard, a forest, or any other environment where GNSS-based positioning is difficult.
The above and other elements, features, steps, characteristics and advantages of the present invention will become more apparent from the following detailed description of the example embodiments with reference to the attached drawings.
In the present disclosure, a “work vehicle” means a vehicle for use in performing work in a work area. A “work area” is any place where work may be performed, e.g., a field, a mountain forest, or a construction site. A “field” is any place where agricultural work may be performed, e.g., an orchard, an agricultural field, a paddy field, a cereal farm, or a pasture. A work vehicle can be an agricultural machine such as a tractor, a rice transplanter, a combine, a vehicle for crop management, or a riding mower, or a vehicle for non-agricultural purposes such as a construction vehicle or a snowplow vehicle. A work vehicle may be configured so that an implement that is suitable for the content of work can be attached to at least one of its front and its rear. Traveling of a work vehicle that is made while it performs work by using an implement may be referred to as “tasked travel”.
“Self-driving” means controlling the travel of a vehicle based on the action of a controller, rather than through manual operation of a driver. During self-driving, not only the travel of the vehicle, but also the task operation (e.g., the operation of the implement) may also be automatically controlled. A vehicle that is traveling via self-driving is said to be “self-traveling”. The controller may be configured or programmed to control at least one of steering, adjustment of traveling speed, and starting and stopping of travel as are necessary for the travel of vehicle. In the case of controlling a work vehicle having an implement attached thereto, the controller may be configured or programmed to control operations such as raising or lowering of the implement, starting and stopping of the operation of the implement, and the like. Travel via self-driving includes not only the travel of a vehicle toward a destination along a predetermined path, but also the travel of merely following a target of tracking. A vehicle performing self-driving may travel based in part on a user's instruction. A vehicle performing self-driving may operate not only in a self-driving mode but also in a manual driving mode of traveling through manual operation of the driver. The steering of a vehicle that is based on the action of a controller, rather than manually, is referred to as “automatic steering”. A portion or a whole of the controller may be external to the vehicle. Between the vehicle and a controller that is external to the vehicle, communication of control signals, commands, data, or the like may be performed. A vehicle performing self-driving may autonomously travel while sensing the surrounding environment, without any person being involved in the control of the travel of the vehicle. A vehicle that is capable of autonomous travel can travel in an unmanned manner. During autonomous travel, detection of obstacles and avoidance of obstacles may be performed.
An “exterior sensor” is a sensor that senses the external state of the work vehicle. Examples of exterior sensors include LiDAR sensors, cameras (or image sensors), laser range finders (also referred to as “range sensors”), ultrasonic sensors, millimeter wave radars, and magnetic sensors.
A “crop row” is a row of agricultural items, trees, or other plants that may grow in rows on a field, e.g., an orchard or an agricultural field, or in a forest or the like. In the present disclosure, a “crop row” is a notion that encompasses a “row of trees”.
An “obstacle map” is local map data in which the position or a region of an object around the work vehicle is expressed in a predetermined coordinate system. A coordinate system defining an obstacle map may be a vehicle coordinate system that is fixed to the work vehicle, or a world coordinate system that is fixed to the globe (e.g. a geographic coordinate system), for example. An obstacle map may include information other than position (e.g., attribute information) of an object around the work vehicle. The obstacle map may be expressed in various formats, e.g., a grid map or a point cloud map.
Hereinafter, example embodiments of the present disclosure will be described more specifically. Note however that unnecessarily detailed descriptions may be omitted. For example, detailed descriptions on what is well known in the art or redundant descriptions on what is substantially the same configuration may be omitted. This is to avoid lengthy description, and facilitate the understanding of those skilled in the art. The accompanying drawings and the following description, which are provided by the present inventors so that those skilled in the art can sufficiently understand the present disclosure, are not intended to limit the scope of claims. In the following description, component elements having identical or similar functions are denoted by identical reference numerals.
The following example embodiments are only exemplary, and the techniques according to the present disclosure are not limited to the following example embodiments. For example, numerical values, shapes, materials, steps, orders of steps, etc., that are indicated in the following example embodiments are only exemplary, and allow for various modifications so long as it makes technological sense. Any one implementation may be combined with another.
Hereinafter, as one example, an example embodiment where the work vehicle is a tractor for use in agricultural work in a field such as an orchard will be described. Without being limited to tractors, the techniques according to example embodiments of the present disclosure are also applicable to other type of agricultural machines such as rice transplanters, combines, vehicles for crop management, or riding lawn mowers, for example. The techniques according to example embodiments of the present disclosure are also applicable to vehicles for non-agricultural purposes such as a construction vehicle or a snowplow vehicle.
As shown in
The work vehicle 100 includes a plurality of exterior sensors to sense the surroundings of the work vehicle 100. In the example of
The cameras 120 may be provided at the front/rear/right/left of the work vehicle 100, for example. The cameras 120 image the surrounding environment of the work vehicle 100 and generate image data. The images acquired with the cameras 120 may be transmitted to the terminal device, which is responsible for remote monitoring, for example. The images may be used to monitor the work vehicle 100 during unmanned driving. The cameras 120 may be provided according to the needs, and any number of them may be provided.
The LiDAR sensors 140 are one example of exterior sensors that output sensor data indicating a distribution of geographic features around the work vehicle 100. In the example of
The LiDAR sensor(s) 140 may be configured to output two-dimensional or three-dimensional point cloud data as sensor data. In the present specification, “point cloud data” broadly means data indicating a distribution of multiple reflection points that are observed with a LiDAR sensor(s) 140. The point cloud data may include coordinate values of each reflection point in a two-dimensional space or a three-dimensional space or information indicating the distance and direction of each reflection point, for example. The point cloud data may include information of luminance of each reflection point. The LiDAR sensor(s) 140 may be configured to repeatedly output point cloud data with a pre-designated cycle, for example. Thus, the exterior sensors may include one or more LiDAR sensors 140 that output point cloud data as sensor data.
The sensor data that is output from the LiDAR sensor(s) 140 is processed by a controller configured or programmed to control self-traveling of the work vehicle 100. During travel of the work vehicle 100, based on the sensor data that is output from the LiDAR sensor(s) 140, the controller can be configured or programmed to consecutively generate an obstacle map indicating a distribution of objects existing around the work vehicle 100. The controller may be configured or programmed to generate an environment map by joining together obstacle maps with the use of an algorithm such as SLAM, for example, during self-traveling. The controller can be configured or programmed to perform estimation of the position and orientation of the work vehicle 100 (i.e., localization) by matching the sensor data against the environment map.
The plurality of obstacle sensors 130 shown in
The work vehicle 100 further includes a GNSS unit 110. GNSS is a collective term for satellite positioning systems such as the GPS (Global Positioning System), QZSS (Quasi-Zenith Satellite System, e.g., MICHIBIKI), GLONASS, Galileo, and BeiDou. A GNSS unit 110 receives satellite signals (also referred to as GNSS signals) that are transmitted from a plurality of GNSS satellites, and performs positioning based on the satellite signals. Although the GNSS unit 110 in the present example embodiment is disposed above the cabin 105, it may be disposed at any other position. The GNSS unit 110 includes an antenna to receive signals from the GNSS satellites, and a processing circuit. The work vehicle 100 in the present example embodiment is used in environments where multiple trees grow to make it difficult to use a GNSS, e.g., a vineyard. In such environments, the LiDAR sensor(s) 140 is mainly used in positioning. However, in an environment where it is possible to receive GNSS signals, positioning may be performed by using the GNSS unit 110. By combining the positioning based on the LiDAR sensor(s) 140 and the positioning based on the GNSS unit 110, the stability or accuracy of positioning can be improved.
The GNSS unit 110 may include an inertial measurement unit (IMU). Signals from the IMU can be used to complement position data. The IMU can measure a tilt or a small motion of the work vehicle 100. The data acquired by the IMU can be used to complement the position data based on the satellite signals, so as to improve the performance of positioning.
The prime mover 102 may be a diesel engine, for example. Instead of a diesel engine, an electric motor may be used. The transmission 103 can change the propulsion and the moving speed of the work vehicle 100 through a speed changing mechanism. The transmission 103 can also switch between forward travel and backward travel of the work vehicle 100.
The steering device 106 includes a steering wheel, a steering shaft connected to the steering wheel, and a power steering device to assist in the steering by the steering wheel. The front wheels 104F are the wheels responsible for steering, such that changing their angle of turn (also referred to as “steering angle”) can cause a change in the traveling direction of the work vehicle 100. The steering angle of the front wheels 104F can be changed by manipulating the steering wheel. The power steering device includes a hydraulic device or an electric motor to supply an assisting force to change the steering angle of the front wheels 104F. When automatic steering is performed, under the control of the controller disposed in the work vehicle 100, the steering angle may be automatically adjusted by the power of the hydraulic device or the electric motor.
A linkage device 108 is provided at the rear of the vehicle body 101. The linkage device 108 includes, e.g., a three-point linkage (also referred to as a “three-point link” or a “three-point hitch”), a PTO (Power Take Off) shaft, a universal joint, and a communication cable. The linkage device 108 allows the implement 300 to be attached to, or detached from, the work vehicle 100. The linkage device 108 is able to raise or lower the three-point link with a hydraulic device, for example, thus changing the position or attitude of the implement 300. Moreover, motive power can be sent from the work vehicle 100 to the implement 300 via the universal joint. While towing the implement 300, the work vehicle 100 allows the implement 300 to perform a predetermined task. The linkage device may be provided at the front portion of the vehicle body 101. In that case, the implement can be connected at the front portion of the work vehicle 100.
Although the implement 300 shown in
The work vehicle 100 shown in
In addition to the GNSS unit 110, the camera(s) 120, the obstacle sensors 130, the LiDAR sensor(s) 140, and the operational terminal 200, the work vehicle 100 in the example of
The GNSS unit 110 includes a GNSS receiver 111, an RTK receiver 112, an inertial measurement unit (IMU) 115, and a processing circuit 116. The sensors 150 include a steering wheel sensor 152, an angle-of-turn sensor 154, and a axle sensor 156. The travel control system 160 includes a storage 170 and a controller 180. The controller 180 includes a plurality of electronic control units (ECU) 181 to 184. The implement 300 includes a drive device 340, a controller 380, and a communicator 390. Note that
The GNSS receiver 111 in the GNSS unit 110 receives satellite signals transmitted from the plurality of GNSS satellites and generates GNSS data based on the satellite signals. The GNSS data is generated in a predetermined format such as, for example, the NMEA-0183 format. The GNSS data may include, for example, the ID number, the angle of elevation, the azimuth angle, and a value representing the reception intensity of each of the satellites from which the satellite signals are received.
The GNSS unit 110 may perform positioning of the work vehicle 100 by utilizing an RTK (Real Time Kinematic)-GNSS. In the positioning based on the RTK-GNSS, not only satellite signals transmitted from a plurality of GNSS satellites, but also a correction signal that is transmitted from a reference station is used. The reference station may be disposed near the work area where the work vehicle 100 performs tasked travel (e.g., at a position within 10 km of the work vehicle 100). The reference station generates a correction signal of, for example, an RTCM format based on the satellite signals received from the plurality of GNSS satellites, and transmits the correction signal to the GNSS unit 110. The RTK receiver 112, which includes an antenna and a modem, receives the correction signal transmitted from the reference station. Based on the correction signal, the processing circuit 116 of the GNSS unit 110 corrects the results of the positioning performed by the GNSS receiver 111. Use of the RTK-GNSS enables positioning with an accuracy on the order of several centimeters of errors, for example. Positional information including latitude, longitude, and altitude information is acquired through the highly accurate positioning by the RTK-GNSS. The GNSS unit 110 calculates the position of the work vehicle 100 as frequently as, for example, one to ten times per second. Note that the positioning method is not limited to being performed by using an RTK-GNSS, any arbitrary positioning method (e.g., an interferometric positioning method or a relative positioning method) that provides positional information with the necessary accuracy can be used. For example, positioning may be performed by utilizing a VRS (Virtual Reference Station) or a DGPS (Differential Global Positioning System).
The GNSS unit 110 according to the present example embodiment further includes the IMU 115. The IMU 115 may include a 3-axis accelerometer and a 3-axis gyroscope. The IMU 115 may include a direction sensor such as a 3-axis geomagnetic sensor. The IMU 115 functions as a motion sensor which can output signals representing parameters such as acceleration, velocity, displacement, and attitude of the work vehicle 100. Based not only on the satellite signals and the correction signal but also on a signal that is output from the IMU 115, the processing circuit 116 can estimate the position and orientation of the work vehicle 100 with a higher accuracy. The signal that is output from the IMU 115 may be used for the correction or complementation of the position that is calculated based on the satellite signals and the correction signal. The IMU 115 outputs a signal more frequently than the GNSS receiver 111. For example, the IMU 115 outputs a signal as frequently as approximately several ten times to several thousand times per second. Utilizing this signal that is output highly frequently, the processing circuit 116 allows the position and orientation of the work vehicle 100 to be measured more frequently (e.g., about 10 Hz or above). Instead of the IMU 115, a 3-axis accelerometer and a 3-axis gyroscope may be separately provided. The IMU 115 may be provided as a separate device from the GNSS unit 110.
The cameras 120 are imagers that image the surrounding environment of the work vehicle 100. Each camera 120 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), for example. In addition, each camera 120 may include an optical system including one or more lenses and a signal processing circuit. During travel of the work vehicle 100, the cameras 120 image the surrounding environment of the work vehicle 100, and generate image (e.g., motion picture) data. The cameras 120 are able to capture motion pictures at a frame rate of 3 frames/second (fps: frames per second) or greater, for example. The images generated by the cameras 120 may be used by a remote supervisor to check the surrounding environment of the work vehicle 100 with the terminal device 400, for example. The images generated by the cameras 120 may also be used for the purpose of positioning or detection of obstacles. As shown in
An obstacle sensor 130 detects objects around the work vehicle 100. The obstacle sensor 130 may include a laser scanner or an ultrasonic sonar, for example. When an object exists at a position closer to the obstacle sensor 130 than a predetermined distance, the obstacle sensor 130 outputs a signal indicating the presence of an obstacle. A plurality of obstacle sensors 130 may be provided at different positions of the work vehicle 100. For example, a plurality of laser scanners and a plurality of ultrasonic sonars may be disposed at different positions of the work vehicle 100. Providing a multitude of obstacle sensors 130 can reduce blind spots in monitoring obstacles around the work vehicle 100.
The steering wheel sensor 152 measures the angle of rotation of the steering wheel of the work vehicle 100. The angle-of-turn sensor 154 measures the angle of turn of the front wheels 104F, which are the wheels responsible for steering. Measurement values by the steering wheel sensor 152 and the angle-of-turn sensor 154 may be used for steering control by the controller 180.
The axle sensor 156 measures the rotational speed, i. e., the number of revolutions per unit time, of an axle that is connected to the wheels 104. The axle sensor 156 may be a sensor including a magnetoresistive element (MR), a Hall generator, or an electromagnetic pickup, for example. The axle sensor 156 outputs a numerical value indicating the number of revolutions per minute (unit: rpm) of the axle, for example. The axle sensor 156 is used to measure the speed of the work vehicle 100. Measurement values from the axle sensor 156 can be utilized for the speed control by the controller 180.
The drive device 240 includes various types of devices required to cause the work vehicle 100 to travel and to drive the implement 300. For example, the prime mover 102, the transmission 103, the steering device 106, the linkage device 108 and the like described above. The prime mover 102 may include an internal combustion engine such as, for example, a diesel engine. The drive device 240 may include an electric motor for traction instead of, or in addition to, the internal combustion engine.
The storage 170 includes one or more storage media such as a flash memory or a magnetic disc. The storage 170 stores various data that is generated by the GNSS unit 110, the camera(s) 120, the obstacle sensor(s) 130, the LiDAR sensor(s) 140, the sensors 150, and the controller 180. The data that is stored by the storage 170 may include an environment map of the environment where the work vehicle 100 travels, an obstacle map that is consecutively generated during travel, and path data for self-driving. The storage 170 also stores a computer program(s) to cause each of the ECUs in the controller 180 to perform various operations described below. Such a computer program(s) may be provided to the work vehicle 100 via a storage medium (e.g., a semiconductor memory, an optical disc, etc.) or through telecommunication lines (e.g., the Internet). Such a computer program(s) may be marketed as commercial software.
The controller 180 is configured or programmed to include the plurality of ECUs. The plurality of ECUs include, for example, the ECU 181 for speed control, the ECU 182 for steering control, the ECU 183 for implement control, and the ECU 184 for self-driving control.
The ECU 181 controls the prime mover 102, the transmission 103, and brakes included in the drive device 240, thus controlling the speed of the work vehicle 100.
The ECU 182 controls the hydraulic device or the electric motor included in the steering device 106 based on a measurement value of the steering wheel sensor 152, thus controlling the steering of the work vehicle 100.
In order to cause the implement 300 to perform a desired operation, the ECU 183 controls the operations of the three-point link, the PTO shaft and the like that are included in the linkage device 108. Also, the ECU 183 generates a signal to control the operation of the implement 300, and transmits this signal from the communicator 190 to the implement 300.
Based on data output from the GNSS unit 110, the camera(s) 120, the obstacle sensor(s) 130, the LiDAR sensor(s) 140, and the sensors 150, the ECU 184 performs computation and control for achieving self-driving. For example, the ECU 184 estimates the position of the work vehicle 100 based on the data output from at least one of the GNSS unit 110, the camera(s) 120, and the LiDAR sensor(s) 140. In a situation where a sufficiently high reception intensity exists for the satellite signals from the GNSS satellites, the ECU 184 may determine the position of the work vehicle 100 based only on the data output from the GNSS unit 110. On the other hand, in an environment where obstructions, such as trees, that may hinder reception of the satellite signals exist around the work vehicle 100, e.g., an orchard, the ECU 184 estimates the position of the work vehicle 100 by using the data output from the LiDAR sensor(s) 140 or the camera(s) 120. During self-driving, the ECU 184 performs computation necessary for the work vehicle 100 to travel along a target path, based on the estimated position of the work vehicle 100. The ECU 184 sends the ECU 181 a command to change the speed, and sends the ECU 182 a command to change the steering angle. In response to the command to change the speed, the ECU 181 controls the prime mover 102, the transmission 103, or the brakes to change the speed of the work vehicle 100. In response to the command to change the steering angle, the ECU 182 controls the steering device 106 to change the steering angle.
Through the actions of these ECUs, the controller 180 realizes self-traveling. During self-traveling, the controller 180 controls the drive device 240 based on the measured or estimated position of the work vehicle 100 and on the consecutively-generated target path. As a result, the controller 180 can cause the work vehicle 100 to travel along the target path.
The plurality of ECUs included in the controller 180 can communicate with one another in accordance with a vehicle bus standard such as, for example, a CAN (Controller Area Network). Instead of a CAN, faster communication methods such as Automotive Ethernet (registered trademark) may be used. Although the ECUs 181 to 184 are illustrated as individual blocks in
The communicator 190 includes a circuit communicating with the implement 300 and the terminal device 400. The communicator 190 includes circuitry to perform exchanges of signals complying with an ISOBUS standard such as ISOBUS-TIM, for example, between itself and the communicator 390 of the implement 300. This allows the implement 300 to perform a desired operation, or allows information to be acquired from the implement 300. The communicator 190 may further include an antenna and a communication circuit to exchange signals via the network 80 with the respective communicators of the terminal device 400. The network 80 may include a 3G, 4G, 5G, or any other cellular mobile communications network and the Internet, for example. The communicator 190 may have a function of communicating with a mobile terminal that is used by a supervisor who is situated near the work vehicle 100. With such a mobile terminal, communication may be performed based on any arbitrary wireless communication standard, e.g., Wi-Fi (registered trademark), 3G, 4G, 5G or any other cellular mobile communication standard, or Bluetooth (registered trademark).
The operational terminal 200 is a terminal for the user to perform a manipulation related to the travel of the work vehicle 100 and the operation of the implement 300, and is also referred to as a virtual terminal (VT). The operational terminal 200 may include a display device such as a touch screen panel, and/or one or more buttons. The display device may be a display such as a liquid crystal display or an organic light-emitting diode (OLED) display, for example. By manipulating the operational terminal 200, the user can perform various manipulations, such as, for example, switching ON/OFF the self-driving mode, and switching ON/OFF the implement 300. At least a portion of these manipulations may also be realized by manipulating the operation switches 210. The operational terminal 200 may be configured so as to be detachable from the work vehicle 100. A user who is at a remote place from the work vehicle 100 may manipulate the detached operational terminal 200 to control the operation of the The drive device 340 in the implement 300 shown in
Next, with reference to
The LiDAR sensor 140 in the example of
A LIDAR sensor having an N of 1 may be referred to as a “two-dimensional LiDAR”, while a LiDAR sensor having an N of 2 or more may be referred to as a “three-dimensional LiDAR”. When N is 2 or more, the angle made by the first laser beam and an Nth laser beam is referred to as the “vertical viewing angle”. The vertical viewing angle may be set in a range from about 20° to 60°, for example.
As shown in
Each laser light source 142 includes a laser diode, and emits a pulsed laser beam of a predetermined wavelength in response to a command from the control circuit 145. The wavelength of the laser beam may be a wavelength that is included in the near-infrared wavelength region (approximately 700 nm to 2.5 μm), for example. The wavelength used depends on the material of the photoelectric conversion element used for the photodetector 143. In the case where silicon (Si) is used as the material of the photoelectric conversion element, for example, a wavelength around 900 nm may be mainly used. In the case where indium gallium arsenide (InGaAs) is used as the material of the photoelectric conversion element, a wavelength of not less than 1000 nm and not more than 1650 nm may be used, for example. Note that the wavelength of the laser beam is not limited to the near-infrared wavelength region. In applications where influences of ambient light are not a problem (e.g., for nighttime use), a wavelength included in the visible region (e.g., approximately 400 nm to 700 nm) may be used. Depending on the application, the ultraviolet wavelength region may also be used. In the present specification, any radiation in the ultraviolet, visible light, and infrared wavelength regions in general is referred to as “light”.
Each photodetector 143 is a device to detect laser pulses that are emitted from the laser light source 142 and reflected or scattered by an object. The photodetector 143 includes a photoelectric conversion element such as an avalanche photodiode (APD), for example. The photodetector 143 outputs an electrical signal which is in accordance with the amount of received light.
In response to a command from the control circuit 145, the motor 144 rotates the mirror that is placed on the optical path of a laser beam emitted from each laser light source 142. This realizes a scan operation that changes the outgoing directions of laser beams.
The control circuit 145 controls emission of laser pulses by the laser light sources 142, detection of reflection pulses by the photodetectors 143, and rotational operation by the motor 144. The control circuit 145 can be implemented by a circuit that includes a processor, e.g., a microcontroller unit (MCU), for example.
The signal processing circuit 146 is a circuit to perform computations based on signals that are output from the photodetectors 143. The signal processing circuit 146 uses ToF (Time of Flight) techniques to calculate a distance to an object that has reflected a laser pulse emitted from a laser light source 142, for example. ToF techniques include direct ToF and indirect ToF. Under direct ToF, the time from the emission of a laser pulse from the laser light source 142 until reflected light is received by the photodetector 143 is directly measured to calculate the distance to the reflection point. Under indirect ToF, a plurality of exposure periods are set in the photodetector 143, and the distance to each reflection point is calculated based on a ratio of light amounts detected in the respective exposure periods. Either the direct ToF or indirect ToF method may be used. The signal processing circuit 146 generates and outputs sensor data indicating the distance to each reflection point and the direction of that reflection point, for example. Furthermore, the signal processing circuit 146 may calculate coordinates (u,v) or (u,v,w) in the sensor coordinate system based on the distance to each reflection point and the direction of that reflection point, and include these in the sensor data for output.
Although the control circuit 145 and the signal processing circuit 146 are two separate circuits in the example of
The memory 147 is a storage medium to store data that is generated by the control circuit 145 and the signal processing circuit 146. For example, the memory 147 stores data that associates the emission timing of a laser pulse emitted from each laser unit 141, the outgoing direction, the reflected light intensity, the distance to the reflection point, and the coordinates (u,v) or (u,v,w) in the sensor coordinate system. Such data is generated each time a laser pulse is emitted, and recorded to the memory 147. The control circuit 145 outputs such data with a predetermined cycle (e.g., the length of time required to emit a predetermined number of pulses, a half scan period, or one scan period). The output data is recorded in the storage 170 of the The LiDAR sensor 140 outputs sensor data with a frequency of about 1 to 20 times per second, for example. This sensor data may include the coordinates of multiple points expressed by the sensor coordinate system, and time stamp information. The sensor data may include the information of distance and direction toward each reflection point but not include coordinate information. In such cases, the controller 180 performs conversion from the distance and direction information into coordinate information.
Note that the method of distance measurement is not limited to the ToF techniques, but other methods such as the FMCW (Frequency Modulated Continuous Wave) techniques may also be used. In the FMCW techniques, light whose frequency is linearly changed is emitted, and distance is calculated based on the frequency of beats that occur due to interferences between the emitted light and the reflected light.
As described above, the LiDAR sensor(s) 140 according to the present example embodiment may be scan-type sensors, which acquire information on the distance distribution of objects in space by scanning a laser beam. However, the LiDAR sensors 140 are not limited to scan-type sensors. For example, the LiDAR sensor(s) 140 may be flash-type sensors, which acquire information on the distance distribution of objects in space by using light diffused over a wide area. A scan-type LiDAR sensor uses a higher intensity light than does a flash-type LiDAR sensor, and thus can acquire distance information at a greater distance. On the other hand, flash-type LiDAR sensors are suitable for applications that do not require intense light because they are simple in structure and can be manufactured at low cost.
Next, an operation of the work vehicle 100 will be described.
Therefore, the controller 180 according to the present example embodiment detects two crop rows existing on opposite sides of the work vehicle 100 based on sensor data that is output from the LiDAR sensor(s) 140, and causes the work vehicle 100 to travel along a path between the two crop rows. Furthermore, upon detecting an end of a crop row based on the sensor data, the controller 180 sets a coordinate system for turning travel and a target point for the turning travel, and causes the work vehicle 100 to turn toward the target point based on the coordinate system. During turning travel, based on the sensor data, the controller 180 detects two adjacent crop rows that are near the target point, and sets a target path for a next instance of inter-row travel in between these two crop rows. The controller 180 calculates an amount of positional deviation and an amount of directional deviation of the work vehicle 100 with respect to the target path based on the sensor data. When a plurality of conditions are satisfied, including a first condition that the amount of positional deviation is smaller than a first threshold, and a second condition that the amount of directional deviation is smaller than a second threshold, the controller 180 ends control of turning travel, and starts control of travel between crop rows. Such an operation allows for solving the aforementioned problem, and smoothly performing self-traveling among a plurality of crop rows and making turns.
At timings when the GNSS unit 110 is able to receive a GNSS signal, positioning may be conducted based on the GNSS signal. For example, at any timing of turning around along the path 30 illustrated in either
In the example of
At step S102, the controller 180 determines whether the condition for starting turning travel is satisfied for not. If the condition for starting turning travel is satisfied, control proceeds to step S103. If the condition for starting turning travel is not satisfied, control proceeds to step S111.
The determination of step S102 may be made through the following flow, for example. First, upon detecting an end of at least the row of trees corresponding to the turning direction (right or left) between the two adjacent rows of trees, the controller 180 sets a coordinate system for turning travel and a target point of turn. Hereinafter, a coordinate system for turning travel may be referred to as a “turning coordinate system”. A turning coordinate system is a coordinate system that is fixed to the ground surface, and is used to control the turning travel. The target point is an entrance point of a next instance of inter-row travel. After setting the turning coordinate system and the target point of turn, once the work vehicle 100 passes an end of the row of trees, the controller 180 determines that the condition for starting turning travel is satisfied, and transitions to the turning travel mode (step S103). If the work vehicle 100 has not passed an end of the row of trees, the controller 180 determines that the condition for starting turning travel is not satisfied, and proceeds to step S111.
At step S111, the controller 180 determines whether the condition for stopping travel is satisfied or not. For example, the controller 180 determines that the condition for stopping travel is satisfied and halts the work vehicle 100, if any of the following is true: (a) when an obstacle is detected based on sensor data; (b) when the fuel of the work vehicle 100 is smaller than a predetermined amount; (c) when the amount of material possessed by the work vehicle 100 is smaller than a predetermined amount; (d) when a problem of the work vehicle 100 is detected; (e) when an instruction to halt is received from the user; (f) when an angle of tilt of the work vehicle 100 is larger than a predetermined value; and (g) when work to be performed by the work vehicle 100 is finished (step S113). As used herein, a “material” is a substance to be consumed in agricultural work. Examples of materials include chemical agents such as sprayer liquids, fertilizers, seedlings of crops, and seeds, for instance. A material may be carried on the work vehicle 100 or the implement 300. An example of when an angle of tilt of the work vehicle 100 is larger than a predetermined value may be a case where the work vehicle 100 is traveling on a steep uphill or a steep downhill. The controller 180 can determine whether each case of (a) to (g) above is true or not by relying on various sensors provided on the work vehicle 100. If the condition for stopping travel is not satisfied, control returns to step S102.
Once transitioning to the turning travel mode at step S103, the controller 180 causes the work vehicle 100 to travel along a turning path that is set on the turning coordinate system. Specifically, based on the sensor data being consecutively output from the LiDAR sensor(s) 140, the controller 180 causes the work vehicle 100 to travel along the turning path, while performing localization for the work vehicle 100 on the turning coordinate system. In the turning travel mode, the controller 180 may utilize not only the sensor data but also utilize a signal that is output from the GNSS receiver 111 and/or a signal that is output from the IMU 115 to perform positioning. While the work vehicle 100 is performing turning travel, the controller 180 determines whether the condition for starting inter-row travel is satisfied or not (step S104). For example, when the work vehicle 100 has reached near the target point of turn so that any deviation in the position and orientation of the work vehicle 100 with respect to the target path in a next instance of the inter-row travel mode has become sufficiently small, it may be determined that the condition for starting inter-row travel is satisfied. Specifically, the controller 180 determines that the condition for starting inter-row travel is satisfied upon satisfying a plurality of conditions, including: (1) a first condition that the amount of positional deviation is smaller than a first threshold; and (2) a second condition that the amount of directional deviation is smaller than a second threshold. If the condition for starting inter-row travel is satisfied, the controller 180 switches from the turning travel mode to the inter-row travel mode (step S105). If the condition for starting inter-row travel is not satisfied, control proceeds to step S112.
At step S112, the controller 180 determines whether the condition for stopping travel is satisfied or not. This process of determination is similar to the process of determination at step S111. For example, the controller 180 determines that the condition for stopping travel is satisfied and halts the work vehicle 100, if any of the following is true: (a) when an obstacle is detected based on sensor data; (b) when the fuel of the work vehicle 100 is smaller than a predetermined amount; (c) when the amount of material possessed by the work vehicle 100 is smaller than a predetermined amount; (d) when a problem of the work vehicle 100 is detected; (e) when an instruction to halt is received from the user; (f) when an angle of tilt of the work vehicle 100 is larger than a predetermined value; and (g) when work to be performed by the work vehicle 100 is finished (step S113). If the condition for stopping travel is not satisfied, control returns to step S104.
After transitioning to the inter-row travel mode at step S105, control returns to step S102. Thereafter, similar operations are repeated until the last instance of inter-row travel is finished. Through the above operation, self-traveling between rows of trees 20 is achieved. The above control is realized by the ECU 184 of the controller 180.
Through the above operation, while repeating the inter-row travel and the turning travel, the work vehicle 100 can perform self-traveling from the start point to the end point of a row of trees 20. According to the present example embodiment, even in an orchard or other environments where highly accurate positioning using a GNSS is difficult, it is possible to perform agricultural tasks such as spreading of a chemical agent while performing self-traveling.
Hereinafter, more specific examples of the operation of the work vehicle 100 according to the present example embodiment will be described.
In the example shown in
The controller 180 causes the work vehicle 100 to travel along the target path 45 that has been set. For example, the controller 180 performs steering control for the work vehicle 100 so as to reduce or minimize the deviation of the position and orientation of the work vehicle 100 with respect to the target path 45. As a result, the work vehicle 100 can be made to travel along the target path 45.
The obstacle map 40 shown in
The controller 180 may generate the obstacle map 40 by eliminating data of any points that are estimated as corresponding to unwanted objects, e.g., the ground surface and weeds, from the sensor data that is output from the LiDAR sensor(s) 140. In a case where three-dimensional point cloud data is output as the sensor data, from the point cloud data, the controller 180 may extract only the data of points whose height from the ground surface is within a predetermined range (e.g., within a range of 0.1 m to 1.5 m), and generate the obstacle map from the extracted data of points. By such a method, an obstacle map indicating a distribution of trees (mainly the trunks) can be generated.
Based on the obstacle map 40, the controller 180 detects two rows of trees 20R and 20L existing on opposite sides of the work vehicle 100. For example, as shown in
After detecting the end of the row of trees 20R in the turning direction, the controller 180 sets the turning coordinate system. First, the controller 180 sets an origin of the turning coordinate system based on the position of the work vehicle 100 at the time of detecting the end of the row of trees 20R in the turning direction. For example, the controller 180 may set the position of the work vehicle 100 at the time of detecting the end of the row of trees 20R as an origin of the turning coordinate system. Alternatively, a position that is shifted in a predetermined direction and by a predetermined distance from the position of the work vehicle 100 at the time of detecting the end of the row of trees 20R may be set as the origin of the turning coordinate system.
In the example of
Thus, based on the obstacle map 40, the controller 180 according to the present example embodiment sets the turning coordinate system Et, calculates the lengths Lr and Ll of the rows of trees 20R and 20L as well as the interval Lg between rows of trees, and based on these values, determines the target point, which is an entrance of the passage between the rows of trees to be next traveled through.
After setting the turning coordinate system Et and the target point P of turn in the inter-row travel mode, the controller 180 switches to the turning travel mode. After setting the coordinate system Et and the target point P, the controller 180 determines whether turning is possible or not, based on the sensor data that is output from the LiDAR sensor(s) 140. If determining that turning is possible, the controller 180 switches to the turning travel mode. For example, after setting the coordinate system Et and the target point P in the inter-row travel mode, upon determining based on the sensor data that a space needed for turning exists and the work vehicle 100 (if the implement 300 is attached, including also the implement 300) has passed the end of the row of trees 20R in the turning direction, the controller 180 switches to the turning travel mode.
In the example of
Next, with reference to
In an initial state, it is assumed that the work vehicle 100 is located at the entrance of a passage between the first rows of trees, as shown in
At step S301, the controller 180 causes the work vehicle 100 to travel in the inter-row travel mode. In the inter-row travel mode, the controller 180 performs the following operation. First, based on the sensor data that is output from the LiDAR sensor(s) 140, the controller 180 generates the obstacle map 40. Next, from the obstacle map 40, approximate straight lines of two adjacent rows of trees that are located on opposite sides of the work vehicle 100 are calculated. The approximate straight lines can be calculated by performing a Hough transform for the obstacle map 40 in order to extract two line segments extending in directions close to the traveling direction in the vicinity of the work vehicle 100, for example. The controller 180 may be configured or programmed to set a target path by setting a plurality of waypoints at positions that are equidistant from the two approximate straight lines. In a case where the positions to be worked on by the implement 300 attached to the work vehicle 100 are located closer to either one of the right or left, a target path may be set at a position that is shifted from a position that is equidistant from the two approximate straight lines. In a case where the position and orientation of the work vehicle 100 are deviated from the position and direction of the target path, the controller 180 performs steering control so as to reduce the deviation. This allows the work vehicle 100 to travel along a target path that is between the rows of trees. If during travel an obstacle is detected on or near the target path, the controller 180 may halt the work vehicle 100, or change the target path in the middle so as to avoid the obstacle. At this time, the controller 180 may send an alert to the terminal device 400 for monitoring purposes.
At step S302, based on the obstacle map 40, the controller 180 performs a process of detecting an end of a row of trees. First, the controller 180 determines the lengths Lr and Ll (see
At step S303, the controller 180 determines whether the row of trees whose end has been detected is the last row of trees or not. The determination as to whether the row of trees is the last row of trees or not can be made based on information concerning the distribution of the row of trees or the number of rows that is stored in the storage 170 in advance. If the row of trees whose end has been detected is the last row of trees, the controller 180 ends the tasked travel of the work vehicle 100. If the row of trees whose end has been detected is not the last row of trees, control proceeds to step S304.
At step S304, the controller 180 sets the turning coordinate system Ct. As has been described with reference to
At step S305, the controller 180 sets coordinates (px, py) of the target point P of turn in the turning coordinate system Et. As shown in
At step S306, the controller 180 determines whether the work vehicle 100 has gone out of the row of trees in the turning direction. With reference to
At step S307, the controller 180 determines whether a space needed for the turning exists or not. This determination may be made based on the obstacle map 40. As shown in
At step S308, the controller 180 sets the turning path 46, and transitions from the inter-row travel mode to the turning travel mode. The turning path 46 may be an arc-shaped path connecting the point P0 and the point P as shown in
At step S309, the controller 180 determines whether the condition for starting inter-row travel is satisfied or not. For example, the controller 180 determines whether a plurality of conditions that have been set as conditions for starting inter-row travel are satisfied or not. The plurality of conditions at least include a first condition and a second condition as follows.
In other words, when the amount of positional deviation and the amount of directional deviation with respect to the next target path are sufficiently small, it may be determined that condition for starting inter-row travel is satisfied.
Furthermore, the controller 180 may calculate a curvature of the target path in a next instance of the inter-row travel mode, and determine that the condition for starting inter-row travel is satisfied when the curvature is smaller than a threshold. In other words, the aforementioned plurality of conditions may include a third condition as follows.
In this case, when a curvature of the target path in a next instance of the inter-row travel mode is sufficiently small, in addition to the aforementioned first condition and second condition being met, it is determined that the condition for starting inter-row travel is satisfied.
Moreover, when the work vehicle 100 has the implement 300 linked thereto, the controller 180 may calculate a distance between the two crop rows in a next instance of the inter-row travel mode that are detected based on the sensor data, and determine that the condition for starting inter-row travel is satisfied when the distance is larger than the width of the implement 300. In other words, the aforementioned plurality of conditions may include a fourth condition as follows.
In this case, only when the width of the implement 300 is smaller than the interval between the two crop rows in addition to the aforementioned conditions being met, it is determined that the condition for starting inter-row travel is satisfied.
If the condition for starting inter-row travel is not satisfied, the controller 180 continues its control in the turning travel mode. If the condition for starting inter-row travel is satisfied, control proceeds to step S310, and the controller 180 transitions to the inter-row travel mode. Thereafter, control returns to step S301, and a similar operation is repeated.
Now, a more specific example of the operation from steps S308 to S310, i.e., from transitioning into the turning travel mode to again transitioning into the inter-row travel mode, will be described.
At step S401, the controller 180 acquires sensor data that is output from the LiDAR sensor(s) 140.
At step S402, based on the sensor data, the controller 180 performs a process of detecting two rows of trees in a next instance of inter-row travel. For example, the controller 180 generates the aforementioned obstacle map based on the sensor data, and detects two rows of trees in the next instance of inter-row travel based on the obstacle map. If the two rows of trees are detected, control proceeds to step S403. If the two rows of trees are not detected, control returns to step S401.
When the two rows of trees 20C and 20D are detected at step S402 shown in
At step S404, the controller 180 calculates an amount of positional deviation and an amount of directional deviation of the work vehicle 100 with respect to the target path. The controller 180 can estimate the position and direction (orientation) of the work vehicle 100 through a matching between the sensor data and the environment map. Based on a comparison between the estimated position and direction of the work vehicle 100 and the position and direction of the target path, the controller 180 can calculate the amount of positional deviation and the amount of directional deviation.
When an amount of positional deviation and an amount of directional deviation are calculated at step S404 shown in
At step S406, the controller 180 determines whether the amount of directional deviation is smaller than a second threshold or not. The second threshold may also be set to an appropriate value which is in accordance with the width of the work vehicle 100, the width of the implement 300, the interval between rows of trees, or the like, for example. The second threshold may be set to a value of, e.g., 5 degrees to 20 degrees. If the amount of directional deviation is smaller than the second threshold, control proceeds to step S407. If the amount of directional deviation is equal to or greater than the second threshold, control proceeds to step S408.
At step S407, the controller 180 calculates a curvature of the target path in a next instance of the inter-row travel mode, and determines whether the curvature is smaller than a third threshold or not. The third threshold may be set to an appropriate value which is in accordance with the width of the work vehicle 100, the width of the implement 300, the interval between rows of trees, the length of the work vehicle 100, the turning performance of the work vehicle 100, or the like, for example. If the curvature is smaller than the third threshold, control proceeds to step S410. If the curvature is equal to or greater than the third threshold, control proceeds to step S408.
At step S408, the controller 180 determines whether the work vehicle 100 has reached a region between the two rows of trees in a next instance of the inter-row travel mode or not. On the basis of the relative positioning between the position of the work vehicle 100 as estimated based on the sensor data and the two detected rows of trees, the controller 180 can determine whether the work vehicle 100 has reached a region between the two rows of trees or not. If the work vehicle 100 has not reached a region between the two rows of trees, control returns to step S401, and the aforementioned operation is again performed. If the work vehicle 100 has reached a region between the two rows of trees, control proceeds to step S409.
At step S409, the controller 180 stops the turning travel mode, and halts the work vehicle 100. This is because, if the amount of positional deviation or the amount of directional deviation of the work vehicle 100 with respect to the target path is large, or if the curvature of the target path is large, performing inter-row travel may result in a contact between the work vehicle 100 and a tree. Note that, if it is determined at step S407 that the curvature of the target path is smaller than the third threshold, control may immediately proceed to step S409 to halt the work vehicle 100, without going through step S408. The reason is that, regarding the amount of positional deviation and the amount of directional deviation, these deviations can be reduced as the work vehicle 100 approaches the two rows of trees. However, it is often the case that the curvature of the target path does not decrease even if the work vehicle 100 approaches the two rows of trees.
At step S410, the controller 180 transitions to the inter-row travel mode. The controller 180 ends its control of turning travel using the aforementioned turning coordinate system, and restarts a control of inter-row travel as described with reference to
Note that the order of steps S405, S406 and S407 may be swapped, without being limited to the order shown in the figure.
Furthermore, in the present example embodiment, it is only when the curvature of the target path is determined to be smaller than the third threshold at step S407 that the controller 180 switches from the turning travel mode to the inter-row travel mode.
Note that the determination of step S407 may be omitted if it is previously known that the curvature of the target path is sufficiently small, as in a case where each row of trees includes a plurality of trees that are in a linear arrangement. In other words, without calculating a curvature of the target path, the controller 180 may switch from the turning travel mode to the inter-row travel mode when the two conditions are satisfied, i.e., the first condition that the amount of positional deviation is smaller than the first threshold and the second condition that the amount of directional deviation is smaller than the second threshold.
The controller 180 may switch from the turning travel mode to the inter-row travel mode when, in addition to the aforementioned first condition and second condition being met, the fourth condition that the width of the implement 300 is smaller than an interval between the two crop rows in a next instance of the inter-row travel mode is satisfied. By imposing the fourth condition, it becomes possible to reduce the possibility that the implement 300 may come in contact with the crops. Furthermore, the controller 180 may switch from the turning travel mode to the inter-row travel mode when, in addition to the first condition, second condition, and fourth condition being met, the third condition that a curvature of the target path in a next instance of the inter-row travel mode is smaller than the third threshold is satisfied, as in step S407 of
In the example of
At step S409, instead of halting the work vehicle 100, the controller 180 may cause the work vehicle 100 to travel backward, and thereafter restart a control of self-traveling of the work vehicle 100 in the turning travel mode. The reason is that, in some cases, causing the work vehicle 100 to travel backward and moving it again closer to the two rows of trees may help the position and orientation of the work vehicle 100 to be adjusted, such that the condition for starting inter-row travel becomes satisfied.
In the example shown in
In the example shown in
Moreover, in the example shown in
Through the above operation, the work vehicle 100 can automatically perform travel between rows of trees and turns. According to the present example embodiment, when the end of a row of trees is detected, the turning coordinate system and the target point are set, and travel control along the turning path is performed based on a turning coordinate system. Furthermore, upon confirming that any deviation in the position and direction of the work vehicle 100 with respect to the target path in a next instance of inter-row travel is sufficiently small, turning travel is switched to inter-row travel. This makes it possible to smoothly perform a turn for changing the row to be traveled and switching between the turning travel mode and the inter-row travel mode. Even in an environment in which GNSS-based positioning is difficult, e. g., an orchard or a forest, or in an environment where localization based on a matching between the sensor data and a previously generated environment map is difficult, it becomes possible to smoothly perform self-traveling between rows of trees.
In the example shown in
In performing the operations illustrated in
Although the above example illustrates that the target point P is modified, without modifying the target point P, the turning path 46 may be modified to avoid contact with another row of trees. For example, after setting the turning path, if any other row of trees is detected based on the sensor data, the controller 180 may modify the turning path in accordance with a position relationship of the other row of trees and the turning path.
In the above example embodiments, the one or more exterior sensors included in the work vehicle are a LiDAR sensor(s) to output two-dimensional or three-dimensional point cloud data as sensor data through scanning of a laser beam. However, the exterior sensors are not limited to such LiDAR sensors. For example, other types of sensors such as flash-type LiDAR sensors or image sensors may be used. Such other types of sensors may be used in combination with scan-type LiDAR sensors.
Although in the above example embodiments the work vehicles perform self-traveling between rows of trees in an orchard, the work vehicles may be used for the purposes of self-traveling between crop rows other than rows of trees. For example, the techniques according to the present disclosure are applicable to work vehicles, such as tractors, that perform self-traveling among a plurality of crop rows in an agricultural field.
A device that performs a processing needed for self-traveling of a work vehicle according to a present example embodiment may be mounted to a work vehicle lacking such functionality as an add-on. For example, a controller configured or programmed to control the operation of a work vehicle that travels among a plurality of crop rows may be attached to the work vehicle in use.
The example embodiments and techniques according to the present disclosure are applicable to work vehicles, e.g., tractors movable in an environment in which a plurality of crop rows (e.g., rows of trees) exist, such as an orchard, an agricultural field, or a mountain forest.
While example embodiments of the present invention have been described above, it is to be understood that variations and modifications will be apparent to those skilled in the art without departing from the scope and spirit of the present invention. The scope of the present invention, therefore, is to be determined solely by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2022-103963 | Jun 2022 | JP | national |
This application claims the benefit of priority to Japanese Patent Application No. 2022-103963 filed on Jun. 28, 2022 and is a Continuation Application of PCT Application No. PCT/JP2023/021410 filed on Jun. 8, 2023. The entire contents of each application are hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/021410 | Jun 2023 | WO |
Child | 18979713 | US |