The present disclosure relates to a sensing device, a processing device, and a method of processing data.
In general, conventionally, there have been proposed a variety of devices that scan (scan) a space with light, detect light reflected from a physical object, and measure a distance to the physical object. Distance information on a target scene may be converted into, for example, three-dimensional point cloud data and utilized. Typically, point cloud data is data in which a distribution of points in a scene where physical objects are present is expressed in three-dimensional coordinates.
Japanese Unexamined Patent Application Publication Nos. 2019-135446 and 2011-027457 disclose examples of ranging devices based on FMCW (Frequency Modulated Continuous Wave) method. Ranging devices based on the FMCW method send out electromagnetic waves with frequencies that are modulated at a fixed cycle and measure a distance based on a difference between frequencies of transmitting waves and reflected waves. When electromagnetic waves are light such as visible light or infrared light, the ranging devices of the FMCW method are called FMCW LiDAR (Light Detection and Ranging). The FMCW LiDAR divides the light with frequencies that are modulated at a fixed cycle into output light and reference light and detects interference light between the reference light and reflected light that is generated by the output light being reflected by a physical object. It is possible to calculate a distance to the physical object and a velocity of the physical object, based on the frequencies of the interference light. Japanese Unexamined Patent Application Publication Nos. 2019-135446 and 2011-027457 disclose performing ranging and velocity measurement, using the ranging devices based on the FMCW method.
One non-limiting and exemplary embodiment provides techniques of facilitating integration or utilization of data acquired by one or more sensing devices.
In one general aspect, the techniques disclosed here feature a sensing device including a light source that emits light with modulated frequencies; an interference optical system that separates the light emitted from the light source into reference light and output light and generates interference light between reflected light and the reference light, the reflected light being generated by the output light being reflected at a reflecting point of a physical object; a photodetector that receives the interference light and outputs a detection signal according to intensity of the interference light; and a processing circuit that processes the detection signal. The processing circuit selects a specific data format from a plurality of data formats that can be generated by the processing circuit based on the detection signal, and outputs output data including measurement data having the selected specific data format.
An inclusive or specific aspect of the present disclosure may be implemented by a system, a device, a method, an integrated circuit, a computer program, or a recording medium such as a computer readable recording disk, and may be implemented by any combination of the system, the device, the method, the integrated circuit, the computer program, and the recording medium. The computer readable recording medium may include a volatile recording medium, or may include a non-volatile recording medium such as a CD-ROM (Compact Disc-Read Only Memory). The device may include one or more devices. If the device includes two or more devices, the two or more devices may be located in one apparatus or may be separately located in two or more separate apparatuses. In this specification and Claims, a “device” may mean one device as well as a system including a plurality of devices.
According to one aspect of the present disclosure, it is possible to facilitate integration or utilization of data acquired by one or more sensing devices.
Additional benefits and advantages of the disclosed embodiments will become apparent from the specification and drawings. The benefits and/or advantages may be individually obtained by the various embodiments and features of the specification and drawings, which need not all be provided in order to obtain one or more of such benefits and/or advantages.
In the present disclosure, all or some of a circuit, a unit, a device, a member, or a section, or all or some of functional blocks in a block diagram may be executed by one or more electronic circuits including, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). An LSI or an IC may be integrated in a chip or may include a combination of a plurality of chips. For example, a functional block other than a storage element may be integrated in a chip. Here, it is called an LSI or an IC, but a name varies depending on a degree of integration, and may be called a system LSI, a VLSI (very large scale integration), or a ULSI (ultra large scale integration). An FPGA (Field Programmable Gate Array) that is programmed after an LSI is manufactured or an RLD (reconfigurable logic device) capable of reconfiguration of bonding relations within the LSI or setup of circuit sections within the LSI can also be used for the same purpose.
Furthermore, functions or operations of all or some of the circuit, the unit, the device, the member, or the section can be performed through software processing. In this case, software is recorded in a non-transitory recording medium such as one or more ROMs, optical disks, hard disk drives. When the software is executed by a processing device (processor), a function specified by the software is performed by the processing device or peripheral devices. A system or a device may include one or more non-transitory recording media having software recorded therein, a processing device, and necessary hardware devices such as an interface.
Before describing embodiments of the present disclosure, a description is given of an example of a system to which a sensing device and a processing device according to an embodiment of the present disclosure are applicable.
The sensing device 100 mounted on the fixed body 400 includes one or more sensors. Each sensor may be, for example, a sensor that acquires data for ranging, such as an FMCW LiDAR including a light source and a light sensor. The sensing device 100 may include a plurality of sensors that are located at different positions and in different orientations. The sensing device 100 can sequentially generate and output measurement data including positional information and velocity information of a physical object present in the surrounding area. The measurement data may include, for example, data indicating a position of each point in a three-dimensional point cloud and data indicating a velocity at each point. In the following description, unless otherwise stated, the three-dimensional point cloud is simply referred to as a “point cloud”. Data including positional information of each point in the three-dimensional point cloud is referred to as “point cloud data”.
The sensing device 100 is not limited to the ranging device including the light source and the light sensor but may be a ranging device that performs ranging and velocity measurement with another method. For example, a ranging device that performs ranging using radio waves such as millimetric-waves may also be used. Alternatively, instead of performing ranging, a device that outputs measurement data including spectrum information of interference light between reflected light that is reflected by a physical object and light that is emitted from a light source may be used as the sensing device 100. In that case, the server 500 that has acquired measurement data from the sensing device 100 can calculate a distance from the sensing device 100 to the physical object and a velocity of the physical object based on the spectrum information of the interference light.
The server 500 includes a processing device 530 and a storage device 540. The server 500 acquires, for example, data indicating a position and attitude of the sensing device 100 and the measurement data, from the sensing device 100 in each of the fixed bodies 400. The processing device 530 can integrate the measurement data acquired from each sensing device 100 to sequentially generate data indicating the road environment and record the data in the storage device 540. The processing device 530 can generate point cloud data that is represented by a predetermined three-dimensional coordinate system, for example and to which the velocity information for each point has been added. When an accident occurs, for example, such data may be utilized for the purpose of examining a cause of the accident. A coordinate system of the point cloud data generated by the server 500 may be a coordinate system specific to the server 500 or may match a coordinate system of three-dimensional map data utilized by the server 500. Alternatively, an administrator of the server 500, or the like, may operate the server 500 to specify the coordinate system of the point cloud data.
Note that although the sensing device 100 is provided in the fixed body 400 in each example described above, the sensing device 100 may be provided in the mobile object 300. The server 500 may similarly acquire measurement data from the sensing device provided in the mobile object 300 as well as from the sensing device 100 provided in the fixed body 400. Data to be transmitted from the sensing device provided in the mobile object 300 to the server 500 may include, for example, data indicating the velocity of the mobile object 300 itself, in addition to data indicating the position and the attitude of the sensing device and measurement data for calculating positions and velocities of surrounding physical objects. In that case, the processing device 530 of the server 500 can generate data indicating more details of the road environment based on the data acquired from the sensing device in each of the fixed body 400 and the mobile object 300.
The server 500 may store, in the storage device 540, the information indicating the position and the attitude of the sensing device 100 in each of the fixed bodies 400 in advance. In that case, the data acquired from each sensing device 100 by the server 500 does not have to include the information indicating the position and the attitude of the sensing device 100. Alternatively, the server 500 may estimate the position, the attitude, and velocity of each of the sensing devices 100 based on the measurement data acquired from the sensing device 100 in each of the fixed bodies 400 and the mobile objects 300.
Next, an example a configuration of the system illustrated in
Each of the sensing devices 100 illustrated in
The sensing device 100 generates measurement data by subjecting data outputted from each of the sensors 200 to necessary processing such as coordinate transformation, and transmits the measurement data from the communication circuit 120 to the server 500. The sensing device 100 transmits a batch of data to the server 500, for example, at fixed time intervals.
In the present disclosure, a batch of data transmitted from the sensing device 100 may be referred to as a “frame”. This may or may not match a “frame” that is a unit of image data outputted from an image sensor. The sensing device 100 may repeatedly output, for example, point cloud data including information on the velocity at each point and measurement time, at a fixed frame rate, for example. It is also possible to associate one frame of point cloud data in association with one time and output the point cloud data.
In this manner, each of the sensors 200 utilizes the FMCW technology to acquire data necessary for ranging and velocity measurement of the target scene. Note that it is not necessary that every sensor 200 be the FMCW LiDARs. Some of the sensors 200 may be radars using radio waves such as the millimetric-waves.
Note that the processing circuit 240 may output, as the sensor data, spectrum information of the detection signal, which is information in the previous stage, without calculating distance and velocity of each reflecting point. The spectrum information may include, for example, information indicating power spectrum of the detection signal or a peak frequency in the power spectrum of the detection signal. If the processing circuit 240 outputs the spectrum information of the detection signal, calculation of the distance and the velocity is performed by, for example, the processing device 530 of the server 500, and not by the sensor 200.
The server 500 includes a communication device 550, in addition to the processing device 530 and the storage device 540. The processing device 530 sequentially acquires measurement data from each of the sensing devices 100 via the communication device 550 and records the measurement data in the storage device 540. The processing device 530 performs necessary processing such as time checking and coordinate transformation of the acquired measurement data. This allows the processing device 530 to generate point cloud data integrated at a specific time and at a specific location and velocity data of each point.
Each of the mobile objects 300 in the example of
Each of the sensing devices 100 repeatedly performs sensing and sequentially generates measurement data including information on the position to each reflecting point on a physical object surface in the scene, the velocity at each reflecting point, and time. The measurement data is transmitted to the server 500. The processing device 530 of the server 500 performs the necessary processing such as the time checking and the coordinate transformation of the acquired measurement data and records the measurement data in the storage device 540. Such operations may be repeated at a fixed cycle, for example.
The server 500 may receive, from outside, a command requesting analysis of road environment at a specific date and time and at a specific location. In that case, the processing device 530 of the server 500 acquires data on the relevant date and time and the location from the storage device 540 and generates and outputs data corresponding to the request. Such operations allow for acquisition of data, which helps in elucidation of the cause of accident, for example.
Similarly to the example of
Each of the mobile objects 300 transmits its own positional data to the server 500, for example, at a fixed cycle or at necessary timing. When receiving the positional data of the mobile object 300, the server 500 determines whether or not the mobile object 300 is approaching a specific area such as a junction point of expressways. When recognizing that the mobile object 300 is approaching the specific area, the server 500 transmits point cloud data with velocity information in the specific area to the mobile object 300. The controller 320 of the mobile object 300 controls the drive device 330 based on the transmitted point cloud data. As a result, the mobile object 300 performs running control such as deceleration according to the road conditions, avoidance of obstacles, or the like.
According to the system described above, the point cloud data, to which the velocity information is attached, can be generated for each reflecting point, based on the data acquired by each of the sensing devices 100. This makes it possible to generate traffic information including information on a traveling velocity in addition to positions of physical objects such as other mobile objects that are present in an environment through which the mobile object 300 travels. Such traffic information makes it possible to provide detailed confirmation of accident conditions, accurate notification on approaching other mobile objects in dangerous areas and surrounding areas thereof, such as a junction point of expressways that are not easy to visibly recognize.
In addition, although the server 500 and each of the sensing devices 100 communicate via the network 600 in each system described above, the present embodiment is not limited to such a configuration. For example, the server 500 and each of the sensing devices 100 may communicate via a dedicated communication line. The communication line may be wired or wireless. Alternatively, a processing device having functions similar to those of the above-mentioned server 500 and the sensing device 100 may be configured to be directly connected and communicate within one system.
In such a system described above, each of the sensing devices 100 does not necessarily have an identical configuration, and the sensing devices 100 having different specifications and performance may coexist. For example, the plurality of sensing devices 100 manufactured by different manufacturers or the plurality of sensing devices 100 that are manufactured by a same manufacturer but are of different models may coexist in one system. Such a plurality of sensing devices 100 may have mutually different data output formats or may be able to select more than one output format depending on the model. For example, there may be such situations that one sensing device 100 outputs measurement data including information on the position and the velocity of each reflecting point, while other sensing device 100 outputs measurement data including information on the position of each reflecting point and a spectrum of a detection signal. In that case, it is difficult for the processing device such as the server 500 to integrate data outputted from the plurality of sensing devices 100 and generate point cloud data of a specific time/area.
In order to solve the problem described above, each of the sensing devices may include, in measurement data, identification information indicating a format of the measurement data that it outputs and transmits it to the processing device. The processing device may change arithmetic processing based on the measurement data, on the basis of the identification information indication of the format of the measurement data. This can facilitate integration of sensor data having different data formats.
In a certain embodiment, the processing device may transmit a request signal specifying a format of measurement data to each of the sensing devices. The sensing devices that have received the request signal may generate measurement data in the format specified by the request signal and transmit the measurement data to the processing device. For example, the sensing devices can output the measurement data in a plurality of different formats and may be able to select the data format according to the request signal. Such a configuration allows the processing device to acquire necessary data from each of the sensing devices according to the circumstances. This facilitates the integration of the data acquired by the plurality of sensing devices, and makes it possible to realize detailed analysis of an environment surrounding the sensing devices, provision of appropriate information to mobile objects present in the environment, or the like.
Hereinafter, an overview of the embodiments of the present disclosure will be described.
A sensing device according to one embodiment of the present disclosure includes a light source that emits frequency-modulated light; an interference optical system that separates the light emitted from the light source into reference light and output light and generates interference light between the reference light and reflected light generated by the output light being reflected at a reflecting point of a physical object; a photodetector that receives the interference light and outputs a detection signal according to intensity of the interference light; and a processing circuit that processes the detection signal. The processing circuit selects a specific data format from a plurality of data formats that can be outputted by the processing circuit, generates measurement data having the selected specific data format based on the detection signal, and outputs output data including the measurement data.
According to the configuration described above, a specific data format can be selected from a plurality of data formats, and output data including measurement data having the specific data format can be outputted. Therefore, a data format to be outputted can be changed flexibly in accordance with, for example, the data format requested by another device or an output data format of other sensing data. This can facilitate the integration or utilization of data outputted from the sensing device and other sensing devices.
The processing circuit may be a collection of a plurality of processing circuits. Functions of the above processing circuit can be implemented in collaboration with, for example, the processing circuit 240 in each sensor 200 illustrated in
In the present disclosure, “measurement data” may be data that is generated based on a detection signal and that includes information on the positions or the velocity of one or more reflecting points or physical objects. “Output data” may be, for example, data to be transmitted to another device such as a storage device or a server. In addition to the measurement data, the output data may include various types of data used in processing performed by other devices. For example, the output data may include an identification number of the sensing device, information indicating the position and the orientation of the sensing device, and identification information indicating the data format of the measurement data, or the like.
The processing circuit may generate positional information of the reflecting point based on the detection signal, and generate the measurement data including the positional information. The processing circuit may generate, for example, point cloud data including the positional information of a plurality of the reflecting points as the measurement data.
The processing circuit may generate velocity information of the reflecting point based on the detection signal, and generate the measurement data including the velocity information. The velocity information may be information indicating, for example, a relative velocity vector of the reflecting point with respect to the sensing device or a component of the relative velocity vector in a direction along a straight line connecting the sensing device and the reflecting point.
The processing circuit may generate spectrum information of the interference light based on the detection signal and generate the measurement data including the spectrum information. This makes it possible to output output data including the spectrum information of interference light. The spectrum information may include, for example, information on a power spectrum of the detection signal or a peak frequency of the power spectrum. Another device that has acquired the information can generate velocity information of the reflecting point based on the information.
The processing circuit may generate the positional information and the velocity information of the reflecting point based on the detection signal; generate information indicating a degree of danger of the physical object based on the velocity information; and generate the measurement data including the positional information and the information indicating the degree of danger. This allows output data including the positional information and the information indicating the degree of danger of the reflecting point to be outputted.
The processing circuit may generate positional information and velocity information of each of the plurality of reflecting points irradiated with the output light; divide a plurality of reflecting points to one or more clusters based on the positional information and determine one velocity vector for each cluster based on the velocity information of three or more reflecting points included in each cluster; and generate the measurement data including information indicating the velocity vector of each cluster. This allows output data including the information on the velocity vector of each cluster to be outputted.
The processing circuit may include the identification information indicating the specific data format in the output data and output the output data. This makes it possible for another device that has acquired the output data to recognize the data format of the output data based on the identification information and perform arithmetic processing according to the data format.
The processing circuit may select the specific data format from the plurality of data formats according to a request signal inputted from another device. This makes it possible to generate measurement data having the specific data format requested by the other device.
The sensing device may further include a communication circuit that transmits the output data to the other device. This makes it possible to transmit the output data to the other device such as a processing device (server, for example) connected to the sensing device via, for example, a network or a line within the system.
A method according to another embodiment of the present disclosure is a method of processing output data outputted from one or more sensing devices. The one or more sensing devices each include a light source that emits frequency-modulated light; an interference optical system that separates the light emitted from the light source into reference light and output light and generates interference light between reflected light and the reference light, the reflected light being generated by the output light being reflected at a reflecting point of a physical object; a photodetector that receives the interference light and outputs a detection signal according to intensity of the interference light; and a processing circuit that generates the measurement data based on the detection signal. The method includes obtaining output data including the measurement data, discriminating a data format of the measurement data; and generating positional information of the physical object by applying arithmetic processing according to the discriminated data format to the measurement data.
According to the method described above, the positional information of the physical object can be generated through the arithmetic processing according to the data format of the measurement data in the output data acquired from the sensing device. Consequently, even if measurement data in a different data format is acquired from a plurality of sensing devices, for example, integration of the measurement data can be facilitated by the arithmetic processing according to the data format.
The method may further include generating velocity information of the physical object by applying the arithmetic processing according to the discriminated data format to the measurement data.
When the data format of the measurement data is a data format that includes velocity component information indicating the component of the relative velocity vector in the direction along the straight line connecting the sensing device and the reflecting point, the method may further include generating velocity vector information of the physical object based on the velocity component information.
The measurement data may include positional information and velocity information of each of the plurality of reflecting points irradiated with the output light. The method may include dividing the plurality of reflecting points to one or more clusters based on the positional information, determining one velocity vector for each cluster based on the velocity information of three or more reflecting points included in each cluster, and outputting information on the velocity vector of each cluster as velocity vector information of the physical object.
The one or more sensing devices may be a plurality of sensing devices. The output data acquired from each of the plurality of sensing devices may include information indicating positions and orientations of the sensing devices. The positional information of the physical object may be generated based on the information indicating the positions and the orientations of the plurality of sensing devices.
When the data format of the measurement data is a data format including spectrum information of the detection signal, the method may include generating positional information of the physical object and velocity vector information of the physical object based on the spectrum information.
The spectrum information may include, for example, information on a power spectrum of the detection signal or the peak frequency of the power spectrum.
When the data format of the measurement data is a data format that includes power spectrum information indicating the power spectrum of the detection signal or a data format that includes peak frequency information indicating the peak frequency of the power spectrum of the detection signal, the method may include generating positional information of the physical object and velocity vector information of the physical object, based on the power spectrum information or the peak frequency information.
The method may further include transmitting a request signal specifying the data format of the measurement data to the one or more sensing devices.
The one or more sensing devices may be mounted on a mobile object. The request signal may be transmitted to the one or more sensing devices when abnormality is detected in the mobile object itself or in an environment in which the mobile object runs.
The output data may include identification information indicating the data format of the measurement data. The discrimination of the data format may be performed based on the identification information.
The method may further include outputting a signal for controlling operations of a mobile object based on the positional information of the physical object.
A processing device according to yet another embodiment of the present disclosure includes a processor and a memory in which a computer program executed by the processor is stored. The processor performs obtaining output data including measurement data, from one or more sensing devices including a light source that emits frequency-modulated light; an interference optical system that separates the light emitted from the light source into reference light and output light and generates interference light between reflected light and the reference light, the reflected light being generated by the output light being reflected at a reflecting point of a physical object; a photodetector that receives the interference light and outputs a detection signal according to intensity of the interference light; and a processing circuit that generates the measurement data based on the detection signal; discriminating a data format of the measurement data; and generating positional information of the physical object by applying arithmetic processing according to the discriminated data format to the measurement data.
Hereinafter, the exemplary embodiments of the present disclosure will be described specifically. Note that all of the embodiments to be described below represents an inclusive or specific example. Numeric values, shapes, components, arrangement positions and connection forms of components, steps, and order of steps, or the like, illustrated in the embodiments below are merely examples, and are not intended to be limiting to the present disclosure. In addition, among components in the following embodiments, a component not described in independent claims that represent the primary component, will be described as an arbitrary component. In addition, each figure is a schematic diagram and not necessarily illustrated strictly. Furthermore, in each diagram, same or similar components are denoted by same reference numerals. An overlapping description may be omitted or simplified.
First, a first embodiment of the present disclosure will be described.
The sensing device according to the present embodiment is a ranging device that includes one or more sensors and a communication circuit. Each sensor has FMCW LiDAR functionality. Each sensor generates and outputs sensor data including information related to positions and velocities of the plurality of reflecting points in a scene to be observed. The communication circuit transmits the sensor data outputted from each sensor to the server 500 illustrated in
The processing circuit 110 is, for example, a circuit including a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). The processing circuit 110 operates by executing a computer program stored in the storage device 130, for example. The processing circuit 110 acquires sensor data outputted from the one or more sensors 200 included in the sensing device 100, and converts positional information and velocity information of a plurality of points into output data in a predefined data format for communication, the positional information and the velocity information being measured during a predefined time section.
The communication circuit 120 is a communication module that performs data transmission and reception. The communication circuit 120 transmits output data generated by the processing circuit 110 to the server 500. The communication circuit 120 may be configured to receive a data request signal specifying a specific data format from an external device such as the server 500. In that case, the processing circuit 110 generates output data in the specified data format.
The storage device 130 includes any one or more storage media, such as a semiconductor memory, a magnetic storage medium, an optical storage medium. The storage device 130 stores data generated by the processing circuit 110 and a computer program executed by the processing circuit 240. The storage device 130 stores information related to a position and attitude of the sensing device 100 such as fixed information indicating a position and attitude of each of the sensors 200 included in the sensing device 100. The storage device 130 further stores the sensor data acquired by the processing circuit 110 from each of the sensors 200 according to a processing process of the processing circuit 110.
The sensor 200 illustrated in
The photodetector 230 receives the interference light and generates and outputs an electric signal according to intensity of the interference light. The electric signal is referred to as a “detection signal”. The photodetector 230 includes one or more light receiving elements. A light receiving element includes, for example, a photoelectric converter such as a photodiode. The photodetector 230 may be a sensor such as an image sensor in which a plurality of light receiving elements is two-dimensionally arranged.
The processing circuit 240 is an electronic circuit that controls the light source 210 and performs processing based on the detection signal outputted from the photodetector 230. The processing circuit 240 may include a control circuit that controls the light source 210 and a signal processing circuit that performs signal processing based on the detection signal. The processing circuit 240 may be configured as a circuit or may be a collection of a plurality of separate circuits. The processing circuit 240 transmits a control signal to the light source 210. The control signal causes the light source 210 to periodically change the frequency of the emitted light within a predetermined range. In other words, the control signal is a signal to sweep the frequency of the light emitted from the light source 210.
The light source 210 in this example includes a drive circuit 211 and a light emitting element 212. The drive circuit 211 receives the control signal outputted from the processing circuit 240, generates a drive current signal according to the control signal, and inputs the drive current signal to the light emitting element 212. The light emitting element 212 may be, for example, an element, such as a semiconductor laser element, that emits highly coherent laser light. The light emitting element 212 emits frequency-modulated laser light in response to the drive current signal.
The frequency of laser light emitted from the light emitting element 212 is modulated at a fixed cycle. A frequency modulation cycle may be greater than or equal to 1 microsecond (μs) and less than or equal to 10 millisecond (ms), for example. A frequency modulation amplitude may be greater than or equal to 100 MHz and less than or equal to 1 THz, for example. A wavelength of laser light may be included in a near-infrared wavelength band of greater than or equal to 700 nm and less than or equal to 2000 nm, for example. In sunlight, an amount of light of the near-infrared light is less than that of visible light. Therefore, use of the near-infrared light as the laser light can reduce influence of sunlight. Depending on an application, the wavelength of laser light may be included in the wavelength band of the visible light of greater than or equal to 400 nm and less than or equal to 700 nm or in a wavelength band of ultraviolet light.
The control signal inputted from the processing circuit 240 to the drive circuit 211 is a signal a voltage of which fluctuates at a predetermined cycle and with a predetermined amplitude. The voltage of the control signal may be modulated like a triangular wave or a saw wave, for example. A control signal like the triangular wave or the saw wave, a voltage of which repeatedly changes linearly, makes it possible to sweep the frequency of the light emitted from the light emitting element 212 in a nearly linear form.
The interference optical system 220 in the example illustrated in
The interference optical system 220 is not limited to the configuration illustrated in
The sensor 200 may further include an optical deflector that changes the direction of emitted light.
In the example of
1-2. Measurement Data Outputted from Sensing Device
Next, a description will be given of ranging and velocity measurement by the FMCW-Lidar used in the present embodiment. The ranging and velocity measurement by the FMCW-LiDAR method are carried out by analyzing the frequency of interference light generated by the interference between the frequency-modulated reference light and the reflected light.
Here, light velocity is c, modulation frequency of the emitted light is fFMCW, a width of the frequency modulation of the emitted light (that is, a difference between the highest frequency and the lowest frequency) is Δf, the beat frequency is fb (=fup=fdown), and the distance from the sensor 200 to the object is d. The modulation frequency fFMCW is an inverse of a cycle of the frequency modulation of the emitted light. The distance d can be calculated based on the following expression (1):
d=c×f
b/(Δf×fFMCW)×(¼) (1).
Here, the component of the relative velocity of the object with respect to the sensor 200 in the direction toward the sensor 200 is vc, a wavelength of the emitted light is λ, and the amount of frequency shift due to the Doppler effect is fd. The amount of the frequency shift fd is expressed as fd=(fdown−fup)/2. In this case, the velocity component vc can be calculated based on the following expression:
v
c
=f
dλ/2=(fdown−fup)λ/4 (2).
When vc is positive, it indicates that the object is moving in a direction approaching to the sensor 200. To the contrary, when vc is negative, it indicates that the object is moving in a direction moving away from the sensor 200.
In this manner, the Doppler effect occurs with respect to the direction of the reflected light. That is, the Doppler effect is caused by the velocity component in the direction from the object toward the sensor 200. Therefore, the processing circuit 240 can determine the component of the relative velocity of the object with respect to the sensor 200 in the direction toward the sensor 200, by the above calculation based on the detection signal.
The sensing device 100 of the present embodiment collectively outputs, as a frame, data including the positional information and the velocity information of the plurality of reflecting points, the data being outputted from the plurality of sensors 200 during a time section of a specific length. A reflecting point is a point at which light from the light source 210 is reflected during the time section and is also herein referred to as a “data point”. In the present embodiment, the sensing device 100 generates and outputs output data including the information related to the position and the velocity of each reflecting point expressed in a coordinate system with a reference position of the sensing device 100 as the origin. As the information related to the velocity of each reflecting point, the sensing device 100 generates output data including information indicating the true relative velocity vector v at that point or the velocity component vc in a direction along a straight line connecting that point and the origin. Whether the sensing device 100 outputs the velocity information in the format of the true relative velocity vector v or the velocity component vc varies depending on a model or setting. The sensing device 100 may output the velocity information in a format specified by a data request signal from the external device such as the server 500. In this manner, the processing circuit 110 in the sensing device 100 can select a specific data format from a plurality of data formats in which the processing circuit 110 can output, and generate measurement data including the velocity information expressed in the selected specific data format. The processing circuit 110 generates output data including the measurement data and identification information indicating the specific data format selected. The communication circuit 120 transmits the output data to other devices such as the server 500.
In the example illustrated in
As described above, the processing circuit 110 may output a value of the velocity component measured at each point or may output information on the relative velocity vector of each point with respect to the sensing device 100, as the velocity information of each point. When outputting the information on the relative velocity vector of each point, the processing circuit 110 can calculate the relative velocity vector based on values of velocity components measured at a plurality of points. Instead of outputting the information on the relative velocity vector of each point, information on a traveling velocity of the object at the position of each point (that is, the relative velocity of the object with respect to the sensing device 100) may be outputted. In that case, the velocities of points on the same object are all the same. Rather than transmitting velocity information attached to all points in the point cloud data, the points having the same velocity information is grouped as one cluster, and only one piece of velocity information can be written for one cluster. Grouping points as a cluster can reduce volume of data to be transmitted.
Note that in the example of
Similarly to the example of
Like the example illustrated in
Next, a specific example of operations of the sensing device 100 will be described.
The sensing device 100 starts the operation when receiving input of a start signal from input means not illustrated.
(Step S1100) The processing circuit 110 determines whether or not an end signal has been inputted from the input means. If the end signal has been inputted, the sensing device 100 ends its operation. If no end signal has been inputted, processing advances to step S1200.
(Step S1200) The processing circuit 110 determines whether or not a frame period that has been predefined as time to acquire one frame of data has ended. If the frame period has ended, processing advances to step S1500. If the frame period has not ended, processing advances to step S1300.
(Step S1300) The processing circuit 110 determines whether or not it has acquired data from any of the one or more sensors 200. When the processing circuit 110 has acquired data from any of the sensors 200, processing advances to step S1400. When the processing circuit 110 has not acquired data from any of the sensors 200, processing returns to step S1200.
(Step S1400) The processing circuit 110 causes the storage device 130 to store sensor data acquired from the sensor 200, information on the sensor 200 that has generated the sensor data, and data acquisition time. The sensor data may include, for example, information indicating the position of the reflecting point in the coordinate system of the sensing device 100, and information indicating the component of the relative velocity vector of the reflecting point with respect to the sensor 200 along the straight line connecting the sensor 200 and the reflecting point. After step S1400, processing returns to step S1200.
The sensing device 100 can store information on the position and the relative velocity component of the point cloud measured by the one or more sensors 200 for every fixed frame period ( 1/30 second, for example), by repeating steps S1200 to S1400.
(Step S1500) When the acquisition of the one frame of the data ends, the processing circuit 110 performs clustering of the point cloud based on the positional information of each point in the point cloud that is measured in the frame recorded in the storage device 130. For example, the processing circuit 110 classifies the point cloud into one or more clusters, with a plurality of points located relatively close to each other as one cluster. The processing circuit 110 sets a cluster ID for each generated cluster. As illustrated in
(Step S1600) The processing circuit 110 determines whether or not processing in steps S1700 and S1800 has been completed for all clusters generated in step S1500. If the processing has been completed for all the clusters, processing advances to step S1900. If there are unprocessed clusters, processing advances to step S1700.
(Step S1700) The processing circuit 110 selects one cluster from clusters for which calculation of the relative velocity vector has not yet been completed, among the clusters generated in step S1500.
(Step S1800) The processing circuit 110 calculates a relative velocity vector common to all the points in the cluster, based on information on the relative velocity components of the plurality of points included in the cluster selected in step S1700.
The relative velocity vector common to all the points in the cluster may be calculated based on, for example, the velocity components vector measured at three or more data points in the same cluster. Similarly to the example illustrated in
|vc|2=|v·vc| (3)
The processing circuit 110 can estimate the relative velocity vector v common to the data points in the cluster by applying the expression (3) above to the velocity component vectors vc at the three or more data points. Details of step S1800 will be described below.
The processing circuit 110 can calculate the relative velocity vector v for all the clusters in the point cloud data measured within the frame period, by repeating steps S1600 to S1800. The relative velocity vector v represents a relative velocity of an object corresponding to the cluster, with respect to the sensing device 100. When calculating the relative velocity vector v of the cluster, the processing circuit 110 causes the storage device 130 to store the information.
(Step S1900) When the relative velocity vectors have been determined for all the clusters, the processing circuit 110 generates output data for the frame. The processing circuit 110 generates the output data in the format exemplified in
If the format exemplified in
If the format exemplified in
(Step S2000) The processing circuit 110 outputs one frame of the output data generated in step S1900 to the communication circuit 120. The communication circuit 120 transmits the output data to the server 500 via the network 600. After the data is transmitted, processing returns to step S1100.
Through the operations from step S1100 to step S2000, the sensing device 100 can generate data for communication and transmit it to the server 500, the data for communication including information on the position, time, and velocity of the point cloud measured within the time of the frame. By repeating the operations described above, the sensing device 100 can transmit the measurement data including the positional information and the velocity information of each point to the server 500, for each frame.
1-3-1. Operation to Estimate velocity Vector
Next, details of an operation of estimating the relative velocity vector common to all the points in the cluster in step S1800 will be described.
(Step S1810) For the clusters selected in step S1700, the processing circuit 110 selects three points whose velocity component magnitude is not 0, among the data points in the cluster. The three points may be selected based on a distance from the gravity center of the cluster, for example.
(Step S1820) The processing circuit 110 calculates the common relative velocity vector v from the velocity component vectors of the three data points selected in step S1810, based on the above expression (3). As illustrated in
|vc1|2=|v·vc1|
|vc2|2=|v·vc2|
|vc3|2=|v·vc3|
The processing circuit 110 can determine the common relative velocity vector v by solving the above simultaneous equation using the known velocity component vectors vc1, vc2, and vc3.
Note that three points are selected in this example, but points to be selected may be four or more points. When four or more points are selected, a vector that is estimated as a common relative velocity vector in the cluster is not determined uniquely. In that case, an average of vectors estimated by combinations of three points or a representative value derived by a method other than averaging may be adopted as the common relative velocity vector.
The processing circuit 110 may perform operations illustrated in
(Step S1830) The processing circuit 110 divides the cluster selected in step S1700 into a plurality of regions. As illustrated in
(Step S1840) The processing circuit 110 determines whether or not processing from steps S1850 to S1870 has been completed for all the regions divided in step S1830. If there remains any cluster that has not been divided, processing advances to step S1850. If the processing has been completed for all the regions, processing advances to step S1880.
(Step S1850) The processing circuit 110 selects one region that has not been processed yet from among the regions divided in step S1830.
(Step S1860) The processing circuit 110 selects three data points whose velocity component magnitude s is not 0 among the data points within the range selected in step S1850.
(Step S1870) The processing circuit 110 calculates a relative velocity vector common to the three points, by solving the above simultaneous equation using the known velocity component vectors vc1, vc2, and vc3 of the three points selected in step S1860.
By repeating steps S1840 to S1870, the relative velocity vector without bias can be estimated with respect to the data points distributed in the entire cluster.
(Step S1880) When the estimation processing of the relative velocity vector has been completed for all the regions, the processing circuit 110 generates an average vector of the relative velocity vectors in the respective regions estimated in steps S1840 to S1870, as a representative relative velocity vector of the entire cluster.
Note that in the example illustrated in
In the example illustrated in
On the other hand, in the format illustrated in
Hereinafter, a description will be given of an example of operations of the sensing device 100 when the velocity vector component along the straight line connecting the sensor 200 and the data point is transmitted in the format illustrated in
In step S1910, the processing circuit 110 generates output data for that frame. The processing circuit 110 generates the output data in the format exemplified in
As described above, the sensing device 100 in the present embodiment can generate data to be transmitted to the server 500, for example, using the following method in (a) or (b).
(a) The processing circuit 110 performs clustering of a point cloud and associates each cluster with one physical object. Assuming that points included in a same cluster have a same velocity, the processing circuit 110 calculates a relative velocity vector of the sensing device 100 with respect to the physical object corresponding to each cluster and generates point cloud data including positional information and relative velocity vector information, as exemplified in any of
(b) The processing circuit 110 generates point cloud data including the positional information of each point in the point cloud and information on a relative velocity component of each point in the point cloud, that is, the relative velocity component along a straight line connecting a coordinate origin of the sensor 200 and the point, the positional information and the relative velocity component information being measured by the sensor 200 as exemplified in
As such, the velocity information of each point included in output data may be information indicating an actual relative velocity vector of the point, or information indicating a relative velocity component of the point in the direction along the straight line connecting the coordinate origin set in the sensing device 100 and that point. The processing circuit 110 may be configured to select the above two types of velocity data as velocity information and generate the output data. The processing circuit 110 may also be configured to be able to select a format of the output data from the plurality of formats exemplified in
Next, a description of a configuration example of the server 500 will be given. As described with reference to
The input device 510 is a device that accepts input requesting detailed road condition information at specific time and in a specific space. The input device 510 may include, for example, a keyboard or voice character input means. The input device 510 may include a pointing device that allows a user to specify a specific point on a map.
The output device 520 is a device that outputs the detailed road condition information in response to a request for the detailed road condition information at specific time and in a specific space, which is inputted using the input device 510. The road condition information may include information related to, for example, arrangement of the fixed body 400 and the mobile object 300 as well as a traveling velocity of the mobile object 300. The output device 520 may include a display, for example. The display displays a map of the road environment, for example, and draws the fixed body 400 and the mobile object 300 on the map. Alternatively, the output device 520 may be three-dimensional display or a hologram display device that three-dimensionally displays the fixed body and the mobile object in a specific space.
The communication device 550 is a device that communicates with each of the sensing devices 100 via the network 600. Data received by the communication device 550 is transmitted to the processing device 530.
The processing device 530 is, for example, a device including one or more processors such as a CPU or a GPU, and a memory. The memory stores a computer program to be executed by the processor. The processor of the processing device 530 generates positional information and velocity information of a physical object, by the communication device 550 acquiring output data including measurement data from one or more sensing devices 100, discriminating a data format of the measurement data, and applying arithmetic processing according to the discriminated data format to the measurement data. The processing device 530 performs processing such as coordinate transformation of point cloud data included in the output data, conversion from a relative velocity to an absolute velocity of each point, detailed time adjustment, and causes the storage device 540 to store information thereof. In response to the request inputted from the input device 510 for the detailed road condition information at the specific time and in the specific space, the processing device 530 also acquires data on the relevant time and area from the storage device 540 and transmits a signal instructing the output device 520 to output.
The storage device 540 is a device including one or more storage media such as semiconductor storage media (memory, for example), magnetic storage media, or optical storage media. The storage device 540 stores information on measurement time, a position, and a velocity of each point in the point cloud.
(Step S3100) The processing device 530 determines whether or not an end signal has been inputted from the input device 510. If the end signal has been inputted, the server 500 ends its operation. If no end signal has been inputted, processing advances to step S3200.
(Step S3200) The processing device 530 determines whether or not the communication device 550 has received data from the sensing device 100. If the data has been received, processing advances to step S3300. If no data has been received, step S3200 is repeated until the data is received. The operation of step S3200 may be performed for every time unit of the point cloud processing of the server 500. For example, processing of step S3200 may be performed at predefined fixed time intervals. The predefined fixed time intervals may be referred to as a processing frame of the server 500.
(Step S3300) The processing device 530 reads the format of velocity information included in the fixed value of data acquired in step S3200, and determines whether the velocity information included in the acquired data represents a relative velocity vector of a physical object with respect to the sensing device 100 or represents a relative velocity component vector in a direction along the straight line connecting the sensor 200 and the data point. When the velocity information represents the relative velocity vector of the physical object with respect to the sensing device 100, processing advances to step S3800. When the velocity information represents the relative velocity component vector in the direction along the straight line connecting the sensor 200 and the data point, processing advances to step S3400.
(Step S3400) The processing device 530 clusters data points of the acquired point cloud data based on the position or the distance from the sensing device 100, and classifies or groups the plurality of data points into one or more clusters.
(Step S3500) The processing device 530 determines whether or not the velocity vector estimation processing has been completed for all clusters generated in step S3400. If there remain clusters, processing advances to step S3600. If the velocity vector estimation processing has been completed for all clusters, processing advances to step S3800.
(Step S3600) The processing device 530 selects one cluster from clusters for which the velocity vector estimation has not yet been completed, among the clusters generated in step S3400.
(Step S3700) The processing device 530 estimates a relative velocity vector common to all points in the cluster, based on the relative velocity component vectors of the plurality of points in the cluster selected in step S3600. An estimation method is similar to the method in step S1800 illustrated in
By repeating steps S3500 to step S3700, the relative velocity vector for each cluster can be estimated, for all the clusters generated in step S3400.
(Step S3800) The processing device 530 transforms the position of each point in the data acquired in step S3200 and the relative velocity vector of the cluster to which each point belongs, into data expressed in the coordinate system of the server 500. The coordinate transformation may be performed based on information indicating the position and the direction of the sensing device 100 that is included as the fixed value in the data transmitted from each sensing device 100.
(Step S3900) The processing device 530 causes the storage device 540 to store information on the position and the velocity of each point subjected to the coordinate transformation in step S3800. As illustrated in
By repeating the processing in step S3100 to step S3900, the server 500 can acquire the measurement data from the sensing device 100, and record the information on the position and the velocity for each point expressed in the coordinate system of the server 500, together with the detailed time information. In the present embodiment, as exemplified in
In the example described above, the server 500 processes the measurement data received from the one sensing device 100. If the system includes the plurality of sensing devices 100, the server 500 may receive measurement data in different formats from the plurality of sensing devices 100. Hereinafter, a description will be given of an example of operations of the server 500 in such a configuration.
(Step S4100) The processing device 530 determines whether or not an end signal has been inputted from the input device 510. If the end signal has been inputted, the server 500 ends its operation. If no end signal has been inputted, processing advances to step S4200.
(Step S4200) The processing device 530 determines whether or not the communication device 550 has received data from the sensing device 100. If the data has been received, processing advances to step S4300. If no data has been received, step S4200 is repeated until the data is received. The operation of step S4200 may be performed for every time unit of the point cloud processing of the server 500. For example, processing of step S4200 may be performed at predefined fixed time intervals.
(Step S4300) The processing device 530 transforms the information on the position and the velocity vector of each data point in the data acquired in step S4200 into data expressed in the coordinate system of the server 500. The velocity vector information may be expressed, for example, by coordinate values indicated by an end point of the velocity vector whose starting point is the position of the data point.
(Step S4400) The processing device 530 reads the format of the velocity information included in the fixed value of the data acquired in step S4200, and determines whether the velocity information included in the acquired data represents the relative velocity vector of the physical object with respect to the sensing device 100 or represents the relative velocity component vector in the direction along the straight line connecting the sensor 200 and the data point. If the velocity information represents the relative velocity vector of the physical object with respect to the sensing device 100, processing advances to step S5300. If the velocity information represents the relative velocity component vector in the direction along the straight line connecting the sensor 200 and the data point, processing advances to step S4500.
(Step S4500) The processing device 530 performs clustering processing on the point cloud subjected to the coordinate transformation in step S4300, by combining it with the data points in the point cloud recorded in the storage device 540 and acquired from another sensing device 100 in the same frame. Based on the position of each data point, the processing device 530 combines the point cloud subjected to the coordinate transformation in step S4300 with the point cloud recorded in the storage device 540 and acquired within the time of the frame, and divides them into one or more clusters.
(Step S4600) The processing device 530 determines whether or not the estimation processing of the velocity vector common to the clusters has been completed for all clusters in the point cloud generated in step S4500. If the velocity vector common to the clusters has been estimated for all clusters, processing advances to step S5300. Among the clusters generated in step S4500, if there are clusters for which the estimation processing of the velocity vector common to the clusters has not been performed yet, processing advances to step S4700.
(Step S4700) The processing device 530 selects one cluster from clusters for which the velocity vector common to the data points included in the cluster has not been calculated yet, among the clusters generated in step S4500.
(Step S4800) The processing device 530 determines whether or not there is any velocity vector information already calculated for the clusters selected in step S4700. The processing device 530 determines, for the data point included in the cluster, whether the velocity information corresponding to the data point represents the relative velocity component vector on the straight line connecting the sensor and the data point or represents the velocity vector of the object at the data point. If all the data points in the cluster represent the relative velocity component vector on the straight line connecting the sensor and the data point, as velocity information, that is, if the velocity vector of the cluster has not been estimated, processing advances to step S5100. For one or more data points, if there is the velocity vector information estimated as the velocity vector common to the data points included in the cluster, processing advances to step S4900.
(Step S4900) The processing device 530 determines whether or not the velocity vectors already calculated for the clusters selected in step S4700 are inconsistent with the velocity vectors or the relative velocity component vectors corresponding to other data points in the cluster.
As a method of determining inconsistency, for example, it can be determined that there is inconsistency when a difference between the velocity vectors already calculated at the plurality of data points in the cluster is larger than or equal to a predefined reference. The vector difference may be, for example, a sum of absolute values of differences between respective three-dimensional coordinate values. Alternatively, presence or absence of inconsistency may be determined based on a calculated vector difference, for example, by assigning a large weight to a difference in vector orientations and a small weight to a difference in vector magnitude. In addition, the method of determining inconsistency may be a method of determining that there is inconsistency when a difference between the magnitude of the relative velocity component vectors corresponding to one or more data points in the cluster and the magnitude of the components of the velocity vector already calculated at the one or more data points in the cluster in the same direction as the relative velocity component vector is larger than or equal to a reference value. Such a method makes it possible to detect inconsistency, for example, when objects that are located close to each other and have different velocities are grouped as one cluster.
If inconsistency is found between the velocity vectors corresponding to the data points in the cluster, processing advances to step S5000. If no inconsistency is found between the velocity vectors corresponding to the data points in the cluster, processing advances to step S5200.
(Step S5000) The processing device 530 divides the data points in the cluster into a plurality of groups based on data attributes or by clustering. The division may be performed, for example, based on time associated with each data point. A plurality of data points that correspond to a plurality of objects and that do not overlap in spatial position at the same time may be subjected to ranging in an overlapped state in the same space due to a time difference. In such a case, division of the data points based on detailed time of ranging clears the situation in which objects with a plurality of different velocities overlap or are proximate to each other in the same space. As an example of another method of division, a method of dividing for each sensing device 100 that has performed ranging of the data points or a method of dividing a point cloud based on the velocity information format at the time of processing of step S5000 is possible.
(Step S5100) The processing device 530 estimates a common velocity vector for each group or cluster of the divided point clouds. The method of estimation is similar to the method of step S3700 illustrated in
(Step S5200) If there is no inconsistency between the velocity vector already estimated for the selected cluster and other velocity information, the processing device 530 sets the already estimated velocity vector as the common velocity vector for the cluster. After the operation in step S5200, processing returns to step S4600.
By repeating the operation of step S4600 to step S5100 or step S5200, it is possible to divide the point cloud acquired during the frame period into clusters or groups, estimate a common velocity vector for each of the divided clusters or groups, and generate velocity vector information for each data point in the point cloud.
(Step S5300) The processing device 530 causes the storage device 540 to store the information on the position and the velocity of each point that has been subjected to the coordinate transformation. As illustrated in
By repeating the processing of step S4100 to step S5300, it is possible to transform the point cloud data acquired from each of the plurality of sensing devices 100 to the data expressed in the coordinate system set for the server 500, and associate and store the position, the detailed time, and the velocity vector of each data point.
In the example illustrated in
Next, a description will be given of an example of a case where the server 500 receives, from the sensing device 100, data including ID information indicating the format of the velocity information for each frame, as exemplified in
(Step S6100) When acquiring measurement data from the sensing device 100, the processing device 530 determines whether or not processing on each frame has been completed for all frames of the acquired measurement data. If there are unprocessed frames, processing advances to step S3300. If processing has been completed on all the frames, processing advances to step S3900.
The operations from step S3300 to Step S3900 are similar to the operations illustrated in
By repeating step S6100 to step S3800, the processing device 530 can process for each frame the velocity information of each point according to the format specified for each frame, convert it into point cloud data including information on the velocity vector expressed in the coordinate system of the server 500 for each data point, and record the point cloud data in association with the detailed time information.
(Step S6100) After the coordinate transformation processing in step S4300, the processing device 530 determines whether or not processing for each frame has been completed for all the frames. If there are unprocessed frames, processing advances to step S4400. If processing has been complete for all the frames, processing returns to step S4100.
The operations from step S4400 to Step S5300 are similar to the operations illustrated in
Next, a description will be given of an example of the operation in which the server 500 outputs information on road conditions at specific time and in a specific space, in response to input of an information request.
(Step S7000) The processing device 530 determines whether or not an end signal has been inputted from the input device 510. If the end signal has been inputted, the server 500 ends its operation. If no end signal has been inputted, processing advances to step S7100.
(Step S7100) The processing device 530 determines whether or not an information request signal has been inputted from the input device 510. If the information request signal has been inputted, processing advances to step S7200. If no information request signal has been inputted, step S7100 is repeated until the information request signal is inputted.
The information request signal is a signal that specifies a specific time range and a specific spatial range. The input device 510 may determine a time range, for example, using a method of setting a predefined fixed time width before and after a specific date and time specified by a user or the like. Alternatively, a time range may be determined according to start time and end time of the time range entered by the user or the like. A spatial range can be determined, for example, using a method of the user inputting latitude and longitude or entering an address as character strings, or the like, to specify a specific point and determine an area surrounding that point as a spatial range. Alternatively, a spatial range may be specified by the user specifying an area on a map.
(Step S7200) Based on the time range and the spatial range indicated by the information request signal inputted in step S7100, the processing device 530 acquires data on points that are included in the time range in which the measured time has been inputted and that are included in the spatial range in which the position coordinates have been inputted, among the point cloud data recorded in the storage device 540.
(Step S7300) The processing device 530 generates display data for three-dimensionally displaying the point cloud based on the positional information of each point in the data acquired in step S7200. For example, the processing device 530 may generate display data that three-dimensionally represents the velocity vector of each point in the data acquired in step S7200, as a vector having a starting point at the position of each point. Such display data represents distribution of physical objects and motion thereof in the specified time range and spatial range.
(Step S7400) The processing device 530 outputs the display data generated in step S7300 to the output device 520 such as a display. The output device 520 displays an image indicating the three-dimensional distribution of the physical objects in a specific location based on the display data. When the specified time range is long, moving image data may be generated as display data. After the operation in step S7400, processing returns to step S7000.
In the example illustrated in
Next, a description will be given of an example of an operation in which the server 500 in the system illustrated in
(Step S8000) The processing device 530 determines whether or not an end signal has been inputted from the input device 510. If the end signal has been inputted, the server 500 ends its operation. If no end signal has been inputted, processing advances to step S8000.
(Step S8100) The processing device 530 determines whether or not the current time is predefined transmission time of the road information. If the current time is the transmission time of the road information, processing advances to step S7200. If the current time is not the predefined transmission time of the road information, step S8100 is repeated until the transmission time is reached.
The server 500 may transmit the road information, for example, at fixed time intervals. In the environment illustrated in
(Step S8200) Based on the preset time range and the spatial range, the processing device 530 acquires data on points that are included in the time range in which the measured time has been inputted and that are included in the spatial range in which the position coordinates have been inputted, among the point cloud data recorded in the storage device 540. As the time range, for example, a range from 0.05 seconds before the current time till the current time may be set. As the spatial range, for example, an area including the main road and the confluent road within 100 m before the junction point may be set. As the time range and the spatial range, a range necessary for avoiding dangers may be defined by, for example, taking the road condition, in particular, the vehicle velocities and the road structure, into consideration.
(Step S8300) The processing device 530 converts the data acquired in step S8200 into output data in the data format of the point cloud including the velocity information as exemplified in
Note that it is not necessary to convert all data points into output data in step S8300. When a spatial density of data points is larger than or equal to a certain density, the processing device 530 may reduce the number of data points and convert them into output data. The data points may be reduced according to the spatial density or based on the number of points in the cluster. If the data acquired from the sensing device 100 includes supplementary information such as likelihood or reliability of measurement results of the data point, data may be reduced based on the supplementary information of the data point.
(Step S8400) The communication device 550 transmits the road information indicating the latest road condition generated in step S8300 to the mobile object 300. After the operation in step S8400, processing returns to step S8000.
By repeating the operations from step S8000 to S8400, the server 500 can periodically transmit the current road information and provide vehicles running on roads with blind spots such as the junction points with information on the conditions of the surrounding roads.
(Step S8110) The processing device 530 determines whether or not the communication device 550 has received data including valid point cloud information from the one or more sensing devices 100. If the communication device 550 has not received the valid data from the sensing devices 100 at the time immediately before the current time, step S8110 is repeated till the valid data is received. If the communication device 550 has received the valid data from the sensing devices 100 at the time immediately before the current time, processing advances to step S8200.
In a system that monitors blind spots on roads as in the example of
As described above, the sensing device 100 in the present embodiment outputs the information on the position, the detailed time, and the velocity of each data point, as the output data. The velocity related information represents the relative velocity vector of the data point with respect to the sensor 200 that performed ranging of that data point, or the relative velocity vector component indicating the component of the relative velocity vector of the data point in the direction along the straight line connecting the sensor 200 and the data point. The relative velocity vector and the relative velocity vector component may both be expressed as the vectors in the coordinate system set in the sensing device 100. Note that the relative velocity vector component may be expressed as a scalar representing its magnitude. If the positional information of the data point and the information on the reference position (that is, the coordinate origin) of the sensing device 100 are known, the relative velocity vector component as a vector can be calculated from the information on the magnitude of the relative velocity vector component. The sensing device 100 generates and outputs the output data including, as the identification information, the code representing the format of the velocity that indicates whether the velocity information of each data point included in the output data is the relative velocity vector or the relative velocity vector component. Such a code being included in the output data, the server 500 that receives the output data can discriminate the type of the velocity information accompanying the point cloud data, based on the code. The server 500 performs different processing according to the type of the discriminated velocity information. For example, when the velocity information represents the relative velocity vector component, the server 500 performs processing to transform the relative velocity vector component of each point into the actual velocity vector expressed in the coordinate system of the server 500. Such processing can facilitate the integration of the output data even if the output data with different velocity expression formats is acquired in many batches from the one or more sensing device 100.
Note that the sensing device 100 may be mounted not only in the fixed bodies 400 but also in the mobile objects 300 such as vehicles equipped with, for example, self-driving capability. In that case, velocity information of the point cloud acquired by the sensing device 100 is influenced by the traveling velocity of the mobile objects 300. Therefore, the processing circuit 110 of the sensing device 100 in the mobile object 300 may acquire the information on the position and a traveling velocity vector of the mobile object 300 from the controller 320 of the mobile object 300, include the information in the output data, and output the data. In that case, in the data formats exemplified in
In the present embodiment, the sensing device 100 performs measurements at any timing without receiving instructions from the server 500, determines the type of the velocity information, generates data, and transmits the data. At this time, the sensing device 100 may determine the type of the velocity information to be transmitted to the server 500, according to changes in the communication rate with the server 500. Instead of such an operation, the sensing device 100 may perform measurements based on instructions from the server 500. Alternatively, the sensing device 100 may generate data to be transmitted to the server 500 based on the specification of the type of the velocity information from the server 500. The server 500 may transmit different signals such as a signal instructing to start measurement, a signal specifying frequency of measurements, or a signal specifying a type of velocity information, depending on contents of the instruction to the sensing device 100.
In the present embodiment, the sensing device 100 and the server 500 in the system communicate via the network 600, but the communication does not necessarily have to go through the network 600. For example, the sensing device 100 and the server 500 may be connected through communications within a system separated from other communication networks. For example, a system may be configured in which a control circuit that controls operations of mobile objects and one or more sensing devices communicate via a communication network within the system, and the control circuit monitors the situations surrounding the mobile object. In addition, the technology of the present embodiment may be applied to a system that constantly monitors a certain spatial area, such as a security system, or a monitoring system for nursing care facilities or hospitals. Such systems may be such configured that a circuit controlling operations such as warning or calling and one or more sensing devices can communicate through a communication network within the system, rather than going through an external network.
The various modification examples described above may be applied not only to the present embodiment but also to respective embodiments to be described below.
Next, a second embodiment of the present disclosure will be described.
A sensing device in the present embodiment outputs information indicating an emission direction of light and a spectrum of interference light, instead of outputting the velocity information of the point cloud. A server that receives data from the sensing device 100 calculates a distance and a velocity of an object from the light emission direction and information on the spectrum of the interference light, and can generate the point cloud data including the velocity information of each point. Hereinafter, different points from Embodiment 1 will be mainly described.
A physical configuration of the sensing device 100 in the present embodiment is similar to the configuration illustrated in
Operations of the sensing device 100 in the present embodiment are similar to the operations illustrated in
(Step S1400) The processing circuit 110 acquires the information on the light emission direction and the spectrum information from the sensor 200. The spectrum information is information indicating a result of frequency analysis of interference light generated by the interference optical system 220 illustrated in
By repeating the operations from step S1200 to step S1400, the sensing device 100 stores the information indicating the light emission direction acquired by one or more sensors 200 in a predetermined frame period and the corresponding spectrum information.
(Step S1910) The processing circuit 110 generates output data of the frame. The processing circuit 110 generates output data according to the data format illustrated in
Data of normal frames describes data that varies from frame to frame. In this example, the processing circuit 110 generates output data including the measurement time for each emission direction, the laser light emission direction, and the spectrum information, by referring to the data recorded in the storage device 130 exemplified in
In the example illustrated in
As data outputted for each frame, in the example of
The processing circuit 110 may generate output data, for example, in the format illustrated in
In the examples illustrated in
As data for each frame, for example, for each emission direction, information on detailed irradiation time may be written in five bytes, the irradiation direction may be written in two bytes, the spectral peak frequency may be written in one byte, and the signal intensity at that frequency may be written in one byte. A set of the frequency and the signal intensity is repeated for the number of spectral peaks written as the fixed value, for each of up-chirp and down-chirp. A data string of the set of the frequency and the signal intensity for the number of peaks is repeated for the number of irradiation points written in the fixed value, and one frame of data is written.
Next, the operations of the server 500 in the present embodiment will be described. In the present embodiment, the server 500 may receive, from the sensing device 100, output data including the positional information and the velocity information of the point cloud but also output data including the spectrum information for calculating a distance and a velocity. For example, the server 500 may receive, from the sensing device 100, output data including the spectrum information on the interference waves detected in each of the up-chirp period and the down-chirp period, the information being acquired by an FMCW sensor. Therefore, the server 500 performs different signal processing depending on a type of received data, according to the code indicating the data type included in the output data.
When the processing device 530 of the server 500 acquires data from any sensing device 100 within a certain frame period in step S3200, processing advances to step S9100.
(Step S9100) The processing device 530 determines whether or not the generation processing of the point cloud data including the information on the velocity expressed in the coordinate system of the server 500 (that is, the relative velocity vector or the relative velocity component) has been finished, for all of the information on the position and the velocity of the point cloud received from the one or more sensing device 100 in step S3200 or the spectrum information for calculating the position and the velocity of the point cloud. When the generation processing of the point cloud data has been completed for all the data, processing advances to step S9800. When there remains unprocessed data, processing advances to step S9200.
(Step S9200) The processing device 530 selects one piece of data for which processing has not been performed, among the data acquired in step S3200. The data selected here is data corresponding to one data point in the point cloud.
(Step S9300) The processing device 530 discriminates the format of the data acquired in step S3200. The processing device 530 discriminates the format of the data based on the code indicating the type or format of the data included in the fixed value or information for each frame in the acquired data. In the example of
A first format is a format of the point cloud data including the information on the relative velocity vector of each data point with respect to the sensing device 100. The data in the first format may be written in any of the formats in
A second format is data format including the information on the component of the relative velocity vector of each data point with the respect to the sensing device 100 along the straight line connecting the sensing device 100 and the data point. The data in the second format may be written in the format illustrated in
A third format is a format including the information on the emission direction of laser light emitted for measurement, and the information on the spectral peak of interference light obtained when the laser light is emitted. The data in the third format may be written in the format illustrated in
A fourth format includes the information on the emission direction of laser light emitted for measurement, and the information on the power spectrum calculated by the frequency analysis of the interference light obtained when the laser light is emitted. The data in the fourth format may be written in the format illustrated in
(Step S9410) When the data format is the first format, the processing device 530 extracts, from the acquired data, information on the measurement time, the position, and the relative velocity vector of the data point.
(Step S9420) When the data format is the second format, the processing device 530 extracts, from the acquired data, information on the measurement time, the position, and the relative velocity component vector of the data point.
(Step S9430) When the data format is the third format, the processing device 530 extracts data on the spectral peaks in each of up-chirp and down-chirp from the acquired data.
(Step S9440) The processing device 530 determines whether or not there is a peak exceeding a predefined threshold of the spectral peaks of each in up-chirp and down-chirp. If there is the corresponding peak both in the up-chirp and the down-chirp, processing advances to step S9500. If there is no peak exceeding the predefined threshold in at least one of the up-chirp or the down-chirp, processing returns to step S9100. If no spectral peak can be identified, there is a possibility that the sensor 200 of the sensing device 100 could not acquire a data point because no interference due to reflected light occurred in that emission direction. In such a case, processing does not proceed to subsequent processing and returns to step S9100.
(Step S9450) When the data format is the fourth format, the processing device 530 extracts data on the power spectrum in each of the up-chirp or the down-chirp from the acquired data.
(Step S9460) The processing device 530 determines from the data on the power spectrum in each of the up-chirp and the down-chirp whether or not there is any peak exceeding the predefined threshold. If there are corresponding peaks both in the up-chirp and the down-chirp, processing advances to step S9500. If there is not the peak exceeding the predefined threshold in at least one of the up-chirp or the down-chirp, processing returns to step S9100.
(Step S9500) Based on the spectral peak value selected or extracted in step S9440 or step S9460, the processing device 530 calculates a distance to the data point and a relative velocity component of the data point in the direction along the straight line connecting the sensor and the data point. Here, if there is a plurality of valid spectral peaks in either each of the up-chirp and the down-chirp, or both of the up-chirp and the down-chirp, the processing device 530 selects one spectral peak for each of the up-chirp and the down-chirp. As a selection method, there is, for example, a method of selecting a peak with the maximum signal intensity for each of the up-chirp and the down-chirp. In addition to the method of selecting the peak with the maximum signal intensity, another method is, for example, to select a peak in a specific frequency band, or the like. The processing device 530 calculates the distance from the sensor to the data point and the velocity with the method described with reference to
(Step S9600) The processing device 530 transforms the positional information and the velocity information of the data point from the coordinate system of the sensing device 100 to the coordinate system of the server 500, and records the converted data in the storage device 540. After step S9600, processing returns to step S9100.
By repeating the operations from step S9100 to step S9600, the processing device 530 can generate and record point cloud data including the positional information and the velocity information of each point, irrespective of the format of the data in the frame.
(Step S9800) When the coordinate transformation processing is finished for all the data points in the frame, the processing device 530 performs the clustering processing and the velocity vector estimation processing. With the processing described above, the positional information and the velocity information is recorded in the storage device 540, for all the data points in the frame. However, for data processed through step S9410, information on the actual relative velocity vector with respect to the sensing device 100 is recorded as the velocity information of each point. On the other hand, for data processed through steps S9420, S9430 and S9450, the information on the relative velocity component vector is recorded as the velocity information of each point. The processing device 530 performs the clustering processing to integrate the point cloud data that may have different types of velocity information, and estimates velocity vectors of the physical object corresponding to each data point. The processing here is similar to, for example, the processing from steps S3400 to S3700 in
(Step S3900) The processing device 530 causes the storage device 540 to store the positional information and the velocity vector information of each data point estimated in step S9800.
By repeating the processing from step S3100 to step S3900 illustrated in
Next, a third embodiment of the present disclosure will be described.
In the present embodiment, the sensing device 100 is mounted on the mobile object 300. The server 500 specifies a type of data to be generated by the sensing device 100, depending on the environmental situation in which the mobile object 300 operates. The sensing device 100 generates data according to the type of data specified by the server and transmits the data. This can cause the sensing device 100 to output data including the spectrum information of interference light that can be analyzed in detail, for example, when the server 500 needs detailed information in the environment. Hereinafter, different points from Embodiments 1 and 2 will be described.
A configuration of the sensor 200 included in the sensing device 100 is similar to the configuration of the sensor 200 in Embodiments 1 and 2. As illustrated in, for example,
The communication device 310 transmits, to the server 500, data including information on the distance and the velocity measured by the sensing device 100 or the spectrum information for calculating the distance and the velocity. The communication device 310 also receives a request signal specifying a format of the data transmitted from the server 500.
The processing circuit 110 processes the information on the distance and the velocity outputted from the one or more sensors 200 or the spectrum information for calculating the distance and the velocity and generates output data including the information on the distance or the position and the velocity of the object or the spectrum information that may generate the information. The output data is transmitted to the server 500 by the communication device 310. The processing circuit 110 determines a measurement operation of the sensor 200 according to the request signal that the communication device 310 acquires from the server 500 and generates a control signal for controlling the sensors 200.
The controller 320 may be a device, such as an electronic control unit (ECU), that includes a processor and a memory. The controller 320 determines the mobile object's 300 own position, direction, and course, based on map information, the map information being recorded in advance in a storage device and indicating the road environment, and the information on the distance or the position and the velocity of the object, the information being generated by the processing circuit 110. The controller 320 generates a control signal for controlling the drive device 330 based on those pieces of information. The controller 320 may also transmit, to the processing circuit 110, the control signal to the drive device 330. Based on the control signal from the controller 320, the processing circuit 110 can acquire information on the operation of the mobile object 300, such as a traveling direction, a traveling velocity, and acceleration.
The drive device 330 may include various devices used in movement of the mobile object 300, such as wheels, an engine or an electric motor, a transmission, or a power steering device. The drive device 330 operates based on a control signal outputted from the controller 320, and causes the mobile object 300 to perform operations such as accelerating, decelerating, turning to the right, turning to the left, or the like.
During normal operation, the server 500 may receive a notice from an external device indicating that abnormality has occurred in an environment in which the mobile object 300 runs. The notice may be a notice of, for example, entry of a person into an operating area of the mobile object 300. For example, when it is known in advance that a worker will enter for road inspection or construction, a notice may be transmitted from the external device to the server 500. The server 500 can request one or more mobile objects 300 to transmit detailed information that allows for analysis of detailed positional information and velocity information, in order to acquire detailed information on the intruder. The detailed information is used to estimate a position and a velocity of a physical object in the environment with higher accuracy than the positional and velocity information generated by the sensing device 100. The detailed information may be, for example, measurement data in a format that includes the spectrum information described in Embodiment 2.
The mobile object 300 transmits, to the server 500, the spectrum information that may generate the detailed positional information and velocity information, according to the request from the server 500.
The server 500 receives the spectrum data transmitted by the mobile object 300, calculates the distance and the velocity from the spectrum data, and monitors the intruder through the coordinate transformation into the coordinate system of the server 500, as well as discrimination processing on point cloud data, tracking processing, or the like.
(Step S10010) The processing device 530 determines whether or not there is a request to perform special processing different from normal operations, for example, processing different from normal operation, due to intrusion of a person, or the like, for example. If there is a request for special processing, processing advances to step S10100. If there is no instruction for special processing, processing advances to step S10020.
(Step S10020) The processing device 530 determines whether or not the communication device 550 has received data from the mobile object 300. If the data is received from the mobile object 300, processing advances to step S10030. If no data is received from the mobile object 300, step S10020 is repeated.
(Step S10030) When the data is received in step S10020, the processing device 530 determines whether or not the format of the received data is a data format for normal processing. The data format for normal processing may be a format including information on the position of the object. The information on the position may be information indicating a distance between the sensing device 100 and the object. If the format of the received data is the data format for normal processing, processing advances to step S10040. If the format of the received data is not the data format for normal processing, that is, when it is a data format for detailed analysis, processing advances to the special processing in step S10100.
The data format for normal processing may be, for example, the format illustrated in any of
The data format for detailed analysis may be, for example, the format illustrated in any of
(Step S10040) The processing device 530 acquires the positional information from the data received in step S10020.
(Step S10060) The processing device 530 performs checking of the point cloud data generated based on the positional information acquired in step S10040 against the map data recorded in the storage device 540 to determine the position of the mobile object 300.
(Step S10070) The processing device 530 records the position of the mobile object 300 and the data acquisition time determined in step S10070, in the storage device 540.
After the operation in step S10070, processing returns to step S10010.
(Step S10100) Next, an example of special processing of step S10100 will be described.
(Step S10110) If it is determined that special operation is necessary that is different from normal operation of receiving a result of ranging from the mobile object 300, the processing device 530 determines whether or not a person has intruded the range of movement of the mobile object 300. If there is information or input that a person has intruded the range of movement of the mobile object 300, processing advances to step S10120. If there is neither information nor input indicating that a person has intruded the range of movement of the mobile object 300, processing advances to step S10190.
(Step S10120) The communication device 550 instructs the communication device 550 to transmit, to the mobile object 300, a transmission request for data for detailed analysis that allows for analysis of the detailed data. The data for detailed analysis may be, for example, data including the detailed time information and the spectrum information of the interference wave. The spectrum information is a source data for generating positional and velocity information, and the server 500 can perform a detailed analysis based on the spectrum information.
(Step S10130) The processing device 530 determines whether or not data has been acquired from the mobile object 300. If the data has been acquired from the mobile object 300, processing advances to step S10140. If no data has been acquired from the mobile object 300, step S10130 is repeated.
(Step S10140) The processing device 530 determines whether or not the data from the mobile object 300 acquired in step S10130 is data for detailed analysis including the spectrum information. If the acquired data is the data for detailed analysis, processing advances to step S10150. If the acquired data is not the data for detailed analysis, processing returns to step S10120.
Determination on a data format in step S10140 is performed based on a data format code included in data transmitted from the mobile object 300. A data format code is predefined in the system, and may be written as 0 for data for normal processing and as 1 for data for detailed analysis, for example, at the beginning of transmission data or at a fixed position close to the beginning.
(Step S10150) When there is reflection from a physical object in each emission direction, for each sensor, based on the measurement value of the data for detailed analysis acquired in step S10130 for each sensor, the processing device 530 calculates a distance to the physical object and calculates a velocity component vector of the physical object in that emission direction. The processing device 530 calculates a position of a data point from the emission direction and the distance, and associates the velocity component vector with that data point. This allows the processing device 530 to generate point cloud data including the velocity component information.
(Step S10160) The processing device 530 detects a person who is present around the mobile object 300, based on the point cloud data including the velocity component information calculated I step S10150. A method of detecting a person will be described below.
(Step S10170) Among the point cloud data generated in step S10150, the processing device 530 checks the point cloud data, excluding the point cloud data included in the cluster of the person detected in step S10160, against the map data stored in the storage device 540, and determines the position and the direction of the mobile object 300.
(Step S10180) The processing device 530 transforms the coordinates of the point cloud data for the person detected in step S10160 into the coordinates of the map data in the server 500, and causes the storage device 540 to store the position of the person and the data acquisition time together.
After the operation in step S10180, processing returns to step S10110.
(Step S10190) If there is neither information nor input indicating that a person has intruded the range of movement of the mobile object 300 in step S10110, the processing device 530 instructs the communication device 550 to transmit a signal requesting data for normal processing to the mobile object 300. After the operation in step S10190, processing returns to step S10010 of normal operation.
Next, a description will be given of a specific example of the person detection processing in step S10160 with reference to
(Step S10161) The processing device 530 performs the clustering processing on the point cloud data including information on the velocity component vector calculated in step S10150. The processing device 530 classifies the point cloud into one or more clusters by grouping points that are close to each other as one cluster based on the positional information of the point cloud.
(Step S10162) The processing device 530 determines whether or not the determination processing has been completed for all clusters determined in step S10161 to determine whether or not the cluster represents a person. If the determination processing has been completed for all the clusters, processing advances to step S10170. If there are clusters for which processing of determining whether the cluster represents a person has not yet been performed, processing advances to step S10163.
(Step S10163) The processing device 530 selects one cluster from clusters for which processing of determining whether the cluster represents a person has not yet been performed, among the clusters generated in step S10161.
(Step S10164) For the clusters selected in step S10163, the processing device 530 determines whether or not distribution of positions of point cloud data in the cluster matches a predefined range of human sizes. If the distribution of the positions of the point cloud data in the cluster matches the human size, processing advances to step S10165. If the distribution of the positions of the point cloud data in the cluster does not match the human size, processing returns to step S10162.
(Step S10165) The processing device 530 further performs clustering on the point cloud data in the cluster based on the velocity component vector of each point. The processing device 530 classifies a plurality of points included in the cluster into one or more partial clusters by grouping points whose velocity component vectors are similar into a smaller partial cluster.
(Step S10166) As a result of the processing in step S10165, the processing device 530 determines whether or not the point clouds included in the cluster selected in step S10163 has been divided into a plurality of smaller partial clusters, based on the velocity information. If the point clouds have been divided into the plurality of partial clusters based on the velocity information, processing advances to step S10167. If the point clouds have not been divided into the plurality of partial clusters based on the velocity information, processing returns to step S10162.
In step S10164, a point cloud that may be a person is discriminated by a cluster generated based on the positional information of the point cloud. However, because a human shape in the three-dimensional space varies depending on the attitude and motion, it is difficult to detect a person only based on the shape of a cluster. On the other hand, due to motion, animals including humans have different directions and velocities of motion for each body part. Hence, utilizing the fact that the sensing device 100 can acquire the velocity information, the processing device 530 of the present embodiment further performs clustering on point clouds belonging to the cluster based on the velocity information. Consequently, when a point cloud clustered based on the positional information can be divided into smaller clusters based on the velocity information, it can be determined that cluster is highly likely to correspond to an animal such as a human.
Note that the processing device 530 in this example determines whether or not that cluster corresponds to a human, depending on whether or not a cluster is further divided into a plurality of partial clusters based on the velocity information, but the determination may be made by using other methods. For example, the processing device 530 may determine whether the cluster corresponds to a human, based on the velocity information of the point cloud included in the cluster, by using a method of determining, based on a size of each partial cluster, whether the partial cluster has been generated that corresponds to either a central part, that is, the body trunk or a peripheral attached part, that is, any of a head or limbs.
(Step S10167) The processing device 530 detects, as a human, the cluster that has been the plurality of partial clusters based on the velocity information. After the operation in step S10167, processing returns to step S10162.
By repeating the processing from step S10162 o step S10167, the processing device 530 can determine for all of the clusters generated in step S10161 whether or not the cluster is a human.
Next, a description will be given of the operation related to data generation and transmission by the sensing device 100 in the mobile object 300.
(Step S20010) The processing circuit 110 of the sensing device 100 determines whether or not there is input of an operation end signal from the server 500 or other devices. If there is the input of the operation end signal, the mobile object 300 ends it operation. If there is no input of the operation end signal, processing advances to step S20020.
(Step S20020) The processing circuit 110 determines whether or not the communication device 310 has received a request signal transmitted from the server 500 instructing on the data format. If the request signal instructing on the data format has been received, processing advances to step S20030. If the request signal has not been received, processing advances to step S20040.
(Step S20030) The processing circuit 110 rewrites the setting of output data of the sensing device 100 to the data format indicated by the request signal. An initial value of the data format may be the format for normal processing, for example. If the data format indicated by the received request signal is data for detailed analysis, the processing circuit 110 rewrites the data format from the format for normal processing to the format for detailed analysis.
(Step S20040) The processing circuit 110 causes the light source 210 of each sensor 200 in the sensing device 100 to emit laser light and performs measurements. Measurements may be repeatedly performed, for example, while changing the emission direction over one frame period.
(Step S20050) The processing circuit 110 acquires, as a measurement result, a detection signal obtained by detecting interference light from each sensor 200 in each emission direction. The processing circuit 110 analyzes a waveform of each acquired detection signal and generates spectrum data for each emission direction of each sensor 200. The spectrum data may be, for example, data on power spectrum representing signal intensity of each frequency band. The spectrum data is used to generate the positional information and the velocity information of the point cloud.
(Step S20060) The processing circuit 110 determines whether or not data for detailed analysis is requested. If the currently specified output data format is the data format for detailed analysis, processing advances to step S20080. If the currently specified data format is the data format for normal processing, processing advances to step S20070.
(Step S20070) The processing circuit 110 generates the positional information of the object based on the spectrum data of the interference light generated in step S20050. Specifically, the processing circuit 110 identifies a peak frequency from the spectrum data and calculates the distance to the object with the peak frequency as beat frequency fb, based on the expression (1) described above with respect to
(Step S20080) The processing circuit 110 generates a data string for transmitting the spectrum data of each interference light generated in step S20050 or the point cloud data generated in step S20070. The processing circuit 110 generates transmission data according to the currently specified data format. If the specified data format is the data format for normal processing, the processing circuit 110 generates transmission data including the point cloud data generated in step S20070. On the other hand, if the specified data format is the data format for detailed analysis, the processing circuit 110 generates transmission data including the spectrum data generated in step S20050.
The data format for normal processing may be the format as illustrated in
The data format for detailed analysis may be the data as illustrated in
(Step S20090) The communication device 310 transmits the data for communication generated in step S20080 to the server 500. After the operation in step S20090, processing returns to step S20010.
By repeating the operations from step S20010 to step S20090, the mobile object 300 can transmit the measurement data in the format according to the request of the server 500.
As described above, in the present embodiment, as an example of special processing, when a person intrudes, processing to detect the person is performed. In the course of the person detection processing, the server 500 requests the sensing device 100 in the mobile object 300 to generate data for detailed analysis, in order to generate velocity information that is not utilized in the normal processing. The sensing device 100 generates data including information on the light emission direction and the power spectrum of the interference light as the data for detailed analysis. Based on such data for detailed analysis, the server 500 can generate detailed point cloud data including the velocity information and detect a person who has intruded, based on the point cloud data.
Special processing may be performed for any purpose other than person detection. For example, special processing may be performed when it is necessary to analyze the positions and operations of physical objects around the mobile object 300, such as when the mobile object 300 transmits an abnormal signal, such as a failure, to the server 500. In addition, the data for detailed analysis is not limited to the spectrum information of the interference light, and may include other types of information that allows the position and the velocity of the physical object to be derived. For example, as described in Embodiments 1 and 2, the data for detailed analysis may include the positional information and the velocity vector information of the physical object detected by the clustering processing.
In this manner, in the present embodiment, the communication device 550 in the server 500 transmits a request signal requesting measurement data for detailed analysis to the sensing device 100 of the mobile object 300, when abnormality is detected in the mobile object 300 itself or in an environment in which the mobile object 300 runs. The request signal specifies the data format of the measurement data. The sensing device 100 that has received the request signal generates the measurement data having the data format specified by the request signal and transmits output data including the measurement data to the server 500 via the communication device 310. This allows the communication device 550 of the server 500 to perform detailed analysis of the surroundings of the sensing device 100, which makes it easy to identify a cause of the abnormality.
Next, a fourth embodiment of the present disclosure will be described.
In the systems in Embodiments 1 to 3, the server 500, which communicates with the sensing device 100 provided in the fixed body 400 or the mobile object 300, monitors or records the conditions such as operations of physical objects around the fixed body 400 or the mobile object 300. In contrast, in the present embodiment, the mobile object 300 capable of autonomous movement includes the sensing device 100 and a processing device that can perform arithmetic processing similar to the above-described server 500. Within the mobile object 300, data is transmitted and received between the sensing device 100 and the processing device. The processing device generates positional information of surrounding physical objects based on the output data outputted from the sensing device 100 and generates and outputs a signal for controlling the operation of the mobile object 300 based on the positional information of the physical object.
The sensor 200 uses laser light to perform ranging and velocity measurement of FMCW method. The sensor 200 has a similar configuration to the sensor 200 illustrated in any of
The input device 360 is a device for input an instruction for the mobile object 300, such as starting or ending operations. The input device 360 may include a device such as a button, a lever, a switch, or a keyboard.
The processing device 340 is a device including one or more processors (that is, processing circuits) such as a CPU or a GPU, and a storage medium such as a memory. The processing device 340 can process sensor data outputted from the sensor 200 and generate point cloud data for the surrounding environment of the mobile object 300. The processing device 340 can perform processing to determine the operation of the mobile object 300, such as detecting an obstacle based on the point cloud data, and determining a course of the mobile object 300. The processing device 340 transmits, for example, a signal indicating the course of the mobile object 300 to the controller 320.
The controller 320 generates a control signal and outputs the control signal to the drive device 330, in order to implement the operation of the mobile object 300 determined by the processing device 340.
The drive device 330 operates according to the control signal outputted from the controller 320. The drive device 330 may include various actuating parts such as an electric motor, wheels, or arms.
The storage device 350 is a device including one or more storage media such as a semiconductor storage medium, magnetic storage media, or optical storage media. The storage device 350 storage data related to the operating environment and the operating conditions necessary for the mobile object 300 to move, such as the map data of the environment in which the mobile object 300 moves.
(Step S30010) The processing device 340 determines whether or not special processing different from normal operation is required. The processing device 340 determines that special processing is required, for example, when the mobile object 300 is in some abnormal conditions. Examples of the abnormal conditions may include a condition in which running cannot be continued under normal operation because guidelines are not detected on the floor, a condition in which arrangement of surrounding physical objects does not match the map data recorded in the storage device 350, or a condition in which equipment has failed. When special processing is required, processing advances to step S30100. When special processing is not required, processing advances to step S30020.
(Step S30020) The processing device 340 causes the sensing device 100 to perform ranging. Each sensor 200 in the sensing device 100 emits laser light to measure a distance. The sensor 200 detects interference light between light emitted from the light source and reflected light from the physical object, and measures the distance to the reflecting point of the physical object based on the frequency of the interference light. The sensor 200 calculates three-dimensional coordinates of the reflecting point based on the distance and information on the laser light emission direction. The sensor 200 repeats the above-described operations over the entire measurement target area while changing the laser light emission direction. As a result, the sensor 200 generates point cloud data including positional information of each of the plurality of reflecting points included in the target area.
(Step S30030) The processing device 340 checks the point cloud data generated by the sensor 200 in step S30020 against the map data recorded in the storage device 350, and determines the position of the mobile object 300 on the map.
(Step S30040) The processing device 340 transforms the coordinates of each point in the point cloud data acquired in step S30020 into coordinates in the coordinate system used in the map data.
(Step S30050) The processing device 340 generates the course of the mobile object 300 according to the position of the mobile object 300 determined in step S30030 and the map data. For a position close to the mobile object 300, the processing device 340 determines, for example, a detailed course where no collision with an obstacle occurs, based on the point cloud data subjected to the coordinate transformation in step S30040.
(Step S30060) The controller 320 generates a control signal for controlling the drive device 330 according to the course generated by the processing device 340 in step S30050.
(Step S30070) The controller 320 outputs the control signal generated in step S30060 to the drive device 330. The drive device 330 operates according to the control signal. After the operation in step S30070, processing returns to step S30010.
By repeating the operations from step S30010 to step S30070, navigation of the mobile object 300 under normal conditions without abnormality is realized.
(Step S30100) If it is determined in step S30010 that special processing is required, the mobile object 300 performs special processing.
(Step S30110) The processing device 340 instructs the controller 320 to reduce the traveling velocity. This is to ensure safety under abnormal conditions.
(Step S30120) The controller 320 generates a control signal for velocity reduction and outputs the control signal to the drive device 330.
(Step S30130) The drive device 330 operates according to the control signal outputted from the controller 320 and reduces the velocity of the mobile object 300. For example, the drive device 330 may stop the mobile object 300.
(Step S30140) The processing device 340 requests each sensor 200 of the sensing device 100 to generate data for detailed analysis. Under normal conditions, the sensor 200 outputs point cloud data that can be checked against the map data recorded in the storage device 350. Under abnormal conditions, as the point cloud data cannot be checked against the map data, the processing device 340 requests to each sensor 200 the point cloud data to which the velocity information is attached, in order to acquire detailed information necessary for operating without referring to the map data.
(Step S30150) The processing device 340 determines whether or not the measurement data has been acquired from the sensor 200. If the measurement data has been acquired, processing advances to step S30160. If the measurement data has not been acquired, step S30150 is repeated.
(Step S30160) The processing device 340 determines whether or not the format of the measurement data acquired from the sensor 200 in step S30150 is the format for the data for detailed analysis requested in step S30140. If the acquired data is in the format for the data for detailed analysis, processing advances to step S30170. If the acquired data is not in the format for the data for detailed analysis, that is, if the acquired data is in the format for the data for normal processing, processing returns to step S30140.
By repeating step S30140 to step S30160, the processing device 340 can acquired the data for detailed analysis from the sensor 200.
(Step S30170) The processing device 340 extracts the point cloud data to which the velocity information is added for each data point, from the data acquired from the sensor 200. The velocity information for each data point in the present embodiment represents the relative velocity component of the data point in the direction along the straight line connecting the sensor 200 and the data point.
(Step S30180) The processing device 340 performs clustering on the point cloud data according to the positional information of each data point. The clustering method is similar to the method described in Embodiment 1. The clustering makes it possible to identify physical objects (obstacles or people, for example) that are present around the mobile object 300.
(Step S30190) For each cluster in the point cloud data clustered in step S30180, the processing device 340 classifies types of physical objects corresponding to clusters based on the velocity information corresponding to each data point included in the cluster. For example, the processing device 340 checks a sign of the velocity information of each data point. If a direction moving away from the mobile object 300 is a positive direction of the velocity, a cluster having more data points whose velocity information is smaller than 0 is likely to be a cluster corresponding to the physical object approaching the mobile object 300. Therefore, the processing device 340 determines that such a cluster is a dangerous moving body, and records information thereon. A cluster having more data points whose velocity information is larger than 0 is likely to be a physical object moving away from the mobile object 300. Therefore, the processing device 340 determines that such a cluster is a moving object with a lower degree of danger, and records information thereon. The processing device 340 determines that a cluster having the positive velocity information that is competing with the negative velocity information or a cluster having more data points whose velocity information is 0 is a stationary object and stores it.
(Step S30200) The processing device 340 generates a course in a direction moving away from a point cloud position of a cluster that has no data point and is determined as a dangerous moving body.
(Step S30210) The controller 320 generates a control signal for controlling the drive device 330 according to the course generated by the processing device 340 in step S30200.
(Step S30220) The controller 320 outputs the control signal generated in step S30210 to the drive device 330. The drive device 330 operates according to the control signal. After the operation in step S30220, processing returns to normal operation in step S30010.
Through the operations from step S30110 to step S30220 it is possible to avoid danger and determine a course, even in a case where, for example, guidelines are not found on the floor. The mobile object 300 can autonomously move while avoiding danger, for example, until guidelines can be detected.
(Step S30340) The processing device 340 in this example requests data for autonomous movement from each sensor 200, after the mobile object 300 is decelerated. The data for autonomous movement is the point cloud data to which hazard classification information of each point is attached. The hazard classification information may be, for example, a code for discriminating whether or not a physical object corresponding to the point is a dangerous moving body. The hazard classification information may be a code that represents the degree of danger of the physical object corresponding to the point at a plurality of levels. The hazard classification information may indicate, for example, whether the physical object corresponding to the point is a dangerous moving body, a non-dangerous moving body, or a stationary object.
(Step S30360) When acquiring data from the sensor 200, the processing device 340 determines whether or not the data format thereof is the format of the data for autonomous movement. The processing device 340 can determine whether or not the data is the data for autonomous movement, based on the code indicating the data format included in the acquired data. If the acquired data is the data for autonomous movement, processing advances to step S30370. If not, processing returns to step S30340.
(Step S30370) The processing device 340 extracts the point cloud data to which the hazard classification information is added for each data point, from the acquired data.
(Step S30380) The processing device 340 generates a course in a direction that is moving away from the data point to which the code indicating the dangerous moving body is added, and that has no data point.
In the example illustrated in
The processing of generating the point cloud data to which the hazard classification information as illustrated in
Next, a fifth embodiment of the present disclosure will be described.
The calibration device 700 determines a value of a parameter used in distance calculation and velocity calculation, from the known distance of the object held by the object holding device 800 and the spectrum information of the interference light received from the sensing device 100, and transmits the parameter to the sensing device 100. By using the parameter, a distance measurement value and a velocity measurement value of the sensing device 100 are calibrated. The object is, for example, a white board and placed parallel to a lens surface of the sensor 200. This arrangement scatters light emitted from the sensor 200, and the scattered light efficiently enters the lens of the sensor 200. As a result, the intensity of the detected interference light increases relative to noise, and the accuracy of calibration increases.
The calibration by this system is performed, for example, when the sensor 200 is manufactured or shipped. In addition, it may also be performed as re-calibration when the sensing device 100 including the sensor 200 is installed or during an inspection. The sensor 200 is configured similarly to the configuration illustrated in, for example,
The calibration that the calibration device 700 performs on the sensing device 100 is not limited to the calibration of the distance measurement value or the velocity measurement value. Other examples include checking noise occurrence status when the distance or velocity is measured and determining the measurement conditions corresponding to noise, or the like. Examples of the measurement conditions include a detection threshold of the interference light, the number of times of measurements when calculating the measurement values, and the average number of times when calculating measurement values, or the like.
As described above, in the system of the present embodiment, the calibration device 700 acquires the spectrum information measured at the first distance and the spectrum information measured at the second distance. Based on the acquired spectrum information, the calibration device 700 determines parameters for calculating the distance and the velocity from the frequency of the interference light, and transmits the determined parameters to the sensing device 100. The sensing device 100 receives the parameters transmitted from the calibration device 700 and saves the acquired parameters. This allows the sensing device 100 to generate and output accurate distance and velocity from the spectrum information of the measured interference light.
Note that here, although the distance at which the object is held is the predefined distance, the calibration device 700 may instruct the object holding device 800 on a distance at which the object is held. Alternatively, a distance at which the object is held may be determined by user input. The object holding device 800 has a mechanism for changing a position to hold the object so that the distance from the sensor 200 to the object will be the instructed or input distance.
Note that in the above example, the object holding device 800 automatically adjusts the position to hold the object so that the distance between the sensor 200 and the object will be the determined value, but the object holding device 800 may have a jig that does not operate. In this case, a user may determine the distance between the sensor 200 and the object, use the jig to set the distance between the sensor 200 and the object, and input the set distance in the calibration device 700. In this case, communications only have to be performed between the calibration device 700 and the sensing device 100, and communication between the object holding device 800 and the calibration device 700 is not required. The calibration device 700 stores the distance between the object and the sensor 200 that is inputted by the user.
In addition, according to the above system, the object holding device 800, the calibration device 700, and the sensor 200 are each connected by direct wired communication, and transmit and receive signals via a signal line, while each of them may be connected via direct wireless communication or may be connected via a network.
The processing circuit 710 of the calibration device 700 outputs a control signal for instructing a calibration operation to the communication circuit 720. The processing circuit 710 also determines parameters based on the acquired data and outputs the parameters to the communication circuit 720 and the display device 730. The communication circuit 720 transmits the control signal to the sensing device 100 and the object holding device 800. The communication circuit 720 further receives measurement data outputted by the sensing device 100 and distance data of the object outputted by the object holding device 800.
The grasping device 810 of the object holding device 800 selects one of a plurality of predefined distances stored in the storage device 830, based on the control signal outputted from the calibration device 700. Furthermore, the grasping device 810 adjust a distance from the collimator 223 of the sensor 200 to the object based on the selected distance, and outputs a value of the selected distance to the communication circuit 820. The communication circuit 820 outputs the distance between the object and the collimator 223 of the sensor 200 to the calibration device 700.
(Step S40010) In step S40010, the processing circuit 710 of the calibration device 700 selects one sensor from the sensors 200 included in the sensing device 100. The selected sensor is a sensor that has been determined to require calibration and for which calibration has not yet been completed.
(Step S40020) In step S40020, the processing circuit 710 of the calibration device 700 determines whether or not the necessary pieces of spectrum data for calibration are stored. The spectrum data necessary for calibration is, for example, a numeric value of two or more values. Here, it is assumed that the necessary pieces of spectrum data are two. In step S40020, if the necessary pieces of spectrum data are stored, that is, if yes in step S40020, processing advances to step S40080. In step S40020, if the necessary pieces of spectrum data are not stored, that is, if no in step S40020, processing advances to step S40030.
(Step S40030) In step S40030, the calibration device 700 outputs a control signal instructing holding of the object for measurement to the object holding device 800.
(Step S40040) In step S40040 the object holding device 800 select a distance at which the object has not yet been held, from among pre-stored object distances stored in the storage device 830, and holds the object at a position where the distance between the sensor 200 and the object is equal to the selected distance.
(Step S40050) The object holding device 800 transmits the distance determined in step S40040 to the calibration device 700. In step S40050, the calibration device 700 receives the signal outputted from the object holding device 800 and acquires the distance between the object and the collimator 223 of the sensor 200.
(Step S40060) In step S40060, the calibration device 700 outputs, to the sensing device 100, a control signal instructing the sensor 200 to perform ranging measurement on the object. The control signal includes a signal specifying the data format outputted by the sensing device 100, together with an instruction signal instructing starting of the measurement. The specified data format is, for example, the frequency of the spectral peak of the interference light detected by the sensor 200.
(Step S40070) The sensing device 100 receives the control signal outputted by the calibration device 700 in step S40060, and performs a measurement operation. The measurement operation is similar to what has been described the previous embodiments. That is, the sensor 200 emits laser with the periodically modulated frequency toward the object, receives light reflected by the object, and causes the reflected light to interfere with reference light. Furthermore, the sensor 200 detects interference light resulting from the interference, with the photodetector 230, and performs frequency-analysis on the detection signal to determine a spectral peak. In the example illustrated in
By repeating the operations from step S40020 to step S40070, it is possible to acquire all spectrum data necessary to determine the parameters for calculating the distance.
(Step S40080) In step S40080, the processing circuit 710 of the calibration device 700 calculates parameters when calculating the distance from the frequency of the spectral peak.
A description is given of an example of processing performed by the processing circuit 710 of the calibration device 700. Based on the expression (1) mentioned above, a relationship between the holding distance L1, the holding distance L2, and the interference light frequency is as illustrated in the expressions (4) and (5), respectively. Here, fb1 and fb2 are the frequencies of the spectral peaks of the interference light detected at the holding distance L1 and the holding distance L2, respectively. fb1 may be an average value of the frequency of the spectral peak in the up-chirp and the frequency of the spectral peak in down-chirp detected at the holding distance L1, or either one of the frequency of the spectral peak in the up-chirp and the frequency of the spectral peak in down-chirp. The same also applies to fb2. A is a shift in the zero point of the distance caused by a difference in a length between a waveguide of the reference light and a waveguide till the received reflected light interferes with the reference light, in the actual interference optical system, and is a constant defined for each interference optical system. In addition, symbols that are the same as those in the expression (1) have the same meanings.
L
1
=c×f
b1/(Δf×fFMCW)×(¼)+A (4)
L
2
=c×f
b2/(Δf×fFMCW)×(¼)+A (5)
As the holding distance L1 and the holding distance L2 have known values, the processing circuit 710 of the calibration device 700 can calculate Δf and A using expression (4) and expression (5).
(Step S40090) In step S40090, the calibration device 700 transmits Δf and A determined in step S40080 to the sensing device 100 via the communication circuit 720. The Δf and A are the parameters when calculating the distance from the frequency of the spectral peak. During subsequent measurement operations, the sensing device 100 calculates the distance using the detection signal of the interference light acquired by the sensor 200 and the parameters. The values of Δf and A being updated, the sensing device 100 and the sensor 200 are calibrated.
As described above, in the system in the present embodiment, calibration of the sensor 200 is easily performed, so that ranging can be achieved and maintained with high accuracy.
Note that the operation when calibrating one of the sensors 200 is described here, but a plurality of the sensors may be calibrated. Data for calibration may be transmitted or received for each sensor or transmitted or received collectively for a plurality of sensors.
In the system of the present embodiment, the calibration device 700 transmits, to the sensing device 100, the control signal specifying output of a spectral peak that is an example of the spectrum information of interference light. The sensing device 100 outputs the spectral peak information to the calibration device 700, according to the specified data format. This allows the calibration device 700 to calibrate parameters when converting the spectrum information of interference light detected by the sensor 200 into distance or velocity. Use of the spectrum information as raw data before calculating distance or velocity makes it possible to easily perform calibration of the parameters when calculating distance or velocity by using the frequency analysis of interference light. Such a calibration may be performed in scenes such as when the sensing device 100 is installed, when there is a change in the usage environment of the sensing device 100, during maintenance for abnormality in the sensing device 100, or during regular maintenance of the sensing device 100, or the like, as well as when the sensing device 100 is shipped.
As described above, according to the system of the present embodiment, calibration can be easily performed for deterioration of the measurement accuracy of the sensing device 100 due to age deterioration in the laser characteristics, or the like, so that high reliability of measurements of the sensing device 100 can be maintained.
Next, Modification Example 1 of the fifth embodiment of the present disclosure will be described.
In the fifth embodiment, the spectral peak, that is, the value of the frequency that showed the maximum energy in the measurement frequency range was used as the data format of the spectrum information outputted from the sensing device 100 to the calibration device 700. In this modification example, a power spectrum is used as the data format of the spectrum information. As described with reference to
(Step S41060) In step S41060, the calibration device 700 outputs a control signal instructing the sensing device 100 to perform measurement. The control signal includes a signal such as a sensor number for specifying the sensor determined in step S40010 and a signal specifying the data format to be outputted. As described earlier, the power spectrum is specified as the output data format in this modification example. The sensing device 100 acquires the control signal from the calibration device 700 through the communication circuit 120.
(Step S41070) The sensing device 100 receives the control signal outputted by the calibration device 700 in step S41060 and performs the measurement operation. The sensing device 100 determines a power spectrum by frequency-analyzing interference light detected by the photodetector 230. The data format of the power spectrum is, for example, similar to the format illustrated in
Note that the data format illustrated in
In addition, in the data format illustrated in
(Step S41080) In step S41080, the processing circuit 710 of the calibration device 700 calculates standard power of noise based on the power spectrum values acquired from the sensing device 100. Then, the processing circuit 710 determines a value exceeding the standard power of noise, as the extraction threshold of the spectral peak. The extraction threshold of the spectral peak is determined for each of up-chirp and down-chirp. The threshold may be determined based on a value inputted by the user through the unillustrated input means. At this time, the calibration device 700 displays the power spectrum through the display device 730.
(Step S41090) In step S41090, the calibration device 700 transmits the extraction thresholds of the spectral peaks for up-chirp and down-chirp, respectively, with respect to the sensing device 100 determined in step S41080. The extraction thresholds of the spectral peaks are examples of the parameters for calibrating the sensing device 100 in this modification example. In subsequent measurement operations, the sensing device 100 can acquire the spectral peak of interference light accurately by using the extraction thresholds and distinguishing the spectral peak of interference light from the peak due to noise.
Note that the data format of the power spectrum only has to include a frequency for each data point or a numeric value corresponding to the frequency, and intensity for each data point for the number of data points.
According to a calibration system of this modification example, the user or the calibration device 700 can confirm a noise state of the interference light by checking S/N on the power spectrum. This makes it possible to determine presence or absence of abnormality in the sensor 200 or necessity of adjustment, and also facilitates identification of a cause of abnormality. Therefore, adjustment of the sensing device 100 at the time of shipment, adjustment at the time of maintenance, or repair at the time of maintenance is facilitated.
Next, Modification Example 2 of the fifth embodiment of the present disclosure will be described.
In the fifth embodiment and Modification Example 1, the sensing device 100 outputs the spectrum information such as the spectral peak or the power spectrum, according to the specification of the data format included in the control signal outputted by the calibration device 700. In contrast, in this modification example, the calibration device 700 outputs a control signal specifying an interference light waveform as the output data format, and the sensing device 100 outputs data on the interference light waveform according to the received control signal. The data on the interference light waveform is generated by digitizing the waveform of the interference light outputted from the photodetector 230 of the sensor 200. A configuration of the system of this modification example is similar to the configuration described using
(Step S42060) In step S42060, the calibration device 700 outputs a control signal instructing the sensing device 100 to perform measurement. The control signal includes a signal such as a sensor number for specifying the sensor determined in step S40010 and a signal specifying the output data format. The interference light waveform is specified as the output data format in this modification example 2. The sensing device 100 acquires the control signal from the calibration device 700 through the communication circuit 120.
(Step S42070) The sensing device 100 receives the control signal outputted by the calibration device 700 in step S42060 and performs the measurement operation. The sensing device 100 cuts out a waveform in a predefined section from the interference light detected by the photodetector 230. The cut-out waveform is, for example, a signal value digitized every fixed period of time. The sensing device 100 groups the cut-out interference light waveforms according to the predefined output format, and transmits the waveforms to the calibration device 700. In step S42070, the calibration device 700 acquires data on the interference light waveform transmitted by the sensing device 100.
(Step S42080) In step S42080, the processing circuit 710 of the calibration device 700 generates a correction value that corrects a sampling interval when fast Fourier-transforming the interference light waveform based on the interference light waveform data acquired from the sensor 200 to be calibrated. The correction value is determined so that distortion of interference light waveform due to nonlinearity of the light emitting element 212 can be corrected, for example. In addition, the correction value is determined for each of the up-chirp waveform and the down-chirp waveform.
(Step S42090) In step S42090, the calibration device 700 transmits the correction values for up-chirp and down-chirp, respectively, determined in step S42080 to the sensing device 100. The correction values are examples of the parameters for calibrating the sensing device 100 in this modification example.
During subsequent measurement operations, the sensing device 100 performs the fast Fourier transformation when performing the frequency analysis on the detection signal of the photodetector 230 based on the correction values. This allows the sensing device 100 to correct the distortion of the interference light waveform due to the nonlinearity of the light emitting element 212 to perform measurement with high accuracy.
As described above, according to the calibration system of this modification example, it is possible to easily correct the distortion of the interference light waveform due to differences in characteristics of each component of the light emitting element 212 or age deterioration of the light emitting element 212.
The techniques of the present disclosure can be widely used in devices or systems that acquire positional information of physical objects by sensing the surrounding environment. For example, the techniques of the present disclosure can be used in devices or systems that utilize FMCW LiDAR.
Number | Date | Country | Kind |
---|---|---|---|
2021-137762 | Aug 2021 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/031064 | Aug 2022 | WO |
Child | 18441022 | US |