This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2021-10001, filed on Jan. 26, 2021, the entire contents of which are incorporated herein by reference.
The embodiments discussed herein are related to a method of collecting data and a computer-readable recording medium storing a data collection program.
There have been a known technique by which vehicle data is collected from a plurality of vehicles and a known technique by which a waste of communication costs is suppressed when collecting sensing information obtained by sensors of target vehicles.
Examples of the related art include as follows: Japanese Laid-open Patent Publication No. 2019-040305.
According to an aspect of the embodiments, there is provided a computer-implemented method of collecting data. In an example, the method includes: collecting pieces of metadata associated with pieces of image data from a plurality of moving objects that hold the pieces of image data; and determining, when a specific piece of metadata that satisfies a condition is found in the collected pieces of metadata, based on information for making collected numbers of the pieces of image data close to be an equalized value and a map that manages the collected numbers in a mesh shape, whether to request transmission of a specific piece of image data with which the specific piece of metadata is associated to a specific moving object from which the specific piece of metadata is collected.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
In some cases, image data from a camera mounted on a moving object such as a vehicle is collected from the moving object to create, for example, a map. In such cases, it is desired that the image data be entirely evenly collected. However, the image data is not necessarily entirely evenly collected. For example, there is a possibility that a piece of image data at a specific position is concentratedly collected and also a possibility that a piece of image data at a specific position is not collected at all. For example, there is a possibility that variation of the collection occurs in collecting the image data, and accordingly, the collected numbers of pieces of image data are not made to be an equalized value.
Accordingly, in one aspect, it is an object to provide a method of collecting data and a data collection program that makes collected numbers of pieces of image data close to an equalized value.
Hereinafter, embodiments of the present disclosure are described with reference to the drawings.
First, an outline of a collection server 100 that executes a method of collecting data is described with reference to
Each of the vehicles 300 periodically transmits metadata D1 to the collection server 100. The metadata D1 is data for explaining image data D2 of an image captured by a camera (not illustrated) installed in the vehicle 300. The metadata D1 is associated with the image data D2. The metadata D1 includes a vehicle identifier (ID) for identifying the vehicle 300, positional information of the vehicle 300, the time when the image is obtained, controller area network (CAN) bus information, and so forth. The CAN bus information is information that flows through a bus of an onboard network called the CAN (CAN bus) and is, for example, detected by various sensors (for example, onboard sensors) such as an acceleration sensor and a vehicle speed sensor. The image may be a still image or a moving image in which still images continue in time series. A moving image may also be referred to as, for example, a video.
The collection server 100 collects and stores various pieces of the metadata D1 periodically transmitted from each of the vehicles 300. When a user 10 operates an input device 11 to input predetermined conditions, the collection server 100 determines whether there is a specific piece of metadata D1 that satisfies the predetermined conditions among the various pieces of metadata D1. When the collection server 100 finds the specific piece of metadata D1, the collection server 100 identifies the vehicle 300 from which the identified piece of metadata D1 is collected and makes a request, to the identified vehicle 300, of transmission of a specific piece of image data D2 associated with the specific piece of metadata D1.
Thus, the vehicle 300 identified by the collection server 100 transmits to the collection server 100 the piece of image data D2 that the vehicle itself holds. When the piece of image data D2 is transmitted from the vehicle 300, the collection server 100 collects and stores the piece of image data D2 transmitted from the vehicle 300. The user 10 is able to check the image of the piece of image data D2 via a display device 12 by operating the input device 11 to access the collection server 100.
With reference to
The collection server 100 includes, as a processor, a central processing unit (CPU) 100A and, as memory, a random-access memory (RAM) 100B and a read-only memory (ROM) 100C. The collection server 100 also includes a network interface (I/F) 100D and a hard disk drive (HDD) 100E. A solid-state drive (SSD) may be used instead of the HDD 100E.
The collection server 100 may include, as desired, at least one of an input I/F 100F, an output I/F 100G, an input and output I/F 100H, and a drive device 100I. The elements from the CPU 100A to the drive device 100I are coupled to each other via an internal bus 100J. For example, the collection server 100 may be realized by a computer.
The input device 11 is coupled to the input I/F 100F. Examples of the input device 11 include, for example, a keyboard, a mouse, a touch pad, and the like. The display device 12 is coupled to the output I/F 100G. Examples of the display device 12 include, for example, a liquid crystal display and the like. A semiconductor memory 13 is coupled to the input and output I/F 100H. Examples of the semiconductor memory 13 include, for example, a Universal Serial Bus (USB) memory, a flash memory, and the like. The input and output I/F 100H reads a data collection program stored in the semiconductor memory 13. The input I/F 100F and the input and output I/F 100H include, for example, a USB port. The output I/F 100G includes, for example, a display port.
A portable recording medium 14 is inserted into the drive device 100I. Examples of the portable recording medium 14 include, for example, a removable disc such as a compact disc (CD)-ROM and a Digital Versatile Disc (DVD). The drive device 100I reads the data collection program recorded in the portable recording medium 14. The network I/F 100D includes, for example, a LAN port, a communication circuit, and the like.
The data collection program stored in at least one of the ROM 100C, the HDD 100E, and the semiconductor memory 13 is temporarily stored in the RAM 100B by the CPU 100A. The data collection program recorded in the portable recording medium 14 is temporarily stored in the RAM 100B by the CPU 100A. When the stored data collection program is executed by the CPU 100A, the CPU 100A realizes various types of functions to be described later and executes various types of processes to be described later. The data collection program may be configured to perform processing of a flowchart to be described later.
A functional configuration of the collection server 100 according to a first embodiment is described with reference to
As illustrated in
The storage unit 110 includes, as elements thereof, a metadata database (DB) 111, a data request queue 112, a data request management DB 113, a management map DB 114, and an image data DB 115. At least one of the elements of the storage unit 110 may be distributed to and provided in another server (not illustrated) different from the collection server 100.
The processing unit 120 includes, as elements thereof, a metadata collection unit 121, a collection determination unit 122, a data request distribution unit 123, and an image data storing unit 124. At least one of the elements of the processing unit 120 selectively accesses an element of the storage unit 110 to execute various types of processes. For example, the metadata collection unit 121 collects the metadata D1 transmitted from the vehicle 300 via the communication unit 150 and stores the collected metadata D1 in the metadata DB 111. In this way, the metadata DB 111 stores the metadata D1. The other elements will be described in detail in the description of operations of the collection server 100.
A hardware configuration and a functional configuration of the vehicle 300 is described with reference to
As illustrated in
As illustrated in
The information detection unit 320 detects various types of information such as the speed and the acceleration of the vehicle 300 and outputs the detected information as the CAN bus information to the CAN bus 300G. The position obtaining unit 330 obtains positional information of the vehicle 300 based on GPS function. The positional information may be information on a running position of the vehicle 300 or information on a stop position of the vehicle 300. The imaging unit 340 captures an image within the predetermined field angle range in front of the vehicle 300 and generates and holds the image data D2 of the image within the predetermined field angle range. The onboard communication unit 350 receives a data request to be described later and transmits the metadata D1 and the image data D2.
The control unit 310 controls operations of the entirety of the vehicle 300 including the information detection unit 320, the position obtaining unit 330, the imaging unit 340, and the onboard communication unit 350. For example, the control unit 310 obtains the image data D2 generated and held by the imaging unit 340 and associates the image data D2 with, for example, the vehicle ID, the positional information, the CAN bus information, and the time when the image data D2 is obtained as the metadata D1 and holds the image data D2. Instead of the time when the image data D2 is obtained, the time when the imaging unit 340 captures the image may be used. The control unit 310 periodically transmits the metadata D1 via the onboard communication unit 350. Upon receiving the data request via the onboard communication unit 350, the control unit 310 transmits the image data D2 corresponding to the data request via the onboard communication unit 350.
Next, the operations of the collection server 100 are described with reference to
First, operations of the metadata collection unit 121 are described with reference to
Thus, for example, when a piece of metadata D1 including the vehicle ID “#A”, a piece of metadata D1 including the vehicle ID “#B”, and the like are stored in the metadata DB 111 as illustrated in the upper part of
Next, operations of the collection determination unit 122 according to the first embodiment are described with reference to
The extraction conditions are conditions related to the positional information and the CAN bus information when a specific piece of metadata D1 is found and extracted from the various pieces of metadata D1 stored in the metadata DB 111. A piece of image data D2 associated with the piece of metadata D1 extracted according to the extraction conditions is to be collected by the collection server 100. Accordingly, the extraction conditions may also be referred to as the collection conditions of the image data D2.
The equalization conditions include a mesh size and a collection upper limit. The mesh size is the level that defines the granularity of a mesh (grid). For example, as illustrated in
Although it is not illustrated, the third-level mesh is a mesh region defined by equally dividing the second-level mesh by ten in each of the latitude direction and the longitude direction. The third-level mesh has a latitude interval of 30 seconds and a longitude interval of 45 seconds. The fourth-level mesh is a mesh region defined by equally dividing the third-level mesh by two in each of the latitude direction and the longitude direction. The fourth-level mesh has a latitude interval of 15 seconds and a longitude interval of 22.5 seconds. Although description of the fifth-level mesh and the sixth-level mesh is omitted, these meshes may be viewed in a predetermined web page the uniform resource locator (URL) of which is https://www.fttsus.jp/worldgrids/ja/top-ja/. According to the present embodiment, a fourth-level mesh defined as follows is described as an example: the third-level mesh is equally divided by three in each of the latitude direction and the longitude direction to have nine mesh regions having a latitude interval of 10 seconds and a longitude interval of 15 seconds. The collection upper limit included in the equalization conditions represents an upper limit number when pieces of image data D2 are collected. The collection determination unit 122 generates a management map in accordance with the mesh size, sets the collection upper limit of the pieces of image data D2 for individual sections (hereafter, referred to as mesh regions), and stores the collection upper limit in the management map DB 114.
As illustrated in
When the collection determination unit 122 determines that the conditions are not satisfied (step S14: NO), the processing ends. In contrast, when the collection determination unit 122 determines that the conditions are satisfied (step S14: YES), the collection determination unit 122 calculates the mesh ID (step S15). According to the present embodiment, as illustrated in
When the mesh ID is calculated, the collection determination unit 122 obtains a collection ratio (step S16). As described above, when the mesh ID “#5” is calculated, as illustrated in
After obtaining the collection ratio, the collection determination unit 122 next determines whether the collection ratio is less than the collection upper limit (step S17). When the collection ratio is not less than the collection upper limit (step S17: NO), the collection determination unit 122 ends the processing. In contrast, when the collection ratio is less than the collection upper limit (step S17: YES), the collection determination unit 122 generates management information and a data request (step S18). For example, the collection determination unit 122 issues a request ID to identify the data request, and as illustrated in
When the collection determination unit 122 generates the data request and the management information, the collection determination unit 122 stores the management information and the data request (step S19) and ends the processing. For example, as illustrated in
Next, operations of the data request distribution unit 123 are described with reference to
Upon receiving the obtaining request, the data request distribution unit 123 subsequently refers to the data request queue 112 (step S22) and determines whether there is a data request (step S23). In more detail, the data request distribution unit 123 refers to the data request queue 112 based on the vehicle ID “#C” included in the obtaining request and determines whether there is a data request including the vehicle ID “#C”.
When there is the data request (step S23: YES), the data request distribution unit 123 obtains and distributes the data request (step S24) and ends the processing. According to the present embodiment, as described above, the data request queue 112 stores the data request for the vehicle ID “#C” (see
In the vehicle 300 with the vehicle ID “#C”, different processes are executed depending on whether the vehicle 300 receives the data request or the empty response. When the onboard communication unit 350 receives the data request, the control unit 310 identifies the piece of image data D2 with which the time included in the data request is associated as the metadata D1. When the control unit 310 identifies the piece of image data D2, the onboard communication unit 350 associates the identified piece of image data D2 with the request ID “3” (see
Next, operations of the image data storing unit 124 are described with reference to
Upon receiving the piece of image data D2, as illustrated in
When the piece of image data D2 is stored, as illustrated in
The image data storing unit 124 obtains the positional information from the management information including the request ID “3” before or after the registration of the address. According to the present embodiment, the image data storing unit 124 obtains the positional information (Ing1, Int1) (see
Thus, the collected number “3” reaches the collection upper limit, and thereafter, the piece of image data D2 corresponding to the mesh region of the mesh ID “#5” is not requested. Accordingly, when the above-described processing is similarly executed for the mesh regions other than the mesh ID “#5”, all the mesh regions converge to the collection ratio “3/3”. Thus, the collected numbers of pieces of the image data D2 are made to be an equalized value without the occurrences of variation of the collection of the pieces of image data D2. Also, since the collection of the image data D2 is stopped by the collection upper limit, the collection efficiency of the image data D2 is improved. When the user 10 operates the input device 11 to access the image data DB 115, the user 10 may view the image data D2 by causing the display device 12 to display the image data D2.
Referring next to
As illustrated in
Next, operations of the collection determination unit 122 according to the second embodiment are described with reference to
When the collection determination unit 122 obtains the average of the collected numbers, the collection determination unit 122 next generates the management information and the data request (step S53). In the process in step S53, the collection determination unit 122 generates the management information and the data request in a similar manner to that in step S18. Thus, the management information and the data request described with reference to
When the collection determination unit 122 generates the management information and the data request, the collection determination unit 122 next determines whether the collected number is greater than the average (step S54). For example, the collection determination unit 122 determines whether the collected number of the pieces of image data D2 corresponding to the target mesh ID is greater than the collected numbers of the pieces of image data D2 corresponding to the mesh IDs other than the target mesh ID.
When the collected number is greater than the average (YES in step S54), as illustrated in
In contrast, when the collected number is smaller than or equal to the average (NO in step S54), although it is not illustrated, the collection determination unit 122 sets a priority Mid in the data request (step S56). The priority Mid is information for not adjusting the distribution order of the data requests. For example, when the collected number of the pieces of image data D2 corresponding to the target mesh ID is relatively smaller, collection of the image data D2 is enhanced so as to make the collected numbers of pieces of the image data D2 close to be an equalized value. When the collected number is 0 (zero), the collection determination unit 122 may set a priority High in the data request.
When the collection determination unit 122 sets the priority Low or Mid in the data request, the collection determination unit 122 next stores the management information generated in the process in Step S53 (Step S57). For example, similarly to the first embodiment, the collection determination unit 122 stores the management information in the data request management DB 113. When the collection determination unit 122 stores the management information, as illustrated in
As illustrated in
Next, operations of the data request adjustment unit 125 are described with reference to
When the data request storing unit 126 determines the priority, the data request storing unit 126 stores the data request in the corresponding queue (step S63) and ends the processing. For example, when it is determined that the priority Low is set in the data request, the data request storing unit 126 stores the data request in the third queue 128C as illustrated in
When the data request storing unit 126 stores the data request, as illustrated in
Upon receiving the data request, as illustrated in
Although the preferred embodiments according to the present disclosure have been described in detail above, the embodiment is not limited to the specific embodiments related to the present disclosure, and various modifications and changes may be made without departing from the gist of the present disclosure described in the claims.
For example, according to the embodiments described above, it has been described that the collection determination unit 122 receives and holds the collection request D3 input by the user 10 in advance and checks the extraction conditions of the collection request D3 against the metadata D1 every time the metadata D1 is collected. In contrast, the metadata D1 may be periodically collected and stored and the extraction conditions of the collection request D3 may be checked against the metadata D1 when the collection determination unit 122 receives and holds the collection request D3 input by the user 10 afterward.
Although the management map corresponding to the positional information is used according to the above-described embodiment, a management map corresponding to time or the vehicle ID may be used. This may suppress a situation, in which, for example, images are concentratedly collected for a specific one minute even when the user 10 wants to view a change over time at predetermined intervals before and after an accident, and collection of the image data D2 may be made to be equalized.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2021-010001 | Jan 2021 | JP | national |