The present invention relates to a data processing apparatus, a data processing system, and a data processing method.
As a method to display a site that includes industrial facilities/equipment, common structures, a natural environment, and the like as a display object, for example, LiDAR (Light Detection and Ranging) is used. The LiDAR can acquire a shape of an object as point group data by scanning the object with irradiation light. For example, a principle of ToF (Time of Flight) LiDAR is as follows. The LiDAR includes a light emission unit emitting the irradiation light, and a detection unit detecting reflected light that is the irradiation light reflected by the object. The LiDAR detects the reflected light reflected by the object while scanning the object with the irradiation light at a predetermined angle of view. Further, the LiDAR calculates a distance D to the object from an expression D=(t2−t1)/2×(light velocity) by using a time t1 until the irradiation light reaches the object and a time t2 until the reflected light reaches the detection unit. As a result, the LiDAR can acquire the point group data including the distance to the object as scan data in a scanned range.
For example, Non-Patent Literature 1 discloses that RGB information acquired by a camera is imparted to each point of the point group data.
Non-Patent Literature 1: “FARO 3D Digital Documentation Software for Terrestrial Laser Scanner and Hand Scanner”, SCENE 2019, [online], [searched on Feb. 21, 2020], Internet <URL: http://www.faro.com/ja-jp/products/3d-design/faro-scene>
In display disclosed in Non-Patent Literature 1, each point of the point group data is colored based on the RGB information. Therefore, positional information, for example, coloring with a specific color to enhance an abnormal position in an inspection result cannot be imparted. Further, in the display disclosed in Non-Patent Literature 1, it is not possible to acquire color information on a position such as a far place or sky where reflected light is not acquirable because the irradiation light does not reach and point group data is not acquirable. As described above, the display of the point group data disclosed in Non-Patent Literature 1 has an issue in terms of convenience.
The present disclosure is made to solve such issues, and an objective of the present disclosure is to provide a data processing apparatus, a data processing system, and a data processing method that can improve convenience of display of point group data.
A data processing apparatus according to the present disclosure includes a point group data processing means configured to display, when a viewpoint for display of point group data obtained by scanning a site is input, the point group data viewed from the viewpoint, and an image data processing means configured to determine image data corresponding to the point group data to be displayed, from a plurality of pieces of image data obtained by imaging the site from different positions, and to display the determined image data.
A data processing system according to the present disclosure includes the above-described data processing apparatus, and a point group data acquisition means configured to acquire the point group data.
A data processing method according to the present disclosure includes a point group data display step of displaying, when a viewpoint for display of point group data obtained by scanning a site is input, the point group data viewed from the viewpoint, and an image data display step of determining image data corresponding to the point group data to be displayed, from a plurality of pieces of image data obtained by imaging the site from different positions, and displaying the determined image data.
According to the present disclosure, it is possible to provide the data processing apparatus, the data processing system, and the data processing method that can improve convenience of display of the point group data.
Some example embodiments are described below with reference to drawings. To clarify the description, the following description and the drawings are appropriately omitted and simplified. Further, in the drawings, the same elements are denoted by the same reference numerals, and repetitive descriptions are omitted as necessary.
(Outline of Example Embodiment)
First, an outline of an example embodiment according to the present disclosure is described.
As illustrated in
On the other hand, as illustrated in
The display arrangement of the point group data and the image data is not limited to the arrangement in the vertical direction. The point group data processing unit 10 and the image data processing unit 20 may arrange and display the point group data and the image data in a horizontal direction, in an oblique direction inclined from the horizontal direction, or in a partially overlapping manner. Further, a lateral display size and a vertical display size of the image data may be different from a lateral display size and a vertical display size of the point group data. The arrangement direction, a size of the overlapping area, and the display size are not limited as long as the image data and the point group data are viewable together.
The image data processing unit 20 may display the image data in synchronization with a timing when the point group data processing unit 10 displays the point group data. For example, the image data processing unit 20 displays the image data at the same time when the point group data is displayed. The point group data processing unit 10 and the image data processing unit 20 may adjust timings when the point group data and the image data are displayed, by using a buffer or the like.
As described above, the data processing apparatus 1 according to the present example embodiment can display the image data corresponding to the display of the point group data. For example, the data processing apparatus 1 according to the present example embodiment can display the image data corresponding to the display of the point group data, alongside the point group data. This makes it possible to improve visibility of the point group data obtained by scanning the site and the image data obtained by imaging the site, and to improve browsability. Further, when the point group data is moved or rotated, the image data matching with the displayed point group data can be displayed. This makes it possible to improve convenience of the display of the point group data and the image data. Further, the timing when the point group data is displayed and the timing when the image data is displayed are synchronized with each other. This makes it possible to further improve visibility and browsability.
The data processing apparatus 1 may be configured by hardware including a microcomputer. The microcomputer includes, for example, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and an interface unit (I/F). The CPU performs data processing such as calculation processing and control processing. The ROM stores data processing programs such as a calculation processing program and a control processing program to be executed by the CPU. The RAM stores various kinds of data such as point group data and image data. The interface unit (I/F) performs input/output of signals and data with outside. The CPU, the ROM, the RAM, and the interface unit are mutually connected through a data bus and the like.
Next, details of the data processing apparatus 1 according to a first example embodiment are described.
As illustrated in
The point group data holding unit 11 holds the point group data obtained by scanning the site. The point group data holding unit 11 holds at least one piece of point group data obtained by scanning the site. The point group data is data indicating, for example, structures configuring the site as a group of a plurality of points each represented by a coordinate in a three-dimensional space. The point group data is obtained by, for example, measuring a distance from a predetermined position to an object. The point group data may be, for example, data on the site acquired by the LiDAR. Note that the point group data is not limited to the data acquired by the LiDAR, and may be data acquired by the other method of measuring the distance from the predetermined position to the object, for example, data acquired by a radar using radio waves. The point group data is displayed by using, for example, the point group viewer. The point group data holding unit 11 is connected to the point group data display unit 14, and outputs the point group data held by itself, to the point group data display unit 14.
The viewpoint operation input unit 12 receives input of a viewpoint for display of the point group data. A user who wants to display the point group data inputs the viewpoint for display of the point group data through the viewpoint operation input unit 12. The viewpoint operation input unit 12 is, for example, a keyboard, a mouse, an input unit of the point group viewer, or the like. The viewpoint operation input unit 12 is connected to the sight line calculation unit 13, and outputs the input viewpoint to the sight line calculation unit 13.
The sight line calculation unit 13 calculates sight line information including relationship between the input viewpoint and the point group data. The sight line information includes information on a position of the viewpoint for display of the point group data, and information on a direction, an orientation, and the like from the viewpoint to the point group data. The sight line calculation unit 13 is connected to the point group data display unit 14, and outputs the calculated sight line information to the point group data display unit 14. Further, the sight line calculation unit 13 is connected to an image determination unit 26, and outputs the calculated sight line information to the image determination unit 26.
The point group data display unit 14 displays the point group data viewed from the viewpoint, based on the calculated sight line information. The point group data display unit 14 is, for example, a GUI, and displays the point group data viewed from the viewpoint, to the user.
The image data processing unit 20 includes an image data holding unit 21, an image information holding unit 22, a site map holding unit 23, an image data display unit 24, an image area calculation unit 25, and the image determination unit 26. The image data holding unit 21, the image information holding unit 22, the site map holding unit 23, the image data display unit 24, the image area calculation unit 25, and the image determination unit 26 have functions as an image data holding means, an image information holding means, a site map holding means, an image data display means, an image area calculation means, and an image determination means, respectively.
The image data holding unit 21 holds a plurality of pieces of image data obtained by imaging the site from different positions. The image data may be color image data including RGB pixels, or gray-scale image data. The image data is image data captured by, for example, a camera. The image data holding unit 21 is connected to the image data display unit 24, and outputs the image data to the image data display unit 24.
The image information holding unit 22 holds imaging information on the plurality of pieces of image data held by the image data holding unit 21. The imaging information includes an imaging condition such as a position of the camera capturing the image data, posture of the camera, and an angle of view of the camera. The image information holding unit 22 is connected to the image area calculation unit 25, and outputs the imaging information held by itself, to the image area calculation unit 25.
The site map holding unit 23 holds site information. The site information includes, for example, a structure map indicating layout of structures in the site, and structure data indicating positions of the structures disposed in the site.
As illustrated in
The site map holding unit 23 may include site information other than the structure map KM and the structure data. For example, the site map holding unit 23 may hold site information such as a design drawing indicating the structures in the site. Further, the site map holding unit 23 may hold site information generated by a technique such as SLAM (Simultaneous Localization and Mapping) simultaneously performing self-position estimation and site map creation. The site map holding unit 23 is connected to the image area calculation unit 25, and outputs the site information held by itself, to the image area calculation unit 25. Further, the site map holding unit 23 is connected to the image determination unit 26, and outputs the site information held by itself to the image determination unit 26.
The image data display unit 24 displays the image data. For example, the image data display unit 24 displays the image data alongside the point group data. The image data to be displayed is image data determined by the image determination unit 26 described below. The image data display unit 24 may display the image data in synchronization with a timing when the point group data display unit 14 displays the point group data. In a case where a time difference occurs between display by the point group data display unit 14 and display by the image data display unit 24, the display timings may be adjusted. The image data display unit 24 is, for example, a GUI (Graphical User Interface), and presents the image data to the user.
The image area calculation unit 25 calculates imaged structure portions common to an imaging area of the site captured in the image data and the structure data indicating the positions of the structures KD1 to KD3 disposed in the site. A method of calculating the imaged structure portions is described below with reference to drawings.
For example, the image area calculation unit 25 calculates the imaging area SA on the structure map KM by using imaging information on image data captured from an imaging position CA. Further, the image area calculation unit 25 calculates the imaging area SB on the structure map KM by using imaging information on image data captured from an imaging position CB. On the other hand, as described above, the structures KD1 to KD3 are disposed on the structure map KM in the site information.
The image area calculation unit 25 calculates the imaged structure portions HA1 to HA3 and HB1 to HB3 common to the plurality of imaging areas SA and SB calculated by using the imaging information, and the structure data held by the site map holding unit 23.
Further, the image area calculation unit 25 determines, among intersections between straight lines indicating boundaries of the minute angles and the structures KD1 to KD3 indicated by the structure data, intersections closest to the imaging position CA. In a case where the structure data is discrete data, a point at which a distance with any of the straight lines indicating the boundaries of the minute angles is less than or equal to a predetermined value may be regarded as an intersection. Among the acquired intersections, points close to one another are connected by a line. Portions each formed by the connected intersections correspond to the imaged structure portions HA1, HA2, and HA3. For example, the imaged structure portion HA1 includes minute imaging areas SA1 and SA2, the imaged structure portion HA2 includes minute imaging areas SA8 to SA12, and the imaged structure portion HA3 includes minute imaging areas SA14 and SA15.
As illustrated in
When a viewpoint for display of the point group data obtained by scanning the site is input, the image determination unit 26 calculates visible structure potions common to visible areas facing the viewpoint side in the point group data and the structure data on the site. A method of calculating the visible structure portions is described below with reference to drawings.
Thereafter, the image determination unit 26 determines, among intersections between straight lines indicating boundaries of the minute angles and the structures KD1 to KD3 indicated by the structure data, intersections closest to the viewpoint PA. In the case where the structure data is discrete data, a point at which a distance with any of the straight lines indicating the boundaries of the minute angles is less than or equal to a predetermined value may be regarded as an intersection. Among the acquired intersections, points close to one another are connected by a line. As a result, as illustrated in
For example, in the example illustrated in
The image data display unit 24 displays the determined image data. The image data display unit 24 receives the determined image data from the image data holding unit 21 based on the information output from the image determination unit 26, and displays the image data. For example, as illustrated in
Next, a data processing method using the data processing apparatus according to the first example embodiment is described.
As illustrated in
Further, in step S20, the image data processing unit 20 determines image data corresponding to the point group data to be displayed, from the plurality of pieces of image data obtained by imaging the site from different positions, and displays the determined image data, as illustrated in
In the image data display step S20, the image data may be displayed in synchronization with a timing when the point group data is displayed. Specifically, the image data processing unit 20 may display the image data in synchronization with the display of the point group data by the point group data processing unit 10.
The data processing method according to the example embodiment is more specifically described below. First, the point group data display step S10 is described.
As illustrated in
First, as illustrated in step S11, the point group data is held. For example, the point group data holding unit 11 holds the point group data obtained by scanning the site.
Next, as illustrated in step S12, it is determined whether the viewpoint PA has been input. For example, the viewpoint operation input unit 12 determines whether the viewpoint PA for display of the point group data has been input. In a case where the viewpoint PA has not been input in step S12, the processing waits as it is. In contrast, in a case where the viewpoint PA has been input in step S12, the viewpoint operation input unit 12 outputs the input viewpoint PA to the sight line calculation unit 13.
Next, as illustrated in step S13, the sight line information is calculated and the sight line information is output. For example, the sight line calculation unit 13 calculates the sight line information including relationship between the input viewpoint PA and the point group data. The sight line calculation unit 13 outputs the calculated sight line information to the point group data display unit 14. Further, the sight line calculation unit 13 outputs the calculated sight line information to the image determination unit 26.
Next, as illustrated in step S14, the point group data viewed from the viewpoint PA is displayed based on the calculated sight line information. For example, the point group data display unit 14 displays the point group data viewed from the viewpoint PA. The point group data display step S10 may further includes a step of acquiring the point group data before step S11 of holding the point group data. For example, the point group data display step S10 may include a step of acquiring the point group data on the site by the LiDAR.
Next, the image data display step S20 is described.
As illustrated in
First, as illustrated in step S21, the plurality of pieces of image data are held. For example, the image data holding unit 21 holds the plurality of pieces of image data obtained by imaging the site from different positions.
Next, as illustrated in step S22, the imaging information is held. For example, the image information holding unit 22 holds the imaging information including the imaging conditions of the plurality of pieces of image data held by the image data holding unit 21.
Next, as illustrated in step S23, the site information is held. For example, the site map holding unit 23 holds the site information including the structure map indicating layout of the structures in the site and the structure data indicating position of the structures disposed in the site. The order of step S21, step S22, and step S23 is not limited thereto. For example, the step of holding the plurality of pieces of image data may be performed in step S22 or step S23, and the step of holding the imaging information may be performed in step S21 or step S23. The step of holding the site information may be performed in step S21 or step S22.
Next, as illustrated in step S24, the imaged structure portions HA1 to HA3 and HB1 to HB3 are calculated. For example, the image area calculation unit 25 calculates the imaged structure portions HA1 to HA3 and HB1 to HB3 common to the imaging areas SA and SB of the site captured in the image data and the structure data indicating the positions of the structures disposed in the site. Specifically, the image area calculation unit 25 calculates the imaging areas SA and SB on the structure map of the site by using the imaging information. Further, the image area calculation unit 25 calculates the imaged structure portions HA1 to HA3 and HB1 to HB3 by using the calculated imaging areas SA and SB.
Next, as illustrated in step S25, it is determined whether the sight line information has been input. For example, the image determination unit 26 determines whether the sight line information has been input from the sight line calculation unit 13. In a case where the sight line information has not been input in step S25, the processing waits as it is.
In contrast, in a case where the sight line information has been input in step S25, the image data to be displayed is determined as illustrated in step S26. For example, the image determination unit 26 calculates the visible structure portion MA common to the visible area RA facing the viewpoint PA side in the point group data and the structure data. For example, the image determination unit 26 calculates the visible area RA on the structure map KM by using the sight line information input from the sight line calculation unit 13, and calculates the visible structure portion MA by using the calculated visible area RA.
Further, the image determination unit 26 determines the image data to be displayed among the plurality of pieces of image data based on the imaged structure portion HA2 corresponding to the calculated visible structure portion MA. At this time, the image determination unit 26 determines the image data including the most imaged structure portions corresponding to the visible structure portion RA, as the image data to be displayed.
Next, as illustrated in step S27, the determined image data is displayed. For example, the image data display unit 24 displays the determined image data alongside the point group data. The image data display unit 24 may display the image data in synchronization with the timing when the point group data is displayed. The data processing to display the point group data and the image data can be performed in the above-described manner.
Next, effects by the present example embodiment are described. In the data processing apparatus 1 according to the present example embodiment, the image data display unit 24 can display the image data corresponding to the display of the point group data by the point group data display unit 14. For example, the image data display unit 24 can display the image data alongside the point group data. This makes it possible to visibility of the point group data obtained by scanning the site and the image data obtained by imaging the site, and to improve browsability. Further, the image data is displayed in synchronization with the display of the point group data, which makes it possible to further improve visibility and browsability.
Further, when the viewpoint for the point group data is moved or rotated by the viewpoint operation input unit 12, the image data matching with the displayed point group data can be displayed. This makes it possible to improve convenience of the display of the point group data.
For example, Non-Patent Literature 1 discloses that the RGB information acquired by the camera is imparted to each point of the point group data and is displayed. In contrast, in the data processing apparatus 1 according to the present example embodiment, the point group data and the image data are separated and displayed side by side. Accordingly, the RGB information acquired by the camera is not imparted to the point group data, which enables coloring of the point group data based on the other information. Positional information, for example, can be displayed such that the point group data is colored to highlight an abnormal position in an inspection result.
Further, in the method disclosed in Non-Patent Literature 1, the RGB information is imparted to the point group data and is displayed. Therefore, it is not possible to acquire color information on a position such as a far place or sky where reflected light is not acquirable because irradiation light does not reach and point group data is not acquirable. In contrast, in the method according to the present example embodiment, the image data including the RGB information can be separately displayed alongside the point group data. Accordingly, it is possible to acquire color information on the position where the point group data is not acquirable.
Next, a modification of the first example embodiment is described. The image data display unit 24 may display three-dimensional image data constructed from the plurality of pieces of image data like an SFM (Structured from Motion) technique or the like. Specifically, the image data holding unit 21 holds three-dimensional image data on the site constructed from the plurality of pieces of image data. The image area calculation unit 25 calculates the imaged structure portions at each viewpoint position when the three-dimensional image data is viewed. The image determination unit 26 calculates the visible areas by using the sight line information, and determines the viewpoint position, the three-dimensional image data from which is displayed, based on the imaged structure portions corresponding to the calculated visible structure portions. As a result, the image data display unit 24 can display the three-dimensional image data from the determined viewpoint position.
According to the present modification, it is possible to continuously change the viewpoint position of the image data to be displayed. This makes it possible to improve similarity of the image data to be displayed, to the point group data, and to further improve visibility and convenience of the point group data and the image data.
Next, a second modification of the first example embodiment is described. In the above-described first example embodiment, when determining the image data corresponding to the point group data to be displayed, the image determination unit 26 determines the image data to be displayed, based on the imaged structure portions corresponding to the visible structure portions. For example, the image determination unit 26 determines the image data including the most imaged structure portions corresponding to the visible structure portions, as the image data to be displayed. In the present modification, the image determination unit 26 determines the image data to be displayed, by using characteristic points of the point group data and characteristic points of the image data.
Specifically, the image data processing unit 20 includes the image determination unit 26 that determines the image data to be displayed from the plurality of pieces of image data based on characteristic points of the point group data to be displayed and characteristic points of the plurality of pieces of image data. The image determination unit 26 determines image data including the most characteristic points common to the characteristic points of the point group data to be displayed, as the image data to be displayed.
According to the present modification, since the image data to be displayed is determined by using the characteristic points of the point group data and the characteristic points of the image data, the image data to be displayed can be matched with the point group data to be displayed, and visibility and convenience of the point group data and the image data can be improved.
Next, a data processing system according to a second example embodiment is described. The data processing system includes the above-described data processing apparatus 1 and a point group data acquisition unit acquiring the point group data. The point group data acquisition unit has a function as a point group data acquisition means. The point group data acquisition unit is, for example, the LiDAR. In this case, the point group data is data on the site acquired by the LiDAR. Further, the point group data acquisition unit may be, for example, flash LiDAR. In this case, the point group data is data acquired by the flash LiDAR at predetermined frequency. First, as the point group data acquisition unit, the LiDAR is described, and the flash LiDAR is then described.
The LiDAR 30 calculates a distance D to the object OB from an expression D=(t2−t1)/2×(light velocity) by using a time t1 until the irradiation light 33 reaches the object OB and a time t2 until the reflected light 34 reaches the detection unit 32. As a result, the LiDAR 30 can acquire the point group data including the distance to the object OB as scan data in a scanned range.
The flash LiDAR acquires the point group data on the site at predetermined frequency. The point group data acquired by the flash LiDAR may be held by the point group data holding unit 11, or may be held by a memory provided in the flash LiDAR. The point group data display unit 14 acquires the point group data from the memory provided in the flash LiDAR in real time, and may display the acquired point group data.
Although the first and second example embodiments and the first and second modifications are described above, the present invention is not limited to the first and second example embodiments and the first and second modifications, and can be appropriately modified without departing from the spirit. For example, example embodiments obtained by combining the respective configurations of the first and second example embodiments and the first and second modifications are also included in the scope of the technical idea. Further, a data processing program causing a computer to execute the data processing method according to the first example embodiment is also included in the technical scope of the first and second example embodiments and the first and second modifications.
A part or all of the above-described example embodiments can be described as the following supplementary notes, but are not limited to the following description.
A data processing apparatus, comprising:
a point group data processing means configured to display, when a viewpoint for display of point group data obtained by scanning a display object is input, the point group data viewed from the viewpoint; and
an image data processing means configured to determine image data corresponding to the point group data to be displayed, from a plurality of pieces of image data obtained by imaging the display object from different positions, and to display the determined image data.
The data processing apparatus according to the supplementary note 1, wherein the data processing means displays the image data in synchronization with a timing when the point group data processing means displays the point group data.
The data processing apparatus according to the supplementary note 1 or 2, wherein the image data processing means includes an image area calculation means configured to calculate imaged structure portions common to an imaging area of the display object captured in the image data and structure data indicating positions of structures disposed in the display object, and an image determination means configured to calculate visible structure portions common to visible areas facing the viewpoint side in the point group data and the structure data, and to determine the image data to be displayed, from the plurality of pieces of image data based on the imaged structure portions corresponding to the calculated visible structure portions.
The data processing apparatus according to the supplementary note 3, wherein the image determination means determines the image data including the most imaged structure portions corresponding to the visible structure portions, as the image data to be displayed.
The data processing apparatus according to the supplementary note 3 or 4, wherein the image data processing means includes an image data holding means configured to hold the plurality of pieces of image data, an image information holding means configured to hold imaging information including imaging conditions of the plurality of pieces of image data held by the image data holding means, a site map holding means configured to hold site information including the structure data, and an image data display unit configured to display the image data determined by the image determination means.
The data processing apparatus according to the supplementary note 5, wherein the image area calculation means calculates the imaging area on a structure map indicating layout of the structures in the display object by using the imaging information, and calculates the imaged structure portions by using the calculated imaging area.
The data processing apparatus according to any one of the supplementary notes 3 to 6, wherein the point group data processing means includes a point group data holding means configured to hold the point group data obtained by scanning the display object, a viewpoint operation input means configured to receive input of the viewpoint for display of the point group data, a sight line calculation means configured to calculate sight line information including relationship between the input viewpoint and the point group data, and a point group data display means configured to display the point group data viewed from the viewpoint based on the calculated sight line information.
The data processing apparatus according to the supplementary note 7, wherein the image determination means calculates the visible areas on a structure map indicating layout of the structures in the display object by using the sight line information, and calculates the visible structure portions by using the calculated visible areas.
The data processing apparatus according to any one of the supplementary notes 1 to 8, wherein the point group data is data on the display object acquired by LiDAR.
A data processing system, comprising:
the data processing apparatus according to any one of the supplementary notes 1 to 9; and
a point group data acquisition means configured to acquire the point group data.
A data processing method, comprising:
a point group data display step of displaying, when a viewpoint for display of point group data obtained by scanning a display object is input, the point group data viewed from the viewpoint; and
an image data display step of determining image data corresponding to the point group data to be displayed, from a plurality of pieces of image data obtained by imaging the display object from different positions, and displaying the determined image data.
The data processing method according to the supplementary note 11, wherein, in the image data display step, the image data is displayed in synchronization with a timing when the point group data is displayed.
The data processing method according to the supplementary note 11 or 12, wherein the image data display step includes a step of calculating imaged structure portions common to an imaging area of the display object captured in the image data and structure data indicating positions of structures disposed in the display object, and a step of calculating visible structure portions common to visible areas facing the viewpoint side in the point group data and the structure data, and determining the image data to be displayed, from the plurality of pieces of image data based on the imaged structure portions corresponding to the calculated visible structure portions.
The data processing method according to the supplementary note 13, wherein, in the step of determining the image data to be displayed, the image data including the most imaged structure portions corresponding to the visible structure portions is determined as the image data to be displayed.
The data processing method according to the supplementary note 13 or 14, wherein the image data display step further includes a step of holding the plurality of pieces of image data, a step of holding imaging information including imaging conditions of the plurality of pieces of held image data, a step of holding site information including the structure data, and a step of displaying the determined image data.
The data processing method according to the supplementary note 15, wherein, in the step of calculating the imaged structure portions, the imaging area on a structure map indicating layout of the structures in the display object is calculated by using the imaging information, and the imaged structure portions are calculated by using the calculated imaging area.
The data processing method according to any one of the supplementary notes 13 to 16, wherein the point group data display step includes a step of holding the point group data obtained by scanning the display object, a step of determining whether the viewpoint for display of the point group data has been input, a step of calculating sight line information including relationship between the input viewpoint and the point group data, and a step of displaying the point group data viewed from the viewpoint based on the calculated sight line information.
The data processing method according to the supplementary note 17, wherein, in the step of determining the image data to be displayed, the visible areas on a structure map indicating layout of the structures in the display object is calculated by using the sight line information, and the visible structure portions are calculated by using the calculated visible areas.
The data processing method according to any one of the supplementary notes 11 to 18, wherein the point group data is data on the display object acquired by LiDAR.
The data processing method according to any one of the supplementary notes 11 to 19, further comprising a step of acquiring the point group data.
A non-transitory computer-readable medium that stores a data processing program causing a computer to execute:
displaying, when a viewpoint for display of point group data obtained by scanning a display object is input, the point group data viewed from the viewpoint; and
determining image data corresponding to the point group data to be displayed, from a plurality of pieces of image data obtained by imaging the display object from different positions, and displaying the determined image data.
The non-transitory computer-readable medium that stores the data processing program according to the supplementary note 21, in which the data processing program causes the computer to execute, when the determined image data is displayed, displaying the image data in synchronization with a timing when the point group data is displayed.
The non-transitory computer-readable medium that stores the data processing program according to the supplementary note 21 or 22, in which the data processing program causes the computer to execute, when the determined image data is displayed, calculating imaged structure portions common to an imaging area of the display object captured in the image data and structure data indicating positions of structures disposed in the display object, and calculating visible structure portions common to visible areas facing the viewpoint side in the point group data and the structure data, and determining the image data to be displayed, from the plurality of pieces of image data based on the imaged structure portions corresponding to the calculated visible structure portions.
The non-transitory computer-readable medium that stores the data processing program according to the supplementary note 23, in which the data processing program causes the computer to execute, when the image data to be displayed is determined, determining the image data including the most imaged structure portions corresponding to the visible structure portions, as the image data to be displayed.
The non-transitory computer-readable medium that stores the data processing program according to the supplementary note 23 or 24, in which the data processing program causes the computer to execute, when the determined image is displayed, holding the plurality of pieces of image data, holding imaging information including imaging conditions of the plurality of pieces of held image data, holding site information including the structure data, and displaying the determined image data.
The non-transitory computer-readable medium that stores the data processing program according to the supplementary note 25, in which the data processing program causes the computer to execute, when the imaged structure portions are calculated, calculating the imaging area on a structure map indicating layout of the structures in the display object by using the imaging information, and calculating the imaged structure portions by using the calculated imaging area.
The non-transitory computer-readable medium that stores the data processing program according to any one of the supplementary notes 13 to 16, in which the data processing program causes the computer to execute, when the point group data viewed from the viewpoint is displayed, holding the point group data obtained by scanning the display object, determining whether the viewpoint for display of the point group data has been input, calculating sight line information including relationship between the input viewpoint and the point group data, and displaying the point group data viewed from the viewpoint based on the calculated sight line information.
The non-transitory computer-readable medium that stores the data processing program according to the supplementary note 27, in which the data processing program causes the computer to execute, when the image data to be displayed is determined, calculating the visible areas on a structure map indicating layout of the structures in the display object by using the sight line information, and calculating the visible structure portions by using the calculated visible areas.
The non-transitory computer-readable medium that stores the data processing program according to any one of the supplementary notes 21 to 28, in which the point group data is data on the display object acquired by LiDAR.
The non-transitory computer-readable medium that stores the data processing program according to any one of the supplementary notes 21 to 29, in which the data processing program causes the computer to execute acquiring the point group data.
In the above-described examples, the program can be stored by using various types of non-transitory computer-readable media, and supplied to a computer. The non-transitory computer-readable media include various types of tangible storage media. Examples of the non-transitory computer-readable media include a magnetic recording medium (for example, flexible disk, magnetic tape, and hard disk drive), a magnetooptical recording medium (for example, magnetooptical disk), a CD-ROM (Read Only Memory), a CD-R, a CD-R/W, and a semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, and RAM (Random Access Memory)). Further, the program may be supplied to the computer by various types of transitory computer readable media. Examples of the transitory computer readable media include an electric signal, an optical signal, and an electromagnetic wave. The transitory computer readable media can supply the program to the computer through a wired communication channel such as an electric cable and an optical fiber, or a wireless communication channel.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/010332 | 3/10/2020 | WO |