This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2012-145885, filed on Jun. 28, 2012; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a measurement support device, a method therefor and a computer program product.
In three-dimensional measurement of an object, a measurer needs to check the status of the measurement of the object to determine a part for which the measurement is not sufficient and decide on the next part to be measured. In order to facilitate the measurement status check, a technique for displaying an image of three-dimensional shape data representing measured parts of an object as viewed in an arbitrary direction is known.
With such a known technique as described above, a user specifies the position of the point of view from which the three-dimensional shape data representing measured parts of an object are viewed, in which there is room for improvement for increasing the efficiency of the measurement work.
According to an embodiment, a measurement support device includes a first calculator configured to calculate, when three-dimensional shape data representing a measured part of an object are viewed from a plurality of points of view, a plurality of first information quantities representing visibility of the three-dimensional shape; a second calculator configured to calculate a second information quantity by multiplying a maximum value of the first information quantities by a predetermined proportion; a selector configured to select a point of view, which has a smaller difference between the first information quantity and the second information quantity, from the points of view; and a display controller configured to display the three-dimensional shape data as viewed from the selected point of view on a display unit.
Embodiments will be described in detail below with reference to the accompanying drawings.
The operating unit 11 performs various operation inputs and can be implemented by an input device such as a keyboard, a mouse, a touch pad and a touch panel.
The display unit 13 displays various screens and can be implemented by a display device such as a liquid crystal display and a touch screen display.
The storage unit 15 stores therein various programs executed in the measurement support device 10, data used for various types of processing performed by the measurement support device 10, and the like. The storage unit 15 can be implemented by a storage device that is magnetically, optically or electrically recordable such as a hard disk drive (HDD), a solid state drive (SSD), a memory card, an optical disc, a read only memory (ROM), and a random access memory (RAM).
The storage unit 15 stores therein three-dimensional shape data representing measured parts of an object to be measured existing in the real world. The three-dimensional shape data is a result of three-dimensional measurement of a measured part of an object to be measured and is data representing the shape of the measured part of the object.
In the first embodiment, the three-dimensional shape data is a set of points constituting the shape of a measured part of an object to be measured, that is, point cloud data. Points belonging to three-dimensional shape data (point cloud data) each have coordinates representing a position in a three-dimensional space. Any three-dimensional system can be used for the coordinate system that is a reference for the coordinates.
The three-dimensional shape data, however, is not limited to point cloud data but may be mesh data representing the shape of a measured part of an object to be measured as a mesh, polygon data, or the like, for example.
Moreover, the three-dimensional shape data may include a normal vector of a surface of a measured part of an object to be measured and texture information of the object to be measured in addition to the data, such as point cloud data, representing the shape of the measured part of the object to be measured.
The storage unit 15 also stores therein a history of points of view previously selected by the selecting unit 25 that will be described later.
The first calculating unit 21, the second calculating unit 23, the selecting unit 25 and the display control unit 27 can be implemented by causing a processing device such as a central processing unit (CPU) execute programs, that is by software.
The first calculating unit 21 calculates a plurality of first information quantities representing the visibilities of the three-dimensional shape when the three-dimensional shape data of an object are viewed from a plurality of points of view. In the first embodiment, the first calculating unit 21 acquires three-dimensional shape data of an object to be measured from the storage unit 15, and calculates a first information quantity when the three-dimensional shape data is viewed from each of a plurality of points of view.
The points of view are part of a set of points of view provided as points of view from which the three-dimensional shape data is viewed. Specifically, the points of view are a set of points of view to be selected by the selecting unit 25 that will be described later in the set of points of view provided in advance as points of view from which the three-dimensional shape data is viewed. The set of points of view to be selected is a set of all the points of view that are specified in the set of points of view provided in advance as points of view from which the three-dimensional shape data is viewed, and may be a set of points of view within a certain distance from specified coordinates or may be a set of points of view at a certain distance or farther from specified coordinates, for example. The specified coordinates may be those of a previous point of view selected by the selecting unit 25 that will be described later. Note that the set of points of view to be selected may be a set of points of view in a three-dimensional space or may be a set of points of view in any two-dimensional space.
The first information quantity may be any information that represents the visibility of a three-dimensional shape viewed from a point of view. In the first embodiment, it is assumed that the first information quantity is a projected area of the three-dimensional shape data projected on a projection plane in the line-of-sight direction or the direction opposite thereto. The first calculating unit 21 can employ various projection methods such as perspective projection and parallel projection for the projection of the three-dimensional shape data.
Note that, when the three-dimensional shape data is mesh data, polygon data or the like, the first calculating unit 21 can simply calculate the projected area because the three-dimensional shape data constructs of a plurality of facets. When the three-dimensional shape data is point cloud data, however, the first calculating unit 21 cannot simply calculate the projected area because the three-dimensional shape data does not constructs of facets. Thus, the first calculating unit 21 calculates the projected area by approximating the point data as a microsphere or a microcube. Alternatively, the first calculating unit 21 may calculate the projected area by forming facets by assigning a mesh to the point cloud data.
In calculation of the first information quantity, the first calculating unit 21 needs not use all the point data constituting the three-dimensional shape data but only needs to use a part thereof. For example, the first calculating unit 21 may calculate the first information quantity by using point data with coordinate values within a specified range among all the point data. The specified range can be a certain distance or shorter from the previous point of view selected by the selecting unit 25 that will be described later, for example.
The second calculating unit 23 calculates a second information quantity by multiplying a maximum first information quantity that is the maximum value of a plurality of information quantities calculated by the first calculating unit 21 by a predetermined proportion.
The second calculating unit 23 calculates the maximum first information quantity by using an expression (1), for example.
In the expression, Vmax represents the maximum first information quantity, and satisfies Vmax=V(Pmax). Pmax represents the point of view with the maximum first information quantity, and V(Pmax) represents the first information quantity of the point of view Pmax. Specifically, in the expression (1), the maximum first information quantity is determined among the first information quantities of the respective points of view P to be selected that constitute the set S. Note that Pmax is not limited to a single point of view but there may be a plurality of Pmax.
The second calculating unit 23 also calculates the second information quantity by using an expression (2), for example.
V
c
=r×V
max (2)
In the expression, Vc represents the second information quantity. r is a real constant in a range of 0<r<1. Specifically, in the expression (2), the second information quantity that has a value of (100×r)% of the maximum first information quantity.
The selecting unit 25 selects a point of view from a plurality of points of view by using a difference between each of a plurality of first information quantities calculated by the first calculating unit 21 and the second information quantity calculated by the second calculating unit 23. In the first embodiment, the selecting unit 25 selects a point of view with the first information quantity whose difference from the second information quantity calculated by the second calculating unit 23 is within a predetermined range among a plurality of first information quantities calculated by the first calculating unit 21. The selecting unit 25 then stores information, such as coordinate information and an identifier of the selected point of view with which the selected point of view can be identified, in the storage unit 15 as a history.
Note that the point of view that is selected by the selecting unit 25 is a point of view with the first information quantity whose difference from the second information quantity is within a predetermined range because a point of view with the second information quantity does not always exist in the set of point of views to be selected (point of views with the first information quantities).
The selecting unit 25 selects a point of view that satisfies an expression (3) among the points of views to be selected, for example.
|V(Pout)−Vc|≦s (3)
In the expression, Pout represents a point of view selected by the selecting unit 25. s is a predetermined positive real number and represents the predetermined range. Specifically, in the expression (3), a point of view with the first information quantity whose absolute value of the difference from the second information quantity is equal to or less than s. Note that if there is no point of view that satisfies the expression (3) in the points of view to be selected, the selecting unit 25 may select a point of view with the first information quantity whose difference from the second information quantity is the smallest from the points of view to be selected.
The display control unit 27 displays the three-dimensional shape data as viewed from the point of view selected by the selecting unit 25 on the display unit 13. Specifically, the display control unit 27 acquires the three-dimensional shape data from the storage unit 15 and displays the three-dimensional shape data as viewed from the point of view selected by the selecting unit 25 on the display unit 13.
The display control unit 27 can use various display techniques. For example, the display control unit 27 may display the three-dimensional shape data like computer aided design (CAD) software, or may display the three-dimensional shape data by using a technology disclosed in Japanese Patent Application Laid-open No. 2002-352271.
First, the first calculating unit 21 acquires three-dimensional shape data of an object to be measured (measured part of the object to be measured) from the storage unit 15 (step S101).
Subsequently, the first calculating unit 21 calculates a first information quantity representing the visibility of the three-dimensional shape image as viewed from each of a plurality of points of view (step S103).
Subsequently, the second calculating unit 23 determines a maximum first information quantity that is the maximum value among the first information quantities calculated by the first calculating unit 21 (step S105).
Subsequently, the second calculating unit 23 calculates a second information quantity by multiplying the determined maximum first information quantity by a predetermined proportion (step S107).
Subsequently, the selecting unit 25 selects a point of view with the first information quantity whose difference from the second information quantity calculated by the second calculating unit 23 is within a predetermined range among the first information quantities calculated by the first calculating unit 21 (step S109).
Subsequently, the display control unit 27 displays the three-dimensional shape data as viewed from the point of view selected by the selecting unit 25 on the display unit 13 (step S111).
Afterwards, the user (measurer) determines a point of view in the real space (real world) from which next observation is to be made on the basis of the three-dimensional shape data displayed on the display unit 13, observes and measures the object to be measured from the point of view in the real space (real world), and the three-dimensional shape data stored in the storage unit 15 is updated, which will be described in detail in a second embodiment. Then, when the user performs a display operation input for displaying the three-dimensional shape data from a new point of view through the operating unit 11, the process illustrated in
As described above, in the first embodiment, the second information quantity is calculated by multiplying the maximum first information quantity that is the maximum value of a plurality of first information quantities by the predetermined proportion, and the three-dimensional shape data as viewed from the point of view with the first information quantity whose difference from the second information quantity is within the predetermined range are displayed.
Note that the point of view with the maximum first information quantity is a point of view from which the three-dimensional shape data, that is, the measured part of the object to be measured can be most efficiently figured out, but is not a point of view from which an unmeasured part of the object to be measured can be efficiently figured out. In contrast, the point of view with the first information quantity whose difference from the second information quantity is within the predetermined range is a point of view from which both the measured part and the unmeasured part of the object to be measured can be efficiently figured out since the second information quantity is a value obtained by multiplying the maximum first information quantity by the predetermined proportion.
According to the first embodiment, it is therefore possible to automatically select a point of view from which the user (measurer) can easily figure out the measurement status of an object to be measured and display three-dimensional shape data therefrom, which can increase the efficiency of the measurement work on the object to be measured.
In the second embodiment, an example of generating (updating) the three-dimensional shape data will be described. In the following, differences from the first embodiment will be mainly described, and components having the same functions as those in the first embodiment will be designated by the same names and reference numerals as in the first embodiment and description thereof will not be repeated.
The observing unit 117 observes an object to be measured for three-dimensional measurement of the object, and can be implemented by various devices such as a visible light camera, a laser scanner, a laser range sensor, and a camera with a projector that are typically used for three-dimensional measurement. Specifically, the observing unit 117 observes an object to be measured from a point of view in the real space determined by the user (measurer) as a point of view from which the next observation is to be made on the basis of three-dimensional shape data displayed on the display unit 13. In the observation of an object to be measured, the observing unit 117 may observe the object from a single point of view or from a plurality of points of view.
The generating unit 120 generates three-dimensional shape data by using a result of observation (observation data) from the observing unit 117, and stores the generated data in the storage unit 15. When three-dimensional shape data are already stored in the storage unit 15, the generating unit 120 updates the three-dimensional shape data stored in the storage unit 15 with the generated three-dimensional shape data. Since the technique for generating the three-dimensional shape data is known, description thereof will not be provided herein.
First, the observing unit 117 observes an object to be measured and obtains observation data (step S201). Note that, in the initial observation, the observing unit 117 performs observation from an arbitrary point of view in the real space since there are no three-dimensional shape data to be displayed on the display unit 13, and in the second and subsequent observations, the observing unit 117 performs observation from a point of view in the real space from which the user (measurer) determines to perform the next observation on the basis of three-dimensional shape data displayed on the display unit 13.
Subsequently, the generating unit 120 generates three-dimensional shape data by using observation data from the observing unit 117 and stores the generated data in the storage unit 15 (step S203).
According to the second embodiment, similarly to the first embodiment, it is therefore possible to automatically select a point of view from which the user can easily figure out the measurement status of an object to be measured and display three-dimensional shape data therefrom, which can increase the efficiency of the measurement work on the object to be measured.
While examples in which a point of view is selected are described in the embodiments above, a combination of points of view may be selected.
In this case, the first calculating unit 21 calculates a first information quantity for each combination of points of view by summing up the visibilities of parts constituting the three-dimensional shape data.
Specifically, when the set S of points of view to be selected is {P1, . . . , PN} (N>1), the first calculating unit 21 calculates a first information quantity for each combination of points of view constituted by M (1<M≦N) points of view selected from the points of view to be selected {P1, . . . , PN}. In the following description, a combination of points of view will be referred to as {P′1, . . . , P′M} for convenience of explanation.
It is assumed here that the three-dimensional shape data is constituted by Q (Q>1) parts (shape). In this case, the first calculating unit 21 calculates a partial first information quantity vj(P′i) (j=1, . . . , Q) representing the visibility of each part constituting the three-dimensional shape data for each of the points of view P′i (i=1, . . . , M) constituting the combination of points of view {P′1, . . . , P′M}. Since the technique for calculating a partial first information quantity is the same as that for calculating the first information quantity in the first embodiment, the description thereof will not be repeated here. The first calculating unit 21 then obtains a maximum value of the calculated partial first information quantities vj(P′i) for each part of the three-dimensional shape data, and calculates the first information quantity by summing up the obtained maximum values.
The first calculating unit 21 calculates the first information quantity by using an expression (4), for example.
In the expression, V(P′1, . . . , P′M) represents the first information quantity of the combination of points of view {P′1, . . . , P′M}.
The method for calculating the first information quantity according to the modified example 1 will be described below in detail with reference to
The selecting unit 25 selects a combination of points of view from a plurality of points of view by using the difference between each of the first information quantities calculated by the first calculating unit 21 and the second information quantity calculated by the second calculating unit 23.
The display control unit 27 displays the three-dimensional shape data as viewed from the combination of points of view selected by the selecting unit 25 on the display unit 13.
According to the modified example 1, even in a case where a combination of points of view is selected, it is possible to automatically select a combination of points of view from which the user can easily figure out the measurement status of an object to be measured and display three-dimensional shape data therefrom, which can increase the efficiency of the measurement work on the object to be measured.
In the embodiments and the modified example 1 described above, points of view may be selected further taking the relation with a previously selected point of view into account.
In the first embodiment described above, when the relation with a previously selected point of view is taken into account and a point of view in an unobserved direction is to be selected first, the selecting unit 25 may select a point of view by using expressions (5) and (6), for example.
In the expression, Pt represents a point of view from which observation is previously performed and satisfies t=1, . . . , T (T≧1). Dmin represents a smallest value of the distances between points of view P to be selected and the point of view Pt from which observation is previously performed. Specifically, in the expression (5), the smallest value of the distances between observing points to be selected and the point of view from which observation is previously performed is obtained.
αf(|V(P)−Vc|)+(1−α)g(Dmin) (6)
In the expression, α is a real number between 0 and 1 representing a balance of weight. The function f( ) is a monotonic increase function while the function g( ) is a monotonic decrease function. That is, in the expression (6), a weighted sum of a value obtained by transforming an absolute value of the difference between the first information quantity and the second information quantity by the function f( ) and a value obtained by transforming Dmin by the function g( ).
In this case, the selecting unit 25 obtains the weighted sum for each of the points of view to be selected by using the expressions (5) and (6), and may select a point of view P with a smaller weighted sum first, or a point of view P with the smallest weighted sum, for example. This allows the user to easily figure out the measurement status of an object to be measured and a point of view in an unobserved direction can be selected first.
Moreover, in the modified example 1 described above, when the relation with a previously selected point of view is taken into account and a point of view in an unobserved direction is to be selected first, the selecting unit 25 may select a point of view by using expressions (7) and (8), for example.
In the expression, D′min represents a sum of the smallest values of the distances between the respective points of view P′i constituting a combination of points of view and the point of view Pt from which the observation is previously performed. Note that the point of view Pt from which the observation is previously performed may be any point of view included in a combination of points of view from which the observation is previously performed.
αf(|V(P′1, . . . , P′M)−Vc|)+(1−α)g(D′min) (8)
In the expression (8), a weighted sum of a value obtained by transforming an absolute value of the difference between the first information quantity and the second information quantity by the function f( ) and a value obtained by transforming D′min by the function g( ).
In this case, the selecting unit 25 obtains the weighted sum for each combination of points of view by using the expressions (7) and (8), and may select a combination of points of view {P′1, . . . , P′M} with a smaller weighted sum first, or a combination of points of view {P′1, . . . , P′M} with the smallest weighted sum, for example. This allows the user to easily figure out the measurement status of an object to be measured and a combination of points of view in an unobserved direction can be selected first.
While a case in which a point of view in an unobserved direction is selected first has been described in the modified example 2 as an example of taking the relation with a point of view from which observation is previously performed into account, a point of view at a certain distance or shorter from a point of view from which observation is previously performed (for example, a point of view from which observation is performed last time) may be selected. In this case, the selecting unit 25 may select a point of view at a certain distance or shorter from the point of view from which observation is performed last time from among the points of view selected by using the expression (3), the expressions (5) and (6), or the expression (7) and (8).
While cases in which the first information quantity is a projected area or a value based on a projected area are described in the embodiments and modified examples described above, the first information quantity may be a sum of absolute values of scalar products of normal vectors at respective point data of the three-dimensional shape data (point cloud data) and the line-of-sight direction. As described above, the normal vectors may be included in the three-dimensional shape data or may be calculated from point cloud data.
In the embodiments and the modified examples described above, imaged data obtained by imaging an object and the three-dimensional shape data may be superimposed for display.
In this case, the storage unit 15 stores in advance a plurality of imaged data pieces obtained by imaging the object to be measured. The imaged data pieces are preferably obtained by imaging the object to be measured equally from various angles.
In selecting a point of view, the selecting unit 25 selects the point of view from the points of view to be selected and the points of view from which the imaged data are taken. This can significantly reduce the computation for selecting points of view and make the points of view from which the three-dimensional shape data are viewed and the points of view from which the imaged data are taken correspond to each other, which leads to improvement in the quality of superimposed display.
In addition, the display control unit 27 superimposes the three-dimensional shape data as viewed from the point of view selected by the selecting unit 25 and the imaged data obtained by imaging the object from the point of view selected by the selecting unit 25 and displays the superimposed result on the display unit 13. A technique disclosed in Japanese Patent Application Laid-open No. 2009-075117 may be used as the technique for displaying a superimposed display.
This allows the user to more easily figure out the measurement status of an object to be measured, which can increase the efficiency of the measurement work on the object to be measured.
Hardware Configuration
An example of a hardware configuration of a measurement support device according to the embodiments and modified examples described above will be described. The measurement support device according to the embodiments and modified examples described above includes a control device such as a CPU, a storage device such as a ROM and a RAM, an input device such as a keyboard and a mouse, and a communication device such as a communication interface, which is a hardware configuration utilizing a common computer system.
Programs to be executed by the measurement support device according to the embodiments and modified examples described above may be recorded on a computer readable recording medium such as CD-ROM, a CD-R, a memory card, a digital versatile disk (DVD), and a flexible disk (FD) in a form of a file that can be installed or executed, and provided therefrom.
Alternatively, the programs to be executed by the measurement support device according to the embodiments and modified examples described above may be stored on a computer system connected to a network such as the Internet, and provided by being downloaded via the network. Still alternatively, the programs to be executed by the measurement support device according to the embodiments and modified examples described above may be provided or distributed via a network such as the Internet. Still alternatively, the programs to be executed by the measurement support device according to the embodiments and modified examples described above may be embedded on a ROM or the like in advance and provided therefrom.
The programs to be executed by the measurement support device according to the embodiments and modified examples described above have modular structures including the respective units described above. In an actual hardware configuration, the CPU reads the programs from the HDD and executes the programs, whereby the respective units are implemented on the computer system.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
For example, the steps in the flowcharts in the embodiments may be changed in the order in which the steps are performed, may be performed simultaneously or may be performed in a different order each time the steps are performed as long as the change is not inconsistent with the nature thereof.
As described above, according to the embodiments and modified examples described above, the work efficiency of measurement can be increased.
Number | Date | Country | Kind |
---|---|---|---|
2012-145885 | Jun 2012 | JP | national |