The invention relates to a system and method for recording and representing measurement points on a body surface.
The geometric measurement of body surfaces can serve, for example, for the purpose of evaluating measurement points on the body surface in regard to their position with respect to a reference point, a reference line, or a reference plane inside of or outside of the body. For example, points of the body surface can be analyzed in regard to their planarity relative to a reference plane.
Described in EP 2 998 694 A9 is a method for determining the 3D coordinates of an object by means of a 3D measurement device, wherein the 3D measurement device is connected to a pair of smart glasses, which serves to present to the user data relating to the 3D coordinate measurement by means of a display, wherein the data can be superimposed on images or videos recorded by the smart glasses, in order to make possible an “augmented reality” representation.
Described in EP 2 916 099 A1 is a similar system, in which the recording of the 3D coordinates is produced by means of an articulated arm, and results are represented by means of a pair of smart glasses.
Described in WO 2014/014786 A1 is a system that projects data directly onto a surface of a body in order to obtain an augmented reality function.
Described in DE 10 2008 020 771 A1 is a method for determining deviations between a target state and an actual state of a workpiece, wherein corresponding data are represented on the workpiece in correct position by means of a pair of smart glasses, and wherein the position and viewing direction of the user who is wearing the smart glasses can be recorded by means of a tracking sensor integrated in the smart glasses.
Described in DE 10 2008 020 772 A1 is a similar method, wherein the pair of smart glasses is designed to identify predefined gestures of the user in order to select, on the basis of the recorded gestures, the measurement results to be represented.
Described in WO 2016/085682 A1 is a pair of smart glasses that determines 3D coordinates of bodies, such as, for example, pieces of furniture, and then calculates distances, surfaces, or volumes between points on the body surfaces.
DE 10 2015 201 290 A1 relates to a method for aligning two bodies with respect to each other by means of a laser alignment system and a pair of smart glasses connected thereto, which has a head-up display (“HUD”) in order to represent to the user information that is helpful in carrying out the alignment operation, including measurement results relative to the alignment.
The object of the present invention is to create a system and a method that make possible the recording and representation of measurement points on a body surface in an especially efficient way.
In accordance with the invention, this object is achieved by a system according to claim 1 as well as a method according to claim 14.
In the solution according to the invention, a portion of the coordinates of each measurement point is recorded by means of a pair of smart glasses or, provided it is already known, specified to the user in a convenient way, whereas the remaining portion of the coordinates of each measurement point is determined by means of a position measurement device, wherein, by means of the smart glasses, a result derived from the portion of the coordinates recorded for each measurement point by means of the position measurement device is represented graphically on the body surface at the position of the respective measurement point. Because the recording of the coordinates can be produced with relatively high accuracy by means of the position measurement device and in a relatively simple manner by means of the smart glasses, the portion of the coordinates for which a high accuracy is not required can be recorded or specified in a fast and convenient way by means of the smart glasses, whereas the portion of the coordinates that is to be used for the result of interest can be determined with high accuracy by means of the position measurement device. The pair of smart glasses further ensures a clear, convenient, and efficient representation of the results.
Preferably, the pair of smart glasses is used to record or graphically specify the two horizontal coordinates, whereas the position measurement device is used to record the vertical coordinates.
Further preferred embodiments of the invention ensue from the dependent claims.
The invention will be explained in detail below on the basis of the appended drawings by way of example; shown herein are:
Shown schematically in
The system of
Provided that the x and y coordinates of a measurement point are not known to the smart glasses 24, the pair of smart glasses 24 determines, by means of the position sensor unit 36, the x and y coordinates of the current measurement point, that is, the point at which the user has positioned the optical detector 20 (in the example of
Provided that the pair of smart glasses 24 already knows the x and y coordinates of the current measurement point (for example, when the measurement points are specified externally on the basis of a specific diagram), the pair of smart glasses 24 then graphically represents the position of the measurement point corresponding to the specified coordinates of the body surface 10 to the user by means of the head-up display 26, so that the user knows at which position the optical detector 20 is to be placed on the body surface 10. This can be done in a way that is known as such in the form of an “augmented reality” representation by use of the camera 28. In this case, the representation of the position of the measurement point can be produced, for example, by using virtual 3D or 2D markup objects (indicated in
Once the optical detector 20 has been placed at the desired measurement point, a measurement of the z coordinate of the measurement point is produced by means of the laser beam 18. For this purpose, the points of impingement of the laser beam 18 on the detector surface 22 are recorded in a way that is known as such, from which, in a way that is known as such, the z coordinate of the corresponding measurement point can be determined (in this case, a known offset is essentially included in the determination and corresponds to the distance between a reference point or a horizontal reference line on the detector surface and the point of placement of the detector on the body surface). In this case, the laser beam 18 can establish, for example, a horizontal laser plane by correspondingly rotating the laser light source 16, so that the relative z coordinate of the measurement point resulting, via the offset, from the vertical components of the point of impingement of the measurement point indicates the position relative to the horizontal plane specified by the laser beam 18.
The detector 20 transmits the data relating to the points of impingement of the laser beam 18 (or the already determined corresponding z coordinate) via the interfaces 32 to the smart glasses 24, where the processor 30 then serves as an analysis unit in order to derive a result for the measurement point from the determined z coordinate. What is involved here can be, for example, the amount and/or the direction of the deviation from a specified target value.
This procedural approach is then repeated for the other measurement points provided (in
The results for the individual measurement points are graphically represented to the user by means of the head-up display 26. This can be produced, for example, as indicated in
Number | Date | Country | Kind |
---|---|---|---|
10 2017 128 588 | Dec 2017 | DE | national |
Number | Name | Date | Kind |
---|---|---|---|
9298266 | Blackstone | Mar 2016 | B2 |
9857589 | Lundberg | Jan 2018 | B2 |
10114465 | Wan | Oct 2018 | B2 |
10248855 | Lin | Apr 2019 | B2 |
20100253494 | Inoue | Oct 2010 | A1 |
20150062003 | Rafii | Mar 2015 | A1 |
20150373321 | Bridges | Dec 2015 | A1 |
20190208979 | Bassa | Jul 2019 | A1 |
Number | Date | Country |
---|---|---|
103 19 369 | Dec 2004 | DE |
10 2008 020 771 | Jul 2009 | DE |
10 2008 020 772 | Oct 2009 | DE |
10 2015 201 290 | Jul 2016 | DE |
2 916 099 | Sep 2015 | EP |
2 998 694 | Apr 2016 | EP |
2014014786 | Jan 2014 | WO |
2016085682 | Jun 2016 | WO |
Entry |
---|
European Search Report for EP18207699.2 dated May 6, 2019, European Patent Office. |
German Search Report for DE Application No. 102017128588.1 dated Aug. 13, 2018. |
Number | Date | Country | |
---|---|---|---|
20190171015 A1 | Jun 2019 | US |