This application claims Convention priority of German patent application 10 2016 106 696.6, filed on Apr. 12, 2016. The entire content of this priority application is incorporated herein by reference.
The disclosure relates to a system for measuring spatial coordinates of a measurement object. The system according to this disclosure may also be referred to as a mobile optical coordinate measuring machine.
Coordinate measuring machines serve for checking workpieces, for example as part of quality assurance, or for ascertaining the geometry of a workpiece as part of what is known as “reverse engineering.” Moreover, diverse further application possibilities are conceivable, such as e.g. process-controlling applications, too, in which the measurement technique is applied directly for online monitoring and regulation of manufacturing and processing processes.
In coordinate measuring machines, different types of sensors may be used to capture the workpiece to be measured. By way of example, sensors that measure in tactile fashion are known in this respect, as are sold by the applicant under the product designation “VAST XT” or “VAST XXT”. Here, the surface of the workpiece to be measured is scanned with a stylus, the coordinates of said stylus in the measurement space being known at all times. Such a stylus may also be moved along the surface of a workpiece in a manner such that a multiplicity of measurement points can be captured at set time intervals during such a measurement process as part of a so-called “scanning method”.
Moreover, it is known to use optical sensors that enable non-contact capture of the coordinates of a measurement object or workpiece. The present disclosure relates to such a coordinate measuring machine or coordinate measuring system comprising an optical sensor.
In optical dimensional metrology, great outlays regularly arise if the form of measurement objects or workpieces is intended to be measured with accuracies in the range of single micrometers. This is generally attributable to the fact that comparatively complex and heavy sensors are guided by comparatively complex machines along preplanned trajectories. Subsequently or in parallel, the optically captured information is then related to the spatial information provided by the machine actuator system, such that the surface of the object to be measured can be reconstructed. One example of such an optical sensor is the optical sensor sold by the applicant under the product designation “ViScan”. An optical sensor of this type can be used in various types of measurement setups or coordinate measuring machines. Examples of such coordinate measuring machines are the products “O-SELECT” and “O-INSPECT”, which are sold by the applicant.
The issue of the mobile usability of such coordinate measuring machines is increasingly gaining in importance since this would extend the spectrum of use of the coordinate measuring machines even further just on account of the more flexible usability. However, the extremely stringent requirements made in terms of the measurement accuracy that these coordinate measuring machines are intended to deliver often militate against the mobile usability of such a coordinate measuring machine. It is true that manifold digital-optical possibilities now exist, in particular software methods, in order that, from images or films of objects or scenes, the spatial structure of the imaged objects in the scene may be deduced. For this purpose, a 3D point cloud is usually generated computationally from the image or video material. Many of these possibilities are even accessible at no cost. In principle, however, these methods have some deficiencies which have the consequence that they are currently still not appropriate for the highly accurate measurements demanded. The most serious factor is achieving the measurement accuracy required for industrial applications.
In the digital-optical methods known heretofore, which involve the use of conventional photographic or video apparatuses, for example, achieving the demanded measurement accuracy is usually not possible, in particular for the following reasons: A simple calibration of “imaging scales” is ruled out since the optical units used for imaging in mobile terminals, but also in expensive cameras, are not designed for metrological purposes. Inter alia, they are generally not telecentric, which leads to unquantifiable, defocus-dependent imaging scale variations during operation. Their deformations and distortions are generally unknown and, under certain circumstances, not reproducible during the operation of the optical unit. This is applicable particularly if video or photographic apparatuses with moving zoom and/or autofocus optical units are involved.
One possible solution would consist in adding reference objects of known size to the scene. This would enable a calibration of the imaging conditions that were present when the respective image was recorded. Particularly for the measurement of relatively large parts, however, this reference object would then have to be taken along concomitantly. Alternatively, a large number of reference objects would have to be available. Both are impractical if only from a workflow standpoint.
A further problem is achieving high accuracy over relatively large measurement distances. The approaches are based in principle on so-called stitching, i.e. correlative methods for determining the offset of the individual images of an image sequence. This approach leads in principle to the possibility of measurement errors increasing without limit and in addition is greatly restricted regarding achievable accuracies if the imaging aberrations vary within the image sequence. Moreover, the stability of the correlation calculation is influenced greatly by the image content. As regards the measurement of feature-poor objects as represented by, for example, cleanly manufactured small drilled holes in groove-free surroundings, which then typically also have to be measured accurately, these are particularly poor image contents for correlative methods.
It is thus an object to provide a coordinate measuring system, that is to say a system for measuring spatial coordinates of a measurement object, which overcomes the disadvantages mentioned above. In this case, it is an object, in particular, to provide a solution which is capable of mobile use and is comparatively cost-effective and which nevertheless makes it possible to be able to ensure the measurement accuracy required for the industrial metrology.
In accordance with one aspect of the present disclosure, a system for measuring spatial coordinates of a measurement object is provided, comprising:
In accordance with a further aspect of the present disclosure, a method is provided comprising the following steps:
As far as the essential component parts of the herein presented system are concerned, said system is similar to a commercially available optical coordinate measuring machine insofar as here, too, the three customary modules: sensor system, actuator system and control unit are used for generating the 3D information of the measurement object.
In contrast to the customary optical coordinate measuring machines, the sensor system for capturing the data of the measurement object comprises a mobile computer device and a pose determination unit having an external tracking sensor, which is configured to capture data with regard to the pose of the mobile computer device, i.e. the position and location of the mobile computer device. The mobile computer device is equipped with an optical sensor, which is designated as “first optical sensor” in the present case for differentiation from further optical sensors. Said first optical sensor is preferably a camera that can be used to gather image data from the measurement object and, if appropriate, also the environment thereof. Said image data may comprise one or a plurality of images or an entire image sequence, that is to say also a video. The mobile computer device is preferably a tablet computer, a smartphone or a laptop. In principle, however, other mobile terminals are also appropriate.
In comparison with a conventional optical coordinate measuring machine, the herein presented system does not have an actuator controlled in an automated manner. Instead, in the present system the human acts as an actuator, moving the sensor, i.e. the mobile computer device with the camera arranged thereon (first optical sensor), relative to the measurement object. However, since the human cannot provide information in the micrometers range with regard to the movement thereof, the position and location information (pose data) of the mobile computer device is generated via the already mentioned external tracking sensor of the pose determination unit. With the aid of said external tracking sensor, which is embodied separately from the mobile computer device, the position and location of the mobile computer device in space is known at every point in time during the movement of said mobile computer device.
In a manner similar to that in the case of conventional optical coordinate measuring machines, the processing of the image data captured from the measurement object and also of the pose data of the mobile computer device is effected in a control unit, which calculates the spatial coordinates of the measurement object on the basis of said data. Said control unit preferably comprises a processor on which corresponding metrological software is implemented, with the aid of which said data can be evaluated and the spatial coordinates of the measurement object can be calculated on the basis thereof. Moreover, the control unit preferably has the possibility of retrieving predefined test plans or of storing the progression of a measurement carried out together with the results thereof in a manner such that they are retrievable again.
The performance or the demanded measurement accuracy is achieved in particular by virtue of the fact that the spatial position and location of the image recording component, i.e. of the mobile computer device, are known unambiguously at any time, since they are captured unambiguously by the pose determination unit of the system.
The fact that for example a conventional tablet computer can be used as a mobile computer device affords not only the advantages already mentioned above regarding the very mobile usability of the herein presented system, but also enormous cost advantages compared with conventional optical coordinate measuring machines. Nevertheless, an accuracy of the measurement of the measurement object in the range of one or a few micrometers can be achieved with the aid of the herein presented system.
In a refinement, the control unit is configured to assume the measurement object to be time-invariant when evaluating the image data of the measurement object to determine the spatial coordinates.
In other words, the metrological software implemented on the control unit contains an algorithm which, in the evaluation of said image and pose data, assumes the measurement object itself to be time-invariant. Together with the fact that the spatial position and location (pose) of the mobile computer device and thus also the spatial position and location (pose) of the first optical sensor are known at any time, this additional input information or condition makes it possible to correct imaging differences in the individual images captured from the measurement object (for example a magnification caused by a relatively small distance between the mobile computer device and the measurement object) and also imaging aberrations such as e.g. distortions. A very accurate 3D reconstruction of the measurement object can thus be created from the image data captured from the measurement object.
The control unit can be integrated either directly into the mobile computer device or at least partly on an external computer or server that is connected to the mobile computer device via a corresponding data connection. A partial or total integration of the control unit into an external computer or server has the following advantages in comparison with its integration into the mobile computer device: A possibly limited performance of the mobile computer device is then of smaller significance. An increased power demand, which often leads to the mobile computer device heating up, can thus be prevented as well. This is advantageous particularly because instances of sensor heating, which are often accompanied by deformations, are extremely disadvantageous for metrological applications. Moreover, rechargeable battery power of the mobile computer device can also be saved. The communication of the data between the mobile computer device of the pose determination unit and the control unit can be effected both in a wired fashion and wirelessly.
In a further refinement, the external tracking sensor of the pose determination unit comprises a second optical sensor, wherein the pose data of the mobile computer device comprise image data of a monitoring region including the mobile computer device.
Preferably, said second optical sensor comprises two stationary cameras. These two stationary cameras are preferably arranged offset with respect to one another in space, such that 3D image data can be put together from the image data obtained by said cameras in a known manner. Alternatively, said second optical sensor may also comprise more than only two cameras or be realized as a 3D camera, for example a stereo camera, a plenoptic camera or a TOF camera.
In a further refinement, the pose determination unit furthermore comprises an internal position and/or location capture sensor, which is integrated into the mobile computer device and is configured to capture further data with regard to position and/or location of the mobile computer device, wherein the control unit is configured to determine the spatial coordinates of the measurement object also on the basis of the data captured by the internal position and/or location capture sensor.
Therefore, in this refinement, the pose of the mobile computer device is determined not only via the external sensor, but also with the aid of further sensors integrated into the mobile computer device. Measurement accuracy, measurement speed and long-term stability can thereby be increased. Examples of such internal position and/or location capture sensors are: a GPS/GLONASS sensor, a gyrometer, one or more acceleration sensors, a barometer, etc. Such sensors are already contained in commercially available tablet computers. By way of example, speed and location of the mobile computer device can be calculated by single or double integration of the data of an acceleration sensor integrated into the mobile computer device. Similar evaluations are possible via a gyrometer which is integrated into the mobile computer device and which can be used to ascertain angles and/or locations in space and angular velocities of the mobile computer device. By comparison, temporal, spatial and/or Fourier frequency filtering and/or Kalman filtering and/or other methods for so-called sensor data fusion of the measurement values of the individual sensors of the pose determination unit, it is thus possible simultaneously to increase the accuracy and the measurement speed of the capture of the pose of the mobile computer device.
In accordance with a further refinement, the mobile computer device furthermore comprises a third optical sensor for capturing image data from the environment of the mobile computer device, wherein the control unit is configured to identify at least one stationary reference point in the image data captured by the third optical sensor and to ascertain the position and location of said reference point with regard to the mobile computer device, and wherein the control unit is configured to determine the spatial coordinates of the measurement object also on the basis of the ascertained position and location of the at least one identified reference point relative to the mobile computer device.
The first and the third optical sensors, which are both integrated into the mobile computer device, preferably each comprise a camera. The camera of the first sensor and the camera of the third sensor preferably have opposite viewing directions. The mobile computer device preferably additionally comprises a display. The viewing direction of the camera of the first sensor is preferably opposite to the emission direction of the display. By contrast, the camera of the third sensor is preferably arranged on an opposite side with respect to the camera of the first sensor, that is to say preferably on the same side of the mobile computer device as the display.
In the refinement mentioned last, the third optical sensor is thus likewise part of the pose determination unit. Preferably, in particular the location of the mobile computer device in space is determined with the aid of the evaluation of the image data captured by the third optical sensor. For support, 2D or 3D objects which are stably localizable and easily recognizable for image-processing algorithmic procedures can also be added to the object space in order to be able better to ascertain spatial relations in the image data.
In a further refinement, the control unit is configured to determine whether the external tracking sensor is imaged in the image data captured by the third optical sensor.
With the aid of this evaluation, it is possible to check whether the external tracking sensor, that is to say for example the two stationary cameras for externally determining the pose of the mobile computer device, have a free view of the mobile computer device. Consequently, it would be possible to recognize, for example, if one or both stationary cameras of the external tracking sensor temporarily cannot optically capture the mobile terminal at all, since for example a human or an object is obstructing the field of view. The control unit may be configured not to use, or to use only in part, the image data of the external tracking sensor for the determination of the spatial coordinates of the measurement object if, on the basis of the image data captured by the third optical sensor, it is ascertained that the external tracking sensor is not imaged or is only partly imaged in said image data. In this case, therefore, data of one or both stationary cameras of the tracking sensor which at times do not optically capture the mobile computer device would not be taken into account in these time intervals. As a result, the bandwidth or computing power can be saved and the stability of the so-called position fix can also be increased.
In a further refinement, the mobile computer device comprises a display and an optical marker, wherein the optical marker either is arranged fixedly on the mobile computer device or is generated on the display, and wherein the control unit is configured to determine the pose of the mobile computing device with the aid of the optical marker.
With the aid of one such optical marker or with the aid of a plurality of such optical markers, the pose of the mobile computer device can be determined even more precisely on the basis of the image data obtained by the external tracking sensor. The markers may be binary or black-white, grayscale-gradated or else colored structures. These structures can be identified with the aid of the cameras of the tracking sensor relatively simply in the image data thereof, such that a tracking of said markers in space can be ensured relatively simply. The optical markers may be static or variable markers.
In embodiment refinement, the control unit is configured to generate the optical marker on the display and to vary a representation and/or position of the optical marker on the display over time.
Such a temporal variation of the markers has a number of advantages: Firstly, it is thereby possible to synchronize parts of the system according to the invention with one another. Secondly, the structures represented on the display can be adapted variably to the external conditions.
In invention refinement, the control unit is configured to vary the representation and/or position of the optical marker on the display depending on the pose data determined by the pose determination unit.
A modification of the marker structures represented on the display depending on the pose data of the mobile computer device has the particular advantage that the optical marker is adaptable variably to the changing viewing direction for the external tracking sensor, such that the external tracking sensor can determine the pose of the mobile computer device with uniformly high measurement accuracy. Also in this case, the pose data may comprise the data determined by the external tracking sensor and the data determined by the internal position and/or location capture sensor(s).
In a further refinement, the control unit is configured to synchronize the image data of the measurement object captured by the first optical sensor with the image data of the monitoring region captured by the second optical sensor, on the basis of the temporally varied representation and/or position of the optical marker.
Further possibilities for synchronizing the captured data include access to universal time. In this case, however, both the second optical sensor or the external tracking sensor and the mobile computer device would have to have access to the universal time clock. Even though commercially available tablet computers normally have such access to the universal time clock anyway, this would necessitate a further data connection for the external tracking sensor. A synchronization as presented above with the aid of the temporally varied representation and/or position of the optical marker on the display is thus significantly more elegant and more autonomous.
In accordance with a further refinement, the first optical sensor comprises a telecentric optical unit or plenoptic optical unit, which is integrated into the first optical sensor or is arranged in a releasable manner on the latter.
The second variant of a releasable arrangement, for example with the aid of a clip-on optical unit, is preferred in the present case since the mobile computer device does not have to be permanently modified for this purpose. Such clip-on optical units are advantageous particularly for cases in which the reproducibilities of the imaging relationships of the moving optical units of the mobile computer device are not sufficient to be able to achieve the desired accuracies. For these cases a clip-on optical unit could be designed such that the optical unit in the mobile computer device need no longer be adjusted. That is to say that possible desired changes, e.g. in the working distance and/or the magnification, would be transferred to the clip-on optical unit. This adjustable optical unit is preferably controlled via the mobile computer device or the control unit. Clip-on optical units can also be used for the display of the mobile computer device, as will also be explained in detail further below.
In accordance with a further refinement, the system furthermore comprises an illumination device for illuminating the measurement object, wherein the control unit is configured to control the illumination device in such a way that the latter periodically changes its brightness.
Such an illumination device is advantageous particularly if the ambient illumination or the display luminous intensity of the mobile computer device does not suffice to be able to measure sufficiently rapidly and/or accurately. A periodic change in brightness, that is to say for example a blinking or flashing of the illumination device, has the following advantages: a relatively low energy consumption; and shorter shutter times for the recordings of the scenes by the mobile computer device, such that the recordings of the first optical sensor, for example, are less blurred if the mobile computer device is moved during the capture of the image data. Moreover, less heat is generated, which is always advantageous in metrological applications.
With the use of such an illumination device it is advantageous if the display of the mobile computer device is at least partly antireflection-coated. This ensures that the external tracking sensor can identify entirely satisfactorily the above-mentioned markers that are generated on the display.
In principle, however, it is also possible to use a spectral and/or temporal separation of the image recordings and/or of the tracking of the position and location of the mobile computer device and/or of the ambient illumination. In this case, the possibility is then also afforded of separately optimizing the illumination and imaging conditions for the two measurement tasks (capturing image data from the measurement object and capturing image data from the mobile computer device).
Furthermore, it is possible to use the display of the mobile computer device for illuminating the measurement object. By way of example, the control unit may be configured to drive the display in such a way that a stripe projection is imaged on the measurement object. Further possibilities for the optimized illumination of the measurement object are explained in greater detail further below with reference to the figures.
In a further refinement, the mobile computer device furthermore comprises a loudspeaker and/or a vibration actuator, wherein the control unit is configured to drive the vibration actuator and/or to output information via the display and/or the loudspeaker depending on the image data captured by the first optical sensor and/or depending on the data captured by the pose determination unit.
In measurement applications in which the display faces the user, besides the optical markers mentioned above it is also possible to represent supporting optical information for the user on the display. Said information may include e.g. feedback messages as to whether enough image data have already been captured for the 3D reconstruction of the measurement object. Likewise, by means thereof it is also possible to pass on instructions to the user that support the latter regarding for what parts of the measurement object image data must still be captured in order to completely capture said measurement object. Parts of the display may, if appropriate, also be provided for the control of the system via a touchscreen function. Alternatively, a control of the system may also be effected by voice control via built-in microphones. Further supporting information can also be communicated to the user acoustically via the loudspeaker. Since, in the present use, the user moves the mobile computer device preferably using his/her hands, the output of vibration signals is particularly advantageous since this directly appeals to the sense of touch that is used by the user anyway in this activity. The control unit may be configured to support the user when implementing a predefined test plan. In this regard, said control unit may instruct the user e.g. with regard to the positions “to be moved to”, and it may give feedback messages optically, acoustically and/or haptically in the course of measurement operation. Said feedback messages for the user may concern e.g. error messages, achieved accuracies, permissible speeds or measurement distances, etc.
It is understood that the aforementioned features and those yet to be explained below may be used not only in the respectively specified combination but also in other combinations or on their own, without departing from the spirit and scope of the present disclosure. It should likewise be pointed out that the abovementioned embodiments described essentially in relation to the herein presented system relate in a corresponding manner to the herein presented method.
Exemplary embodiments of the invention are shown in the drawings and are explained in greater detail in the following description. In the figures:
The system 10 is illustrated schematically in
Instead of a large, structurally complex, relatively immobile measurement set-up of a coordinate measuring machine usually used for such tasks, the system 10 according to the disclosure for measuring the spatial coordinates of the measurement object 14 is comparatively small and capable of mobile use. The system 10 comprises a mobile computer device 20. Said mobile computer device 20 is preferably a tablet computer. By way of example, it is possible to use an iPad Air 2 WiFi plus cellular™, since this device combines a large number of the functions which are required for the system 10 according to the disclosure and are explained below. In principle, however, the use of a smartphone or laptop is also conceivable.
A first exemplary embodiment of such a mobile computer device 20 is illustrated schematically in
In the present exemplary embodiment, the mobile computer device 20 furthermore also comprises a display 24 and a further optical sensor 26, which is designated as third optical sensor 26 in the present case. The third optical sensor 26 is preferably arranged on the same side of the mobile computer device 20 as the display 24. By contrast, the first optical sensor 22 is preferably arranged on the opposite side, such that the optical sensors 22, 26 have opposite viewing directions, as is illustrated schematically with the aid of the arrows 28 (see
A further component part of the system 10 according to the disclosure is a pose determination unit 30 comprising a tracking sensor 32 for capturing data with regard to the position and location of the mobile computer device 20. Said tracking sensor 32 is embodied as an external tracking sensor, that is to say that it is not integrated into the mobile computer device 20, but rather tracks the position and location thereof externally. The external tracking sensor 32 preferably comprises one or more cameras. In the exemplary embodiment illustrated schematically in
Therefore, the spatial position and location (pose) of the image recording system 20 are thus known unambiguously and at every point in time since they are supplied by the external tracking sensor 32. The additional assumption of the time invariance of the measurement object 14 then allows correction of the imaging differences (such as e.g. different working distance and hence magnification or reduction) and the imaging aberrations (such as e.g. distortion) in the individual images of an image sequence that are supplied by the first optical sensor 22, and creation of a continuous and accurate 3D reconstruction of the measurement object 14 from the entire corrected image sequence. For this purpose, by way of example, at least two images are recorded from the measurement object 14 with the aid of the first optical sensor 22, wherein these two images are recorded in different positions and/or locations of the mobile computer device 20 and thus also of the first optical sensor 22. The position and location of the mobile computer device 20 (and thus also of the first optical sensor 22) at the point in time of capturing the two images mentioned can be ascertained exactly on the basis of the image data obtained by the cameras 34, 36. Size and position changes of the imaging of the measurement object 14 from one of the two images to the other can then be linked with the ascertained position and location change of the first optical sensor 22, such that ultimately the real dimensions within the two images captured by the first optical sensor 22 can be determined unambiguously as a result. The control unit 38 is preferably configured ultimately to calculate a 3D point cloud of the measurement object 14 with the aid of the method mentioned above, wherein the coordinates of these points can be represented in an unambiguously defined, known coordinate system. Measurements with an accuracy in the range of one or a few micrometers are possible in this way.
Instead of an embodiment of the control unit 38 as a component completely integrated into the mobile computer device 20, an embodiment is likewise conceivable in which at least parts of the control unit 38 and/or of the data storage are implemented in an external computer or in a cloud.
The exemplary embodiment illustrated in
The above-explained measurement principle of the system 10 according to the disclosure can be optimized with regard to the precision thereof with the aid of a multiplicity of further system features. In order to simplify the optical capture of the position and location of the mobile computer device 20, for example a plurality of optical markers 48 can be represented on the display 24. In accordance with one exemplary embodiment of the present disclosure, the shape and/or position of said optical markers 48 on the display 24 can be changed in a predefined manner over time. This enables for example an automated, temporal synchronization of the external tracking sensor 32 (cameras 34, 36) and of the mobile computer device 20 with the first optical sensor 22 incorporated therein. Likewise, it is also possible, however, to change the optical markers 48 represented on the display 24 depending on the position and location of the mobile computer device 20. For this purpose, the mobile computer device 20 preferably comprises an internal pose determination sensor 50 (see
In comparison with the embodiments illustrated schematically in
A further possibility for application of the third optical sensor 26 in the system 10 according to the disclosure is as follows: The image data captured by the third sensor 26 can also be evaluated as to whether the cameras 34, 36 are visible in said image data. The consideration on which this type of evaluation is based consists in the fact that lack of visibility of the cameras 34, 36 in the image data captured by the third optical sensor 26 is a strong indication that the cameras 34, 36 also do not have an unrestricted view of the mobile computer device 20. If such a case is detected, the control unit 38 can discard the corresponding image data of one or both cameras 34, 36. This saves data capacity and increases the robustness of the position and location determination.
Further exemplary embodiments of the mobile computer device 20 and of the system 10 according to the disclosure are illustrated schematically in
In particular telecentric clip-on optical units 54, 56 are advantageous for metrological applications or dimensional measurement. Telecentric optical units fundamentally differ from the customary optical units installed in mobile computer devices 20. Objects are imaged with a distance-independent scale by telecentric optical units. This is ideal for the measurement since uncertainties in the positioning of the imaging system do not translate into uncertainties of the imaging scale which directly limit the achievable measurement accuracy.
Furthermore, the clip-on optical units 54, 56 can also be configured such that a so-called plenoptic or light field recording becomes possible with the sensors 22′, 26′ of the mobile computer device 20. Particularly chips having an extremely high number of pixels (greater than 40 MPx), such as are becoming increasingly widespread in many mobile computer devices 20, offer a good basis for this. The “transformation” of the normal terminal cameras 22, 26 into a plenoptic camera has the advantage that the computer device 20 is directly able to generate 3D information about the measurement object 14 from individual recordings. Accordingly, with the use of a plenoptic camera in combination with the spatial and location ascertaining effected by the external tracking sensor 32, there is an increase in the stability and/or accuracy of the measurement with the aid of the system 10 according to the disclosure by means of which the 3D contour of the imaged measurement object 14 is reconstructed from the image data.
Furthermore, as indicated schematically in
Instead of or in addition to the clip-on optical units 54, 56 pushed onto the sensors 22′, 26′, a corresponding clip-on optical unit 60 can also be placed onto the display 24 (see
With the aid of such clip-on optical units 60 it would also be possible to use the display 24 as illumination for the measurement object 14. This is illustrated schematically in the situation depicted in
Further sensors of the mobile computer device 20 may support the system 10 according to the disclosure as follows: The identity of the user 12 can be captured via a face recognition or fingerprint sensor. If appropriate, as a result it is possible to load preset user parameters from archives. The identity of the user 12 may equally well be stored together with the measurement results in corresponding databases in an automated manner. Furthermore, it is possible to capture the motor characteristics or idiosyncrasies of the user 12 in order, depending thereon, to examine the quality of the measurement results or speeds of measurements and to relate the latter to use and/or trajectory parameters and/or environmental parameters. Measurement sequences can possibly be optimized as a result. Feedback messages or instructions can equally well be passed on to the user 12 by being output acoustically with the aid of a loudspeaker or being passed on to the user 12 in tactile form with the aid of vibration actuators or being displayed to the user 12 via the display 24.
Overall, a multiplicity of application possibilities are thus conceivable with the system 10 according to the disclosure. The system 10 according to the disclosure essentially affords the advantage, however, that a relatively exact coordinate measuring machine which is exceptionally capable of mobile use can be simulated with commercially available standard components.
Number | Date | Country | Kind |
---|---|---|---|
10 2016 106 696.6 | Apr 2016 | DE | national |