This claims the benefit of German Patent Application DE 10 2010 043136.2, filed Oct. 29, 2010 and hereby incorporated by reference herein.
The present invention relates to a measuring device, in particular in the form of a handheld device, for a noncontact measurement of a distance to a target object. The present invention also relates to a method for noncontact measurement of distances on a target object.
A measuring device may be used as a handheld distance meter using a suitably designed laser measuring unit, for example.
A noncontact measurement of a distance to a target object is usually made with the aid of an optical measuring beam, for example, a laser beam. Regardless of the measuring beam used, fundamentally different methods are known for distance measuring; for example, a distance to a target object may be determined in a noncontact method with the aid of a travel time measurement, a phase measurement or laser triangulation. For implementing these or similar methods, a housing of the measuring device provides a distance measuring unit, which utilizes an optical measuring beam situated in the housing, with the aid of which the distance to the target object is measurable without contact. An exemplary distance measuring unit which is advantageously designed for noncontact distance measuring via a travel time measurement is described in DE 101 12 833 C1, for example. It has a beam emitting unit in the form of a laser unit. In addition, an optical unit having optical elements for beam guidance is provided. The optical elements include at least one transmission and reception optical unit. A transmission optical unit is situated in an optical transmission path having an optical axis for emitting a measuring beam to the target object. A receiving optical unit is situated in an optical reception path having an optical axis for receiving the measuring beam reflected or backscattered by the target object.
Distance is understood within the scope of this patent application to refer to a distance measured to the measuring point on the target object. In particular distance is understood to be a distance being essentially between the measuring device and the target object, concretely between a reference point on the measuring device and the measuring point on the target object, so a distance is oriented essentially transversely (usually at a right angle) to a lateral surface of the target object. Inasmuch as the term distance is used more generally, it thus includes distances to the target object. If the term distance is used specifically, this refers in particular to a distance between target points present on the target object, i.e., in particular distances on or essentially aligned to a lateral surface of the target object. In the present case, distance refers specifically to a distance in the lateral surface or in a lateral plane of the target object or allocated to the target object. A distance is thus measurable specifically on the target object itself in particular.
The distances in the present specific case are distances which are not accessible to direct measurement by a distance meter of the aforementioned type, in contrast with the aforementioned distances. For example, this relates initially to lengths but also to surface contents, which may be related thereto and are to be found on a building façade or the like, for example. These lengths are not measurable via a conventional distance measurement of the type described above. A simple effective measurement of lateral distances in surfaces or planes on a target object, preferably as accurate as possible, would be desirable in practice.
Essentially known methods of photogrametry, for example, are usually limited to a mere visual so-called 3D modeling of camera shots without being associated with any dimensions of lateral distances on the target object or in the surface or in the plane of the target object. For example, WO 00/25089 describes a device for three-dimensional representation of an object in which a distance measuring unit records distances of a number of points of an object within a target region. There is a link to a two-dimensional image of the target object only to generate a three-dimensional representation of the object therefrom without providing any quantitative information about distances in the plane of the target object or on the target object.
EP 2 026 077 A1 describes a system for noncontact recording of three-dimensional coordinates which is relatively complex, as are other systems of this type. An image coordinate system, which refers to the recorded three-dimensional image, succeeds in being transformed into the object coordinate system within which the object is to be measured. On the one hand, however, two or more cameras recording the target object from different positions at the same time are necessary. On the other hand, marks in the object coordinate system on which the aforementioned transformation is based are required.
Such systems may be too complex for applications at construction sites or in construction and renovation jobs and are susceptible to problems and are thus ultimately not manageable. In particular providing marks in an object coordinate system should be avoided if at all possible because this is obviously impossible or very difficult in the case of target objects at a great distance or target objects that are simply inaccessible. In particular, placing marks on the target object entails the risk of accidents, which should ultimately be avoided.
Instead, it is desirable to simplify the measurement of lateral distances on a target object—if necessary, also the measurement of distances to the target object—and to make it more reliable and more efficient. It has also been found that an accuracy in the percentage range in the profile of requirements usually encountered at construction sites or the like is sufficient to be able to meet requirements for initial user needs. Comparatively simple profiles of requirements exist, for example, when determining surfaces on the target object—in particular lateral surfaces and lateral distances on the target object characterizing these surfaces.
A measuring device mentioned in the introduction and having a housing as well as a distance measuring unit situated in the housing and utilizing an optical measuring beam and having a photoelectric image acquisition unit situated in the housing offers an approach for doing so. However, this approach may be made simpler than is described in EP 2 023 077 A1, for example. It is fundamentally known that a distance measuring unit for distance measuring and a photoelectronic image acquisition unit may be combined in one housing—like a measuring device of the type defined in the introduction.
A measuring device of the type defined in the introduction is known from WO 2008/155657 or JP 08021878, for example. A distance measuring unit and a photoelectric image acquisition unit are implemented in such measuring devices, but they are situated in a housing where they are uncoupled, one image of a measuring point on the target object being superimposed on a photoelectric image merely through the use of software. For example, JP 08021878 describes how the position of a scanned measuring point of the distance measuring unit detected with the aid of the photodiode array is superimposed on the photoelectric image of the target object within the context of the software application and only then is the image displayed on a display screen. Similarly in WO 2008/155657 the display of the distance meter and the photoelectric image of a camera are superimposed. Such software-based approaches have proven to be inadequate in compact measuring devices, in particular those which are handheld.
Accordingly, approaches such as that described in EP 1 407 227 B1 merely visualize a measuring point on the target object via the photoelectric image acquisition unit—in other words, a photoelectric image acquisition unit in these systems acts like a telescopic sight to make the measuring point of a distance measuring unit on the target object visible for the eye of the user. It is thus impossible to measure lateral distances on the target object, in particular on surfaces or on lateral surfaces of the target object.
DE 100 55 510 B4 by the present applicant discloses a measuring device of the type defined in the introduction in which a distance measuring unit and also a photoelectric image acquisition unit are provided in a housing. A control and computation unit calculates a virtual measuring spot and displays it graphically on the display screen, so that a parallax error for a distance measurement is correctable. Such a measuring device also measures only distances between the measuring device and the target object without lateral distances on the target object itself being determinable.
It is an object of the present invention to provide a measuring device and a method of the type defined in the introduction with the aid of which distances to a target object are determinable in a comparatively efficient manner which is simple in particular. Lateral distances on a surface of the target object such as, for example, lateral distances on a plane of the target object are to be determinable in particular. It should be possible in particular to indicate the lateral distances on the target object at least approximately. In particular the base should be created for enabling a documentation and attribution of measured distances, i.e., in particular lateral distances on the target object, in a manner that is easily visualized with an easily handled measuring device, in particular in the form of a handheld device. In particular it should additionally be possible to indicate distances between the measuring device and the target object.
The present invention is based on the consideration that a control and computation unit, having a memory and taking into account the available computation and storage capacities, may also be configured for handheld measuring devices in such a way that it is possible to provide at least approximate information about lateral distances on the target object by using the results of a distance measuring unit and a photoelectric image acquisition unit. The present invention is also based on the consideration that a simplification of a processing operation in the control and computation unit according to the proposed concept is advantageous. For this purpose, the present invention initially provides an image processing unit which is designed to define at least a number of the target points on the target object as a number of corresponding pixels in at least one photoelectric image. The number of points in the present case is understood to be an integer of one, two, three, four, etc., or more points. The measuring point may but need not necessarily be part of the image. A measuring point is advantageously part of the image. In particular a measuring point may be one of the target points. This is not normally the case but it may prove to be advantageous. In other words, the concept of the present invention is based on the analysis of initially exactly one photoelectric image and the single measuring point recorded with the photoelectric image. In particular the concept of the present invention may already be implemented advantageously on the basis of a single photoelectric image obtained with a single [photographic] shot. In particular it is sufficient for implementing the concept that the photoelectric image acquisition unit has a single search lens and camera lens. However, the present invention is not limited thereto. Likewise there is a significant advantage of the concept in its comparatively simple implementability.
The present invention has recognized that the distance measuring device supplies a distance between a reference point and the measuring point on a target object simultaneously or in real time to the recording of the photoelectric image. Accordingly, the present invention additionally provides that the control and computation unit is designed to assign the pixel, which is defined and corresponds to a target point, to the distance of the reference point to the measuring point. An allocation formed in this way between the distance supplied by the distance measuring unit and the measuring point on the one hand and a pixel defined in the image of the photoelectric image acquisition unit on the other hand has proven to be a sufficient basis for approximately determining lateral distances on the target object in particular.
Such an allocation may be implemented in a fundamentally different manner within the scope of the concept of the present invention, for example, by a suitable reference to a corresponding value for the distance and the pixel. The values may be compiled in lists or fields or other allocations which are used for assigning value. Within the scope of a refinement, the allocation may be available as a separate numerical field or a numerical field formed by mutual reference, for example. The definition of a pixel may preferably be available as pixel coordinates and the distance may be available as a distance measure. The allocation as a triple number advantageously in particular includes the pixel coordinate difference of two pixels and the distance of the reference point to the measuring point as a distance measure. The control and computation unit may advantageously be designed with a memory, the allocation of the distance and the at least one pixel being stored as an allocation in the memory.
Within the scope of a particularly advantageous refinement of the present invention, it is provided that exactly one single photoelectric image results from a single recording of the target object. In particular it is provided for this purpose that precisely one measuring point is allocated to the photoelectric image. The measuring point is advantageously allocated to one of the number of pixels in the single photoelectric image but not to one of those pixels allocated to one target point. Such a recording of a photoelectric image with a measuring point and an additional distance measurement to the measuring point may be accomplished using a single viewfinder lens and camera lens, which greatly simplifies the design of the measuring device.
Additional advantageous refinements of the present invention are characterized in the subclaims and specify the details of advantageous possibilities of implementing the concept explained above within the scope of the object of the present invention and also with regard to additional advantages.
Within the scope of one advantageous refinement, the allocation as a triple number, for example—including the definition of a pixel as a pixel coordinate and a distance as a distance measure—may be made available. The triple number is advantageously suitable for further processing by the control and computation unit. The control and computation unit may therefore have a distance module, for example, in a particularly preferred manner.
The distance module is advantageously designed to define a distance between a first pixel and a second pixel as a pixel distance and to allocate a distance measure to the pixel distance corresponding to the distance of the target points on the target object corresponding to the pixels. This may be implemented, for example, within the scope of a suitably programmed software module for image processing. Within the scope of a particularly preferred refinement, it is provided that the distance module has an input for a reference measure and is designed to determine an image scale as an image conversion factor at least approximately from the reference measure and a distance measure to the measuring point. The image conversion factor is used in a particularly preferred manner to allocate a distance measure to the aforementioned pixel distance. This approach makes it possible to also detect lateral distances on the target object using a distance measure even with the computation power available in a handheld device. Within the scope of a particularly efficiently devised refinement of the present invention it has proven advantageous that a first reference measure is formed as a focal length of the viewfinder and/or camera lens. The focal length of the viewfinder and/or camera lens is advantageously used to this extent to be able to allocate a distance measure to the pixel distance. A second reference measure is advantageously formed as a pixel variable. The pixel variable may be different in different pixel coordinate directions of the image. The pixel variable may advantageously be approximated isotropically. An image scale is preferably determined in particular as the ratio of a focal length and a distance multiplied by a pixel variable. As recognized in the refinement, the use of the first and second reference measures in particular results in a preferred determination of the image scale which is also performable with comparatively little computation power and also leads to results that are approximately very usable in any case. It is possible in this way in particular to identify a measuring point of the distance measuring unit unambiguously in the photoelectric image of the target object, following the concept of the present invention, practically together with a distance measurement and recording of a photoelectric image, and to specify lateral distances on the target object with reference to a reference measure.
The exactly one single photoelectric image preferably results from a single recording of the target object, and precisely one measuring point is allocated to the photoelectric image. In particular it is provided that the number of target points is defined essentially in a plane in which the measuring point is also located, for the target points in exactly one single photoelectric image. The plane, which is also referred to as a reference plane, may advantageously be defined by the measuring point and a normal vector which is advantageously given approximately by the direction of view of the camera. This further definition of the reference plane is based in particular on the advantageous assumption that a user triggers a measurement essentially from a direction perpendicular to a plane of the target object which is to be measured. In particular, the concept of the refinement may be applied with justifiable precision to distances to be measured in a reference plane, forming approximately an angle of 90°±25° to a direction of view of the camera. In other cases, in particular in cases in which the measuring point lies in a plane having distances forming an acute angle to the direction of view, it has proven advantageous to transform the target points defining the distances to be measured into a reference plane which fulfills the above advantageous prerequisites and to do so through suitable image processing algorithms, if necessary using additional sensor data and user interactions. This may be done, for example, by rotation of the plane about the measuring point. A measuring point should be located as close as possible to the lateral distance to be measured. In general, a reference plane may be defined on the target object at least through the measuring point on the target object and its distance—in particular also the direction of the distance—to the reference point in the measuring device—supported if necessary by processing of the photoelectric image in an image processing unit and/or by a user interaction. A position of a number of target points may be defined in the reference plane—merely approximately if necessary. In particular it has proven sufficient in general if the number of distance points is located only near the reference plane, within the scope of an acceptable deviation, which is small in relation to the distance. This includes, for example, the frequently encountered situation in which the measuring point is placed on a mostly essentially planar surface of the target object such as a building façade or the like, while the target points defining the lateral distances to be measured are themselves in front of or behind the aforementioned planar surface. This is often the case in the aforementioned situation for balconies, window ledges, door recesses and the like in the building façade, for example. This refinement which includes approximations is advantageous for most applications in which planar surfaces are to be measured, for example, walls or the like. A comparatively accurate determination of lateral distances in such a plane is advantageously achieved when this plane is oriented as a reference plane practically at a right angle to the direction of view of the photoelectric image acquisition unit (normal direction) and when the measuring point is in this plane. In particular such planes are referred to here as the reference plane and their normals are referred to as the reference normals. Within the scope of another advantageous refinement of the present invention it is provided that the control and computation unit has a joining module which is designed to process a number of individual photoelectric images, each resulting from a single recording of the target object, each with exactly one allocated measuring point together, in particular to process them in combination, in particular combining them to form a panoramic image. Multiple photoelectric images may advantageously result from multiple recordings of the target object. In this regard, it is provided in particular that exactly one measuring point is allocated to each photoelectric image, namely as one of the number of pixels in the corresponding photoelectric image. In particular a lateral distance between a first pixel in a first image and a second pixel in a second image may be defined as the pixel distance, and a distance measure of the target points may be allocated to the pixel distance. The image processing may advantageously be designed to form relationships between the first image and the second image in such a way that a pixel distance may be specified and a distance measure of the target points is to be allocated to the pixel distance—if necessary with a different image scale in the first and second images.
Within the context of an additional advantageous refinement of the present invention it is provided that exactly one photoelectric image results from multiple individual recordings of the target object and is compiled as a panoramic image from multiple individual recordings of the target object. Exactly one measuring point of the distance measurement is allocated to each individual recording of the target object. Thus multiple measuring points corresponding to the individual images are allocated to the single compiled photoelectric image, namely as one of the number of pixels in the single composite photoelectric image. The joining module may advantageously be designed to allocate a number of measuring points formed from the allocated measuring points, in particular a number of averaged measuring points to a single measuring point, if necessary.
The distance module may be refined in a particularly advantageous manner to define a number of distance measures between a number of target points in a surface of the exactly one single photoelectric image. In this way, a number of lengths in a lateral surface of the target object may be determined advantageously. For example, these may be secured in noticeable positions of the image. A definition of the positions may take place automatically, either entirely or partially, for example, based on contrast or with the aid of some other image processing filter function of an image processing unit and/or control and computation unit using optical analysis. A definition may also be provided by user interaction, again either partially or entirely, in particular in the form of a user interaction via an input device or the electronic display unit. The electronic display unit may be implemented, for example, as a touchscreen system having a suitable functionality, for example, a snap function (allocation of an approximate touch position to a noteworthy image position in the vicinity).
In concrete terms, a façade of a building may be completely surveyed in this manner. Based on that it has also proven advantageous to define a surface measure of a polyhedron formed within a number of target points in a surface of the exactly one single photoelectric image. A user may utilize such an at least approximately determined surface measure in an appropriate manner to be able to estimate, for example, the construction material for a surface to be processed.
In particular the concept of the present invention in a refinement is suitable for refining an electronic display unit to visualize the documentation and attribution of measured distances on a lateral surface or area of the target object satisfactorily for a user. For example, the electronic display unit may be designed to display at least one distance measure and/or surface measure between at least one first pixel and one second pixel in the image.
This or another may be in particular a display without a defined distance from the measuring point of the distance measuring unit. It has been found that a user does not need the distance to the measuring point in each case but instead is interested more in lateral distances on a lateral surface of the target object. Likewise in one advantageous refinement, an electronic display unit may be configured in the housing, to display a distance to the measuring point as an alternative or in addition to the distance measure.
The distance measuring unit advantageously has: a beam unit in particular a laser unit and optics having optical elements, including at least transmission and reception optics, an optical transmission path having an optical axis for emitting the measuring beam to the target object and an optical receiving path having an optical axis for receiving the measuring beam reflected by the measuring point.
The transmission path is advantageously guided biaxially to the reception path via a separate output element of the transmission optics, in particular an output lens. Alternatively, the transmission path may also be guided coaxially to the reception path via a shared output element of the transmission and reception optics, in particular via a collimator lens.
The distance measuring unit which utilizes the optical measuring beam with the aid of which the distance to the target object is measurable without contact may advantageously be implemented in a so-called biaxial variant or advantageously in a so-called coaxial variant. The aforementioned naming refers to the relative configuration of the transmission path and the reception path to one another. In the biaxial variant it is advantageously provided that the transmission path is guided biaxially to the reception path via a separate output element of the transmission optics. The output element of the transmission optics may advantageously be an output lens or the like.
The distance measuring unit and the photoelectric image acquisition unit may advantageously be implemented constructively in the measuring device, but different variants are possible as needed. Essentially a transmission path, a reception path and an image path of the distance measuring unit and the photoelectric image acquisition unit may be implemented separately (also referred to as biaxially) or at any rate may be partially combined (also known as coaxial). For a complete coaxial configuration of the paths, a shared output element of the image path and of the transmission and/or reception paths may be provided in particular.
It has been found that within the context of one refinement, the control and computation unit may be expanded to correct for optical distortions in the photoelectric image acquisition unit. The control and computation unit and/or the image processing unit advantageously has/have a transformation module which is designed to make available to the distance module a correction measure for a perspective distortion of a polyhedron in particular, which is formed by a number of target points. With the transformation module, target points may be transformed into the reference plane in addition or alternatively, if necessary, using additional sensor data of an inclination sensor, a yaw rate sensor or the like, for example, or using a user interaction. In particular this relates to corrections of perspective distortions with respect to a vanishing point. A correction module for correction of image distortions caused by elements of the image acquisition unit is also advantageously provided, so that even temperature-dependent effects are correctable based on a model or using tabular values.
The measuring device is suitable in a particularly advantageous manner for selecting distinctive target points such as, for example, edge end points, biaxial intersection points or triaxial intersection points or the like as the number of pixels. In other words, pixels may be predefined with the aid of the measuring device in an advantageous manner, in such a way that distinctive target points on the target object are definable as pixels in the photoelectric image.
This may be accomplished, for example, through a choice by the user, for example, via a control panel. This may also be done automatically, for example, with the aid of the image processing unit, e.g., based on contrast analyses, Hough transformation or similar image filters. The measuring device is preferably expandable with a coupling module which allows coupling of additional applications such as flat memory, a GPS system or other available distance information carriers in a suitable manner. This is suitable in a particularly advantageous manner to compensate for the distance measures which may be defined by the distance module with other distance information. The coupling module is advantageously designed to allocate a distance of the distance module to a distance of a distance information carrier. This may be utilized advantageously, for example, to define plans, locations or orientations of the target object or measuring device with respect to specifications and to identify with them. This may be utilized in an advantageous manner for BIM (Building Information Modeling) applications.
Exemplary embodiments of the present invention will now be described below with reference to the drawings, which are not necessarily scale drawings of the exemplary embodiments but instead the drawings are shown schematically and/or in a slightly distorted form for the purpose of illustration. With regard to additions to the teachings which are directly recognizable from the drawings, reference is made to the related art. To be taken into account here is the fact that a variety of modifications and changes with respect to the form and the detail of a specific embodiment may be made without deviating from the general idea of the present invention. The features of the present invention disclosed in the drawings and in the claims may be essential to the refinement of the present invention either individually or in any combination. Furthermore, all combinations of at least two of the features disclosed in the description, the drawings and/or the claims fall within the scope of the present invention. The general idea of the present invention is not limited to the precise form or detail of the preferred specific embodiment shown and described below or limited to a subject matter which would be restricted in comparison with the subject matter claimed in the claims. With the stated dimension ranges, values within the specified limits should also be disclosed as limiting values and may be used and claimed as desired. For the sake of simplicity, the same reference numerals are used below for identical or similar parts or parts having identical or similar functions.
Additional advantages, features and details of the present invention are derived from the following description of preferred exemplary embodiments and on the basis of the drawings.
Housing 10 of measuring device 100 which is designed in the form of a laser distance measuring device for example, is designed for manual use—so in the present case, it is not insignificantly larger than the area of a hand with corresponding haptics, possibly also ergonomics. Likewise, housing 10 is shown as a rectangle for the sake of simplicity. Housing 10 accommodates distance measuring unit 20 in the form of a laser distance measuring unit utilizing optical measuring beam 1. Possible variants of distance measuring unit 20 are shown in
Measuring device 100 has an operating and input configuration 30, which is situated on housing 10 and is formed in the present case as a keypad embedded in the operating side of housing 10. A visual display 40 is embedded on the operating side of housing 10, so that in the present case both measured distance z between distance measuring device 100 and a target object 200 and the operating state of distance measuring device 100 may be displayed there. Distance measuring unit 20 is operable via the operating and input configuration 30. One of the reference stops 50A, 50B, 50C or 50D of housing 10, which is explained below, may be selected, for example. Whereas the measurement via optical measuring beam 1 (a laser beam here, for example) is based on a reference point NP within the housing, a user will usually want to measure the distance to target object 200 with respect to one of reference stops 50A, 50B, 50C or 50D. When the reference stop is selected by a user, for example, via operating and input configuration 30, distance z may be based on various reference stops using fixed addition constants. The most important reference stop 50A is mounted on the rear side 10A of the instrument. Furthermore, there are other reference stops 50B, 50C, 50D, for example, on the front side 10B of the instrument or on a tip 10D of a measurement extension or on a fastening 10C for a stand thread whose midpoint may also function as reference stop 50C.
For the sake of simplicity the same reference numerals are used below for identical or similar parts or parts having an identical or similar function.
To determine a distance between a target object 200 (See
In concrete terms—as is also apparent from FIG. 3A—measuring beam 1 of a laser unit 21 in the form of a laser diode is bundled using an optical lens of transmission optics 22 in such a distance measuring unit 20 designed as a laser distance measuring unit or the like. Bundled measuring beam 1 is directed from the front side of housing 10B at target object 200—for example, a measuring point P1 there—and forms a light spot on measuring point P1. Using an optical lens of reception optics 23, measuring beam 2 of this light spot, which is reflected or backscattered and is referred to as scattered light, is imaged on the active surface of a photodiode of detector 26 in the manner explained. Distance measuring unit 20 may be designed to be biaxial or coaxial. To determine the distance from target object 200 to reference point NP of measuring device 100—corresponding to the path back and forth—the laser light of the laser beam is modulated as measuring beam 1. A modulation may be pulsed or sinusoidal. Other forms of modulation are also possible. The modulation takes place in such a way that the time difference between an emitted measuring beam modulation and a received measuring beam modulation is measurable. A simple distance between reference zero point NP of measuring device 100 and target object 200 may thus be inferred based on the factor of the speed of light. This may be calculated in a control unit, for example.
Measuring devices 100A, 100B differ in the area of the paths and the output optics, which may be implemented as needed, with advantages that tend to be different. Photoelectric image acquisition units 60A and 60B are situated differently in relation to distance measuring units 20A and 20B. In measuring device 100A of
In measuring device 100B of
For further processing of photoelectric image 4, camera lens 61 is connected to image processing unit 62 via a suitable image data line 63. Image processing unit 62 is connected to control and computation unit SE via another image data line 64. Control and computation unit SE thus has access to information about photoelectric image 4 of photoelectric image acquisition units 60A, 60B. Likewise, control and computation unit SE also has access over a detector signal line 29 to detector signals, which supply a calculated value for distance z1 of measuring point P1 at target object 200 in control and computation unit SE. Information about a photoelectric image 4 processed with the aid of the image processing unit as well as a distance measure of a distance z1 between measuring point P1 and reference point NP may thus be made available with the aid of control and computation unit SE for further processing and/or for the user.
As shown symbolically in
In a situation which is not depicted here, however, measuring point P1 need not necessarily be part of the scope of the image. It is adequate if a plane is definable as the reference plane with the aid of measuring point P1 and distance z1 of measuring point P1 from reference point NP. At any rate, target points Z1, Z2 may be allocated approximately to the reference plane and target points Z1, Z2 are advantageously situated in the reference plane. In particular measuring point P1 does not usually form a target point Z1, i.e., it does not form an end point of a lateral distance A which is to be measured. A measuring point P1 is usually in particular not a distinctive position because the user will, if necessary, define the measuring point by aligning the measuring device with any free point on a surface, for example, a building façade. For example, if one wants to measure a window width, then measuring point P1 is situated somewhere on a wall as a reference plane, for example. Measuring point P1 is relevant for the measurement of distance z1 from device 100A, 100B to the wall as the reference plane. However, in contrast with
If the latter is not the case, an improvement may be achieved by a perspective rectification. For this purpose a transformation module may be used which is shown in
A recording of target object 200 in the area of measuring point P1 is possible using a single viewfinder and camera lens 61 having a comparatively simple design, as shown in
According to the concept of the present invention, this information is present in a mutually self-referencing form, i.e., a measuring point image P1′ (x1′, y1′) defined according to measuring point P1 is allocated to thusly designated distance z1 in
Based on this, a distance module A shown in
Specifically,
Measuring point P1 (as first target point Z1 here, for example), which is visible in a lateral surface of target object 200—the reference plane—is also imaged in photoelectric image 4. Measuring point P1 has pixel coordinates x1′, y1′ as measuring point image P1′ (x1′, y1′). Photoelectric image 4 is the result of a single recording by photoelectric image acquisition unit 60A, 60B. To determine a distance measure of distance A [m] between measuring point P1/first target point Z1 and a second target point Z2, the latter is imaged with pixel coordinates x2′, y2′ as second target point image Z2′ (x2′, y2′) in photoelectric image 4. Within the scope of the present specific embodiment, no additional recording of a photoelectric image is initially necessary. Instead, image processing unit 62 is designed to define at least measuring point image P1′ and target point image Z2′ in photoelectric image 4 via pixel coordinates x1′, y1′ and x2′and to define a distance between these pixels, namely between measuring point image P1′ (x1′, y1′) and target point image Z2′ (x2′, y2′), as pixel distance Δ′. This is done, for example, via pixel coordinate differences Δx′=x2′−x1′, Δy′=y2′−y1′. In the present case, pixel distance Δ′ may be selected at will, for example, with pixel coordinate differences (Δx′, Δy′)=(2, 13). A distance measure Δ may be allocated to such a pixel distance Δ′ by distance module A shown in
In the present case (a corresponding design of the distance module is shown in
It follows from this for distance Δ of target points Z1 (here measuring point P1) and Z2 approximately:
For the same pixel variables bx=by=b, distance Δ of the target points is simplified to
In abbreviated form, this procedure is illustrated in
Due to this clear allocation of measuring point P1 in the lateral plane of target object 200 to a measuring point image P1′ (x1′, y1′) in photoelectric image 4, measured distance z1 to measuring point P1, in particular together with the focal length ΔF of the viewfinder and camera lens 61, may be utilized to ascertain at least approximately an image conversion factor as image scale M=Δ/Δ′=(ΔF/z1·b) for photoelectric image 4. Photoelectric image 4 may thus be quantitatively related to the actual lateral plane of target object 200. Objects such as an edge defined by the pixel coordinate difference (Δx′, Δy′) between P1′ (x1′, y1′) and Z2′ (x2′, y2′) may thus be measured at least approximately.
In such a measurement, a measurement error is the least when the objects are in a plane, in which measuring point P1 is also located, which is preferably aligned perpendicularly to the direction of view of the photoelectric image acquisition unit (reference normal). To this extent, a measurement error is minor in particular when the aforementioned lateral plane stands at least approximately perpendicular to measuring beam 1 on the lateral surface of target object 200. In a subsequent method step, for example, by repeating the procedure depicted in
Such a situation is illustrated in
Photoelectric image 4 may be displayed in the form shown in
The additional display of edge dimensions or other distance dimensions described with reference to
With reference to
Another advantageous application of coupling module K is facial recognition using the camera. For example, a laser of distance measuring device 20A, 20B could be deactivated when a person is in the beam path of measuring beam 1.
Data required by other devices for special applications may, if necessary, be read in or input via interface 71 or operating and input configuration 30 and made available for a control and computation unit SE. These may be, for example, data for construction materials such as thermal conductivity values, cost units or the like and may be made available to the control and computation unit. Control and computation unit SE may also be equipped in such a way that distance measures on a lateral surface of a target object 200 (i.e., the reference surface)—for example, those shown in FIG. 10—may be utilized to make available an at least approximate cost analysis or heat loss information. In other words, a measuring device 100A, 100B may already be equipped for making available distance measures together with additional information. Thus a user at a construction site may already make important estimates of costs and required measures as well as the extent thereof on site. This may pertain to the renovation of a façade or a thermal insulation thereof or also the renovation of an interior or the like, for example.
Such data as well as other additional data may be supplied to measuring device 100A, 100B, advantageously to achieve a better attribution of the distance measurements described above. The additional information which is available or may be input via coupling modules K, interface 71 or operating and input configuration 30—optionally including handwritten diagrams, comments or the like—may be placed in photoelectric image 4, for example.
Symbol 5 shown in
With additional input of the construction material used, the price may be ascertained automatically on site. Additional information such as GPS data, compass data and input of thermal conductivity values (K value) permits an on-site calculation of heat loss and the corresponding costs.
Various distance measurements, each belonging to one photoelectric image, may be provided, for example, for one measuring point P1, P2, P3, etc., and also permit a more accurate perspective rectification. This is due to the fact that additional information about the angle of the object planes may be derived from the various measured values and measuring spot positions. In addition, the movement of the measuring device during the recording of the measured data may also be taken into account by using an inertial navigation system (triaxial acceleration sensors, triaxial gyroscope or the like). Such transformations as well as others may be implemented within the scope of transformation module T in order to make available to the user in conclusion a corrected and rectified measuring surface including dimensioning as shown at the bottom of
Furthermore, the computation and control unit may include an evaluation algorithm module which tests the quality of the measuring points and eliminates invalid measuring points (e.g., measurements through doors or windows or bypassing the house wall) on the one hand or, on the other hand, proposes suitable measuring points in the camera image to the user.
As
Number | Date | Country | Kind |
---|---|---|---|
DE102010043 136.2 | Oct 2010 | DE | national |