The present invention relates to a projection device.
Conventionally, projection devices that can project an image onto a screen, for example, have been known. As a technique related to projection devices, Patent Literature (PTL) 1 discloses a device that changes its projection angle when projecting an image.
However, in the technique described in PTL 1, when there is a desire to project an image toward, for example, a wide-open, indoor space at a wide projection angle and a high degree of accuracy, it may not be possible to accurately project the image.
In view of this, the present invention has an object to provide a projection device which, by performing image correction with a high degree of accuracy, can project an image at a target position with reduced error, even when projecting images across a wide-open space at a wide projection angle.
In order to achieve the above-mentioned object, a projection device according to one aspect of the present invention is a projection device for use at a construction site and includes: a projector that projects an image onto a projection plane; a distance meter that measures a distance to the projection plane; a rotation driver that rotates the projector and the distance meter; an angle sensor that measures a rotation angle of the rotation driver; and a data processor that corrects the image that is projected from the projector. At least one of the distance meter or the projector is offset from a center line of a rotation axis of the rotation driver. The data processor corrects the image by using the distance measured by the distance meter and the rotation angle measured by the angle sensor.
According to the present invention, by performing image correction with a high degree of accuracy, errors in projecting an image at a target position can be reduced, even when projecting images across a wide-open space at a wide projection angle.
Hereinafter, an embodiment will be described in detail with reference to the drawings. However, the embodiment described below is merely one example of various embodiments of the present invention. In accordance with the design, or the like, various modifications can be made to the following embodiment as long as the object of the present invention can be achieved. It should be noted that the respective figures described in the following embodiment are schematic diagrams, and elements that are not needed to describe the present invention have been omitted. Furthermore, in the respective figures, elements that are substantially the same are given the same reference signs, and redundant descriptions may be omitted or simplified. It should be noted that the accompanying drawings and the following descriptions are provided to assist those skilled in the art in gaining a thorough understanding of the present invention, and are not intended to limit the scope of the subject matter recited in the Claims.
(1) Overview
First, an overview of projection device 10 according to an embodiment will be described.
That being said, for the image projected from projection device 10, a projection error in which the actual projection position is different from the desired projection position occurs. Here, when projection device 10 projects an image at a wide projection angle across a wide-open space 100, such as the interior of a building under construction, as described above, since the position of projection plane 110 is far from projection device 10, the error (projection error) between the intended projection position of the image and the actual projection position on projection plane 110 becomes large. In view of this, the configuration and operation of projection device 10 having a function of accurately correcting an image to be projected in order to enable highly-accurate, wide-area projection at the intended position, will be described.
(2) Configuration
A configuration of projection device 10 according to the embodiment will be described.
As illustrated in
In particular, for
Projection device 10 includes projector 21, angle sensor 22, rotation driver 23, distance meter 24, controller 25, and storage 26.
Projector 21 is a projection module for projecting an image onto projection plane 110. Projector 21 includes light source 21a and scanner 21b. Although not shown in the figures, note that projector 21 includes other optical components, such as a lens, a mirror, or the like.
Light source 21a is a laser light source implemented by a semiconductor light-emitting element, for example. It should be noted that light source 21a may include multiple light-emitting elements that emit lights of different colors (e.g., a red light-emitting element, a green light-emitting element, and a blue light-emitting element), and may have a configuration that can switch the color of light emitted.
Scanner 21b is a mechanism that projects light onto projection plane 110 by scanning the light emitted by light source 21a, and may be implemented by a micro electro-mechanical system (MEMS) mirror, for example.
Projector 21 is an integrated module that includes light source 21a and scanner 21b, and is provided in the vicinity of the outer shell of housing 27.
Distance meter 24 measures the distance from projection device 10 to the structure that constitutes projection plane 110. Distance meter 24 is a distance measurement sensor, such as a time of flight (TOF) sensor, for example. Distance meter 24 may, for example, be a distance measurement sensor that uses a phase-difference detection method, a distance measurement sensor that uses a triangulation distance measurement method, or other type of distance measurement sensor. Distance meter 24 includes a light source for distance measurement and a light-absorbing element, such as a photodiode, or the like. The light source for distance measurement is a light source that emits light toward a structure. The light source for distance measurement is implemented by a light-emitting element that emits infrared light, for example, but may be implemented by a light-emitting element that emits visible light. As described later, distance meter 24 has a laser pointer function for indicating the current distance measurement target point to a user. Although this function is implemented, for example, by a light source separate from the light source for distance measurement, when the light source for distance measurement emits visible light, the function may be implemented by the light source for distance measurement.
Rotation driver 23 is a rotation mechanism for changing the orientation of projection device 10 (i.e., the orientation and angles of distance meter 24). Rotation driver 23 includes first rotation driver 23a for changing the orientation of projection device 10 in the tilt direction and second rotation driver 23b for changing the orientation of projection device 10 in the pan direction. In
Angle sensor 22 measures the orientation of projection device (i.e., the orientation and angles of distance meter 24). Specifically, angle sensor 22 is an angle sensor that measures the drive amount of rotation driver 23. Angle sensor 22 includes first angle sensor 22a that measures the angle of first rotation driver 23a that rotates in the tilt direction and second angle sensor 22b that measures the angle of second rotation driver 23b that rotates in the pan direction. It should be noted that when rotation driver 23 includes a third rotation driver for changing the orientation of projection device 10 in the roll direction, angle sensor 22 may measure the roll angle as a drive amount of rotation driver 23.
Controller 25 is a control device that controls individual components of projection device 10 including projector 21 and rotation driver 23 in order to project an image onto projection plane 110. Controller 25 includes data processor 25a and inclination calculator 25b as functional components. Data processor 25a corrects the image to be projected by using information on the rotation angles of rotation driver 23 measured by angle sensor 22 and the distance measured by distance meter 24. Details of the correction will be described later. Furthermore, inclination calculator 25b calculates the inclination angle of projection plane 110 based on the distances from projection device 10 to at least 3 arbitrary points on projection plane 110. Details of the calculation will be described later.
Controller 25 is implemented by a microcomputer or processor, for example. Furthermore, controller 25 may include a drive circuit for driving projector 21 and a drive circuit for driving rotation driver 23, or controller 25 may include a drive circuit for driving distance meter 24.
Storage 26 is a storage device implemented by semiconductor memory, for example. Storage 26 stores an image to be projected onto projection plane 110, a program for implementing data processor of controller 25, a program to control other elements of projection device 10, information temporarily required for control performed by controller 25, information measured by angle sensor 22 and distance meter 24, and information generated by data processor 25a, and the like. Furthermore, storage 26 stores information on the positional relationship between distance meter 24 and projector 21.
Housing 27 is a housing that houses projector 21, angle sensor 22, distance meter 24, controller 25, and storage 26. Housing 27 is formed from resin, for example, but may be formed from metal. Furthermore, in the present embodiment, housing 27 is of a rectangular-cuboid shape, and is disposed in the gap between the opposing ends of the approximate U-shape formed by rotation driver 23 in a front view. As long as housing 27 is connected to rotation driver 23 and can be rotated in the tilt direction and pan direction, housing 27 is not limited to such a shape and arrangement.
While distance meter 24 has been described as being housed in housing 27, distance meter 24 does not necessarily need to be housed in housing 27, and distance meter 24 may, for example, be fixed to the outside of housing 27.
It should be noted that projection device 10 may include a display component and operation component that are not shown in the figures. Accordingly, a user can check the status related to projection device 10 that is displayed on the display component. Furthermore, a user can operate and drive the individual components of projection device 10 by operating the operation component. In this case, the display component is a liquid crystal display (LCD), or the like. The operation component is an input device, such as a touch panel or keyboard.
It should be noted that projection device 10 may include a communication component that is not shown in the figures. Projection device 10 may be driven by operations performed on an external electronic device by sending instructions from the external electronic device to controller 25 via the communication component by wired or wireless communication. In such a case, there are no particular limitations on the communication standard used in the wired or wireless communication by the communication component, and any method may be used.
With the above-mentioned configuration, projection device 10 can project as an image, onto projection plane 110, light for assisting in work performed during the construction of a structure.
(3) Operation Example
Next, an operation example of projection device 10 will be described.
First, a user places projection device 10 in space 100 and rotates rotation driver 23 at arbitrary angles to point housing 27 that includes projector 21 and distance meter 24 toward the desired direction for projecting onto projection plane 110 (S11).
Next, angle sensor 22 measures and obtains the rotation angles of rotation driver 23 at the point in time when the process for step S11 was completed (S12). Specifically, first angle sensor 22a measures tilt angle θ of first rotation driver 23a. Second angle sensor 22b measures pan angle φ of second rotation driver 23b. The tilt angle θ and pan angle φ obtained through these measurements are stored in storage 26.
Next, distance meter 24 measures and obtains the distance from projection device 10 to projection plane 110 according to their positional relationship at the point in time when the process for step S11 was completed (S13). Specifically, distance meter 24 measures and obtains the straight-line distance between distance meter 24 and projection plane 110 (Lmeasured) as shown in
However, as illustrated in
Next, controller 25 checks whether measurement of the rotation angles of rotation driver 23 and the distance from distance meter 24 to projection plane 110 (Lmeasured) is complete (S14). If measurement of the rotation angles and Lmeasured is not complete (“No” for S14), the process returns to step S11, and the above-mentioned process is performed again.
If measurement of the rotation angles and Lmeasured is complete (“Yes” for S14), distance meter 24 measures the distances from projection device 10 to three arbitrary measurement points on projection plane 110 that are not aligned in a straight line, and angle sensor 22 obtains the rotation angles of rotation driver 23 for each of the three points (S15).
When doing so, point B and point C are selected so that point A, point B, and point C are not aligned in a straight line. For example, arbitrary point B is selected starting from a state where distance meter 24 is pointed at point A of projection plane 110 based on the rotation angles of rotation driver 23 at the point in time when the process for step S14 was completed, and first rotation driver 23a and second rotation driver 23b are then rotated so that distance meter 24 is pointed at point B. Once distance meter 24 is in a state where it is pointed at point B, distance meter 24 measures the distance from distance meter 24 to point B and the distance is stored in storage 26. Furthermore, angle sensor 22 measures tilt angle θ and pan angle φ of rotation driver 23 while in this state and the angles are stored in storage 26. Likewise for point C, once information on the distance and angles for point B is obtained, rotation driver 23 rotates so that distance meter 24 is pointed at point C, and the distance from distance meter 24 to point C and the rotation angles of rotation driver 23 at that time are obtained and stored in storage 26. The selection of point B and point C may be performed by the user sending instructions by operating projection device 10 while checking a laser pointer light emitted from distance meter 24. Alternatively, the selection may be performed automatically according to a program that has been stored in advance in storage 26.
Next, inclination calculator 25b calculates the inclination angle of projection plane 110 (S16). Inclination calculator 25b calculates orthogonal coordinates (x, y, and z coordinates) of the three measurement points based on stored information (specifically, measurement results for the distances and rotation angles, as well as the positional relationship between distance meter 24 and projector 21).
To calculate the orthogonal coordinates of the three measurement points, inclination calculator 25b first calculates the polar coordinates of the three measurement points. In order to calculate the polar coordinates, inclination calculator 25b calculates distance r from each of the three measurement points to origin O, from the distance information measured by distance meter 24 and stored in storage 26, as well as information on the positional relationship between distance meter 24 and projector 21 stored in storage 26. The distance information measured by distance meter 24 indicates the distances from distance meter 24 to the measurement points. Distance r from origin O to each measurement point is calculated by using the information on the positional relationship between distance meter 24 and projector 21 (information on this positional relationship also includes the positional relationship between origin O, distance meter 24, and projector 21) together with this distance information measured by distance meter 24. In other words, distances r are calculated in a manner that reflects the error from the positional relationship where distance meter 24 and projector 21 are offset from the center lines of the rotation axes of first rotation driver 23a and second rotation driver 23b. The polar coordinates of the three measurement points are calculated by combining distance r calculated as described above with pan angle φ and tilt angle θ measured for each measurement point.
Once the polar coordinates are calculated, inclination calculator 25b converts the polar coordinates calculated for the three measurement points to orthogonal coordinates.
Once the orthogonal coordinates of the three measurement points are calculated, inclination calculator 25b calculates, using the orthogonal coordinates of the three measurement points, the distance from projection device 10 to projection plane 110 (i.e., a plane that passes through measurement point A, measurement point B, and measurement point C) and the inclination angle of projection plane 110 relative to projection device 10. Assuming that the equation for projection plane 110 is ax+by +cz=d, where the coordinates for point A are (xa, ya, za), the coordinates for point B are (xb, yb, zb), and the coordinates for point C are (xc, yc, zc), inclination calculator calculates, from this information, the normal vector of projection plane 110, which is n=(a, b, c). Normal vector n indicates the inclination angle of projection plane 110 in orthogonal coordinates, and the length of normal vector n indicates the distance from projection device 10 to projection plane 110. That is to say, calculating normal vector n is equivalent to calculating the distance from projection device 10 to projection plane 110 and the inclination angle of projection plane 110 relative to projection device 10.
Next, data processor 25a corrects the image to be projected based on the distances measured by distance meter 24 from projection device 10 to the respective points, the rotation angles of rotation driver 23 at that time, and the inclination angle of projection plane 110 calculated by inclination calculator 25b, and the corrected image is projected from projector 21 (S17). Specifically, data processor 25a corrects distortion of the image according to the calculated inclination angle of projection plane 110 and rotation angles of rotation driver 23, and corrects the projection scaling factor of the architectural drawing data based on the calculated distance to projection plane 110. Furthermore, in this correction, as described earlier, when distance meter 24 and projector 21 form an angle with projection plane 110, corrections are performed taking into account the fact that the straight-line distance (Lmeasured) between distance meter 24 and projection plane 110 and the straight-line distance (Lprojected) between projector 21 and projection plane 110 are not equal to each other.
Due to the operation described in the above-mentioned operation example for projection device 10, even in a wide-open space 100 in a building under construction, an image can be projected at the intended position with a high degree of accuracy by performing highly accurate image correction. In projection device 10 in the present configuration, since image correction that also takes into consideration the inclination angle of projection plane 110 is performed by using information on tilt angles θ and pan angles φ of projection device 10 provided by first angle sensor 22a and second angle sensor 22b, highly accurate correction of the image to be projected can be performed.
Furthermore, in the calculation of the inclination angle of projection plane 110, the inclination angle of projection plane 110 is calculated in a manner which eliminates the error arising from the positional relationship between distance meter 24 and origin O where distance meter 24 and projector 21 are offset from the center lines of the rotation axes of first rotation driver 23a and second rotation driver 23b. As a result, the calculation of the inclination angle of projection plane 110 can be performed with a high degree of accuracy, and thus correction of the image to be projected can also be performed with high accuracy.
Furthermore, in the correction of the image to be projected, when distance meter 24 and projector 21 form a relative tilt angle θ with projection plane 110, for the straight-line distance (Lmeasured) between distance meter 24 and projection plane 110 and the straight-line distance (Lprojected) between projector 21 and projection plane 110, correction to rectify distance measurement error caused by the positional relationship between distance meter 24 and projector 21 is performed where Lmeasured is equal to Lprojected plus the length of distance h×tan θ. Accordingly, correction of the image to be projected can be performed with a high degree of accuracy.
Here, a specific example will be given to describe the advantageous effects of projection device 10 according to the present embodiment.
In the case where the above-described correction that takes into consideration the 50 mm error is not performed, since projection plane 110 is actually positioned so that Lprojected=950 mm, each of the nine circular symbols is instead projected at the position indicated by a corresponding one of the nine triangular symbols. Specifically, the uncorrected image is projected at a scale of 950/1,000 and a position that is shifted toward the minus-side of the Y-axis by 50 mm/cos 45°≈70.7 mm. The specific corresponding relationship of how the circular symbols and triangular symbols correspond to each other is as illustrated in
Such a degree of error can be tolerated in projection devices for entertainment or presentation use. However, for projection device 10 that is used at construction sites and projects guide lights to indicate work locations, projection errors cannot be tolerated since they will result in the construction of structures that do not conform to the dimensions given in their design. Since projection device 10 according to the present embodiment performs image correction based on distances and rotation angles, as described above, correction of the image to be projected can be performed with a high degree of accuracy.
Furthermore, in the present embodiment, distance meter 24 and projector 21 are each disposed at positions that are offset from the center lines of the rotation axes of first rotation driver 23a and second rotation driver 23b. Due to this positional relationship between distance meter 24 and projector 21, a balanced weight distribution can be achieved inside housing 27, including the electronic substrate (not shown in the figures) for driving projection device 10. As a result, the force required by rotation driver 23 to rotate housing 27 can be reduced, thereby allowing the device configuration of projection device 10 to be simplified and miniaturized.
As shown above, an embodiment according to the present invention has been described with reference to the drawings. However, the scope of the present invention is not limited by the foregoing description. Various variations and modifications can be made to the present invention within the essence of the present invention as described in the Claims. For example, although a laser-scanning projection device is described for the embodiment of the present invention, the present invention may be implemented as a different type of projection device. Furthermore, the projection device may be implemented as a client-server system. In this case, part of the processes described as being performed by the projection device in the above-mentioned embodiment is performed by a server device by transmitting and receiving information via a communication component.
Furthermore, the configuration described in the above-mentioned operation example is merely an example, and the present invention is not limited to the foregoing configuration. For example, various variations and modifications may be made to the sequence and the contents of the processes of the individual steps indicated by reference signs marked with an “S” as long as there is no deviation in the advantageous effects of the present invention. Forms obtained through various modifications to the foregoing embodiments that can be conceived by those skilled in the art, as well as forms realized by combining elements and functions in the foregoing embodiments without departing from the essence of the present invention are included within the scope of the present invention.
The present invention can be suitably utilized when projecting, in a relatively wide-open space, such as in a building under construction, for example, an image indicating work locations at which workers should perform work.
Number | Date | Country | Kind |
---|---|---|---|
2020-194837 | Nov 2020 | JP | national |
This application is the U.S. National Phase under 35 U.S.C. § 371 of International Patent Application No. PCT/JP2021/042025, filed on Nov. 16, 2021, which in turn claims the benefit of Japanese Patent Application No. 2020-194837, filed on Nov. 25, 2020, the entire disclosures of which Applications are incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/042025 | 11/16/2021 | WO |