This application is a 35 U.S.C. § 371 National Stage Application of PCT/CN2018/105144, filed on Sep. 12, 2018 in China, the disclosure of which is incorporated herein by reference in its entirety.
The disclosure relates to a laser leveling tool having gesture control functionalities for controlling the operation of the laser leveling tool.
Laser leveling tools or levelers are generally used in construction and decoration industries. A conventional laser leveling tool comprises at least one laser beam emitter for emitting a laser beam which is projected onto a target surface to form a predefined pattern serving as a reference for facilitating operation during construction or remodeling.
Laser leveling tools generally have HMIs via which users can control the operations of the laser leveling tools. HMIs of current laser leveling tools comprise physical press buttons to input commands of the users, like inputting initial settings, selecting and toggling different projection modes, or the like. A laser leveling tool is generally levelled in a proper position and orientation before operation, so the press actions may affect the levelled position and orientation of the leveling tool and thus there may be error or deflection in the projected pattern of the leveling tool.
In addition, when the leveling tool are designed to be able to project more types of patterns (for example, dot, line, plane, cross, etc.), more buttons need to be provided in the HMI, so the HMI will occupy a large area of the leveling tool.
In order to reduce the number of buttons, a single button design has been proposed, which allow the users to toggle ON/OFF of different types of laser patterns in cycle. This may confuse the users when they want to control one or a few specific laser patterns only.
An object of the disclosure is to alleviate the deficiencies found in the laser leveling tools which are resulted from physical press buttons.
For achieving this object, in one aspect, the disclosure provides a laser leveling tool comprising: gesture control means comprises at least one sensor adapted to sense gestures and/or movements of a user; and a controller in communication with the gesture control means for receiving signals from the sensor and generating control commands for the tool based on the received signals, the control commands comprising at least ON/OFF states of laser beam patterns that the tool can project.
According to a possible embodiment, there are at least two sensors arranged at substantially the same height, and the controller is configured to generate a control command to activate a horizontal laser beam when the user is swiping from a first one of the at least two sensors to a second one of the at least two sensors and to generate a control command to deactivate the horizontal laser beam when the user is swiping from the second one of the at least two sensors to the first one of the at least two sensors.
According to a possible embodiment, there are least two sensors arranged at different heights, and the controller is configured to generate a control command to activate a vertical laser beam when the user is swiping from one of the at least two sensors to another one of the at least two sensors and to generate a control command to deactivate the vertical laser beam when the user is swiping from the another one of the at least two sensors to the one of the at least two sensors.
According to a possible embodiment, there is at least one sensor used to be taped on by the user, and the controller is configured to generate a control command to activate/deactivate a plumb laser dot when the user is taping on this sensor.
According to a possible embodiment, there are at least two sensors to be swiped by the user, and the controller is configured to generate a control command to activate/deactivate a plumb laser dot when the user is swiping from one of the at least two sensors to another one of the at least two sensors or from the another one of the at least two sensors to the one of the at least two sensors.
According to a possible embodiment, the tool comprises an HMI, and at least some of the sensors are arranged on or under a screen of the HMI.
According to a possible embodiment, the screen is provided with marks corresponding to the sensors for directing the user's swiping and/or taping actions.
According to a possible embodiment, the tool comprises a housing, and at least some of the sensors are arranged on the housing or at the inner side of the housing.
According to a possible embodiment, when the user is sensed by two sensors in sequence, there is a time delay between the signals of them, and the controller is configured to determine different laser on/off commands based on the signals of the sensors and the time delays.
According to a possible embodiment, the sensors have a sensor range of 10 to 20 mm.
According to a possible embodiment, the sensor is selected from infrared sensor, capacitive sensor, passive infrared sensor, resistive sensor and/or magnetic sensor.
According to a possible embodiment, the sensor comprises a sensor that can capture gestures and/or movements of the user when the user is within a capturing area of the sensor.
According to a possible embodiment, the sensor is configured to capture images of the user when the user is exposed in an irradiation in the capturing area of the sensor.
According to a possible embodiment, the irradiation comprises visible or invisible light.
According to a possible embodiment, the laser leveling tool further comprises an emitter configured to emit the visible or invisible light.
According to a possible embodiment, the sensors have a sensor range of several meters to dozens of meters.
According to a possible embodiment, the sensor is selected from camera, infrared sensor and/or passive infrared sensor.
According to a possible embodiment, the sensor has an adjustable sensor range.
In accordance with the disclosure, the leveling tool comprises gesture control functionalities to substitute the physical press buttons used in traditional leveling tools. In operation of the leveling tool, commands can be inputted by the swiping or taping actions of the user's body, and no press down action is needed. Deficiencies in the laser leveling tools resulted from physical press buttons are eliminated.
Other advantages and aspects of the disclosure will be described in the detailed description.
The disclosure will be further understood by reading the following detailed description with reference to the drawings in which:
In general, the disclosure relates to a laser leveling tool. The laser leveling tool may be of any types and has any forms. Thus, the disclosure will be described with reference to some particular forms, but the scope of the disclosure all covers other forms of laser leveling tools.
On an oblique back side portion of the housing 1, there is provided with an HMI 4 by means of which a user can input commands to the tool and which can display information about the state and operation of the tool.
The basic idea of the disclosure is to remove the press-down physical buttons used in the HMIs of traditional tools, and for this purpose the tool is equipped with gesture control means for sensing the user's gesture.
As an example, the HMI 4 is configured as the gesture control means that can sense gestures and/or motions of the user's body, especially hand or finger, and transform the sensed gestures into input commands for controlling the operation of the tool.
For sensing the gestures of the user, it is belied that short distance approach and long distance approach can be used here to achieve user's gesture sensing.
As short distance approach, some proximity sensors are studied and evaluated. In general, the user's finger is near the gesture control means, for example, in a short distance within 20 mm, so the working range of the sensors shall cover this distance range. Further, sensing precision, stability (under various environmental conditions, such as light, temperature and the like), sensitivity, etc. are also key factors in selecting sensors. It is found that IR (infrared) sensors and capacitive sensors meet almost all these requirements.
An IR sensor used here is equipped with an IR emitter for emitting IR rays and an IR receiver for sensing IR rays reflected from the user's hand or finger. When the user's finger is approaching the IR light emitted from the IR emitter, the IR light will be reflected by the finger, and the reflected IR light is sensed by the IR receiver. The working range of the IR sensor is adjustable, and is suitable for short distance use (for example, 10 to 20 mm).
A capacitive sensor used here can detect the change in capacitance when a user's finger touches or approaches the capacitive sensor. The sensor can pass through glass or plastic, so the capacitive sensor can be mounted inside the housing. The working range of the capacitive sensor is also adjustable, and is suitable for short distance use (for example, 10 to 20 mm).
It is understood that other types of proximity sensors, like passive IR sensors, resistive sensors, laser beam sensors, magnetic sensors, etc. may also meet the short distance approach requirements, possibly with some necessary adaption; for example, when passive IR sensors are used, some particularly designed lenses may needed for hand gesture sensing in a short distance.
Then, the HMI 4 is modified as gesture control means that incorporates proximity sensors having a working range of 10 to 20 mm. The proximity sensors are arranged in a pattern, suitable for sensing the gesture of a user's finger, within the range of the HMI 4. A particular pattern is shown in
When the sensor fields of the sensors S1 to S6 (for example, capacitive sensors) can pass through the screen of the HMI 4, the sensors can be mounted under the screen. On the other hand, when the sensor fields of the sensors S1 to S6 cannot pass through the screen of the HMI 4, the sensors can be mounted to the screen.
The transverse sensors S1, S2 and S3 can be used for sensing a substantially transverse (horizontal) movement of the user's finger, the longitudinal sensors S4, S2 and S5 can be used for sensing a substantially longitudinal (vertical) movement of the user's finger, and the sensor S6 can be used for sensing the tap of the user's finger. For directing the actions of the user's finger, a cross mark 5 is formed on the screen corresponding to the transverse sensors S1, S2 and S3 and the longitudinal sensors S4, S2 and S5, and a spot or circular tap mark 6 is formed on the screen corresponding to the sensor S6.
As shown in
As shown in
Alternatively or in addition, the tool may comprise sensors arranged in other locations to form gesture control means. As an example, as shown in
The controller of the tool can receive signals from the sensors Sa to Sd and generates corresponding commands to control the tool to turn on/off the laser beams. For example, as shown in
Other types of actions of the finger, such as taping on two sensors at the same time, swiping over three or more sensor in a non-linear path, etc., may be used by the controller to generate other commands.
Other types laser beam on/off operations can be achieved by arranging corresponding sensors and correlating the input commands with the sensed signals of the sensors.
The gesture control means formed by sensors of the disclosure is also applicable in other types of tools. For example, for a tool with turrets 7 and 8 as shown in
It is understood that, for all the embodiments of the disclosure, the turning on/or off of all the laser beams (horizontal, vertical, dot, cross or the like) can be achieved by taping on corresponding sensors or swiping (moving) between corresponding sensors. However, swiping between sensors can provide higher reliability, which can be understood with reference to
According to another aspect of the disclosure as shown in
The sensor 10 of long distance approach may be a sensor that can capture images of the user's body when the user is exposed in an irradiation in a capturing area of the sensor 10. For example, the irradiation may be visible or invisible light. The irradiation may come from the environment light or come from as irradiation emitter 11 of the laser leveling tool.
The irradiation emitter 11, if any, may be also mounted to or in the housing 1, and there may be more than one irradiation emitter 11 in the laser leveling tool.
As an example of the long distance approach sensor 10, those that can capture images of the user can be used here. For example, the sensor 10 may be micro cameras that can capture the user's image in visible light.
As an example of the long distance approach sensor 10, IR sensors or passive IR sensors may be used here for capturing the user's images in invisible lights. For an IR sensor, it comprises an IR emitter and an IR receiver. The IR emitter emits IR rays. The IR rays emitted from the IR emitter is irradiated onto the user's body, and then images of the user's body when exposed in the IR rays can be sensed by the IP receiver. For a passive IR sensor, it may comprise only an IR receiver for sensing the user's body when exposed in the IR rays come from environment light.
Other types of long distance approach sensors (with or without an irradiation source) can also be used here.
The short and long distance approaches may be used in combination in the same laser leveling tool.
As can be seen, according to the disclosure, the laser leveling tool comprises gesture control means to substitute the physical press buttons used in traditional leveling tools. In operation of the leveling tool, commands can be inputted by swiping, taping or other types of actions of the user's finger, hand or body, and no press down action is needed. Deficiencies in the laser leveling tools resulted from physical press buttons are eliminated.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosure. The disclosure are intended to cover all the modifications, substitutions and changes as would fall within the scope and spirit of the disclosure.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2018/105144 | 9/12/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/051783 | 3/19/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5689330 | Gerard et al. | Nov 1997 | A |
5864956 | Dong | Feb 1999 | A |
7134211 | Bascom | Nov 2006 | B2 |
7454839 | Della Bona | Nov 2008 | B2 |
7797844 | Hobden | Sep 2010 | B2 |
9303990 | Bascom | Apr 2016 | B2 |
9441963 | Yuen | Sep 2016 | B2 |
9441967 | Ranieri | Sep 2016 | B2 |
10088306 | Smith | Oct 2018 | B1 |
11512954 | Loebig | Nov 2022 | B2 |
11668564 | Chan | Jun 2023 | B2 |
20050172502 | Sergyeyenko | Aug 2005 | A1 |
20100066673 | Yeh | Mar 2010 | A1 |
Number | Date | Country |
---|---|---|
101533572 | Sep 2009 | CN |
204831296 | Dec 2015 | CN |
105518576 | Apr 2016 | CN |
106933413 | Jul 2017 | CN |
108123353 | Jun 2018 | CN |
0 787 972 | Aug 1997 | EP |
2013120271 | Aug 2013 | WO |
Entry |
---|
International Search Report corresponding to PCT Application No. PCT/CN2018/105144, dated May 29, 2019 (English language document) (4 pages). |
Number | Date | Country | |
---|---|---|---|
20210341287 A1 | Nov 2021 | US |