The present invention relates to detecting objects during operation of power tools. More specifically, the present invention relates changing operation of a power tool based on detection of an object.
In one embodiment, a power tool is provided including a housing, a motor within the housing, an output component coupled to the motor and defining an operating path of the power tool, an object detection sensor, and a controller. The operating path of the power tool is defined by the direction of motion of the output component when driven by the motor. The object detection sensor is configured to generate an output signal indicative of a material in the operating path of the power tool. The controller is coupled to the motor and the object detection sensor. The controller is configured to receive the output signal from the object detection sensor, and determine, based on the output signal, when the material in the operating path of the power tool changes. The controller is also configured to change an operation of the motor in response to determining that the material in the operating path of the power tool changed.
In another embodiment a method is provided for operating a power tool including a motor and an output component. The power tool defines an operating path of the power tool based on the direction of motion in which an output component of the power tool is driven by the motor. The method includes receiving, at a controller, an output signal from an object detection sensor of the power tool, and determining, by the controller, whether flesh is present in the operating path of the power tool. The method also includes interrupting power to the motor in response to determining that flesh is present in the operating path of the power tool.
In another embodiment a method is provided for operating a power tool including a motor and an output component. The power tool defines an operating path of the power tool based on the direction of motion in which an output component of the power tool is driven by the motor. The method includes receiving, at a controller, an output signal from an object detection sensor of the power tool, and determining, by the controller based on the output signal, that an object is present in the operating path of the power tool and that the object is of a certain type. The method also includes interrupting power to the motor in response to determining that the object that is present in the operating path of the power tool is of the certain type.
Other aspects will become apparent by consideration of the detailed description and accompanying drawings.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limited. The use of “including,” “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “mounted,” “connected” and “coupled” are used broadly and encompass both direct and indirect mounting, connecting and coupling. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings, and can include electrical connections or couplings, whether direct or indirect.
It should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the invention. Furthermore, and as described in subsequent paragraphs, the specific configurations illustrated in the drawings are intended to exemplify embodiments of the invention and that other alternative configurations are possible. The terms “processor” “central processing unit” and “CPU” are interchangeable unless otherwise stated. Where the terms “processor” or “central processing unit” or “CPU” are used as identifying a unit performing specific functions, it should be understood that, unless otherwise stated, those functions can be carried out by a single processor, or multiple processors arranged in any form, including parallel processors, serial processors, tandem processors or cloud processing/cloud computing configurations.
The power tool 100 defines a longitudinal axis A. The power tool 100 generally includes a shoe assembly 105, and a main body 110 having a motor 115. The motor 115 receives power from an electric cord (for example, in an AC version), a battery pack (for example, in a DC version), a source of compressed air (for example, in a pneumatic version), or a combination thereof. A drive mechanism 120 converts rotational motion of the motor 115 to reciprocating motion of a reciprocating spindle 125 to reciprocate a saw blade 130 (i.e., the output component) in a direction substantially parallel to the longitudinal axis A of the power tool 100. The power tool 100 also includes a handle assembly 135 positioned at a distal end of the main body 110 opposite the shoe assembly 105. The handle assembly 135 includes a grip portion 140 and a trigger 145 adjacent the grip portion 140 for actuating the motor 115. The trigger 145 is positioned such that a user can actuate the trigger 145 using the same hand that is holding the grip portion 140, for example, with an index finger. The power tool 100 further includes a mode pad 150. The mode pad 150 allows a user to select a mode of the power tool 100 and indicates to the user the currently selected mode of the power tool 100, which is described in greater detail below.
The shoe assembly 105 includes a shoe post 155 and a shoe 160. The shoe 160 is pivotally mounted on a distal end of the shoe post 155 away from the main body 110. In other constructions, the shoe 160 may be fixedly mounted to the shoe post 155, or mounted in other suitable ways. In other constructions, other types of shoe assemblies may be employed. The shoe assembly 105 is secured relative to the main body 110 of the power tool 100 and provides a guiding surface 165 for resting the power tool 100 against a workpiece (not shown) during cutting operations. The shoe assembly 105 includes the longitudinally-extending shoe post 155, extending substantially parallel to the longitudinal axis A of the power tool 100, which is at least partially disposed within an orifice of the main body 110 of the power tool 100. The shoe post 155 is axially movable relative to the main body 110 of the power tool 100 in a direction substantially parallel to the axis A and includes a locking mechanism 170 for stabilizing the shoe assembly 105 in one of a plurality of axial positions relative to the main body 110. For example, the locking mechanism 170 may include a ball detent system. In other constructions, other suitable types of locking mechanisms may be employed, such as magnets, cams, other types of detent mechanisms, etc.
As shown in
In some embodiments, the power tool 100 includes a battery pack interface, such as illustrated in
The switching network 216 enables the controller 226 to control the operation of the motor 115. Generally, when the trigger 145 is depressed as indicated by an output of the trigger switch 213, electrical current is supplied from the power source 205 to the motor 115, via the switching network 216. When the trigger 145 is not depressed, electrical current is not supplied from the power source 205 to the motor 115.
In response to the controller 226 receiving the activation signal from the trigger switch 213 indicating at least partial depression of the trigger 145, the controller 226 activates the switching network 216 to provide power to the motor 115. The switching network 216 controls the amount of current available to the motor 115 and thereby controls the speed and torque output of the motor 115. The switching network 216 may include numerous FETs, bipolar transistors, or other types of electrical switches. For instance, the switching network 216 may include a six-FET bridge that receives pulse-width modulated (PWM) signals from the controller 226 to drive the motor 115.
The sensors 218 are coupled to the controller 226 and communicate to the controller 226 various signals indicative of different parameters of the power tool 100 or the motor 115. The sensors 218 may include Hall sensors 218a, one or more current sensors 218b, one or more object detection sensors 218c, one or more distance sensors 218d, one or more shoe contact sensors 218e, among other sensors, such as, for example, one or more voltage sensors, one or more temperature sensors, and one or more torque sensors. While these respective sensors are generally referred to in the singular herein (e.g., the distance sensor 218d), these sensors may include one sensor (e.g., one distance sensor 218d) or more than one sensor (e.g., two or more distances sensors 218d) in some embodiments. The controller 226 can monitor the current drawn by the motor 115 using the current sensor 218b. The distance sensor 218d may be an induction sensor that determines the distance between the material being cut and the shoe 160. Additionally, the shoe contact sensor 218e may be an induction sensor that determines whether material is contacting the shoe 160. As explained in further detail below, the object detection sensor 218c may include capacitive sensors, RF sensors, ultrasound sensors, visual sensors (e.g., cameras), conductivity sensors, noncontact voltage sensors, or a combination thereof. The object detection sensor(s) 218c generates an output indicative of whether a particular object is proximate the operating path of the power tool 100. The operating path of the power tool 100 is defined by the direction of travel of the output component of the power tool 100 when the power tool 100 is operated. In other words, the operating path refers to the area in which the output component (e.g., a blade, a drill bit, and the like) of the power tool 100 is being driven, or is expected to be driven in the current operation. As just two examples,
Each Hall sensor 218a outputs motor feedback information to the controller 226, such as an indication (e.g., a pulse) when a magnet of the motor's rotor rotates across the face of that Hall sensor. Based on the motor feedback information from the Hall sensors 218a, the controller 226 can determine the position, velocity, and acceleration of the rotor. In response to the motor feedback information and the signals from the trigger switch 213, the controller 226 transmits control signals to control the switching network 216 to drive the motor 115. For instance, by selectively enabling and disabling the FETs of the switching network 216, power received via the power source 205 is selectively applied to stator coils of the motor 115 to cause rotation of its rotor. The motor feedback information is used by the controller 226 to ensure proper timing of control signals to the switching network 216 and, in some instances, to provide closed-loop feedback to control the speed of the motor 115 to be at a desired level.
The object detection sensor(s) 218c indicate whether an extraneous object is present in the operating path of the power tool 100. An object may be present in the operating path when it is within or proximate to the operating path. The object detection sensor(s) 218c may be implemented, for example, to detect whether flesh (e.g., a finger, hand, arm, or other limb of a living being) is within the operating path of the power tool 100. In some embodiments, however, the object detection sensor 218c may also determine the type of object that is present the operating path of the power tool 100.
In one embodiment, the object detection sensor 218c includes a radio frequency (RF) sensor. The RF sensor transmits RF signals, and analyzes changes in the response signal (e.g., a received signal that is a reflection of or otherwise corresponds to the originally transmitted RF signal) to determine whether an extraneous object is present in the operating path of the power tool 100. When an object is in the travel path of the RF signal, a parameter of the signal changes and the corresponding received signal is different than the originally transmitted signal. As discussed in further detail below, the controller 226 receives information regarding the originally transmitted RF signal, the response signal, the difference(s) between the originally transmitted RF signal and the response signal, or a combination thereof, and determines based on the RF sensor whether an object is in the operating path of the power tool 100 and, in some embodiments, the type of object that is in the operating path of the power tool 100.
In another embodiment, the object detection sensor 218c includes a capacitive sensor. The capacitive sensor monitors the changes in a capacitive field in the operating path of the power tool 100 to determine whether an object is present in the operating path of the power tool 100 and, in some embodiments, to determine which type of object is present in the operating path of the power tool 100. In some embodiments, the capacitive sensor includes a single capacitive probe that forms a capacitor with a detected object. A voltage signal is applied to the capacitive probe, and the capacitance changes based on the object that forms the capacitor with the capacitive probe. Based on the distance between the capacitive probe and the detected object, the material properties of the detected object, the material between the capacitive probe and the detected object, or a combination thereof, the capacitance at the capacitive probe change. In other embodiments, the capacitive sensor includes two capacitive probes, for example, that are positioned opposite to each other. When an object enters the area between the two capacitive probes, the capacitance between the two probes changes. The capacitive sensor generates an output indicative of the capacitance sensed by the capacitive probe (or capacitive probes), and transmits the output to the controller 226 periodically. In other embodiments, the capacitive sensor generates an output indicative of the change in capacitance at the single capacitive probe or the pair of capacitive probes, and transmits the output to the controller 226.
In other embodiments, the object detection sensor 218c includes an ultrasound probe. Similar to the RF sensor, the ultrasound probe generates and transmits an ultrasound signal and analyzes the response signal (a reflection of or otherwise corresponding to the originally transmitted signal) to determine whether an object is present in the operating pathway of the power tool 100, and, in some embodiments, a type of object that is present in the operating pathway of the power tool 100. When an object is in the travel path of the ultrasound signal, a parameter of the signal changes and the corresponding response signal differs from the originally transmitted ultrasound signal. As discussed in further detail below, the controller 226 receives from the ultrasound sensor information regarding the originally transmitted ultrasound signal, the response signal, the difference(s) between the originally transmitted ultrasound signal and the response signal, or a combination thereof, and determines based on the received information whether an object is in the operating path of the power tool 100 and, in some embodiments, the type of object that is in the operating path of the power tool 100.
In yet other embodiments, the object detection sensor 218c includes a camera (or an array of cameras) mounted on the power tool 100. The camera visually monitors the operating path of the power tool 100 and outputs image data to the controller 226 to enable the controller 226 to determine whether an object is present in the operating pathway of the power tool 100 and, in some embodiments, the type of object that is present in the operating pathway of the power tool 100. In some embodiments, a single camera is mounted to the power tool 100. The camera captures image data including, for example, light patterns in the operating pathway of the power tool 100, surface textures in the operating pathway, and the like, which is used by the controller 226 to identify when an object (and the type of object) that is present in the operating pathway. In some embodiments, two or more cameras are mounted on the power tool 100. The cameras may be mounted such that depth information (e.g., depth perception) may be obtained by analyzing the image data from both cameras. The image data obtained by the camera (or camera array) may include visual information in grayscale, color, black and white, or other formats (e.g., HSV, LSL, and the like).
In further embodiments, the object detection sensor 218c includes a conductivity probe. In some embodiments, the conductivity probe may be coupled to the output component of the power tool 100 (e.g., the blade or drill bit). The conductivity probe generates an output indicative of the conductivity of the output component. When an object touches the output component, the conductivity of the output component changes, and such a change is registered by the conductivity probe. In other embodiments, two conductivity probes are used to measure the conductivity of the space or material in the operating path of the power tool 100. When an object enters the operating pathway of the power tool 100, the conductivity changes and such a change is registered by the conductivity probes.
The indicators 220 are also coupled to the controller 226 and receive control signals from the controller 226 to turn on and off or otherwise convey information based on different states of the power tool 100. The indicators 220 include, for example, one or more light-emitting diodes (“LED”), or a display screen. The indicators 220 can be configured to display conditions of, or information associated with, the power tool 100. For example, the indicators 220 are configured to indicate measured electrical characteristics of the power tool 100, the status of the power tool 100, the mode of the power tool (discussed below), etc. The indicators 220 may also include elements to convey information to a user through audible or tactile outputs.
As described above, the controller 226 is electrically and/or communicatively connected to a variety of modules or components of the power tool 100. In some embodiments, the controller 226 includes a plurality of electrical and electronic components that provide power, operational control, and protection to the components and modules within the controller 226 and/or power tool 100. For example, the controller 226 includes, among other things, a processing unit 230 (e.g., a microprocessor, a microcontroller, or another suitable programmable device), a memory 232, input units 234, and output units 236. The processing unit 230 (herein, electronic processor 230) includes, among other things, a control unit 240, an arithmetic logic unit (“ALU”) 242, and a plurality of registers 244 (shown as a group of registers in
The memory 232 includes, for example, a program storage area and a data storage area. The program storage area and the data storage area can include combinations of different types of memory, such as read-only memory (“ROM”), random access memory (“RAM”) (e.g., dynamic RAM [“DRAM”], synchronous DRAM [“SDRAM”], etc.), electrically erasable programmable read-only memory (“EEPROM”), flash memory, a hard disk, an SD card, or other suitable magnetic, optical, physical, or electronic memory devices. The electronic processor 230 is connected to the memory 232 and executes software instructions that are capable of being stored in a RAM of the memory 232 (e.g., during execution), a ROM of the memory 232 (e.g., on a generally permanent basis), or another non-transitory computer readable medium such as another memory or a disc. Software included in the implementation of the power tool 100 can be stored in the memory 232 of the controller 226. The software includes, for example, firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions. The electronic processor 230 is configured to retrieve from memory and execute, among other things, instructions related to the control processes and methods described herein. The electronic processor 230 is also configured to store power tool information on the memory 232 including operational data, information identifying the type of tool, a unique identifier for the particular tool, and other information relevant to operating or maintaining the power tool 100. The tool usage information, such as current levels, motor speed, motor acceleration, motor direction, may be captured or inferred from data output by the sensors 218.
The input units 234 and output units 236 enable the processing unit 230 to communicate with various components, such as the mode pad 150, the trigger switch 213, the switching network 216, the indicators 220, the wireless controller 228, and the user interface 222. For example, the input units 234 and the output unites 236 may include conductive pins, driver circuitry, buffers, or a combination thereof.
The power tool 100 operates in various modes. Each mode enables different features to be executed by the power tool 100 and facilitates certain applications for the user. The current operational mode of the power tool 100 is selected by the user for instance, using the mode pad 150. For example, with respect to
The controller 226 receives the signal(s) from the object detection sensor 218c and determines, based on the output signals from the sensor 218c, whether an object is present in the operating pathway of the power tool 100, and, in some embodiments, a type of object that is present in the operating pathway. When the object detection sensor 218c includes the RF sensor, the controller 226 receives various measured parameters of the originally transmitted signal and the response signal, and determines based on the difference in parameters whether an object is in the operating path of the power tool 100. For example, since different materials reflect different frequencies better than others, in one embodiment, the RF sensor transmits a wideband frequency signal. The transmitted signal travels until it reaches a reflecting object or surface (for example, an object in the operating path of the power tool 100). The object then reflects a response signal back toward the RF sensor. The frequency of the response signal, however, depends on the material of the object from which the response signal reflects. For example, drywall may reflect back a first group of frequencies better than a second group of frequencies, while metal may reflect the second group of frequencies better than the first group of frequencies. Accordingly, when drywall is in the operating path of the power tool 100, the response signal will be primarily in the first group frequencies. However, when a metal pipe, for example, enters the operating path of the power tool 100, the response signal changes to include more frequencies from the second group of frequencies. Therefore, by determining a primary frequency (or group of frequencies) of the response signal, the controller may determine whether an object is present in the operating pathway of the power tool 100. Additionally, the controller may detect a change in the primary frequency (or group of frequencies) during operation of the power tool 100 and, in response, determine that a new object is present in the operating pathway of the power tool 100. Moreover, the controller 226 is also configured to determine a type of object that is in the operating pathway of the power tool 100 using predetermined frequency responses associated with a variety of object types. In other words, by comparing the frequency of the response signal to predetermined frequency responses for various materials (e.g., in a lookup table in the memory 232) and identifying a match or similar frequency response, the controller 226 determines the type of material present in the operating pathway of the power tool 100.
Based on the material determined by the controller 226, the controller 226 can, for example, interrupt the power to the motor 115. For example, when the controller 226 determines that a part of a living being is in the operating path of the power tool 100, the controller 226 interrupts power to the motor 115 to inhibit the power tool 100 from operating the output component is the same area as the detected part of the living being. In other embodiments, the controller 226 is programmed to avoid operating on certain materials, for example, metal pipes, water conduits, or electrical conduits, so as to avoid damaging other components of the construction. In such embodiments, the controller 226 may first determine whether the certain materials are present in the operating path of the power tool 100 and allows the motor 115 to be activated when the certain materials are determined to not be proximate to the operating path of the power tool 100.
In other embodiments, the controller 226 determines the phase angle of the originally transmitted RF signal and the corresponding response signal. Because different materials reflect RF signals differently, the phase angle of the response signal may be indicative of the type of material that reflected the originally transmitted RF signal. Accordingly, the controller 226 may calculate a phase difference between the originally transmitted signal and the response signal. The controller 226 then compares the phase difference to predetermined phase differences corresponding to different types of materials. For example, a metal pipe may be expected to cause a phase difference of approximately 25 degrees, while flesh (e.g., a part of a living being) may be expected to cause a phase difference of 120 degrees. By comparing the calculated phase difference to these predetermined phase differences, the controller 226 is configured to determine the material that reflected the response signal. The predetermined phase differences and associated material types may be stored in a lookup table in the memory 232.
In yet another embodiment, the controller 226 analyzes the dispersion pattern of the transmitted RF signal. Because different materials reflect the RF signal differently, by analyzing the dispersion pattern of the transmitted RF signal, the controller 226 can determine the material of the object in the operating path of the power tool 100. In such embodiments, the RF sensor may also include an RF sensor array to detect the dispersion pattern of the response signal(s). Similar to identifying materials based on phase angle, predetermined dispersion patterns and associated materials may be stored in a lookup table in the memory 232 and accessed by the controller 226 to identify the type of material based on the dispersion pattern detected.
When the object detection sensor 218c includes the capacitive sensor, the controller 226 detects the changes in capacitance and compares the capacitance measurements to, for example, threshold(s) or ranges that correspond to different materials (e.g., which may be stored in a lookup table in the memory 232). For example, a capacitance of less than 100 pF may be indicative of flesh in the operating path of the power tool 100. In such embodiments, when the controller 226 determines that the capacitance detected by the capacitive sensor (or sensors) is less than 100 pF, the controller 226 determines that flesh is in the operating path of the power tool 100 and, in response, interrupts power to the motor 115. The controller 226 may access, for example, a look up table to determine the material of the object in the operating path of the power tool 100. For example, the controller 226 may determine that a capacitance between 150 pF and 200 pF indicates that a metal object (for example, a metal pipe) may be in the operating path of the power tool 100. As discussed below, the controller 226 adjusts or interrupts power to the motor 115 based on the material of the object determined to be present in the operating path of the power tool 100.
When the object detection sensor 218c includes the ultrasound sensor, the controller 226 may analyze similar characteristics or parameters as those analyzed with respect to the RF sensor. For example, the controller 226 receives parameters for the transmitted ultrasound signal and the response ultrasound signal. The controller 226 analyzes, for example, the phase angle between the two ultrasound signals, the frequencies of the two ultrasound signals, the dispersion patterns of the ultrasound signals, or a combination thereof. As discussed above, the controller 226 may access a table that indicates how the different materials of the reflecting object in the operating path of the power tool 100 affect the response signal. Based on these comparisons, the controller 226 may determine the material of the object in the operating path of the power tool 100.
When the object detection sensor 218c includes a camera or camera array, the controller 226 receives visual information (e.g., image data) regarding the operating path of the power tool 100. The controller 226 may then analyze the visual information to extract light patterns associated with a particular object in the operating path of the power tool 100. For example, when the power tool 100 is operating on drywall, the camera may detect a first set of light patterns, but when the power tool 100 is operating on, for example, a metal pipe the light patterns may be different. In some embodiments, the power tool 100 also includes a light that illuminates the operating path of the power tool 100. The light may cause the light patterns detected by the camera (or camera array). In other embodiments, the controller 226 may implement object detection (e.g., for example, using auto-encoders) to analyze the visual information detected by the camera (or camera arrays) and determine whether an extraneous object is present in the operating path of the power tool 100. In embodiments in which more than one camera is used and depth information is gathered, the controller 226 may also determine an estimated distance of an object with respect to the output component of the power tool 100.
When the object detection sensor 218c includes a conductivity probe, the controller 226 receives the conductivity measurement from the conductivity probe (or probes) and compares the conductivity measurement to predetermined conductivities corresponding to different materials. In some embodiments, the predetermined conductivities are not the conductivities of the different materials but, rather, correspond to the conductivity of, for example, the air when an object of a particular material is present. That is, instead of comparing the conductivity measurement to the conductivity of copper, the controller 226 accessed a stored table that indicates that when a copper object is present in air, the conductivity of air, for example, increases. Accordingly, by comparing the conductivity measurements to predetermined conductivity measurements for various materials, the controller 226 can estimate which material corresponds to the object in the operating path of the power tool 100.
Based on which material the controller 226 determines is present in the operating path of the power tool 100, the controller 226 changes the operation of the motor 115. For example, when the controller 226 determines that drywall is present in the operating path of the power tool 100, the controller 226 maintains the operation of the motor 115. However, when the controller 226 determines that flesh (e.g., a part of a living being) is present in the operating path of the power tool 100, the controller 226 interrupts power to the motor 115 to inhibit the power tool 100 from harming the detected flesh. In some embodiments, the controller 226 receives a user input indicating the materials that cause changes to the operation of the motor 115 and indicating the particular changes to occur. For example, the user input may indicate that when flesh is detected in the operating path of the power tool 100, power to the motor 115 is to be stopped, but, when a metal pipe is detected in the operating path of the power tool 100, power to the motor 115 is to be increased. In some embodiments, the user input indicates which materials are to cause the power to the motor 115 to be interrupted, and which materials are to cause the operation of the motor 115 to remain unchanged.
The user input may be received by the controller 226, for example, via an actuator or user interface 222 on the power tool 100 directly. For example, the user interface 222 on the power tool 100 may include one or more of a rotating knob, a dial, an actuator, a sliding actuator, a touch screen, or pushbutton that selects a material (or materials) for which the power to the motor 115 is the interrupted. In other embodiments, the controller 226 receives the user input via the wireless communication controller 228 of the power tool 100. In such embodiments, the wireless communication controller 228 is coupled to the controller 226 and is configured to communicate with an external device 223, which may incorporate the user interface 222, the external device 223 being, for example, a smartphone, a tablet computer, a laptop, and the like. The wireless communication controller 228 may communicate, for example, via Bluetooth®, Wi-Fi, or other similar wireless communication protocol. In such embodiments, the external device 223 includes the user interface 222 or another user interface that receives the user inputs. For example, the user input may indicate the materials that are to cause a change in operation of the motor 115, the particular change or changes in operation, or both, and relays the indications to the controller 226 via the wireless communication controller 228. The controller 226 receives other types of user input via the user interface 222 on the tool 100 or on the external device and the wireless communication controller 228.
In some embodiments, the controller 226 also receives a user input, via one of the aforementioned techniques, indicating the desired sensitivity for the object detection sensor 218c. Based on the received user input, the controller 226 may adapt, for example, the voltage or other drive signals provided to the object detection sensor 218c or may adapt the thresholds to which the output signals from the object detection sensor 218c are compared to determine which material is in the operating path of the power tool 100. In one example, the controller 226 receives a user input via the user interface 222 indicating to increase the sensitivity of the object detection sensor 218c. The controller 226 then increases the voltage signal applied to the single capacitive probe, which allows the capacitive sensor to detect objects that are farther away.
In some embodiments, the controller 226 may also receive a user input indicating whether the object detection feature is to be enabled or disabled. That is, the controller 226 receives a user input that indicates whether the output signals from the object detection sensor 218c are to be analyzed and whether the operation of the motor 115 is to change based on the output signals from the object detection sensor 218c. The controller 226 may receive the user input via the user interface 222 discussed above.
In some embodiments, the controller 226 defaults to operating the power tool 100 with the object detection feature enabled. In such embodiments, the controller 226 operates the motor 115 according to the trigger 145, determines whether a particular type of object (for example, flesh) is in the operating path of the power tool 100. The controller 226 may then receive a toggle signal (e.g., from a push button or other toggle switch on the user interface 222 on the power tool 100 or from the external device 223 via the wireless communication controller 228) to change the state of the object detection feature. The controller 226 then disables the object detection feature in response to receiving the toggle signal. The controller 226 continues to operate the motor 115 based on the trigger 145 and without regard for outputs from the object detection sensor 218c.
In other embodiments, the controller 226 defaults to operating the power tool 100 with the object detection feature disabled. In such embodiments, the controller 226 operates the motor 115 according to the trigger 145. The controller 226 then receives a toggle signal (e.g., from a push button or other toggle switch on the user interface 222 on the power tool 100 or on the external device 223 via the wireless communication controller 228) to change the state of the object detection feature. In response to receiving the toggle signal, the controller 226 enables the object detection feature. When the controller 226 determines that an extraneous object is in the operating path of the power tool 100, the controller 226 changes the operation of the motor 115.
In step 305, the controller 226 receives an input indicating the first type of object for which the operation of the motor 115 is to change (step 305). For example, the controller 226 may receive the indication from the user interface 222 (e.g., a knob, sliding selector, or push button) on the power tool 100, or the user interface 222 on an external device 223 (e.g., a smartphone or tablet) via a wired or wireless connection with the wireless controller 228. In some embodiments, the indication is input at the time of manufacturing and is hard-coded into the controller 226, rather than being input by an end-user. The controller 226 then receives an activation signal from the trigger switch 213, for example, when the trigger 145 is activated by the user (step 310). The controller 226 also receives signals from the object detection sensor 218c (step 315). In response to receiving the activation signal, the controller 226 determines, based on the output signals from the object detection sensor 218c, whether the first type of object is in the operating path (step 320). Examples of object detection sensors 218c and techniques for object type determination are explained in further detail above. While the controller 226 determines that the first type of object is absent from the operating path, the controller 226 drives the motor 115 according to the trigger 145 (step 325). The controller 226 loops back to step 315 to receive updated signals from the object detection sensor 218c, and determine whether the first type of object is present in step 320. When, in step 320, the controller 226 determines that the first type of object is present in the operating path, the controller 226 changes the operation of the motor 115 (step 330). For example, when the controller 226 determines that flesh is in the operating path, the controller 226 inhibits power to the motor 115. In some embodiments, to reset the operation of the power tool 100 after the controller 226 changes the operation of the motor 115, the power tool 100 is restarted by, for example, reconnecting the power source (e.g., the battery pack). In other embodiments, the operation of the power tool 100 may be reset by, for example, releasing the trigger 145 and re-engaging the trigger 145.
Thus, embodiments described herein provide, among other things, a power tool including an object detection feature that enables the operation of the power tool to change based on the materials that are present in the operating path of the power tool.
This application is a continuation of U.S. patent application Ser. No. 17/370,192, filed Jul. 8, 2021, which is a division of U.S. patent application Ser. No. 16/115,087, filed Aug. 28, 2018, which claims priority to U.S. Provisional Patent Application No. 62/552,105, filed Aug. 30, 2017, the entire content of each of which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
6080041 | Greenland | Jun 2000 | A |
6536536 | Gass et al. | Mar 2003 | B1 |
6813983 | Gass et al. | Nov 2004 | B2 |
6826988 | Gass et al. | Dec 2004 | B2 |
6834730 | Gass et al. | Dec 2004 | B2 |
6857345 | Gass et al. | Feb 2005 | B2 |
6880440 | Gass et al. | Apr 2005 | B2 |
6920814 | Gass et al. | Jul 2005 | B2 |
6922153 | Pierga et al. | Jul 2005 | B2 |
6945148 | Gass et al. | Sep 2005 | B2 |
6945149 | Gass et al. | Sep 2005 | B2 |
6957601 | Gass et al. | Oct 2005 | B2 |
6994004 | Gass et al. | Feb 2006 | B2 |
6997090 | Gass et al. | Feb 2006 | B2 |
7000514 | Gass et al. | Feb 2006 | B2 |
7024975 | Gass et al. | Apr 2006 | B2 |
7055417 | Gass | Jun 2006 | B1 |
7077039 | Gass et al. | Jul 2006 | B2 |
7093668 | Gass et al. | Aug 2006 | B2 |
7098800 | Gass | Aug 2006 | B2 |
7100483 | Gass et al. | Sep 2006 | B2 |
7121358 | Gass et al. | Oct 2006 | B2 |
7137326 | Gass et al. | Nov 2006 | B2 |
7171879 | Gass et al. | Feb 2007 | B2 |
7197969 | Gass et al. | Apr 2007 | B2 |
7210383 | Gass et al. | May 2007 | B2 |
7225712 | Gass et al. | Jun 2007 | B2 |
7228772 | Gass | Jun 2007 | B2 |
7231856 | Gass et al. | Jun 2007 | B2 |
7284467 | Gass et al. | Oct 2007 | B2 |
7290472 | Gass et al. | Nov 2007 | B2 |
7308843 | Gass et al. | Dec 2007 | B2 |
7328752 | Gass et al. | Feb 2008 | B2 |
7347131 | Gass | Mar 2008 | B2 |
7350444 | Gass et al. | Apr 2008 | B2 |
7350445 | Gass et al. | Apr 2008 | B2 |
7353737 | Gass et al. | Apr 2008 | B2 |
7357056 | Gass et al. | Apr 2008 | B2 |
7359174 | Gass | Apr 2008 | B2 |
7377199 | Gass et al. | May 2008 | B2 |
7421315 | Gass et al. | Sep 2008 | B2 |
7421932 | Heinzmann et al. | Sep 2008 | B1 |
7472634 | Gass et al. | Jan 2009 | B2 |
7481140 | Gass et al. | Jan 2009 | B2 |
7509899 | Gass et al. | Mar 2009 | B2 |
7525055 | Gass et al. | Apr 2009 | B2 |
7536238 | Gass | May 2009 | B2 |
7540334 | Gass et al. | Jun 2009 | B2 |
7591210 | Gass et al. | Sep 2009 | B2 |
7600455 | Gass et al. | Oct 2009 | B2 |
7610836 | Gass et al. | Nov 2009 | B2 |
7621205 | Gass | Nov 2009 | B2 |
7628101 | Knapp et al. | Dec 2009 | B1 |
7640835 | Gass | Jan 2010 | B2 |
7640837 | Gass et al. | Jan 2010 | B2 |
7644645 | Gass et al. | Jan 2010 | B2 |
7647752 | Magnell | Jan 2010 | B2 |
7661343 | Gass et al. | Feb 2010 | B2 |
7681479 | Gass et al. | Mar 2010 | B2 |
7685912 | Gass et al. | Mar 2010 | B2 |
7698976 | Gass | Apr 2010 | B2 |
7707918 | Gass et al. | May 2010 | B2 |
7707920 | Gass et al. | May 2010 | B2 |
7712403 | Gass et al. | May 2010 | B2 |
7739934 | Tetelbaum et al. | Jun 2010 | B2 |
7784507 | Gass et al. | Aug 2010 | B2 |
7788999 | Gass et al. | Sep 2010 | B2 |
7789002 | Gass et al. | Sep 2010 | B2 |
7804204 | Shafer et al. | Sep 2010 | B1 |
7827889 | Carrier | Nov 2010 | B2 |
7827890 | Gass et al. | Nov 2010 | B2 |
7827893 | Gass et al. | Nov 2010 | B2 |
7832314 | Gass | Nov 2010 | B2 |
7836804 | Gass | Nov 2010 | B2 |
7845258 | Gass et al. | Dec 2010 | B2 |
7866239 | Gass et al. | Jan 2011 | B2 |
7888826 | Shafer et al. | Feb 2011 | B1 |
7895927 | Gass | Mar 2011 | B2 |
7900541 | Gass et al. | Mar 2011 | B2 |
7908950 | Gass et al. | Mar 2011 | B2 |
7921754 | Gass et al. | Apr 2011 | B2 |
7958806 | Gass et al. | Jun 2011 | B2 |
7971613 | Gass et al. | Jul 2011 | B2 |
7991503 | Gass | Aug 2011 | B2 |
7997176 | Gass et al. | Aug 2011 | B2 |
8006595 | Gass | Aug 2011 | B2 |
8011279 | Gass et al. | Sep 2011 | B2 |
8051759 | Gass et al. | Nov 2011 | B2 |
8061245 | Gass | Nov 2011 | B2 |
8061246 | Gass et al. | Nov 2011 | B2 |
8065943 | Gass et al. | Nov 2011 | B2 |
8074546 | Knapp et al. | Dec 2011 | B1 |
8079292 | Gass et al. | Dec 2011 | B2 |
8079295 | Gass | Dec 2011 | B2 |
8087438 | Gass | Jan 2012 | B2 |
8100039 | Gass | Jan 2012 | B2 |
8122798 | Shafer et al. | Feb 2012 | B1 |
8122807 | Gass et al. | Feb 2012 | B2 |
8151675 | Gass et al. | Apr 2012 | B2 |
8186253 | Tetelbaum et al. | May 2012 | B2 |
8186255 | Gass et al. | May 2012 | B2 |
8191450 | Gass | Jun 2012 | B2 |
8196499 | Gass | Jun 2012 | B2 |
8246059 | Gass et al. | Aug 2012 | B2 |
8266997 | Gass et al. | Sep 2012 | B2 |
8291797 | Gass et al. | Oct 2012 | B2 |
8371196 | Gass et al. | Feb 2013 | B2 |
8386067 | Krapf | Feb 2013 | B2 |
8402869 | Gass et al. | Mar 2013 | B2 |
8408106 | Gass | Apr 2013 | B2 |
8413559 | Gass | Apr 2013 | B2 |
8424429 | Knapp et al. | Apr 2013 | B1 |
8430005 | Gass et al. | Apr 2013 | B2 |
8438958 | Gass et al. | Apr 2013 | B2 |
8459157 | Gass et al. | Jun 2013 | B2 |
8469067 | Gass et al. | Jun 2013 | B2 |
8489223 | Gass | Jul 2013 | B2 |
8490527 | Gass et al. | Jul 2013 | B2 |
8498732 | Gass | Jul 2013 | B2 |
8505424 | Gass et al. | Aug 2013 | B2 |
8511693 | Gass et al. | Aug 2013 | B2 |
8522655 | Gass et al. | Sep 2013 | B2 |
8534174 | Kajita et al. | Sep 2013 | B2 |
8640583 | Pierga et al. | Feb 2014 | B2 |
8646369 | Gass et al. | Feb 2014 | B2 |
8919231 | Butler et al. | Dec 2014 | B2 |
8950305 | Shiban | Feb 2015 | B1 |
9038515 | Gass | May 2015 | B2 |
9522476 | Gass | Dec 2016 | B2 |
9555491 | Gass et al. | Jan 2017 | B2 |
9623498 | Gass et al. | Apr 2017 | B2 |
9702504 | Pierga et al. | Jul 2017 | B2 |
9724840 | Gass | Aug 2017 | B2 |
9757871 | Burke et al. | Sep 2017 | B2 |
9844891 | Gass et al. | Dec 2017 | B2 |
9877410 | Teraki et al. | Jan 2018 | B2 |
9878380 | Gass et al. | Jan 2018 | B2 |
9908189 | Gass et al. | Mar 2018 | B2 |
9919369 | Gass et al. | Mar 2018 | B2 |
9925683 | Gass et al. | Mar 2018 | B2 |
9927796 | Gass | Mar 2018 | B2 |
9937573 | Haldar | Apr 2018 | B2 |
9962778 | Talesky et al. | May 2018 | B2 |
9969014 | Gass | May 2018 | B2 |
9981326 | Gass et al. | May 2018 | B2 |
20020017179 | Gass et al. | Feb 2002 | A1 |
20020066346 | Gass et al. | Jun 2002 | A1 |
20030015253 | Gass et al. | Jan 2003 | A1 |
20030019341 | Gass et al. | Jan 2003 | A1 |
20030037651 | Gass et al. | Feb 2003 | A1 |
20030056853 | Gass et al. | Mar 2003 | A1 |
20030131703 | Gass et al. | Jul 2003 | A1 |
20030140749 | Gass et al. | Jul 2003 | A1 |
20040040426 | Gass et al. | Mar 2004 | A1 |
20040200329 | Sako | Oct 2004 | A1 |
20040265079 | Dils et al. | Dec 2004 | A1 |
20050041359 | Gass | Feb 2005 | A1 |
20050139056 | Gass et al. | Jun 2005 | A1 |
20050139459 | Gass et al. | Jun 2005 | A1 |
20060123960 | Gass et al. | Jun 2006 | A1 |
20060123964 | Gass et al. | Jun 2006 | A1 |
20060197020 | Trzecieski | Sep 2006 | A1 |
20060219076 | Gass et al. | Oct 2006 | A1 |
20060225551 | Gass et al. | Oct 2006 | A1 |
20070028733 | Gass | Feb 2007 | A1 |
20070157784 | Gass et al. | Jul 2007 | A1 |
20080110653 | Zhang et al. | May 2008 | A1 |
20080178722 | Gass et al. | Jul 2008 | A1 |
20080295660 | Gass et al. | Dec 2008 | A1 |
20090114070 | Gass | May 2009 | A1 |
20090174162 | Gass et al. | Jul 2009 | A1 |
20090178524 | Gass et al. | Jul 2009 | A1 |
20100083804 | Gass et al. | Apr 2010 | A1 |
20100180741 | Gass et al. | Jul 2010 | A1 |
20100213018 | Gass et al. | Aug 2010 | A1 |
20110048197 | Winkler | Mar 2011 | A1 |
20110056351 | Gass et al. | Mar 2011 | A1 |
20110138978 | Gass et al. | Jun 2011 | A1 |
20110203438 | Nenadic et al. | Aug 2011 | A1 |
20120216665 | Gass et al. | Aug 2012 | A1 |
20140260852 | Laliberte | Sep 2014 | A1 |
20140290455 | Gass | Oct 2014 | A1 |
20140331833 | Gass et al. | Nov 2014 | A1 |
20140360819 | Gass et al. | Dec 2014 | A1 |
20150075342 | Butler et al. | Mar 2015 | A1 |
20150107427 | Gass et al. | Apr 2015 | A1 |
20150151371 | Gass et al. | Jun 2015 | A1 |
20150165641 | Gass et al. | Jun 2015 | A1 |
20150217421 | Gass | Aug 2015 | A1 |
20150273723 | Gass et al. | Oct 2015 | A1 |
20150273725 | Gass et al. | Oct 2015 | A1 |
20150283630 | Gass et al. | Oct 2015 | A1 |
20150321271 | Nenadic et al. | Nov 2015 | A1 |
20150321365 | Lauritsen | Nov 2015 | A1 |
20160046034 | Burke et al. | Feb 2016 | A1 |
20160082529 | Gass et al. | Mar 2016 | A1 |
20160158959 | Gass et al. | Jun 2016 | A9 |
20160263769 | Laliberte et al. | Sep 2016 | A1 |
20160318142 | Maharshi Ramaswamy et al. | Nov 2016 | A1 |
20160319989 | Ramaswamy et al. | Nov 2016 | A1 |
20160346849 | Gass | Dec 2016 | A1 |
20170190012 | Gass | Jul 2017 | A9 |
20170190041 | Dey, IV et al. | Jul 2017 | A1 |
20170216986 | Dey, IV et al. | Aug 2017 | A1 |
20170312837 | Gass et al. | Nov 2017 | A1 |
20170334087 | Gass | Nov 2017 | A1 |
20170368710 | Burke et al. | Dec 2017 | A1 |
20190063679 | Mergener | Feb 2019 | A1 |
20230356341 | Volpert | Nov 2023 | A1 |
Number | Date | Country |
---|---|---|
102187176 | Sep 2011 | CN |
102574281 | Jul 2012 | CN |
103118844 | May 2015 | CN |
102007032221 | Jan 2009 | DE |
102008055062 | Jun 2010 | DE |
202008018113 | Dec 2011 | DE |
2331905 | May 2012 | EP |
2826604 | Jan 2015 | EP |
2621692 | Jul 2016 | EP |
2010027598 | Mar 2010 | WO |
2010059786 | May 2010 | WO |
2012044377 | Apr 2012 | WO |
2014102811 | Jul 2014 | WO |
2015073405 | May 2015 | WO |
2015091245 | Jun 2015 | WO |
2017093877 | Jun 2017 | WO |
Entry |
---|
International Search Report and Written Opinion for Application No. PCT/US2018/048344 dated Dec. 26, 2018 (23 pages). |
Extended European Search Report for Application No. 18850890.7 dated Apr. 7, 2021 (9 pages). |
Chinese Patent Office Action for Application No. 201880057191.1 dated Aug. 10, 2022 (23 pages including machine English Translation). |
Number | Date | Country | |
---|---|---|---|
20230400150 A1 | Dec 2023 | US |
Number | Date | Country | |
---|---|---|---|
62552105 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16115087 | Aug 2018 | US |
Child | 17370192 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17370192 | Jul 2021 | US |
Child | 18332963 | US |