Embodiments described herein relate to a touch-based control device to detect touch input.
Home control systems, such as lighting control systems used for lighting fixtures, include binary analog switches and analog dimmer switches that enable users to control one or more lights wired to an electrical box upon which such switches are connected. Furthermore, when a person wishes to activate or interact with home systems, the person typically must interact with an actual device of the system or a dedicated or universal remote control and manually create an environment comprising activated or dimmed lights, audio system output, visual system output (e.g., a television or digital picture frame output), temperature, and the like.
Touch sensitive sensing technology has emerged as a popular human interface for many types of devices. Many types of devices, such as portable computers, media players, and IOT devices, provide capacitive touch-sensing technology as a mechanism for enabling user input. Typically, touch-sensing functionality is implemented through use of a circuit board that distributes capacitive elements over an area of the circuit board.
Typically, the sensing layer 510 has substantially the same dimensions as the underlying reference plane 506. Additionally, the electric field that each sensing element is reactive to directly overlays the respective sensing element. Further, the sensing logic is tuned to interpret fluctuations in electric field that directly overlays individual sensing element as touch-input on the external surface 502. As a result, blind spots 512 can exist over portions of the external surface 502 that overlay (i) regions of the circuit board where no capacitive touch-sensing elements exist, and (ii) underlying regions where the circuit board itself is absent (e.g., regions beyond the perimeter of the circuit board). If a user touches the external surface 502 where there is a blind spot 512, the touch-input is not detected or registered as an input. As a result, under such conventional approaches, areas which are located beyond a perimeter of a touch region are not responsive to touch-input. Likewise, areas of the exterior surface 502 which overlay a region of a circuit board where no touch sensors are provided (e.g., region of circuit board where an antenna structure is provided) may also be non-responsive under conventional approaches.
The disclosure herein is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements, and in which:
Embodiments include a touch-based control device that can detect touch input on regions of an exterior panel without so called “blind spots.” In some examples, a touch-based control device is operable to detect touch input over an entirety of the control device's exterior, including at regions of an exterior panel that (i) overlay portions of a circuit board where no touch sensors exist, and/or (ii) extend beyond a perimeter of a touch region or circuit board on which touch sensors are provided.
In some variations, the control device is substantially touch-sensitive across an area of an external panel (e.g., the device is “touch-anywhere” responsive). Thus, the user can provide touch-input (e.g., tap) or initiate a gesture on a perimeter or corner region of the external panel to register input. Moreover, the touch-input on the perimeter or corner region of the external panel can be detected and processed so as to have the same effect as a touch-input that is more towards the center of the device.
The term “substantially” as used with examples herein means at least 90% of an indicated quantity (e.g., such as an area of an exterior panel).
In some examples, the control device is a wall-mounted device that can operate as or similar to a light switch to control one or multiple different devices (e.g., illumination device, smart device, etc.).
In examples, the control device operates to: (a) interpret any one of multiple possible touch-inputs of a user, and (b) generate one or more output control signals that are pre-associated with the interpreted touch-input. In examples, the multiple possible touch-inputs can correspond to gestures, such as tap actions (e.g., single, double, or triple (or more) tap), one or more long touches, and slides (e.g., continuous touching motion of user in leftward, rightward, upward or downward direction). Still further, gestures can include more complex motions, such as multi-directional inputs, or patterned inputs (e.g., long touch followed by short tap).
In some examples, the control device is a home device controller, such as an intelligent light switch that includes functionality for switching and dimming lights of a dwelling, as well as controlling operation of one or more connected devices (e.g., ceiling fan, thermostat, appliance, etc.). In such implementations, the control device can connect to the mains of the house, and/or communicate with connected devices through a wireless or wireline network or communication interface. A user can use touch-input to interact with the control device to control the operation of the connected devices.
In at least some examples, a control device includes at least one groove formed on the exterior panel to receive at least a first type of input. The groove can correlate to a region where one type of input is recognized or favored over another type of input. Additionally, the touch-sensitive panel includes one or more surrounding regions that are capable of receiving at least a second type of input.
Accordingly, in response to receiving touch-input, the control device performs an output function, such as to control one or more connected devices, either wired or wirelessly, based on the interpreted touch-input.
In variations, the control device 100 can connect to and control other types of connected devices, such as ceiling fans, appliances, thermostats, audio/video devices, etc. In such variations, the sensing capabilities of the control device 100 can detect and interpret different inputs that result in output signals to control operation of such other types of devices (e.g., on/off, speed control for ceiling fan, connected device setting, etc.).
In examples as shown, control device 100 includes an exterior panel 110 to receive touch-input from a user. The exterior panel 110 can include one or multiple sensing zones where the control device 100 detects, or is more likely to detect particular types of touch-inputs. In some examples, the exterior panel 110 includes a groove 116, corresponding to a vertically elongated indentation or recess on the exterior panel 110. As described in greater detail, groove 116 is an example of a zone or region on exterior panel 110 where a particular type of touch-input is detectable, or more likely to be detected, as compared to a non-groove region of the exterior panel 110. In variations, the control device may not have a groove 116. For example, the control device 100 may be planar. Furthermore, the control device can include other types of input/output features, such as a display.
As shown, the control device 100 can be installed as a wall-mounted light controller for a room. In such implementations, the control device 100 can control (i) one or more connected lights (e.g., set of lights in a room) using electrical and switching components, and/or (ii) other connected devices using wireline or wireless communications. As a light controller, for example, the control device 100 can interpret a first user gesture as on/off input, in which case the control device 100 can cause a connected light to switch on or off. Additionally, the control device 100 can interpret a second gesture as dimming control, in which case the control device can implement dimming control on a connected light. The on/off and/or dimming control can be implemented through the control device 100 interfacing with electrical and switching components that connect to the target lights. In variations, the control device 100 can detect touch-input to similarly control other types of connected devices using a wireline or wireless channel.
According to examples, the control module 120 includes components for implementing touch sensing functionality at any location of the exterior panel 110, including on perimeter and corner regions of the exterior panel 110. The control module can respond to touch-inputs by signaling commands or control output signals to control switching and/or settings for operating lights 25 and/or other connected target device(s) 30.
In some examples, the control module 120 interprets touch-input by (i) identifying locations on the exterior panel 110 where the touch occurred, and (ii) interpreting the touch as a particular type of input (e.g., gesture input). By identifying the locations on the exterior panel 110 where touch-input occurred, the control module 120 can modulate the sensitivity of the detection, so as to better detect touch-input on regions of the exterior panel which do not overlay sensing elements of the control module 120.
The control module 120 can implement a control operation or function in response to detecting and interpreting touch-input. For connected lights, the control operation or function can include (i) switching the connected light(s), (e.g., from on to off or vice versa), and (ii) dimming the connected light from high-to-low or low-to-high. For other target devices, the control operation can include communicating wirelessly to switch the target device power state and/or implement a setting or other control parameter (e.g., raise volume).
In some examples, the control module 120 distinguishes different touch-inputs and further selects commands or output signals based on the type of touch-input and/or attributes of the detected touch-input. In examples, the control module 120 can define a gesture based on one or more characteristics of the touch-input, including characteristics of (i) an amount of movement which occurred when the user contacted the panel 110, (ii) a path or linearity of the movement, (iii) a duration in time of the contact; (iv) a location on the panel where the touch was detected (or initiated and/or completed); and/or (v) whether another touch was detected on the external panel 110 within a following time interval upon removal of a prior touch or contact with the external panel 110 (e.g., such as in the case of a double tap). Additionally, in variations, the control module 120 can determine which control operations to perform based on the detected gesture and a scene or setting associated with the detected gesture.
As described with various examples, the control device 100 can detect and respond to touch-input that is received at any location along the surface of the panel 110. Accordingly, the control device 100 can receive and interpret touch-input based on the attributes of the touch-input, where the characteristics include (as described in other examples) distance of touch travel (or starting and completion points), path of touch travel, duration of touch travel, etc. Additionally, as the control device 100 is “touch-anywhere” responsive, the control module 120 can (i) interpret a user gesture, and/or (ii) select a scene (e.g., predetermined settings or actions to perform) or operation to perform based on the detected location(s) of the touch-input.
With reference to
When installed, the exterior panel 110 can mount directly over or in close proximity to the sensing layer 210, such that the individual sensing elements of the sensing layer 210 can detect fluctuations in an electric field caused by introduction of a capacitive object, such as a human finger which inherently carries capacitance. With reference to
Still further, in some implementations, the reference plane 220 can include one or more sensor void regions 244 that are intended to accommodate design aspects of the sensing layer 210. For example, the control module 120 can include a sensor void region 244 where no sensing elements are provided, so as to prevent interference with an antenna element of a wireless transceiver 234.
With reference to
Accordingly, with reference to
Additionally, examples provide that the sensing control logic 230 can implement logic that is specific to a particular area or location of contact on the exterior panel. In some examples, the sensitivity of the sensing control logic 230 in how it interprets raw sensor data generated from the sensing layer 210 can be tuned based on the location (e.g., X/Y coordinates) of the touch contact. For example, to detect touch contact that occurs over structure void regions 242, sensor void regions 244, and/or perimeter regions 246, the sensing control logic 230 can implement a lower threshold variance as between the detected capacitance and a baseline level for sensing layer 210. Moreover, the sensing control logic 230 may determine different types of touch-input based on the location of the touch contact (e.g., X/Y coordinate). For example, the sensing control logic 230 may detect a touch-input as a stroke or movement when the touch-input overlaps with the groove 116. As another example, the sensing control logic 230 can detect a touch-input as a tap, or double tap, when the touch-input occurs over one of the structure void regions.
In examples, the sensing control logic 230 includes detection logic 310 which can continuously monitor the sensor signals 301 to detect the occurrence of a touch-input. The detection logic 310 can detect a touch-input as a change in a value of one or more sensor signals 301, where the change is in reference to the baseline or noise signal value for the sensing element. In examples, the detection logic 310 can register a touch-input when the value of one or more sensor signals 301 varies from the baseline by more than a given minimum threshold (“touch trigger threshold”).
In variations, the detection logic 310 can implement additional conditions for registering changes in values of the sensor signals 301 as touch-input. By way of examples, the additional conditions can include (i) a minimum threshold number of sensing elements that generate sensor signals 301 which vary from the baseline by more than the touch trigger threshold area; and (ii) a minimum threshold time interval during which the change in the sensor signals 301 was detected.
Additionally, in detecting touch-inputs, the detection logic 310 can implement calibration or sensitivity adjustments that are specific to the location of a sensing element. The calibration or sensitivity adjustments can be made in context of determining whether a value of a sensor signal 301, individually or in combination with other signals, is indicative of touch input as opposed to noise. In examples, the detection logic 310 incorporate calibration or sensitivity adjustments for sensor signals 301 of sensing elements that are adjacent or proximate to a location of the touch region 225 which does not directly overlay any sensing element. For example, sensor signals 301 that are generated adjacent or proximate to one of the structure void regions 242, sensor void regions 244 and/or perimeter regions 246 of the circuit board can be calibrated to reflect greater sensitivity as compared to sensor signals 301 that are generated from a region of the sensor layer which directly coincided with presence of one or multiple sensing elements. The detection logic 310 can, for example, vary the touch trigger threshold for individual sensing elements based on the location of the respective sensing elements, with the touch trigger threshold being less for those sensing elements that are proximate to one of the structure void regions 242, sensor void regions 244 and/or perimeter regions 246. In this way, the detection logic 310 can be more sensitive to touch-inputs which occur on locations of the touch region 225 that do not, for example, overlay a sensing element (e.g., location beyond perimeter edge of PCB 202).
Still further, some examples recognize that a touch-input can impact the sensor signals 301 of multiple sensing elements (e.g., cluster) at one time, and over a given time interval during which the touch-input occurred, the number of sensing elements and the degree to which they are impacted may range based on attributes of the touch-input. In determining whether touch input occurs, detection logic 310 can process the sensor signals 301 for attributes which are indicative of a potential touch event, and the attributes can be analyzed to determine whether a touch input occurred. The attributes can reflect, for example, (i) the number of sensing elements which are impacted by a touch-input, (ii) the variation amongst the modulated sensor signals 301, (iii) the degree and/or duration to which the sensor signals 301 are modulated, and/or (iv) the location of the sensing elements that generated the modulated sensor signals 301. The detection logic 310 can incorporate calibration or sensitivity adjustments based on the location of the sensing elements from which respective modulated sensor signal 301 are detected. In some examples, the calibration or sensitivity adjustments can include weighting one or more attributes that are determined from sensing signals 301 that are near a void or perimeter region where no other sensing element is provided. As an addition or variation, the detection logic 310 can pattern match detected attributes of sensor signals 301, such as by (i) representing attributes of a number of modulated signals as a feature vector, and (ii) comparing the determined feature vector with known feature vectors that are labeled to reflect input or no input (or alternatively, a particular type of input). In this way, the detection logic 310 can associate a touch-input that includes attributes such as the location of the touch-input at multiple instances of time during an interval when the touch-input was detected.
In examples, the sensing control logic 230 may also include touch interpretation logic 320, which can associate the detected attributes associated with the touch-input with an input type and/or value. By way of example, the determined input types or values can correspond single-tap, double-tap, long touch, slide or swipe, etc. In some variations, the input type and/or value can also be associated with one or more location values. For example, a touch-input in a first region of the touch region 225 may be interpreted differently as compared to the same touch-input in a second region of the touch region 225.
In examples, the sensing control logic 230 can include correlation logic 330 to correlate the sensor change value, the detected attributes and the input type to an output signal 305. The output signal 305 can be selected for one of multiple connected devices 325 (e.g., light(s) 25, target device(s) 30). Additionally, the output signal 305 can specify a setting or command based on the connected device 325. In some variations, the output signal can be specific to the type or functionality of the connected device 325.
Methodology
According to examples, the control module 120 continuously monitors sensor signals 301 generated by sensing elements of the sensing layer 210 (410). The control module 120 can further detect instances when one or multiple sensor signals 301 modulate in a manner that is potentially indicative of a touch-input (420). For example, the control module 120 can detect when the modulating sensor signal(s) 301 exceed a corresponding baseline value by an amount which exceeds the touch trigger threshold.
The control module 120 can process the modulating sensor signals 301 to determine whether a touch input has occurred (430). Further, in making the determination, the control module 120 can implement calibration and/or sensitivity adjustments that are based on the location of the sensor signals 301 (432). In particular, the control module 120 can implement the calibration and/or sensitivity adjustments so that modulated sensor signals 301, resulting from one or multiple sensing elements that are adjacent to a void or perimeter region, can properly be detected and interpreted as touch input.
As an addition or alternative, the control module 120 can analyze modulating sensor signal(s) 301 to identify attributes that include (i) a number of modulating sensing elements, (ii) the variation amongst the modulated sensor signals 301, (iii) the degree and/or duration to which the sensor signals 301 are modulated, and (iv) the location of the modulated sensor signals 301. Additionally, the control module 120 can weight attributes determined from sensing elements that are proximate or adjacent void or perimeter regions to reflect more sensitivity, so as to better detect touch-input that occurs over a void or perimeter region.
Among other advantages, examples such as described with
It is contemplated for examples described herein to extend to individual elements and concepts described herein, independently of other concepts, ideas or systems, as well as for examples to include combinations of elements recited anywhere in this application. Although examples are described in detail herein with reference to the accompanying drawings, it is to be understood that the concepts are not limited to those precise examples. As such, many modifications and variations will be apparent to practitioners skilled in this art. Accordingly, it is intended that the scope of the concepts be defined by the following claims and their equivalents. Furthermore, it is contemplated that a particular feature described either individually or as part of an example can be combined with other individually described features, or parts of other examples, even if the other features and examples make no mention of the particular feature. Thus, the absence of describing combinations should not preclude claiming rights to such combinations.
This application is a continuation of U.S. patent application Ser. No. 17/142,102, filed on Jan. 5, 2021; which claims benefit of priority to Provisional U.S. Patent Application No. 62/957,302, filed Jan. 5, 2020; the aforementioned applications being hereby incorporated by reference in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
9208676 | Fadell et al. | Dec 2015 | B2 |
9614690 | Ehsani et al. | Apr 2017 | B2 |
9835434 | Sloo | Oct 2017 | B1 |
10375897 | Reilly | Aug 2019 | B2 |
11507217 | Emigh | Nov 2022 | B2 |
20020016639 | Smith | Feb 2002 | A1 |
20030040812 | Gonzales | Feb 2003 | A1 |
20060132303 | Stilp | Jun 2006 | A1 |
20060212136 | Lee | Sep 2006 | A1 |
20080158172 | Hotelling | Jul 2008 | A1 |
20080223706 | Hagiwara | Sep 2008 | A1 |
20090158188 | Bray | Jun 2009 | A1 |
20100001746 | Duchene | Jan 2010 | A1 |
20100146423 | Duchene | Jun 2010 | A1 |
20110063224 | Vexo | Mar 2011 | A1 |
20110137836 | Kuriyama | Jun 2011 | A1 |
20120259992 | Koehler | Oct 2012 | A1 |
20120310415 | Raestik | Dec 2012 | A1 |
20130131839 | Washington | May 2013 | A1 |
20130157729 | Tabe | Jun 2013 | A1 |
20130187887 | Mizuhashi | Jul 2013 | A1 |
20130191233 | Atkinson | Jul 2013 | A1 |
20130198858 | Atkinson | Aug 2013 | A1 |
20130201151 | Takashima | Aug 2013 | A1 |
20130219520 | Atkinson | Aug 2013 | A1 |
20140006465 | Davis | Jan 2014 | A1 |
20140071588 | Chen | Mar 2014 | A1 |
20140108019 | Ehsani | Apr 2014 | A1 |
20140201666 | Bedikian | Jul 2014 | A1 |
20140225855 | Aitchison | Aug 2014 | A1 |
20140253483 | Kupersztoch | Sep 2014 | A1 |
20140257532 | Kim | Sep 2014 | A1 |
20140266669 | Fadell | Sep 2014 | A1 |
20140292396 | Bruwer | Oct 2014 | A1 |
20150005952 | Sasaki | Jan 2015 | A1 |
20150077343 | Shao | Mar 2015 | A1 |
20150097780 | Hotelling | Apr 2015 | A1 |
20150156031 | Fadell | Jun 2015 | A1 |
20150163412 | Holley | Jun 2015 | A1 |
20150187209 | Brandt | Jul 2015 | A1 |
20150241860 | Raid | Aug 2015 | A1 |
20150256391 | Hardy | Sep 2015 | A1 |
20150293513 | Schlette | Oct 2015 | A1 |
20160012348 | Johnson et al. | Jan 2016 | A1 |
20160025367 | Matsuoka et al. | Jan 2016 | A1 |
20160043926 | Frei | Feb 2016 | A1 |
20160117019 | Takeda | Apr 2016 | A1 |
20160259308 | Fadell | Sep 2016 | A1 |
20160277203 | Jin | Sep 2016 | A1 |
20160277204 | Kang | Sep 2016 | A1 |
20160372138 | Shinkai | Dec 2016 | A1 |
20170005818 | Gould | Jan 2017 | A1 |
20170006504 | Townend | Jan 2017 | A1 |
20170017324 | O'Keeffe | Jan 2017 | A1 |
20170019265 | Hou | Jan 2017 | A1 |
20170026194 | Vijayrao et al. | Jan 2017 | A1 |
20170131891 | Novet | May 2017 | A1 |
20170195130 | Landow et al. | Jul 2017 | A1 |
20170205958 | Kurasawa | Jul 2017 | A1 |
20170220197 | Matsumoto | Aug 2017 | A1 |
20170280274 | Notohardjono | Sep 2017 | A1 |
20170292712 | Alexander | Oct 2017 | A1 |
20170336902 | Smith | Nov 2017 | A1 |
20170339004 | Hall | Nov 2017 | A1 |
20170357439 | Lemay | Dec 2017 | A1 |
20170359190 | Nadathur | Dec 2017 | A1 |
20180052451 | Billi-Duran | Feb 2018 | A1 |
20180091326 | McLaughlin | Mar 2018 | A1 |
20180160284 | Lim | Jun 2018 | A1 |
20180160301 | Kwon | Jun 2018 | A1 |
20180181248 | Chang | Jun 2018 | A1 |
20180181849 | Cassidy | Jun 2018 | A1 |
20180191517 | Emigh | Jul 2018 | A1 |
20180292962 | Choi | Oct 2018 | A1 |
20180300645 | Segal | Oct 2018 | A1 |
20180323996 | Roman | Nov 2018 | A1 |
20190029096 | O'Driscoll et al. | Jan 2019 | A1 |
20190035567 | O'Keeffe | Jan 2019 | A1 |
20190215184 | Emigh et al. | Jul 2019 | A1 |
20190229943 | Edwards | Jul 2019 | A1 |
20190265858 | Nishio | Aug 2019 | A1 |
20190280891 | Pognant | Sep 2019 | A1 |
20190371145 | McQueen et al. | Dec 2019 | A1 |
20200007356 | Mason | Jan 2020 | A1 |
20200028734 | Emigh et al. | Jan 2020 | A1 |
20200097112 | Seo | Mar 2020 | A1 |
20200210022 | Kim | Jul 2020 | A1 |
20210019284 | Bowman | Jan 2021 | A1 |
20210194758 | Emigh | Jun 2021 | A1 |
20210208723 | Emigh | Jul 2021 | A1 |
20210208724 | Emigh | Jul 2021 | A1 |
20210210939 | Emigh | Jul 2021 | A1 |
20210211319 | See | Jul 2021 | A1 |
20210218399 | Cardanha | Jul 2021 | A1 |
20220006664 | Emigh | Jan 2022 | A1 |
20220224596 | Emigh | Jul 2022 | A1 |
20230019612 | See | Jan 2023 | A1 |
20230044270 | Emigh | Feb 2023 | A1 |
Number | Date | Country |
---|---|---|
107229230 | Oct 2017 | CN |
3073338 | Sep 2016 | EP |
3131235 | Feb 2017 | EP |
2887166 | Jul 2019 | EP |
WO-2017107521 | Jun 2017 | WO |
WO-2017192752 | Nov 2017 | WO |
WO-2018129105 | Jul 2018 | WO |
WO-2019136440 | Jul 2019 | WO |
WO-2020018995 | Jan 2020 | WO |
Entry |
---|
Extended European Search Report dated Dec. 12, 2023, Application No. 21736185.6-1211 8 pages. |
International Search Report and the Written Opinion of the International Searching Authority mailed Apr. 18, 2019, for related PCT Application No. PCT/US19/012698 filed Jan. 8, 2019, 7 pages. |
International Search Report and the Written Opinion of the International Searching Authority mailed Nov. 14, 2019, for related PCT Application No. PCT/US19/42843 filed Jul. 22, 2019, 8 pages. |
International Search Report and the Written Opinion of the International Searching Authority mailed Feb. 25, 2021, for related PCT Application No. PCT/US21/12214 filed Jan. 5, 2021, 8 pages. |
Extended European Search Report dated Mar. 9, 2022, Application No. 19837970.3 11 pages. |
Number | Date | Country | |
---|---|---|---|
20230155587 A1 | May 2023 | US |
Number | Date | Country | |
---|---|---|---|
62957302 | Jan 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17142102 | Jan 2021 | US |
Child | 17987617 | US |