The present invention relates to a touch-sensitive control device, and more particularly to a touch-sensitive control device supporting remote control. The present invention also relates to a touch-sensitive remote-control system.
With the development of interactive electronic devices, particularly portable electronic communication devices such as smart phones and tablet computers, touch-sensing is more and more popular as a human-machine interface contributed to the intuitive and easy manipulation features. So far, capacitive touch sensors have been the mainstream of touch sensors.
For devices controlled with switch buttons or keys, the mechanical structures are disadvantageous in compactness and maintenance. Furthermore, if the devices are distributed in a relatively large area and need to be group controlled, the control via a mechanical interface would be difficult. Therefore, a touch-sensitive control device is advantageously applied to a remote-control system.
For example, US Patent Publication No. 2015/0341184 A1 discloses a control device, which may execute an application that presents a custom UI to a user, and relays control commands back to the host controller of a home automation system. A conversion engine may convert a service implementation into a configuration database, a copy of which may also be maintained on the host controller. The configuration database utilizes special logical representations to describe the configuration of the home automation system. To produce a custom UI on a given control device, the configuration database is transferred to (e.g., downloaded by) the control device and encapsulated by a control SDK. The control SDK, among other functionality, provides methods for querying the configuration database. A mobile app executing on the control device utilizes the control SDK to systematically query the configuration database, to retrieve information concerning the logical representations (present and thereby the configuration of the home automation system. The mobile app then translates returned information to UI elements to create a custom UI of the mobile app, the translation using predefined mappings. The custom UI is displayed on the control device, for use by a user to control the home automation system. The above-described UI elements of the custom UI are shown on the display as a listing of sliders, buttons or knobs, and are user-selectable to indicate desired control of the related entities.
Another US Patent Publication No. 2014/0098247 proposes smart home control using mobile devices, cellular telephones, smart devices and smart phones. Activities in the house may be viewed on the Mobile Device/Mobile Phone including the current state of various appliances, events, and authorized users with permissions to control and access various appliances. Events may be searched, assigned to, or organized by user in the household. Temporary access to the house or an appliance may be enabled by adding a user and setting a duration of access. The location of the individuals may be mapped, geo-fenced, and determined using GPS, Access Point connections and names and locations, network IP address, RFID, NEC, or other location mapping techniques. Each appliance may be mapped to a specific location in the house or office and identified with a description, photo, or internal home map. A slider bar may allow a user to dim lights by making contact with the screen and moving the slider bar from one end to the other.
In the above-described systems, the controlled elements are operated individually. That is, one element is selected and controlled at one time by triggering and moving a corresponding slider bar. Such control mechanisms do not actually take advantage of capabilities of touch-sensing control.
Therefore, the present invention provides a touch-sensitive control device supporting remote control in an intuitive and flexible way.
The present invention further provides touch-sensitive control device supporting remote control in a grouped manner.
The present invention provides a control device for controlling a controlled system, which includes a plurality of controlled devices allocated in a physical layout. The control device comprises a touch-sensing and displaying panel detecting a touching operation or gesture thereon or thereover, generating a position information in response to the touching operation or gesture, and displaying a prompt information according to the position information, wherein the touch-sensing and displaying panel has a default virtual prompt layout corresponding to the physical layout of the controlled devices, and consisting of a plurality of default prompts, and the prompt information includes a prompt pattern consisting of a selected portion of the default prompts, and is changeable with the position information generated in response to the touching operation or gesture; and a driver in communication with the touch-sensing and displaying panel and the controlled system, issuing a first driving signal to the controlled system according to the position information for triggering a selected group of the controlled devices in compliance with the prompt pattern.
In an embodiment, the default virtual prompt layout is consistent to the physical layout of the controlled devices.
In an embodiment, the selected group of the controlled devices are simultaneously controlled by another touching operation or gesture on or over the touch-sensing and displaying panel.
In an embodiment, the touching operation or gesture includes multiple moves simultaneously or sequentially conducted at multiple positions on or over the touch-sensing and displaying panel to select the default prompts. Alternatively, the touching operation or gesture passes some of the default prompts to define a closed loop so as to have the default prompts located inside the closed loop automatically selected. Preferably, the automatically selection of certain default prompts can be manually cancelled by a further touching operation or gesture thereon or thereover.
In an embodiment, the controlled devices are allocated as an array, and are selected from a group consisting of lamps, sprinklers, electrochromic members and electric curtains, and the touch-sensing and displaying panel includes an LED array adaptively emitting light to show the prompt pattern.
In an embodiment, the prompt pattern is a pictorial and/or literal pattern.
According to the present invention, a user can clearly identify the relative positions of the near-end control device and the remote-end controlled devices, thereby supporting remote control by way of touch-sensing means. The operation interface is easy, flexible and intuitive, and the structure is simplified.
The invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, in which:
The invention will now be described more specifically with reference to the following embodiments. It is to be noted that the following descriptions of preferred embodiments of this invention are presented herein for purpose of illustration and description only. It is not intended to be exhaustive or to be limited to the precise form disclosed.
Referring to
The controlled units 101˜10n may be allocated as an array or in any other form, depending on practical requirements. The controlled units 101˜10n may be similar or different devices. Even if the controlled units 101˜10n are identical, they can still be readily identified according to the present invention, compared to the prior art. The touch-sensing and displaying panel 110 of the touch-sensing control device 11 according to the present invention has a default virtual layout 21 consisting of a plurality of default prompts, which corresponds to the physical layout 20 consisting of the controlled units 101˜10n. In an embodiment, the default virtual layout 21 is consistent or equivalent to the physical layout 20, as illustrated in
According to the position information or the set of position information, a prompt information will be displayed on the panel 110. The prompt information may include one or more pictorial and/or literal patterns. Meanwhile, the driver 111, which is in communication with the touch-sensing and displaying panel 110 and the controlled system 10, issues a first driving signal to the controlled system 10 according to the position information or the set of position information, thereby driving one or more of the controlled units 101˜10n associated with the displayed pattern or patterns to conduct a specific operation. For example, in response to a user's touching operation or gesture, a corresponding position information is generated and a prompt pattern 21 is displayed on the display 110. The prompt pattern 21 is preset to correspond to selected controlled units. Accordingly, the driver 111 issues a driving signal to enable a default action of the controlled units.
In this example, the controlled units 101˜10n can be lamps, and selectively triggered to illuminate in a variety of combinations under the control of the control device 11 as described above.
Extensively, the selected controlled unit or units or all the controlled units can be fine-tuned, as a whole, by the control device 11 based on another user's touching operation or gesture. For example, in response to sliding operation or gesture, the touch sensor generates a shift information. The driver 111 issues a second driving signal to the controlled system 10 to trigger the fine-tuning according to the shift information. For example, when the touch sensor 1101 detects a sliding shift from right to left in a specified or designated region, a corresponding shift information is generated. The driver 111 receives the shift information and in response, issues a driving signal to the controlled system 10 to trigger a fine-tuning operation of the controlled system, e.g. to raise the luminance of all or selected lamp or lamps. On the contrary, when the touch sensor 1101 detects a sliding shift from left to right in the specified or designated region, a corresponding shift information is generated. The driver 111 receives the shift information and in response, issues a driving signal to the controlled system 10 to trigger another fine-tuning operation of the controlled system, e.g. to lower the luminance of all or selected lamp or lamps.
In another example, when the touch sensor 1101 detects a downward sliding shift in the specified or designated region, a corresponding shift information is generated. The driver 111 receives the shift information and in response, issues a driving signal to the controlled system 10 to trigger a fine-tuning operation of the controlled system, e.g. to raise the color temperature of all or selected lamp or lamps. On the contrary, when the touch sensor 1101 detects an upward sliding shift in the specified or designated region, a corresponding shift information is generated. The driver 111 receives the shift information and in response, issues a driving signal to the controlled system 10 to trigger another fine-tuning operation of the controlled system, e.g. to lower the color temperature of all or selected lamp or lamps. The upward or downward sliding shift described herein may be a horizontal shift in parallel to the housing surface of the control device 10. Alternatively, with a specifically designed touch sensor, the upward or downward sliding shift described herein may also be a vertical shift normal to the housing surface of the control device 10.
Further extensively, a mode of the selected controlled unit or units or all the controlled units can be switched, as a whole, by the control device 11 based on another user's touching operation or gesture. For example, in response to a tapping operation or gesture, the touch sensor generates a count information. The driver 111 issues a third driving signal to the controlled system 10 to trigger the mode-switching according to the count information. For example, when the touch sensor 1101 detects a specified count of tapping, a corresponding count information is generated. The driver 111 receives the count information and in response, issues a driving signal to the controlled system 10 to trigger a mode-switching operation of the controlled system, e.g. to change colors of all or selected lamp or lamps. For example, different tapping counts and/or sequences result in different default colors.
Further extensively, a specific mode of the selected controlled unit or units or all the controlled units can be triggered, as a whole, by the control device 11 based on another user's touching operation or gesture. For example, a power-on or power-off control or power level control of the selected controlled unit or units or all the controlled units as a whole can be triggered by the control device 11 based on another user's touching operation or gesture. For example, in response to a pressing operation or gesture, the touch sensor generates a duration information. The driver 111 issues a fourth driving signal to the controlled system 10 to trigger the power-switching or power-adjusting operation according to the duration information. For example, when the touch sensor 1101 detects a pressing duration exceeding a threshold, a corresponding duration information is generated. The driver 111 receives the duration information and in response, issues a driving signal to the controlled system 10 to trigger a power-switching operation of the controlled system, e.g. to power on or power off all or selected lamp or lamps, or to change supplied power level of all or selected lamp or lamps.
In addition to lamps, the controlled units may also be, for example, sprinklers, electrochromic glass members, electric curtains or any other suitable devices or members to be group-controlled. According to the present invention, due to the clear position correlation of the controlled system to the control device, remote control of the controlled system can be achieved by conducting a touching operation or gesture on or over the control device. The remote control may include simple switch-on and switch-off operations. Furthermore, a variety of fine-tuning operations may also be included by way of corresponding designs. The parameters to be fine-tuned, for example, may include a sprayed water level of all or selected sprinkler or sprinklers, transmittance of all or selected electrochromic glass member or members, an open level of all or selected electric curtain or curtains, etc. The fine-tuning operations are conducted according to shift information generated in response to user's sliding shifts, as mentioned above. Likewise, prompt pattern or patterns generated in response to user's touching operation(s) or gesture(s) and corresponding to the selected controlled unit or units are shown on the display for reference or confirmation.
The touch-sensing and displaying panel 110 may alternatively be implemented with a structure as shown in
In view of the foregoing, by corresponding the configuration of the near-end control device to the layout of the remote-end controlled system, remote control can be achieved via an easy and intuitive touch-sensing interface according to the present invention. Furthermore, the touch-sensing interface exempts from bulky and damage problems generally encountered by a mechanical structure.
While the invention has been described in terms of what is presently considered to be the most practical and preferred embodiments, it is to be understood that the invention needs not be limited to the disclosed embodiment. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims which are to be accorded with the broadest interpretation so as to encompass all such modifications and similar structures.
Number | Date | Country | Kind |
---|---|---|---|
201410401129.3 | Aug 2014 | CN | national |
The present application is a continuation-in-part application claiming benefit from a pending U.S. patent application bearing a Ser. No. 14/827,376 and filed Aug. 17, 2015, contents of which are incorporated herein for reference.
Number | Date | Country | |
---|---|---|---|
Parent | 14827376 | Aug 2015 | US |
Child | 15674713 | US |