A user may automatically control controllable devices using input devices. For example, the user may control the dimming of different lights, the unlocking or locking of doors, the playing of media programs, etc. using the input device. In one example, the input device may display a user interface that includes a plurality of objects. Each object may represent a controllable device that a user can control automatically.
If a user wants to control a first controllable device, the user would locate a first object on the user interface that corresponds to the first controllable device. For example, the first object may be an icon that is displayed on the user interface. The user would then select the first object and apply a control command that the user desires. For example, a user may turn off a living room light.
If the user wants to perform a subsequent command with a second controllable device, the user would locate a second object on the user interface for the second controllable device. The user would then select the second object and apply the desired control command for the second object. Then, the command is applied to the second controllable device. For example, the user may turn off a bedroom room light.
Although the user can automatically control multiple controllable devices, it may be burdensome for the user to serially control multiple controllable devices. That is, for each controllable device, the user must select an object corresponding to each controllable device and individually apply the desired commands via the objects.
Described herein are techniques for applying gestures to group objects. In the following description, for purposes of explanation, numerous examples and specific details are set forth in order to provide a thorough understanding of particular embodiments. Particular embodiments as defined by the claims may include some or all of the features in these examples alone or in combination with other features described below, and may further include modifications and equivalents of the features and concepts described herein.
In one embodiment, a method receives a gesture via a touchscreen of an electronic device. The touchscreen displays a set of objects that are used to control a set of controllable devices. The method then determines the gesture is a command to group a plurality of objects together and joins the plurality of objects as a single group. The plurality of objects corresponds to a plurality of controllable devices. A control to apply to the single group is determined in response to receiving the gesture where the control is applied to the plurality of controllable devices to cause the plurality of controllable devices to perform a function together.
In one embodiment, a non-transitory computer-readable storage medium is provided that contains instructions that, when executed, control a computer system to be configured for: receiving a gesture via a touchscreen of an electronic device, the touchscreen displaying a set of objects that are used to control a set of controllable devices; determining the gesture is a command to group a plurality of objects together; joining the plurality of objects as a single group, wherein the plurality of objects correspond to a plurality of controllable devices; and associating a control to apply to the single group in response to receiving the gesture, wherein the control is applied to the plurality of controllable devices to cause the plurality of controllable devices to perform a function together.
In one embodiment, a system is provided comprising: a plurality of controllable devices, wherein the plurality of controllable devices correspond to a set of objects that are used to control the plurality of controllable devices via a touchscreen of an input device; and a control device coupled to the plurality of controllable devices, wherein: a set of controllable devices are grouped together into a single group based on a gesture received via the touchscreen the input device, the touchscreen displaying the set of objects, the control device receives a control to apply to the single group in response to the gesture, and the control device applies the control to the plurality of controllable devices to cause the plurality of controllable devices to perform a function together.
Input device 102 includes a user interface 106 and a gesture control manager 108. User interface 106 displays objects 110-1-110-4 that correspond to controllable devices 104-1-104-4, respectively. User interface 106 may display each object 110 as an icon or other graphical representation. A user may use an object 110-1 to automatically control controllable device 104-1. Likewise, objects 110-2, 110-3, and 110-4 may be used to control controllable devices 104-2, 104-3, and 104-4, respectively. It will be understood that although a 1:1 relationship of objects 110 to controllable devices 104 is described, a single object 110 may control multiple controllable devices 104. In one embodiment, input device 102 communicates with a gateway 112 to send commands to control controllable devices 104. Gateway 112 may also communicate with a number of control points 114-1-114-2 that may be connected to controllable devices 104. Although this system configuration is described, it will be understood that other systems for distributing commands to controllable devices 104 may be used, such as a single gateway or control point may be used.
Particular embodiments allow a user to use a gesture, such as a multi-touch gesture, to combine objects 110 into a group. In one embodiment, a gesture control manager 108 detects a multi-touch gesture on user interface 106 and groups objects 110 together accordingly. When combined into a group, a control is associated with all objects 110 in the group. For example, input device 102 may control all objects 110 in the group where a control command is applied to the group. In another example, when an object 110 is added to a group, a control is automatically applied to object 110. For example, a controllable device 104 corresponding to object 110 is automatically controlled to start playing a media program. The various scenarios will be described in more detail below. Also, other input devices 102 (not shown) may control the group where all control commands are applied to controllable devices 104 corresponding to objects 110 in the group.
A user may then make a gesture that indicates a desire to group the two objects 110-1 and 110-2 together. For example, the user may make a “pinching” gesture to move both objects 110-1 and 110-2 together such that they move towards each other. In one example, gesture control manager 108 determines that a pinching gesture has been performed when objects 110-1 and 110-2 overlap or touch. However, objects 110-1 and 110-2 do not need to touch for gesture control manager 108 to determine a pinching gesture. For example, gesture control manager 108 may analyze a speed of pinching and determine a pinching gesture has been performed when the speed of movement of objects 110-1 and 110-2 is above a threshold. Other ways of forming the group may be used. For example, the user may touch both objects 110-1 and 110-2 and then indicate the desire for grouping by pressing a separate button or icon to indicate the desire to create a group.
After forming group 202-1, gesture control manager 108 associates a control for the group that applies to all objects 110 of the group. A control may be applying some function for objects 110-1 and 110-2 to perform. For example, a command to play a football game or play a playlist of songs is applied to all objects 110 of group 202-1, and thus causes corresponding controllable devices 104-1 and 104-2 to start playing the football game or play the playlist.
The above gesture forms a new group; however, objects 110 may be moved into an already existing group 202.
As shown, an area 302 may be formed using the three areas of touch detected from the user's fingers on user interface 106. The areas of touch may or may not contact an object 110. Gesture control manager 108 may detect the touch and area 302 using known methods. Once area 302 is detected, gesture control manager 108 then determines objects 110 within the area. In this case, objects 110-1, 110-2, and 110-3 are found within area 302. The objects within the area may be objects that are totally within the area, objects partially within the area, or any objects that are within or partially within the area.
The user may indicate a desire to group objects 110-1-110-3 by pinching the three fingers together. As noted, the user does not need to contact objects 110 specifically to have them grouped. In other examples, the user may touch the screen with the three fingers and then also indicate the desire for grouping by pressing a separate button or icon to indicate the desire to create a group.
Once group 202-3 is created, gesture control manager 106 associated a control for the group that applies to all objects 110 of the group. For example, a command to play a football game or play a playlist of songs is applied to all objects 110 of group 202-3, and thus causes corresponding controllable devices 104-1, 104-2, and 104-3 to start playing the football game or play the playlist.
To cause controllable devices 104-1 and 104-2 to perform function #3, a command processor 402 may send a control command for group 202. For example, the command causes controllable devices 104-1 and 104-2 to play the master playlist. Command processor 402 receives a signal from gesture control manager 108 indicating a group has been formed. Command processor 402 determines a control to apply to the group and sends a command to control controllable devices 104-1 and 104-2.
Users may also control objects 110 within group 202 after forming the group. For example, a user may use interface 106 of any input device 102 to apply a control command to group 202. A command processor 402 detects the control command for group 202. Command processor 402 may then determine objects 110 that are included in group 202. For example, in this case, objects 110-1 and 110-2 are included in group 202. Command processor 402 then sends a command for corresponding controllable devices 104-1 and 104-2 for objects 110-1 and 110-2.
In one embodiment, gateway 112 receives the command and applies the command to controllable devices 104-1 and 104-2. For example, control point 114-1 receives a command for a controllable device 104-1. Control point 114-1 then applies the command to controllable devices 104-1 and 104-2. For example, controllable devices 104-1 and 104-2 may start playing the master playlist. Thus, both controllable devices 104-1 and 104-2 start playing the master playlist in response to the control command received for group 202.
To illustrate the above,
At 504, command processor 402 combines the first function and the second function. For example, command processor 402 combines the first playlist and the second playlist. The order of the songs within the playlist may vary. For example, command processor 402 may put songs in the first playlist first followed by songs in the second playlist. Alternatively, command processor 402 may interleave the songs from the first playlist and the second playlist.
At 506, command processor 402 sends a command to gateway 108 to have controllable devices 104-1 and 104-2 perform the combined function. For example, command processor 402 sends the new playlist to both controllable devices 104-1 and 104-2, which then start playing the new playlist.
If an existing group 202 has already been formed, when an object 110 is added to group 202, then command processor 402 generates a command to cause an added controllable device 104 to perform the function of group 202. For example, command processor generates a command to play a football game and automatically and sends the command to a controllable device 104.
When combining objects 110, a tiered structure may be used. For example, a user may move an object 110 from one group to another group using a multi-touch gesture. Then, when the user wants to remove object 110 from the second group, the user may use a de-pinch gesture and object 110 is reinserted back into the first group.
At 602, gesture control manager 108 receives a multi-touch gesture to move an object 110 from a first group 202-1 to a second group 202-2. For example,
At 606, command processor 402 applies a control for the second group 202-2 to object 110-1. For example, a function associated with the second group is applied to object 110-1, such as a controllable device 104 associated with object 110-1 may start playing a football game that other controllable devices 104 in the second group are already playing.
At 608, gesture control manager 108 receives a de-pinch gesture.
An example of using the tiered structure will now be described. In one example, a first group 202-1 may be designated as a baseball zone. A second group 202-2 may be designated as a football zone. Controllable devices 104 within first group 202-1 and second group 202-2 may be televisions. Each television may be interspersed within a location, such as a bar. At one point, a user that is watching a television may not want to watch a baseball game, but rather wants to watch a football game. In this case, a bartender may pinch an object 110-1 corresponding to the television from first group 202-1 into second group 202-2. This causes the television to automatically start playing a football game because it has been added to the football zone.
At some point, the user who wanted to watch the football game may leave the bar. At this point, the bartender may de-pinch object 110-1 from the second group 202-2. Gesture control manager 108 then automatically removes object 110-1 from second group 202-1 and places object 110-1 back within first group 202-1. In this case, the television starts playing the baseball game again.
Accordingly, particular embodiments allow users to use gestures to group objects together. Then, control commands may be applied to the group. This provides a convenient way for users to control multiple controllable devices 104 together. For example, once a group is formed, control commands from any input device 102 may be applied to the group. For example, a first input device 102 groups two audio zones to play the same song using a multi-touch gesture. At that point, commands (from any input device 102) to one of the audio zones is echoed to the other audio zone. Another example is when a first input device groups multiple televisions into the same group, such as in a sports bar. Then, any control command by any input device 102 performed on the group is echoed to all controllable devices 104 in the group.
Particular embodiments may be implemented in a non-transitory computer-readable storage medium for use by or in connection with the instruction execution system, apparatus, system, or machine. The computer-readable storage medium contains instructions for controlling a computer system to perform a method described by particular embodiments. The computer system may include one or more computing devices. The instructions, when executed by one or more computer processors, may be operable to perform that which is described in particular embodiments.
As used in the description herein and throughout the claims that follow, “a”, “an”, and “the” includes plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
The above description illustrates various embodiments along with examples of how aspects of particular embodiments may be implemented. The above examples and embodiments should not be deemed to be the only embodiments, and are presented to illustrate the flexibility and advantages of particular embodiments as defined by the following claims. Based on the above disclosure and the following claims, other arrangements, embodiments, implementations and equivalents may be employed without departing from the scope hereof as defined by the claims.