The present invention is related generally to sending and receiving information among computers.
Many conventional computers, in particular portable computers, e.g., smartphones, tablet computers, etc., comprise touch-sensitive systems, e.g., touch-screen displays and touch-sensitive bezels.
Use of these computers often includes sending information (e.g., multimedia content) between two or more different devices.
There tends to be a need for easy and intuitive ways of sending information among two or more different computers that comprise touch-sensitive systems.
While the appended claims set forth the features of the present invention with particularity, the invention, together with its objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
Turning to the drawings, wherein like reference numerals refer to like elements, the invention is illustrated as being implemented in a suitable environment. The following description is based on embodiments of the invention and should not be taken as limiting the invention with regard to alternative embodiments that are not explicitly described herein.
Embodiments of the invention provide methods and apparatus for sharing information (e.g., multimedia content) among devices that may comprise touch-screen displays and touch-sensitive bezels. The sending and receiving of information from a first computer to a second computer may comprise performing (by a user of the first computer) a directional gesture, i.e., a gesture that specifies a direction. This direction may be used to identify the second computer.
Apparatus for implementing any of the below described arrangements, and for performing any of the below described method steps, may be provided by configuring or adapting any suitable apparatus, for example one or more computers or other processing apparatus or processors, or providing additional modules. The apparatus may comprise a computer, a network of computers, or one or more processors for implementing instructions and using data, including instructions and data in the form of a computer program or plurality of computer programs stored in or on a machine-readable storage medium such as computer memory, a computer disk, ROM, PROM, etc., or any combination of these or other storage media.
It should be noted that certain of the process steps depicted in the below described process flowcharts may be omitted or such process steps may be performed in an order differing from that presented below and shown in those process flowcharts. Furthermore, although all the process steps have, for convenience and ease of understanding, been depicted as discrete temporally-sequential steps, nevertheless some of the process steps may in fact be performed simultaneously or at least overlapping to some extent temporally.
Referring now to the Figures,
The first computer 2 comprises a bezel 4, a display 6, a transceiver 7, a bezel-gesture module 8, a display-gesture module 10, and a device-detection module 11.
The bezel 4 forms part of the housing of the first computer 2. The bezel 4 comprises a frame structure that may be adjacent to (e.g., at least partly surrounding) the display 6.
The display 6 may be a touch-screen display. Some or all of the display 6 may extend underneath the bezel 4 to some extent. Also, some or all of the display 6 may not extend underneath bezel 4, and instead at least a portion of the display 6 may lie flush with the bezel 4.
The transceiver 7 is a conventional transceiver that may transmit information from the first computer 2 for use by an entity remote from the first computer 2 and may receive information from an entity that is remote from the first computer 2. The transceiver may be connected to the gesture modules 8, 10 and to the device-detection module 11.
The gesture modules 8, 10 may each comprise one or more processors. The functionality of the bezel-gesture module 8 and of the display-gesture module 10 may be to recognize bezel gestures and display gestures (e.g., gestures made by a user of the first computer 2) respectively. Further functionality of the gesture modules 8, 10 may be to cause operations that correspond to the gestures to be performed.
The bezel-gesture module 8 is configured to recognize a touch input to the bezel 4. Such a touch input may, for example, be made by a user of the first computer 2 touching the bezel 4 (or a portion of the first computer 2 proximate to the bezel 4) with his finger (i.e., one of his digits). Such, a touch input may, for example, initiate or end a gesture. Any suitable technology may be utilized to sense such a touch input.
The display-gesture module 10 is configured to recognize a touch input to the display 6. Such a touch input may, for example, be made by a user of the first computer 2 touching the display 6 (or a portion of the first computer 2 proximate to the display) with his finger. Such, a touch input may, for example, initiate or end a gesture. Any suitable technology may be utilized to sense such a touch input.
The gesture modules 8, 10 may be connected together such that information may be sent between the modules 8, 10. This is such that gestures that involve touch inputs to both the bezel 4 and to the display 6 may be processed. The gesture modules 8, 10 may be implemented using any suitable type of hardware, software, firmware, or combination thereof. In other embodiments, the functionality provided by the gesture modules 8, 10 may be provided by a single module. The gesture modules 8, 10 may be configured such that they can detect a change from a touch input to the bezel 4 and a touch input to the display 6, and vice versa.
The device-detection module 11 may be configured to detect or identify other systems or apparatus (e.g., other computers) that may in the vicinity of the first computer 2. The functionality of the device-detection module 11 is described in more detail below with reference to
The information being sent from the first computer 2 to the second computer 16 may be any type of digital information (e.g., a computer file, a computer program, a web-link, etc).
At step s2 of
At step s4, the first user 12 may slide his finger 20 across the display 6 towards an edge of the display 6, i.e., towards the bezel 4. Movement of the first user's finger 20 across the display 6 may be detected by the display-gesture module 10. The display-gesture module 10 may then recognize or identify this movement as indicating a “drag” operation. The position of the icon 18 on the display 6 may be changed so that the icon 18 is positioned at the point on the display 6 that is being touched by the first user's finger 20.
At step s6, the first user 12 continues to slide his finger 20 across the display 6 until his finger 20 contacts the bezel 4. Contact of the first user's finger 20 with the bezel 4 may be detected by the bezel-gesture module 8.
The bezel-gesture module 8 and the display-gesture module 10 may recognize or identify the gesture performed during steps s2 through s6 as corresponding to a “select and send” operation, i.e., an operation by which information to be sent may be selected and sent from the first computer 2. In other embodiments the “select and send operation” may additionally comprise the first user 12 moving his finger 20 so it no longer touches the first computer 2 (e.g., by sliding his finger 20 off the edge of the bezel 4). In other words, the gesture performed by the first user 12 using his finger 20 and comprising swiping and dragging the icon or content across the display 6, then simultaneously touching the display 6 and the bezel 4, and then continuing this motion across the bezel 4 alone, may represent or indicate the first user's intention to copy or move content to another computer.
At step s7, the direction in which the first user 12 moves his finger 20 across the display 6 and bezel 4 (i.e., the direction of the “user swipe”) may be used to select a device to which the selected information is to be sent. In this embodiment, the first user 12 may swipe in the direction of the second computer 16 thereby, in effect, selecting that second device 16 as a desired recipient for the information. (Note that different detection technologies allow different levels of precision when detecting the direction of the second computer 16 relative to the first computer 2. Human imprecision also limits the exactness that can be expected. With these considerations in mind, a second computer 16 may be “substantially” in the required direction even with an error of up to 45 degrees in any direction.)
For example,
In this further scenario 102, the second computer 16 is located to the right of the first computer 2. Also, there are two further computers, namely a third computer 104 and a fourth computer 106. The third computer 104 is located in front of the first computer 2. The fourth computer 106 is located to the right of the first computer 2 (i.e., in the same direction as the second computer 16). The third and fourth computers 104, 106 may be the same type of computers as the first and second computers 2, 16.
Each of the computers 2, 16, 104, 106 may comprise device-detection modules (such as the device-detection module 11 described above with reference to
In this scenario, the second computer 16 is identified as the target for the content by the first computer 2 or the first user 12.
Returning to
At step s10, the second computer 16 informs its second user 14 that information is to be sent to the second computer 16. This may, for example, be performed by displaying, to the second user 14, an indication or notification, e.g., on a display of the second user device 16 (hereinafter referred to as “the further display”). This displayed indication may, for example, give the second user 14 an option to “accept” the information (i.e., allow information sent from the first computer 2 to be received by the second computer 16) or “decline” the information (i.e., not allow information sent from the first computer 2 to be received by the second computer 16) from the first computer 2.
Next described with reference to steps s12 through s16 is an example method that may be performed by the second user 14 to accept the information from the first computer 2.
At step s12, the second user 12 may touch, e.g., with his finger or a stylus, a bezel of the second computer 16 (hereinafter referred to as “the further bezel”). Contact of the second user's finger with the further bezel may be detected by a bezel-gesture module of the second computer (hereinafter referred to as “the further bezel-gesture module”).
At step s14, the second user 14 may slide his finger 26 from the further bezel 24 onto the further display 28 and across the further display 28 to some point on the further display 28.
Movement of the second user's finger 26 from the further bezel 24 and onto and across the further display 28 may be detected by the further bezel-gesture module and a display-gesture module of the second computer 16 (hereinafter referred to as “the further display-gesture module”).
At step s16, the second user 14 may move his finger 26 so that it no longer touches the second computer 16 (e.g., by moving his finger 26 away from the further display 28). The further bezel-gesture module and the further display gesture model of the second computer 16 may recognize or identify this “drag and drop” type gesture (i.e., the gesture performed by the second user 14 during steps s12 through s16) as corresponding to a “receive information” operation, i.e., an operation that initiates the receiving (e.g., the downloading) of the information sent by the first computer 2 onto the second computer 16.
In a similar way to how the direction was indicated by the first user's gesture (performed at steps s2 to s6), a direction indicated by the second user's gesture (performed at steps s12 to s16), i.e., the direction that the second user 14 swipes his finger 26 across the further bezel 24 and further display 28 may select a device from which the content is to be received. For example, in the further scenario 102 of
At step s18, the information sent by the first computer 2 is received (e.g., downloaded) by the second computer 16 (e.g., by a transceiver of the second computer 16).
Thus, a process by which information may be sent from the first computer 2 to the second computer 16, and received at that second computer 16, is provided.
The above described method and apparatus utilize a “swipe,” “flick,” or “fling” type gesture that incorporates both a touch-screen display and a touch-sensitive bezel to share content between users. The gesture used by a user is advantageously intuitive and allows the first user to “push” content from his computer (the first computer) to the second user's computer (the second computer) by touching the content and dragging it across the screen and bezel of the first computer in the direction of the second computer. The utilization of both the touch-screen display and touch-sensitive bezel advantageously facilitate in the differentiation (e.g., by the gesture modules) between “select and share” operations and conventional “drag and drop” operations.
Furthermore, the “fling” type gesture advantageously tends to provide that only devices that are in the direction that is indicated by the gesture are identified as targets to send content to. Thus, not to all devices in the vicinity of the sending device are targeted or communicated with during the transmission process. This advantageously tends to allow a user to easily (and using an intuitive gesture) select content for transmission and specify a target device to which to send that content.
A computer that is to receive content may advantageously display an indicator to the user of that device. This indicator may be any appropriate type of indicator. The indicator may be a message or dialog box displayed to the user. It may also be a symbolic icon, representing the action to the user. The indicator may indicate, to the user of the receiving device, that content is being transferred (or is to be transferred, etc.) to the receiving device. The indicator may be any appropriate type of indicator and may provide further information to the user of the receiving device. For example, the indicator may give an indication of the direction of the sending device relative to the receiving device (i.e., an indication of the direction from which the content is being transferred). Also, the indication may indicate the type of content or provide a representation of the specific content itself. Also, the indication may indicate a time limit to the user by which the user must accept (or decline) the content. If the user does not explicitly permit the content to be received or downloaded by the receiving computer within that time limit (e.g., by performing the gesture described above with reference to steps s12 through s16 of
The performance, by the second user, of an action (e.g., the gesture performed by the second user and described above with reference to steps s12 through s16 of
Advantageously, the content that is sent from the first computer to the second computer may be any appropriate type of content. For example, the content to be sent may be content that is stored on the first computer (e.g., pictures, video, documents, etc.). Also for example, the content to be sent may be “referenced content,” e.g., a uniform resource locator (URL) for an Internet resource. For example, the first user may be watching an online video (e.g., a YouTube™ video). The first user may send this video to the second user, e.g., by touching the video being played and dragging it across the display of the first computer to the bezel of the first computer in the direction of the second user (i.e., the first user may perform the above described steps s2 through s6 of
Advantageously, an intuitive and secure method for sending and receiving content between devices is provided. The disclosed method and apparatus is particularly useful for sending and receiving content between devices that are in relatively close proximity.
In the above embodiments, the gestures performed to send and receive information comprise touching (e.g., with a finger or a stylus) a touch-sensitive bezel. However, in other embodiments, one or both of these gestures may not include use of a touch-sensitive bezel. Instead, for example, the functionality provided by the bezel may be provided by a different system, apparatus, or module. For example, a portion of the display (e.g., a region of the display around the edge of the display) may replace the bezel in the above described embodiments. A user's directional gesture may, for example, comprise the user sliding his finger across the display and into contact with an edge region of the display. The user may continue to slide his finger off the display completely.
In above embodiments, the apparatus (i.e., the computer) that detects the user's gesture (i.e., the directional gesture that the user uses to send or receive information) may comprise a touch-screen display and a touch-sensitive bezel. However, in other embodiments, a directional gesture of the user may be detected in a different way by one or more different modules. For example, in other embodiments the user may perform a gesture without touching a device at all. For example, whilst the user performs a directional gesture, the user's movements may be measured or detected (e.g., using one or more cameras or imaging systems). These measurements may then be used to determine a direction being specified by the user. Such systems and apparatus tend to be particularly useful in devices which do not comprise touch-screen displays, e.g., a set-top box operatively coupled to a television. In other embodiments, a set-top box and television (TV) may be operatively coupled to a camera system (or other gesture-recognition system). The TV may display, e.g., an icon. The user may point (e.g., with his finger) to that and move his hand in a dragging or sweeping gesture across the screen and then off the screen in the direction of the device that is to receive data associated with the icon. The camera system coupled to the set-top box and TV (or other gesture-recognition system) may detect the gesture.
In the above embodiments, the gesture described above with reference to
In view of the many possible embodiments to which the principles of the present invention may be applied, it should be recognized that the embodiments described herein with respect to the drawing figures are meant to be illustrative only and should not be taken as limiting the scope of the invention. Therefore, the invention as described herein contemplates all such embodiments as may come within the scope of the following claims and equivalents thereof.