This disclosure relates generally to electronic computing devices and more particularly relates to processing touch inputs into touch screen computing devices.
Conventional touch screen computing devices have been configured to identify the positioning and/or movement of one or more fingers or other objects on or near touch surfaces of the devices. For example, touch screens associated with some touch computing devices have been configured for receiving input via finger gestures and to perform one or more functions in response to those finger gestures. Certain touch screen computing devices can receive input from input devices such as stylus devices. A stylus is a writing, drawing, or pointing instrument or utensil that is generally configured to be hand held and, in the context of touch screen computing devices, used to interact with a touch surface. For example, touch screen computing devices have identified input based on one end of the stylus moving on or near the touch surface of the computing device. Styluses (or styli) have been used with personal digital assistant devices, tablet computing devices, smart phones, and other touch screen computing devices for handwriting, drawing, selecting icons, and providing other forms of input to such touch computing devices.
There are three general categories of stylus devices: active styli, pressure sensitive styli, and ‘dumb’ styli. Dumb styli have no internal electronic components, no batteries, and typically only have a capacitive rubber tip at an end of a pen-shaped body. Such styli are unable to detect amounts or levels of pressure applied via their tips onto a display of a touch computing device. Active styli are self-contained systems designed to work with specific, usually proprietary, touch computing devices. Active styli may include radios or other means to communicate with a particular touch device/platform and are typically limited to working with a proprietary touch screen interface of a closed, proprietary system. Such active styli are constrained to working with a given platform because other, third party touch computing platforms and devices will not recognize these closed-system styli as valid input devices.
In contrast to active styli, pressure sensitive styli are often designed to work with third party touch screens and touch computing devices not made by the manufacturer of such styli. Example pressure sensitive styli are described in more detail in U.S. patent application Ser. No. 13/572,231 entitled “Multifunctional Stylus”, filed Aug. 10, 2012, which is incorporated by reference herein in its entirety. The tips of pressure sensitive may include pressure-sensitive elements. Pressure sensitive styli seek to provide multiple levels of pressure sensitivity, which can be useful in drawing, graphics, and other touch-based applications. For example, pressure sensitive styli can be used to sketch a drawing and provide other touch inputs to applications such as Adobe® Ideas®, Adobe® Illustrator®, and Adobe® Photoshop® executing on various touch computing devices and platforms such as tablet computing devices and smart phones.
Styli that are capable of sensing or detecting levels of pressure can be used to provide more types of controls, data, gestures, and other contact inputs to touch computing devices and touch-based applications. Such pressure sensitivity can be achieved via use of pressure sensitive tips and sensors. Some prior touch-based applications have not taken full advantage of the array of inputs produced by pressure sensitive styli, particularly when the inputs are combined with, and/or augmented by, touch inputs using other means, such as fingers and palms. The limited amount of contact detection and input differentiation performed by such applications can decrease their ease of use, compatibility with other applications, and user efficiency.
Traditional techniques for detecting pressure levels as a component of touch inputs and are limited in terms of contact detection and levels of pressure that can be detected. This limits the types of inputs and gestures that can be processed. These techniques are also unable to effectively and quickly differentiate input received from a stylus versus other means, such as fingers and palms. Some touch-based applications recognize and process application-specific touch inputs. Traditional touch-based applications and touch computing devices do not utilize input timing information to differentiate and distinguish inputs received via stylus contacts versus inputs received via finger touches and gestures.
As such, inputs including a combination of stylus contacts with a touch surface of a touch computing device and finger touches may not be recognized or definable in touch-based applications. For example, some touch based platforms and touch computing devices are limited to recognizing a single touch input means at a time. Such platforms and devices may toggle between accepting inputs from a stylus and fingers, but do not recognize simultaneous input from multiple input means. The lack of support for hybrid stylus-finger touch inputs decreases user productivity by requiring that some application workflows and operations include more and/or different steps in one touch computing device as compared to another touch computing device. Similarly, some touch-based applications do not support inputs or workflows that include stylus and touch inputs from other means such as fingers. The lack of cross-application support for touch inputs limits functionality, reduces user friendliness, and presents additional disadvantages.
Disclosed herein are methods and systems for differentiating contacts and other inputs received from pressure sensitive styli from touch inputs received from other means such as fingers and palms. Workflows for touch computing devices and touch applications based on libraries of input sequences, including inputs received from styli and other means, are disclosed. Methods for differentiating stylus inputs from other touch inputs based on pressure levels and timing information received from a stylus are disclosed.
According to one exemplary embodiment, a computer implemented detects a touch input by receiving a physical contact made at a touch surface of a computing device and identifies whether the touch input was received from a stylus based on additional input received from the stylus. The method includes responding to the touch input with a response, wherein the response differs based on whether the touch input was received from the stylus. The detecting, identifying and responding are performed at the computing device.
According to another exemplary embodiment, a computer implemented method includes detecting a first touch input and a second touch input, the first touch input detected by receiving a physical contact at a touch surface of a computing device, the second touch input detected by receiving a second physical contact at the touch surface of the computing device. The method also includes associating the first touch input with a first type of touch input and the second type of touch input with a second type of touch input different from the first type and then responding to first touch input and the second touch input with a response based on the first type and the second type, wherein the detecting, determining, associating, and responding are performed by a computing device.
In another exemplary embodiment, a computer readable medium has instructions stored thereon, that, if executed by a processor of a computing device, cause the computing device to perform operations for differentiating input received at a touch surface of the computing device. The instructions include instructions for detecting a touch input by receiving a physical contact made at a touch surface of the computing device, instructions for identifying whether the touch input was received from a stylus based on additional input received from the stylus. The instructions also include instructions for responding to the touch input with a response, wherein the response differs based on whether the touch input was received from the stylus.
According to yet another exemplary embodiment, a stylus has a capacitive tip configured to interact with a touch surface of a computing device. The stylus includes a wireless transceiver configured to communicate with the computing device and a pressure sensor configured to determine a level of pressure received at the tip. The stylus also has a processor and a computer readable medium having logic encoded thereon, that if executed by the processor, cause the processor to perform operations. The operations comprise determining if a level of pressure measured by the pressure sensor has reached a predetermined threshold, suppressing capacitive output from the tip to the touch surface if the determining determines that the threshold has not been reached, and communicating a message to the computing device based on determining that the threshold has been reached.
These illustrative features are mentioned not to limit or define the disclosure, but to provide examples to aid understanding thereof. Additional embodiments are discussed in the Detailed Description, and further description is provided there. Advantages offered by one or more of the various embodiments may be further understood by examining this specification or by practicing one or more embodiments presented. The structure and operation of various embodiments are described in detail below with reference to the accompanying drawings. Such embodiments are presented herein for illustrative purposes only. Additional embodiments will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
Exemplary embodiments are best understood from the following detailed description when read in conjunction with the accompanying drawings. It is emphasized that, according to common practice, the various features of the drawings are not to scale. On the contrary, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. Included in the drawings are the following figures:
Embodiments of the present invention will now be described with reference to the accompanying drawings. In the drawings, generally, common or like reference numbers indicate identical or functionally similar elements. Additionally, generally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
Methods and systems are disclosed for a pressure sensitive stylus that functions as a device for interacting with one or more touch based applications executable on touch computing devices. The stylus may also function as a wireless transceiver for transmitting input and content between the stylus and the computing devices. In one embodiment, a user may pair the stylus with one or more touch computing devices, such as for example, a tablet computer, a smart phone with a touch screen interface, and/or any other touch computing device. The user may then cause one or more actions to be performed on the touch computing device by interacting with the touch computing device using the stylus. The actions can form part of one of a plurality of workflows defined in a workflow library. For example, the actions performed on the touch computing device may be specific to the application being executed by the touch computing device when the interaction occurs, and the application may access the library of workflows to process input received from the stylus.
In embodiments, a touch application is able to differentiate between input received via a stylus and finger or palm based on the capacitive difference between the stylus tip and the finger or palm. In alternative embodiments, in cases where capacitive difference information is not available to the touch application, i.e., due to platform features of the touch computing device the application is running on, such as, but not limited to, its operating system (OS) and characteristics of its touch screen, input differentiation is achieved by comparing timing between the computing device's touch information and pressure information retrieved by the stylus. The computing device itself, depending on the platform and/or OS, may not deliver any pressure information to the touch application. In these cases, embodiments use the stylus to provide pressure information to the touch application. In certain embodiments, the stylus functions as a pressure sensor that sends pressure information to the touch application via a wireless transceiver of the stylus 111 (i.e., via a Bluetooth transceiver). An exemplary stylus can do this very quickly (in near real time—in less than 40 milliseconds in one non-limiting embodiment) so that the timing of the stylus is closely aligned with that of the touch computing device. In an embodiment, this same timing information can then also be used to distinguish between contact from the stylus and contact from a finger or palm.
According to embodiments, a method detects physical contact made by a first input means at a touch surface of a touch computing device, determines a pressure level corresponding to the detected contact. The computing device receives first and second inputs from the first input means, and then associates, based at least in part on the second input, first input means with a type of input means.
In an embodiment, a stylus can provide timing data, such as, for example, a timestamp corresponding to the first input so that the touch computing device will recognize the first and second inputs as coming from the stylus, instead of other means such as a finger, fingers, or a palm. In another embodiment, if the touch computing device determines that a received pressure level meets a predetermined, tunable pressure level threshold associated with a stylus, the touch computing device will recognize the first and second inputs as coming from the stylus.
According to embodiments, if the touch computing device receives a pressure level or a modulated pressure stream via a capacitive tip of a stylus, the touch computing device will recognize the first and second inputs as coming from the stylus.
In another embodiment, if the touch computing device determines that a pressure level has reached or exceeded a predefined, tunable threshold and also receives a timestamp from the means associated with the first and/or second inputs, the touch computing device will recognize the first and second inputs as coming from the stylus.
In certain embodiments, the touch computing device and/or touch applications executing on the touch computing device is configured to recognize stylus or other touch inputs received on the touch computing device's touch surface. In embodiments, the inputs, can include, but are not limited to, a single tap, a long press on the touch surface, a swipe in a cardinal direction, a flick, a double tap, a pinch, a two-finger press, a three-finger press, a draw selection, a paint selection, an erase selection, a button click, and an extended button press (i.e., a button press and hold). Embodiments include software libraries or other means for correlating, by the touch computing device, at stylus and/or non-stylus touch inputs received at its touch surface with an input or step included in one of a plurality of workflows. In certain embodiments, the workflows can include one or more of an erasure, an undo operation, a redo or repeat operation, a brush size selection, a brush opacity selection, a selection of constraint, such as, but not limited to, a line angle constraint, a menu navigation, a menu node or menu item selection, a copy operation for a selected electronic asset, a cut operation for a selected electronic asset, and a paste operation for a previously cut or copied electronic asset.
In another embodiment, the touch computing device is configured to detect a second physical contact made by a second input means at the touch surface of the computing device, determine a pressure level corresponding to the second detected contact. The touch computing device can then receive first input and second inputs from the second input means and then associate, based at least in part on the second input from the second input means, the second input means with a second type of input means, the second type of input means being different from the first input means. In this way, exemplary embodiments can perform input differentiation for a stylus versus a non-stylus touch input means. In embodiments, inputs and workflows can comprise hybrid inputs including stylus and non-stylus inputs.
In an additional embodiment, as described below with reference to
In embodiments, input differentiation software executes on a touch computing device that is running a touch application. The input differentiation software runs as the touch computing device is receiving touch input on its touch surface and the touch computing device is also receiving a signal or indication from the stylus that indicates a certain amount of pressure presently being applied on the stylus tip. In one embodiment, the indication and the touch input are received nearly simultaneously. Based on the touch computing device receiving these two pieces of information, the touch computing device (or the touch application invoking the input differentiation software) determines that input is from the stylus and not from a finger or palm. In embodiments, a touch application receives input at the touch computing device and then determines whether or not that input should be associated with the stylus or is not associated with the stylus based on separate (i.e., non-touch and/or wireless) input received from the stylus relating to pressure. In other embodiments, a third type of input means, such as, for example, a non-capacitive touch stylus or a dumb stylus, may also be recognized and used for inputs and workflows in cases where the touch computing device has differentiated between a pressure sensitive stylus and a dumb stylus. In certain embodiments, this differentiation may be based on receiving a modulated pressure stream and/or a timestamp from the pressure sensitive stylus as compared to a dumb stylus that is unable to communicate a modulated pressure stream or a timestamp and a non-capacitive stylus that is unable to communicate a modulated pressure stream via its tip.
In embodiments, the exemplary inputs and workflows shown in
As used herein, the term “pressure” refers to the effect of a mechanical force applied to a surface. Pressure can be quantified as the amount of force acting per unit area. That is, pressure is the ratio of force to the area over which that force is distributed. Pressure is force per unit area applied in a direction perpendicular to the surface of an object. In the context of touch computing devices, pressure can be measured as force per unit area applied in a direction substantially perpendicular or tangential to a touch surface of a touch computing device. In the context of a stylus used with touch computing devices, pressure can be measured as force per unit area applied in a direction substantially perpendicular or tangential to the elongate body of the stylus. For example, a level of pressure can be measured in terms of force per unit area applied to a stylus tip by a touch screen in response to the tip coming into contact with and being pressed onto the touch screen.
One exemplary embodiment includes an input device such as a stylus. The stylus is configured to interact with one or more touch computing devices and includes a capacitive tip at one end of the stylus, the tip being configured to interact with a touch surface of a computing device. The stylus is capable of detecting levels of pressure being applied via physical contact between the tip and a touch surface of a touch computing device. The stylus can communicate the detected pressure on a near-real-time basis to the touch computing device.
In embodiments, a stylus can measure levels or amounts of pressure applied at its tip. The stylus can produce many types of touch inputs and workflows based in part on having a pressure sensor for detecting and measuring pressure applied when the stylus tip is being pressed onto a touch surface, such as, for example, a capacitive touch surface.
Non-limiting examples of a pressure sensitive stylus incorporating a pressure sensor are described in commonly-assigned U.S. patent application Ser. No. ______ (Attorney Docket No. 58083/863896 (3076US01)), entitled “Pressure Sensor for Touch Input Devices,” by Dowd et al., which is incorporated by reference herein in its entirety.
In embodiments, a current pressure level and pressure status is determined and indicated. A threshold can also be determined as shown in
In another embodiment, a stylus input includes a computer readable storage medium having logic encoded thereon, that when executed by a processor, causes the processor to determine and indicate, a number of pressure levels applied to a tip of the input device that is in contact with a surface, such as a touch surface of a touch computing device. In response to determining a pressure level, the logic can include instructions to indicate, via a wireless transceiver other suitable communications means, a pressure level, and a pressure status such as, but not limited to, increasing pressure, decreasing pressure, static pressure, and quiescence (i.e., a lack of pressure on the tip as would be the case when the tip is not in contact with a touch surface).
The logic can be encoded into circuitry such as one or more integrated circuits (ICs) on one or more printed circuit boards (PCBs). For example, the logic can be encoded in an application-specific IC (ASIC). The logic is executable by a processor, such as a microprocessor chip included in the circuitry on a PCB. When executed, the logic determines a pressure level and/or a pressure status.
As used herein, the term “input device” refers to any device usable to interact with an interface of a computing device. An input device may be one or more of a keyboard, a microphone, or a pointing/drawing device such as a mouse or stylus. Input devices can be configured to interact with a touch-sensitive interface of a computing device, such as a touch surface or a touch-sensitive display. As used herein, a “stylus” refers to any writing, drawing, or pointing instrument or utensil that is generally configured to be hand held and, in the context of touch screen computing devices, used to interact with a computing device having a touch-sensitive interface or touch surface (i.e., a touch computing device). The terms “input device” and “stylus” are used interchangeably herein to refer broadly and inclusively to any type of input device capable of interacting with a touch computing device.
As used herein, the term “computing device” refers to any computing or other electronic equipment that executes instructions and includes any type of processor-based equipment that operates an operating system or otherwise executes instructions. A computing device will typically include a processor that executes program instructions and may include external or internal components such as a mouse, a CD-ROM, DVD, a keyboard, a display, or other input or output equipment. Examples of computing devices are personal computers, digital assistants, personal digital assistants, mobile phones, smart phones, pagers, tablet computers, laptop computers, Internet appliances, other processor-based devices, gaming devices, and television viewing devices. A computing device can be used as special purpose computing device to provide specific functionality offered by its applications and by the interaction between their applications.
As used herein, the term “application” refers to any program instructions or other functional components that execute on a computing device. An application may reside in the memory of a device that executes the application. As is known to one of skill in the art, such applications may be resident in any suitable computer-readable medium and execute on any suitable processor. For example, as discussed below with reference to
These illustrative examples are given to introduce the reader to the general subject matter discussed here and are not intended to limit the scope of the disclosed concepts. The following sections describe various additional embodiments and examples with reference to the drawings in which like numerals indicate like elements. For brevity, only the differences occurring within the Figures, as compared to previous or subsequent ones of the figures, are described below.
Exemplary styli are described below with reference to
In cases where the stylus 111 is a stylus with an elongated body like the exemplary body 104, the body housing 102 will be an elongated housing configured to accept the body 104 and connect to the tip 109 through the nozzle housing 103 of the stylus.
In certain embodiments, the stylus 111 includes a wireless transceiver in the body 104. For example, a stylus 111 embodied as a multifunction stylus may include a Bluetooth transceiver (see, e.g., wireless transceiver 336 in
As the exemplary stylus 111 is a pressure sensitive stylus, the tip 109 can be embodied as a pressure sensitive tip. Such a pressure sensitive tip 109 may be manufactured from a smooth and/or gentle material that is not harmful to a touch screen of a touch computing device. The pressure sensitive tip 109 can also be manufactured from a material that deforms when force is applied thereto. For example, the tip 109 may be manufactured from a synthetic or natural rubber material. Additionally, included within the stylus 111 may be a memory, a wireless transceiver, a processing unit, and/or other components (see, e.g., battery 208, button circuitry 226A, and main circuitry 226B in
According to embodiments, the internal battery 208 supplies power to electrical components of the stylus 111, including the button circuitry 226A, a pressure sensor, the indicator light 119, and the main circuitry 226B.
Among other functionality, the button circuitry 226A is configured to provide the force levels measured by the pressure sensor. The button circuitry 226A may communicate or otherwise indicate measured levels of pressure via a wireless transceiver of the stylus 111. Alternatively, the button circuitry 226A can relay pressure levels, identifiers of assets to be copied and pasted, and other inputs via the main circuitry 226B, which in turn can communicate or convey the inputs.
In embodiments, the button circuitry 226A and/or the main circuitry 226B includes electronics and logic to indicate changes in pressure levels on the tip 109 to a touch application executing on a touch computing device. The indications can be communicated on a substantially real time basis. The button circuitry 226A and/or the main circuitry 226B can also include electronics and logic to communicate information uniquely identifying an electronic asset being copied or pasted using the stylus 111 to a touch application executing on a touch computing device. In one embodiment, logic is implemented as an integrated circuit (IC) within the button circuitry 226A. Changes in pressure applied to the tip 109 may be measured and quantified by the button circuitry 226A, which in turn can be communicated to the touch computing device the stylus 111 is currently interacting with. Similarly, other data needed for the stylus 111 to complete the exemplary inputs and workflows discussed below with reference to
In accordance with embodiments, the button circuitry 226A and the main circuitry 226B include a computer readable storage medium with executable instructions or logic for indicating a pressure level, timing information, workflow information, menu selections, copy/paste operations, and other inputs. The timing information can be communicated wirelessly via a wireless transceiver of the input device (see, e.g., the wireless transceiver 336 in
The circuitry 226A, 226B can comprise a printed circuit board (PCB) having one or more ICs or ASICs with logic encoded on them. The logic is executable by a processor, such as a microprocessor chip included in the circuitry 226A, 226B as part of the PCB. When executed, the logic determines a status, such as pressure threshold having been reached, a workflow having been initiated, or a workflow/input having been completed. Exemplary touch inputs and workflows are shown in
Like a pairing operation between a stylus 111 and a touch computing device, in embodiments, indications of pressure levels, timing information, a pressure status, workflow inputs, menu selections, copy/paste operations, and other inputs can be performed by a combination of touch inputs and wireless communications.
Such data and information can be communicated via a wireless transceiver of the stylus 111. For example, a stylus 111 embodied as a multifunction stylus may include a wireless transceiver, such as a Bluetooth transceiver, a wireless network transceiver, and/or some other wireless transceiver for such communications.
The server system 302 includes a processor 305. The processor 305 may include a microprocessor, an application-specific integrated circuit (ASIC), a state machine, or other suitable processing device. The processor 305 can include any number of computer processing devices, including one. The processor 305 can be communicatively coupled to a computer-readable medium, such as a memory 308. The processor 305 can execute computer-executable program instructions and/or accesses information stored in the memory 308. The memory 308 can store instructions that, when executed by the processor 305, cause the processor to perform operations described herein.
A computer-readable medium may include, but is not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor (see, e.g., processors 318a, 318b, 305, and 330 in
The server system 302 may also include a number of external or internal devices, such as input or output devices. For example, the server system 302 is shown with an input/output (I/O) interface 312. A bus 310 can also be included in the server system 302. The bus 310 can communicatively couple one or more components of the server system 302. In the non-limiting example of
Each of the computing devices 304a, 304b includes respective processors 318a, 318b. Each of the processors 318a, 318b may include a microprocessor, an ASIC, a state machine, or other processor. In the non-limiting example of
The computing devices 304a, 304b may also comprise a number of external or internal devices such as a mouse, a CD-ROM, DVD, a keyboard, a display, audio speakers, one or more microphones, or any other input or output devices. For example, each of the computing devices 304a, 304b is respectively shown with I/O interfaces 324a, 324b and display devices 326a, 326b. A non-limiting example of a display device is a computer monitor or computer screen, such as a touch screen. In the non-limiting example of
Buses 322a, 322b can be respectively included in the computing devices 304a, 304b. Each of the buses 322a, 322b can communicatively couple one or more components of the computing devices 304a, 304b.
Each of the client applications 328a, 328b can include one or more software modules for establishing communication with a cloud application 316. Each of the client applications 328a, 328b can also include one or more software modules for performing functions in addition to establishing communication with the cloud application 316. For example, each of the client application 328a, 328b can be an image manipulation application having a software module for communicating with the cloud application 316. In some embodiments, each of the client application 328a, 328b can be a different type of application including different functionality. For example, a client application 328a can be Adobe® Ideas® and a client application 328b can be Adobe® Illustrator®. In some embodiments, the client applications 328a, 328b can be stand-alone applications. In other embodiments, the client applications 328a, 328b can be embedded in another application, such as an image manipulation application.
The server system 302 can include any suitable server or computing device for hosting the cloud application 316. In one embodiment, the server system 302 may be a single server, such as a web or application server. In another embodiment, the server system 302 may be presented as virtual server implemented using a number of server systems connected in a grid or cloud computing topology.
The computing devices 304a, 304b can include any suitable computing device or system for communicating via a data network 306 and executing the client applications 328a, 328b. Non-limiting examples of a suitable computing device or system include a desktop computer, a tablet computer, a smart phone, or any other computing device or system suitable for using electronic content.
An input device (a stylus 111 in the example of
According to embodiments, timing information such as a timestamp and pressure level information can be transmitted from the stylus 111 to a computing device 304a or a client application 328a that the stylus is interacting with. In embodiments, this information can be used by the client application, e.g., 328a, executing on the computing device 304a, to identify touch inputs received via an associated touch display devices 326 as being from the stylus 111 as opposed to other input means, such as fingers. In certain embodiments, the timing information can comprise two components or levels. The first level can be based upon a known time delay between sending a signal or data wirelessly from the pen and it reaching the client application 328a. For example, after a first touch input by the stylus 111 tip 109 at the display device 326a, the client application 328a can identify that touch input as being from the stylus 111 based on receiving a second, wireless input from the stylus 111 within a threshold time span of a time that the touch input was received. Another level can be synchronizing clocks between the stylus 111 and the computing device 304a. Through clock synchronization, a degree of uncertainty or period of time in which a contact or input received at a touch display device 326a cannot be identified as being from a stylus 111 or another input means can be reduced to a matter of milliseconds. By way of example, after a first touch input by the stylus 111 at the display device 326a, the client application 328a can identify that touch input as being from the stylus 111 based on receiving a second input from the stylus 111 identifying synchronization between a clock of the computing device and a clock of the stylus.
In an additional or alternative embodiment, locality of a touch input on touch surface such as the touch display device 326a can be used to differentiate between input from the stylus 111 and other input means. Such locality can be used by the client application 328a or the computing device 304a the client application 328a is executing on, to process ambiguous touch inputs. For example, if a user selects one point in the touch display device 326a with their finger while selecting a second point with the stylus 111 and then, within a fraction of a second, puts the stylus 111 tip 109 down where the finger was and the finger where tip 109 was, locality information and the contact are of at the respective points can be used that to ensure that the stylus 111 contacts/inputs and the finger contacts remain differentiated from each other and unambiguous. Combined with, or in addition to pressure level and timing information, location of a contact on the display device 326a can be used to distinguish inputs in cases where multiple inputs from multiple input means are intermixed at the same time based on determining that the finger inputs are going to typically be on one part of the touch display device 326a and the stylus 111 tip 109 is going to be in another part of the display device 326a.
A non-limiting example of a stylus 111 is stylus 111 shown in
In some embodiments, the memory 332 and I/O module 334 can be implemented as firmware. As used herein, the term “firmware” is used to refer to one or more operating instructions for controlling one or more hardware components of a device. Firmware can include software embedded on a hardware device. A firmware module or program can communicate directly with a hardware component, such as the processor 330 of the stylus 111, without interacting with the hardware component via an operating system of the hardware device.
In embodiments, the touch computing device 304a and/or the client application 328a is able to perform input differentiation, based at least in part on pressure and contact area, in order to differentiate between touch inputs 400 by the stylus 111 and inputs using fingers, palms, or other input means.
In certain embodiments, the capacitive difference of a finger or palm input as compared to a stylus 111 input can be used. In alternative or additional embodiments, the timing of pressure applied is used to differentiate stylus 111 inputs 400 as compared to finger touch inputs they are emulating. These embodiments can use a pressure sensor in the stylus 111, together with a timestamp or other timing data sent via the wireless transceiver 336 (i.e., a Bluetooth transceiver), to distinguish between contact from the stylus 111 versus fingers or a palm. Embodiments involve toggling capacitive differentiation versus timing to differentiate inputs from a tip 109 versus a finger or palm.
In another embodiment, the stylus button 113 that may be depressed and/or clicked to indicate another form of input, such as initiation of a copy or paste operation for an electronic asset. For example, as shown in
As shown in
Lastly,
The modification inputs 510, menu inputs 520, and additional inputs/operations 530 shown in
Throughout
In embodiments, the exemplary inputs and workflows shown in
In embodiments, the display devices 326a and 326b used to display the user interfaces shown in
As shown in user interfaces and canvases 602 in
In embodiments, the client application 328a is able to differentiate between input received via the stylus 111 and the finger means 611 based on the capacitive difference between the stylus tip 109 and the finger means 611. In alternative embodiments, the input differentiation is additionally or alternatively achieved by comparing timing between the computing device's 304a touch information and pressure information retrieved by the stylus 111. This is useful in cases where capacitive difference information is not available to the client application 328a, i.e., due to platform features of the computing device 304a, such as, but not limited to, its operating system (OS) and characteristics of its touch screen 326a. The computing device 304a itself, depending on the platform and/or OS, may not deliver any pressure information to the client application 328a. In these cases, embodiments use the stylus 111 to provide that information. In certain embodiments, the stylus 111 functions as a pressure sensor that sends pressure information to the client application 328b via the wireless transceiver 336 of the stylus 111 (i.e., via a Bluetooth transmitter). The exemplary stylus 111 can do this very quickly (in near real time—in the range of 37 milliseconds in one embodiment) so that the timing of the stylus 111 is closely aligned with that of the computing device 304a. In an embodiment, this same timing information can then also be used to distinguish between contact from the stylus 111 and contact from finger input means 611. Additionally, as described below with reference to the example embodiment shown in
From the perspective of the computing device 304a running the application 328a, the computing device 304a is receiving touch input on the touch surface 326a and the computing device 304a is receiving a signal from the stylus 111 that indicates a certain amount of pressure presently being applied on the tip 109. Based on the computing device 304a receiving these two pieces of information, the computing device 304a (or the client application 328a) determines that input, such as the first input 620 is from the stylus and not from the finger input means 611. In embodiments, a client application 328a receives input at the touch device 304a and then determines whether or not that input should be associated with the stylus 111 or is not associated with the stylus 111 based on separate (i.e., non-touch or wireless) input received from the stylus 111 relating to pressure and/or based on whether separate input is received from the stylus or not received from the stylus.
A server-based clipboard can enable an image or other asset copied from a first device having a screen with a given size and/or dimensions, such as a smart phone, to be skewed, scaled, or otherwise modified for display on a second, destination device having a screen with a different size and/or dimensions, such as a tablet computer or desktop computer. In embodiments, such processing for the clipboard can be performed on the server system, thereby providing for faster performance than non-server based clipboard applications, such as local clipboard applications on touch computing devices. In an embodiment, a server-based clipboard can be provided by the server system 302 or its cloud application 316 shown in
As used herein, the term “electronic content” is used to refer to any type of media that can be rendered for display or use at a computing device such as a touch computing device or another electronic device. Electronic content can include text or multimedia files, such as images, video, audio, or any combination thereof. Electronic content can also include application software that is designed to perform one or more specific tasks at a computing device.
As used herein, the terms “asset” and “electronic asset” are used to refer to an item of electronic content included in a multimedia object, such as text, images, videos, or audio files.
As used herein, the term “clipboard” is used to refer to a location in a memory device accessible via multiple applications that provides short-term data storage and/or data transfer among documents or applications. Data transfer between documents and applications can be performed via interface commands for transferring data such as cutting, copying, and/or pasting operations. For example, a clipboard can provide a temporary data buffer that can be accessed from most or all programs within a runtime environment. In some embodiments, a clipboard can allow a single set of data to be copied and pasted between documents or applications. Each cut or copy operation performed on a given set of data may overwrite a previous set of data stored on the clipboard. In other embodiments, a clipboard can allow multiple sets of data to be copied to the clipboard without overwriting one another.
As an example, the user may desire to copy the image presented in a user interface of a client application 328a. To this end, the user may provide input that corresponds with copying the image (such as, pressing the button 113 and subsequently tapping an image or other electronic asset 1704 with the tip 109 of the stylus 111). In response, the application 328a may copy the selected electronic asset 1704 and provide a reference or unique identifier for the asset 1704 to the stylus 111. For example, the stylus 111 may transmit a request to receive the copied image to the touch computing device 304b via the wireless transceiver 336. In response, the touch computing device 304a may transmit the copied image to the stylus 111 over the data network 306. In one embodiment, the stylus 111 may store a reference to a storage location for the asset 1704 in memory 332 of the stylus 111 where the reference is to a storage location in the server system 302. Additionally, the indicator light 119 of the stylus 111 may be lit a certain color to indicate that data is being received by the stylus 111, that data is being stored by the stylus 111, and/or indicate some other status.
Next, input 1706 results in the asset 1704 being copied to a server-based clipboard available through, for example, the cloud application 316 or another application on the server system 302. In the example embodiment of
After the copy action has been performed as a result of inputs 1702-1406, a confirmation output 1708 can be provided to notify a user that with that the copied asset 1704 is available to be subsequently placed or pasted. The output 1708 can be displayed on the display device 326a by the client application 328a and/or by the stylus input device 111. The output 1708 confirms that the copied asset 1704 is available to be placed or pasted within the same client application 328a, a new destination application 328b, and/or another touch computing device 304b. In the example provided in
With continued reference to
Exemplary Pressure Sensing and Tip Connection Method
The method 1900 handles situations where the contact area made by a stylus on a touch surface of a touch computing device may exceed a touch computing device's minimum threshold for recognizing the stylus before the stylus (or a pressure sensor in the stylus) has met its minimum pressure threshold for determining that it is in contact with a touch surface. Similarly, by performing steps 1902-1908, the method 1900 handles situations where the stylus (or its pressure sensor) has met its minimum pressure threshold for determining that it is in contact with a touch surface before the contact area made by the stylus on the touch surface reaches the touch computing device's minimum threshold for recognizing the stylus.
In one embodiment, the result of performing method 1900 is that a touch computing device, such as the computing device 304a, will only detect a contact from the tip 109 when the stylus 111 is reporting pressure wirelessly to the computing device 304a. The reporting can be communicated via the wireless transceiver 336 (i.e., through a Bluetooth transceiver in the stylus 111).
Beginning with step 1902, an input device receives a mechanical force at its tip 109. According to an exemplary embodiment, the stylus 111 is configured to detect changes in pressure using the pressure sensor.
In step 1902, a pressure level corresponding to an applied (or released) force of a stylus tip 109 touching a touch screen, such as touch display device 326a, is determined. As shown in
Next, in step 1904, a determination is made as to whether the pressure detected in step 1902 exceeds a threshold. Step 1904 can comprise comparing the detected and determined pressure from step 1902 to a threshold so as to ensure that there is both sufficient pressure and a sufficient contact area for the touch computing device to recognize the stylus 111 and accept input via its tip 109. If it is determined that the pressure threshold has not yet been reached, control is passed to step 1906. Otherwise, if it is determined that the pressure threshold has been reached or exceeded, control is passed to step 1908.
In step 1906, the stylus tip 109 is disconnected (or remains disconnected). It is to be understood that performing step 1906 does not result in a physical disconnection of the tip 109 from the stylus 111. Rather, performing step 1906 results in the touch computing device (momentarily) not accepting input via the tip 109. Alternatively, or in addition, executing step 1906 can result in the stylus 111 not providing input to the touch surface (i.e., the touch display device 326a). This step results in input via the tip 109 not being recognized (or provided) as a result of insufficient pressure being applied.
After the tip is disconnected (or remains disconnected), control is passed back to step 1902.
In step 1908, the stylus tip 109 is connected (or remains disconnected). This step results in input via the tip 109 being recognized as a result of sufficient pressure being applied. Performing step 1906 results in the touch computing device accepting (or continuing to recognize) input via the tip 109.
After the tip is connected (or remains connected), control is passed back to step 1902.
As shown, steps 1902-1908 can be repeated as needed when a stylus tip 109 comes in and out of contact with a touch surface. That is, if a previously detected pressure from step 1902 led to a connection of tip 109 in step 1908, when the tip 109 of the stylus 111 is subsequently lifted from a touch surface and then brought back into contact with the touch surface, control is passed to step 1902 where the method 1900 is repeated.
Exemplary Computer System Implementation
Although exemplary embodiments have been described in terms of charging apparatuses, units, systems, and methods, it is contemplated that certain functionality described herein may be implemented in software on microprocessors, such as a microprocessor chip included in the input devices 111 shown in
Aspects of the present invention shown in
If programmable logic is used, such logic may execute on a commercially available processing platform or a special purpose device. One of ordinary skill in the art may appreciate that embodiments of the disclosed subject matter can be practiced with various computer system configurations, including multi-core multiprocessor systems, minicomputers, mainframe computers, computers linked or clustered with distributed functions, as well as pervasive or miniature computers that may be embedded into virtually any device.
For instance, at least one processor device and a memory may be used to implement the above-described embodiments. A processor device may be a single processor, a plurality of processors, or combinations thereof. Processor devices may have one or more processor “cores.”
Various embodiments of the invention are described in terms of this example computer system 2000. After reading this description, it will become apparent to a person skilled in the relevant art how to implement the invention using other computer systems and/or computer architectures. Although operations may be described as a sequential process, some of the operations may in fact be performed in parallel, concurrently, and/or in a distributed environment, and with program code stored locally or remotely for access by single or multi-processor machines. In addition, in some embodiments the order of operations may be rearranged without departing from the spirit of the disclosed subject matter.
Processor device 2004 may be a special purpose or a general purpose processor device. As will be appreciated by persons skilled in the relevant art, processor device 2004 may also be a single processor in a multi-core/multiprocessor system, such system operating alone, or in a cluster of computing devices operating in a cluster or server farm. Processor device 2004 is connected to a communication infrastructure 2006, for example, a bus, message queue, network, or multi-core message-passing scheme. In certain embodiments, one or more of the processors 305, 318a, 318b, and 330 described above with reference to the server system 302, computing device 304a, computing device 304b and input device 111 of
The computer system 2000 also includes a main memory 2008, for example, random access memory (RAM), and may also include a secondary memory 2010. Secondary memory 2010 may include, for example, a hard disk drive 2012, removable storage drive 2014. Removable storage drive 2014 may comprise a floppy disk drive, a magnetic tape drive, an optical disk drive, a flash memory, or the like. In non-limiting embodiments, one or more of the memories 308, 320a, and 320b described above with reference to the server system 302 and computing devices 304a, 304b of
The removable storage drive 2014 reads from and/or writes to a removable storage unit 2018 in a well known manner. Removable storage unit 2018 may comprise a floppy disk, magnetic tape, optical disk, etc. which is read by and written to by removable storage drive 2014. As will be appreciated by persons skilled in the relevant art, removable storage unit 2018 includes a non-transitory computer readable storage medium having stored therein computer software and/or data.
In alternative implementations, secondary memory 2010 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 2000. Such means may include, for example, a removable storage unit 2022 and an interface 2020. Examples of such means may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units 2022 and interfaces 2020 which allow software and data to be transferred from the removable storage unit 2022 to computer system 2000. In non-limiting embodiments, the memory 332 described above with reference to the input device 111 of
Computer system 2000 may also include a communications interface 2024. Communications interface 2024 allows software and data to be transferred between computer system 2000 and external devices. Communications interface 2024 may include a modem, a network interface (such as an Ethernet card), a communications port, a PCMCIA slot and card, or the like. Software and data transferred via communications interface 2024 may be in the form of signals, which may be electronic, electromagnetic, optical, or other signals capable of being received by communications interface 2024. These signals may be provided to communications interface 2024 via a communications path 2026. Communications path 2026 carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an RF link or other communications channels.
With reference to
Computer programs (also called computer control logic) are stored in main memory 2008 and/or secondary memory 2010. Computer programs may also be received via communications interface 2024. Such computer programs, when executed, enable computer system 2000 to implement the present invention as discussed herein. In particular, the computer programs, when executed, enable processor device 2004 to implement the processes of the present invention, such as the steps in the method 1900 illustrated by the flowchart of
In an embodiment, the display devices 326a, 326b used to display interfaces of client applications 328a and 328b, respectively, may be a computer display 2030 shown in
Embodiments of the invention also may be directed to computer program products comprising software stored on any computer useable medium. Such software, when executed in one or more data processing device, causes a data processing device(s) to operate as described herein. Embodiments of the invention employ any computer useable or readable medium. Examples of computer useable mediums include, but are not limited to, primary storage devices (e.g., any type of random access memory), secondary storage devices (e.g., hard drives, floppy disks, CD ROMS, ZIP disks, tapes, magnetic storage devices, and optical storage devices, MEMS, nanotechnological storage device, etc.), and communication mediums (e.g., wired and wireless communications networks, local area networks, wide area networks, intranets, etc.).
Numerous specific details are set forth herein to provide a thorough understanding of the claimed subject matter. However, those skilled in the art will understand that the claimed subject matter may be practiced without these specific details. In other instances, methods, apparatuses or systems that would be known by one of ordinary skill have not been described in detail so as not to obscure claimed subject matter.
Some portions are presented in terms of algorithms or symbolic representations of operations on data bits or binary digital signals stored within a computing system memory, such as a computer memory. These algorithmic descriptions or representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. An algorithm is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, operations or processing involves physical manipulation of physical quantities. Typically, although not necessarily, such quantities may take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared or otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to such signals as bits, data, values, elements, symbols, characters, terms, numbers, numerals or the like. It should be understood, however, that all of these and similar terms are to be associated with appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise, it is appreciated that throughout this specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” and “identifying” or the like refer to actions or processes of a computing device, such as one or more computers or a similar electronic computing device or devices, that manipulate or transform data represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the computing platform.
The system or systems discussed herein are not limited to any particular hardware architecture or configuration. A computing device can include any suitable arrangement of components that provide a result conditioned on one or more inputs. Suitable computing devices include multipurpose microprocessor-based computer systems accessing stored software that programs or configures the computing system from a general-purpose computing apparatus to a specialized computing apparatus implementing one or more embodiments of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device.
Embodiments of the methods disclosed herein may be performed in the operation of such computing devices. The order of the blocks presented in the examples above can be varied—for example, blocks can be re-ordered, combined, and/or broken into sub-blocks. Certain blocks or processes can be performed in parallel.
The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.