Capacitive touch screens typically rely on current from a body part (e.g., a finger) to receive user input. However, a finger generally lacks the precision required for drawing applications. More precise devices for drawing applications, such as a stylus or even a fingernail, cannot be used as an input device on capacitive touch screens.
The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. Also, the following detailed description does not limit the invention.
Systems and/or methods described herein may provide a drawing interface to aid in precision for drawing on capacitive touch screens. Upon activation of a drawing interface, sensing points may be used to determine a location, dimensions, and/or orientation for a touch (e.g., by a finger) on the touch screen. A drawing tool may be displayed extended beyond the touch location to provide a precise drawing tip based on the location of the touch. The drawing tool may generate graphics (e.g., a line, shape or other graphic) and may move as an apparent extension of the user's finger as the touch is dragged along the surface of the touch screen.
Touch screen 110 may include devices and/or logic that can be used to display images to a user of drawing interface 100 and to receive user inputs in association with the displayed images. For example, drawing tool 120, toolbar 130, drawn objects 140, icons, virtual keys, or other graphical elements may be displayed via touch screen 110.
Drawing tool 120 may include a pointer, tip, brush or other indicator associated with the location and/or orientation of a touch. Drawing tool 120 may be located on display 110, for example, to appear as an extension of a finger. As described further herein, a touch on touch screen 110 may include multiple sensing points. The multiple sensing points may be analyzed to determine dimension(s), location, and orientation of the touch. Drawing tool 120 may then be displayed in a location associated with the touch and removed from the actual touch area so as to be visible to a user. As the touch is dragged along the surface of touch screen 110, drawing tool 120 may generate drawn objects (e.g., drawn object 140) that correspond to the location of drawing tool 120. In one implementation, removal of the touch from touch screen 110 may cause drawing tool 120 to be removed from view on touch screen 110.
Toolbar 130 may include a variety of menu items, icons, and/or other indicators (generically referred to herein as “tips”) that may represent multiple shapes for drawing tool 120. Tips may include for example, multiple line thicknesses, spray paint simulations, brushes, polygons, text boxes, erasers, lines and other graphics. A tip may be selected from toolbar 130 by a user (e.g., by touching a tip on toolbar 130). The selection of a particular tip from toolbar 130 may change the appearance and/or drawing properties of drawing tool 120.
Although
As illustrated in
Display 220 may provide visual information to the user. For example, display 220 may display text input into device 100, text, images, video, and/or graphics received from another device, and/or information regarding incoming or outgoing calls or text messages, emails, media, games, phone books, address books, the current time, etc. For example, display 220 may include a liquid crystal display (LCD), such as a thin film transistor (TFT) LCD, etc.
As shown in
Generally, touch panel 230 may include any kind of technology that provides the ability to identify multiple touches and/or a sequence of touches that are registered on the surface of touch panel 230. Touch panel 230 may also include the ability to identify movement of a body part or a pointing device as it moves on or near the surface of touch panel 230.
In one embodiment, touch panel 230 may include a capacitive touch overlay including multiple touch sensing points capable of sensing a touch. An object having capacitance (e.g., a user's finger) may be placed on or near touch panel 230 to form a capacitance between the object and one or more of the touch sensing points. The amount and location of touch sensing points may be used to determine touch coordinates (e.g., location and dimensions) of the touch. The touch coordinates may be associated with a portion of display 220 having corresponding coordinates.
In another embodiment, touch panel 230 may include projection scanning technology, such as infra-red touch panels or surface acoustic wave panels that can identify, for example, dimensions of a human touch on the touch panel. For either infra-red or surface acoustic wave panels, the number of horizontal and vertical sensors (e.g., acoustic or light sensors) detecting the touch may be used to approximate the location of a touch.
Control buttons 240 may permit the user to interact with device 200 to cause device 200 to perform one or more operations. For example, control buttons 240 may be used to cause device 200 to transmit information and/or to activate drawing interface 100 on display 230.
Microphone 250 may receive audible information from the user. For example, microphone 250 may receive audio signals from the user and may output electrical signals corresponding to the received audio signals. Speaker 260 may provide audible information to a user of device 200. Speaker 260 may be located in an upper portion of device 200, and may function as an ear piece when a user is engaged in a communication session using device 200. Speaker 260 may also function as an output device for music and/or audio information associated with games and/or video images played on device 200.
Although
Processor 300 may include one or more microprocessors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or the like. Processor 300 may control operation of device 200 and its components. In one implementation, processor 300 may control operation of components of device 200 in a manner described herein.
Memory 310 may include a random access memory (RAM), a read-only memory (ROM), and/or another type of memory to store data and instructions that may be used by processor 300. In one implementation, memory 310 may store data used to display a graphical user interface, such as quick-access menu arrangement 100 on display 230.
User interface 320 may include mechanisms for inputting information to device 200 and/or for outputting information from device 200. Examples of input and output mechanisms might include buttons (e.g., control buttons 240, keys of a keypad, a joystick, etc.); a speaker (e.g., speaker 220) to receive electrical signals and output audio signals; a microphone (e.g., microphone 250) to receive audio signals and output electrical signals; a display (e.g., display 230) to receive touch input and/or to output visual information; a vibrator to cause device 200 to vibrate; and/or a camera to receive video and/or images.
Communication interface 330 may include, for example, a transmitter that may convert baseband signals from processor 300 to radio frequency (RF) signals and/or a receiver that may convert RF signals to baseband signals. Alternatively, communication interface 330 may include a transceiver to perform functions of both a transmitter and a receiver. Communication interface 330 may connect to antenna assembly 340 for transmission and/or reception of the RF signals.
Antenna assembly 340 may include one or more antennas to transmit and/or receive RF signals over the air. Antenna assembly 340 may, for example, receive RF signals from communication interface 330 and transmit them over the air, and receive RF signals over the air and provide them to communication interface 330. In one implementation, for example, communication interface 330 may communicate with a network and/or devices connected to a network.
As will be described in detail below, device 200 may perform certain operations described herein in response to processor 300 executing software instructions of an application contained in a computer-readable medium, such as memory 310. A computer-readable medium may be defined as a physical or logical memory device. A logical memory device may include a space within a single physical memory device or spread across multiple physical memory devices. The software instructions may be read into memory 310 from another computer-readable medium or from another device via communication interface 330. The software instructions contained in memory 310 may cause processor 300 to perform processes that will be described later. Alternatively, hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
Although
Touch panel controller 400 may identify touch coordinates from touch panel 230. Coordinates from touch panel controller 400, including the identity of particular sensors in, for example, the X and Y dimensions, may be passed on to touch engine 410 to associate the touch coordinates with a location and/or object displayed on display 220. For example, touch panel controller may identify which sensors may indicate a touch on touch panel 230 and the location of the sensors registering the touch. In one implementation, touch panel controller 400 may be included as part of processor 300.
Touch engine 410 may include hardware or a combination of hardware and software for processing signals that are received at touch panel controller 400. More specifically, touch engine 410 may use the signal received from touch panel controller 400 to detect touches on touch panel 230 and determine dimensions, locations, and/or orientation of the touches. For example, touch engine 410 may use information from touch panel controller 400 to determine an approximate surface area of a touch. As described further herein, the touch dimensions, the touch location, and the touch orientation may be used to determine a location for a drawing object (e.g., drawing tool 120) associated with the touch. In one implementation, touch engine 410 may be included as part of processor 300.
Graphical objects and data 420 may include, for example, user preferences, images and/or templates. User preferences may include, for example, preferences for drawing settings and features, such as default drawing tip sizes/types, menu arrangements, shortcut comments, default directories, etc. Images may include, for example, definitions of stored images, such as tips for drawing tool 120, shapes, fill patterns, clip art, color palettes, and/or other drawing options that may be included on toolbar 130. Templates may include formats for drawing interface 100, such as flowcharts, maps, pictures, backgrounds, etc., which can be drawn over and/or revised on a display (e.g., display 220). Graphical objects and data 420 may be included, for example, in memory 310 (
Drawing logic 430 may include hardware or a combination of hardware and software to display drawing object and drawing images based on signals from touch engine 410. For example, in response to signals that are received at touch panel controller 400, touch engine 410 may cause drawing logic 430 display drawing object 120 at a location associated with the location, dimension, and/or orientation of touch. Drawing logic 430 may also display an image (e.g., a line, a brush stroke, etc.) along the path of drawing object 120 as a touch is moved along the surface of a capacitive display (e.g., touch screen 110). More particularly, in one implementation, drawing logic 430 may connect a series of registered coordinates for drawing object 120 with a graphical image, such as a line.
Drawing logic 430 may connect each point in the series of registered coordinates using a substantially straight line between each point. However, the use of straight lines may provide a rather coarse interpolation of the motion path of a touch as it is dragged along a touch screen. Thus, drawing logic 430 may also include smoothing logic to produce a smoother curve. Smoothing logic may include, for example, spline interpolation, polynomial interpolation, curve fitting or other smoothing techniques. In another implementation, drawing logic 430 may provide different drawing-interface functions, such as selections, magnifications, placing/altering shapes, etc. Drawing logic 430 may be included as part of processor 300.
Although
Referring
Still referring to
A drawing tool location 520 may be determined based on the sensing nodes 502 within finger position 510. In the example of
The area or approximated boundaries of finger position 510 may be calculated using the sensing nodes 502 within finger position 510. In one implementation, the locations of sensing nodes 502 within finger position 510 may be calculated to determine dimensions (e.g., X width and Y height dimensions) of the touch. In another implementation, device 200 may calculate a touch pattern to best fit sensing nodes 502 within finger position 510. For example, device 200 may calculate a best-fit ellipse to correspond to the sensing nodes 502 within finger position 510.
In an exemplary implementation, the number and location of sensing nodes 502 within finger position 510 may be calculated to determine an approach orientation of the touch that may be used to identify drawing tool location 520. For example, referring to
Although
A user may initiate a touch-based drawing mode to initiate process 600. As illustrated in
A drawing tip location may be calculated (block 630), and the drawing tip may be generated or moved (block 640). For example, based on the location and orientation of the touch, device 200 (e.g., touch engine 410) may calculate a drawing tip location associated with the location of the touch input, but somewhere outside the boundaries of the touch area. Device 200 (e.g., drawing logic 430) may then apply an image representing a drawing tip at the location of calculated drawing tip location. The drawing tip may be a default drawing tip or a particular drawing tip previously selected by a user (e.g., from toolbar 130). If an image representing a drawing tip is already being displayed, device 200 may move the image to the updated location.
A graphical image may be generated at coordinates associated with the drawing tip location (block 650). For example, device 200 (e.g., drawing logic 430) may apply a graphical image to join a previous drawing tip location to a current drawing tip location, thus forming a line between the two locations. The graphical image may be an image associated with the selected (or default) drawing tip. For example, one drawing tip may be associated with a small circular image (e.g., representing a sharp pencil), while another drawing tip may be associated with a larger circular image (e.g., representing a marker).
Smoothing logic may be applied (block 660). For example, device 200 (e.g., drawing logic 430) may apply smoothing logic to one or more segments of the graphical image. Smoothing logic may alter the connecting segments to provide a more visually pleasing result on the device display. In some implementations, application of smoothing logic may be optional.
It may be determined if there is a change to the location of the user input (block 670). For example, device 200 (e.g., touch panel controller 400) may detect a user's dragging the touch along the surface of the touch panel. Alternatively, the touch may be removed from the touch panel. If it is determined that there is a change to the location of the user input (block 670—YES), process 600 may return to block 620. If it is determined that there is no change to the location of the user input (block 670—NO), the drawing tip may be deactivated (block 690). For example, when device 200 (e.g., touch controller 400) detects that no touch sensors are active, device 200 (e.g., drawing logic 430) may remove the drawing tip from the display. The graphical image associated with the drawing tip may remain on the display.
Touch-sensitive display 720 may include a display screen integrated with a touch-sensitive overlay. In an exemplary implementation, touch-sensitive display 720 may include a capacitive touch overlay. An object having capacitance (e.g., a user's finger) may be placed on or near display 720 to form a capacitance between the object and one or more of the touch sensing points. The touch sensing points may be used to determine touch coordinates (e.g., location), dimensions, and/or orientation of the touch. In other implementations, different touch screen technologies that accept a human touch input may be used.
Touch-sensitive display 720 may include the ability to identify movement of an object as the object moves on the surface of touch-sensitive display 720. As described above with respect to, for example,
Systems and/or methods described herein may include detecting a touch from a user's finger on the touch-sensitive display, the touch having a path of movement. A location, dimensions and/or orientation of the touch may be determined. A drawing tool may be displayed, on the touch-sensitive display, at a fixed distance outside an area of the touch, where the area of the touch may be determined based on the determined dimensions and/or orientation. The drawing tool may thus have a path of movement that is different than, but associated with, the path of movement of the touch. A fixed graphical image corresponding to the drawing tool path can be generated to provide a precise drawing interface.
The foregoing description of implementations provides illustration and description, but is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention.
For example, while implementations have been described primarily in the context of a touch-screen enabled mobile device (such as a radiotelephone, a PCS terminal, or a PDA), in other implementations the systems and/or methods described herein may be implemented on other touch-screen computing devices such as a laptop computer, a personal computer, a tablet computer, an ultra-mobile personal computer, or a home gaming system.
Also, while a series of blocks has been described with respect to
It will be apparent that aspects described herein may be implemented in many different forms of software, firmware, and hardware in the implementations illustrated in the figures. The actual software code or specialized control hardware used to implement these aspects is not limiting of the invention. Thus, the operation and behavior of these aspects were described without reference to the specific software code—it being understood that software and control hardware may be designed to implement these aspects based on the description herein.
Further, certain portions of the invention may be implemented as “logic” that performs one or more functions. This logic may include hardware, such as an application specific integrated circuit or a field programmable gate array, or a combination of hardware and software.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of the invention. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification.
No element, act, or instruction used in the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items. Where only one item is intended, the term “one” or similar language is used. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise.