The present invention generally relates to methods and systems for converting touch screen events into application formatted data.
Today, a wide variety of conventional touch screen systems are used in various applications. Examples of applications include retail sales, restaurants, point of sale terminals, kiosks, ATM machines, medical systems, e-mail packages and the like. Touch screen systems typically include a display joined with a touch or proximity sensor mechanism. The sensor mechanism detects a user's finger or hand, or an instrument when located proximate to the display. The display is controlled to present application-specific information to the user including, among other things, graphics, text, video and audio. Examples of application-specific information include virtual telephone pads, calculators, cash-registers, key boards, electronic documents and receipts, and windows. The application-specific graphics may represent toolbars, pop-up menus, scrollbars, text entry windows, icons, electronic writing or signature boxes and the like.
The sensor mechanism detects the presence of a finger or instrument and generates a touch screen event in response thereto. The touch screen event may represent a touch event, a release event, a streaming or drag event and the like. The touch screen event includes data or signals representative of the event type and identifying the position (or positions) at which the event occurred.
The display is controlled by the application running on a system computer. The application controls the display to present the application-specific information to the user. The display and touch screen function as a user interface, through which the user inputs data to the application. The user -entered data may represent dollar amounts, product information, patient/customer information, medical information, patient vitals, test results, internet addresses, web-site content, e-mail-related content and the like. The user may input the data by selecting a key, menu item or button, writing in a box, pressing virtual alphanumeric keys and the like.
However, in conventional touch screen systems, the application that drives the display also directly communicates with the sensor mechanism of the touch screen. When writing/modifying an application, the programmer defines the information to be displayed. In addition, due to the direct interaction between the application and the touch screen, the programmer is also required to incorporate, into the application, instructions defining the interface between the application and the touch screen. The interface instructions specify the characteristics of the touch screen events that may be entered at the touch screen by the user.
Generally, touch screens produce “raw” touch screen data, namely the event detected and the event position. The programmer is required to incorporate into the application functionality to a) validate and distinguish touch screen events, b) associate each event with the displayed information and c) act accordingly to control the related software application. Hence, the programmer needs a detailed understanding of the low-level format and operation of the touch screen sensor mechanism and the characteristics and content of the touch screen event. Further, numerous types of touch screens exist, each of which may utilize a different format for the touch screen events. Consequently, programmers are required to individualize each application to the corresponding type of touch screen.
A need exists for methods and systems that provide a generalized interface between application software and touch screen sensing mechanisms.
A method is provided for converting touch screen events into application-specific formatted data. The method includes detecting a touch screen event and identifying an active event zone associated with the touch screen, where the active event zone contains the touch screen event. The method further includes outputting application-specific formatted data based on the active event zone.
Optionally, the method may compare the touch screen event to a table of event zones and generate a list of potential event zones, from which the active event zone is then identified. Once the list of potential event zones is generated, the active event zone may be identified based on a priority ranking. When the touch screen event occurs inside of overlapping event zones, one event zone is identified as the active event zone based upon the priority ranking of the event zones. The touch screen event may comprise at least one of a touch event, a release event, or a drag event and comprise event position coordinates relative to a touch screen coordinate system. Each event zone may be assigned to at least one mode, such as a scroll mode, an electronic writing mode, a mouse functionality mode, a button mode and the like.
The term “touch screen” is used throughout in its broadest context. For example, the touch screen may represent an apparatus or device that presents graphical or image information, such as a liquid crystal display (LCD) with an integral or separable touch screen. The LCD may be touch sensitive. Alternatively, the touch screen may represent a physical device, such a piece of glass, capable of sensing touch, where the physical device does not necessarily directly present graphical or image information. Instead, the touch sensitive physical device may be placed in front of a separate display screen. The term “touch screen” may refer to the touch sensitive physical device alone, as well as, more generally, to the display screen in combination with the touch sensitive physical device.
The information presented by or in connection with, touch screen 10 includes a toolbar 12 comprising a plurality of button zones 14 (e.g., Button #1, Button #2, Button #3, etc). A background zone 16 is denoted in the mid portion of the touch screen 10 and has a pop-up menu 18 superimposed thereon. The pop-up menu 18 comprises a series of menu item zones 20-25, each of which is associated with an item function (e.g., Item #1, Item #2, etc). By way of example only, the menu 18 may be generated when Button #1 is selected in button zone 14. A vertical scroll bar is presented in a vertical scroll zone 26 to the user, while a horizontal scroll bar is presented in a horizontal scroll zone 28 to the user. A signature box is presented in a writing zone 30. The zones 14-30 are associated with different event modes or characteristics as explained below in more detail.
The touch screen control module 50 includes a touch screen interface or driver 54 which transmits drive signals to the sensors within the touch screen 42. The touch screen control module 50 also includes an event type identifier module 56 and an event position identifier module 58 that process touch screen events received from the touch screen 42. The event type identifier module 56 identifies the event type, while the event position identifier module 58 identifies event position. Examples of event types include touch events, release events and drag or streaming events. The event position may be defined based upon the coordinate system of the touch screen 42 such as by a pixel location, a row and column designator or an X-Y coordinate combination.
The touch screen control module 50 further includes a zone position table 60, a zone mode table 62, an application data set table 64 and an application interface 66.
The zone position table 60 contains a list of event zone records. Each event zone record is uniquely associated with an event zone. The list of event zone records in the zone position table 60 may contain all event zones utilized in connection with the touch screen 10 presented on the display 44. Alternatively, the zone position table 60 may store a complete list of event zone records associated with a plurality of touch screens 10 to be displayed on display 44 throughout operation of application 48. In the latter example, each event zone record would also include an “operational” field denoting event zones that are presently utilized in connection with a current touch screen 10.
Each event zone record may include, among other things, an event zone ID, coordinates defining the boundaries of the associated event zone, such as the diagonal corners of the event zone (e.g., Xn,Yn and xn, yn), the size of the event zone, the shape of the event zone, an overlap flag Foverlap, a preference ranking Prank and the like. Event zones may be rectangular, square, circular, elliptical, triangular, and any other bounded shape. The overlap flag Foverlap is utilized to indicate whether the event zone overlaps another event zone (e.g., pop-up windows). The preference or priority ranking Prank may be used to determine which event zone to activate when a touch screen event occurs within two or more overlapping event zones. An example may be when a pop-up menu overlaps another graphic, such as an icon, tool bar button and the like. The menu item zones in the pop-up window may be provided a higher priority or preference ranking than the event zone associated with the underlying graphic.
The zone mode table 62 stores zone mode records containing an event zone ID and one or more event mode flags Fmode#N. The event zone ID in the zone mode table 62 corresponds to the event zone ID in the zone position table 60 to afford a cross reference therebetween. The event mode flag FmodeN is used to correlate expected event types and/or sequences of events with application-specific responses which are output to the application 48 in the form of an application formatted data set. By way of example only, event modes may include Fmode1=“Touch Response in Event Zone”, Fmode2=“No Touch Response in Event Zone”, Fmode3=“Click on Touch”, Fmode4=“Click on Release”, Fmode5=“Drag on Touch”, Fmode6=“Double Click Left Button”, Fmode6=“Right Click Button” and the like.
In the above example, event mode Fmode1 indicates that, when a touch event is detected, the touch screen control module 50 should immediately output a touch response from the application interface 66 to the application 48. Event mode Fmode2 indicates that, when a touch event is detected, the touch screen control module 50 should not provide any output, but instead should ignore the touch event. Event mode Fmode3 indicates that, when a touch event is detected, the touch screen control module 50 should immediately output a command corresponding to the click of the left button on a computer mouse. Event mode Fmode4 indicates that touch screen control module 50 should output a command corresponding to the click of the left button on a computer mouse only after detecting both a valid touch event and a valid release event. Event modes Fmode5 and Fmode5 indicate that touch screen control module 50 should output commands corresponding to the double click of the left button and a single click of the right button, respectively, on a computer mouse after detecting a corresponding valid series of touch and release events within the associated event zone.
The application data set table 64 stores data sets, each data set of which is formatted to the specific application. Each application formatted data set is defined by the application 48 and represents input values acceptable to the application 48. By way of example, an application formatted data set may present a command associated with a single left button mouse click, a double left button mouse click, a right button mouse click, an ASCII character, an ASCII string of characters, a keyboard function such as an enter, a control, or an alt function, a function associated with a calculator, a series of coordinates such as identifying a signature or any system functional command that may be initiated by a data sequence from an input device. Optionally, the application formatted data sets may redefine or redirect the buttons or virtual keyboard keys, such as to reorder the key layout of the keyboard.
During initialization, the application 48 may load the zone position table 60, zone mode table 62, and application data set table 64 through the application interface 66. Optionally, the application may dynamically alter the zone position table 60, zone mode table 62, and application data set table 64 in real time.
The application 48 and touch screen control module 50 may be implemented utilizing a single processor, parallel processors, separate dedicated processors and the like. The touch screen control module 50 may represent a separate entity from a host computer system running the application 48. Alternatively, the touch screen control module 50 may be implemented as part of the host computer system. Optionally, the functionality of the touch screen control module 50 and of the application 48 may be carried out in combination by host and separate computer systems, or as a distinct pair of separate and independent functional entities.
The operation of the touch screen control module 50 is explained below in more detail in connections with
If a touch screen event position falls within the boundary of the event zone, at step 106 the event zone is added to a list of potential event zones. At step 108, it is determined, a) whether the event zone analyzed at step 106 is the last event zone in the zone position table 60, b) whether the event zone is a background zone and c) whether an overlap flag has been set in connection with the current event zone. The overlap flag is set when the current event zone overlaps another event zone on the display 44. If the decision at step 108 is yes, flow passes to step 110, at which processing moves to the next event zone record in the zone position table 60 (
At step 112, it is determined whether the overlap flag is clear for the event zones on the potential event zone list. If yes, flow passes to step 118 in
Turning to
Optionally, the application-based coordinate system may differ from the coordinate system of the touch screen 42. For example, the touch screen 42 may include a coordinate system having a first resolution (e.g., 4000×4000), while the application-based coordinate system has a lower resolution (e.g., 1024×1024). Alternatively, the touch screen 42 may operate based on a polar coordinate system, while the application-based coordinate system may be Cartesian coordinates (or vice verse). The touch screen control module 50 would perform a conversion between coordinate systems.
Optionally, the touch screen control module 50 may provide a “delayed drag” function such that, when a user drags a finger or instrument across the touch screen, the underlying graphical representation following the user's finger (e.g., the mouse or a line) would lag behind the user's finger. Alternatively, the touch screen control module 50 may provide an “extended touch” function proximate to the border of the touch screen such that, as the user's finger approaches the border of the touch screen, the event position information output to the application 48 is indexed closer to the border than the actual position of the user's finger. The extended touch function may be useful when an event zone is small and located close to the corner or side of the display 44, such as the maximize, minimize and close icons on a window.
While the invention has been described in terms of various specific embodiments, those skilled in the art will recognize that the invention can be practiced with modification within the spirit and scope of the claims.