The present disclosure relates to touch screens, and in particular to touch screen systems and methods that are based on touch location and touch force. All publications, articles, patents, published patent applications and the like cited herein are incorporated by reference herein in their entirety, including U.S. Provisional Patent Applications No. 61/564,003 and 61/564,024.
The market for displays and other devices (e.g., keyboards) having non-mechanical touch functionality is rapidly growing. As a result, touch-sensing techniques have been developed to enable displays and other devices to have touch functionality. Touch-sensing functionality is gaining wider use in mobile device applications, such as smart phones, e-book readers, laptop computers and tablet computers.
Touch-sensitive surfaces have become the preferred method where users interact with a portable electronic device. To this end, touch systems in the form of touch screens have been developed that respond to a variety of types of touches, such as single touches, multiple touches, and swiping. Some of these systems rely on light-scattering and/or light attenuation based on making optical contact with the touch-screen surface, which remains fixed relative to its support frame. An example of such a touch-screen system is described in U.S. Patent Application Publication No. 2011/0122091.
Commercial touch-based devices such as smart phones currently detect an interaction from the user as the presence of an object (i.e. finger, stylus) on or near the display of the device. This is considered a user input and can be quantified by 1) determining if an interaction has occurred, 2) calculating the X-Y location of the interaction, and 3) determining the length of interaction.
Touch screen devices are limited in that they can only gather location and timing data during user input. There is a need for additional intuitive inputs that allow for efficient operation and are not cumbersome for the user. By using touch events and input gestures, the user is not required to sort through tedious menus which save both time and battery-life. Application programming interfaces (API) have been developed that characterize user inputs in the form of touches, swipes, and flicks as gestures that are then used to create an event object in software. However, the more user inputs that can be included in the API, the more robust the performance of the touch screen device.
The present disclosure is directed to a touch screen device that employs both location and force inputs from a user during a touch event. The force measurement is quantified by deflection of a cover glass during the user interaction. The additional input parameter of force is thus available to the API to create an event object in software. An object of the disclosure is the utilization of force information from a touch even with projected capacitive touch (PCT) data for the same touch event to generate software based events in a human controlled interface.
Force touch sensing can be accomplished using an optical monitoring systems and method, such as the systems and methods described in the following U.S. Provision Patent Applications: 61/640,605; 61/651,136; and 61/744,831.
Many types of touch sensitive devices exist, such as analog resistive, projected capacitive, surface capacitive, surface acoustic wave (SAW), infrared, camera-based optical, and several others. The present disclosure is described in connection with a capacitive-based device such as a Projected Capacitive Touch (PCT) device, which has the advantage that it enables multiple touch detection and is very sensitive and durable. The combination of location sensing and force sensing in the touch screen system disclosed herein enables a user to supply unique force-related inputs (gestures). A gesture such as the pinch gesture can thus be replaced with pressing the touchscreen with different amounts of force.
There are numerous advantages to a touch screen device that utilizes a combination of force sensing and location sensing. The primary advantage of using force monitoring is the intuitive interaction it provides for the user experience. It allows the user to press on a single location and modulate an object property (e.g., change a graphical image, change volume on audio output, etc.). Previous attempts at one-finger events employ long-press gestures, such as swiping or prolonged contact with the touch screen. Using force data allows for faster response times that obviate-press gestures. While a long-press gesture can operate using a predetermined equation for the response speed (i.e. a long-press gesture can a page to scroll at a set speed or at a rapidly increasing speed), force-based sensing allows the user to actively change the response time in a real-time interaction. The user can thus vary the scroll for instance simply by varying the applied touching force. This provides a user experience that is more interactive and is operationally more efficient.
Moreover, the use of force sensing combined with location sensing enable a wide variety of new touch-screen functions (APIs) as described below.
Additional features and advantages of the disclosure are set forth in the detailed description that follows, and in part will be readily apparent to those skilled in the art from that description or recognized by practicing the disclosure as described herein, including the detailed description that follows, the claims, and the appended drawings.
The claims as well as the Abstract are incorporated into and constitute part of the Detailed Description set forth below.
Cartesian coordinates are shown in certain of the Figures for the sake of reference and are not intended as limiting with respect to direction or orientation.
The present disclosure can be understood more readily by reference to the following detailed description, drawings, examples, and claims, and their previous and following description. However, before the present compositions, articles, devices, and methods are disclosed and described, it is to be understood that this disclosure is not limited to the specific compositions, articles, devices, and methods disclosed unless otherwise specified, as such can, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting.
The following description of the disclosure is provided as an enabling teaching of the disclosure in its currently known embodiments. To this end, those skilled in the relevant art will recognize and appreciate that many changes can be made to the various aspects of the disclosure described herein, while still obtaining the beneficial results of the present disclosure. It will also be apparent that some of the desired benefits of the present disclosure can be obtained by selecting some of the features of the present disclosure without utilizing other features. Accordingly, those who work in the art will recognize that many modifications and adaptations to the present disclosure are possible and can even be desirable in certain circumstances and are a part of the present disclosure. Thus, the following description is provided as illustrative of the principles of the present disclosure and not in limitation thereof.
Disclosed are materials, compounds, compositions, and components that can be used for, can be used in conjunction with, can be used in preparation for, or are embodiments of the disclosed method and compositions. These and other materials are disclosed herein, and it is understood that when combinations, subsets, interactions, groups, etc. of these materials are disclosed that while specific reference of each various individual and collective combinations and permutation of these compounds may not be explicitly disclosed, each is specifically contemplated and described herein.
Thus, if a class of substituents A, B, and C are disclosed as well as a class of substituents D, E, and F, and an example of a combination embodiment, A-D is disclosed, then each is individually and collectively contemplated. Thus, in this example, each of the combinations A-E, A-F, B-D, B-E, B-F, C-D, C-E, and C-F are specifically contemplated and should be considered disclosed from disclosure of A, B, and/or C; D, E, and/or F; and the example combination A-D. Likewise, any subset or combination of these is also specifically contemplated and disclosed. Thus, for example, the sub-group of A-E, B-F, and C-E are specifically contemplated and should be considered disclosed from disclosure of A, B, and/or C; D, E, and/or F; and the example combination A-D. This concept applies to all aspects of this disclosure including, but not limited to any components of the compositions and steps in methods of making and using the disclosed compositions. Thus, if there are a variety of additional steps that can be performed it is understood that each of these additional steps can be performed with any specific embodiment or combination of embodiments of the disclosed methods, and that each such combination is specifically contemplated and should be considered disclosed.
Touch screen system 10 includes a conventional capacitive touch screen system 12, such as PCT touch screen. Examples of capacitive touch screen system 12 are disclosed for example in the following U.S. Pat. Nos. 4,686,443; 5,231,381; 5,650,597; 6,825,833; and 7,333,092. Touch screen system 10 also includes an optical force-sensing system 14 operably interfaced with or otherwise operably combined with capacitive touch screen system 12. Both capacitive touch screen system 12 and optical force-sensing system 14 are electrically connected to a microcontroller 16, which is configured to control the operation of touch screen system 10, as described below.
In an example, microcontroller 16 is provided along with the capacitive touch screen system 12 (i.e., constitutes part of the touch screen system) and is re-configured (e.g., re-programmed) to connect directly to force-sensing system 14 (e.g., via I2C bus) and receive process force signals SF from optical force-sensing system 14. The microcontroller 16 may also be connected to a multiplexer (not shown) to allow for the attachment of multiple sensors.
In an example, optical force-sensing system 14 is configured so that a conventional capacitive touch screen system 12 can be retrofitted to have both location-sensing and force-sensing functionality. In an example, optical force-sensing system 14 is configured as an adapter that is added onto capacitive touch-screen system 12. In an example, optical force-sensing system 14 optionally includes its own microcontroller 15 (shown in
Display system 11 also includes a flex circuit 50 that resides in frame interior 36 atop microcontroller 16 and batteries 18. Flex circuit 50 has a top surface 52 and ends 53. A plurality of proximity sensor heads 54H are operably mounted on the flex circuit top surface 52 near ends 53. With reference to
With reference again to
Display system 11 also include a capacitive touch screen 70 adjacent display top surface 62 and spaced apart therefrom via spacers 66 to define an air gap 67. Capacitive touch screen 70 has top and bottom surfaces 72 and 74. Capacitive touch screen 70 is electrically connected to microcontroller 16 via electrical lines 76 (wiring), which in an example constitute a bus (e.g., an I2C bus). Electrical lines 76 carry location signal SL generated by the capacitive touch screen.
Display system 11 also includes a transparent cover sheet 80 having top and bottom surfaces 82 and 84 and an outer edge 85. Transparent cover sheet 80 is supported by frame 30 by the bottom surface 84 of the transparent cover sheet at or near the outer edge 85 contacting the top edge 33 of the frame. One or more light-deflecting elements 86 are supported on the bottom surface 84 of cover glass 80 adjacent and inboard of outer edge 85 so that they are optically aligned with a corresponding one or more proximity sensor head 54H. In an example, light-deflecting elements 86 are planar mirrors. Light-deflecting elements 86 may be angled (e.g., wedge-shaped) used to provide better directional optical communication between the light source 54L and the photodetector 54D of proximity sensor 54, as explained in greater detail below. In an example, light-deflecting elements are curved. In another example, light-deflecting elements comprise gratings or a scattering surface. Each proximity sensor head 54H and the corresponding light-deflecting element 86 defines a proximity sensor 54 that detects a displacement of transparent cover sheet 80 to ascertain an amount of touching force FT applied to the transparent cover sheet by a touch event TE
In an example embodiment, transparent cover sheet 80 is disposed adjacent to and in intimate contact with capacitive touch screen 70, i.e., the bottom surface 84 of the transparent cover sheet 80 is in contact with the top surface 72 of capacitive touch screen 70. This contact may be facilitated by a thin layer of a transparent adhesive. Placing transparent cover sheet 80 and the capacitive touch screen 70 in contact allows them to flex together when subjected to touching force FT, as discussed below.
It is noted here that the optical force-sensing system 14 of
With continuing reference to
It is also noted that the deflection of transparent cover sheet 80 changes the distance between the light source 54L and photodetector 54D and this change in the distance can cause a change in the detected irradiance at the photodetector. Also in an example, photodetector 54D can detect an irradiance distribution as well as changes to the irradiance distribution as caused by a displacement in transparent cover sheet 80. The irradiance distribution can be for example, a relatively small light spot that moves over the detector area, and the position of the light spot is correlated to an amount of displacement and thus an amount of touching force FT. In another example, the irradiance distribution has a pattern such as due to light scattering, and the scattering pattern changes as the transparent cover sheet is displaced.
In an alternative embodiment illustrated in
In another example embodiment, transparent cover sheet 80, capacitive touch screen 70 and display 60 are adhered together. In this case, proximity sensor 54 can be operably arranged with respect to display 60, wherein either the proximity sensor head 54H or the light-deflecting element 86 is operably arranged on the top surface 62 of the display.
While the optical force-sensing system 14 of touch screen system 12 is described above in connection with a number of different examples of proximity sensor 54, other optical sensing means can be employed by modifying the proximity sensor. For example, proximity sensor 54 can be configured with reflective member 86 having a diffracting grating that diffracts light rather than reflects light, with the diffracted light being detected by the photodetector 54D.
Moreover, the light may have a spectral bandwidth such that different wavelengths of light within the spectral band can be detected and associated with a given amount of displacement (and thus amount of touching force FT applied to) transparent cover sheet 80. Light source 54L can also inject light into a waveguide that resides upon the bottom surface 84 of transparent cover sheet 80. The light-deflecting element 86 can be a waveguide grating that is configured to extract the guided light, with the outputted light traveling to the photodetector 54D and being incident thereon in different amounts or at different positions, depending upon the displacement of the transparent cover sheet.
In another embodiment, proximity detector 54 can be configured as a micro-interferometer by having a beamsplitter included in the optical path that provides a reference wavefront to the photodetector. Using a coherent light source 54L, the reference wavefront and the reflected wavefront from light-deflecting element 86 can interfere at photodetector 54D. The changing fringe pattern (irradiance distribution) can then be used to establish the displacement of the transparent cover sheet due to touching force FT.
Also in an example, proximity sensor 54 can be configured to define a Fabry-Perot cavity wherein the displacement of transparent cover sheet 80 causes a change in the Finesse of the Fabry-Perot cavity that can be correlated to amount of applied touching force FT used to cause the displacement. This can be accomplished for example, by adding a second partially-reflective window (not shown) operably disposed relative to reflective member 86
The proximity sensor heads 54H and their corresponding reflective members 86 are configured so that a change in the amount of touching force FT results in a change in the force signal SF by virtue of the displacement of transparent cover sheet 80. Meanwhile, capacitive touch screen 70 sends location signal SL to microcontroller 16 representative of the (x,y) touch location TL of touch event TE associated with touching force FT as detected by known capacitive-sensing means. Microcontroller 16 thus receives both force signal SF representative of the amount of force FT provide at the touch location TL, as well as location signals SL representative of the (x,y) position of the touch location. In an example multiple force signals SF from different proximity sensors 54 are received and processed by microcontroller 16.
In an example, microcontroller 16 is calibrated so that a given value (e.g., voltage) for force signal SF corresponds to amount of force. The microcontroller calibration can be performed that measures the change in the force signal (due to a change in intensity or irradiance incident upon photodetector 54D) and associates it with a known amount of applied touching force FT at one or more touch locations TL. Thus, the relationship between the applied touching force FT and the force signal can be established empirically as part of a display system or touch screen system calibration process.
Also in an example, the occurrence of a touch event TE can be used to zero the proximity sensors 54. This may be done in order to compensate the sensors for any temperature differences that may cause different proximity sensors 54 to perform differently.
Microcontroller 16 is configured to control the operation of touch screen system 10 and also process the force signal(s) SF and the touch signal(s) SL to create a display function (e.g., for display 11 for an event object that has an associated action), as described below. In some embodiments, microcontroller 16 includes a processor 19a, a memory 19b, a device driver 19c and an interface circuit 19c (see
In an example, microcontroller 16 is configured or otherwise adapted to execute instructions stored in firmware and/or software (not shown). In an example, microcontroller 16 is programmable to perform the functions described herein, including the operation of touch screen system 10 and any signal processing that is required to measure, for example, relative amounts of pressure or force, and/or the displacement of the transparent cover sheet 80, as well as the touch location TL of a touch event TE. As used herein, the term microcontroller is not limited to just those integrated circuits referred to in the art as computers, but broadly refers to computers, processors, microcomputers, programmable logic controllers, application-specific integrated circuits, and other programmable circuits, as well as combinations thereof, and these terms can be used interchangeably.
In an example, microcontroller 16 includes software configured to implement or aid in performing the functions and operations of touch screen system 10 disclosed herein. The software may be operably installed in microcontroller 16, including therein (e.g., in processor 19a). Software functionalities may involve programming, including executable code, and such functionalities may be used to implement the methods disclosed herein.
Such software code is executable by the microprocessor. In operation, the code and possibly the associated data records are stored within a general-purpose computer platform, within the processor unit, or in local memory. At other times, however, the software may be stored at other locations and/or transported for loading into the appropriate general-purpose computer systems. Hence, the embodiments discussed herein involve one or more software products in the form of one or more modules of code carried by at least one machine-readable medium. Execution of such code by a processor of the computer system or by the processor unit enables the platform to implement the catalog and/or software downloading functions, in essentially the manner performed in the embodiments discussed and illustrated herein.
With reference again to
In an example embodiment of touch screen system 10, each force signal SF have a count value over a select range, e.g., from 0-255. In an example, a count value of 0 represents proximity sensor head 54H touching transparent cover sheet 80 (or the light-deflecting element 86 thereon), while a count value of 255 represents a situation where the light-deflecting element is too far away from proximity sensor head. During calibration, a reading a from proximity sensor 54 with no force being applied to touch screen system 10 is recorded along with the sensor reading β for a specified large amount of touching force FT.
The following equation shows how the data represented by force signal SF is normalized for a given proximity sensor 54 and is also applied to the other proximity sensors as well. The normalization factor N is given by:
N=[α
A
−A]/[α
A−βA]·100
where A is the proximity sensor data for force signal SF, α is the proximity sensor reading with no force FT, and β is the proximity sensor reading at maximum force FT.
The average of the data for all the normalized proximity sensors 54 is then taken. A further rolling averaging step is used to smooth the data by taking an average of the three most recent averaged values. Table 1 below helps to illustrate this concept, wherein “Ac#n” stands for “array column #n” and AVGR stands for “rolling average” for different times T. At the initial time point, a blank three-column array is initialized in microcontroller 16 and contains no values. During the first time point, the first column (AC #1) is populated with the average of all normalized sensors (labeled P1). At the next time point, the data for P1 is moved to the second column (AC #2) and AC #1 is replaced with the average of all normalized sensors at the second time point (labeled P2).
This process continues for each time point. The average of the data in the three columns is taken as the final value, which is accessed by software for various applications. The rolling average from the array is ignored until all columns have been populated. The parameter P is given by:
P=Normalized sensor data=normalized A+normalized B+normalized C+normalized D
The value for AVGR was used in a custom drawing program in microcontroller 16 to modify the width of a display image in the form of a line when swiping. During the swipe, if a certain amount of force FT is applied, the width of the line increases. When less force is applied, the line width is reduced.
In example embodiments of the disclosure, an amount of touching pressure or touching force (pressure=FT/area) is applied at a touch location TL associated with a touch event TE. Aspects of the disclosure are directed to sensing the occurrence of a touch event TE, including relative amounts of applied force FT as a function of the displacement of transparent cover sheet 80. The time-evolution of the displacement (or multiple displacements over the course of time) and thus the time-evolution of the touching force FT can also be determined.
Thus, the amount as well as the time-evolution of the touching force FT is quantified by proximity sensors 54 and microcontroller 16 based on the amount of deflection of transparent cover sheet 80. Software algorithms in microcontroller 16 are used to smooth out (e.g., filter) the force signal SF, eliminate noise, and to normalize the force data. In this way, the applied force FT can be used in combination with the location information to manipulate the properties of graphics objects on a graphical user interface (GUI) of system 10, and also be used for control applications. Both one-finger and multiple-finger events can be monitored. The force information embodied in force signal SF can be used as a replacement or in conjunction with other gesture-based controls, such as tap, pinch, rotation, swipe, pan, and long-press actions, among others, to cause system 10 to perform a variety of actions, such as selecting, highlighting, scrolling, zooming, rotating, and panning, etc.
For example, with reference to
In another example, system 10 replaces delay-based controls, such as long-press touches, to enable a faster response for an equivalent function. The touching force FT can be used to change an aspect of display image 200. For example, in a drawing application, to modify the width of a line or change the brush size during use (i.e. paint brush size, erase size). For image-based applications, the force information from force signal(s) SF can be used to lighten/darken a photo or adjust the contrast. In image applications or map programs, the force data can provide the rate of image translation during panning, or the speed of image magnification during a zoom function, as discussed above.
Touch-based data can be used in conjunction with another user gesture (i.e. pinch & zoom) to perform a certain action (i.e. lock, pin, crop). A hard press on the touch screen (i.e., a relatively large touching force FT) can be used to cause a display image (e.g., a graphic object) to flip (front to back) or to rotate by a select amount, e.g., 90 degrees. With reference to
In certain instances, there will a maximum touching force F that can be used. Rather than exceed the maximum touching force, in an example a pumping or pulsing action can be used whereby an implement 20 presses with force multiple times in a given time period. This option can be useful for applications such as gaming or in satellite imagery where the user would like to zoom in/out at a much faster rate than the applied maximum force.
This application claims the benefit of priority under 35 U.S.C. §119 of U.S. Provisional Application Ser. No. 61/738,047, filed on Dec. 17, 2012, the content of which is relied upon and incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61738047 | Dec 2012 | US |