1. Technical Field
The present application relates generally to touch devices, and more specifically to systems, methods, and devices for improving the accuracy of touch screens near an edge of the screen.
2. Description of the Related Art
Advances in technology have resulted in smaller and more powerful computing devices. For example, there currently exist a variety of portable computing devices, including wireless computing devices such as wireless telephones, personal digital assistants (PDAs), and tablet computers that are small, lightweight, and easily carried by users. In order to simplify user interfaces and to avoid pushbuttons and complex menu systems, such portable computing devices may use touch screen displays that detect user gestures on the touch screen and translate the detected gestures into commands to be performed by the device. Such gestures may be performed using one or more fingers or a stylus type pointing implement.
Processing overhead measures the total amount of work the central processing unit (CPU) of the device can perform and the percentage of that total capacity which is used by individual computing tasks, such as touch detection. In total, these tasks must require less than the processor's overall capacity. Simple touch gestures may typically be handled by a touchscreen controller, which is a separate processor associated with the touch screen, but more complex touch gestures require the use of a secondary processor, often the mobile device's CPU, to process large amounts of touch data. Typically, large amounts of touch data must be processed to determine the nature of the touch, sometimes only to conclude that a touch was a “false positive,” consuming large amounts of CPU capacity and device power. The processing overhead required for complex touch recognition may require a large percentage of the overall CPU capacity, impairing device performance.
The current generation of mobile processors is not well adapted to deal with increasing touch complexity and corresponding CPU overhead, especially in conjunction with the many other common high performance uses of mobile devices. Increasing the size of the mobile processor core or cache delivers performance increases only up to a certain level, beyond which heat dissipation issues make any further increase in core and cache size impractical. Overall processing capacity is further limited by the size of many mobile devices, which limits the number of processors that can be included in the device. Additionally, because mobile computing devices are generally battery-powered, high performance uses also shortens battery life.
Despite mobile processing limitations, many common mobile applications such as maps, games, email clients, web browsers, etc., are making increasingly complex use of touch recognition. Further, touch processing complexity increases proportional to touch-node capacity, which in turn increases proportional to display size. Therefore, because there is a trend in many portable computing devices toward increasing display size and touch complexity, touch processing is increasingly reducing device performance and threatening battery life. Further, user interaction with a device through touch events is highly sensitive to latency, and user experience can suffer from low throughput interfaces between the touchscreen panel and the host processor resulting in processing delay and response lag or incorrect touch position estimation for touch events near the screen edge.
The systems, methods, devices, and computer program products discussed herein each have several aspects, no single one of which is solely responsible for the desirable attributes disclosed herein. Without limiting the scope of this invention as expressed by the claims which follow, some features are discussed briefly below.
Embodiments and innovations described herein relate to systems and methods that may be run in a processor for an electronic device to correct the position of a touch input. Preferably, touch position correction methods have a wide range of controls and can be implemented in existing hardware or software. However, in some embodiments, specially designed hardware and software may improve speed or efficiencies of such processes.
One innovation of the disclosure provides a method of correcting the position of a touch input. The method includes identifying a bias model for touch positions on a touch screen, receiving a touch input from a touch screen, determining a position of a centroid corresponding to the touch input, determining a bias based on the position and the bias model, and adjusting the position based on the bias. In some aspects of the method receiving a touch input from the touch screen comprises receiving a plurality of input points, each input point including location information and an indication of the strength of the touch (for example, an x value, a y value, and an amplitude (or magnitude) value). Some aspects of the method include determining an average pointing object size, comparing a number of points corresponding to the average pointing object size and a number of points corresponding to the touch input, and determining the bias based on the comparison. In some aspects, determining an average pointing object size comprises averaging a number of touch input points present in a plurality of touch centroids.
Another innovation disclosed is an apparatus for correcting the position of a touch input. The apparatus includes a processor, a touch screen, a memory, operably connected to the processor, and configured to store instructions for the processor that when executed, cause the processor to identify a bias model for touch positions on a touch screen, receive a touch input from a touch screen, determine a position of a centroid corresponding to the touch input, determine a bias based on the position and the bias model, and adjust the position based on the bias.
In some innovations, the processor is further configured to receive a touch input from the touch screen by receiving a plurality of input points, each input point including an x value, a y value, and an amplitude. In some aspects, the memory stores processor instructions that further configure the processor to determine an average pointing object size, compare a number of points corresponding to the average pointing object size and a number of points corresponding to the touch input, and determine the bias based on the comparison. In some aspects of the apparatus determining an average pointing object size comprises averaging a number of touch input points present in a plurality of touch centroids.
Another innovation disclosed is a method of correcting the position of a touch input. The method includes receiving a touch input from a touch screen, determining a position of a centroid corresponding to the touch input, determining a bias based on the position and a bias model, and adjusting the position based on the bias. In some aspects, receiving a touch input from the touch screen comprises receiving a plurality of input points, each input point including an x value, a y value, and an amplitude. In some aspects, the method also includes determining an estimated pointing object size, determining a size of a bias region based on the estimated pointing object size; and determining a bias for the touch input based on the position of the centroid relative to the bias region.
Another innovation disclosed is an apparatus for correcting the position of a touch input. The apparatus includes a processor, a touch screen, a memory, operably connected to the processor, and configured to store instructions for the processor that when executed, cause the processor to: receive a touch input from a touch screen, determine a position of a centroid corresponding to the touch input, determine a bias based on the position and a bias model, and adjust the position based on the bias. In some aspects, the memory stores additional instructions that further configure the processor to receive a touch input from the touch screen by receiving a plurality of input points, each input point including an x value, a y value, and an amplitude. In some aspects, the memory stores processor instructions that further configure the processor to determine an estimated pointing object size, determine a size of a bias region based on the estimated pointing object size; and determine a bias for the touch input based on the position of the centroid relative to the bias region.
In one innovation, a method of adjusting the position of a touch input is disclosed. The method includes the steps of receiving a touch input, determining a centroid of the touch input, the centroid indicating an estimated touch position of the touch input on a touch panel, and determining whether to apply a bias to adjust the estimated touch position. In some aspects, receiving a touch input comprises receiving information from a plurality of touch sensors of the touch panel. In some aspects, the information from each of the plurality of touch sensors represents an x position value, a y position value, and an amplitude of the estimated touch position. In some aspects, the method further includes adjusting one or more of the x position value and the y position value of the estimated touch position based on the bias. The method may further include the steps of determining an estimated pointing object size, determining a size of a bias region based on the estimated pointing object size, and determining a bias based on the position of the centroid relative to the bias region. In some aspects, the method further includes determining a bias to apply and storing bias information in a device that comprises the touch panel. In some aspects, the bias is based on an expected size of an object making the touch input. In some aspects, determining whether to apply a bias to adjust the estimated touch position comprises comparing the touch position of the estimated position to a determined area of the touch panel, and applying the bias if the estimated touch position is within the determined area of the touch panel. In some aspects, the method further includes applying the bias to the estimated touch position to determine an adjusted estimated touch position of the touch input on the touch panel.
In another innovation, an apparatus for adjusting the position of a touch input includes a processor, a touch device, and a memory, operably connected to the processor, and configured to store instructions for the processor that when executed, cause the processor to receive a touch input, determine a centroid of the touch input, the centroid indicating an estimated touch position of the touch input on a touch panel, and determine whether to apply a bias to adjust the estimated touch position. In some aspects, receiving a touch input comprises receiving information from a plurality of touch sensors of the touch panel. In some aspects, the information from each of the plurality of touch sensors represents an x position value, a y position value, and an amplitude of the estimated touch position. In some aspects, the processor is further configured to adjust one or more of the x position value and the y position value of the estimated touch position based on the bias. In some aspects, the memory stores processor instructions that further configure the processor to determine an estimated pointing object size, determine a size of a bias region based on the estimated pointing object size, and determine a bias based on the position of the centroid relative to the bias region. In some aspects, the memory is further configured to determine a bias to apply and storing bias information in a device that comprises the touch panel. In some aspects, the bias is based on an expected size of an object making the touch input. In some aspects, determining whether to apply a bias to adjust the estimated touch position comprises comparing the touch position of the estimated position to a determined area of the touch panel, and applying the bias if the estimated touch position is within the determined area of the touch panel. In some aspects, the memory further is configured to store processor instructions that configure the processor to apply the bias to the estimated touch position to determine an adjusted estimated touch position of the touch input on the touch panel.
Yet another innovation discloses a system for adjusting the position of a touch input. The system includes a control module configured to receive a touch input, determine a centroid of the touch input, the centroid indicating an estimated touch position of the touch input on a touch panel, and determine whether to apply a bias to adjust the estimated touch position. In some aspects, receiving a touch input comprises receiving information from a plurality of touch sensors of the touch panel. In some aspects, the information from each of the plurality of touch sensors represents an x position value, a y position value, and an amplitude. In some aspects, the control module is further configured to adjust one or more of the x position and the y position of the touch position based on the bias. In some aspects, the control module is further configured to determine an estimated pointing object size, determine a size of a bias region based on the estimated pointing object size, and determine a bias based on the position of the centroid relative to the bias region. In some aspects, the control module is further configured to determine a bias to apply and store bias information in a device that comprises the touch panel, apply the bias to the estimated touch position to determine an adjusted estimated touch position of the touch input on the touch panel, and use the adjusted estimate of the touch input on the touch panel as user input for a selection on a display touch panel. The bias is based on an expected size of an object making the touch input and determining whether to apply a bias to adjust the estimated touch position comprises comparing the touch position of the estimated position to a determined area of the touch panel and applying the bias if the estimated touch position is within the determined area of the touch panel.
In another innovation, a non-transitory computer-readable medium stores instructions that, when executed, cause at least one physical computer processor to perform a method of adjusting the position of a touch input. The method includes the steps of receiving a touch input, determining a centroid of the touch input, the centroid indicating an estimated touch position of the touch input on a touch panel, and determining whether to apply a bias to adjust the estimated touch position. In some aspects, receiving a touch input comprises receiving information from a plurality of touch sensors of the touch panel. In some aspects, the information from each of the plurality of touch sensors represents an x position value, a y position value, and an amplitude. In some aspects, the method further includes adjusting one or more of the x position and the y position of the touch position based on the bias. In some aspects, the method further includes determining an estimated pointing object size, determining a size of a bias region based on the estimated pointing object size, and determining a bias based on the position of the centroid relative to the bias region. In some aspects, the method further includes determining a bias to apply, applying the bias to the estimated touch position to determine an adjusted estimated touch position of the touch input on the touch panel, and storing bias information in a device that comprises the touch panel. The bias is based on an expected size of an object making the touch input and determining whether to apply a bias to adjust the estimated touch position comprises comparing the touch position of the estimated position to a determined area of the touch panel and applying the bias if the estimated touch position is within the determined area of the touch panel.
The disclosed aspects will hereinafter be described in conjunction with the appended drawings, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
Embodiments disclosed herein relate to touch panels that are input interfaces configured to receive a “touch input” from a user, for example, by a stylus or a user's finger(s). A touch input may also be referred to herein as a “touch event.” Many touch panels used on computers and mobile devices also include a display, allowing a user to interact with displayed information. Such computers and devices include, but are not limited to, cell phones, tablet computers, cameras, appliances, gas pumps, office equipment, communication equipment, banking equipment, automobiles, grocery and retail equipment, and a variety of other consumer and commercial devices, including both wireless and non-wireless devices.
A touch panel is configured with sensor technology to sense a location of the touch input. For example, a touch panel may include a number of sensors arranged in columns and rows across the touch panel. In most if not all touch panel implementations, a touch input generates information related to a “strength” and a “location” or “touch position” of the touch input, and the generated information can be further processed as user input. The information may be, for example, one or more signals representing the location of the touch input and the strength of the touch input. The signal(s) representing the location of the touch input indicates where on the touch panel the touch input occurred, and may be generally described as an (x,y) location on the touch panel. Because a stylus or a finger may be larger than a sensor on the touch panel, a single touch input may contact multiple sensors on the touch panel. The strength of the touch input may be determined in various ways, one example being the number of sensors that are contacted (or actuated) by the touch input. The number of actuated sensors may depend on the size of the stylus/finger touching the touch panel of the touch input, where a finger pressing hard on the touch panel will generally actuate more touch sensors because the finger is flattened out. The number of actuated sensors may also depend on the size of the sensors and the configuration of the sensors on the touch panel. In another example, the strength may be determined by the length of time a touch input is made on the touch panel. In another example, the strength of the touch input may be determined based on the amount of physical deflection that occurs on the touch panel as a result of the touch. As one having ordinary skill in the art will appreciate, the particular information generated by the touch input relating to the location and strength of the touch input may be based on the technology of a particular touch panel.
The sensors of a touch panel are generally small so that when a touch input is made by a user with a finger or a stylus, multiple sensors may detect the touch input. Generally more sensors detect a touch input when a finger, rather than a stylus, is used due to a larger contact surface of a finger. To determine an (estimated) exact location of what a user was intending to touch when multiple touch sensors are actuated by the touch input, a touch panel may process information received from the multiple touch sensors and determine a “center” of the touch input. In some embodiments, a centroid of the touch input is determined based on the information received from actuated multiple touch sensors. The centroid (or geometric center) of the touch input region may be generally defined as the arithmetic mean position of all the sensors in the footprint of the touch input, that is, the mean position of all the sensors that are actuated. Because information from the touch sensors indicate a signal strength of the touch input for that sensor, the sensor position and the strength of each touch sensor may be used to determine a centroid of the touch input (for example, by weighting each actuated sensor by the strength of the touch on that sensor), and the location of the centroid is used as the intended touch point on the touch panel.
On many display touch panels, a touch input made on the touch panel near the edge of the touch panel may generate less information, and thus be less accurate, than a touch input made in the middle of the touch panel because the touch panel may not have touch sensors disposed near the edges of the touch panel, even though it may appear to a user that they should be able to make a touch input near the edge of the display touch panel. Additionally, a touch input received at, or near, the edge of a touch panel may be partially off the touch panel resulting in inaccurate information being generated by the touch panel. For example, when a user makes a touch input on an icon displayed at the edge of a display touch panel, the user's finger, when it is in contact with the touch panel display, may extend past the edge of the touch panel display resulting in inaccurately generated touch information. Additionally, depending on the technology of a touch panel, the electronic noise and shadows (for example, caused by the stylus or finger) may lead to an inaccuracy in a touch input. Because of such inaccuracies, touch inputs made near the edge of a touch panel may need to be made more than once to correctly indicate a user's desired input. Problems relating to accuracy of a touch input may also occur anywhere on the touch panel. To address such issues, embodiments described herein may process information received from a touch input near the edge of a display to provide a more accurate determination of the location and strength of the touch input, resulting in a more accurate and more efficient input touch panel interface. For example, a calculated center position (for example, a centroid) may be adjusted to remove bias in its position that is a result of having incomplete touch sensor information.
In the following description, specific details are given to provide a thorough understanding of the examples. However, it will be understood by one of ordinary skill in the art that the examples may be practiced without these specific details. For example, electrical components/devices may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, such components, other structures and techniques may be shown in detail to further explain the examples.
As shown in the embodiment illustrated in
The processor 104 is representative of a processing system that may include one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information.
Such a processing system may also include machine-readable media for storing software. Software shall be construed broadly to mean any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, cause the processing system to perform the various functions described herein.
The transmitter 110 may be configured to wirelessly transmit packets having different packet types or functions. For example, the transmitter 110 may be configured to transmit packets of different types generated by the processor 104. When the device 100 is implemented or used as an access point or station, the processor 104 may be configured to process packets of a plurality of different packet types. For example, the processor 104 may be configured to determine the type of packet and to process the packet and/or fields of the packet accordingly. The receiver 112 may be configured to wirelessly receive packets having different packet types. In some aspects, the receiver 112 may be configured to detect a type of a packet used and to process the packet accordingly.
The device 100 may also include a signal detector 118 that may be used in an effort to detect and quantify the level of signals received by the transceiver 114. The signal detector 118 may detect such signals as total energy, energy per subcarrier per symbol, power spectral density and other signals.
The device 100 may further comprise a user interface 122 that includes a touch panel 142). The user interface 122 may include any element or component that conveys information to a user of the device 100 and/or receives input from the user. Systems and methods for improving the accuracy of touch position estimates near the screen edge can be implemented in device 100.
As illustrated in the embodiment of
Although a number of separate components are illustrated in
The display 140 of the user interface 122 may include a touch panel 142. The touch panel 142 may be incorporated in a display 140. In various embodiments, the display 140 may include, for example, LED, LCD of plasma technology to display information. The display 140 also may include a display component 144, which may be, in some embodiments, coupled to a user interface processor 160 or processor 104 for receiving information (for example, images, text, symbols or video) to display visually to a user.
The touch panel 142 may have implemented therein one or a combination of touch sensing technologies, for example, capacitive, resistive, surface acoustic wave, or optical touch sensing. In some embodiments, touch panel 142 may be positioned over (or overlay) display component 144 in a configuration such that visibility of the display component 144 is not impaired. In other embodiments, the touch panel 142 and display component 144 may be integrated into a single panel or surface. The touch panel 142 may be configured to operate with display component 144 such that a touch input on the touch panel 142 is associated with a portion of the content displayed on display component 144 corresponding to the location of the touch on touch panel 142. Display component may also be configured to respond to a touch input on the touch panel 142 by displaying, for a limited time, a visual representation of the touch.
Still referring to the embodiment of
The processing module 154 may be configured to analyze touch events, including adjusting touch position estimates as described in further detail below to improve the accuracy of the touch position, and to communicate touch data to user interface processor 160. The processing module 154 may, in some embodiments, include instructions that when executed act as a touch screen controller (TSC). The specific type of TSC implemented can depend on the type of touch technology used in touch panel 142. The processing module 154 may be configured to start up when the touch detection module 152 indicates that a touch input has occurred on touch panel 142, and to power down after release of the touch. This feature may be useful for power conservation in battery-powered devices.
Processing module 154 may be configured to perform filtering on touch input information received from the touch detection module 152. For example, in an embodiment of the display 140 where the touch panel 142 is disposed on top of a display component 144 that includes a LCD screen, the LCD screen may contribute noise to the coordinate position measurement of the touch input. This noise may be a combination of impulse noise and Gaussian noise. The processing module 154 may be configured with median and averaging filters to reduce this noise. Instead of using only a single sample for a coordinate measurement of the touch input, the processing module 154 may be programmed to instruct the touch detection module 152 to provide more than one sample (e.g., two, four, eight, or 16 samples). These samples may then be sorted, median filtered, and averaged to give a lower noise, more accurate result of the touch coordinates.
In some embodiments, the processing module 154 can be a processor specifically configured for use with the touch screen subsystem 150, while user interface processor 160 may be configured to handle the general processing requirements of the user interface. The processing module 154 and the user interface processor 160 may be in communication with each other. In various embodiments, the processing described as being performed by user interface processor 160, processing module 154, and processor 104 may performed in different processors or a single processor.
As illustrated in
When a centroid of a touch input is determined to be outside or to the right of the bias region 605, there is complete sensor data for the touch input, as shown in
To correct the touch position centroid estimation when incomplete touch sensor information is available, the bias may be mitigated or removed in accordance with a bias model.
In some embodiments, as shown in
In block 805, a bias model is identified for a touch position on a touch panel. In some aspects, a bias model may be developed during research and development of a particular model of touch panel or determined as discussed in greater detail above. The bias model may be embedded within device 100 so that the model may be referenced during run-time. For example, software and/or firmware logic processing input from a touch panel may reference the model. In some embodiments, block 805 is not performed.
In block 810, touch input is received from a touch panel, such as touch panel 142. In some aspects, the touch input may include amplitude values received from a plurality of touch sensors, such as touch sensors 206. For example, amplitude values for touch sensors within a proximity of a touch spike, such as the maximum of the touch data 305 or data 505 may be received. In some aspects, at least a portion of the received touch input may correspond to input related to a finger or other object touching or coming within a proximity of a sensor 206 of a touch panel 205. The touch input may generate information from a plurality of touch sensors, with the information from each touch sensor including x and y coordinate values, and an amplitude value, as discussed above with respect to
In block 815, a position of a centroid corresponding to the touch input is determined. In some embodiments, the centroid may be determined in some aspects via a weighted average of the input values received in block 810. For example, the x values for each of the plurality of touch sensor data points included in the touch input of block 810 may be weighted based on the data point's amplitude value. A weighted average of the x values may then be used to determine the centroid position. A similar calculation may be performed with respect to the y values of the touch sensor data points.
In block 820, a bias may be determined based on the position of the touch input and the bias model. In some aspects, a bias provided by the bias model may be based on the number of sensor data points included within a touch input. For example, some embodiments may include estimation of a finger (or other pointing object) size. For example, the size of the finger or pointing object may correspond to a number of touch sensor data points having an amplitude above a predetermined threshold when a touch event occurs.
Determination of the bias may be further based on the estimated pointing object size. For example, the size of the bias region 605 illustrated in
In block 825, the position of the centroid of the touch input is adjusted based on the bias and the estimated centroid position. For example, removal or mitigation of the bias may move the position of the centroid towards an edge of the touch panel 205 when the number of touch sensors included in the centroid calculation is less than the number of touch sensors for a touch input having complete sensor data.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. Various aspects of the novel systems, apparatuses, and methods are described more fully hereinafter with reference to the accompanying drawings. This disclosure may, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings herein one skilled in the art should appreciate that the scope of the disclosure is intended to cover any aspect of the novel systems, apparatuses, and methods disclosed herein, whether implemented independently of, or combined with, any other aspect of the invention. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, the scope of the invention is intended to cover such an apparatus or method which is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the invention set forth herein. It should be understood that any aspect disclosed herein may be embodied by one or more elements of a claim.
Although particular aspects are described herein, many variations and permutations of these aspects fall within the scope of the disclosure. Although some benefits and advantages of the preferred aspects are mentioned, the scope of the disclosure is not intended to be limited to particular benefits, uses, or objectives. Rather, aspects of the disclosure are intended to be broadly applicable to different wireless technologies, system configurations, networks, and transmission protocols, some of which are illustrated by way of example in the figures and in the following description of the preferred aspects. The detailed description and drawings are merely illustrative of the disclosure rather than limiting, the scope of the disclosure being defined by the appended claims and equivalents thereof.
It should be understood that any reference to an element herein using a designation such as “first,” “second,” and so forth does not generally limit the quantity or order of those elements. Rather, these designations may be used herein as a convenient wireless device of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not mean that only two elements may be employed there or that the first element must precede the second element in some manner. Also, unless stated otherwise a set of elements may include one or more elements.
A person/one having ordinary skill in the art would understand that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
It is noted that examples may be described as a process, which is depicted as a flowchart, a flow diagram, a finite state diagram, a structure diagram, or a block diagram. Although a flowchart may describe operations as a sequential process, many of the operations can be performed in parallel, or concurrently, and the process can be repeated. In addition, the order of operations may be re-arranged. A process may be deemed to be terminated when its operations are completed. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a software function, its termination may correspond to a return of the function to the calling function or the main function. A person/one having ordinary skill in the art would further appreciate that any of the various illustrative logical blocks, modules, processors, means, circuits, and algorithm steps described in connection with the aspects disclosed herein may be implemented as electronic hardware (e.g., a digital implementation, an analog implementation, or a combination of the two, which may be designed using source coding or some other technique), various forms of program or design code incorporating instructions (which may be referred to herein, for convenience, as “software” or a “software module), or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented within or performed by an integrated circuit (IC), an access terminal, or an access point. The IC may include a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, electrical components, optical components, mechanical components, or any combination thereof designed to perform the functions described herein, and may execute codes or instructions that reside within the IC, outside of the IC, or both. The logical blocks, modules, and circuits may include antennas and/or transceivers to communicate with various components within the network or within the device. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. The functionality of the modules may be implemented in some other manner as taught herein. The functionality described herein (e.g., with regard to one or more of the accompanying figures) may correspond in some aspects to similarly designated “means for” functionality in the appended claims.
If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be implemented in a processor-executable software module which may reside on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that can be enabled to transfer a computer program from one place to another. A storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, such computer-readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Also, any connection can be properly termed a computer-readable medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and instructions on a machine readable medium and computer-readable medium, which may be incorporated into a computer program product.
It is understood that any specific order or hierarchy of steps in any disclosed process is an example of a sample approach. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged while remaining within the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the disclosure is not intended to be limited to the implementations shown herein, but is to be accorded the widest scope consistent with the claims, the principles and the novel features disclosed herein. The word “exemplary” is used exclusively herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.
This application claims the benefit of U.S. Provisional Patent Application No. 61/943,221, filed Feb. 21, 2014, titled “SYSTEMS AND METHODS FOR IMPROVED TOUCH SCREEN ACCURACY WITHIN PROXIMITY OF A SCREEN EDGE,” the disclosure of which is hereby incorporated herein by reference in its entirety and for all purposes.
Number | Date | Country | |
---|---|---|---|
61943221 | Feb 2014 | US |