This application relates generally to computing device user interfaces, and more particularly to user interfaces suitable for touchscreen-equipped mobile devices.
Mobile computing devices (e.g. cell phones, PDAs, laptops, gaming devices) increasingly rely on touchscreen user interfaces over traditional button-based user interfaces. Accordingly, many touchscreen-equipped mobile devices rely primarily on virtual keyboards for alphanumeric text entry. In addition to providing virtual buttons and virtual keyboards, touchscreen devices may provide for user input gestures. For example, in some mobile devices a user may drag a finger or stylus down the touchscreen to scroll a list.
The various embodiments provide a method of implementing a user interface function on a computing device equipped with a touchscreen display, which includes displaying a translucent gesture recognition overlay area on a portion of the touchscreen display, detecting a first touch event on the touchscreen within the gesture recognition overlay, determining an intended cursor movement within a displayed text string based on the detected first touch event, determining an alphanumeric character, punctuation mark, special character or string (e.g., “www” or “.com”) adjacent to a current position of a cursor within the displayed text string after the determined intended cursor movement, displaying on the touchscreen an enlarged portion of the alphanumeric or other character, and updating the cursor position in the displayed text string. In a further embodiment displaying a translucent gesture recognition overlay area on a portion of the touchscreen display is accomplished in response to detecting a user input corresponding to activation of a gesture recognition overlay functionality. In a further embodiment, determining an intended cursor movement within a displayed text string based on the detected touch event includes determining from the first touch event whether a user has touched the gesture recognition overlay with a finger or stylus near an edge, and determining from motion of the first touch event whether the user has dragged the finger or stylus towards the opposite edge of the gesture recognition overlay along a centerline. In a further embodiment, displaying on the touchscreen an enlarged portion of the alphanumeric or other character includes displaying an enlarged portion of the alphanumeric character within the gesture recognition overlay, and periodically updating the display of the enlarged portion based on movement of the first touch event across the gesture recognition overlay. In another embodiment the method further includes recognizing a character sketched in the gesture recognition overlay and adding the recognized character to the displayed text string at the updated cursor position. In another embodiment the method further includes detecting a user input corresponding to activation of a text selection mode, and highlighting the alphanumeric or other character when the text selection mode is activated. In another embodiment the method further includes detecting a user input corresponding to activation of a text selection mode, and highlighting a word in a direction of movement of the first touch event when the text selection mode is activated. In another embodiment the method further includes determining a speed of movement of the first user input, wherein determining an intended cursor movement within a displayed text string based on the detected first touch event includes determining that a single character movement of the cursor is intended when the speed of movement of the first user input is slow, and determining that a multi-character movement of the cursor is intended when the speed of the movement of the first user input is fast. In another embodiment the method further includes displaying a menu of functions including one or more menu icons within the gesture recognition overlay in response to detecting a user input corresponding to activation of a toolbar functionality, detecting a user touch on one or more menu icons, and implementing a function corresponding to the touch one or more menu icons.
In further embodiments, a computing device includes a touchscreen display coupled to a processor which is configured with processor-executable instructions to perform operations including displaying a translucent gesture recognition overlay area on a portion of the touchscreen display, detecting a first touch event on the touchscreen within the gesture recognition overlay, determining an intended cursor movement within a displayed text string based on the detected first touch event, determining an alphanumeric or other character adjacent to a current position of a cursor within the displayed text string after the determined intended cursor movement, displaying on the touchscreen an enlarged portion of the alphanumeric or other character, and updating the cursor position in the displayed text string. In a further embodiment the processor may be configured such that displaying a translucent gesture recognition overlay area on a portion of the touchscreen display is accomplished in response to detecting a user input corresponding to activation of a gesture recognition overlay functionality. In a further embodiment, the processor may be configured such that determining an intended cursor movement within a displayed text string based on the detected touch event includes determining from the first touch event whether a user has touched the gesture recognition overlay with a finger or stylus near an edge, and determining from motion of the first touch event whether the user has dragged the finger or stylus towards the opposite edge of the gesture recognition overlay along a centerline. In a further embodiment, the processor may be configured to perform further operations including displaying, on the touchscreen, an enlarged portion of the alphanumeric or other character includes displaying an enlarged portion of the alphanumeric or other character within the gesture recognition overlay, and periodically updating the display of the enlarged portion based on movement of the first touch event across the gesture recognition overlay. In another embodiment, the processor may be configured to perform operations further including recognizing a character sketched in the gesture recognition overlay, and adding the recognized character to the displayed text string at the updated cursor position. In another embodiment the processor may be configured to perform operations further includes detecting a user input corresponding to activation of a text selection mode, and highlighting the alphanumeric or other character when the text selection mode is activated. In another embodiment, the processor may be configured to perform operations further including detecting a user input corresponding to activation of a text selection mode, and highlighting a word in a direction of movement of the first touch event when the text selection mode is activated. In another embodiment, the processor may be configured to perform operations further including determining a speed of movement of the first user input, wherein determining an intended cursor movement within a displayed text string based on the detected first touch event includes determining that a single character movement of the cursor is intended when the speed of movement of the first user input is slow, and determining that a multi-character movement of the cursor is intended when the speed of the movement of the first user input is fast. In another embodiment the processor may be configured to perform operations further including displaying a menu of functions including one or more menu icons within the gesture recognition overlay in response to detecting a user input corresponding to activation of a toolbar functionality, detecting a user touch on one or more menu icons, and implementing a function corresponding to the touch one or more menu icons.
In further embodiments, a computing device includes means for displaying a translucent gesture recognition overlay area on a portion of the touchscreen display, means for detecting a first touch event on the touchscreen within the gesture recognition overlay, means for determining an intended cursor movement within a displayed text string based on the detected first touch event, means for determining an alphanumeric or other character adjacent to a current position of a cursor within the displayed text string after the determined intended cursor movement, means for displaying, on the touchscreen, an enlarged portion of the alphanumeric or other character, and means for updating the cursor position in the displayed text string. In a further embodiment the means for displaying a translucent gesture recognition overlay area on a portion of the touchscreen display includes means for displaying a translucent gesture recognition overlay area on a portion of the touchscreen display in response to detecting a user input corresponding to activation of a gesture recognition overlay functionality. In a further embodiment, the means for determining an intended cursor movement within a displayed text string based on the detected touch event includes means for determining from the first touch event whether a user has touched the gesture recognition overlay with a finger or stylus near an edge, and means for determining from motion of the first touch event whether the user has dragged the finger or stylus towards the opposite edge of the gesture recognition overlay along a centerline. In a further embodiment, the means for displaying, on the touchscreen, an enlarged portion of the alphanumeric or other character includes a means for displaying an enlarged portion of the alphanumeric or other character within the gesture recognition overlay, and means for periodically updating the display of the enlarged portion based on movement of the first touch event across the gesture recognition overlay. In another embodiment, the computing device further includes means for recognizing a character sketched in the gesture recognition overlay, and means for adding the recognized character to the displayed text string at the updated cursor position. In another embodiment, the computing device further includes means for detecting a user input corresponding to activation of a text selection mode, and means for highlighting the alphanumeric or other character when the text selection mode is activated. In another embodiment, the computing device further includes means for detecting a user input corresponding to activation of a text selection mode, and means for highlighting a word in a direction of movement of the first touch event when the text selection mode is activated. In another embodiment, the computing device further includes means for determining a speed of movement of the first user input, wherein the means for determining an intended cursor movement within a displayed text string based on the detected first touch event includes means for determining that a single character movement of the cursor is intended when the speed of movement of the first user input is slow, and means for determining that a multi-character movement of the cursor is intended when the speed of the movement of the first user input is fast. In another embodiment, the computing device further includes means for displaying a menu of functions that includes one or more menu icons within the gesture recognition overlay in response to detecting a user input corresponding to activation of a toolbar functionality, means for detecting a user touch on one or more menu icons, and means for implementing a function corresponding to the touch one or more menu icons.
In further embodiments, a non-transitory computer-readable storage medium has stored thereon processor-executable instructions that are configured to cause a processor of a computing device equipped with a touchscreen display to perform operations including displaying a translucent gesture recognition overlay area on a portion of the touchscreen display, detecting a first touch event on the touchscreen within the gesture recognition overlay, determining an intended cursor movement within a displayed text string based on the detected first touch event, determining an alphanumeric or other character adjacent to a current position of a cursor within the displayed text string after the determined intended cursor movement, displaying on the touchscreen an enlarged portion of the alphanumeric or other character, and updating the cursor position in the displayed text string. In a further embodiment, the stored processor-executable instructions are configured such that displaying a translucent gesture recognition overlay area on a portion of the touchscreen display is accomplished in response to detecting a user input corresponding to activation of a gesture recognition overlay functionality. In a further embodiment, the stored processor-executable instructions are configured such that determining an intended cursor movement within a displayed text string based on the detected touch event includes determining from the first touch event whether a user has touched the gesture recognition overlay with a finger or stylus near an edge, and determining from motion of the first touch event whether the user has dragged the finger or stylus towards the opposite edge of the gesture recognition overlay along a centerline. In a further embodiment, the stored processor-executable instructions are configured such that displaying, on the touchscreen, an enlarged portion of the alphanumeric or other character includes displaying an enlarged portion of the alphanumeric or other character within the gesture recognition overlay, and periodically updating the display of the enlarged portion based on movement of the first touch event across the gesture recognition overlay. In another embodiment, the stored processor-executable instructions are configured to cause a computing device processor to perform operations further including recognizing a character sketched in the gesture recognition overlay, and adding the recognized character to the displayed text string at the updated cursor position. In another embodiment, the stored processor-executable instructions are configured to cause a computing device processor to perform operations further including detecting a user input corresponding to activation of a text selection mode, and highlighting the alphanumeric or other character when the text selection mode is activated. In another embodiment, the stored processor-executable instructions are configured to cause a computing device processor to perform operations further including detecting a user input corresponding to activation of a text selection mode, and highlighting a word in a direction of movement of the first touch event when the text selection mode is activated. In another embodiment, the stored processor-executable instructions are configured to cause a computing device processor to perform operations further including determining a speed of movement of the first user input, wherein determining an intended cursor movement within a displayed text string based on the detected first touch event includes determining that a single character movement of the cursor is intended when the speed of movement of the first user input is slow, and determining that a multi-character movement of the cursor is intended when the speed of the movement of the first user input is fast. In another embodiment, the stored processor-executable instructions are configured to cause a computing device processor to perform operations further including displaying a menu of functions including one or more menu icons within the gesture recognition overlay in response to detecting a user input corresponding to activation of a toolbar functionality, detecting a user touch on one or more menu icons, and implementing a function corresponding to the touch one or more menu icons.
The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary aspects of the invention. Together with the general description given above and the detailed description given below, the drawings serve to explain features of the invention.
The various aspects will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes and are not intended to limit the scope of the invention or the claims.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
The terms “computing device” and “mobile device” are used interchangeably herein to refer to any one or all of cellular telephones, personal data assistants (PDA's), palm-top computers, wireless electronic mail receivers (e.g., the Blackberry® and Treo® devices), multimedia Internet enabled cellular telephones (e.g., the Blackberry Storm®), Global Positioning System (GPS) receivers, wireless gaming controllers, personal computers, and similar personal electronic devices which include a touchscreen user interface/display. While the various embodiments are particularly useful in mobile devices, such as cellular telephones, which have small displays, the embodiments may also be useful in any computing device that employs a touchscreen display or a touch surface user interface. Therefore, references to “mobile device” in the following embodiment descriptions are for illustration purposes only, and are not intended to exclude other forms of computing devices that feature a touchscreen display or to limit the scope of the claims.
The various embodiments provide a gesture recognition overlay for a mobile device that enables useful user interface functionality. A mobile device configured with such an overlay functionality may provide users with the ability to enter alphanumeric text and edit text by performing simple gestures on a gesture recognition overlay within the touchscreen display. By providing a larger area for accepting user input touch gestures as well as presenting menus, the various embodiments facilitate text entry and editing operations on the relatively small area of most mobile device touch screen displays. While the various embodiments are illustrated in use with a text entry application, the embodiments are not limited to text applications or the manipulation of text fields, and may be implemented with a wide range of applications including multimedia editing and display, games, communication, etc.
The gesture recognition overlay 30 may be configured as an image presented on the touch screen display that has functionality related to the zones of the display that are touched and the directions in which a touch is dragged. In some embodiments the gesture recognition overlay 30 may be presented as a semi-transparent screen or translucent overlay through a technique known as “alpha blending” so that underlying parts of the displayed image can be viewed. The embodiments include configuring the mobile device to recognize certain touch events within the gesture recognition overlay 30 as corresponding to defined user interface commands. Such recognized commands may be, for example, the entry of letters, numbers, punctuation marks, special characters (e.g., mood-icons) and common strings (e.g., “www.” and “.com”) that are traced within the area of the gesture recognition overlay 30. For ease of reference, letters, numbers, punctuation marks, special characters (e.g., mood-icons) and common strings (e.g., “www.” and “.com”) are referred herein generally as characters. The recognized commands may also include actions associated with text entry, such as the backspace action, cursor movements, text highlighting, and text moving. The overlay may be configured with defined edges represented by a border of contrast-colored pixels, or simply by the boundary between the overlay 30 and the underlying application. The overlay 30 may include sub-portions defined by contrast-colored pixels along two centerlines (i.e., the horizontal and the vertical). The overlay 30 may further include contrast-colored pixels in a circular pattern around the center of the overlay, to define a center region. The overlay 30 may further be configured to display magnified portions of a display or text that is being addressed by user interactions on the overlay.
In a simple example of interaction with a gesture recognition overlay 30, if the user makes a circular motion inside the overlay 30, the mobile device may recognize this gesture as an entry of the letter “O” as if typed on a key of a virtual keyboard. Similarly a vertical swipe through the overlay 30 may be recognized as entry of a “1” or an “i”. The larger size and fixed boundaries of the gesture recognition overlay 30 may facilitate the entry and recognition of letters and numbers from lines and curves traced within the overlay area by the tip of a finger or stylus.
In addition to providing gesture recognition for alphanumeric text entry, gesture recognition overlay 30 may provide for the recognition of action gestures, such as a gesture to backspace or move the cursor. Examples of such embodiments in operation are illustrated in
In some embodiments, the touchscreen overlay may assist the user in moving a cursor within a text field by providing a magnification of the text field within the overlay. Two examples of such cursor movement functionality according to an embodiment are shown in
An example method 100 that may be implemented in a processor of a mobile device for causing a cursor movement in response to a touch gesture within a gesture recognition overlay is illustrated in
At block 105, the processor may evaluate the received touch event to determine whether the user is entering a character or a cursor command. Entry of the character may be recognized based upon the starting point direction and curvilinear nature of the touch gesture. The gesture recognition overlay functionality may be configured with processor-executable instructions to recognize only certain touch events as being associated with cursor movements. In an embodiment, cursor movement commands are limited to beginning at a side of the overlay roughly adjacent to even the horizontal or vertical centerline. Thus, if a touch gesture begins at a location away from a side or top edge of the overlay adjacent to a horizontal or vertical centerline, the gesture recognition overlay functionality may be configured to recognize such a touch event as being associated with a character entry or some other functionality. Thus, character input patterns may be limited or defined so that such inputs do not begin near a centerline and/or along an edge of the overlay. If the processor executing the gesture recognition overlay functionality determines that the touch event is a character entry, the character recognized based on the pattern drawn on the overlay may be added to the text string at the current location of the cursor.
If the processor executing the gesture recognition overlay functionality determines that the touch event is not associated with the cursor command, the functionality may process the locations of the touch event against character recognition patterns in order to recognize a character being input by the user. Methods of recognizing patterns traced on a user input device are well known and may be implemented with the gesture recognition overlay serving as the input area for tracing such patterns. In this regard, the gesture recognition overlay functionality enables entry of such character tracings without confusing the user interface since touch events occurring within the overlay are interpreted by the processor as being user inputs to the functionality. Thus, even though user input icons and images below the overlay may be viewable through its translucent display, touch events within the overlay will not be interpreted as activation of any of the underlying user input icons.
At block 106, the mobile device processor may determine the direction of the intended cursor movement based on movements of the touch coordinates received from the operating system. In some embodiments, if the user moves his or her finger or stylus left to right horizontally, this indicates an intention to move the cursor to the left. Alternately, in some embodiments a left to right horizontal movement may be used to move the cursor to the right. In a further embodiment, the movement direction may be user-configurable. Similarly, vertical finger or stylus movement may be interpreted as a request to move the cursor up or down. At block 108, the processor may determine the cursor position after a one-character movement. For example, if the cursor is between character 37 and character 38 in a string buffer and the processor has determined that the user wishes to move the cursor to the right, then the position of the cursor after the movement may be between characters 38 and 39. Alternatively, if the user wishes to move the cursor upwards, the final cursor position may be between characters 26 and 27.
At block 112, the processor may determine the alphanumeric, punctuation mark or other character adjacent to the cursor position after the cursor movement by accessing the string buffer. In some embodiments, the magnification aspect may simulate the viewpoint of the cursor by displaying the character over which the cursor is moving, as is illustrated in
As described above with reference to
In some embodiments, the gesture recognition overlay functionality may enable jumping the cursor over several characters or spaces, similar to the jump scrolling or flick scrolling feature common on touchscreen user interfaces. In such embodiments, the processor may be configured to recognize a “flick” gesture, such as by detecting that the finger or stylus no longer touches the touchscreen and detecting that the finger or stylus was moving at the moment that the finger or stylus stopped touching the touchscreen. If the processor determines that the nature of the touch event movement indicates a flick of the finger or stylus tip from the edge of the overlay towards the center (i.e., determination block 118=“Flick”), the processor may determine the position of the cursor after the jump (i.e., the jump point) at block 130. The amount of text jumped in response to such a flick gesture may be determined based on the measured tangential velocity of the movement of the finger or stylus tip across the overlay at the that the finger or stylus stopped touching the touchscreen. Alternatively, the amount of text jumped in response to a flick gesture may be fixed at a certain number of pixels, millimeters, characters, spaces or lines. In another embodiment, a horizontal flick gesture (i.e., a rapid movement more or less parallel to the horizontal centerline of the overlay) may be construed as a command to move the cursor to the beginning or end of the line. Similarly, a vertical flick gesture (i.e., a rapid movement more or less parallel to the vertical centerline of the overlay) may be construed as a command to perform a page up or page down movement through a text document. Once the jump point is determined at block 130, processor may display a scrolling magnification of the jumped text starting with the character already displayed in the overlay at block 134 so that the overlay shows a continuous stream and movement of magnified characters moving across the overlay during the flick movement. At the jump end point the processor may clear the overlay magnification and return the overlay to the base state in block 138. Alternatively, the mobile device may clear the overlay magnification and instead magnify characters adjacent to the cursor at the jump end point. Once the cursor movement has been completed, gesture recognition overlay functionality may return to block 104 to receive the next touch event from the operating system.
In some embodiments, the gesture recognition overlay functionality may provide an additional or alternative operating mode in which the overlay becomes or displays a toolbar with a plurality of virtual buttons (i.e., user input icons displayed within the overlay). An example of such an embodiment is illustrated in
The toolbar-mode overlay 30 may also provide a select button 32a function to enter a text selection mode, in which the overlay may be used to select portions of the text string 35. Examples of text selection functionality that may be included in such a text selection mode are illustrated in
An exemplary method 200 for providing text selection functionality in an overlay is illustrated in
At block 204, the processor may receive a touch event indicating that the user has dragged a finger or stylus along an overlay center line from one edge of the overlay to the other edge (i.e., a movement gesture). At block 206, the processor may determine the intended direction of the movement based upon the coordinates of the touch event received in block 204. In an embodiment, a right to left touch movement gesture may be correlated to a command causing the variable highlight buffer boundary to move to the left. Alternatively, a right to left movement may be correlated to a command causing the variable highlight buffer boundary to move his to the right (i.e., inverted control). Both or either of the horizontal and vertical movement gestures may be inverted, but neither need be.
At determination block 212, the processor may determine whether the indicated movement of the variable boundary will grow or shrink the highlighted text stored in the highlight buffer. If the processor determines that the touch event movement corresponds to a command to grow the highlighted text by adding text to the highlight buffer (i.e., determination block 212=“Grow”), the processor may determine whether the cursor is visible in the display at determination block 220. In one embodiment, the cursor will be visible when the highlight buffer is empty. In another embodiment there may be no cursor at all times, and instead the user will see a highlight. The presence and absence of a cursor versus a highlight may render it obvious to the user whether the user interface is in a selection mode or in a cursor (regular) mode. Thus, in this embodiment a cursor may never appear during the selection mode nor will a highlight ever disappear. The smallest unit that a highlight will cover is a “word,” which can be just a single character like “a” or “I”. If the cursor is visible (i.e., determination block 220=“Yes”), the processor may set the cursor to not be visible in block 224. The processor may access the next word (or character in an embodiment where the boundary moves one character at a time) and add it to the highlight buffer at block 226. If the processor determines that movement of the variable boundary in response to the moving touch gesture will shrink the highlighted text string stored in the highlight buffer (i.e., determination block 212=“Shrink”), the processor may access the previously added word (i.e., the word adjacent to the variable boundary), and remove it from the highlight buffer at block 234. Removing the selected word from the highlight buffer will cause the display of word to shift to normal (i.e., non-highlighted) mode. Alternatively, the user may command an exit from the highlight mode by performing a predefined user touch gestures, such as tapping on the center circle of the overlay to display a tool bar and then selecting another function (e.g., unselect, cut, copy, paste, etc.), or double tapping on the highlight. At determination block 238, the processor may determine whether the highlight buffer is empty. If the highlight buffer is empty (i.e., determination block 238=“Yes”), the processor may set the cursor back to visible at block 242.
After adding or deleting a word from the highlight buffer, the processor may determine whether the text selection mode is ended at determination block 228. This determination may be based upon the defined actions that terminate the text selection mode, such as a period of time of inactivity, tapping on a menu icon, activating another functionality by tapping on the center circle and selecting another function, double tapping on the highlight, etc. For example, the text selection mode may terminate when the user taps the center of the overlay to bring up the overlay toolbar. As another example, the text selection mode may terminate when the user performs a “cut” command (e.g., performing a gesture which causes the processor to move the content of the highlight buffer to the clipboard buffer). If the processor determines that the text selection gesture is complete (i.e., determination block 228=“Yes”), the processor may return the overlay to the base state at block 232. If the processor determines that the text selection gesture is not complete (i.e., determination block 228=“No”), the mobile device may wait for another touch movement gesture input from the operating system in block 204.
An example method 300 for implementing a toolbar within the gesture recognition overlay is illustrated in
The foregoing embodiment descriptions illustrate some of the user input functionality that may be implemented in a gesture recognition overlay. Other functionality may be implemented in a similar manner. For example, different functionality may be associated with diagonal motions through the overlay, motions limited to one of the four quadrants of the overlay, circular motions, and application dependent functionality. Such additional or alternative functionality may be implemented using methods and systems similar to those described herein.
Typical mobile devices 1 suitable for use with the various embodiments will have in common the components illustrated in
The processor 401 may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described herein. In some mobile devices, multiple processors 401 may be provided, such as one processor dedicated to wireless communication functions and one processor dedicated to running other applications. Typically, software applications may be stored in the internal memory 402 before they are accessed and loaded into the processor 401. In some mobile devices, the processor 401 may include internal memory sufficient to store the application software instructions. In some mobile devices, the secure memory may be in a separate memory chip coupled to the processor 401. In many mobile devices 400 the internal memory 402 may be a volatile or nonvolatile memory, such as flash memory, or a mixture of both. For the purposes of this description, a general reference to memory refers to all memory accessible by the processor 401, including internal memory 402, removable memory plugged into the mobile device, and memory within the processor 401 itself.
The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of steps in the foregoing embodiments may be performed in any order. Words such as “thereafter,” “then,” “next,” etc. are not intended to limit the order of the steps; these words are simply used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles “a,” “an” or “the” is not to be construed as limiting the element to the singular.
The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module executed which may reside on a tangible non-transitory computer-readable medium. Tangible non-transitory computer-readable media may be any available non-transitory media that may be accessed by a computer. By way of example, and not limitation, such non-transitory computer-readable media may comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to carry or store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of non-transitory computer-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory machine readable medium and/or non-transitory computer-readable medium, which may be incorporated into a computer program product.
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.