The present disclosure relates generally to graphical user interfaces and, more particularly, to an interface tool that may be utilized to navigate through content displayed on electronic devices.
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present disclosure, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Applications executed on electronic devices may include applications that enable users to input and edit text. Examples of such applications include word processing applications, presentation applications, spreadsheet applications, and note-taking applications. In some cases, a display of an electronic device may not display an entire document within an application. In other words, only a portion of a document within an application may be displayed. For example, in some cases, an electronic device may have a relatively small display and/or the document may be relatively large (e.g., wide, long, or both). Navigating through the document may prove challenging in such cases or with certain types of electronic devices.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
The present disclosure relates to a virtual joystick tool that may be utilized to navigate through a productivity document (e.g., a text document, spreadsheet document, presentation document, etc.). More specifically, the joystick tool may be presented on a display of an electronic device with the productivity document, and a user of the electronic device may interact with the joystick tool to navigate within the productivity document. For instance, the joystick tool may include a joystick that the user may interact with to cause a viewable portion of the productivity document that is displayed to transition to another viewable portion of the productivity document. Additionally, a user may select within a bounding area associated with the joystick tool to cause the viewable portion of the productivity document to jump to another viewable portion of the productivity document. Accordingly, a user may be able to navigate through the productivity document in a convenient and intuitive manner, especially on electronic devices with relatively small displays and/or electronic devices that do not utilize input structures typically associated with computers, such as keyboards, mice, and/or trackpads.
Various refinements of the features noted above may be made in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. The brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings in which:
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “including” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “some embodiments,” “embodiments,” “one embodiment,” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Furthermore, the phrase A “based on” B is intended to mean that A is at least partially based on B. Moreover, the term “or” is intended to be inclusive (e.g., logical OR) and not exclusive (e.g., logical XOR). In other words, the phrase A “or” B is intended to mean A, B, or both A and B.
The present disclosure relates to a joystick tool that may be utilized to navigate through a productivity document (e.g., a text document, spreadsheet document, presentation document, etc.). More specifically, the joystick tool may be presented on a display of an electronic device with the productivity document, and a user of the electronic device may interact with the joystick tool to navigate within the productivity document. For instance, the joystick tool may include a joystick that the user may interact with to cause a viewable portion of the productivity document that is displayed to transition to another viewable portion of the productivity document. Additionally, a user may select within a bounding area associated with the joystick tool to cause the viewable portion of the productivity document to jump to another viewable portion of the productivity document. Accordingly, a user may be able to navigate through the productivity document in a convenient and intuitive manner, especially on electronic devices with relatively small displays and/or electronic devices that do not utilize input structures typically associated with computers, such as keyboards, mice, and/or trackpads.
With this in mind, a block diagram of an electronic device 10 is shown in
The electronic device 10 shown in
The processor core complex 12 may carry out a variety of operations of the electronic device 10. The processor core complex 12 may include any suitable data processing circuitry to perform these operations, such as one or more microprocessors, one or more application program specific processors (ASICs), or one or more programmable logic devices (PLDs). In some cases, the processor core complex 12 may execute programs or instructions (e.g., an operating system or application program) stored on a suitable article of manufacture, such as the local memory 14 and/or the main memory storage device 16. For example, the processor core complex 12 may carry out instructions stored in the local memory 14 and/or the main memory storage device 16 to change a viewable portion of a document within an application based on user input. In addition to instructions for the processor core complex 12, the local memory 14 and/or the main memory storage device 16 may also store data to be processed by the processor core complex 12. By way of example, the local memory 14 may include random access memory (RAM) and the main memory storage device 16 may include read only memory (ROM), rewritable non-volatile memory such as flash memory, hard drives, optical discs, or the like.
The electronic display 18 may display image frames, such as a graphical user interface (GUI) for an operating system or an application program interface, still images, or video content. The processor core complex 12 may supply at least some of the image frames. For example, the processor core complex 12 may supply image frames that display an application and the joystick tool of this disclosure. The electronic display 18 may be a self-emissive display, such as an organic light emitting diodes (OLED) display, a micro-LED display, a micro-OLED type display, or a liquid crystal display (LCD) illuminated by a backlight. In some embodiments, the electronic display 18 may include a touch screen, which may allow users to interact with a user interface of the electronic device 10.
The input structures 22 of the electronic device 10 may enable a user to interact with the electronic device 10 (e.g., pressing a button to increase or decrease a volume level). The I/O interface 24 may enable electronic device 10 to interface with various other electronic devices, as may the network interface 26. The network interface 26 may include, for example, interfaces for a personal area network (PAN), such as a Bluetooth network, for a local area network (LAN) or wireless local area network (WLAN), such as an 802.11x Wi-Fi network, and/or for a wide area network (WAN), such as a cellular network. The network interface 26 may also include interfaces for, for example, broadband fixed wireless access networks (WiMAX), mobile broadband Wireless networks (mobile WiMAX), asynchronous digital subscriber lines (e.g., ADSL, VDSL), digital video broadcasting-terrestrial (DVB-T) and its extension DVB Handheld (DVB-H), ultra wideband (UWB), alternating current (AC) power lines, and so forth. The power source 28 may include any suitable source of power, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter.
In certain embodiments, the electronic device 10 may take the form of a computer, a portable electronic device, a wearable electronic device, or other type of electronic device. Such computers may include computers that are generally portable (such as laptop, notebook, and tablet computers) as well as computers that are generally used in one place (such as desktop computers, workstations and/or servers). In certain embodiments, the electronic device 10 in the form of a computer may be a model of a MacBook®, MacBook® Pro, MacBook Air®, iMac®, Mac® mini, or Mac Pro® available from Apple Inc. By way of example, the electronic device 10, taking the form of a notebook computer 10A, is illustrated in
User input structures 22, in combination with the electronic display 18, may allow a user to control the handheld device 10B. For example, the input structures 22 may activate or deactivate the handheld device 10B, navigate user interface to a home screen, a user-configurable application program screen, and/or activate a voice-recognition feature of the handheld device 10B. Other input structures 22 may provide volume control, or may toggle between vibrate and ring modes. The input structures 22 may also include a microphone may obtain a user's voice for various voice-related features, and a speaker may enable audio playback and/or certain phone capabilities. The input structures 22 may also include a headphone input may provide a connection to external speakers and/or headphones.
Turning to
Similarly,
As noted above, the present disclosure relates to a joystick tool that may be used to navigate within an application that may be displayed on a display of an electronic device, such as the electronic display 18 of the electronic device 10. For example, as discussed below, a user may interact with the joystick tool (e.g., via a touch screen display or the input structures 22), and a viewable portion of a document within an application may be changed based on the user's interaction with the joystick tool.
With the foregoing in mind,
At process block 62, the processor core complex 12 may display at least a portion of a productivity document. To help elaborate,
The productivity document 102 includes columns 110 and rows 112 of data. Additionally, the software application program 100 may include a column indicator 114 and a row indicator 116, which respectively indicate which column and row a particular datum (e.g., a cell within the spreadsheet) is included in. In some cases, the software application program 100 may also include tabs 118 that enable users to switch between different portions of the productivity document 102 within the software application program 100. For example, in the illustrated embodiment, the tabs 118 may be utilized to switch between two different spreadsheets within the productivity document 102. Additionally, the software application program 100 may include a new tab tool 120, which when selected by a user, may cause the processor core complex 12 to add a new tab (e.g., a new spreadsheet in a spreadsheet application) to the productivity document 102.
The productivity document 102 may have a bounding area, such as an area of the electronic display 18 or a portion of the electronic display 18. In some instances, the productivity document 102 provided by the software application program 100 may be larger display 18 of the electronic device 10 upon which the software application program 100 is displayed. In other words, a portion of the productivity document 102 may be displayed (e.g., a viewable portion) via the electronic display 18 while other portions of the productivity document 102 are not displayed. For example, in a spreadsheet document, there may be rows and/or columns of data that may not be displayed on the electronic display 18. In some cases, portions of the productivity document 102 may not be displayed due to a viewing perspective (e.g., zoom level), a size of the electronic display 18, the amount of data in the productivity document 102, or a combination thereof. For example, in embodiments in which the electronic device 10 is the handheld device 10B, handheld device 10C, and wearable electronic device 10E, the viewable portion of the productivity document 102 may be smaller than a viewable portion of the same document when displayed by an electronic device that may have a relatively larger display 18, such as the computer 10A or the computer 10D.
A user may navigate through the productivity document 102 to change which portion of the productivity document 102 is being displayed. For example, when the input structures 22 include a keyboard and/or mouse, a user may utilize the keyboard and/or mouse to navigate through the productivity document 102. Additionally, the input structures 22 may include the electronic display 18 in embodiments of the electronic device 10 in which the electronic display 18 is a touch screen. For instance, a user may drag a finger or stylus along the electronic display 18 to move the viewable portion of the productivity document 102 from one viewable portion to another.
As mentioned above, the software application program 100 may include a joystick tool that is defined by a bounding area 130. That is, the joystick tool may be a user interface feature provided within a portion of the bounding area of the productivity document 102. In other embodiments, the bounding area 130 may be larger than the bounding area associated with the productivity document 102 (e.g., when the productivity document 102 utilizes a relatively small portion of the electronic display 18). As will be discussed below, a user may interact with the joystick tool to navigate through the productivity document 102. It should be noted that when the joystick tool is not being displayed, the bounding area 130 may be transparent or not displayed. In other embodiments, the bounding area 130 may be slightly opaque, which may enable users to see where the bounding area 130 is within the software application program 100.
Returning to
At process block 66, the processor core complex 12 may display the joystick tool at a location within the productivity document 102 (and software application program 100) in response to receiving the user input to display the joystick tool. To help illustrate,
Additionally, it should be noted that in some embodiments, the joystick tool 140 may be provided without receiving user input to display the joystick tool 140. For example, the joystick tool 140 may be provided upon startup of the software application program 100 or loading or creation of the productivity document 102. As another example, the joystick tool 140 may be provided when a user navigates within the productivity document 102 via a manner other than utilizing the joystick tool 140.
Referring back to
In response to receiving user input within the bounding area of the joystick tool 140, at process block 70, the processor core complex 12 may determine a document navigational operation. The document navigational operation may be an operation that, when performed, causes a viewable portion of the productivity document 102 to change to another viewable portion of the productivity document 102. In particular, the processor core complex 12 may determine the document navigational operation based on the type of interaction the user has with the joystick tool 140. For example, when the user input is indicative of the user interacting with the joystick 142, the document navigational operation may be a vector that indicates a magnitude and direction that respectively correspond to a distance the user has moved the joystick 142 (e.g., a distance from a starting position of the joystick 142, such as the middle of the bounding area 130) and a direction the user has moved the joystick 142 (e.g., a direction relative to a starting position of the joystick 142). For example, the farther the user moves the joystick 142 from one portion of the bounding area 130 (e.g., a center point of the bounding area 130), the greater the magnitude.
Regarding the direction indicated by the user input, in some embodiments, the processor core complex 12 may determine the user input as corresponding to one of several specific directions, such as up, down, left, or right, or a combination thereof (e.g., up and left, up and right, down and left, down and right). More specifically, when the direction indicated by the user input corresponds to a combination of directions, the direction determined by the processor core complex 12 may be similar to compass directions (e.g., up and right at a forty-five degree angle corresponding to northeast, up and right at a thirty degree angle corresponding to east-northeast, etc.). In other words, the processor core complex 12 may determine an angle relative to a position (e.g., a center point of the bounding area 130 or a previous position of the joystick 142), and the transition from one viewable portion of the productivity document 102 to another viewable portion of the productivity document 102 may correspond to a different but similar angle. For example, if the processor core complex 12 determines that the angle indicated by the user input is a first angle, the processor core complex 12 may determine a predefined angle that is most similar to the first angle, and the transition from one viewable portion to another viewable portion may correspond to the predefined angle that is similar to the first angle.
As another example of determining a document navigational operation, when the user input is indicative of the user having selected a portion of the bounding area 130 other than the joystick 142, the processor core complex 12 may similarly determine a vector that indicates a magnitude and direction, or the processor core complex 12 may determine a jump position to which a “jump” will occur. For example, as discussed below, when a user interacts with some portions of the bounding area 130, the processor core complex 12 may cause a first viewable portion of the productivity document 102 to move to a second viewable portion of the productivity document 102 without displaying portions of the productivity document 102 that are between the first and second viewable portions.
At process block 72, the processor core complex 12 may adjust a viewable portion of the productivity document 102 based on the document navigational operation. To help illustrate,
Returning to the discussion of the process 60 in
In some embodiments, the visual indicator 150 may also indicate how quickly the viewable portion is changing (e.g., transitioning from one viewable portion of the productivity document 102 to another viewable portion). In other words, the visual indicator 150 may be indicator of how far the user has moved the joystick 142 from the center of the bounding area 130 as well as the magnitude of the vector of the document navigational operation. For example, the size of the visual indicator may be larger the farther the joystick 142 has been moved from the center of the bounding area 130. As another example, the visual indicator 150 may be displayed with varying levels of transparency (or opacity) based on how far the joystick 142 has been moved from the center of the bounding area 130. For instance, the visual indicator 150 may become darker the farther the joystick 142 is moved from the center of the bounding area 130.
Portions of the process 60 may be repeated while a user interacts with the software application program 100 or the productivity document 102. For example, the processor core complex 12 may receive multiple user inputs within the bounding area 130 during a user's experience with the productivity document 102. For instance, the user may move the joystick 142 from one position to another position. In response, the processor core complex 12 may determine a document navigational operation based on the input, adjust a viewable portion of the productivity document 102 based on the document navigational operation, and display the visual indicator 150 to indicate a direction of the viewable portion of the productivity document 102 being displayed relative to an original (or previous) viewable portion of the productivity document 102.
For example,
In other embodiments, the process 60 may include additional operations. For example, the processor core complex 12 may determine characteristics of the productivity document 102 such as the number of pages, columns 110, rows 112, and/or slides, which can depend on the type of document the productivity document 102 is (e.g., spreadsheet document, presentation document, text document). The characteristics may also relate to a portion of the productivity document 102 that is populated with user-created data (e.g., text data, image data, etc.). For instance, the processor core complex 12 may determine which portion of the productivity document 102 include user-created data. The processor core complex 12 may also determine settings of the software application program 100 such as a perspective or zoom level. Based on the characteristics of the productivity document 102 and settings of the software application program 100, the processor core complex 12 may determine one or more directions that a user can navigate within the productivity document 102 and indicate the directions with the joystick tool 140. For instance, the processor core complex 12 may determine that navigation may only occur leftwards or downwards within the productivity document 102 in some cases, while in other cases, the processor core complex 12 may determine that navigation can only occur upwards or downwards.
To help illustrate,
Moreover, it should be noted that the region 160 may be displayed with a different opacity than the portions within the bounding area 130 that are not included within the region 160. For example, in some embodiments, the portions of the bounding area 130 that are not included in the region 160 may be relatively more transparent than the region 160. As another example, in some embodiments, the portions within the bounding area 130 that are located outside of the region 160 may be completely transparent or not displayed. In such embodiments, only the portions of the joystick tool 140 located within the region 160 may be displayed. In other words, the region 160 may be presented as, or instead of, the bounding area 130 of the joystick tool 140.
As described above, user inputs made with the joystick tool 140 may not utilize the joystick 142. For example, the joystick tool 140 may be utilized to jump within the productivity document when a user interacts within the bounding area 130 but not with the joystick 142. With this in mind,
At process block 202, the processor core complex 12 may receive user input to jump to a portion of the productivity document 102. As discussed above, a user may select within the bounding area 130 (e.g., via the input structures 22 or the electronic display 18 when the electronic display 18 is a touch screen display) other than the joystick 142. For instance, a user may select a portion within the bounding area 130 that is relatively closer to the perimeter of the bounding area 130 relative to the joystick 142. Additionally, the specific user interaction with the joystick tool 140 may be in the form of a single tap, double tap, short press, long press, soft press, or hard press on the electronic display 18 in embodiments of the electronic device 10 in which the electronic display 18 is a touch screen display. Furthermore, the processor core complex 12 may discern any other input that is distinguishable from other types of inputs discussed above, such as moving the joystick 142 or making a dragging motion.
For instance,
Returning to
Referring back to
As discussed above, the user interaction or input within the bounding area 130 may be in a special region (e.g., special region 224 or center region 236) or in the central region 232. With this in mind,
At process block 252, the processor core complex 12 may receive user input to jump to a portion of the productivity document 102. For instance, as discussed above, the user input may correspond to a user interaction with a portion of the joystick tool 140 other than the joystick 142. For instance, the user may select within the bounding area 130 at a position that does not include the joystick 142.
At decision block 254, the processor core complex 12 may determine whether the user input is indicative of a special region. For example, as described above with respect to
At process block 258, the processor core complex 12 may adjust the viewable portion of the productivity document to the jump position. In other words, the processor core complex 12 may cause the portion of the productivity document 102 associated with the user input to be displayed. For example, as described above, processor core complex 12 may cause the viewable portion of the productivity document 102 being displayed to change to a different viewable portion of the productivity document 102 without showing a transition between the two viewable portions.
If at decision block 254 the processor core complex 12 determines that the user input is not indicative of a special region, at process block 256, the processor core complex 12 may identify a jump position based on a vector. More specifically, the processor core complex 12 may determine a vector based on the user input and determine a jump position based on the vector. For example, when the user input indicates a portion of the central region 232, the processor core complex 12 may determine a vector (e.g., having a magnitude and direction). The vector may be determined similarly as discussed above. For instance, the magnitude of the vector may be determined based on how far from the center region 236 or a center point of the bounding area 130 the user input is, while a direction of the vector may be determined based on the location of the input relative to the center region 236 or center point of the bounding area 130 (e.g., left, right, up, down, or a combination thereof).
Based on the vector, the processor core complex 12 may determine a jump position. For example, the processor core complex 12 may determine a portion of the productivity document 102 to display based on the vector. More specifically, the vector may correspond to a movement from a currently displayed viewable portion of the productivity document 102, and the processor core complex 12 may display another viewable portion of the productivity document 102 based on a portion of the productivity document 102 indicated by the vector relative to the viewable portion of the productivity document 102 being displayed before the transition to the other viewable portion. For instance,
Additionally, at process block 258, the processor core complex 12 may adjust the viewable portion of the productivity document to the jump position determined based on the vector. In other words, the processor core complex 12 may cause the portion of the productivity document 102 associated with the user input to be displayed. For example, as described above, processor core complex 12 may cause the viewable portion of the productivity document 102 being displayed to change to a different viewable portion of the productivity document 102 without showing a transition between the two viewable portions. With this in mind,
In some embodiments, the processor core complex 12 may determine a type of user interaction within the bounding area 130 and utilize the type of interaction in determining a viewable portion of the productivity document 102 to present. For example, different interactions with the special regions may cause different viewable portions to be displayed. For instance, in one embodiment, a user may double-tap a special region, and a jump to a viewable portion of the productivity document 102 associated with the special region may be performed. However, when the user continuously selects (e.g., performs a long press) on a special region, the viewable portion may transition to another viewable portion similar to when the joystick 142 is utilized. For example, it may appear as though there is a sliding within the productivity document 102 in a direction corresponding to the special region indicated by the user input, and the visual indicator 150 and portion indicator 152 may be displayed. As another example, certain interactions with the center region 236 may cause a previously displayed viewable portion (e.g., prior to the most recent joystick-142 based movement) to be displayed. For instance, if a first viewable portion was displayed, then a second viewable portion was displayed, a user may interact with the center region 236 (e.g., double-tap, short-press, or another type of interaction) to cause the first viewable portion to be displayed. As yet another example, a user may interact with the central region 232 to cause a relatively short jump to occur.
In some embodiments, the user may interact with the joystick tool 140 to disable the joystick tool 140 and/or the joystick 142 or otherwise cause the joystick tool 140 and/or joystick 142 to no longer be displayed. For example, in one embodiment, a user could select the joystick 142 without moving the joystick 142 and/or swipe up passed a boundary of the bounding area 130, and the processor core complex 12 may interpret the user input as a request to stop displaying the joystick 142 and stop displaying the joystick tool 140 and/or joystick 142. With the joystick 142 not displayed, the user may be able to better interact with the embodiment of the joystick tool 140 illustrated in
Moreover, the processor core complex 12 may cause the electronic device 10 to provide feedback, such as visual and/or haptic feedback to alert a user that an input is acknowledged. For example, as discussed above, the visual indicator 150 may be displayed based on a user's interaction with the joystick 142. Additionally, the electronic device 10 may vibrate otherwise provide haptic feedback in response to receiving user input. For example, in response to receiving a user input indicative of an interaction with or selection of a special region, the processor core complex 12 may cause the electronic device 10 to vibrate in addition to causing a new viewable portion of the productivity document 102 to be determined and displayed.
Furthermore, while the joystick tool 140 is described as being provided by the software application program 100, in other embodiments, the joystick tool 140 may be included in a software development kit (SDK) associated with electronic device 10 or software included on, or associated with, the electronic device 10. For example, the joystick tool 140 may be included as part of a SDK included with an operating system of the device or a software package or a software platform that may be included on, or accessible to, the electronic device 10. Accordingly, in some cases the joystick tool 140 may be utilized with software or applications other than the software application program 100.
The technical effects of the present disclosure include a joystick tool 140 that may be utilized to navigate within a productivity document 102 executed via a software application program 100. More specifically, the joystick tool 140 may be presented on a display 18 of an electronic device 10 with the productivity document 102, and a user of the electronic device 10 may interact with the joystick tool 140 to navigate within the productivity document 102. For instance, the joystick tool 140 may include a joystick 142 that the user may interact with to cause a viewable portion of the productivity document 102 that is displayed to transition to another viewable portion of the productivity document 102. Additionally, a user may select within a bounding area 130 associated with the joystick tool 140 to cause the viewable portion of the productivity document 102 to jump to another viewable portion of the productivity document 102. Accordingly, a user may be able to navigate through the productivity document 102 in a convenient and intuitive manner, especially on electronic devices with relatively small displays and/or electronic devices that do not utilize input structures typically associated with computers, such as keyboards, mice, and/or trackpads.
The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).