Workers (for example, public safety personnel, utility workers, and construction workers) responding to individual task requests (for example, incident reports, calls for service, and work orders) may use portable electronic devices to assist them during the performance of their duties. Some portable electronic devices, for example smart telephones, provide a suite of applications that interact with and consume data from computer systems that coordinate work and assign tasks to workers (for example, computer-aided dispatch systems and workflow ticketing systems). Such application suites offer workers access to many potentially relevant applications while responding to task requests.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
The device and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
Typically, switching to a different application while operating in another application on a portable electronic device requires opening and navigating a menu before selecting the desired application. This may be distracting or time consuming, particularly for emergency personnel who may need to use several applications during an emergency situation. In addition, emergency personnel may be able to put time spent switching between applications to better use in response activities. Accordingly, methods and systems are provided herein for application navigation on a portable device.
One example embodiment provides a portable device. The device includes a display and an electronic processor coupled to the display. The electronic processor is configured to associate a set of applications to each other within a folder stored on the portable device and assign each application of the set of applications a priority relative to the other applications of the set of applications. The electronic processor is further configured to receive, via an interface of the portable device, a first user input selecting the folder and in response to receiving the first user input, activate the set of applications in a background of an operating system of the portable device and present, via the display, a first indication of a first application of the set of applications based on the priority of the first application relative to the other applications of the set of applications. The electronic processor is also configured to receive, via the interface of the portable device, a second user input; and in response to receiving the second user input, navigate to a first indication of a second application of the set of applications based on the priority of the second application relative to the other applications of the set of applications and a navigation direction associated with the second user input.
Another example embodiment provides a method of application navigation on a portable device. The method includes associating a set of applications to each other within a folder stored on the portable device and assigning each application of the set of applications a priority relative to the other applications of the set of applications. The method also includes receiving, via an interface of the portable device, a first user input selecting the folder and in response to receiving the first user input, activating the set of applications in a background of an operating system of the portable device and presenting a first indication of a first application of the set of applications based on the priority of the first application relative to the other applications of the set of applications. The method also includes receiving, via the interface of the portable device, a second user input including a gesture and in response to receiving the second user input, navigating to a second indication of a of a second application of the set of applications based on the priority of the second application relative to the other applications of the set of applications and a navigation direction associated with the second user input.
For ease of description, some or all of the example systems presented herein are illustrated with a single exemplar of each of its component parts. Some examples may not describe or illustrate all components of the systems. Other example embodiments may include more or fewer of each of the illustrated components, may combine some components, or may include additional or alternative components.
The electronic processor 102 obtains and provides information (for example, from the memory 104 and/or the input and output interface 106), and processes the information by executing one or more software instructions or modules, capable of being stored, for example, in a random access memory (“RAM”) area of the memory 104 or a read only memory (“ROM”) of the memory 104 or another non-transitory computer readable medium (not shown). The software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions.
The memory 104 can include one or more non-transitory computer-readable media, and includes a program storage area and a data storage area. The program storage area and the data storage area can include combinations of different types of memory, as described herein. In the embodiment illustrated, the memory 104 stores, among other things, an operating system 113 of the portable electronic device 100, a folder 114, a first application 116, and a second application 118 (described in detail below). The electronic processor 102 is configured to retrieve from the memory 104 and execute, among other things, software related to the control processes, for example, the operating system 113 and the first and second application 116, 118, and methods described herein.
The input and output interface 106 is configured to receive input and to provide output to peripherals. The input and output interface 106 obtains information and signals from, and provides information and signals to, (for example, over one or more wired and/or wireless connections) devices both internal and external to the portable electronic device 100.
The electronic processor 102 is configured to control the transceiver 108 to transmit and receive data to and from the portable electronic device 100. The electronic processor 102 encodes and decodes digital data sent and received by the transceiver 108. The transceiver 108 transmits and receives radio signals to and from various wireless communications networks using the antenna 110. The electronic processor 102 and the transceiver 108 may include various digital and analog components, which for brevity are not described herein and which may be implemented in hardware, software, or a combination of both. Some embodiments include separate transmitting and receiving components, for example, a transmitter and a receiver, instead of a combined transceiver 108.
The display 112 is a suitable display, for example, a liquid crystal display (LCD) touch screen, or an organic light-emitting diode (OLED) touch screen. The portable electronic device 100 implements a graphical user interface (GUI) (for example, generated by the electronic processor 102, from instructions and data stored in the memory 104, and presented on the display 112), that enables a user to interact with the portable electronic device 100. The graphical user interface presented herein allows interaction with the interface using gesture-based inputs. Embodiments presented herein are described in terms of gestures received by a touch screen interface. However, in other embodiments, gestures could be captured via a cursor-control device and through input actions such as mouse clicks. Thus, a touch screen is not necessary in all instances.
In some embodiments, the portable electronic device 100 is a smart phone. In other embodiments, the portable electronic device 100 may be a tablet computer, a smart watch, a portable radio, a combination of the foregoing, or another portable or mobile electronic device containing software and hardware enabling it to operate as described herein.
In some embodiments, the method 200 includes an initial step of creating one or more folders.
In some embodiments, the folder is able to be assigned or reassigned an identifier, for example a name. The identifier may be assigned by a user via the interface of the portable electronic device 100 or predetermined based on a configuration command from a remote system.
Returning to
Returning to
At block 206, the electronic processor 102 receives, via the interface of the portable electronic device 100, a first user input selecting the folder 114. In response to receiving the first user input, the set of applications within the folder 114 activates in the background of the operating system 113 of the portable electronic device 100 (block 208) and the electronic processor 102 presents via the display, a first indication of the first application based on the priority of the first application relative to the other applications of the set of applications. For example, the indication of the application with the highest priority (in this case, the second application 118) of the set of applications is presented on the display 112 of the portable electronic device 100 (block 209). The indication may be an icon associated with or a graphical view of the application or the application itself
At block 210, the portable electronic processor 102 receives, via the interface of the portable electronic device 100, a second user input. In response to the second user input, the electronic processor 102 navigates to another indication of another application, for example the second application 118, within the folder based on the priority of the application relative to the other applications within the set of applications and a navigation direction associated with the second user input (block 212). In some embodiments, the electronic processor 102 is configured to determine a gesture type of the second user input. The gesture type may be one of either a first gesture type or a second gesture type. Gesture types include a left slide or swipe, a right slide or swipe, a single tap, a double tap, a circular gesture, and a custom gesture, all of which may be performed with a single finger or two fingers. This should not be considered limiting. In other embodiments, gestures may be received using virtual or augmented reality systems, which detect, for example, the movement of the eyes, arms, hands, or fingers.
The gesture type corresponds to the navigation direction of the set of application. The electronic processor 102 may be configured to associate the first gesture type with a first navigational direction of decreasing priority and the second gesture type with the second navigational direction of increasing priority. The electronic processor 102 determines if the gesture type is of a first gesture type or of a second gesture type. When it is determined that the gesture includes the first gesture type, the electronic processor 102 navigates to and displays on the display 112 an indication of the next application of the priority ordered set assigned a priority lower than the priority of the present application. Alternatively, when it is determined the gesture includes the second gesture type, the electronic processor 102 navigates to and displays on the display 112 an indication of the next application of the priority ordered set assigned a priority less than the priority of the present application.
In some embodiments, the electronic processor 102 receives, via the interface of the portable electronic device 100 (in the example described, the display 112), a user selection of the gesture or gesture types to associate with the first navigational direction and the second navigational direction.
In some embodiments, the electronic processor 102 receives a user input selecting a scrolling type from a group of types. In one example, the group of scrolling types includes either a circular (wrap around) list and a first to last list. When the scrolling type is a first to last list, and the electronic processor 102 receives a gesture navigating in a priority direction past the last indication of the priority ordered set, the last indication remains present on the display 112 unless the gesture corresponds to the opposite direction of priority. When the scrolling type is a circular list, when the electronic processor 102 receives a gesture navigating in a priority direction past the last indication of the priority ordered set, the electronic processor 102 “circles back” to the first indication at the top of the priority ordered set. The scrolling between applications may be in a vertical direction or a horizontal direction. In some embodiments, the scrolling method is selected by a user of the portable electronic device 100, for example, by selecting a create folder option 306.
In some embodiments, the electronic processor 102 is configured to implement the method 200 collaboratively across multiple portable devices used by groups of related users.
At block 704, the commanding device 602 receives a user input including a gesture. At block 706, the commanding device 602 then navigates from the first indication of the first application to the first indication of the second application based on the priority (as described above in regards to blocks 210 and 212 of
In some embodiments, when another portable device 100 joins the group 600, the new portable device 100 is configured to send a notice message to the commanding device 602 either directly to the commanding device 602 or through the server 601. The commanding device 602 (or the server 601) receives the notice message and adds the portable device 100 to the group 600. Likewise, one of the portable devices 100 within the group 600 may leave the group 600 by sending a stop synchronization message directly to the commanding device 602 or through the server 601. The commanding device 602 (or the server 601) receives the notice message and removes the portable device 100 to the group 600 and no longer sends preconfigured navigation synchronization commands to the portable device 100.
Although the method 700 is describes the commanding device 602 communicating with the other portable devices 100 through the server 601, it should be understood that in some embodiments, the commanding device 602 communicates with the other portable devices 100 directly (without the server 601).
In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
Moreover in this document, relational terms for example first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has,” “having,” “includes,” “including,” “contains,” “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a,” “has . . . a,” “includes . . . a,” or “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially,” “essentially,” “approximately,” “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) for example microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
Moreover, an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.