Some modern computing devices can be unlocked with a touch gesture supplied by a user to a touchscreen. Once a device is unlocked, a user can launch an application by selecting an application via the touchscreen.
Technologies are described herein that provide for the unlocking of a computing device and the launching of a particular application with a single gesture applied to a touchscreen. The single gesture can comprise a portion of an unlock gesture and an application selection gesture. For example, a user can unlock a device and launch a desired application by first sliding an icon from a starting location along a first track (a portion of an unlock gesture) and then sliding the icon toward an application icon located near the end of a second track (an application selection gesture). By being able to unlock a computing device and launch a specific application with a single gesture, a user is spared from having to apply multiple gestures to achieve the same result.
Reference is now made to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding thereof. It may be evident, however, that the novel embodiments can be practiced without these specific details. In other instances, known structures and devices are shown in block diagram form in order to facilitate a description thereof. The intention is to cover all modifications, equivalents, and alternatives within the scope of the claims.
In
A user can unlock the computing device 110 and launch a particular application by applying a single gesture to the touchscreen 105. The single gesture can comprise a portion of an unlock gesture and an application selection gesture. Applying the unlock gesture to the touchscreen 105 can unlock the device 110 without launching a user-selected application. In the user interface 101, the unlock gesture comprises sliding the icon 124 from the starting point 126 to the opposite end of the main track 115, toward the unlock icon 144. Thus, a portion of the unlock gesture comprises moving the icon 124 toward, but not all of the way to, the end of the main track 115. In the user interface 101, the application selection gesture comprises a user sliding the icon 124 along one of the spurs 116-122 from the point where the spur connects to the main track 115 to the end of the spur.
Accordingly, to unlock the computing device 110 and launch a messaging application with a single gesture, a user can first move the icon 124 horizontally from the starting position 126 to the point where the main track 115 and the spur 116 meet (a portion of the unlock gesture) and then upwards vertically along spur 116 to the end of spur 116 (an application selection gesture), as indicated by path 140. To unlock the device 110 and launch a camera application associated with the camera application icon 134, the user can first move the icon 124 horizontally from the starting position 126 to the point where the main track 115 meets the spur 119, and then downwards vertically to the end of the track 119, indicated by path 142.
Other track and application icon arrangements in which an icon is moved in a first direction along a first track from a starting position and then in a second direction along a second track to unlock a device and select an application are possible. For example, it is not necessary that the tracks be straight lines. In some embodiments, one of more of the tracks can be curved. Moreover, it is not necessary that tracks have a main track-spur configuration. In various embodiments, application icons for any combination of applications that can be executed on the device 110 can be included in an unlock-and-launch user interface. Furthermore, it is not necessary that an unlock icon be displayed in the user interface. Moreover, some tracks in an unlock-and-launch interface may not be associated with an application. For example, a user may have removed an application from being associated with a track, or not yet assigned an application to a track.
In some embodiments, spur length, the distance between spurs and/or the distance from the starting location of the icon to the nearest spur, as well as additional unlock-and-launch user interface characteristics can be selected to reduce the likelihood that the icon could be unintentionally moved from the starting position to the end of one of one of the spurs. In some embodiments, the icon can automatically return to the starting position once the touching object (finger, stylus, etc.) that moved the icon away from the starting position is no longer in contact with the touchscreen.
In various embodiments, an unlock-and-launch user interface can include application indicators other than application icons to indicate the applications that can be launched from a locked device. Examples of other application indicators include thumbnails of application screenshots, application names, or track characteristics (e.g., track color, shape or length). For example, a yellow spur could be associated with an email application.
In
In
In
The computing device 200 can detect an unlock gesture while the touching object is in contact with the touchscreen in various manners. For example, the computing device can determine whether user input comprises an unlock gesture after the touching object has been substantially stationary for a specified period of time, once the area occupied by the user input exceeds a specified area threshold, after a distance traced by the touching object on the touchscreen has exceeded a specified distance, or the touching object has changed direction more than a specified amount of times.
The application indicators presented at a touchscreen as part of an unlock-and-launch user interface can be configurable. In some embodiments, a user can select the application indicators to be displayed in an unlock-and-launch user interface and their arrangement.
The applications that can be launched from an unlock-and-launch user interface can be selected in other manners. For example, the user can navigate to a settings menu of the computing device that allows the user to select which applications are to be included in an unlock-and-launch user interface.
In some embodiments, the applications that can be launched from an unlock-and-launch user interface can be automatically selected by a computing device based on application usage, such as frequency or recency of use. For example, an unlock-and-launch user interface can comprise applications most frequently used over a default or configurable time period (e.g., day, week, month, year, operational lifetime of the device), applications that have been used at least a certain number of times within a recent time period, or the most recently used applications within a recent time period. In some embodiments, application icons associated with more frequently or recently used applications are positioned closer to the icon starting point than applications icons associated with less frequently or recently used applications.
In some embodiments, the applications that can be launched from an unlock-and-launch user interface can be selected based on an operating context of the computing device. For example, the applications included in an unlock-and-launch interface can depend on the time. For instance, during typical working hours (e.g., 8:00 AM-5:00 PM on weekdays), the applications included in an unlock-and-launch user interface can comprise work productivity applications, such as word processing and spreadsheet applications, and an email application with access to a work email account of the user. During typical non-working hours, such as weekends and weekday evenings, the applications that can be launched from an unlock-and-launch user interface can include recreational and leisure applications, such as gaming, social networking, personal finance or exercise applications.
Applications included in an unlock-and-launch interface can depend on device location as well, which can be determined by, for example, GPS, Wi-Fi positioning, cell tower triangulation or other methods. For example, work-related applications can be presented in an unlock-and-launch user interface when a device is determined to be located at a user's place of work, and non-work-related applications can be presented when the user is elsewhere. For example, an exercise application can be included if the user is at his or her gym; and gaming, media player or social network applications can be included when the user is at home.
In some embodiments, an unlock-and-launch user interface can comprise tracks associated with a user-specified application and tracks that are associated with an application depending on application usage and/or device context. For example, with reference to
The applications to be included in an unlock-and-launch user interface based on device context can be user-selected or selected automatically by the computing device. For example, a user can set up various context profiles based on the time, device location and/or other factors. A context profile can indicate applications that can be presented for selection in an unlock-and-launch user interface if conditions in the context profile are satisfied. Alternatively, the computing device can monitor if a user frequently uses a particular application while at a specific location or during a specific time range, and include the application in an unlock-and-launch interface when the user is next at that location or the next time the user is using the device during that time.
In some embodiments, a computing device can be unlocked and a specific application launched with a single gesture based on the shape of the gesture. For example, a gesture comprising a letter, number or symbol traced on a touchscreen can cause the computing device to unlock and a particular application be launched. For instance, tracing the letter “W” on a touchscreen can unlock the device and launch a web browser, tracing the letter “E” can unlock the device and launch an email application, and tracing a “U” can cause the device to unlock without launching a specific application. The association between a gesture shape and an application can be set by default settings or be user-defined. In some embodiments, user-defined gestures (e.g., non-alphanumeric characters) can be associated with launching specific applications.
In various embodiments, the application associated with a particular gesture can be based on application usage. For example, tracing a “1” on a touchscreen can cause a most recently or frequently used application to be launched, tracing a “2” on the touchscreen can cause a second most recently or frequently used application to be launched, etc.
In some embodiments, where tracing a number launches an application based on application usage, the device can provide feedback to the user after the user has traced a number on the touchscreen to inform the user which application is associated with the traced number. This feedback can help the user avoid launching undesired applications. For example, consider the situation where a web browser is the most frequently used application and an email application is the second most-frequently used application. If the email application later becomes the most frequently used application and the web browser becomes the second most-frequently used application, the user may not be aware of this change. Thus, a user tracing a “1” on the touchscreen and expecting to launch a web browser may instead launch the email application.
If the user intended to launch the device's web browser application, thinking that the web browser application was the most frequently used application, the user can supplying a second numeric gesture to the computing device 410, without removing the touching object from the touchscreen 400, to launch a different application. The device 410 can discard the previously supplied user input if, for example, the user keeps the touching object in contact with the touchscreen 400 for more than a specified amount of time, such as one-half second. Any subsequent user input provided at the touchscreen 400 can be analyzed as a new gesture. In
It is to be understood that
In some embodiments, the method 600 can include additional process acts. For example, consider a smartphone that has received an unlock gesture and the touching object that provided the unlock gesture is still in contact with the touchscreen. In such a situation, the method 600 can further comprise, in response to receiving the unlock gesture and while the touching object is still touching the touchscreen, presenting a plurality of application indicators at the touchscreen. For example, if a user applied an unlock gesture (e.g., the letter “Z” traced on the screen) to a smartphone with his or her finger, the smartphone can present a plurality of application icons at the touchscreen while the user's finger is still in contact with the touchscreen. The application selection gesture can comprise selecting one of the plurality of application indicators, the executed application being an application associated with the selected application indicators. In the example, the user selects a word processing application icon by dragging his or her finger to the region of the touchscreen occupied by the word processing application icon, and the device launches the corresponding word processing application.
One exemplary advantage of the technologies described herein is the ability of a user to unlock a computing device and select an application to be executed with a single gesture. This can relieve the user of having to make multiple gestures to unlock a device and launch an application, which can comprise the user having to scroll through multiple pages of applications to find the application the user desires to launch after the device has been unlocked. Additional advantages include the ability for the user to select the applications that can be launched from an unlock-and-launch user interface. Further, the single gesture typically comprises moving an icon in two different directions, making it less likely that a device is unlocked and an application launched inadvertently. Another advantage is that the technologies can incorporate known unlock gestures, thus making unlock-and-launch user interfaces more familiar to users. For example, the unlock gesture in the unlock-and-launch user interface 101 in
The technologies described herein can be performed by any of a variety of computing devices, including mobile devices (such as smartphones, handheld computers, tablet computers, laptop computers, media players, portable gaming consoles, cameras and video recorders), non-mobile devices (such as desktop computers, servers, stationary gaming consoles, smart televisions) and embedded devices (such as devices incorporated into a vehicle). The term “computing devices” includes computing systems and includes devices and systems comprising multiple discrete physical components.
As shown in
Processors 802 and 804 further comprise at least one shared cache memory 812 and 814, respectively. The shared caches 812 and 814 can store data (e.g., instructions) utilized by one or more components of the processor, such as the processor cores 808-809 and 810-811. The shared caches 812 and 814 can be part of a memory hierarchy for the device 800. For example, the shared cache 812 can locally store data that is also stored in a memory 816 to allow for faster access to the data by components of the processor 802. In some embodiments, the shared caches 812 and 814 can comprise multiple cache layers, such as level 1 (L1), level 2 (L2), level 3 (L3), level 4 (L4), and/or other caches or cache layers, such as a last level cache (LLC).
Although the device 800 is shown with two processors, the device 800 can comprise one processor or more than two processors. Further, a processor can comprise one or more processor cores. A processor can take various forms such as a central processing unit, a controller, a graphics processor, an accelerator (such as a graphics accelerator or digital signal processor (DSP)) or a field programmable gate array (FPGA). A processor in a device can be the same as or different from other processors in the device. In some embodiments, the device 800 can comprise one or more processors that are heterogeneous or asymmetric to a first processor, accelerator. FPGA, or any other processor. There can be a variety of differences between the processing elements in a system in terms of a spectrum of metrics of merit including architectural, microarchitectural, thermal, power consumption characteristics and the like. These differences can effectively manifest themselves as asymmetry and heterogeneity amongst the processors in a system. In some embodiments, the processors 802 and 804 reside in the same die package.
Processors 802 and 804 further comprise memory controller logic (MC) 820 and 822. As shown in
Processors 802 and 804 are coupled to an Input/Output (J/O) subsystem 830 via P-P interconnections 832 and 834. The point-to-point interconnection 832 connects a point-to-point interface 836 of the processor 802 with a point-to-point interface 838 of the I/O subsystem 830, and the point-to-point interconnection 834 connects a point-to-point interface 840 of the processor 804 with a point-to-point interface 842 of the I/O subsystem 830. Input/Output subsystem 830 further includes an interface 850 to couple I/O subsystem 830 to a graphics engine 852, which can be a high-performance graphics engine. The I/O subsystem 830 and the graphics engine 852 are coupled via a bus 854. Alternately, the bus 844 could be a point-to-point interconnection.
Input/Output subsystem 830 is further coupled to a first bus 860 via an interface 862. The first bus 860 can be a Peripheral Component Interconnect (PCI) bus, a PCI Express bus, another third generation I/O interconnection bus or any other type of bus.
Various I/O devices 864 can be coupled to the first bus 860. A bus bridge 870 can couple the first bus 860 to a second bus 880. In some embodiments, the second bus 880 can be a low pin count (LPC) bus. Various devices can be coupled to the second bus 880 including, for example, a keyboard/mouse 882, audio I/O devices 888 and a storage device 890, such as a hard disk drive, solid-state drive or other storage device for storing computer-executable instructions (code) 892. The code 892 comprises computer-executable instructions for performing technologies described herein. Additional components that can be coupled to the second bus 880 include communication device(s) 884, which can provide for communication between the device 800 and one or more wired or wireless networks 886 (e.g. Wi-Fi, cellular or satellite networks) via one or more wired or wireless communication links (e.g., wire, cable, Ethernet connection, radio-frequency (RF) channel, infrared channel, Wi-Fi channel) using one or more communication standards (e.g., IEEE 802.11 standard and its supplements).
The device 800 can comprise removable memory such flash memory cards (e.g., SD (Secure Digital) cards), memory sticks, Subscriber Identity Module (SIM) cards). The memory in device 800 (including caches 812 and 814, memories 816 and 818 and storage device 890) can store data and/or computer-executable instructions for executing an operating system 894 and application programs 896. Example data includes web pages, text messages, images, sound files, video data, biometric thresholds for particular users or other data sets to be sent to and/or received from one or more network servers or other devices by the device 800 via one or more wired or wireless networks, or for use by the device 800. The device 800 can also have access to external memory (not shown) such as external hard drives or cloud-based storage.
The operating system 894 can control the allocation and usage of the components illustrated in
The device 800 can support various input devices, such as a touchscreen, microphone, camera, physical keyboard, proximity sensor and trackball, and one or more output devices, such as a speaker and a display. Other possible input and output devices include piezoelectric and other haptic I/O devices. Any of the input or output devices can be internal to, external to or removably attachable with the device 800. External input and output devices can communicate with the device 800 via wired or wireless connections.
In addition, the computing device 800 can provide one or more natural user interfaces (NUIs). For example, the operating system 892 or applications 894 can comprise speech recognition logic as part of a voice user interface that allows a user to operate the device 800 via voice commands. Further, the device 800 can comprise input devices and logic that allows a user to interact with the device 800 via a body, hand or face gestures. For example, a user's hand gestures can be detected and interpreted to provide input to a gaming application.
The device 800 can further comprise one or more wireless modems (which could comprise communication devices 884) coupled to one or more antennas to support communication between the system 800 and external devices. The wireless modems can support various wireless communication protocols and technologies such as Near Field Communication (NFC), Wi-Fi, Bluetooth, 4G Long Term Evolution (LTE), Code Division Multiplexing Access (CDMA), Universal Mobile Telecommunication System (UMTS) and Global System for Mobile Telecommunication (GSM). In addition, the wireless modems can support communication with one or more cellular networks for data and voice communications within a single cellular network, between cellular networks, or between the mobile computing device and a public switched telephone network (PSTN).
The device 800 can further include at least one input/output port (which can be, for example, a USB port, IEEE 1394 (FireWire) port, and/or RS-232 port) comprising physical connectors, a power supply, a satellite navigation system receiver such as a GPS receiver, a gyroscope, an accelerometer and a compass. A GPS receiver can be coupled to a GPS antenna. The device 800 can further include one or more additional antennas coupled to one or more additional receivers, transmitters and/or transceivers to enable additional functions.
It is to be understood that
The processor core comprises front-end logic 920 that receives instructions from the memory 910. An instruction can be processed by one or more decoders 930. The decoder 930 can generate as its output a micro operation such as a fixed width micro operation in a predefined format, or generate other instructions, microinstructions, or control signals, which reflect the original code instruction. The front-end logic 920 further comprises register renaming logic 935 and scheduling logic 940, which generally allocate resources and queues operations corresponding to converting an instruction for execution.
The processor core 900 further comprises execution logic 950, which comprises one or more execution units (EUs) 965-1 through 965-N. Some processor core embodiments can include a number of execution units dedicated to specific functions or sets of functions. Other embodiments can include one execution unit or one execution unit that can perform a particular function. The execution logic 950 performs the operations specified by code instructions. After completion of execution of the operations specified by the code instructions, back-end logic 970 retires instructions using retirement logic 975. In some embodiments, the processor core 900 allows out of order execution but requires in-order retirement of instructions. Retirement logic 970 can take a variety of forms as known to those of skill in the art (e.g., re-order buffers or the like).
The processor core 900 is transformed during execution of instructions, at least in terms of the output generated by the decoder 930, hardware registers and tables utilized by the register renaming logic 935, and any registers (not shown) modified by the execution logic 950. Although not illustrated in
Any of the disclosed methods can be implemented as computer-executable instructions or a computer program product. Such instructions can cause a computer to perform any of the disclosed methods. Generally, as used herein, the term “computer” refers to any computing device or system described or mentioned herein, or any other computing device. Thus, the term “computer-executable instruction” refers to instructions that can be executed by any computing device described or mentioned herein, or any other computing device.
The computer-executable instructions or computer program products as well as any data created and used during implementation of the disclosed technologies can be stored on one or more tangible computer-readable storage media, such as optical media discs (e.g., DVDs, CDs), volatile memory components (e.g., DRAM, SRAM), or non-volatile memory components (e.g., flash memory, disk drives). Computer-readable storage media can be contained in computer-readable storage devices such as solid-state drives, USB flash drives, and memory modules. Alternatively, the computer-executable instructions can be performed by specific hardware components that contain hardwired logic for performing all or a portion of disclosed methods, or by any combination of computer-readable storage media and hardware components.
The computer-executable instructions can be part of, for example, a dedicated software application or a software application that is accessed via a web browser or other software application (such as a remote computing application). Such software can be executed, for example, on a single computing device or in a network environment using one or more network computers. Further, it is to be understood that the disclosed technology is not limited to any specific computer language or program. For instance, the disclosed technologies can be implemented by software written in C++, Java, Perl, JavaScript. Adobe Flash, or any other suitable programming language. Likewise, the disclosed technologies are not limited to any particular computer or type of hardware. Certain details of suitable computers and hardware are known and need not be set forth in detail in this disclosure.
Furthermore, any of the software-based embodiments (comprising, for example, computer-executable instructions for causing a computer to perform any of the disclosed methods) can be uploaded, downloaded or remotely accessed through a suitable communication means. Such suitable communication means include, for example, the Internet, the World Wide Web, an intranet, cable (including fiber optic cable), magnetic communications, electromagnetic communications (including RF, microwave, and infrared communications), electronic communications, or other such communication means.
As used in this application and in the claims, a list of items joined by the term “and/or” can mean any combination of the listed items. For example, the phrase “A. B and/or C” can mean A; B; C; A and B; A and C; B and C; or A, B and C. As used in this application and in the claims, a list of items joined by the term “at least one of” can mean any combination of the listed terms. For example, the phrases “at least one of A, B or C” can mean A; B; C; A and B; A and C; B and C; or A, B and C.
The disclosed methods, apparatuses and systems are not be construed as limiting in any way. Instead, the present disclosure is directed toward all novel and nonobvious features and aspects of the various disclosed embodiments, alone and in various combinations and subcombinations with one another. The disclosed methods, apparatuses, and systems are not limited to any specific aspect or feature or combination thereof, nor do the disclosed embodiments require that any one or more specific advantages be present or problems be solved.
Theories of operation, scientific principles or other theoretical descriptions presented herein in reference to the apparatuses or methods of this disclosure have been provided for the purposes of better understanding and are not intended to be limiting in scope. The apparatuses and methods in the appended claims are not limited to those apparatuses and methods that function in the manner described by such theories of operation.
Although the operations of some of the disclosed methods are described in a particular, sequential order for convenient presentation, it is to be understood that this manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth herein. For example, operations described sequentially can in some cases be rearranged or performed concurrently. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed methods can be used in conjunction with other methods.
The following examples pertain to further embodiments.
A method of launching an application on a computing device, comprising: receiving a gesture via a touchscreen of the computing device, the gesture comprising a portion of an unlock gesture and an application selection gesture; and executing an application selected with the application selection gesture.
The method of Example 1, further comprising presenting a user interface at the touchscreen comprising a plurality of tracks along which a user can move an icon from a starting position to an end of one of the plurality of tracks, one or more applications being associated with the plurality of tracks.
The method of Example 2, wherein the starting position is in a first track, the unlock gesture comprises moving the icon from the starting position to an end of the first track and the application selection gesture comprises moving the icon along a second track of the plurality of tracks to a selected end of the plurality of tracks, the application selected with the application selection gesture being associated with the selected end.
The method of Example 2, wherein the user interface further comprises a plurality of applications icons displayed near the ends of the plurality of tracks.
The method of Example 1, further comprising presenting a user interface comprising a plurality of application indicators associated with a plurality of applications that can be selected with application selection gestures.
The method of Example 5, further comprising selecting an application indicator of the plurality of applications indicators for presentation in the user interface based on a recency of use of an application associated with the application indicator
The method of Example 5, further comprising selecting an application indicator of the plurality of applications indicators for presentation in the user interface based on a frequency of use of an application associated with the application indicator.
The method of Example 5, further comprising selecting an application indicator of the plurality of applications indicators for presentation in the user interface based on at least the location of the computing device and/or the time.
The method of Example 1, wherein the gesture comprises the unlock gesture, the unlock gesture being received via a touching object in contact with the touchscreen, the method further comprising in response to receiving the unlock gesture and while the touching object is still touching the touchscreen, presenting a plurality of application indicators at the touchscreen, the application selection gesture comprising selecting one of the plurality of application indicators, the executed application being an application associated with the selected application indicator.
The method of Example 9, wherein the application selection gesture comprises moving the touching object from an ending location of the unlock gesture on the touchscreen to a region of the touchscreen occupied by the selected application indicator.
One or more computer-readable storage media storing computer-executable instructions for causing a computing device to perform any one of the methods of Examples 1-10.
At least one computing device programmed to perform any one of the methods of Examples 1-10.
A method for launching an application, the method comprising: presenting a user interface at a touchscreen of a computing device, the user interface comprising a plurality of tracks along which a user can drag an icon from a starting position, one or more applications being associated with the plurality of tracks; receiving a gesture via the touchscreen, the gesture comprising moving the icon in a first direction along a first track of the plurality of tracks, and in a second direction along a second track of the plurality of tracks to an end of the second track; and executing an application associated with the second track.
One or more computer-readable storage media storing computer-executable instructions for causing a computing device to perform the method of Example 13.
At least one computing device programmed to perform the method of Example 13.
A method for launching application, the method comprising: receiving user input comprising a number traced on a touchscreen of a computing device while the computing device is locked; and executing an application associated with the number, the association between the application and the number being based at least in part on a usage of the application.
The method of Example 16, wherein the association between the application and the number is based at least in part on a recency of usage of the application and/or a frequency of use of the application.
The method of Example 16, the method further comprising displaying an application indicator associated with the application associated with the number.
One or more computer-readable storage media storing computer-executable instructions for causing a computing device to perform any one of the methods of Examples 16-18.
At least one computing device programmed to perform any one of the methods of Examples 16-18.
A method of launching an application, the method comprising: receiving first user input comprising a first number traced on a touchscreen of a computing device via a touching object; presenting a first application indicator on the touchscreen, the first application indicator being associated with a first application associated with the first number; receiving second user input comprising a second number traced on the touchscreen with the touching object; presenting a second application indicator on the touchscreen, the second application indicator being associated with a second application associated with the second number; and executing the second application; and wherein the association between the first application indicator the first number is based at least in part on a usage of the first application and the association between the second application indicator and the second number is based at least in part on the a usage of the second application.
One or more computer-readable storage media storing computer-executable instructions for causing a computer to perform the method of Example 21.
At least one computing device programmed to perform the method of claim 21.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/CN2012/086396 | 12/12/2012 | WO | 00 | 12/20/2013 |