Contextual search by a mobile communications device

Information

  • Patent Grant
  • 8825699
  • Patent Number
    8,825,699
  • Date Filed
    Thursday, April 30, 2009
    15 years ago
  • Date Issued
    Tuesday, September 2, 2014
    9 years ago
Abstract
Contextual search by a mobile communications device is described. In an implementation, a search query is a received and a context is detected of a user interface currently being displayed on a display device of a mobile communications device. One or more search results are displayed on a display device of a search performed in the detected context using the search query.
Description
BACKGROUND

Mobile communication devices (e.g., wireless phones) have become an integral part of everyday life. For example, a user traditionally used a mobile communications device to make telephone calls when the user was away from a fixed communications device, e.g., a house or office wired telephone. In some instances, the mobile communications device became the primary device via which the user communicated with other users as the user became accustomed to the convenience and functionality of the device.


Communication techniques that may be employed using a mobile communications device have also increased. For example, users were traditionally limited to telephone calls between mobile communications devices. Advances were then made to provide a variety of other communication techniques, e.g., text messaging and email. However, inclusion of these additional communication techniques on mobile communications devices having traditional form factors may cause these devices to become unwieldy and less suitable for mobile applications. For example, traditional input devices that were employed by these communication techniques may be less suitable when applied by traditional mobile communications devices.


SUMMARY

Contextual search by a mobile communications device is described. In an implementation, a search query is received and a context is detected of a user interface currently being displayed on a display device of a mobile communications device. One or more search results are displayed on a display device of a search performed in the detected context using the search query.


In an implementation, a mobile communications device includes a display device and one or more modules to cause display of one or more results on the display device of a search performed in a first context based on a search query. If a gesture is detected to switch from the first context to a second context, the modules are configured to cause display of one or more results of a search performed in the second context based on the search query.


In an implementation, a mobile communications device includes a display device and one or more modules to display search results on the display device of a first search performed in a first context based on a search query. If an input is received via a button of the mobile communications device that indicates that a scope of the first search is to be expanded, the modules are configured to perform a second search that includes a second context that was not part of the first search, the second search being performed without manual reentry of the search query.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an example implementation of a mobile communications device in accordance with one or more embodiments of devices, features, and systems for mobile communications.



FIG. 2 illustrates an example implementation in which a gesture is utilized to switch a context used to perform a search by the mobile communications device of FIG. 1.



FIG. 3 illustrates an example implementation of a gesture received to scroll (e.g., pan up or down) through the user interface of FIG. 1.



FIG. 4 illustrates an example implementation of the mobile communications device of FIG. 1 as outputting search results in response to a search query.



FIG. 5 illustrates an example implementation of loading search results by the mobile communications device of FIG. 1.



FIG. 6 illustrates an example implementation of a location context search of the user interface of the mobile communications device of FIG. 1.



FIG. 7 is a flow diagram depicting a procedure in an example implementation in which a current context of a user interface is used to scope a search.



FIG. 8 is a flow diagram depicting a procedure in an example implementation in which a switch is performed between first and second contexts of a search using a gesture.



FIG. 9 is a flow diagram depicting a procedure in an example implementation in which a search is scoped out to include an additional context.



FIG. 10 illustrates various components of an example device that can be implemented in various embodiments as any type of a mobile device to implement embodiments of devices, features, and systems for mobile communications.





DETAILED DESCRIPTION

Overview


Functionality is continually added to mobile communications devices (e.g., mobile phones), such as to consume music, store contacts, communicate via messages (e.g., SMS, MMS, email), consume streamed content (e.g., music and videos), and so on. Because of this, it is becoming progressively harder for users to find desired content stored on the mobile communications device. Additionally, because mobile communications devices are typically connected to a network, a user may desire content that is not stored locally on the mobile communications device but rather is accessible via the network.


Techniques are described to provide contextual search on a mobile communications device. These techniques may be implemented in a variety of ways to provide a variety of features. For example, a physical search button may be included with a keyboard of the mobile communications device to invoke a search. In this way, the search may be invoked throughout a user interface of the mobile communications device (e.g., by different applications through an API) without consuming display area of the mobile communications device by display of a search input area when the search is not desired.


Contextual filtering of search results may also be performed based on context of a current output of a user interface by the mobile communications device. For example, a search may be performed for music when a music application is currently output in the user interface by the mobile communications device. Thus, in this example the music application provides the context to the search.


In an implementation, a context may be switched without reentering the search criteria, e.g., by performing a panning gesture, pressing a button of the mobile communications device, and so on. For example, a pan gesture may be detected via a display device to mimic switching from a phone context accessible via a particular column in a user interface to a web context accessible via an adjacent column in the user interface. Additionally, a search may be “scoped out” of a particular context, e.g., may increase a scope of the search from music, to an entirety of a mobile communications device, and even beyond the bounds of the mobile communications device to the Internet. Further discussion of contextual search may be found in relation to the following sections.


In the following discussion, a variety of example implementations of a mobile communications device (e.g., a wireless phone) are described. Additionally, a variety of different functionality that may be employed by the mobile communications device is described for each example, which may be implemented in that example as well as in other described examples. Accordingly, example implementations are illustrated of a few of a variety of contemplated implementations. Further, although a mobile communications device having one or more modules that are configured to provide telephonic functionality are described, a variety of other mobile devices are also contemplated, such as personal digital assistants, mobile music players, dedicated messaging devices, portable game devices, netbooks, and so on.


Example Implementations



FIG. 1 is an illustration of an example implementation 100 of a mobile communications device 102 in accordance with one or more embodiments of devices, features, and systems for mobile communications. The mobile communications device 102 is operable to assume a plurality of configurations, examples of which include a configuration in which the mobile communications device 102 is “closed” and a configuration illustrated in FIG. 1 in which the mobile communications device 102 is “open.”


The mobile communications device 102 is further illustrated as including a first housing 104 and a second housing 106 that are connected via a slide 108 such that the first and second housings 104, 106 may move (e.g., slide) in relation to one another. Although sliding is described, it should be readily apparent that a variety of other movement techniques are also contemplated, e.g., a pivot, a hinge and so on.


The first housing 104 includes a display device 110 that may be used to output a variety of data, such as a caller identification (ID), information related to text messages as illustrated, email, multimedia messages, Internet browsing, game play, music, video and so on. In the illustrated implementation, the display device 110 is also configured to function as an input device by incorporating touchscreen functionality, e.g., through capacitive, surface acoustic wave, resistive, optical, strain gauge, dispersive signals, acoustic pulse, and other touchscreen functionality.


The second housing 106 is illustrated as including a keyboard 112 that may be used to provide inputs to the mobile communications device 102. Although the keyboard 112 is illustrated as a QWERTY keyboard, a variety of other examples are also contemplated, such as a keyboard that follows a traditional telephone keypad layout (e.g., a twelve key numeric pad found on basic telephones), keyboards configured for other languages (e.g., Cyrillic), and so on.


In the example shown in FIG. 1, the first and second housings 104, 106 of the mobile communications device 102 are approximately squared. For example, a plane defined by an outer surface of the display device 114 may be parallel to a plane of the first housing 104 that approximates a square, which may be the same as or different from the plane defined by the display device 110. In other words, the width and height of the plane taken from the first housing 104 that is parallel to the other surface of the display device 110 is approximately one-to-one. Likewise, the second housing 106 may be considered square along a plane that is parallel to and/or is the same as an outer surface of the keyboard 112 disposed within the second housing 106.


The mobile communications device 102 may assume a “closed configuration” such that the first housing 104 covers the second housing 106 by sliding the housing together using the slide 108. Consequently, the keyboard 112 disposed on the second housing 106 may be covered and made not available to for interaction by a user of the mobile communications device 102. In an implementation, telephonic functionality is still available when the mobile communications device 102 is in the closed configuration, e.g., to receive a telephone call.


In the “open” configuration as illustrated in the example implementation 100 of FIG. 1, the first housing 104 is moved (e.g., slid) “away” from the second housing 106 using the slide 108. In this example configuration, at least a majority of the keys of the keyboard 112 (i.e., the physical keys) is exposed such that the exposed keys are available for use to provide inputs. The open configuration results in an extended form factor of the mobile communications device 102 as contrasted with the form factor of the mobile communications device 102 in the closed configuration. In an implementation, the planes of the first and second housings 104, 106 that are used to define the extended form factor are parallel to each other, although other implementations are also contemplated, such as a “clamshell” configuration, “brick” configuration, and so on.


The form factor employed by the mobile communications device 102 may be suitable to support a wide variety of features. For example, the keyboard 112 is illustrated as supporting a QWERTY configuration. This form factor may be particularly convenient to a user to utilize the previously described functionality of the mobile communications device 102, such as to compose texts, play games, check email, “surf” the Internet, provide status messages for a social network, and so on.


In the mobile communications device 102 of FIG. 1, a portion of the keys of the keyboard 112 are illustrated as sharing multiple functions. For example, a numeric keypad may be provided within physical keys of the QWERTY layout as illustrated by the physical keys “w”, “e”, “r”, “s”, “d”, “f”, “z”, “x”, “c”, and “.” as sharing numbers “1”, “2”, “3”, “4”, “5”, “6”, “7”, “8”, “9”, and “0”, respectively. The numbers may be accessed by pressing the “ALT” key of the keypad 112. A variety of other examples are also contemplated, an example of which may be found in relation to the following figure.


The mobile communications device 102 is also illustrated as including a communication module 114. The communication module 114 is representative of functionality of the mobile communications device 102 to communicate via a network 116. For example, the communication module 114 may include telephone functionality to make and receive telephone calls. The communication module 114 may also include a variety of other functionality, such as to form short message service (SMS) text messages, multimedia messaging service (MMS) messages, emails, status messages for a social network, and so on. A user, for instance, may form a status message for communication via the network 116 to a social network website. The social network website may then publish the status message to “friends” of the user, e.g., for receipt by the friends via a computer, respective mobile communications device, and so on. A variety of other examples are also contemplated, such as blogging, instant messaging, and so on.


The mobile communications device 102 is also illustrated as including a search module 118. The search module 118 is representative of functionality of the mobile communication device 102 to perform a search and generate a user interface 120. The user interface 120 is illustrated as including a search query input area 122 that is configured to receive a search query.


There are a variety of different ways of initiating a search. For example, an input may be received from the keyboard 112 which causes output of the user interface 120 and input of corresponding letters as a search query in the search query input area 122. In another example, a dedicated physical search button 124 of the keyboard 112 may be pressed to cause output of the user interface 120. A variety of other examples are also contemplated, such as by selecting a representation in a menu displayed on the display device 110 or upon receipt of an input in the search query input area 122 from the keyboard 122 (e.g., when a user starts typing without selecting a specific portion of the user interface 120.


Depending on where the search was initiated, the search module 118 may configure the user interface 120 in a variety of ways. As illustrated in FIG. 1, for instance, the user interface 120 may be output without being scoped to a particular context. In this example, the user interface 120 includes a plurality of portions that are selectable to specify a particular context, illustrated instances include representations of the web 126, a phone 128, and location 130. In another example, the user interface 120 may be scoped to a particular context automatically and without user intervention based on a current context of an output by the mobile communications device 102, further discussion of which may be found in relation to FIG. 7.


In an implementation, if a user starts typing on the keyboard 112 without selecting a context, the phone 128 context is selected by default. This may be represented by an animation that gives an appearance of zooming-in to the phone 128 context icon displayed in the user interface 120.


In another implementation, a user may manually select a context. For example, a user may manually select the representation of the web 126 to cause a web search to be performed. In response to the selection (e.g., by pressing the display device 110 using touchscreen functionality), an animation may be output to animate into that context. As the user enters the search query using the keyboard 112 or other input device, the search module 118 performs the search in the selected context. In an implementation, auto-complete functionality may be used to increase efficiency in entering the search query. A user may also switch contexts (e.g., by panning) used to perform the search after the search query is entered, further discussion of which may be found in relation to the following figure.



FIG. 2 illustrates an example implementation in which a gesture is utilized to switch contexts used to perform a search by the mobile communications device 102 of FIG. 1. The mobile communications device 102 is illustrated as outputting the user interface 120 on the display device 110. The user interface 120 has three contexts in the illustrated example arranged in columns. A phone context 202 is illustrated as currently being output on the display device 110. A web context 204 and a location context 206 are illustrated as not currently output on the display device 110 (and accordingly are illustrated in phantom) and are arranged in columns to the left and right of the phone context 202, respectively, to indicate “where” the contexts are located in respect to the phone context 202.


The phone context 202 includes the search query input portion 122 having a search query “529.” Search results that include telephone number from contacts stored in the mobile communication device 102 are displayed in a column of the phone context 202 below the search query input portion 122. In an implementation, the search results are selectable to initiate a communication (e.g., telephone call) using the contact information. Further, additional content may be searched from the phone 128 context such as applications, data, and so on.


In an implementation, the user may change the context in a variety of ways. For example, a user may navigate between the columns of the user interface 120 using one or more physical buttons of the keyboard 112 of FIG. 1. In another example illustrated in FIG. 2, a gesture may be performed to pan between the columns. For example, a finger of a user's hand 208 may be placed on the display device 110 to “drag” the user interface 120 in the desired direction to switch from the phone context 202 to the web context 204 or the location context 206. In an implementation, the search query is automatically reused to perform a search in the respective context. In this way, a user may perform the search in different contexts without manually reentering the search query in the search query input portion 122. A variety of other examples (e.g., gestures) are also contemplated to switch contexts.


As illustrated in FIG. 2, the search query input area 122 includes a search button 210 that is selectable to initiate a search when in the web 204 and location 210 contexts, but not the phone 202 context. When in the phone 202 context, a dial button 212 is included in the search query input area 122 that is selectable to initiate a telephone call using numbers and/or letters input in the search query input area 122.


The telephone number may be displayed in the search query input area 122 according to the following logic. If the query starts with a number or a plus sign the dial button 212 is displayed. The telephone number may then disappear when the user enters a small-cap letter that does not correspond to a number. For example, when the user enters a “q” the user interface may display contacts that include a “q.” The telephone number may also disappear when the user has entered more than the maximum number of digits for a locale before initiating dialing.


In an implementation, if the user has entered a capitalized letter the letter is translated to a number according to the 12-key digit-letter telephone keypad layout unless it violates one or more of the conditions above. The conditions may be validated each time there is a change in the query string such the telephone number field may appear/disappear as the query changes. The telephone number may be automatically formatted to include a plus sign, dashes and parentheses as appropriate by the communication module 114.


If the context was automatically selected for the user (e.g., by manually selecting the phone 128 context of FIG. 1), deleting each of the inputs in the search query input area 122 may cause the user interface 120 to “zoom out” of the selected context. Additionally, pressing the back button 132 may cause text entered into the search query input area 122 to be erased upon exit of the default context back to a main search screen as illustrated in FIG. 1. In another example, the back button 132 may cause the search to “scope out,” further discussion of which may be found in relation to FIG. 9. In an implementation, an input is retained upon exit of a search application (e.g., search module 118) such that the input remains upon re-initiation of the search application until a new input is provided.


The search query input area 122 may also be configured by the mobile communications device 102 for efficient use of an available amount of display area of the display device 110, an example 300 of such an implementation is illustrated in FIG. 3. The example 300 shows first and second instances 302, 304 of the mobile communications device 102.


In the first instance 302, the search query input area 122 is displayed at a first size (e.g., “full” size) to increase legibility of the search query as the search query is entered. In the second instance 304, however, the search query input area 122 is reduced (e.g., shrunk by thirty percent) in response to an input received from the user 208 to scroll through the search results. For example, as illustrated in FIG. 3 a gesture may be received to scroll (e.g., pan up or down) through the user interface 120. The search query input area 122 may return to the original size as illustrated in the first instance 302 when selected by a user. Thus, a user may scroll through search results vertically as shown in FIG. 3 and switch contexts horizontally as shown in FIG. 2, although it should be readily apparent that a variety of other implementations are also contemplated.



FIG. 4 illustrates an example implementation of the mobile communications device 102 of FIG. 1 as outputting search results in response to a search query. The user interface 120 is illustrated as organizing search results into a plurality of sub-categories, examples of which are illustrated as “contacts,” “messages,” and “calendar,” which may or may not correspond to different contexts of a search. In an implementation, each category may have a respective progress indicator to indicate a status of a search within that category, such as by a color change through the text representation of the respective categories that mimics a status bar.


As the search query is being entered in the search query input area 122, the categories may be displayed next to each other. The search results may then be displayed in each respective category as found in real time. A variety of different categories may be supported, examples of which are listed below in a hierarchical arrangement that may be accessed through selecting a “more” button as illustrated for each category.

    • Web
      • Instant answer
        • Weather
        • Stock
        • Movie times
        • Encyclopedia entry
      • Web pages
      • Image
      • News
    • Phone
      • Contacts
      • Call history
      • Messages
      • Favorites
      • Media (Music/Video)
      • Calendar
    • Location
      • Businesses


        In an implementation, the “More” button appears when there are more results than what fits into the first-level results page. When a search is completed, categories that do not contain search results may be removed from the user interface 120. If there are no results in each of the categories, a “No Results” message may be displayed in the user interface 120. Selection (e.g., “tapping”) of a search result may cause the search result to be opened, e.g., to open an appointment on the calendar, display a body of a selected message, and so on. In an implementation, a user may interact with the user interface 120 to specify which of the contexts will appear in the user interface. For example, the user may interact with a series of checkboxes to select one or more of the above contexts.


The search module 118 may also support bookmark functionality for search results. For example, a favorites icon 402 (illustrated as a star in the user interface) may be displayed in the user interface 120 once one or more search results are found and displayed in the user interface 120. Selection of the favorites icon 402 may cause the search module 118 to add the search query to a home screen of the user interface 120 of the mobile communications device 102. Selection of the search query in the home screen may then be used to perform the search again and/or display a previous search result. In an implementation, if the search was performed in a specific context the search is repeated in that context, e.g., one of the columns of FIG. 2.


To view additional search results, a user may perform a gesture to “pan down” through the search results, select a scroll icon 404 (e.g., to scroll down a single page of search results) that is illustrated in a bottom-right corner of the user interface 120, and so on. If there are no additional search results, the scroll icon 404 is not displayed.


In an implementation, a user can scroll through a list of search results. When in a particular context (whether manually or automatically selected), the user may scroll past the “end” of the search results for that context. Scrolling past the end of the context may cause a search to be initiated for at least one other context, e.g., for each local context supported by the mobile communications device 102, for local and remote contexts, and so on. If additional results are not available, the scroll icon 404 may be removed from the user interface 120. In another implementation, the “end” of the search results is indicated by cropping the search results at the bottom of the user interface 120 if there are additional items to imply that the additional items are available to be viewed. If not, the search results at the end are displayed in full, i.e., an icon representing each search result is displayed in its entirety. The search results may also be configured to indicate a number of that search results are loading, further discussion of which may be found in relation to the following figure.



FIG. 5 illustrates an example implementation 500 of loading search results by the mobile communications device 102 of FIG. 1. In this example, the context of the search relates to images. Consequently, images are displayed in the user interface 120 as the search result is entered.


As the search results (e.g., images) are located, for instance, representations of the images are displayed in the user interface, examples of which include a dog and a bolt of electricity. Outlines of other images are displayed to indicate a number of search results found. These outlines may then be “filled in” as the images are loaded into the user interface 120. A variety of other techniques are also contemplated to indicate a number of search results as found and loading of the search results.



FIG. 6 illustrates an example implementation of a location context search of the user interface 120 of the mobile communications device 102 of FIG. 1. The user interface 120 is shown as outputting a location context. As illustrated, the user interface 120 includes a search query input area 122 like the location context 206 of FIG. 2.


However, in this example a location portion 602 is also included for entering a location that is to be used as a basis for performing a search. For instance, a user may use the keyboard 112 to enter a location that is to be used as a basis for a search query entered in the search query input area 122. A user may also select a location icon 604 to use a current geographical location of the mobile communications device 102. Selection of the location icon 604 may case the location portion 602 to be automatically populated with the current geographical location, which may be determined through GPS, triangulation using wireless transmitters, and so on. Selection of a search result (e.g., tapping), may cause the user interface to output a view of the result on a map, e.g., in a web browser.


Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware or a combination of software and firmware. In the case of a software implementation, the module, functionality, or logic represents instructions (e.g., program code) that perform specified tasks when executed on a computing system formed by one or more computers having one or more processors (e.g., CPU or CPUs). The instructions may be stored in one or more tangible computer readable memory devices. The features of the contextual search techniques described below are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


Example Procedures


The following discussion describes contextual search techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, or software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to the environment 100 of FIG. 1 and the implementations 200-600 of FIGS. 2-6, respectively.



FIG. 7 depicts a procedure 700 in an example implementation in which a current context of a user interface is used to scope a search. A search query is received (block 702), such as through the keyboard 110 and displayed in the search query input area 122 of the user interface 120.


A context is detected of a user interface currently being displayed on a display device of a mobile communication device (block 704). For example, the display device 102 may be used to output the user interface 120 in a variety of different configurations for a variety of different applications, such as a music player application, contacts application, telephone application, web browser, location application, and so on. Accordingly, each of these different applications may provide a different context for the user interface 120, such as to display different types of content. Types of content may vary greatly, such as music, video, documents, contacts, and so on and may be detected in a variety of ways, such as based on an extension that identifies the type. Therefore, the search module 118 may leverage the current context to increase a likelihood of finding a relevant search result.


One or more search results are displayed on the display device of a search performed in the detected context using the search query (block 706). As shown in FIG. 2, for instance, search results may be displayed in a phone 202 context as a column. Contexts may also be switched by a user to perform additional searches, such as to switch from a first context to a second context as described in relation to FIG. 8 and/or “scope out” a search as described in relation to FIG. 9.



FIG. 8 depicts a procedure 800 in an example implementation in which a switch is performed between first and second contexts of a search using a gesture. One or more results are displayed on a display device of a search performed in a first context based on a search query (block 802). For example, the first context may be a phone 202 context based on a telephone application is that currently configuring the user interface 120. Accordingly, a search performed in this context may result in search results that include telephone numbers as illustrated in FIG. 2.


If a gesture is detected to switch from the first context to a second context, cause display of one or more results of a search performed in the second context based on the search query (block 804). Additionally, the search may be performed based on the search query in response to the detection of the gesture without manually reentering the search query (block 806). Continuing with the previous example, a user may wish to switch from the phone 202 context to the location 206 context. Therefore, a finger of the user's hand 208 may make a panning gesture across the display device 110. Touchscreen functionality of the display device 110 may be used to detect the gesture and therefore recognize that a switch is to be performed.


In response to this detection, the search module 118 may use the search query from the first search and perform another search in the relevant context, such as location 206 in this example. Although performance of first and second searches has been described, a variety of other examples are also contemplated. For example, the first and second searches in the respective first and second contexts may be performed concurrently (e.g., is a single search). Results of this search may then be separated based on context such that the user may navigate between the contexts to see different search results. Thus, the contexts may be searched in a variety of ways without manually reentering the search query.



FIG. 9 depicts a procedure 900 in an example implementation in which a search is scoped out to include an additional context. Search results are displayed of a first search performed in a first context based on a search query (block 902). If an input is received via a button of the mobile communications device that indicates that a scope of the first search is to be expanded, a second search is performed. The second search is performed in a second context that was not part of the first search without manual reentry of the search query (block 904).


For example, a dedicated search key 132 of a keyboard 112 of the mobile communications device 102 may be pressed to load the search query input area 122. When the search key 132 was pressed, the mobile communications device 102 may be in a phone 128 context and accordingly a search is performed in that context. However, a user may have mistakenly entered the search query while in the phone 128 context and desire another context, such as music. Accordingly, the user may press the back 132 button to scope out to search the mobile communications device 102 as a whole and not just phone numbers. Pressing the back 132 button again may cause the search to be expanded past local storage of the mobile communications device 102 to include content available remotely over the network 116. A variety of other examples are also contemplated.


Example Device



FIG. 10 illustrates various components of an example device 1000 that can be implemented in various embodiments as any type of a mobile device to implement embodiments of devices, features, and systems for mobile communications. For example, device 1000 can be implemented as any of the mobile communications devices 102 described with reference to respective FIGS. 1-6. Device 1000 can also be implemented to access a network-based service, such as a content service.


Device 1000 includes input(s) 1002 that may include Internet Protocol (IP) inputs as well as other input devices, such as the keyboard 112 of FIGS. 1-60. Device 1000 further includes communication interface(s) 1004 that can be implemented as any one or more of a wireless interface, any type of network interface, and as any other type of communication interface. A network interface provides a connection between device 1000 and a communication network by which other electronic and computing devices can communicate data with device 1000. A wireless interface enables device 1000 to operate as a mobile device for wireless communications.


Device 1000 also includes one or more processors 1006 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of device 1000 and to communicate with other electronic devices. Device 1000 can be implemented with computer-readable media 1008, such as one or more memory components, examples of which include random access memory (RAM) and non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.).


Computer-readable media 1008 provides data storage to store content and data 1010, as well as device applications and any other types of information and/or data related to operational aspects of device 1000. For example, an operating system 1012 can be maintained as a computer application with the computer-readable media 1008 and executed on processor(s) 1006. Device applications can also include a communication manager module 1014 (which may be used to provide telephonic functionality) and a media manager 1016.


Device 1000 also includes an audio and/or video output 1018 that provides audio and/or video data to an audio rendering and/or display system 1020. The audio rendering and/or display system 1020 can be implemented as integrated component(s) of the example device 1000, and can include any components that process, display, and/or otherwise render audio, video, and image data. Device 1000 can also be implemented to provide a user tactile feedback, such as vibrate and haptics.


The communication manager module 1014 is further illustrated as including a keyboard module 1022. The keyboard module 1022 is representative of functionality employ one or more of the techniques previously described in relation to FIGS. 1-6.


Generally, the blocks may be representative of modules that are configured to provide represented functionality. Further, any of the functions described herein can be implemented using software, firmware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, or a combination of software and firmware. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the techniques described above are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims
  • 1. A mobile communications device comprising a display device and one or more modules configured to: display search results on the display device of a first search performed in a first context that involves a single application based on a search query;reduce a search query input area on the display device in response to an input to scroll through the search results;during display of the search results of the first search and responsive to receiving an input via a pan gesture that indicates that a scope of the first search is to be expanded to another application, perform a second search that includes a second context that involves the other application, the second search being performed without manual reentry of the search query; andresponsive to receiving the input via the pan gesture, cause the second search to expand from searching content in a local storage of the mobile communications device to searching content remote from the mobile communications device.
  • 2. A mobile communications device as described in claim 1, wherein the first search is for content that is local to the mobile communications device and the second search is for content accessible to the mobile communications device via a network connection.
  • 3. A mobile communications device as described in claim 1, wherein the first and second searches involve telephone numbers.
  • 4. A mobile communications device as described in claim 1, wherein the first context is user selectable through interaction with a user interface displayed on the display device.
  • 5. A mobile communications device as described in claim 1, the one or more modules further configured to output the search results of the first search in a plurality of categories.
  • 6. A mobile communications device as described in claim 1, the one or more modules further configured to display search results of the second search on the display device.
  • 7. A method comprising: displaying search results on a display device of a mobile communications device, the search results of a first search performed in a first context that involves a first application based on a search query;responsive to receiving a vertical pan gesture on the display device of the mobile communications device, scrolling through the search results of the first search and reducing a search query input area displayed in a user interface of the mobile communications device; andresponsive to receiving a horizontal pan gesture on the display device of the mobile communications device, performing a second search that includes a second context that involves a second application which was not part of the first search, the second search being performed without manual reentry of the search query, the second application being determined based, at least in part, on the horizontal pan gesture.
  • 8. A method as described in claim 7, wherein the horizontal pan gesture causes the search to be expanded past local storage of the mobile communications device to include content available remotely over a network.
  • 9. A method as described in claim 7, wherein the first search is for content that is local to the mobile communications device and the second search is for content that accessible to the mobile communications device via a network connection.
  • 10. A method as described in claim 7, wherein the first and second searches involve at least a portion of a telephone number.
  • 11. A method as described in claim 7, wherein the first context is user selectable through interaction with a user interface displayed on the display device.
  • 12. A method as described in claim 7, further comprising animating a visual representation on the display device of the mobile communications device to indicate the context of the first search.
  • 13. A method as described in claim 7, further comprising initiating another search in a different context responsive to scrolling past the end of the search results of the first search performed in the first context.
  • 14. A method as described in claim 7, wherein the first context and the second context each correspond to an individual application.
  • 15. A method as described in claim 7, further comprising associating content with the first context based on an extension type.
  • 16. One or more computer readable memory devices comprising instructions stored thereon that, responsive to execution by a mobile communications device, causes the mobile communications device to perform operations comprising: displaying search results on a display device of the mobile communications device, the search results of a first search performed in a first context that involves a single application based on a search query;reducing a search query input area on the display device in response to an input to scroll through the search results;if an input is received via a single press of a button of the mobile communications device that indicates that a scope of the first search is to be expanded beyond the first context that involves the single application, performing a second search that includes a second context that involves a plurality of applications stored on the mobile communications device; andif a subsequent input is received via another single press of the button of the mobile communications device, performing a third search that includes a third context, the third search being performed for content accessible via a network, the second search and the third search being performed without manual reentry of the search query.
  • 17. One or more computer readable memory devices as described in claim 16, wherein the button comprises a back button.
  • 18. One or more computer readable memory devices as described in claim 16, wherein the first search is for content that is local to the mobile communications device and the third search is for content that accessible to the mobile communications device via a network connection.
  • 19. One or more computer readable memory devices as described in claim 16, wherein the first and second searches involve a location.
  • 20. One or more computer readable memory devices as described in claim 16, wherein the first context is user selectable through interaction with a user interface displayed on the display device.
RELATED APPLICATIONS

This application claims priority under 35 U.S.C. Section 119(e) to U.S. Provisional Patent Applications Nos. 61/107,945, 61/107,935, and 61/107,921, each of which was filed on Oct. 23, 2008, the entire disclosures of which are hereby incorporated by reference in their entirety.

US Referenced Citations (354)
Number Name Date Kind
5189732 Kondo Feb 1993 A
5258748 Jones Nov 1993 A
5463725 Henckel et al. Oct 1995 A
5515495 Ikemoto May 1996 A
5574836 Broemmelsiek Nov 1996 A
5675329 Barker Oct 1997 A
5860073 Ferrel et al. Jan 1999 A
5905492 Straub et al. May 1999 A
5914720 Maples et al. Jun 1999 A
5963204 Ikeda et al. Oct 1999 A
6008816 Eisler Dec 1999 A
6184879 Minemura et al. Feb 2001 B1
6317142 Decoste et al. Nov 2001 B1
6385630 Ejerhed May 2002 B1
6396963 Shaffer May 2002 B2
6424338 Andersone Jul 2002 B1
6507643 Groner Jan 2003 B1
6570582 Sciammarella et al. May 2003 B1
6662023 Helle Dec 2003 B1
6697825 Underwood et al. Feb 2004 B1
6784925 Tomat Aug 2004 B1
6865297 Loui Mar 2005 B2
6876312 Yu Apr 2005 B2
6904597 Jin Jun 2005 B2
6961731 Holbrook Nov 2005 B2
6983310 Rouse Jan 2006 B2
6987991 Nelson Jan 2006 B2
7007238 Glaser Feb 2006 B2
7013041 Miyamoto Mar 2006 B2
7058955 Porkka Jun 2006 B2
7065385 Jarrad et al. Jun 2006 B2
7065386 Smethers Jun 2006 B1
7111044 Lee Sep 2006 B2
7133707 Rak Nov 2006 B1
7133859 Wong Nov 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7158123 Myers Jan 2007 B2
7178111 Glein et al. Feb 2007 B2
7197702 Niyogi et al. Mar 2007 B2
7213079 Narin May 2007 B2
7216588 Suess May 2007 B2
7249326 Stoakley et al. Jul 2007 B2
7280097 Chen Oct 2007 B2
7283620 Adamczyk Oct 2007 B2
7289806 Morris et al. Oct 2007 B2
7296184 Derks et al. Nov 2007 B2
7336263 Valikangas Feb 2008 B2
7369647 Gao et al. May 2008 B2
7388578 Tao Jun 2008 B2
7403191 Sinclair Jul 2008 B2
7447520 Scott Nov 2008 B2
7461151 Colson et al. Dec 2008 B2
7479949 Jobs Jan 2009 B2
7480870 Anzures Jan 2009 B2
7483418 Maurer Jan 2009 B2
7496830 Rubin Feb 2009 B2
7593995 He et al. Sep 2009 B1
7606714 Williams et al. Oct 2009 B2
7610563 Nelson et al. Oct 2009 B2
7614018 Ohazama et al. Nov 2009 B1
7619615 Donoghue Nov 2009 B1
7640518 Forlines et al. Dec 2009 B2
7657849 Chaudhri et al. Feb 2010 B2
7671756 Herz et al. Mar 2010 B2
7681138 Grasser et al. Mar 2010 B2
7702683 Kirshenbaum Apr 2010 B1
7730425 de los Reyes et al. Jun 2010 B2
7746388 Jeon Jun 2010 B2
7755674 Kaminaga Jul 2010 B2
7782332 Nagata Aug 2010 B2
7834861 Lee Nov 2010 B2
7877707 Westerman et al. Jan 2011 B2
7889180 Byun et al. Feb 2011 B2
7983718 Roka Jul 2011 B1
8006276 Nakagawa et al. Aug 2011 B2
8074174 Suzuki et al. Dec 2011 B2
8086275 Wykes et al. Dec 2011 B2
8127254 Lindberg et al. Feb 2012 B2
8131808 Aoki et al. Mar 2012 B2
8150924 Buchheit et al. Apr 2012 B2
8175653 Smuga May 2012 B2
8238526 Seth et al. Aug 2012 B1
8238876 Teng Aug 2012 B2
8250494 Butcher Aug 2012 B2
8255473 Eren et al. Aug 2012 B2
8269736 Wilairat Sep 2012 B2
8280901 McDonald Oct 2012 B2
8289688 Behar et al. Oct 2012 B2
8294715 Patel et al. Oct 2012 B2
8299943 Longe Oct 2012 B2
8355698 Teng et al. Jan 2013 B2
8385952 Friedman et al. Feb 2013 B2
8411046 Kruzeniski et al. Apr 2013 B2
8448083 Migos et al. May 2013 B1
8548431 Teng et al. Oct 2013 B2
8634876 Friedman Jan 2014 B2
20010022621 Squibbs Sep 2001 A1
20020000963 Yoshida et al. Jan 2002 A1
20020018051 Singh Feb 2002 A1
20020026349 Reilly et al. Feb 2002 A1
20020035607 Checkoway Mar 2002 A1
20020060701 Naughton et al. May 2002 A1
20020070961 Xu et al. Jun 2002 A1
20020091755 Narin Jul 2002 A1
20020128036 Yach et al. Sep 2002 A1
20020129061 Swart et al. Sep 2002 A1
20020138248 Corston-Oliver et al. Sep 2002 A1
20020142762 Chmaytelli et al. Oct 2002 A1
20020154176 Barksdale et al. Oct 2002 A1
20030003899 Tashiro et al. Jan 2003 A1
20030008686 Park et al. Jan 2003 A1
20030011643 Nishihihata Jan 2003 A1
20030040300 Bodic Feb 2003 A1
20030073414 Capps Apr 2003 A1
20030096604 Vollandt May 2003 A1
20030105827 Tan et al. Jun 2003 A1
20030135582 Allen et al. Jul 2003 A1
20030187996 Cardina et al. Oct 2003 A1
20030222907 Heikes et al. Dec 2003 A1
20030225846 Heikes et al. Dec 2003 A1
20030234799 Lee Dec 2003 A1
20040015553 Griffin et al. Jan 2004 A1
20040068543 Seifert Apr 2004 A1
20040078299 Down-Logan Apr 2004 A1
20040111673 Bowman et al. Jun 2004 A1
20040137884 Engstrom et al. Jul 2004 A1
20040185883 Rukman Sep 2004 A1
20040212586 Denny Oct 2004 A1
20040217954 O'Gorman et al. Nov 2004 A1
20040250217 Tojo et al. Dec 2004 A1
20050054384 Pasquale et al. Mar 2005 A1
20050060647 Doan et al. Mar 2005 A1
20050060665 Rekimoto Mar 2005 A1
20050079896 Kokko et al. Apr 2005 A1
20050085215 Kokko Apr 2005 A1
20050085272 Anderson et al. Apr 2005 A1
20050114788 Fabritius May 2005 A1
20050143138 Lee et al. Jun 2005 A1
20050182798 Todd et al. Aug 2005 A1
20050183021 Allen et al. Aug 2005 A1
20050184999 Daioku Aug 2005 A1
20050198159 Kirsch Sep 2005 A1
20050216300 Appelman et al. Sep 2005 A1
20050223057 Buchheit et al. Oct 2005 A1
20050232166 Nierhaus Oct 2005 A1
20050250547 Salman et al. Nov 2005 A1
20050273614 Ahuja Dec 2005 A1
20050280719 Kim Dec 2005 A1
20060004685 Pyhalammi et al. Jan 2006 A1
20060005207 Louch et al. Jan 2006 A1
20060015736 Callas et al. Jan 2006 A1
20060015812 Cunningham Jan 2006 A1
20060026013 Kraft Feb 2006 A1
20060059430 Bells Mar 2006 A1
20060070005 Gilbert et al. Mar 2006 A1
20060074771 Kim Apr 2006 A1
20060103623 Davis May 2006 A1
20060129543 Bates et al. Jun 2006 A1
20060135220 Kim et al. Jun 2006 A1
20060136773 Kespohl et al. Jun 2006 A1
20060152803 Provitola Jul 2006 A1
20060172724 Linkert et al. Aug 2006 A1
20060173911 Levin et al. Aug 2006 A1
20060199598 Lee et al. Sep 2006 A1
20060206590 Wakasa et al. Sep 2006 A1
20060218234 Deng et al. Sep 2006 A1
20060246955 Nirhamo Nov 2006 A1
20060253801 Okaro et al. Nov 2006 A1
20060259870 Hewitt et al. Nov 2006 A1
20060259873 Mister Nov 2006 A1
20060271520 Ragan Nov 2006 A1
20060281448 Plestid et al. Dec 2006 A1
20060293088 Kokubo Dec 2006 A1
20060294396 Witman Dec 2006 A1
20070005716 LeVasseur et al. Jan 2007 A1
20070011610 Sethi et al. Jan 2007 A1
20070015532 Deelman Jan 2007 A1
20070024646 Saarinen Feb 2007 A1
20070035513 Sherrard et al. Feb 2007 A1
20070038567 Allaire et al. Feb 2007 A1
20070054679 Cho et al. Mar 2007 A1
20070061306 Pell et al. Mar 2007 A1
20070061714 Stuple et al. Mar 2007 A1
20070067272 Flynt et al. Mar 2007 A1
20070073718 Ramer et al. Mar 2007 A1
20070076013 Campbell Apr 2007 A1
20070080954 Griffin Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070082708 Griffin Apr 2007 A1
20070106635 Frieden et al. May 2007 A1
20070118801 Harshbarger et al. May 2007 A1
20070127638 Doulton Jun 2007 A1
20070150826 Anzures et al. Jun 2007 A1
20070150842 Chaudhri et al. Jun 2007 A1
20070157089 Van Os et al. Jul 2007 A1
20070162850 Adler et al. Jul 2007 A1
20070171192 Seo et al. Jul 2007 A1
20070171238 Ubillos et al. Jul 2007 A1
20070182595 Ghasabian Aug 2007 A1
20070185847 Budzik et al. Aug 2007 A1
20070192707 Maeda et al. Aug 2007 A1
20070198420 Goldstein Aug 2007 A1
20070211034 Griffin et al. Sep 2007 A1
20070214422 Agarwal et al. Sep 2007 A1
20070214429 Lyudovyk et al. Sep 2007 A1
20070216651 Patel Sep 2007 A1
20070225022 Satake Sep 2007 A1
20070232342 Larocca Oct 2007 A1
20070233654 Karlson Oct 2007 A1
20070238488 Scott Oct 2007 A1
20070247435 Benko et al. Oct 2007 A1
20070250583 Hardy Oct 2007 A1
20070253758 Suess Nov 2007 A1
20070255831 Hayashi et al. Nov 2007 A1
20070256029 Maxwell Nov 2007 A1
20070257891 Esenther et al. Nov 2007 A1
20070257933 Klassen Nov 2007 A1
20070262964 Zotov et al. Nov 2007 A1
20070263843 Foxenland Nov 2007 A1
20070273663 Park et al. Nov 2007 A1
20070280457 Aberethy Dec 2007 A1
20070281747 Pletikosa Dec 2007 A1
20080005668 Mavinkurve Jan 2008 A1
20080032681 West Feb 2008 A1
20080036743 Westerman Feb 2008 A1
20080048986 Khoo Feb 2008 A1
20080052370 Snyder Feb 2008 A1
20080057910 Thoresson et al. Mar 2008 A1
20080057926 Forstall et al. Mar 2008 A1
20080066010 Brodersen et al. Mar 2008 A1
20080068447 Mattila et al. Mar 2008 A1
20080076472 Hyatt Mar 2008 A1
20080082934 Kocienda et al. Apr 2008 A1
20080084970 Harper Apr 2008 A1
20080085700 Arora Apr 2008 A1
20080092057 Monson et al. Apr 2008 A1
20080102863 Hardy May 2008 A1
20080114535 Nesbitt May 2008 A1
20080120571 Chang et al. May 2008 A1
20080132252 Altman et al. Jun 2008 A1
20080153551 Baek et al. Jun 2008 A1
20080155425 Murthy et al. Jun 2008 A1
20080155437 Morris Jun 2008 A1
20080162651 Madnani Jul 2008 A1
20080165132 Weiss Jul 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080165163 Bathiche Jul 2008 A1
20080167058 Lee et al. Jul 2008 A1
20080168290 Jobs et al. Jul 2008 A1
20080168403 Westerman et al. Jul 2008 A1
20080172609 Rytivaara Jul 2008 A1
20080180399 Cheng Jul 2008 A1
20080182628 Lee et al. Jul 2008 A1
20080189658 Jeong et al. Aug 2008 A1
20080198141 Lee et al. Aug 2008 A1
20080200142 Adbel-Kader et al. Aug 2008 A1
20080208973 Hayashi Aug 2008 A1
20080222560 Harrison Sep 2008 A1
20080222569 Champion Sep 2008 A1
20080242362 Duarte Oct 2008 A1
20080259042 Thorn Oct 2008 A1
20080261660 Huh et al. Oct 2008 A1
20080263457 Kim et al. Oct 2008 A1
20080270558 Ma Oct 2008 A1
20080295017 Tseng et al. Nov 2008 A1
20080297475 Woolf et al. Dec 2008 A1
20080299999 Lockhart et al. Dec 2008 A1
20080301046 Martinez Dec 2008 A1
20080301575 Fermon Dec 2008 A1
20080307364 Chaudhri et al. Dec 2008 A1
20080309626 Westerman et al. Dec 2008 A1
20080316177 Tseng Dec 2008 A1
20080317240 Chang et al. Dec 2008 A1
20090007017 Anzures et al. Jan 2009 A1
20090012952 Fredriksson Jan 2009 A1
20090029736 Kim et al. Jan 2009 A1
20090037469 Kirsch Feb 2009 A1
20090051671 Konstas Feb 2009 A1
20090061837 Chaudhri et al. Mar 2009 A1
20090061948 Lee et al. Mar 2009 A1
20090064055 Chaudhri Mar 2009 A1
20090077649 Lockhart Mar 2009 A1
20090083656 Dukhon Mar 2009 A1
20090083850 Fadell et al. Mar 2009 A1
20090085851 Lim Apr 2009 A1
20090085878 Heubel Apr 2009 A1
20090089215 Newton Apr 2009 A1
20090106694 Kraft et al. Apr 2009 A1
20090109243 Kraft Apr 2009 A1
20090111447 Nurmi Apr 2009 A1
20090117942 Boningue et al. May 2009 A1
20090119606 Gilbert May 2009 A1
20090140061 Schultz et al. Jun 2009 A1
20090140986 Karkkainen et al. Jun 2009 A1
20090146962 Ahonen et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090160809 Yang Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090164888 Phan Jun 2009 A1
20090170480 Lee Jul 2009 A1
20090205041 Michalske Aug 2009 A1
20090228825 Van Os et al. Sep 2009 A1
20090265662 Bamford Oct 2009 A1
20090284482 Chin Nov 2009 A1
20090288032 Chang et al. Nov 2009 A1
20090293014 Meuninck et al. Nov 2009 A1
20090298547 Kim et al. Dec 2009 A1
20090307589 Inose et al. Dec 2009 A1
20090307623 Agarawala et al. Dec 2009 A1
20090313584 Kerr et al. Dec 2009 A1
20090315847 Fujii Dec 2009 A1
20100008490 Gharachorloo et al. Jan 2010 A1
20100075628 Ye Mar 2010 A1
20100079413 Kawashima et al. Apr 2010 A1
20100087169 Lin Apr 2010 A1
20100087173 Lin Apr 2010 A1
20100100839 Tseng et al. Apr 2010 A1
20100103124 Kruzeniski Apr 2010 A1
20100105424 Smuga Apr 2010 A1
20100105438 Wykes Apr 2010 A1
20100105439 Friedman Apr 2010 A1
20100105440 Kruzeniski Apr 2010 A1
20100105441 Voss Apr 2010 A1
20100107067 Vaisanen Apr 2010 A1
20100107068 Butcher Apr 2010 A1
20100107100 Schneekloth Apr 2010 A1
20100145675 Lloyd et al. Jun 2010 A1
20100146437 Woodcock et al. Jun 2010 A1
20100159966 Friedman Jun 2010 A1
20100159994 Stallings et al. Jun 2010 A1
20100159995 Stallings et al. Jun 2010 A1
20100167699 Sigmund et al. Jul 2010 A1
20100180233 Kruzeniski Jul 2010 A1
20100216491 Winkler et al. Aug 2010 A1
20100248688 Teng Sep 2010 A1
20100248689 Teng Sep 2010 A1
20100248741 Setlur et al. Sep 2010 A1
20100248787 Smuga Sep 2010 A1
20100295795 Wilairat Nov 2010 A1
20100311470 Seo et al. Dec 2010 A1
20100321403 Inadome Dec 2010 A1
20110018806 Yano Jan 2011 A1
20110055773 Agarawala et al. Mar 2011 A1
20110093778 Kim et al. Apr 2011 A1
20110231796 Vigil Sep 2011 A1
20120028687 Wykes Feb 2012 A1
20120050185 Davydov et al. Mar 2012 A1
20120179992 Smuga Jul 2012 A1
20120212495 Butcher Aug 2012 A1
20120244841 Teng et al. Sep 2012 A1
20130102366 Teng Apr 2013 A1
20140068446 Friedman Mar 2014 A1
20140094226 Friedman Apr 2014 A1
20140109005 Kruzeniski Apr 2014 A1
Foreign Referenced Citations (67)
Number Date Country
1749936 Mar 2006 CN
1936797 Mar 2007 CN
101228570 Jul 2008 CN
101296457 Oct 2008 CN
101308440 Nov 2008 CN
101311891 Nov 2008 CN
102197702 Sep 2011 CN
0583060 Feb 1994 EP
1469375 Oct 2004 EP
1752868 Feb 2007 EP
H03246614 Nov 1991 JP
H06242886 Sep 1994 JP
H0897887 Apr 1996 JP
2001125913 May 2001 JP
2002229906 Aug 2002 JP
2003076460 Mar 2003 JP
2004227393 Aug 2004 JP
2004357257 Dec 2004 JP
2005517240 Jun 2005 JP
2005242661 Sep 2005 JP
2005539432 Dec 2005 JP
2006139615 Jun 2006 JP
2006163647 Jun 2006 JP
2007141249 Jun 2007 JP
2007243275 Sep 2007 JP
2007527065 Sep 2007 JP
2007258893 Oct 2007 JP
2008148054 Jun 2008 JP
2008217808 Sep 2008 JP
2008536196 Sep 2008 JP
2008257442 Oct 2008 JP
2009015457 Jan 2009 JP
2009522666 Jun 2009 JP
200303655 Feb 2003 KR
20060019198 Mar 2006 KR
1020070036114 Apr 2007 KR
1020070098337 Oct 2007 KR
20070120368 Dec 2007 KR
1020080025951 Mar 2008 KR
1020080076390 Aug 2008 KR
100854333 Sep 2008 KR
20080084156 Sep 2008 KR
1020080084156 Sep 2008 KR
1020080113913 Dec 2008 KR
1020090041635 Apr 2009 KR
201023026 Jun 2010 TW
WO-03062976 Jul 2003 WO
WO-2005026931 Mar 2005 WO
WO-2005027506 Mar 2005 WO
WO-2006019639 Feb 2006 WO
WO-2007030396 Mar 2007 WO
WO-2007121557 Nov 2007 WO
WO-2007134623 Nov 2007 WO
WO-2008030608 Mar 2008 WO
WO-2008030976 Mar 2008 WO
WO-2008031871 Mar 2008 WO
WO-2008035831 Mar 2008 WO
WO-2008104862 Sep 2008 WO
WO-2008146784 Dec 2008 WO
WO-2009000043 Dec 2008 WO
WO-2009049331 Apr 2009 WO
WO-2010048229 Apr 2010 WO
WO-2010048448 Apr 2010 WO
WO-2010048519 Apr 2010 WO
WO-2010117643 Oct 2010 WO
WO-2010117661 Oct 2010 WO
WO-2010135155 Nov 2010 WO
Non-Patent Literature Citations (191)
Entry
“International Search Report”, Application No. PCT/US2010/028553, Application Filing Date: Mar. 24, 2010,(Nov. 9, 2010),9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2010/034772, (Dec. 29, 2010),12 pages.
“PCT Search Report and Written Opinion”, PCT Application No. PCT/US2010/038730, (Jan. 19, 2011),8 pages.
“PCT Search Report”, U.S. Appl. No. PCT/US2009/061864, (May 14, 2010),10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2009/061382, (May 26, 2010),10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2009/061735, (Jun. 7, 2010),11 pages.
“Kiosk Browser Chrome Customization Firefox 2.x”, Retrieved from: <http://stlouis-shopper.com/cgi-bin/mozdev-wiki/,pl?ChromeCustomization> Making a new chrome for the kiosk browser, Kiosk Project Kiosk Browser Chrome Customization Firefox-2.x,(Aug. 16, 2007),2 pages.
Harrison, Richard “Symbian OS C++ for Mobile Phones: vol. 3 ( Symbian Press): 3 (Paperback)”, Retrieved from: <http://www.amazon.co.uk/Symbian-OS-Mobile-Phones-Press/dp/productdescription/0470066415>, (Jun. 16, 2003),4 pages.
“How do you dial 1-800-FLOWERS”, Retrieved from: <http://blogs.msdn.com/windowsmobile/archive/2007/02/06/how-do-you-dial-1-800-flowers.aspx>, (Feb. 6, 2007),24 pages.
“Blackberry office tools: Qwerty Convert”, Retrieved from: <http://blackberrysoftwarelist.net/blackberry/download-software/blackberry-office/qwerty—convert.aspx>, (Nov. 20, 2008),1 page.
Gade, Lisa “Samsung Alias u740”, Retrieved from: <http://www.mobiletechreview.com/phones/Samsung-U740.htm>, (Mar. 14, 2007),6 pages.
“Dial a number”, Retrieved from: <http://www.phonespell.org/ialhelp.html> on Nov. 20, 2008, 1 page.
“Apple IPhone—8GB AT&T”, Retrieved from: <http://nytimes.com.com/smartphones/apple-iphone-8gb-at/4515-6452—7-32309245.html>, (Jun. 29, 2007),11 pages.
“IntelliScreen—New iPhone App Shows Today Screen Type Info in Lock Screen”, Retrieved from: <http://justanotheriphoneblog.com/wordpress//2008/05/13/intelliscreen-new-iphone-app-shows-today-screen-type-info-on-lock-screen/>, (May 13, 2008),11 pages.
“Winterface Review”, Retrieved from: <http://www.mytodayscreen.com/winterface-review/>, (Jul. 9, 2008),42 pages.
Oliver, Sam “Potential iPhone Usability and Interface Improvements”, Retrieved from: <http://www.appleinsider.com/articles/08/09/18/potential—iphone—usability—and—interface—improvements.html>, (Sep. 18, 2008),4 pages.
“Google Android has Landed; T-Mobile, HTC Unveil G1”, Retrieved from: <http://www.crn.com/retail/210603348> on Nov. 26, 2008., (Sep. 23, 2008),5 Pages.
“Alltel Adds Dedicated Search Key to Phones”, Retrieved from: <http://www.phonescoop.com/news/item.php?n=2159> on Nov. 26, 2008., (Apr. 12, 2007),2 Pages.
Oryl, Michael “Review: Asus P527 Smartphone for North America”, Retrieved from: <http://www.mobileburn.com/review.jsp?Id=4257> on Dec. 17, 2008., (Mar. 5, 2008),1 Page.
“Nokia E61 Tips and Tricks for Keyboard Shortcuts”, Retrieved from: <http://www.mobiletopsoft.com/board/1810/nokia-e61-tips-and-tricks-for-keyboard-shortcuts.html> on Dec. 17, 2008., (Jan. 27, 2006),2 Pages.
Ha, Rick et al., “SIMKEYS: An Efficient Keypad Configuration for Mobile Communications”, Retrieved from: <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=01362557.>, (Nov. 2004),7 Pages.
“Remapping the Keyboard”, Retrieved from: <http://publib.boulder.ibm.com/infocenter/hodhelp/v9r0/index.jsp?topic=/com.ibm.hod9.doc/help/assignkey.html> on Dec. 11, 2008., (Jul. 15, 2005),5 Pages.
“Palm Treo 750 Cell Phone Review—Hardware”, Retrieved from: <http://www.wirelessinfo.com/content/palm-Treo-750-Cell-Phone-Review/Hardware.htm> on Dec. 11, 2008., (Mar. 17, 2007),4 Pages.
“Keyboard (5)”, Retrieved from: <http://landru.uwaterloo.ca/cgi-bin/man.cdi?section=5&topic=keyboard> on Dec. 11, 2008., (Aug. 11, 1997),8 Pages.
“Calc4M”, Retrieved from: <http://www.hellebo.com/Calc4M.html>, (Sep. 10, 2008),4 Pages.
“MIDTB Tip Sheet: Book Courier”, Retrieved from: <http://www.midtb.org/tipsbookcourier.htm> on Dec. 11, 2008., (Sep. 26, 2005),6 Pages.
“Freeware .mobi”, Retrieved from: <http://www.palmfreeware.mobi/download-palette.html>, (Oct. 9, 2001),2 pages.
“Palette Extender 1.0.2”, Retrieved from: <http://palette-extender.en.softonic.com/symbian>, (Jan. 21, 2003),2 pages.
Rice, Stephen V., et al., “A System for Searching Sound Palettes”, Retrieved from: <http://www.comparisonics.com/FindSoundsPalettePaper.pdf>, (Feb. 28-29, 2008),6 pages.
“Multi-touch”, Retrieved from <http://en.wikipedia.org/wiki/Multi-touch#Microsoft—Surface>, (Apr. 17, 2009),8 pages.
Wilson, Tracy V., “How the iPhone Works”, Retrieved from: <http://electronics.howstuffworks.com/iphone2.htm>, (Jan. 2007),9 pages.
“DuoSense™ Multi-Touch Gestures”, Retrieved from: <http://www.n-trig.com/Data/Uploads/Misc/DuoSenseMTG—final.pdf>, (Jul. 2008),4 pages.
Vallerio, Keith S., et al., “Energy-Efficient Graphical User Interface Design”, Retrieved from: <http://www.cc.gatech.edu/classes/AY2007/cs7470—fall/zhong-energy-efficient-user-interface.pdf>, (Jun. 10, 2004),13 Pages.
Nordgren, Peder “Development of a Touch Screen Interface for Scania Interactor”, Retrieved from: <http://www.cs.umu.se/education/examina/Rapporter/PederNordgren.pdf>, (Apr. 10, 2007),67 Pages.
“Elecont Quick Desktop 1.0.43”, Retrieved from: <http://handheld.softpedia.com/get/System-Utilities/Launcher-Applications/Elecont-Quick-Desktop-72131.shtml> on May 5, 2009., (Mar. 13, 2009),pp. 1-2.
“Symbian Applications”, Retrieved from: <http://symbianfullversion.blogspot.com/2008—12—01—archive.html> on May 5, 2009.,. (Jan. 2009),51 Pages.
Remond, Mickael “Mobile Marketing Magazine”, Retrieved from: <http://www.mobilemarketingmagazine.co.uk/mobile—social—networking/> on May 5, 2009.. (Apr. 28, 2009),16 Pages.
“Womma”, Retrieved from: <http://www.womma.org/blog/links/wom-trends/> on May 5, 2009., (2007),70 Pages.
Dolcourt, Jessica “Webware”, Retrieved from: <http://news.cnet.com/webware/?categoryId=2010> on May 5, 2009., (Apr. 2009),13 Pages.
“HTC Shows HTC Snap with Snappy Email Feature”, Retrieved from: <http://www.wirelessandmobilenews.com/smartphones/ on May 5, 2009>, (May 4, 2009),10 Pages.
“Ask Web Hosting”, Retrieved from: <http://www.askwebhosting.com/story/18501/HTC—FUZE—From—ATandampT—Fuses—Fun—and—Function—With—the—One-Touch—Power—of—TouchFLO—3D.html> on May 5, 2009., (Nov. 11, 2008),3 pages.
“Live Photo Gallery—Getting Started—from Camera to Panorama”, Retrieved from: <http://webdotwiz.spaces.live.com/blog/cns!2782760752B93233!1729.entry> on May 5, 2009., (Sep. 2008),7 Pages.
Yang, Seungji et al., “Semantic Photo Album Based on MPEG-4 Compatible Application Format”, Retrieved from: <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04146254.>, (2007),2 Pages.
Mei, Tao et al., “Probabilistic Multimodality Fusion for Event Based Home Photo Clustering”, Retrieved from: <http://ieeexplore.ieee.org//stamp/stamp.jsp?tp=&arnumber=04036960.>, (2006),4 Pages.
“Exclusive: Windows Mobile 7 to Focus on Touch and Motion Gestures”, Retrieved from: <http://anti-linux.blogspot.com/2008/08/exclusive-windows-mobile-7-to-focus-on.html> on May 6, 2009, (Aug. 1, 2008),pp. 1-14.
“Mobile/UI/Designs/TouchScreen”, Retrieved from: <https://wiki.mozilla.org/Mobile/UI/Designs/TouchScreen> on May 6, 2009., (Feb. 2009),15 Pages.
“Introduction to Windows Touch”, Retrieved from: <http://download.microsoft.com/download/a/d/f/adf1347d-08dc-41a4-9084-623b1194d4b2/Win7—touch.docx>, (Dec. 18, 2008),7 Pages.
“Touch Shell Free”, Retrieved from: <http://www.pocketpcfreeware.mobi/download-touch-shell-free.html> on May 5, 2009., (Feb. 23, 2009),2 Pages.
“Parallax Scrolling”, Retrieved from: <http://en.wikipedia.org/wiki/Parallax—scrolling> on May 5, 2009., (May 4, 2009),3 Pages.
Steinicke, Frank et al., “Multi-Touching 3D Data: Towards Direct Interaction in Stereoscopic Display Environments coupled with Mobile Devices”, Retrieved from: <http://viscg.uni-muenster.de/publications/2008/SHSK08/ppd-workshop.-pdf.>, (Jun. 15, 2008),4 Pages.
Mann, Richard et al., “Spectrum Analysis of Motion Parallax in a 3D Cluttered Scene and Application to Egomotion”, Retrieved from: <http://www.cs.uwaterloo.ca/˜mannr/snow/josa-mann-langer.pdf.>, (Sep. 2005),15 Pages.
“Keyboard Shortcuts”, Retrieved from: <http://www.pctoday.com/editorial/article.asp?article=articles%2F2005%2Ft0311%2F26t11%2F26t11.asp> on Aug. 3, 2009., (Nov. 2005),5 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2010/028699, (Oct. 4, 2010),10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2010/028555, (Oct. 12, 2010),10 pages.
Raghaven, Gopal et al., “Model Based Estimation and Verification of Mobile Device Performance”, Retrieved from http://alumni.cs.ucsb.edu/˜raimisl/emsoft04—12.pdf., (Sep. 27-29, 2004),10 Pages.
Reed, Brad “Microsoft Demos Windows Mobile 6.1 at CTIA”, Retrieved from: <http://www.networkworld.com/news/2008/040208-ctia-microsoft-windows-mobile.html> on Jul. 18, 2008., (Apr. 2, 2008),1 page.
Singh, Kundan et al., “CINEMA: Columbia InterNet Extensible Multimedia Architecture”, Retrieved from http://www1.cs.columbia.edu/˜library/TR-repository/reorts/reports-2002/cucs-011-02.pdf, (Sep. 3, 2002),83 Pages.
Kcholi, Avi “Windows CE .net Interprocess Communication”, Retrieved from http://msdn.microsoft.com/en-us/library/ms836784.aspx on Jul. 17, 2008., (Jan. 2004),15 Pages.
Gao, Rui “A General Logging Service for Symbian based Mobile Phones”, Retrieved from: <http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2007/rapporter07/gao—rui—07132.pdf.> on Jul. 17, 2008, (Feb. 2007),pp. 1-42.
“Oracle8i Application Developer's Guide—Advanced Queuing Release 2 (8.1.6)”, Retrieved from: http://www.cs.otago.ac.nz/oradocs/appdev.817/a76938/adq01in5.htm on May 6, 2009., 8 pages.
Mao, Jeng “Comments of Verizon Wireless Messaging Services, LLC”, Retrieved from: http://www.ntia.doc.gov/osmhome/warnings/comments/verizon.htm on May 6, 2009., 5 Pages.
“Oracle8i Concepts Release 8.1.5”, Retrieved from: http://www.cs.umbc.edu/help/oracle8/server.815/a67781/c16queue.htm on May 6, 2009., 10 Pages.
“Oracle8i Application Developers Guide—Advanced Queuing”, Retrieved from: http://www.cs.umbc.edu/help/oracle8/server.815/a68005/03—adq1i.htm on May 6, 2009., 29 Pages.
“Content-Centric E-Mail Message Analysis in Litigation Document Reviews”, Retrieved from: <http://www.busmanagement.com/article/Issue-14/Data-Management/Content-Centric-E-Mail-Message-Analysis-in-Litigation-Document-Reviews/> on May 6, 2009,5 Pages.
“Final Office Action”, U.S. Appl. No. 12/433,667, (Sep. 13, 2011),17 pages.
“Internet Explorer Window Restrictions”, Retrieved from: http://technet.microsoft.com/en-us/library/cc759517(WS.10).aspx on Jun. 28, 2011, Microsoft TechNet,5 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, (Aug. 17, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/413,977, (Jul. 19, 2011),17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,382, (Jul. 26, 2011),9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, (Aug. 2, 2011),6 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,455, (Aug. 29, 2011),8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,458, (Jul. 6, 2011),8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, (Aug. 3, 2011),21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,667, (Jun. 7, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, (Jul. 1, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,799, (Aug. 11, 2011),12 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, (Aug. 10, 2011),6 pages.
“SecureMe-Anti-Theft Security Application for S60 3rd”, Retrieved from: <http:/www.killermobile.com/newsite/mobile-software/s60-applications/secureme-%11-anti%11theft-security-application-for-s60-3rd.htm> on Jun. 28, 2011, Dec. 15, 2008),3 pages.
Suror, “PocketShield-New Screenlock App for the HTC Diamond and Pro”, Retrieved from: <http://wmpowerusercom/?tag=htc-touch-diamond> on Jun. 28, 2011 (Oct. 23, 2008),2 pages.
Terpstra, Brett “Beta Beat: Grape, a New Way to Manage Your Desktop Clutter”, Retrieved from: Beta Beat: Grape, a New Way to Manage Your Desktop Clutter on Jun. 28, 2011, (Apr. 14, 2009),4 pages.
“Final Office Action”, U.S. Appl. No. 12/244,545, (Dec. 7, 2011), 16 pages.
“Final Office Action”, U.S. Appl. No. 12/413,977, (Nov. 17, 2011), 16 pages.
“Final Office Action”, U.S. Appl. No. 12/414,476, (Dec. 1, 2011), 20 pages.
“Final Office Action”, U.S. Appl. No. 12/469,458, (Nov. 17, 2011), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,419, (Nov. 9, 2011), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, (Sep. 22, 2011), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/470,558, (Nov. 22, 2011), 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,845, (Dec. 7, 2011), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/560,081, (Dec. 7, 2011), 16 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, (Oct. 31, 2011), 2 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, (Nov. 29, 2011), 2 pages.
La, Nick “Parallax Gallery”, Available at <http://webdesignerwall.comtutorials/parallax-gallery/comment-page-1>,(Apr. 25, 2008), 16 pages.
Roberts, Neil “Touching and Gesturing on the iPhone”, Available at <http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/comments-pare-1>,(Jul. 10, 2008), 16 pages.
“Advisory Action”, U.S. Appl. No. 12/414,382, (Jan. 20, 2012),3 pages.
“Final Office Action”, U.S. Appl. No. 12/414,382, (Dec. 23, 2011),7 pages.
“Final Office Action”, U.S. Appl. No. 12/469,480, (Feb. 9, 2012),17 pages.
“Final Office Action”, U.S. Appl. No. 12/560,081, (Mar. 14, 2012),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, (Jan. 17, 2012),7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,667, (Feb. 3, 2012),16 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,455, (Jan. 4, 2012),4 pages.
Beiber, Gerald et al., “Screen Coverage: A Pen-Interaction Problem for PDA's and Touch Screen Computers”, In Proceedings of ICWMC 2007,(Mar. 2007),6 pages.
Wyatt, Paul “/Flash/the art of parallax scrolling”, .net Magazine,(Aug. 1, 2007),pp. 74-76.
“Extended European Search Report”, European Patent Application No. 09818253.8, (Apr. 10, 2012),7 pages.
“Final Office Action”, U.S. Appl. No. 12/484,799, (Apr. 30, 2012),13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, (Mar. 27, 2012),18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, (May 31, 2012),7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,419, (May 23, 2012),13 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,382, (Apr. 4, 2012),4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/470,558, (Apr. 2, 2012),7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/413,977, (Jul. 20, 2012), 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/480,969, (Aug. 7, 2012), 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,799, (Aug. 7, 2012), 13 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,434, (Aug. 17, 2012), 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/470,558, (Aug. 23, 2012), 2 pages.
“Notice of Allowance”, U.S. Appl. No. 12/484,845, (Mar. 16, 2012), 5 pages.
“Final Office Action”, U.S. Appl. No. 12/244,545, (Sep. 7, 2012), 23 pages.
“Final Office Action”, U.S. Appl. No. 12/480,969, (Nov. 23, 2012), 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, (Nov. 9, 2012), 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, (Sep. 21, 2012), 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, (Oct. 17, 2012), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,495, (Sep. 17, 2012), 8 pages.
“Notice of Allowance”, U.S. Appl. No. 12/469,419, (Nov. 27, 2012),13 pages.
“Notice of Allowance”, U.S. Appl. No. 12/484,799, (Oct. 22, 2012), 10 pages.
“Extended European Search Report”, European Patent Application No. 09822736.6, (Dec. 18, 2012), 7 pages.
“Final Office Action”, U.S. Appl. No. 12/433,667, (Jan. 7, 2013), 17 pages.
“Final Office Action”, U.S. Appl. No. 12/469,458, (Feb. 1, 2013), 19 pages.
“Foreign Office Action”, Chinese Application No. 200980142632.9, (Jan. 29, 2013), 11 pages.
“Foreign Office Action”, Chinese Application No. 200980142661.5, (Jan. 21, 2013), 12 pages.
“Foreign Office Action”, Chinese Application No. 201080015728.1, (Dec. 26, 2012), 9 pages.
“Foreign Office Action”, Chinese Application No. 201080015788.3, (Dec. 24, 2012), 10 pages.
“Foreign Office Action”, Chinese Application No. 201080023212.1, (Dec. 5, 2012), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,495, (Dec. 19, 2012), 6 pages.
Crouch, Dennis “Smartphone Wars: Micron's Slide-to-Unlock Patent”, (Jan. 30, 2013), 2 pages.
“Final Office Action”, U.S. Appl. No. 12/414,476, (Apr. 8, 2013), 25 pages.
“Final Office Action”, U.S. Appl. No. 12/469,480, (Apr. 10, 2013), 21 pages.
“Foreign Office Action”, Chinese Application No. 200980142644.1, (Apr. 3, 2013), 10 pages.
“Foreign Office Action”, Chinese Application No. 201080015728.1, (May 16, 2013), 10 pages.
“Foreign Office Action”, Chinese Application No. 201080015788.3, (Jun. 5, 2013), 12 Pages.
“Foreign Office Action”, Chinese Application No. 201080023212.1, (Jun. 5, 2013), 8 pages.
“Introducing Application Styling for Windows Forms”, Infragistics Software Manual, Version 7.3.20072.1043, (Nov. 2007), 95 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, (May 3, 2013), 21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/480,969, (Apr. 4, 2013), 22 pages.
“Notice of Allowance”, U.S. Appl. No. 12/433,667, (Jun. 25, 2013),14 pages.
“Notice of Allowance”, U.S. Appl. No. 13/492,495, (Apr. 26, 2013), 5 pages.
“Foreign Office Action”, Chinese Application No. 200980142661.5, (Sep. 24, 2013), 8 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/418,884, (Sep. 30, 2013), 7 pages.
“Foreign Office Action”, Japanese Application No. 2011-530109, (Jul. 18, 2013), 4 Pages.
“Foreign Office Action”, Chinese Application No. 201080015802.X, (Sep. 29, 2013), 11 Pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/433,667, (Oct. 10, 2013), 2 pages.
“Final Office Action”, U.S. Appl. No. 12/469,458, (Oct. 11, 2013), 24 pages.
“Foreign Office Action”, Chinese Application No. 200980142632.9, (Jun. 14, 2013), 6 pages.
“Foreign Office Action”, Japanese Application No. 2012-503523, (Apr. 22, 2013), 5 Pages.
“EP Search Report”, European Application No. 10762112.0, (Aug. 2, 2013), 7 Pages.
“Final Office Action”, U.S. Appl. No. 12/480,969, (Jul. 24, 2013),19 pages.
“Foreign Office Action”, Chinese Application No. 200980142644.1, (Aug. 20, 2013), 9 Pages.
“Foreign Office Action”, Japanese Application No. 2011-533353, (Jul. 5, 2013), 9 Pages.
“Foreign Office Action”, Chilean Application No. 2379-2011, (Jul. 3, 2013), 8 pages.
“Foreign Office Action”, Chinese Application No. 200980139831.4, (Jul. 1, 2013), 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, (Aug. 27, 2013), 22 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 12/433,667, (Aug. 1, 2013), 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, Oct. 25, 2013, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/480,969, Oct. 29, 2013, 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, Dec. 19, 2013, 22 pages.
“Final Office Action”, U.S. Appl. No. 12/469,480, Dec. 5, 2013, 24 pages.
“Foreign Office Action”, MX Application No. Mx/a/2011/012279, Jul. 4, 2013, 3 Pages.
“Foreign Office Action”, JP Application No. 2011-533353, Nov. 26, 2013, 4 pages.
“Foreign Office Action”, JP Application No. 2012-503515, Nov. 18, 2013, 5 Pages.
“Foreign Office Action”, JP Application No. 2012-503514, Aug. 7, 2013, 5 pages.
“Foreign Office Action”, CN Application No. 201080015728.1, Oct. 29, 2013, 8 Pages.
“Final Office Action”, U.S. Appl. No. 13/418,884, Dec. 30, 2013, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/270,111, Oct. 21, 2013, 9 pages.
“Final Office Action”, U.S. Appl. No. 12/480,969, Feb. 21, 2014, 21 pages.
“Foreign Office Action”, JP Application No. 2011-533280, Nov. 26, 2013, 4 Pages.
“Foreign Notice of Allowance”, JP Application No. 2012-503523, Oct. 24, 2013, 4 pages.
“Foreign Office Action”, AU Application No. 2010234909, Mar. 17, 2014, 4 Pages.
“Foreign Office Action”, CN Application No. 200980139831.4, Mar. 24, 2014, 9 Pages.
“Foreign Office Action”, CN Application No. 200980142644.1, Mar. 5, 2014, 7 Pages.
“Foreign Office Action”, JP Application No. 2012-511905, Jan. 28, 2014, 6 Pages.
“Foreign Office Action”, RU Application No. 2011147058, Feb. 12, 2014, 6 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/418,884, Mar. 10, 2014, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/712,777, Mar. 20, 2014, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/270,111, Mar. 7, 2014, 6 pages.
“Final Office Action”, U.S. Appl. No. 12/244,545, May 6, 2014, 24 pages.
“Final Office Action”, U.S. Appl. No. 12/414,476, Apr. 24, 2014, 19 pages.
“Floating Layer”, Retrieved from <http://web.archive.org/web/20011025040032/http://www.echoecho.com/toolfloatinglayer.htm> on Apr. 15, 2014, Oct. 25, 2001, 9 pages.
“Foreign Office Action”, AU Application No. 2010260165, Mar. 25, 2014, 3 Pages.
“Foreign Office Action”, AU Application No. 2010260165, May 1, 2014, 3 Pages.
“Foreign Office Action”, CN Application No. 201080015802.X, May 19, 2014, 7 Pages.
“Foreign Office Action”, JP Application No. 2011-530109, May 2, 2014, 4 Pages.
“Foreign Office Action”, JP Application No. 2012-516218, Mar. 6, 2014, 6 Pages.
“Non-Final Office Action”, U.S. Appl. No. 12/560,081, Apr. 30, 2014, 25 pages.
Related Publications (1)
Number Date Country
20100105370 A1 Apr 2010 US
Provisional Applications (3)
Number Date Country
61107945 Oct 2008 US
61107935 Oct 2008 US
61107921 Oct 2008 US