Mobile communications device user interface

Information

  • Patent Grant
  • 8385952
  • Patent Number
    8,385,952
  • Date Filed
    Monday, June 15, 2009
    15 years ago
  • Date Issued
    Tuesday, February 26, 2013
    11 years ago
Abstract
A mobile communications device user interface is described. In an implementation, a method is implemented by a mobile communications device that includes outputting a user interface having a portion that is configured to accept content. When an option is selected in relation to the portion to initiate a communication and the content includes one or more contacts, the communication is formed that includes the content and is automatically addressed to the one or more contacts.
Description
BACKGROUND

Mobile communication devices (e.g., wireless phones) have become an integral part of everyday life. However, the form factor employed by conventional mobile communications devices is typically limited to promote mobility of the mobile communications device.


For example, the mobile communications device may have a relatively limited amount of display area when compared to a conventional desktop computer, e.g., a PC. Therefore, conventional techniques used to interact with a desktop computer may be inefficient when employed by a mobile communications device. For example, it may be difficult to select multiple items of content using convention techniques on a mobile communications device that has a limited amount of display area.


SUMMARY

A mobile communications device user interface is described. In an implementation, a method is implemented by a mobile communications device that includes outputting a user interface having a portion that is configured to accept content. When an option is selected in relation to the portion to initiate a communication and the content includes one or more contacts, the communication is formed that includes the content and is automatically addressed to the one or more contacts.


In an implementation, one or more computer-readable storage media includes instructions that are executable by a mobile communications device to configure a user interface to include a portion that is configured to accept a plurality of contacts via a drag-and-drop operation. The instructions are further configured to provide an option that is selectable to initiate a communication to each of the plurality of contacts.


In an implementation, a mobile communications device includes a display device, a processor, and memory configured to maintain a plurality of applications and an operating system that are executable on the processor. The operating system is configured to expose a feature to the plurality of applications to output in a user interface for display on the display device. The feature involves a portion that is selectable in the user interface to accept content via a drag-and-drop operation and output one or more indications of actions that are performable on each of the content accepted in the portion.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an example implementation of a mobile communications device in accordance with one or more embodiments of devices, features, and systems for mobile communications.



FIG. 2 is an illustration of a system in an example implementation that shows a gesture that is used to collect content in the portion of the user interface of FIG. 1.



FIG. 3 is an illustration of a system in an example implementation in which content is shown that is accepted into a portion of FIG. 2.



FIG. 4 is a flow diagram depicting a procedure in an example implementation in which a portion is output in a user interface that is configured to accept content and then perform an action involving each of the content.



FIG. 5 is a flow diagram depicting a procedure in an example implementation in which a user interface is output having a portion that is configured to accept content and usable to automatically form a communication without further user interaction.



FIG. 6 illustrates various components of an example device that can be implemented in various embodiments as any type of a mobile device to implement embodiments of devices, features, and systems for mobile communications.





DETAILED DESCRIPTION
Overview

Mobile communications devices typically have a small form factor to aide mobility of the mobile communications device. For example, the mobile communications device (e.g., a mobile phone) may be configured with a relatively minimal amount of display area and limited input devices (e.g., a keypad) so that the device may be easily transported. Consequently, traditional techniques used to interact with a conventional computer (e.g., a desktop PC) may be frustrating when used in conjunction with a mobile communications device.


For instance, selection and sharing of content (e.g., photos, video, and so on) on a mobile communications device may be difficult using traditional techniques due to the limitations of the small form factor described above. Consequently, traditional mobile communications devices were configured to interact with a single item of content at a time.


A mobile communications device user interface is described. In an implementation, a portion of a user interface is configured as a “bucket” to allow a user to drag-and-drop multiple items of content into it. The portion may be persisted between applications by an operating system such that a user may navigate between applications and store items of content from the applications using the portion. Actions may then be applied to the content that has been accepted in the bucket as a whole, such as to form a communication, a playlist, and so on, further discussion of which may be found in relation to the following sections.


In the following discussion, a variety of example implementations of a mobile communications device (e.g., a wireless phone) are described. Additionally, a variety of different functionality that may be employed by the mobile communications device is described for each example, which may be implemented in that example as well as in other described examples. Accordingly, example implementations are illustrated of a few of a variety of contemplated implementations. Further, although a mobile communications device having one or more modules that are configured to provide telephonic functionality are described, a variety of other mobile devices are also contemplated, such as personal digital assistants, mobile music players, dedicated messaging devices, portable game devices, netbooks, and so on.


Example Implementations


FIG. 1 is an illustration of an example implementation 100 of a mobile communications device 102 in accordance with one or more embodiments of devices, features, and systems for mobile communications. The mobile communications device 102 is operable to assume a plurality of configurations, examples of which include a configuration as illustrated in FIG. 1 in which the mobile communications device 102 is “open” and a configuration in which the mobile communications device 102 is “closed” as illustrated in FIGS. 2-3.


The mobile communications device 102 is further illustrated as including a first housing 104 and a second housing 106 that are connected via a slide 108 such that the first and second housings 104, 106 may move (e.g., slide) in relation to one another. Although sliding is described, it should be readily apparent that a variety of other movement techniques are also contemplated, e.g., a pivot, a hinge and so on.


The first housing 104 includes a display device 110 that may be used to output a variety of content, such as a caller identification (ID), contacts, images (e.g., photos) as illustrated, email, multimedia messages, Internet browsing, game play, music, video and so on. In an implementation, the display device 110 is configured to function as an input device by incorporating touchscreen functionality, e.g., through capacitive, surface acoustic wave, resistive, optical, strain gauge, dispersive signals, acoustic pulse, and other touchscreen functionality. The touchscreen functionality (as well as other functionality such as track pads) may be used to detect gestures, further discussion of which may be found in relation to FIGS. 2 and 3.


The second housing 106 is illustrated as including a keyboard 112 that may also be used to provide inputs to the mobile communications device 102. Although the keyboard 112 is illustrated as a QWERTY keyboard, a variety of other examples are also contemplated, such as a keyboard that follows a traditional telephone keypad layout (e.g., a twelve key numeric pad found on basic telephones), keyboards configured for other languages (e.g., Cyrillic), and so on.


In the “open” configuration as illustrated in the example implementation 100 of FIG. 1, the first housing 104 is moved (e.g., slid) “away” from the second housing 106 using the slide 108. Other implementations are also contemplated, such as a “clamshell” configuration, “brick” configuration, and so on.


The form factor employed by the mobile communications device 102 may be suitable to support a wide variety of features. For example, the keyboard 112 is illustrated as supporting a QWERTY configuration. This form factor may be particularly convenient to a user to utilize the previously described functionality of the mobile communications device 102, such as to compose texts, play games, check email, “surf” the Internet, provide status messages for a social network, and so on.


The mobile communications device 102 is also illustrated as including a communication module 114. The communication module 114 is representative of functionality of the mobile communications device 102 to communicate via a network 116. For example, the communication module 114 may include telephone functionality to make and receive telephone calls. The communication module 114 may also include a variety of other functionality, such as to capture content, form short message service (SMS) text messages, multimedia messaging service (MMS) messages, emails, status updates to be communicated to a social network service, and so on. A user, for instance, may input a status update for communication via the network 116 to the social network service. The social network service may then publish the status update to “friends” of the user, e.g., for receipt by the friends via a computer, respective mobile communications devices, and so on. A variety of other examples are also contemplated, such as blogging, instant messaging, and so on.


The mobile communications device 102 is also illustrated as including a user interface module 118. The user interface module 118 is representative of functionality of the mobile communications device 102 to generate, manage, and/or output a user interface 120 for display on the display device 110. A variety of different techniques may be employed to generate the user interface.


For example, the user interface module 118 may configure the user interface 120 to include a portion 122 to collect a plurality of content, such as the images 124 in the user interface 120 of FIG. 1. The user interface module 118 may then expose a plurality of actions 126 that may be performed using each of the content collected in the portion 122. A variety of different actions 126 may be performed, such as to form a communication 128, form a playlist 130, form a slideshow 132, and so on. Thus, in this way content may be first collected using the portion 122 (the “bucket”) and then an object (e.g., the communication, playlist, or slideshow) may be formed. This differs from conventional techniques in which the object was first formed (e.g., a playlist, a communication, and so on) and then populated with content. A variety of different techniques may be used to collect content using the portion 122, an example of which is described in relation to the following figure.



FIG. 2 illustrates a system 200 in an example implementation in which a gesture that is used to collect content in the portion 122 of the user interface 120. The illustrated system 200 shows a plurality of steps 202, 204, 206, 208 that are used to collect content in the portion 122. The mobile communication device 102 is illustrated in the first step 202 as outputting the user interface 120 having a plurality of content, which are images in this example. A user's finger 210 selects the image 212 of the dog by placing the finger 210 against the surface of the display device 110.


At the second step 204, touchscreen functionality of the mobile communications device 102 is used to detect the selection. In response, a thumbnail image 214 of the image 212 of the dog is created which follows the dragging of the user's finger 210 across the display device 210. For example, an animation may be displayed to give the appearance that the thumbnail image 210 “pops off” the image 212 of the dog. Additionally, the display of the image 212 may also be changed to indicate the selection, which is illustrated through the use of grayscale in the second step 204.


At the third step 206, the thumbnail image 214 has been dragged proximal to the portion 122 of the user interface 120 to follow the user's finger 210. In response, an animation is displayed that gives an appearance of the thumbnail image 214 being “dropped in the bucket.” For example, the animation may cause the thumbnail image 214 to be rotated and scaled (e.g., shrunk). In this way, the user is informed that the content (e.g., the image 212) is being input. A variety of other examples are also contemplated.


At the fourth step 208, the display of the image 212 returns back to the original state, e.g., to match the original state in the first step 202. Additionally, a display of the portion 122 is changed to indicate that the portion contains the image. Thus, the user is informed in this instance that the portion 122 “contains” the image 122. This process may be repeated to include a wide variety of content in the portion 122 from a wide variety of applications. For example, the content included in the portion 122 may be heterogeneous (e.g., “mixed”) to include music, images, movies, contacts, documents, and so on obtained from a variety of different applications. Although a drag-and-drop operation has been described for a single item of content, multiple items may also be selected (e.g., sequentially or at one time) and then dragged together. A variety of different actions may then be performed based on what content is included in the portion 122, further discussion of which may be found in relation to the following figure.



FIG. 3 illustrates a system 300 in an example implementation in which content is shown that is accepted into the portion 122. A user's hand 210 is shown as selecting the portion 122, which causes content that has been placed “in” the portion 122 to be displayed in the user interface 120.


The user interface 120 may also include options that are selectable to perform represented actions, examples of which are illustrated as upload 302 and send 304. The upload 302 option is selectable to cause content included in the portion 122 to be uploaded to a network site, such as a social network service. For example, the user interface module 118 may determine that a contact (e.g., “Ellie” in the illustrated example) has been included in the portion 122. Accordingly, the user interface module 118 may upload content that is not the contact to a network location specified in the contact “Ellie.” A similar technique may also be employed to form a communication to one or multiple contacts, further discussion of which may be found in relation to the following procedures.


Example Procedures

The following discussion describes user interface techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to the environment 100 and systems 200-300 of FIGS. 1-3, respectively.



FIG. 4 depicts a procedure 400 in an example implementation in which a portion is output in a user interface that is configured to accept content and then perform an action involving each of the content. A feature is exposed to a plurality of application to output, in a user interface for display on a display device, a portion that is selectable in the user interface to accept content (block 402). For example, functionality of the media module 108 may be incorporated within an operating system that is executable on a processor of the mobile media device 104 and is storable in memory or other computer-readable storage media of the mobile communications device 104. The operating system may expose this functionality to applications that are also executed on the mobile communications device 104 via an application programming interface (API). Further discussion of an operating system may be found in relation to FIG. 6.


One or more indications of actions are output that are performable on each of the content accepted in the portion (block 404). Thus, in this example, the content is gathered and then a variety of actions are performable on the content by selecting the action (block 406). For example, a playlist may be created in response to selection of a corresponding action (block 408) “create playlist” when the portion includes content configured as media, e.g., songs, videos, and so on. In another example, the content may be uploaded to a social network site in response to selection of a corresponding action (block 410) “upload.” For instance, the media module 108 may automatically (or in conjunction with manual interaction of a user) provide credentials to a social network website via the network 106 to upload content to a user's account upon selection of an “upload” action. In a further example, a communication is formed in response to selection of a corresponding action (block 412), such as an email, text message, and so on. Further discussion of communication formation may be found in relation to the following figure.



FIG. 5 depicts a procedure 500 in an example implementation in which a user interface is output having a portion that is configured to accept content and usable to automatically form a communication without further user interaction. A user interface is output that has a portion that is configured to accept content (block 502). For example, the portion 122 may accept content such as images, music, and contacts.


When an option is selected in relation to the portion to initiate a communication and the content includes one or more contacts, the communication is formed to include the content and is automatically addressed to the one or more contacts (block 504). For example, the media module 108 may identify which of the content accepted via the portion 122 includes relevant contact information, e.g., a telephone number, email address, and so on. Therefore, when an option is selected (e.g., a representation of an action “email”) the communication is formed and populated with the relevant contact information of each of the contacts accepted by the portion 122 automatically and without further user interaction. A variety of other examples are also contemplated, such as formation of a SMS, MMS, and so on.


Example Device


FIG. 6 illustrates various components of an example device 600 that can be implemented in various embodiments as any type of a mobile device to implement embodiments of devices, features, and systems for mobile communications. For example, device 600 can be implemented as any of the mobile communications devices 102 described with reference to respective FIGS. 1-3. Device 600 can also be implemented to access a network-based service, such as a social network service as previously described.


Device 600 includes input 602 that may include Internet Protocol (IP) inputs as well as other input devices, such as the keyboard 112 of FIG. 1. Device 600 further includes communication interface 604 that can be implemented as any one or more of a wireless interface, any type of network interface, and as any other type of communication interface. A network interface provides a connection between device 600 and a communication network by which other electronic and computing devices can communicate data with device 600. A wireless interface enables device 600 to operate as a mobile device for wireless communications.


Device 600 also includes one or more processors 606 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of device 600 and to communicate with other electronic devices. Device 600 can be implemented with computer-readable media 608, such as one or more memory components, examples of which include random access memory (RAM) and non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.).


Computer-readable media 608 provides data storage to store content and data 610, as well as device applications and any other types of information and/or data related to operational aspects of device 600. For example, an operating system 612 can be maintained as a computer application with the computer-readable media 608 and executed on processor 606. Device applications can also include a communication manager module 614 (which may be used to provide telephonic functionality) and a media manager 616.


Device 600 also includes an audio and/or video output 618 that provides audio and/or video data to an audio rendering and/or display system 620. The audio rendering and/or display system 620 can be implemented as integrated component(s) of the example device 600, and can include any components that process, display, and/or otherwise render audio, video, and image data. Device 600 can also be implemented to provide a user tactile feedback, such as vibrate and haptics.


Generally, the blocks may be representative of modules that are configured to provide represented functionality. Further, any of the functions described herein can be implemented using software, firmware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the techniques described above are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


CONCLUSION

Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims
  • 1. A method implemented by a device, the method comprising: outputting a user interface having a portion that is configured to accept content via a drag-and-drop operation involving a gesture that gives an appearance of a thumbnail image detected through interaction with a display device of the mobile communications device using touchscreen functionality, the content accepted by the portion remaining hidden from view;presenting an indication during the drag-and-drop operation indicating that the content is being input into the portion;in response to a selection of the portion, displaying the content accepted by the portion; andwhen an option is selected in relation to the portion to initiate a communication, and a first item of content includes one or more contacts and a second item of content does not include one or more contacts, forming the communication to be automatically addressed to the one or more contacts of the first item and that is configured to communicate the second item of content included in the portion over a network by the device.
  • 2. A method as described in claim 1, wherein the communication is an email and the one or more contacts include at least one email address that is used to automatically address the communication.
  • 3. A method as described in claim 1, wherein the communication is a multimedia messaging service (MMS) communication and the one or more contacts include at least one telephone number that is used to automatically address the communication.
  • 4. A method as described in claim 1, wherein the communication is an upload to a social network service and the one or more contacts include at least one network address that is used to automatically address the communication.
  • 5. A method as described in claim 1, wherein the portion is provided by an operating system and exposed to a plurality of applications such that the content accepted by the portion persists during navigation between the applications.
  • 6. A method as described in claim 1, wherein: the thumbnail image of the content is displayed in the user interface as following the gesture; and when the thumbnail image is displayed proximally to the portion, an animation is displayed in the user interface to show acceptance of the thumbnail image by the portion.
  • 7. A method as described in claim 6, wherein the animation involves rotation and scaling of the thumbnail image.
  • 8. A method as described in claim 1, wherein a display state of the portion changes when the content has been accepted by the portion to indicate the acceptance.
  • 9. One or more tangible computer-readable storage media devices comprising instructions that are executable by a mobile communications device to configure a user interface to: include a portion that is configured to accept a plurality of contacts via a drag-and-drop operation involving a gesture that gives an appearance of a thumbnail image detected through interaction with a display device of the mobile communications device using touchscreen functionality;present an indication during the drag-and-drop operation indicating that at least one of the plurality of contacts is being input into the portion;receive, by the portion, content that is not displayed in the portion;in response to a selection of the portion, display the received content and at least one of the accepted plurality of contacts; andprovide an option that is selectable to initiate a communication to a network location specified by each of the displayed plurality of contacts, the communication including the received content.
  • 10. One or more tangible computer-readable media devices as described in claim 9, wherein the communication is a single email that is addressed to each of the plurality of contacts.
  • 11. One or more tangible computer-readable media devices as described in claim 9, wherein the communication is a single multimedia messaging service (MMS) communication that is addressed to each of the plurality of contacts using respective telephone numbers.
  • 12. One or more tangible computer-readable media devices as described in claim 9, wherein the communication is an upload to be performed to a plurality of members of one or more social network services.
  • 13. One or more tangible computer-readable media devices as described in claim 9, wherein the option is selectable to output a plurality of additional options, each of the plurality of additional options indicating a different communication technique.
  • 14. One or more tangible computer-readable media devices as described in claim 9, wherein the portion is provided by an operating system and exposed to a plurality of applications such that the content accepted by the portion persists during navigation between the applications.
  • 15. One or more tangible computer-readable media devices as described in claim 9, wherein: the thumbnail image of the content is displayed in the user interface as following the gesture; andwhen the thumbnail image is displayed proximally to the portion, an animation is displayed in the user interface to show acceptance of the thumbnail image by the portion.
  • 16. A mobile communications device comprising: a display device; a processor; andmemory configured to maintain a plurality of applications and an operating system that are executable on the processor, wherein the operating system is configured to expose a feature to the plurality of applications to output in a user interface for display, on the display device, a portion that is selectable in the user interface to: accept content via a drag-and-drop operation involving a gesture that gives an appearance of a thumbnail image detected through interaction with the display device of the mobile communications device using touchscreen functionality;hide the accepted content from view on the display device until a selection of the portion is received; andoutput one or more indications of actions that are performable on each of the content accepted in the portion; andwherein the user interface is configured to present an indication during the drag-and-drop operation indicating that the content is being input into the portion.
  • 17. A mobile communications device as described in claim 16, wherein the actions include creation of a playlist and initiation of a slide show.
  • 18. A mobile communications device as described in claim 16, wherein the portion provided by the operating system and exposed to the plurality of applications is configured such that the content accepted by the portion persists during navigation between the applications.
  • 19. A mobile communications as described in claim 16, wherein the content includes an image, video, or a contact.
  • 20. A mobile communications device as described in claim 16, wherein: the thumbnail image of the content is displayed in the user interface as following the gesture; andwhen the thumbnail image is displayed proximally to the portion, an animation is displayed in the user interface to show acceptance of the thumbnail image by the portion.
RELATED APPLICATIONS

This application claims priority under 35 U.S.C. Section 119(e) to U.S. Provisional Patent Applications Nos. 61/107,945, 61/107,935, and 61/107,921, each of which was filed on Oct. 23, 2008, the entire disclosures of which are hereby incorporated by reference in their entirety.

US Referenced Citations (306)
Number Name Date Kind
5189732 Kondo Feb 1993 A
5258748 Jones Nov 1993 A
5463725 Henckel et al. Oct 1995 A
5515495 Ikemoto May 1996 A
5574836 Broemmelsiek Nov 1996 A
5675329 Barker Oct 1997 A
5860073 Ferrel et al. Jan 1999 A
5905492 Straub et al. May 1999 A
5914720 Maples et al. Jun 1999 A
5963204 Ikeda et al. Oct 1999 A
6008816 Eisler Dec 1999 A
6385630 Ejerhed May 2002 B1
6396963 Shaffer May 2002 B2
6424338 Andersone Jul 2002 B1
6507643 Groner Jan 2003 B1
6662023 Helle Dec 2003 B1
6697825 Underwood et al. Feb 2004 B1
6784925 Tomat Aug 2004 B1
6865297 Loui Mar 2005 B2
6876312 Yu Apr 2005 B2
6904597 Jin Jun 2005 B2
6961731 Holbrook Nov 2005 B2
6983310 Rouse Jan 2006 B2
6987991 Nelson Jan 2006 B2
7013041 Miyamoto Mar 2006 B2
7058955 Porkka Jun 2006 B2
7065385 Jarrad Jun 2006 B2
7065386 Smethers Jun 2006 B1
7111044 Lee Sep 2006 B2
7133707 Rak Nov 2006 B1
7133859 Wong Nov 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7158123 Myers Jan 2007 B2
7178111 Glein et al. Feb 2007 B2
7216588 Suess May 2007 B2
7249326 Stoakley Jul 2007 B2
7280097 Chen Oct 2007 B2
7283620 Adamczyk Oct 2007 B2
7289806 Morris et al. Oct 2007 B2
7296184 Derks et al. Nov 2007 B2
7336263 Valikangas Feb 2008 B2
7369647 Gao et al. May 2008 B2
7388578 Tao Jun 2008 B2
7403191 Sinclair Jul 2008 B2
7447520 Scott Nov 2008 B2
7461151 Colson et al. Dec 2008 B2
7479949 Jobs Jan 2009 B2
7480870 Anzures Jan 2009 B2
7483418 Maurer Jan 2009 B2
7496830 Rubin Feb 2009 B2
7593995 He et al. Sep 2009 B1
7606714 Williams et al. Oct 2009 B2
7610563 Nelson et al. Oct 2009 B2
7614018 Ohazama et al. Nov 2009 B1
7619615 Donoghue Nov 2009 B1
7640518 Forlines et al. Dec 2009 B2
7657849 Chaudhri et al. Feb 2010 B2
7671756 Herz et al. Mar 2010 B2
7702683 Kirshenbaum Apr 2010 B1
7730425 de los Reyes et al. Jun 2010 B2
7755674 Kaminaga Jul 2010 B2
7834861 Lee Nov 2010 B2
7877707 Westerman et al. Jan 2011 B2
7889180 Byun et al. Feb 2011 B2
7983718 Roka Jul 2011 B1
8006276 Nakagawa et al. Aug 2011 B2
8086275 Wykes et al. Dec 2011 B2
8131808 Aoki et al. Mar 2012 B2
8150924 Buchheit et al. Apr 2012 B2
8175653 Smuga et al. May 2012 B2
8238876 Teng Aug 2012 B2
8250494 Butcher et al. Aug 2012 B2
8255473 Eren et al. Aug 2012 B2
8269736 Wilairat Sep 2012 B2
8280901 McDonald Oct 2012 B2
8355698 Teng et al. Jan 2013 B2
20010022621 Squibbs Sep 2001 A1
20020000963 Yoshida et al. Jan 2002 A1
20020018051 Singh Feb 2002 A1
20020035607 Checkoway Mar 2002 A1
20020060701 Naughton et al. May 2002 A1
20020070961 Xu et al. Jun 2002 A1
20020091755 Narin Jul 2002 A1
20020128036 Yach et al. Sep 2002 A1
20020129061 Swart et al. Sep 2002 A1
20020138248 Corston-Oliver et al. Sep 2002 A1
20020142762 Chmaytelli et al. Oct 2002 A1
20020154176 Barksdale et al. Oct 2002 A1
20030003899 Tashiro et al. Jan 2003 A1
20030008686 Park et al. Jan 2003 A1
20030011643 Nishihihata Jan 2003 A1
20030040300 Bodic Feb 2003 A1
20030073414 Capps Apr 2003 A1
20030096604 Vollandt May 2003 A1
20030105827 Tan et al. Jun 2003 A1
20030135582 Allen et al. Jul 2003 A1
20030187996 Cardina et al. Oct 2003 A1
20030222907 Heikes et al. Dec 2003 A1
20030225846 Heikes et al. Dec 2003 A1
20040068543 Seifert Apr 2004 A1
20040078299 Down-Logan Apr 2004 A1
20040111673 Bowman et al. Jun 2004 A1
20040185883 Rukman Sep 2004 A1
20040212586 Denny, III Oct 2004 A1
20040217954 O'Gorman et al. Nov 2004 A1
20040250217 Tojo et al. Dec 2004 A1
20050054384 Pasquale et al. Mar 2005 A1
20050060647 Doan et al. Mar 2005 A1
20050060665 Rekimoto Mar 2005 A1
20050079896 Kokko et al. Apr 2005 A1
20050085215 Kokko Apr 2005 A1
20050085272 Anderson et al. Apr 2005 A1
20050114788 Fabritius May 2005 A1
20050143138 Lee et al. Jun 2005 A1
20050182798 Todd et al. Aug 2005 A1
20050183021 Allen et al. Aug 2005 A1
20050184999 Daioku Aug 2005 A1
20050198159 Kirsch Sep 2005 A1
20050216300 Appelman et al. Sep 2005 A1
20050223057 Buchheit et al. Oct 2005 A1
20050232166 Nierhaus Oct 2005 A1
20050250547 Salman et al. Nov 2005 A1
20050273614 Ahuja Dec 2005 A1
20050280719 Kim Dec 2005 A1
20060004685 Pyhalammi et al. Jan 2006 A1
20060015736 Callas et al. Jan 2006 A1
20060015812 Cunningham Jan 2006 A1
20060026013 Kraft Feb 2006 A1
20060059430 Bells Mar 2006 A1
20060070005 Gilbert et al. Mar 2006 A1
20060074771 Kim Apr 2006 A1
20060103623 Davis May 2006 A1
20060129543 Bates et al. Jun 2006 A1
20060135220 Kim et al. Jun 2006 A1
20060136773 Kespohl et al. Jun 2006 A1
20060152803 Provitola Jul 2006 A1
20060172724 Linkert et al. Aug 2006 A1
20060173911 Levin et al. Aug 2006 A1
20060199598 Lee et al. Sep 2006 A1
20060218234 Deng et al. Sep 2006 A1
20060246955 Nirhamo Nov 2006 A1
20060253801 Okaro et al. Nov 2006 A1
20060259870 Hewitt et al. Nov 2006 A1
20060259873 Mister Nov 2006 A1
20060271520 Ragan Nov 2006 A1
20060281448 Plestid et al. Dec 2006 A1
20060293088 Kokubo Dec 2006 A1
20060294396 Witman Dec 2006 A1
20070005716 LeVasseur et al. Jan 2007 A1
20070011610 Sethi et al. Jan 2007 A1
20070015532 Deelman Jan 2007 A1
20070024646 Saarinen Feb 2007 A1
20070035513 Sherrard et al. Feb 2007 A1
20070038567 Allaire et al. Feb 2007 A1
20070054679 Cho et al. Mar 2007 A1
20070061306 Pell et al. Mar 2007 A1
20070061714 Stuple et al. Mar 2007 A1
20070067272 Flynt Mar 2007 A1
20070073718 Ramer et al. Mar 2007 A1
20070076013 Campbell Apr 2007 A1
20070080954 Griffin Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070082708 Griffin Apr 2007 A1
20070106635 Frieden et al. May 2007 A1
20070127638 Doulton Jun 2007 A1
20070157089 Van Os et al. Jul 2007 A1
20070171192 Seo et al. Jul 2007 A1
20070182595 Ghasabian Aug 2007 A1
20070185847 Budzik et al. Aug 2007 A1
20070192707 Maeda et al. Aug 2007 A1
20070198420 Goldstein Aug 2007 A1
20070211034 Griffin et al. Sep 2007 A1
20070214429 Lyudovyk et al. Sep 2007 A1
20070216651 Patel Sep 2007 A1
20070225022 Satake Sep 2007 A1
20070233654 Karlson Oct 2007 A1
20070238488 Scott Oct 2007 A1
20070247435 Benko et al. Oct 2007 A1
20070250583 Hardy Oct 2007 A1
20070253758 Suess Nov 2007 A1
20070256029 Maxwell Nov 2007 A1
20070257891 Esenther et al. Nov 2007 A1
20070257933 Klassen Nov 2007 A1
20070262964 Zotov et al. Nov 2007 A1
20070263843 Foxenland Nov 2007 A1
20070273663 Park et al. Nov 2007 A1
20070280457 Aberethy Dec 2007 A1
20070281747 Pletikosa Dec 2007 A1
20080005668 Mavinkurve Jan 2008 A1
20080032681 West Feb 2008 A1
20080036743 Westerman Feb 2008 A1
20080048986 Khoo Feb 2008 A1
20080052370 Snyder Feb 2008 A1
20080057910 Thoresson et al. Mar 2008 A1
20080057926 Forstall et al. Mar 2008 A1
20080066010 Brodersen et al. Mar 2008 A1
20080076472 Hyatt Mar 2008 A1
20080082934 Kocienda et al. Apr 2008 A1
20080084970 Harper Apr 2008 A1
20080085700 Arora Apr 2008 A1
20080102863 Hardy May 2008 A1
20080114535 Nesbitt May 2008 A1
20080120571 Chang et al. May 2008 A1
20080132252 Altman et al. Jun 2008 A1
20080153551 Baek et al. Jun 2008 A1
20080155425 Murthy et al. Jun 2008 A1
20080162651 Madnani Jul 2008 A1
20080165132 Weiss Jul 2008 A1
20080165136 Christie et al. Jul 2008 A1
20080165163 Bathiche Jul 2008 A1
20080167058 Lee et al. Jul 2008 A1
20080168403 Westerman et al. Jul 2008 A1
20080172609 Rytivaara Jul 2008 A1
20080180399 Cheng Jul 2008 A1
20080182628 Lee et al. Jul 2008 A1
20080189658 Jeong et al. Aug 2008 A1
20080198141 Lee et al. Aug 2008 A1
20080200142 Abdel-Kader et al. Aug 2008 A1
20080208973 Hayashi Aug 2008 A1
20080222560 Harrison Sep 2008 A1
20080222569 Champion Sep 2008 A1
20080242362 Duarte Oct 2008 A1
20080259042 Thorn Oct 2008 A1
20080261660 Huh et al. Oct 2008 A1
20080263457 Kim et al. Oct 2008 A1
20080270558 Ma Oct 2008 A1
20080297475 Woolf et al. Dec 2008 A1
20080299999 Lockhart et al. Dec 2008 A1
20080301046 Martinez Dec 2008 A1
20080301575 Fermon Dec 2008 A1
20080307364 Chaudhri et al. Dec 2008 A1
20080309626 Westerman et al. Dec 2008 A1
20080316177 Tseng Dec 2008 A1
20080317240 Chang et al. Dec 2008 A1
20090007017 Anzures et al. Jan 2009 A1
20090012952 Fredriksson Jan 2009 A1
20090029736 Kim et al. Jan 2009 A1
20090037469 Kirsch Feb 2009 A1
20090051671 Konstas Feb 2009 A1
20090061837 Chaudhri et al. Mar 2009 A1
20090061948 Lee et al. Mar 2009 A1
20090064055 Chaudhri et al. Mar 2009 A1
20090077649 Lockhart Mar 2009 A1
20090083656 Dukhon Mar 2009 A1
20090085851 Lim Apr 2009 A1
20090085878 Heubel Apr 2009 A1
20090089215 Newton Apr 2009 A1
20090106694 Kraft et al. Apr 2009 A1
20090109243 Kraft Apr 2009 A1
20090117942 Boningue et al. May 2009 A1
20090140061 Schultz et al. Jun 2009 A1
20090140986 Karkkainen et al. Jun 2009 A1
20090146962 Ahonen et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090160809 Yang Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090164888 Phan Jun 2009 A1
20090205041 Michalske Aug 2009 A1
20090228825 Van Os et al. Sep 2009 A1
20090265662 Bamford Oct 2009 A1
20090284482 Chin Nov 2009 A1
20090293014 Meuninck et al. Nov 2009 A1
20090298547 Kim et al. Dec 2009 A1
20090307589 Inose et al. Dec 2009 A1
20090307623 Agarawala et al. Dec 2009 A1
20090313584 Kerr et al. Dec 2009 A1
20090315847 Fujii Dec 2009 A1
20100008490 Gharachorloo et al. Jan 2010 A1
20100075628 Ye Mar 2010 A1
20100079413 Kawashima et al. Apr 2010 A1
20100087169 Lin Apr 2010 A1
20100087173 Lin Apr 2010 A1
20100100839 Tseng et al. Apr 2010 A1
20100103124 Kruzeniski Apr 2010 A1
20100105370 Kruzeniski Apr 2010 A1
20100105424 Smuga Apr 2010 A1
20100105438 Wykes Apr 2010 A1
20100105439 Friedman et al. Apr 2010 A1
20100105440 Kruzeniski Apr 2010 A1
20100105441 Voss Apr 2010 A1
20100107067 Vaisanen Apr 2010 A1
20100107068 Butcher Apr 2010 A1
20100107100 Schneekloth Apr 2010 A1
20100145675 Lloyd et al. Jun 2010 A1
20100146437 Woodcock et al. Jun 2010 A1
20100159994 Stallings et al. Jun 2010 A1
20100159995 Stallings et al. Jun 2010 A1
20100167699 Sigmund et al. Jul 2010 A1
20100180233 Kruzeniski Jul 2010 A1
20100216491 Winkler et al. Aug 2010 A1
20100248688 Teng Sep 2010 A1
20100248689 Teng Sep 2010 A1
20100248741 Setlur et al. Sep 2010 A1
20100248787 Smuga Sep 2010 A1
20100295795 Wilairat Nov 2010 A1
20100311470 Seo et al. Dec 2010 A1
20100321403 Inadome Dec 2010 A1
20110018806 Yano Jan 2011 A1
20110055773 Agarawala et al. Mar 2011 A1
20110093778 Kim et al. Apr 2011 A1
20110231796 Vigil Sep 2011 A1
20120028687 Wykes et al. Feb 2012 A1
20120050185 Davydov et al. Mar 2012 A1
20120179992 Smuga Jul 2012 A1
20120212495 Butcher Aug 2012 A1
20120244841 Teng Sep 2012 A1
Foreign Referenced Citations (33)
Number Date Country
1936797 Mar 2007 CN
102197702 Sep 2011 CN
0583060 Feb 1994 EP
1752868 Feb 2007 EP
2004227393 Aug 2004 JP
2004357257 Dec 2004 JP
200303655 Feb 2003 KR
20060019198 Mar 2006 KR
1020070036114 Apr 2007 KR
1020070098337 Oct 2007 KR
20070120368 Dec 2007 KR
1020080025951 Mar 2008 KR
1020080076390 Aug 2008 KR
100854333 Sep 2008 KR
1020080084156 Sep 2008 KR
1020080113913 Dec 2008 KR
1020090041635 Apr 2009 KR
201023026 Jun 2010 TW
WO-2005026931 Mar 2005 WO
WO-2005027506 Mar 2005 WO
WO-2006019639 Feb 2006 WO
WO-2007121557 Nov 2007 WO
WO-2007134623 Nov 2007 WO
WO-2008030608 Mar 2008 WO
WO-2008031871 Mar 2008 WO
WO-2008035831 Mar 2008 WO
WO-2009000043 Dec 2008 WO
WO-2009049331 Apr 2009 WO
WO-2010048229 Apr 2010 WO
WO-2010048448 Apr 2010 WO
WO-2010048519 Apr 2010 WO
WO-2010117643 Oct 2010 WO
WO-2010135155 Nov 2010 WO
Non-Patent Literature Citations (128)
Entry
“PCT Search Report and Written Opinion”, Application No. PCT/US2009/061382, (May 26, 2010),10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2009/061735, (Jun. 7, 2010),11 pages.
Raghaven, Gopal et al., “Model Based Estimation and Verification of Mobile Device Performance”, Retrieved from http://alumni.cs.ucsb.edu/˜raimisl/emsoft04—12.pdf., (Sep. 27-29, 2004),10 Pages.
Reed, Brad “Microsoft Demos Windows Mobile 6.1 at CTIA”, Retrieved from http://www.networkworld.com/news/2008/040208-ctia-microsoft-windows-mobile.html on Jul. 18, 2008., (Apr. 2, 2008),3 Pages.
Singh, Kundan et al., “CINEMA: Columbia InterNet Extensible Multimedia Architecture”, Retrieved from http://www1.cs.columbia.edu/˜library/TR-repository/reports/reports-2002/cucs-011-02.pdf, (Sep. 3, 2002),83 Pages.
Kcholi, Avi “Windows CE .NET Interprocess Communication”, Retrieved from http://msdn.microsoft.com/en-us/library/ms836784.aspx on Jul. 17, 2008., (Jan. 2004),15 Pages.
Gao, Rui “A General Logging Service for Symbian based Mobile Phones”, Retrieved from: <http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2007/rapporter07/gao—rui—07132.pdf.> on Jul. 17, 2008, (Feb. 2007),pp. 1-42.
“Microsoft Internet Explorer Window.Createpopup() Method Creates Chromeless Windows”, Retrieved from: <http://www.addict3d.org/news/2012/download.html>, Internet Explorer Window Restrictions,(Oct. 22, 2008),6 pages.
“Kiosk Browser Chrome Customization Firefox 2.x”, Retrieved from: <http://stlouis-shopper.com/cgi-bin/mozdev-wiki/,pl?ChromeCustomization> Making a new chrome for the kiosk browser, Kiosk Project Kiosk Browser Chrome Customization Firefox-2.x,(Aug. 16, 2007),2 pages.
Harrison, Richard “Symbian OS C++ for Mobile Phones: vol. 3 ( Symbian Press): 3 (Paperback)”, Retrieved from: <http://—www.amazon.co.uk/Symbian-OS-Mobile-Phones-Press/dp/productdescription/0470066415>, (Jun. 16, 2003),4 pages.
“How do you dial 1-800-Flowers”, Retrieved from: <http://blogs.msdn.com/windowsmobile/archive/2007/02/06/how-do-you-dial-1-800-flowers.aspx>, (Feb. 6, 2007),24 pages.
“Blackberry office tools: Qwerty Convert”, Retrieved from: <http://blackberrysoftwarelist.net/blackberry/download-software/blackberry-office/qwerty—convert.aspx>, (Nov. 20, 2008),1 page.
Gade, Lisa “Samsung Alias u740”, Retrieved from: <http://www.mobiletechreview.com/phones/Samsung-U740.htm> on Nov. 20, 2008, (Mar. 14, 2007),6 pages.
“Dial a number”, Retrieved from: <http://www.phonespell.org/ialhelp.html> on Nov. 20, 2008, (Nov. 20, 2008),1 page.
“Apple IPhone—8GB AT&T”, Retrieved from: <http://nytimes.com.com/smartphones/apple-iphone-8gb-at/4515-6452—7-32309245.html>, (Jun. 29, 2007),11 pages.
“IntelliScreen—New iPhone App Shows Today Screen Type Info in Lock Screen”, Retrieved from: <http://justanotheriphoneblog.com/wordpress//2008/05/13/intelliscreen-new-iphone-app-shows-today-screen-type-info-on-lock-screen/>, (May 13, 2008),11 pages.
“PocketShield—New Screenlock App for the HTC Diamond and Pro”, Retrieved from: <http://wmpoweruser.com/?tag=htc-touch-diamond>, (Nov. 6, 2008),13 pages.
“SecureMe—Anti-Theft Security Application for S60 3rd”, Retrieved from: <http:/www.killermobile.com/newsite/mobile-software/s60-applications/secureme-%11-anti%11theft-security-application-for-s60-3rd.htm>, (Nov. 11, 2008),2 pages.
“Winterface Review”, Retrieved from: <http://www.mytodayscreen.com/winterface-review/>, (Jul. 9, 2008),42 pages.
Oliver, Sam “Potential iPhone Usability and Interface Improvements”, Retrieved from: <http://www.appleinsider.com/articles/08/09/18/potential—iphone—usability—and—interface—improvements.html>, (Sep. 18, 2008),4 pages.
“Google Android has Landed; T-Mobile, HTC Unveil G1”, Retrieved from: <http://www.crn.com/retail/210603348> on Nov. 26, 2008., (Sep. 23, 2008),5 Pages.
“Alltel Adds Dedicated Search Key to Phones”, Retrieved from: <http://www.phonescoop.com/news/item.php?n=2159> on Nov. 26, 2008., (Apr. 12,.2007),2 Pages.
Oryl, Michael “Review: Asus P527 Smartphone for North America”, Retrieved from: <http://www.mobileburn.com/review.jsp?Id=4257> on Dec. 17, 2008., (Mar. 5, 2008),1 Page.
“Nokia E61 Tips and Tricks for Keyboard Shortcuts”, Retrieved from: <http://www.mobiletopsoft.com/board/1810/nokia-e61-tips-and-tricks-for-keyboard-shortcuts.html> on Dec. 17, 2008., (Jan. 27, 2006),2 Pages.
Ha, Rick et al., “SIMKEYS: An Efficient Keypad Configuration for Mobile Communications”, Retrieved from: <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=01362557.> on Dec. 17, 2008, (Nov. 2004),7 Pages.
“Remapping the Keyboard”, Retrieved from: <http://publib.boulder.ibm.com/infocenter/hodhelp/v9r0/index.jsp?topic=/com.ibm.hod9.doc/help/assignkey.html> on Dec. 11, 2008., (Jul. 15, 2005),5 Pages.
“Palm Treo 750 Cell Phone Review—Hardware”, Retrieved from: <http://www.wirelessinfo.com/content/palm-Treo-750-Cell-Phone-Review/Hardware.htm> on Dec. 11, 2008., (Mar. 17, 2007),4 Pages.
“Keyboard (5)”, Retrieved from: <http://landru.uwaterloo.ca/cgi-bin/man.cgi?section=5&topic=keyboard> on Dec. 11, 2008., (Aug. 11, 1997),8 Pages.
“Calc4M”, Retrieved from: <http://www.hellebo.com/Calc4M.html>, (Sep. 10, 2008),4 Pages.
“MIDTB Tip Sheet: Book Courier”, Retrieved from: <http://www.midtb.org/tipsbookcourier.htm> on Dec. 11, 2008., (Sep. 26, 2005),6 Pages.
“Freeware .mobi”, Retrieved from: <http://www.palmfreeware.mobi/download-palette.html>, (Oct. 9, 2001),2 pages.
“Palette Extender 1.0.2”, Retrieved from: <http://palette-extender.en.softonic.com/symbian>, (Jan. 21, 2003),2 pages.
Rice, Stephen V., et al., “A System for Searching Sound Palettes”, Retrieved from: <http://www.comparisonics.com/FindSoundsPalettePaper.pdf>, (Feb. 28-29, 2008),6 pages.
“Multi-touch”, Retrieved from <http://en.wikipedia.org/wiki/Multi-touch#Microsoft—Surface>, (Apr. 17, 2009),8 pages.
Wilson, Tracy V., “How the iPhone Works”, Retrieved from: <http://electronics.howstuffworks.com/iphone2.htm>, (Jan. 2007),9 pages.
“DuoSense™ Multi-Touch Gestures”, Retrieved from: <http://www.n-trig.com/Data/Uploads/Misc/DuoSenseMTG—final.pdf>, (Jul. 2008),4 pages.
Vallerio, Keith S., et al., “Energy-Efficient Graphical User Interface Design”, Retrieved from: <http://www.cc.gatech.edu/classes/AY2007/cs7470—fall/zhong-energy-efficient-user-interface.pdf>, (Jun. 10, 2004),13 Pages.
Nordgren, Peder “Development of a Touch Screen Interface for Scania Interactor”, Retrieved from: <http://www.cs.umu.se/education/examina/Rapporter/PederNordgren.pdf>, (Apr. 10, 2007),67 Pages.
“Elecont Quick Desktop 1.0.43”, Retrieved from: <http://handheld.softpedia.com/get/System-Utilities/Launcher-Applications/Elecont-Quick-Desktop-72131.shtml> on May 5, 2009., (Mar. 13, 2009),pp. 1-2.
“Symbian Applications”, Retrieved from: <http://symbianfullversion.blogspot.com/2008—12—01—archive.html> on May 5, 2009., (Jan. 2009),51 Pages.
Remond, Mickael “Mobile Marketing Magazine”, Retrieved from: <http://www.mobilemarketingmagazine.co.uk/mobile—social—networking/> on May 5, 2009., (Apr. 28, 2009),16 Pages.
“Womma”, Retrieved from: <http://www.womma.org/blog/links/wom-trends/> on May 5, 2009., (2007),70 Pages.
Dolcourt, Jessica “Webware”, Retrieved from: <http://news.cnet.com/webware/?categoryId=2010> on May 5, 2009., (Apr. 2009),13 Pages.
“HTC Shows HTC Snap with Snappy Email Feature”, Retrieved from: <http://www.wirelessandmobilenews.com/smartphones/ on May 5, 2009>, (May 4, 2009),10 Pages.
“Ask Web Hosting”, Retrieved from: <http://www.askwebhosting.com/story/18501/HTC—FUZE—From—ATandampT—Fuses—Fun—and—Function—With—the One-Touch—Power—of—TouchFLO—3D.html> on May 5, 2009., (Nov. 11, 2008),3 pages.
“Live Photo Gallery—Getting Started—from Camera to Panorama”, Retrieved from: <http://webdotwiz.spaces.live.com/blog/cns!2782760752B93233!1729.entry> on May 5, 2009., (Sep. 2008),7 Pages.
Yang, Seungji et al., “Semantic Photo Album Based on MPEG-4 Compatible Application Format”, Retrieved from: <http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04146254.>, (2007),2 Pages.
Mei, Tao et al., “Probabilistic Multimodality Fusion for Event Based Home Photo Clustering”, Retrieved from: <http://ieeexplore.ieee.org//stamp/stamp.jsp?tp=&arnumber=04036960.>, (2006),4 Pages.
“Exclusive: Windows Mobile 7 to Focus on Touch and Motion Gestures”, Retrieved from: <http://anti-linux.blogspot.com/2008/08/exclusive-windows-mobile-7-to-focus-on.html> on May 6, 2009, (Aug. 1, 2008),pp. 1-14.
“Mobile/UI/Designs/TouchScreen”, Retrieved from: <https://wiki.mozilla.org/Mobile/UI/Designs/TouchScreen> on May 6, 2009., (Feb. 2009),15 Pages.
“Introduction to Windows Touch”, Retrieved from: <http://download.microsoft.com/download/a/d/f/adf1347d-08dc-41a4-9084-623b1194d4b2/Win7—touch.docx>, (Dec. 18, 2008),7 Pages.
Terpstra, Brett “Beta Beat: Grape, a New Way to Manage Your Desktop Clutter”, Retrieved from: Beta Beat: Grape, a New Way to Manage Your Desktop Clutter on May 6, 2009., (Apr. 14, 2009),16 Pages.
“Oracle8i Application Developers Guide—Advanced Queuing Release 2 (8.1.6)”, Retrieved from: http://www.cs.otago.ac.nz/oradocs/appdev.817/a76938/adq01in5.htm on May 6, 2009.,29 Pages.
“Content-Centric E-Mail Message Analysis in Litigation Document Reviews”, Retrieved from: http://www.busmanagement.com/article/Issue-14/Data-Management/Content-Centric-E-Mail-Message-Analysis-in-Litigation-Document-Reviews/., (May 6, 2009),4 Pages.
Mao, Jeng “Comments of Verizon Wireless Messaging Services, LLC”, Retrieved from: http://www.ntia.doc.gov/osmhome/warnings/comments/verizon.htm on May 6, 2009., 5 Pages.
“Oracle8i Concepts Release 8.1.5”, Retrieved from: http://www.cs.umbc.edu/help/oracle8/server.815/a67781/c16queue.htm on May 6, 2009., 10 Pages.
“Oracle8i Application Developers Guide—Advanced Queuing”, Retrieved from: http://www.cs.umbc.edu/help/oracle8/server.815/a68005/03—adq1i.htm on May 6, 2009., 29 Pages.
“Touch Shell Free”, Retrieved from: <http://www.pocketpcfreeware.mobi/download-touch-shell-free.html> on May 5, 2009., (Feb. 23, 2009),2 Pages.
“Parallax Scrolling”, Retrieved from: <http://en.wikipedia.org/wiki/Parallax—scrolling> on May 5, 2009., (May 4, 2009),3 Pages.
Steinicke, Frank et al., “Multi-Touching 3D Data: Towards Direct Interaction in Stereoscopic Display Environments coupled with Mobile Devices”, Retrieved from: <http://viscg.uni-muenster.de/publications/2008/SHSK08/ppd-workshop.-pdf.>, (Jun. 15, 2008),4 Pages.
Mann, Richard et al., “Spectrum Analysis of Motion Parallax in a 3D Cluttered Scene and Application to Egomotion”, Retrieved from: <http://www.cs.uwaterloo.ca/˜mannr/snow/josa-mann-langer.pdf.>, (Sep. 2005),15 Pages.
“Keyboard Shortcuts”, Retrieved from: <http://www.pctoday.com/editorial/article.asp?article=articles%2F2005%2Ft0311%2F26t11%2F26t11.asp> on Aug. 3, 2009., (Nov. 2005),5 pages.
“PCT Search Report”, Application Serial No. PCT/US2009/061864, (May 14, 2010),10 pages.
“International Search Report”, Application No. PCT/US2010/028553, Application Filing Date: Mar. 24, 2010,(Nov. 9, 2010),9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2010/034772, (Dec. 29, 2010),12 pages.
“International Search Report”, Mailed Date: Jan. 19, 2011, Application No. PCT/US2010/038730, Filed Date: Jun. 15, 2010, pp. 8.
“PCT Search Report and Written Opinion”, Application No. PCT/US2010/028699, (Oct. 4, 2010),10 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2010/028555, (Oct. 12, 2010),10 pages.
“Advisory Action”, U.S. Appl. No. 12/414,382, (Jan. 20, 2012),3 pages.
“Final Office Action”, U.S. Appl. No. 12/244,545, (Dec. 7, 2011),16 pages.
“Final Office Action”, U.S. Appl. No. 12/413,977, (Nov. 17, 2011),16 pages.
“Final Office Action”, U.S. Appl. No. 12/414,382, (Dec. 23, 2011),7 pages.
“Final Office Action”, U.S. Appl. No. 12/414,476, (Dec. 1, 2011),20 pages.
“Final Office Action”, U.S. Appl. No. 12/433,605, (Feb. 3, 2012),11 pages.
“Final Office Action”, U.S. Appl. No. 12/469,458, (Nov. 17, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, (Jan. 17, 2012),7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,667, (Feb. 3, 2012),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,419, (Nov. 9, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/470,558, (Nov. 22, 2011),9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/484,845, (Dec. 7, 2011),16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/560,081, (Dec. 7, 2011),16 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,455, (Jan. 4, 2012),4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, (Oct. 31, 2011),2 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, (Nov. 29, 2011),2 pages.
La, Nick “Parallax Gallery”, Available at <http://webdesignerwall.comtutorials/parallax-gallery/comment-page-1>,(Apr. 25, 2008),16 pages.
Roberts, Neil “Touching and Gesturing on the iPhone”, Available at <http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/comments-pare-1>,(Jul. 10, 2008),16 pages.
“Final Office Action”, U.S. Appl. No. 12/433,667, (Sep. 13, 2011),17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, (Aug. 17, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/413,977, (Jul. 19, 2011),17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,382, (Jul. 26, 2011),9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, (Aug. 2, 2011),6 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,455, (Aug. 29, 2011),8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,458, (Jul. 6, 2011),8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, (Aug. 3, 2011),21 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,605, (Jun. 24, 2011),10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,667, (Jun. 7, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, (Jul. 1, 2011),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, (Sep. 22, 2011),14 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,458, (Aug. 10, 2011),6 pages.
“Advisory Action”, U.S. Appl. No. 12/433,605, (Apr. 5, 2012), 3 pages.
“Extended European Search Report”, European Patent Application No. 09818253.8, (Apr. 10, 2012), 7 pages.
“Final Office Action”, U.S. Appl. No. 12/469,480, (Feb. 9, 2012), 17 pages.
“Final Office Action”, U.S. Appl. No. 12/560,081, (Mar. 14, 2012), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/244,545, (Mar. 27, 2012), 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,434, (May 31, 2012), 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,419, (May 23, 2012), 13 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,382, (Apr. 4, 2012), 4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/470,558, (Apr. 2, 2012), 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/484,845, (Mar. 16, 2012), 5 pages.
Beiber, Gerald et al., “Screen Coverage: A Pen-Interaction Problem for PDA's and Touch Screen Computers”, In Proceedings of ICWMC 2007,(Mar. 2007), 6 pages.
Wyatt, Paul “/Flash/the art of parallax scrolling”, .net Magazine,(Aug. 1, 2007), pp. 74-76.
“Final Office Action”, U.S. Appl. No. 12/244,545, (Sep. 7, 2012),23 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/413,977, (Jul. 20, 2012),18 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,458, (Sep. 21, 2012),14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/480,969, (Aug. 7, 2012),15 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,495, (Sep. 17, 2012),8 pages.
“Notice of Allowance”, U.S. Appl. No. 12/414,434, (Aug. 17, 2012),4 pages.
“Notice of Allowance”, U.S. Appl. No. 12/470,558, (Aug. 23, 2012),2 pages.
“Final Office Action”, U.S. Appl. No. 12/480,969, (Nov. 23, 2012), 18 pages.
“Foreign Office Action”, Chinese Application No. 201080023212.1, (Dec. 5, 2012), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/414,476, (Nov. 9, 2012), 22 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/469,480, (Oct. 17, 2012), 16 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/492,495, (Dec. 19, 2012), 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/469,419, (Nov. 27, 2012), 13 pages.
“Final Office Action”, U.S. Appl. No. 12/433,667, (Jan. 7, 2013), 17 pages.
“Foreign Office Action”, Chinese Application No. 201080015728.1, (Dec. 26, 2012), 9 pages.
“Foreign Office Action”, Chinese Application No. 201080015788.3, (Dec. 24, 2012), 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/433,605, (Jan. 11, 2013), 7 pages.
Related Publications (1)
Number Date Country
20100159966 A1 Jun 2010 US
Provisional Applications (3)
Number Date Country
61107945 Oct 2008 US
61107935 Oct 2008 US
61107921 Oct 2008 US