This disclosure generally relates to user interface.
A touchpad is an input device including a surface that detects touch-based inputs of users. A touch screen is an electronic visual display that detects the presence and location of user touch inputs. A proximity sensor is a sensor device that detect the presence of nearby objects without physical contact. A computing system (such as a mobile phone, a tablet computer, or a laptop computer) often incorporate those devices to facilitate user interactions with application programs running on the computing system.
A touchpad is an input device including a surface that detects touch-based inputs of users. Similarly, a touch screen is an electronic visual display surface that detects the presence and location of user touch inputs. So-called dual touch or multi-touch displays or touchpads refer to devices that can identify the presence, location and movement of more than one touch input, such as two- or three-finger touches. A system incorporating one or more touch-based input devices may monitor one or more touch-sensitive surfaces for touch or near touch inputs from a user. When one or more such user inputs occur, the system may determine the distinct area(s) of contact and identify the nature of the touch or near touch input(s) via geometric features and geometric arrangements (e.g., location, movement), and determine if they correspond to various touch events or gestures (e.g., tap, drag, swipe, pinch).
Recognition of touch events by a system with one or more touch-based input devices—i.e., identifying one or more touch inputs by a user and determining corresponding touch event(s)—may be implemented by a combination of hardware, software, or firmware (or device drivers).
In addition to detecting touch or near-touch inputs using one or more touch input devices (e.g., touchpad, touch screen), a system may also detect a location and movement of an object at a distance away from the system's surface by incorporating one or more sensor or input devices. For example, a proximity sensor may detect the presence of nearby objects without physical contact. For example, a camera capturing a substantially real-time video may determine a distance and angle of an object (relative to the camera) in the video based on a focus distance and angle associated with the object. This disclosure contemplates any suitable sensors for detecting a location and movement of an object touching or at a distance away from the system's surface. A system incorporating one or more touch input devices, proximity sensors, or cameras may determine a location and movement of an object touching or at a distance away from the system's surface based on measurements of the object by the touch input devices, proximity sensors, or cameras (e.g., by using triangulation techniques). By continuously monitoring the touch input devices, proximity sensors, or cameras, the system may determine a three-dimensional trajectory of a moving object based on measurements of the object by the touch input devices, proximity sensors, or cameras. A user may provide inputs to the system by performing three-dimensional gestures. For example, a three-dimensional gesture may be the user's fingertip touching a front surface of the system and then pulling away from the front surface. When detecting such a three-dimensional user input, the system may determine a three-dimensional trajectory (e.g., of the user's fingertip), and determine if the three-dimensional trajectory corresponds to one or more three-dimensional gestures. The system may comprise a three-dimensional gesture library containing three-dimensional input modules or computer program code for calculating and interpreting three-dimensional input trajectories (detected by the touch input devices, proximity sensors, or cameras) to three-dimensional gestures. A program running on the system can detect and process three-dimensional gestures by subscribing as listeners to the three-dimensional input modules in the three-dimensional gesture library.
Mobile device 200 may comprise one or more proximity sensors 202 disposed on the front side and the back side of the device's housing as illustrated in
Mobile device 200 may comprise one or more cameras 203 disposed on the front side and the back side of the device's housing as illustrated in
Mobile device 200 may comprise a touch gesture library containing touch event modules or computer code that can recognize touch inputs, and determine one or more corresponding touch events or gestures (e.g., tap, draft, swipe, pinch). One or more applications hosted by mobile device 200 may be configured to detect and respond to one or more touch events or gestures by subscribing as listeners to touch event modules in the touch gesture library.
Mobile device 200 may comprise a three-dimensional gesture library containing three-dimensional input modules or computer program code that can recognize three-dimensional inputs, and determine one or more corresponding three-dimensional gestures. One or more applications hosted by mobile device 200 may be configured to detect and respond to one or more three-dimensional inputs by subscribing as listeners to three-dimensional input modules in the three-dimensional gesture library.
Particular embodiments describe methods for providing user inputs with three-dimensional gestures. Particular embodiments may detect and recognize a three-dimensional gesture, determine a user input based on the three-dimensional gesture, and execute one or more actions based on the user input.
In particular embodiments, the application may determine a user input based on the three-dimensional gestures (STEP 302). In particular embodiments, the application may execute one or more actions based on the user input (STEP 303). For example, the application may be a music player application running on mobile device 200. The music player application may display in touch display 201 user interface icons such as speaker volume adjustment 405, play 406, pause 407, and stop 408, as illustrated in
In other embodiments, the pulling gesture illustrated by the arrow of
In one embodiment, the application may adjust a user-controllable parameter without selecting a user interface object. For example, an operating system of mobile device 200 may adjust screen brightness of touch display 201 based on a user input of the pulling gesture illustrated in
Note that the pick-up-move-drop-down gesture illustrated in
This disclosure contemplates any suitable surface of mobile device 200 where a pick-up-move-drop-down gesture starts and ends. For example, a user may perform a pick-up-move-drop-down gesture by performing a pinching gesture at a first location on the back-side surface of mobile device 200, a pulling gesture away from the first location on the back-side surface, a movement distant from and across the back-side surface toward a second location of the back-side surface, and a dropping gesture on the back-side surface at the second location. An application (or an operation system) running on mobile device 200 may detect and recognize the pick-up-move-drop-down gesture by subscribing as listeners to the three-dimensional input modules as described earlier (STEP 301). The application may determine a drag-and-drop user input based on the pick-up-move-drop-down gesture (STEP 302). The application may identify a first point on touch screen 201 that is opposite to the first location on the back-side surface for the pick-up-move-drop-down gesture. The application may identify a second point on touch screen 201 that is opposite to the second location on the back-side surface for the pick-up-move-drop-down gesture. The application may, based on the drag-and-drop user input, drag and drop a user interface object display in touch screen 201 from the first point within touch screen 201 to the second point within touch screen 201 (STEP 303).
Particular embodiments may repeat the steps of the method of
In particular embodiments, an application (or an operating system of the computing device) may store in a local storage of the computing device, a user preference file comprising user-specific data for the feature of user inputs with three-dimensional gestures as illustrated by the example method of
In some embodiments, the feature of user inputs with three-dimensional gestures (as illustrated by the example method of
A social-networking system, such as a social-networking website, may enable its users to interact with it and with each other through it. The social-networking system may create and store a record (such as a user profile) associated with the user. The user profile may include demographic information on the user, communication-channel information for the user, and personal interests of the user. The social-networking system may also create and store a record of the user's relationships with other users in the social-networking system (e.g. a social graph), as well as provide social-networking services (e.g. wall-posts, photo-sharing, or instant-messaging) to facilitate social interaction between or among users in the social-networking system.
A social-networking system may store records of users and relationships between users in a social graph comprising a plurality of nodes and a plurality of edges connecting the nodes. The nodes may comprise a plurality of user nodes and a plurality of concept nodes. A user node of the social graph may correspond to a user of the social-networking system. A user may be an individual (human user), an entity (e.g., an enterprise, business, or third party application), or a group (e.g., of individuals or entities). A user node corresponding to a user may comprise information provided by the user and information gathered by various system, including the social-networking system. For example, the user may provide his name, profile picture, city of residence, contact information (e.g., a phone number, an email address), birth date, gender, marital status, family status, employment, education background, preferences, interests, and other demographical information to be included in the user node. Each user node of the social graph may correspond to a web page (typically known as a profile page). For example, in response to a request including a user name, the social-networking system can access a user node corresponding to the user name, and construct a profile page including the name, a profile picture, and other information associated with the user. A concept node may correspond to a concept of the social-networking system. For example, a concept can represent a real-world entity, such as a movie, a song, a sports team, a celebrity, a restaurant, or a place or a location. An administrative user of a concept node corresponding to a concept may create the concept node by providing information of the concept (e.g., by filling out an online form), causing the social-networking system to create a concept node comprising information associate with the concept. For example and without limitation, information associated with a concept can include a name or a title, one or more images (e.g., an image of cover page of a book), a web site (e.g., an URL address) or contact information (e.g., a phone number, an email address). Each concept node of the social graph may correspond to a web page. For example, in response to a request including a name, the social-networking system can access a concept node corresponding to the name, and construct a web page including the name and other information associated with the concept. An edge between a pair of nodes may represent a relationship between the pair of nodes. For example, an edge between two user nodes can represent a friendship between two users. For example, the social-networking system may construct a web page (or a structured document) of a concept node (e.g., a restaurant, a celebrity), incorporating one or more selectable buttons (e.g., “like”, “check in”) in the web page. A user can access the page using a web browser hosted by the user's client device and select a selectable button, causing the client device to transmit to the social-networking system a request to create an edge between a user node of the user and a concept node of the concept, indicating a relationship between the user and the concept (e.g., the user checks in a restaurant, or the user likes a celebrity). In addition, the degree of separation between any two nodes is defined as the minimum number of hops required to traverse the social graph from one node to the other. A degree of separation between two nodes can be considered a measure of relatedness between the users or the concepts represented by the two nodes in the social graph.
A social-networking system may support a variety of applications, such as photo sharing, on-line calendars and events, instant messaging, and advertising. For example, the social-networking system may also include media sharing capabilities. For example, the social-networking system may allow users to post photographs and other multimedia files to a user's profile page (typically known as wall posts) or in a photo album, both of which may be accessible to other users of the social-networking system. The social-networking system may also allow users to configure events. For example, a first user may configure an event with attributes including time and date of the event, location of the event and other users invited to the event. The invited users may receive invitations to the event and respond (such as by accepting the invitation or declining it). Furthermore, the social-networking system may allow users to maintain a personal calendar. Similarly to events, the calendar entries may include times, dates, locations and identities of other users.
In particular embodiments, the social-networking system may comprise one or more computing devices (e.g., servers) hosting functionality directed to operation of the social-networking system. In particular embodiments, one or more of data stores 601 may be operably connected to the social-networking system's front end 620. A user of the social-networking system may access the social-networking system using a client device such as client device 622. In particular embodiments, front end 620 may interact with client device 622 through network cloud 621. For example, front end 620 may be implemented in software programs hosted by one or more computing devices of the social-networking system. Front end 620 may include web or HTTP server functionality, as well as other functionality, to allow users to access the social-networking system.
Client device 622 may be a desktop computer, laptop computer, tablet computer, personal digital assistant (PDA), in- or out-of-car navigation system, smart phone or other cellular or mobile phone, or mobile gaming device, among other suitable computing devices. Client device 622 may execute one or more client applications, such as a web browser (e.g., Microsoft Windows Internet Explorer, Mozilla Firefox, Apple Safari, Google Chrome, and Opera, etc.) or special-purpose client application (e.g., Facebook for iPhone, etc.), to access and view content over a computer network.
Network cloud 621 may represent a network or collection of networks (such as the Internet, a corporate intranet, a virtual private network, a local area network, a wireless local area network, a cellular network, a wide area network, a metropolitan area network, or a combination of two or more such networks) over which client devices 622 may access the social network system.
In particular embodiments, the social-networking system may store in data stores 601 data associated with applications and services provided by the social-networking system. In particular embodiments, the social-networking system may store user event data in data stores 601. For example, a user may register a new event by accessing a client application to define an event name, a time and a location, and cause the newly created event to be stored (e.g., as a concept node) in data stores 601. For example, a user may register with an existing event by accessing a client application to confirming attending the event, and cause the confirmation to be stored in data stores 601. For example, the social-networking system may store the confirmation by creating an edge in a social graph between a user node corresponding to the user and a concept node corresponding to the event, and store the edge in data stores 601.
As described earlier, an edge between a pair of nodes may indicate a direct relationship between the pair of nodes. It is also desirable to determine likelihood of a relationship or an interest between a pair of nodes that are two or more hops away. For example, the social-working system may provide (e.g., via an email or a wall-post) a recommendation (e.g., an advertisement) for “Macy's” to user “B”, given the direct relationship represented by the edge between the user node “B” and the concept node “Macy's” as illustrated in
In particular embodiments, computer system 800 includes a processor 802, memory 804, storage 806, an input/output (I/O) interface 808, a communication interface 810, and a bus 812. In particular embodiments, processor 802 includes hardware for executing instructions, such as those making up a computer program. As an example and not by way of limitation, to execute instructions, processor 802 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 804, or storage 806; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 804, or storage 806. In particular embodiments, processor 802 may include one or more internal caches for data, instructions, or addresses. In particular embodiments, memory 804 includes main memory for storing instructions for processor 802 to execute or data for processor 802 to operate on. As an example and not by way of limitation, computer system 800 may load instructions from storage 806 to memory 804. Processor 802 may then load the instructions from memory 804 to an internal register or internal cache. To execute the instructions, processor 802 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 802 may write one or more results (which may be intermediate or final results) to the internal register or internal cache. Processor 802 may then write one or more of those results to memory 804. One or more memory buses (which may each include an address bus and a data bus) may couple processor 802 to memory 804. Bus 812 may include one or more memory buses, as described below. In particular embodiments, one or more memory management units (MMUs) reside between processor 802 and memory 804 and facilitate accesses to memory 804 requested by processor 802. In particular embodiments, memory 804 includes random access memory (RAM). This RAM may be volatile memory, where appropriate Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM).
In particular embodiments, storage 806 includes mass storage for data or instructions. As an example and not by way of limitation, storage 806 may include an HDD, a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 806 may include removable or non-removable (or fixed) media, where appropriate. Storage 806 may be internal or external to computer system 800, where appropriate. In particular embodiments, storage 806 is non-volatile, solid-state memory. In particular embodiments, storage 806 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), or flash memory or a combination of two or more of these.
In particular embodiments, I/O interface 808 includes hardware, software, or both providing one or more interfaces for communication between computer system 800 and one or more I/O devices. Computer system 800 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 800. As an example and not by way of limitation, an I/O device may include a keyboard, microphone, display, touch screen, mouse, speaker, camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 808 for them. Where appropriate, I/O interface 808 may include one or more device or software drivers enabling processor 802 to drive one or more of these I/O devices. I/O interface 808 may include one or more I/O interfaces 808, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
In particular embodiments, communication interface 810 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 800 and one or more other computer systems 800 or one or more networks. As an example and not by way of limitation, communication interface 810 may include a network interface controller (NIC) for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 810 for it. As an example and not by way of limitation, computer system 800 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 800 may communicate with a wireless PAN (WPAN) (e.g., a BLUETOOTH WPAN), a WI-FI network (e.g., a 802.11a/b/g/n WI-FI network,), a WI-MAX network, a cellular network (e.g., a Global System for Mobile Communications (GSM) network, a Long Term Evolution (LTE) network), or other suitable wireless network or a combination of two or more of these.
In particular embodiments, bus 812 includes hardware, software, or both coupling components of computer system 800 to each other. As an example and not by way of limitation, bus 812 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Peripheral Component Interconnect Express or PCI-Express bus, a serial advanced technology attachment (SATA) bus, a Inter-Integrated Circuit (I2C) bus, a Secure Digital (SD) memory interface, a Secure Digital Input Output (SDIO) interface, a Universal Serial Bus (USB) bus, a General Purpose Input/Output (GPIO) bus, or another suitable bus or a combination of two or more of these. Bus 812 may include one or more buses 812, where appropriate.
Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage medium or media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium or media may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.
Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context.
This disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments herein that a person having ordinary skill in the art would comprehend. Moreover, although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would comprehend. Furthermore, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative.
This application is a continuation under 35 U.S.C. § 120 of U.S. patent application Ser. No. 13/557,868, filed 25 Jul. 2012.
Number | Name | Date | Kind |
---|---|---|---|
20110164029 | King | Jul 2011 | A1 |
20120071892 | Itkowitz | Mar 2012 | A1 |
Number | Date | Country | |
---|---|---|---|
20170060257 A1 | Mar 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13557868 | Jul 2012 | US |
Child | 15347577 | US |