Real-time display of multiple images

Information

  • Patent Grant
  • 10862951
  • Patent Number
    10,862,951
  • Date Filed
    Wednesday, October 30, 2019
    5 years ago
  • Date Issued
    Tuesday, December 8, 2020
    4 years ago
Abstract
A user can share (show) multimedia information while simultaneously communicating (telling) with one or more other users over a network. Multimedia information is received from at least one source. The multimedia information may be manually and/or automatically annotated and shared with other users. The multimedia information may be displayed in an integrated live view simultaneously with other modes of communication, such as video, voice, or text. A simultaneous sharing communication interface provides an immersive experience that lets a user communicate via text, voice, video, sounds, music, or the like, with one or more other users while also simultaneously sharing media such as photos, videos, movies, images, graphics, illustrations, animations, presentations, narratives, music, sounds, applications, files, and the like. The simultaneous sharing interface enables a user to experience a higher level of intimacy in their communication with others over a network.
Description
TECHNICAL FIELD

The present invention relates generally to communications and, more particularly, but not exclusively to enabling a user to simultaneously share multimedia information, while communicating with one or more other users.


BACKGROUND

Sharing of multimedia information has become prevalent on computing devices and has changed our everyday lives. Mobile devices, such as digital cameras, video recorders, PDAs, and cell-phones, increasingly, have become enabled with wireless data connectivity. Users are able to send and receive multimedia information from these mobile devices more readily. However, users cannot easily identify relevant sources of, and recipients for their multimedia information.


Tremendous changes have also been occurring in the Internet that influence our everyday lives. For example, online social networks have become the new meeting grounds. They have been called the new power lunch tables and new golf courses for business life in the U.S. Moreover, many people are using such online social networks to reconnect themselves to their friends, their neighborhood, their community, and the world.


The development of such online social networks touch countless aspects of our everyday lives, providing instant access to people of similar mindsets, and enabling us to form partnerships with more people in more ways than ever before.


One aspect of our everyday lives that may benefit from multimedia information sharing is improved communication between people in remote locations. In particular, users would like to feel a sense of intimacy or immediacy in their multimedia online communication. Therefore, it is with respect to these considerations and others that the present invention has been made.





BRIEF DESCRIPTION OF THE DRAWINGS

Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following drawings. In the drawings, like reference numerals refer to like parts throughout the various figures unless otherwise specified.


For a better understanding of the present invention, reference will be made to the following Detailed Description, which is to be read in association with the accompanying drawings, wherein:



FIG. 1 is a system diagram of one embodiment of an environment in which the invention may be practiced;



FIG. 2 shows one embodiment of a mobile device that may be included in a system implementing the invention;



FIG. 3 shows one embodiment of a network device that may be included in a system implementing the invention;



FIGS. 4-14 illustrate screenshots of the novel features for managing a multimedia communication; and



FIG. 15 illustrates one embodiment for a flow diagram of a process managing a multimedia communication, in accordance with the invention.





DETAILED DESCRIPTION

The present invention now will be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific embodiments by which the invention may be practiced. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Among other things, the present invention may be embodied as methods or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. The following detailed description is, therefore, not to be taken in a limiting sense.


Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The phrase “in one embodiment” as used herein does not necessarily refer to the same embodiment, though it may. Furthermore, the phrase “in another embodiment” as used herein does not necessarily refer to a different embodiment, although it may. Thus, as described below, various embodiments of the invention may be readily combined, without departing from the scope or spirit of the invention.


In addition, as used herein, the term “or” is an inclusive “or” operator, and is equivalent to the term “and/or,” unless the context clearly dictates otherwise. The term “based on” is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include plural references. The meaning of “in” includes “in” and “on.”


As used herein, the terms “social network” and “social community” refer to a concept that an individual's personal network of friends, family colleagues, coworkers, and the subsequent connections within those networks, can be utilized to find more relevant connections for a variety of activities, including, but not limited to dating, job networking, service referrals, content sharing, like-minded individuals, activity partners, or the like.


An online social network typically comprises a person's set of direct and/or indirect personal relationships, including real and virtual privileges and permissions that users may associate with these people. Direct personal relationships usually include relationships with people the user can communicated with directly, including family members, friends, colleagues, coworkers, and other people with which the person has had some form of direct contact, such as contact in person, by telephone, by email, by instant message, by letter, or the like. These direct personal relationships are sometimes referred to as first-degree relationships. First-degree relationships can have varying degrees of closeness, trust, and other characteristics.


Indirect personal relationships typically include relationships through first-degree relationships to people with whom a person has not had some form of direct or limited direct contact, such as in being cc'd on an e-mail message, or the like. For example, a friend of a friend represents an indirect personal relationship. A more extended, indirect relationship might be a friend of a friend of a friend. These indirect relationships are sometimes characterized by a degree of separation between the people. For instance, a friend of a friend can be characterized as two degrees of separation or a second-degree relationship. Similarly, a friend of a friend of a friend can be characterized as three degrees of separation or a third-degree relationship.


As used herein, “live” or a “live view” refers to providing of real-time or approximately real-time data. It is recognized that due to a variety of reasons, transfer of data over a network may be delayed by some varying amount of time. The delay may vary based on conditions of the network, configurations of the network, configuration of the sending and/or receiving device, or the like. Thus, live or a live view may range between real-time data transfers to some varying amount of time delay.


Moreover, the term “social networking information.” refers to both dynamic as well as less dynamic characteristics of a social network. Social networking information includes various profile information about a member, including, but not limited to the member's avatar, contact information, the member's preferences, degrees of separation between the member and another member, a membership in an activity, group, or the like, or the like.


Social networking information further may include various information about communications between the member and other members in the social network, including, but not limited to emails, SMS messages. IM messages, Multimedia Message (MMS) messages, alerts, audio messages, phone calls, either received or sent by the member, or the like.


Various “meta-data” may also be associated with the social networking information. Thus, for example, various permissions for access may be associated with at least some of the social networking information. Some access permissions (or sharing rules) may be selected, for example, based, in part, on an input by the member, while other access permissions may be defaulted based on other events, constraints, or the like.


The term “multimedia information” as used herein refers to information comprising visual and/or audio information. Multimedia information may include images, video, movies, presentations, animations, illustrations, narratives, music, sound effects, voice, or the like. As used herein, the term “annotation” refers to marking and/or modifying information with another information.


As used herein, the term “retrieving” refers to the process of sending a request for information, and receiving the information. The request may include a search query, an identifier for the information, or the like. As used herein, the term “searching” refers to providing a query based on at least one criteria. The criteria may include a search term, a name, a Uniform Resource Locator (URL), a date/timestamp, a part or combination thereof, or the like. As used herein, the term “sharing” refers to sending information between one user and at least one other user.


Briefly stated the present invention is directed towards enabling a user to share multimedia information while simultaneously communicating (telling) with one or more other users. Multimedia information is received from at least one source. The multimedia information may be manually and/or automatically annotated and shared with other users. The multimedia information may be displayed in an integrated live view simultaneously with other modes of communication, such as video, voice, music, sounds, or text. In one embodiment, a simultaneous sharing communication interface provides an immersive experience that lets a user communicate via text, voice, video, sounds, music, or the like, with one or more other users while also simultaneously sharing media such as photos, videos, movies, images, graphics, illustrations, animations, presentations, narratives, music, sounds, applications, files, and the like. The simultaneous sharing interface enables a user to experience a higher level of intimacy in their communication with others over a network.


Illustrative Operating Environment



FIG. 1 shows components of one embodiment of an environment in which the invention may be practiced. Not all the components may be required to practice the invention, and variations in the arrangement and type of the components may be made without departing from the spirit or scope of the invention. As shown, system 100 of FIG. 1 includes local area networks (“LANs”)/wide area networks (“WANs”)-(network) 105, wireless network 110, Multimedia Sharing Service (MSS) 106, mobile devices (client devices) 102-104, client device 101, and information services 107.


One embodiment of mobile devices 102-103 is described in more detail below in conjunction with FIG. 2. Generally, however, mobile devices 102-104 may include virtually any portable computing device capable of receiving and sending a message over a network, such as network 105, wireless network 110, or the like. Mobile devices 102-104 may also be described generally as client devices that are configured to be portable. Thus, mobile devices 102-104 may include virtually any portable computing device capable of connecting to another computing device and receiving information. Such devices include portable devices such as, cellular telephones, smart phones, display pagers, radio frequency (RF) devices, infrared (IR) devices. Personal Digital Assistants (PDAs), handheld computers, laptop computers, wearable computers, tablet computers, integrated devices combining one or more of the preceding devices, and the like. As such, mobile devices 102-104 typically range widely in terms of capabilities and features. For example, a cell phone may have a numeric keypad and a few lines of monochrome LCD display on which only text may be displayed. In another example, a web-enabled mobile device may have a touch sensitive screen, a stylus, and several lines of color LCD display in which both text and graphics may be displayed.


A web-enabled mobile device may include a browser application that is configured to receive and to send web pages, web-based messages, and the like. The browser application may be configured to receive and display graphics, text, multimedia, and the like, employing virtually any web based language, including a wireless application protocol messages (WAP), and the like. In one embodiment, the browser application is enabled to employ Handheld Device Markup Language (HDML), Wireless Markup Language (WML), WMLScript, JavaScript, Standard Generalized Markup Language (SMGL). HyperText Markup Language (HTML), eXtensible Markup Language (XML), and the like, to display and send a message.


Mobile devices 102-104 also may include at least one other client application that is configured to receive content from another computing device. The client application may include a capability to provide and receive textual content, multimedia information, and the like. The client application may further provide information that identifies itself, including a type, capability, name, and the like. In one embodiment, mobile devices 102-104 may uniquely identify themselves through any of a variety of mechanisms, including a phone number, Mobile Identification Number (MIN), an electronic serial number (ESN), or other mobile device identifier. The information may also indicate a content format that the mobile device is enabled to employ. Such information may be provided in a message, or the like, sent to MSS 106, client device 101, or other computing devices.


Mobile devices 102-104 may also be configured to communicate a message, such as through Short Message Service (SMS). Multimedia Message Service (MMS), instant messaging (IM), internet relay chat (IRC), Mardam-Bey's IRC (mIRC), Jabber. and the like, between another computing device, such as MSS 106, client device 101, or the like. However, the present invention is not limited to these message protocols, and virtually any other message protocol may be employed.


Mobile devices 102-104 may be further configured to enable a user to participate in communications sessions, such as IM sessions. As such, mobile devices 102-104 may include a client application that is configured to manage various actions on behalf of the client device. For example, the client application may enable a user to interact with the browser application, email application, IM applications. SMS application, and the like.


Mobile devices 102-104 may further be configured to include a client application that enables the end-user to log into an end-user account that may be managed by another computing device, such as MSS 106. Such end-user account, for example, may be configured to enable the end-user to receive emails, send/receive IM messages, SMS messages, access selected web pages, participate in a social networking activity, or the like. However, participation in various social networking activities may also be performed without logging into the end-user account.


In addition, mobile devices 102-104 may include another application that is configured to enable the mobile user to share and/or receive multimedia information, and to display integrated live views for providing the multimedia information. In one embodiment, each of mobile devices 102-104 may share with and/or receive the multimedia information from MSS 106 and/or from another one of mobile devices 102-104. In conjunction with sharing multimedia information, mobile devices 102-104 may enable an interaction with each other, through sharing various messages, and generally participating in a variety of integrated social experiences beyond merely voice communications or text messages (e.g., IM). In one embodiment, mobile devices 102-104 may enable the interaction with a user associated with received multimedia information. For example, a user of one of mobile devices 102-104 may send a comment on the multimedia information to another user of another one of mobile devices 102-104.


Mobile devices 102-104 may also communicate with non-mobile client devices, such as client device 101, or the like. In one embodiment, such communications may include participation in social networking activities, including sharing of multimedia information.


Client device 101 may include virtually any computing device capable of communicating over a network to send and receive information, including social networking information, or the like. The set of such devices may include devices that typically connect using a wired or wireless communications medium such as personal computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, or the like.


In one embodiment, client devices 101-104 may be configured to enable a communication between users over network 105. Client devices 101-104 may be configured to receive or otherwise retrieve multimedia information from, for example, a file system, a data store, MSS 106, information services 107 (e.g., via a HTTP/HTTPS/FTP and using a URL) or the like. Devices 101-104 may be further configured to annotate the multimedia information, and/or share the multimedia information simultaneous with sending a communication between the users, or the like.


Wireless network 110 is configured to couple mobile devices 102-104 and its components with network 105. Wireless network 110 may include any of a variety of wireless sub-networks that may further overlay stand-alone ad-hoc networks, and the like, to provide an infrastructure-oriented connection for mobile devices 102-104. Such sub-networks may include mesh networks. Wireless LAN (WLAN) networks, cellular networks, and the like.


Wireless network 110 may further include an autonomous system of terminals, gateways, routers, and the like connected by wireless radio links, and the like. These connectors may be configured to move freely and randomly and organize themselves arbitrarily, such that the topology of wireless network 110 may change rapidly.


Wireless network 110 may further employ a plurality of access technologies including 2nd (2G), 3rd (3G) generation radio access for cellular systems, WLAN, Wireless Router (WR) mesh, and the like. Access technologies such as 2G, 3G, and future access networks may enable wide area coverage for mobile devices, such as mobile devices 102-104 with various degrees of mobility. For example, wireless network 110 may enable a radio connection through a radio network access such as Global System for Mobile communication (GSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), and the like. In essence, wireless network 110 may include virtually any wireless communication mechanism by which information may travel between mobile device s 102-104 and another computing device, network, and the like.


Network 105 is configured to couple MSS 106 and its components with other computing devices, including, mobile devices 102-104, client device 101, and through wireless network 110 to mobile devices 102-104. Network 105 is enabled to employ any form of computer readable media for communicating information from one electronic device to another. Also, network 105 can include the Internet in addition to local area networks (LANs), wide area networks (WANs), direct connections, such as through a universal serial bus (USB) port, other forms of computer-readable media, or any combination thereof. On an interconnected set of LANs, including those based on differing architectures and protocols, a router acts as a link between LANs, enabling messages to be sent from one to another. Also, communication links within LANs typically include twisted wire pair or coaxial cable, while communication links between networks may utilize analog telephone lines, full or fractional dedicated digital lines including T1, T2, T3, and T4, Integrated Services Digital Networks (ISDNs). Digital Subscriber Lines (DSLs), wireless links including satellite links, or other communications links known to those skilled in the art. Furthermore, remote computers and other related electronic devices could be remotely connected to either LANs or WANs via a modem and temporary telephone link. In essence, network 105 includes any communication method by which information may travel between MSS 106, client device 101, and other computing devices.


Additionally, communication media typically embodies computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, data signal, or other transport mechanism and includes any information delivery media. The terms “modulated data signal,” and “carrier-wave signal” includes a signal that has one or more of its characteristics set or changed in such a manner as to encode information, instructions, data, and the like, in the signal. By way of example, communication media includes wired media such as twisted pair, coaxial cable, fiber optics, wave guides, and other wired media and wireless media such as acoustic, RF, infrared, and other wireless media.


One embodiment of MSS 106 is described in more detail below in conjunction with FIG. 3. Briefly, however, MSS 106 may include any computing device capable of connecting to network 105 to enable sharing of multimedia information based on user information and other social networking information. MSS 106 may receive from various participants in a social network, multimedia information, and social networking information, including information associated with activities, moods, events, messages, user information, communications, or the like. MSS 106 may also receive social networking information from a variety of other sources including, for example, information services 107. MSS 106 may store at least some of the received multimedia and/or social networking information for use by one or more social networking members. MSS 106 may also enable sharing and/or sending the multimedia information to another network device, including one of client devices 101-104 based on user information and/or other social networking information.


Devices that may operate as MSS 106 include personal computers desktop computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, and the like.


Although FIG. 1 illustrates MSS 106 as a single computing device, the invention is not so limited. For example, one or more functions of MSS 106 may be distributed across one or more distinct computing devices. For example, managing various social networking activities, including sharing of multimedia information, managing Instant Messaging (IM) session, SMS messages, email messages, sharing of contact information, aggregating and/or storing of social networking information, or the like, may be performed by a plurality of computing devices, without departing from the scope or spirit of the present invention.


Information services 107 represents a variety of service devices that may provide additional information for use in generating live views on mobile devices 102-104. Such services include, but are not limited to web services, third-party services, audio services, video services, multimedia services, email services, IM services. SMS services, Video Conferencing Services, VOIP services, calendaring services, multimedia information sharing services, or the like. Devices that may operate as information services 107 include personal computers desktop computers, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, mobile devices, and the like.


Illustrative Mobile Client Environment



FIG. 2 shows one embodiment of mobile device 200 that may be included in a system implementing the invention. Mobile device 200 may include many more or less components than those shown in FIG. 2. However, the components shown are sufficient to disclose an illustrative embodiment for practicing the present invention. Mobile device 200 may represent, for example, mobile devices 102-104 of FIG. 1.


As shown in the figure, mobile device 200 includes a processing unit (CPU) 222 in communication with a mass memory 230 via a bus 224. Mobile device 200 also includes a power supply 226, one or more network interfaces 250, an audio interface 252, video interface 259, a display 254, a keypad 256, an illuminator 258, an input/output interface 260, a haptic interface 262, and an optional global positioning systems (GPS) receiver 264. Power supply 226 provides power to mobile device 200. A rechargeable or non-rechargeable battery may be used to provide power. The power may also be provided by an external power source, such as an AC adapter or a powered docking cradle that supplements and/or recharges a battery.


Mobile device 200 may optionally communicate with a base station (not shown), or directly with another computing device. Network interface 250 includes circuitry for coupling mobile device 200 to one or more networks, and is constructed for use with one or more communication protocols and technologies including, but not limited to, global system for mobile communication (GSM), code division multiple access (CDMA), time division multiple access (TDMA), user datagram protocol (UDP), transmission control protocol/Internet protocol (TCP/IP), SMS, general packet radio service (GPRS), WAP, ultra wide band (UWB), IEEE 802.16 Worldwide Interoperability for Microwave Access (WiMax), SIP/RTP, or any of a variety of other wireless communication protocols. Network interface 250 is sometimes known as a transceiver, transceiving device, or network interface card (NIC).


Audio interface 252 is arranged to produce and receive audio signals such as the sound of a human voice. For example, audio interface 252 may be coupled to a speaker and microphone (not shown) to enable telecommunication with others and/or generate an audio acknowledgement for some action. Display 254 may be a liquid crystal display (LCD), gas plasma, light emitting diode (LED), or any other type of display used with a computing device. Display 254 may also include a touch sensitive screen arranged to receive input from an object such as a stylus or a digit from a human hand.


Video interface 259 is arranged to capture video images, such as a still photo, a video segment, an infrared video, or the like. For example, video interface 259 may be coupled to a digital video camera, a web-camera, or the like. Video interface 259 may comprise a lens, an image sensor, and other electronics. Image sensors may include a complementary metal-oxide-semiconductor (CMOS) integrated circuit, charge-coupled device (CCD), or any other integrated circuit for sensing light.


Keypad 256 may comprise any input device arranged to receive input from a user. For example, keypad 256 may include a push button numeric dial, or a keyboard. Keypad 256 may also include command buttons that are associated with selecting and sending images. Illuminator 258 may provide a status indication and/or provide light. Illuminator 258 may remain active for specific periods of time or in response to events. For example, when illuminator 258 is active, it may backlight the buttons on keypad 256 and stay on while the client device is powered. Also, illuminator 258 may backlight these buttons in various patterns when particular actions are performed, such as dialing another client device. Illuminator 258 may also cause light sources positioned within a transparent or translucent case of the client device to illuminate in response to actions.


Mobile device 200 also comprises input/output interface 260 for communicating with external devices, such as a headset, or other input or output devices not shown in FIG. 2. Input/output interface 260 can utilize one or more communication technologies, such as USB, infrared. Bluetooth™, or the like. Haptic interface 262 is arranged to provide tactile feedback to a user of the client device. For example, the haptic interface may be employed to vibrate mobile device 200 in a particular way when another user of a computing device is calling.


Optional GPS transceiver 264 can determine the physical coordinates of mobile device 200 on the surface of the Earth, which typically outputs a location as latitude and longitude values. GPS transceiver 264 can also employ other geo-positioning mechanisms, including, but not limited to, triangulation, assisted GPS (AGPS), E-OTD, CI, SAI, ETA, BSS or the like, to further determine the physical location of mobile device 200 on the surface of the Earth. It is understood that under different conditions, GPS transceiver 264 can determine a physical location within millimeters for mobile device 200; and in other cases, the determined physical location may be less precise, such as within a meter or significantly greater distances. In one embodiment, however, mobile device may through other components, provide other information that may be employed to determine a physical location of the device, including for example, a MAC address, IP address, or the like.


Mass memory 230 includes a RAM 232, a ROM 234, and other storage means. Mass memory 230 illustrates another example of computer storage media for storage of information such as computer readable instructions, data structures, program modules or other data. Mass memory 230 stores a basic input/output system (“BIOS”) 240 for controlling low-level operation of mobile device 200. The mass memory also stores an operating system 241 for controlling the operation of mobile device 200. It will be appreciated that this component may include a general purpose operating system such as a version of UNIX, or LINUX™, or a specialized client communication operating system such as Windows Mobile™, or the Symbian® operating system. The operating system may include, or interface with a Java virtual machine module that enables control of hardware components and/or operating system operations via Java application programs.


Memory 230 further includes one or more data storage 244, which can be utilized by mobile device 200 to store, among other things, applications 242 and/or other data. For example, data storage 244 may also be employed to store information that describes various capabilities of mobile device 200. The information may then be provided to another device based on any of a variety of events, including being sent as part of a header during a communication, sent upon request, or the like. Moreover, data storage 244 may also be employed to store multimedia information and/or social networking information including user information, or the like. At least a portion of the multimedia information and/or social networking information may also be stored on a disk drive or other storage medium (not shown) within mobile device 200.


Applications 242 may include computer executable instructions which, when executed by mobile device 200, transmit, receive, and/or otherwise process messages (e.g., SMS, MMS, IM, email, and/or other messages), multimedia information, and enable telecommunication with another user of another client device. Other examples of application programs include calendars, browsers, email clients, IM applications, SMS applications, Video Conferencing applications. VOIP applications, contact managers, task managers, transcoders, database programs, word processing programs, security applications, spreadsheet programs, games, search programs, and so forth. Applications 242 may further include multimedia sharing manager (MSM) 245, and communication application 243.


Communication application 243 includes any component for managing communication over network interface 250. Communication application 243 may be configured to transmit, receive, and/or otherwise process messages (e.g., Video Conferencing messages, VOIP. SMS, MMS, IM, email, and/or other messages), store such messages, translate between one type of message and another, or the like. Communication application 243 may provide a user interface, such as a conversation screen, video display, or the like.


MSM 245 includes any component configured to manage sharing of multimedia information over network interface 260. MSM 245 may display a window (e.g., a simultaneous sharing window/live view) on display 254, for example. MSM 245 may receive or otherwise retrieve the multimedia information and/or thumbnails for the multimedia information from a variety of applications in applications 242, from a third-party server over network interface 260, from a file/URL on a remote website, on a local file system, on data storage 244, or the like. MSM 245 may display the received/retrieved information in the window. MSM 245 may enable a communication between users over network interface 260. For example, the user of mobile device 200 may use a conversation window within the window to send a message over network interface 260. Simultaneous with sending the message, multimedia information stored, for example, in data storage 244, may be sent over network interface 260 to another user participating in the communication. Operations of MSM 245 may be performed by operations 4-14 of FIGS. 4-14 and/or process 1500 of FIG. 15.


Illustrative Server Environment



FIG. 3 shows one embodiment of a network device, according to one embodiment of the invention. Network device 300 may include many more components than those shown. The components shown, however, are sufficient to disclose an illustrative embodiment for practicing the invention. Network device 300 may represent, for example. MSS 106 of FIG. 1.


Network device 300 includes processing unit 312, video display adapter 314, and a mass memory, all in communication with each other via bus 322. The mass memory generally includes RAM 316, ROM 332, and one or more permanent mass storage devices, such as hard disk drive 328, tape drive, optical drive, and/or floppy disk drive. The mass memory stores operating system 320 for controlling the operation of network device 300. Any general-purpose operating system may be employed. Basic input/output system (“BIOS”) 318 is also provided for controlling the low-level operation of network device 300. As illustrated in FIG. 3, network device 300 also can communicate with the Internet, or some other communications network, via network interface unit 310, which is constructed for use with various communication protocols including the TCP/IP protocol. Network interface unit 310 is sometimes known as a transceiver, transceiving device, or network interface card (NIC).


The mass memory as described above illustrates another type of computer-readable media, namely computer storage media. Computer storage media may include volatile, nonvolatile, removable, and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM. ROM. EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device.


The mass memory also stores program code and data. One or more applications 350 are loaded into mass memory and run on operating system 320. Examples of application programs may include transcoders, schedulers, calendars, database programs, word processing programs. HTTP programs, customizable user interface programs, IPSec applications, encryption programs, security programs. VPN programs, SMS message servers, Video Conferencing servers. IM message servers, email servers, account management and so forth. Communication application 354 and/or Multimedia Information manager (MIM) 355 may also be included as application programs within applications 350.


Communication application 354 includes any component for managing communication over network interface 310, between a plurality of clients. Communication application 354 may be configured to forward, transmit, receive, and/or otherwise process messages (e.g., Video Conferencing messages, VOIP, SMS, MMS, IM, email, and/or other messages), store such messages, translate between one type of message and another, or the like. Communication application 354 may provide a communication interface, such as a web interface (HTTP/HTML), and XML interface, or the like. Such communication interface may be integrated into, for example, a simultaneous sharing window as described in conjunction with FIGS. 4-14.


MIM 355 includes any component configured to manage multimedia information over network interface 310. MIM 355 may receive a request for a communication between users, using for example, communication applications 354. In one embodiment, communication applications 354 may initiate a communication session between the users, including a Short Message Service (SMS), Multimedia Message Service (MMS), instant messaging (IM), internet relay chat (IRC), Voice Over Internet Protocol (VOIP) session, or the like.


In one embodiment. MIM 355 may receive shared multimedia information or a shared identifier associated with the shared multimedia information over network interface 310. MIM 355 may use Multimedia Information Store (MIS) 352 to manage the shared multimedia information/identifier. In one embodiment, MIS 352 may provide the multimedia information associated with the shared identifier. In one embodiment. MIS 352 may store the shared multimedia information for later processing. MIM 355 may also forward a shared multimedia information between users over network interface 310. In one embodiment, the forwarding of the multimedia information may be simultaneous with forwarding a message within the communication session.


In one embodiment, MIS 352 may store a plurality of received multimedia information. In one embodiment. MIS 352 may be a database, a file structure, or the like. MIS 352 may store the multimedia information into a category structure, such as folders, albums, graphs, trees, or the like. In one embodiment, MIS 352 may search for multimedia information based on a query (e.g., received from MIM 355, over network interface 310, or the like). In one embodiment, based on the query, MIS 352 may provide the searched multimedia information (e.g., over network interface 310).


Illustrative Use Operations


The operation of certain aspects of the invention will now be described with respect to FIGS. 4-14. FIG. 4-14 provide examples of certain operations to further illustrate the invention. While the discussion below in conjunction with FIG. 4-14 relate to photo sharing, any type of multimedia sharing may be similarly enabled without departing from the scope of the invention.



FIG. 4 illustrates a voice call with the simultaneous sharing communication interface 402 on the desktop provided by an operating system of a computing device. The desktop is shown with a conversation window 406 that shows two users simultaneously communicating via text messages and voice. Chat history 404 includes recent text messages communicated between users. Although not shown, the conversation window may also include a live video chat window.



FIG. 5 illustrates a user clicking on a “share” tab 502 in the simultaneous sharing communication interface 402 on the desktop provided by an operating system of a computing device. The desktop is shown with a conversation window 406 that shows two users simultaneously communicating via text messages and voice.



FIG. 6 illustrates a dialog box asking the user what to share. The user has selected “Share Photos” button 602 from the choice menu. Although only four options to choose are shown, in other embodiments, substantially more or less options may be presented in one or more views.



FIG. 7 illustrates a photo sharing window prior 702 to the loading of photos for sharing. As shown, the photo choice 704 is selected, and voice calling and text messaging is ongoing. The user is waiting for photos to be dragged or populated in the window so that one or more can be selected for sharing with the other user. Options 708 such as “search your photos” and “find file” are provided for a user to dialog box asking the user what to share. These options may enable searching based on a user provided criteria, searching over a network (e.g., using a search engine), or the like.



FIG. 8 illustrates the most recent photos 802 that are available for sharing in the window. It is understood that the photos may be provided from one or more sources, including other entities, properties, and third parties (e.g., the website Flickr), As shown, a selected photo may be dragged-and-dropped into sharing strip 702.



FIG. 9 illustrates the most recent photos 902 that are located on a local personal computer that are available for sharing in the window. Other photos 802 from a third-party source may also be displayed in addition to (e.g., side-by-side with) photos from the local personal computer. It is understood that the photos may be provided from one or more applications on the personal computer. Also, a search facility can be employed to filter results based on size, timestamp, and the like.



FIGS. 10 and 11 illustrate the user dragging and dropping photos into a sharing strip 702 in the lower section of the window. For example, thumbnail 1002 is dragged into the sharing strip 702. Thumbnail 1102 is also dragged into sharing strip 702 and ordered higher than thumbnail 1002.



FIG. 12 illustrates the user done with choosing photos for the initial set of photos to share, i.e., dragging and dropping photos into a sharing strip in the lower section of the window. In one embodiment, the user may press done button 1202 to indicate that the photos associated with the thumbnail in the sharing strip 702 is to be shared. In another embodiment, sharing may occur as a thumbnail is dragged-and-dropped into the sharing strip 702.



FIG. 13 illustrates the photo sharing window 702 being shared with the other users that are in simultaneous communication by both text and voice. Each user can see the previously selected photos in the sharing strip 702, and can add more photos to the share in the sharing strip 702. As shown, the selected thumbnail for the photo from sharing strip 702 is displayed in high(er) resolution image 1302. In one embodiment, high(er) resolution image 1302 may be displayed on each of the users' display at the same time. Although not shown, instead of or in addition to providing higher resolution image 1302, other multimedia information may be provided to each of the users (e.g., to network devices associated with the user). For example, music or video may be provided. In another embodiment, each of the users may select a different thumbnail to provide different multimedia information to each of the users. In other embodiments not shown, one or more videos can be also shared in the sharing strip 702 while video, voice, and/or text communication is simultaneously occurring between the users.



FIG. 14 illustrates full screen photo sharing window 702 being shared with the other users that are in simultaneous communication by both text and voice. Each user can still see the previously selected photos in the sharing strip 702, and can add more photos to the share in the sharing strip 702. In other embodiments not shown, one or more videos can be also shared in the sharing strip 702 while video, voice, and/or text communication is simultaneously occurring between the users.


Generalized Operation


The operation of certain aspects of the invention will now be described with respect to FIG. 15. FIG. 15 illustrates a logical flow diagram generally showing one embodiment of a process for managing a multimedia communication. Process 1500 of FIG. 15 might be implemented, for example, within MSS 106 of FIG. 1.


Process 1500 begins, after a start block, at block 1502, where a communication between users is enabled. In one embodiment, the users belong to a same social network. In one embodiment, the communication may be enabled if a user and another user are a degree of separation from each other (e.g., at least two degrees of separation from each other in a same social network). The communication may include a voice communication, a text communication, a video communication, or the like. In one embodiment, the communication may be enabled over a plurality of channels. The channels may include at least one of Short Message Service (SMS), Multimedia Message Service (MMS), instant messaging (IM), internet relay chat (IRC), Voice Over Internet Protocol (VOIP), or the like. Processing next continues to block 1504.


At block 1504, multimedia information is received. In one embodiment, receiving the multimedia information may comprise retrieving, searching, filtering, or otherwise processing the multimedia information, based on, for example, attributes of a file associated with the multimedia information. The attributes may include the file name, size, date modified, a rating, or the like. In one embodiment, the multimedia information may be retrieved or otherwise received from at least one source. For example, the multimedia information may be received from a network device over a network, from a directory on the computing device, from a plurality of applications of different types, or the like. In one embodiment, the multimedia information may be retrieved from a user's desktop, using an operating system interface (e.g., using an SQL query, an search engine interface, information retrieval interface, a Windows Vista™ indexing service, a Windows XP™ indexing service, or the like). In one embodiment, the following search query may be sent to the operating system to search for files (e.g., for attributes and/or tags associated with the files) on a desktop, a file system, within a plurality of directories in a file system, at least one remote website, or the like:

    • SELECT TOP NUM_RESULT \“System.FileName\”, \“System.Size\”, \“System.DateModified\”, \“System.ItemUrl\”, \“System.FileExtension\”. \“System.MIMEType\”, \“System.ThumbnailCacheID\”, \“System.Keywords\”, \“System.Image.HorizontalSize\”, \“System.-Image.VerticalSize\” FROM SYSTEMINDEX.SCOPE( ) WHERE \“System.MIMEType\” LIKE ‘image/%’ AND \“System.Size\”>102400 AND \“System.FileName\” LIKE ‘% SEARCH_STRING %’ ORDER BY \“System.DateModified\” DESC”


As shown, the search may be based on a plurality of attributes of files associated with the multimedia information. In one embodiment, SEARCH_STRING may be a search term (or may be empty and optional). In one embodiment, the search terms may be a specified term. For example, the terms may be entered by the user in the live view, may be terms recently search by the user, a member of the social network, a member of the social network a degree of separation from the user, an aggregate of popular terms searched for in the social network, or the like. NUM_RESULT may be the number of results predefined by a threshold, by the user, capable of being displayed in a window (e.g., a simultaneous sharing window). However, the invention is constrained to this example, and other formats, commands, search terms, and the like may also be employed.


In another embodiment, the multimedia information may be retrieved from a third-party source (e.g., the website Flickr, a search engine, or the like). In one embodiment, a search term(s) may be sent to the third-party source, and a result list including at least one URL for a file associated with the multimedia information may be returned. In another embodiment, the third-party source may provide an API to retrieve the multimedia information. For example, a token for the user may be received from the third-party. Using the token, a search may be initiated on the third party for photos associated with a user (e.g., the user, the social network, users a degree of separation from the user, or the like). A result set may be provided by the third-party. In one embodiment, a list of photos, names and/or tags associated with the photos, a list of URLs to thumbnails of the photos, or the like may be provided. The result set may be filtered based on a search term, date, or the like. If the search term is empty then all multimedia information in the result set are selected for further processing. In one embodiment, the search term may be used to filter search result sets from at least two of the plurality of sources of multimedia information. Processing next continues to block 1506.


At block 1506, optionally, multimedia information may be annotated. Annotation may comprise associating other information such as an identifier, text, or the like, with the multimedia information. The other information may include a time/date, name, GPS position, tag, comment, rating or the like. The multimedia information may be annotated by the user and/or done automatically. For example, the user may annotate with a name for a selected photo/multimedia information. In another example, a device may automatically annotate with a time of day/week/month and/or GPS position. As shown, block 1506 may be optional and may not be performed. Processing next continues to block 1508.


At block 1508, the multimedia information is displayed in a live view. In one embodiment, prior to displaying in the live view, the multimedia information may be filtered based on at least one of a size, a timestamp of the multimedia information, or the like. The multimedia information may be ordered or grouped by the source of the multimedia information. The multimedia information may also be sorted by the timestamp. Processing next continues to decision block 1510.


At decision block 1510, it is determined if multimedia information has been inserted into a sharing strip. In one embodiment, placing may comprise appending or concatenating the multimedia information to the shared strip, inserting the multimedia information in between other multimedia information in the shared strip, or the like. In one embodiment, inserting may comprise detecting a drag-and-drop of the multimedia information into the sharing strip. If the multimedia information has been inserted into the sharing strip, processing continues to block 1512. Otherwise, processing loops back to decision block 1510 to wait for the multimedia information to be inserted.


At block 1512, the multimedia information is displayed in the sharing strip. In one embodiment, a thumbnail associated with the multimedia information is displayed. The sharing strip may include a plurality of thumbnails, with each thumbnail associated with a different multimedia information. The sharing strip may enable ordering/re-ordering of the plurality of thumbnails. In one embodiment, the multimedia information/thumbnail may be ordered based on, for example, the time at which the multimedia information was selected to be inserted into the sharing strip. Processing next continues to block 1514.


At block 1512, the multimedia information is shared simultaneously and independent of the enabled communication. In one embodiment, sharing may include sending the multimedia information between the users, sending an annotation (e.g., annotated at block 1506) associated with the multimedia information between the users, displaying the shared multimedia information in a live view, or the like. In one embodiment, simultaneously sharing may comprise sending the multimedia information associated with the selected thumbnail, if the selected thumbnail is detected as drag-and-dropped into a sharing strip. In another embodiment, simultaneously sharing may comprise sending the multimedia information associated with a selected thumbnail (e.g., in a sharing strip) while sending at least one of a Short Message Service (SMS), Multimedia Message Service (MMS), instant messaging (IM), internet relay chat (IRC), Voice Over Internet Protocol (VOIP) message, or the like. In another embodiment, a plurality of multimedia information may be sent in a send order based on an order of different selected thumbnails in the sharing strip, wherein each of the plurality of the multimedia information is associated with a different selected thumbnail in the sharing strip.


In one embodiment, at least two of the users participating in the enabled communication may each have a sharing strip displayed on different network devices. The same thumbnail associated with the multimedia information may be displayed in both sharing strips. Each of the users may add, move, or remove the thumbnails, order/re-order the thumbnails, and/or annotate the multimedia information associated with the thumbnails in their respective sharing strip. A change to one sharing strip may be sent between the different network devices.


In one embodiment, at least one of the users participating in the communication may select a selected multimedia information to be displayed or otherwise provided. For example, a user may select a thumbnail in the strip, and consequently, a video associated with the selected thumbnail may begin playing on the user's display and/or on another user's display, wherein the user and the other user is also participating in the communication. Shared play-back may also enable at least one of the users to otherwise manipulate the provided multimedia information. For example, either user may stop the video, rewind the video, speed-up the video, or the like for the user's display and/or the other user's display.


In another embodiment, an automatic playback may be enabled based on the multimedia information associated with the sharing strip. For example, a slide-show may be provided based on the thumbnails in the sharing strip in the order of the thumbnails in the sharing strip. Processing next continues to decision block 1516.


At decision block 1516, it is determined whether more multimedia information is to be shared. If more multimedia information is to be shared, then processing loops back to decision block 1510. Thereby, more multimedia information may be simultaneously shared, annotated, and displayed by a plurality of users, independent of the communication. For example, a process 400 may be performed multiple times for more than one of the users. Otherwise, processing returns to a calling process for further processing.


It will be understood that the illustrative uses in the figures, can be implemented by computer program instructions. These program instructions may be provided to a processor to produce a machine, such that the instructions, which execute on the processor, create means for implementing the actions specified. The computer program instructions may be executed by a processor to cause a series of operational steps to be performed by the processor to produce a computer implemented process such that the instructions, which execute on the processor to provide steps for implementing the actions specified. The computer program instructions may also cause at least some of the operational steps to be performed in parallel. Moreover, some of the steps may also be performed across more than one processor, such as might arise in a multi-processor computer system. In addition, one or more combinations of uses in the drawings may also be performed concurrently with other uses, or even in a different sequence than illustrated without departing from the scope or spirit of the invention.


Accordingly, the illustrative uses and diagrams support combinations of means for performing the specified actions, combinations of steps for performing the specified actions and program instruction means for performing the specified actions. It will also be understood that each of the illustrated uses, and combinations of uses in the drawings, can be implemented by special purpose hardware-based systems which perform the specified actions or steps, or combinations of special purpose hardware and computer instructions.

Claims
  • 1. A method comprising: generating, by a network platform, a real-time slide show comprising user submitted image items appended in an order based on timestamps, the user submitted image items submitted to the real-time slide show by a plurality of user devices;receiving, from one of the plurality of user devices, an additional image item as a submission to the real-time slide show, the additional image item being generated by the one of the plurality of user devices, the additional image item submitted to the real-time slide show by the one of the plurality of user devices using a user interface element displayed on the one of the plurality of user devices;updating, by the network platform, the real-time slide show to display the additional image item appended after the user submitted image items based on the order of timestamps, the user submitted image items and the additional image item being automatically annotated in the real-time slide show with a user interface tag; andtransmitting, by the network platform, the updated real-time slide show including the user submitted image items and the additional image item to one or more of the plurality of user devices.
  • 2. The method of claim 1, wherein the user interface element is a button configured to submit the additional image item to the real-time slide show upon the button being selected by the one of the plurality of user devices.
  • 3. The method of claim 1, wherein the user interface tag is a timestamp.
  • 4. The method of claim 3, wherein the network platform annotates the additional image item by overlaying the timestamp on the additional image item.
  • 5. The method of claim 1, wherein the real-time slide show is independently navigable by the plurality of user devices.
  • 6. The method of claim 1, wherein the user submitted image items comprises at least one of: an image or a video.
  • 7. The method of claim 1, wherein the network platform is a social network site.
  • 8. A system comprising: a memory comprising instructions; andone or more computer processors of a network platform; wherein the instructions, when executed by the one or more computer processors, cause the one or more computer processors to perform operations comprising:generating a real-time slide show comprising user submitted image items appended in an order based on timestamps, the user submitted image items submitted to the real-time slide show by a plurality of user devices;receiving, from one of the plurality of user devices, an additional image item as a submission to the real-time slide show, the additional image item being generated by the one of the plurality of user devices, the additional image item submitted to the real-time slide show by the one of the plurality of user devices using a user interface element displayed on the one of the plurality of user devices;updating the real-time slide show to display the additional image item appended after the user submitted image items based on the order of timestamps, the user submitted image items and the additional image item being automatically annotated in the real-time slide show with a user interface tag; andtransmitting the updated real-time slide show including the user submitted image items and the additional image item to one or more of the plurality of user devices.
  • 9. The system of claim 8, wherein the user interface element is a button configured to submit the additional image item to the real-time slide show upon the button being selected by the one of the plurality of user devices.
  • 10. The system of claim 8, wherein the user interface tag is a timestamp.
  • 11. The system of claim 10, wherein the network platform annotates the additional image item by overlaying the timestamp on the additional image item.
  • 12. The system of claim 8, wherein the real-time slide show is independently navigable by the plurality of user devices.
  • 13. The system of claim 8, wherein the user submitted image items comprises at least one of: an image or a video.
  • 14. The system of claim 8, wherein the network platform is a social network site.
  • 15. A non-transitory machine-readable storage device including instructions that, when executed by one or more processors of a machine, cause the one or more processors to perform operations comprising: generating a real-time slide show comprising user submitted image items appended in an order based on timestamps, the user submitted image items submitted to the real-time slide show by a plurality of user devices;receiving, from one of the plurality of user devices, an additional image item as a submission to the real-time slide show, the additional image item being generated by the one of the plurality of user devices, the additional image item submitted to the real-time slide show by the one of the plurality of user devices using a user interface element displayed on the one of the plurality of user devices;updating the real-time slide show to display the additional image item appended after the user submitted image items based on the order of timestamps, the user submitted image items and the additional image item being automatically annotated in the real-time slide show with a user interface tag; andtransmitting the updated real-time slide show including the user submitted image items and the additional image item to one or more of the plurality of user devices.
  • 16. The non-transitory machine-readable storage device of claim 15, wherein the user interface element is a button configured to submit the additional image item to the real-time slide show upon the button being selected by the one of the plurality of user devices.
  • 17. The non-transitory machine-readable storage device of claim 15, wherein the user interface tag is a timestamp.
  • 18. The non-transitory machine-readable storage device of claim 17, wherein the additional image item is annotated by overlaying the timestamp on the additional image item.
  • 19. The non-transitory machine-readable storage device of claim 15, wherein the real-time slide show is independently navigable by the plurality of user devices.
  • 20. The non-transitory machine-readable storage device of claim 15, wherein the user submitted image items comprises at least one of: an image or a video.
CLAIM OF PRIORITY

This application is a Continuation of and claims the benefit of U.S. patent application Ser. No. 15/396,133, filed Dec. 30, 2016, which is a Continuation of and claims the benefit of U.S. patent application Ser. No. 14/047,662, filed on Oct. 7, 2013, which is a Continuation Application and claims priority from U.S. patent application Ser. No. 11/750,211, now U.S. Pat. No. 8,554,868, filed on May 17, 2007, which claims priority from U.S. Provisional Patent Application No. 60/883,760, filed Jan. 5, 2007, and entitled “Show And Tell Communication Interface,” all of which are herein incorporated by reference in their entirety.

US Referenced Citations (613)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Ben-Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6272484 Martin et al. Aug 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6687877 Sastry et al. Feb 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240089 Boudreau Jul 2007 B2
7269426 Kokkonen et al. Sep 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Brondrup Jan 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7508419 Toyama et al. Mar 2009 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7589749 De Laurentis Sep 2009 B1
7607096 Oreizy et al. Oct 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7856360 Kramer et al. Dec 2010 B2
8001204 Burtner et al. Aug 2011 B2
8032586 Challenger et al. Oct 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8090351 Klein Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8131597 Hudetz Mar 2012 B2
8135166 Rhoads Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8199747 Rojas et al. Jun 2012 B2
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8554868 Skyrm et al. Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8613089 Holloway et al. Dec 2013 B1
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8732168 Johnson May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8909679 Root et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8972357 Shim et al. Mar 2015 B2
8995433 Rojas Mar 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs et al. Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9258459 Hartley Feb 2016 B2
9344606 Hartley et al. May 2016 B2
9385983 Sehn Jul 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9489661 Evans et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9608949 Skyrm et al. Mar 2017 B2
9628950 Noeth et al. Apr 2017 B1
9710821 Heath Jul 2017 B2
9854219 Sehn Dec 2017 B2
10135765 Skyrm et al. Nov 2018 B1
10491659 Skyrm et al. Nov 2019 B1
20010025309 Macleod Beck et al. Sep 2001 A1
20020047868 Miyazawa Apr 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20020168621 Cook Nov 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030203731 King Oct 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050021624 Herf et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050132288 Kirn et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050227678 Agrawal et al. Oct 2005 A1
20050235062 Lunt et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060031560 Warshavsky et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060170958 Jung et al. Aug 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20060294469 Sareen et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070124245 Sato et al. May 2007 A1
20070136228 Petersen Jun 2007 A1
20070157105 Owens Jul 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080062141 Chandhri Mar 2008 A1
20080076505 Nguyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschwieler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090132341 Klinger May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090192900 Collison Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100130233 Parker May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen et al. Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Francisco Lopez et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 Mcevilly et al. Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140082651 Sharifi Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140101568 Skyrm et al. Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'keefe et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150071619 Brough Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150169827 Laborde Jun 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
20150261917 Smith Sep 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn et al. Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160239248 Sehn Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160321708 Sehn Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
Foreign Referenced Citations (32)
Number Date Country
2887596 Jul 2015 CA
1146753 Oct 2001 EP
2051480 Apr 2009 EP
2151797 Feb 2010 EP
2399928 Sep 2004 GB
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-2006118755 Nov 2006 WO
WO-2007092668 Aug 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO-2011119407 Sep 2011 WO
WO-2013008238 Jan 2013 WO
WO-2013045753 Apr 2013 WO
WO-2014006129 Jan 2014 WO
WO-2014068573 May 2014 WO
WO-2014115136 Jul 2014 WO
WO-2014194262 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
Non-Patent Literature Citations (71)
Entry
“U.S. Appl. No. 14/047,662, Final Office Action dated Sep. 24, 2015”, 10 pgs.
“U.S. Appl. No. 14/047,662, Non Final Office Action dated Jun. 11, 2015”, 11 pgs.
“U.S. Appl. No. 14/047,662, Notice of Allowance dated Nov. 17, 2016”, 10 pgs.
“U.S. Appl. No. 14/047,662, Preliminary Amendment filed Oct. 9, 2013”, 11 pgs.
“U.S. Appl. No. 14/047,662, Response filed May 13, 2016 to Final Office Action dated Sep. 24, 2015”, 2 pgs.
“U.S. Appl. No. 14/047,662, Response filed Sep. 11, 2015 to Non Final Office Action dated Jun. 11, 2015”, 12 pgs.
“U.S. Appl. No. 15/396,117, Final Office Action dated Jul. 14, 2017”, 27 pgs.
“U.S. Appl. No. 15/396,117, Non Final Office Action dated Nov. 17, 2017”, 29 pgs.
“U.S. Appl. No. 15/396,117, Notice of Allowance dated Jun. 28, 2018”, 10 pgs.
“U.S. Appl. No. 15/396,117, Response filed Feb. 19, 2018 to”, 20 pgs.
“U.S. Appl. No. 15/396,117, Response filed Feb. 19, 2018 to Non Final Office Action dated Nov. 17, 2017”, 20 pgs.
“U.S. Appl. No. 15/396,117, Response filed Apr. 24, 2017 to Non Final Office Action dated Feb. 28, 2017”, 22 pgs.
“U.S. Appl. No. 15/396,133 , Non Final Office Action dated Mar. 21, 2017”, 17 pgs.
“U.S. Appl. No. 15/396,133, Examiner Interview Summary dated Jun. 27, 2017”, 5 pgs.
“U.S. Appl. No. 15/396,133, Examiner Interview Summary dated Aug. 15, 2018”, 3 pgs.
“U.S. Appl. No. 15/396,133, Final Office Action dated May 15, 2018”, 19 pgs.
“U.S. Appl. No. 15/396,133, Final Office Action dated Jul. 31, 2017”, 21 pgs.
“U.S. Appl. No. 15/396,133, Non Final Office Action dated Nov. 2, 2018”, 12 pgs.
“U.S. Appl. No. 15/396,133, Non Final Office Action dated Dec. 22, 2017”, 19 pgs.
“U.S. Appl. No. 15/396,133, Notice of Allowance dated Jul. 23, 2019”, 12 pgs.
“U.S. Appl. No. 15/396,133, Response filed Jun. 21, 2017 to Non Final Office Action dated Mar. 21, 2017”, 12 pgs.
“U.S. Appl. No. 15/396,133, Response Filed Aug. 15, 2018 to Final Office Action dated May 15, 2018”, 16 pgs.
“U.S. Appl. No. 15/396,133, Response filed Oct. 31, 2017 to Final Office Action dated Jul. 31, 2017”, 13 pgs.
“U.S. Appl. No. 15/396,133, Response filed May 1, 2019 to Non Final Office Action dated Nov. 2, 2018”, 14 pgs.
“U.S. Appl. No. 15/396,133,Response filed Mar. 22, 2018 to Non Final Office Action dated Dec. 22, 2017”, 14 pgs.
“Flickr Hacks”, Bausch. O'Reilly Media Inc., Chapters 1, 6 and 12, (2006), 32 pgs.
“Geotagging with Zonetag and Bluetooth GPS”, [Online] Retrieved from the Internet: <http://plasticbag.org/archives/2006/geotagging_with_zonet>, (2006), 7 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online] Retrieved from the Internet: <URL: http://www.theregisterco.uk/2005/12/12/stealthtext/>, (Dec. 12, 2005), 1 pg.
Mieszkowski, Katherine, “The Friendster of photo sites”, [Online] Retrieved from the Internet: <http://www.salon.com/2004/12/20/flickr/>, (2004), 12 pgs.
“A Whole New Story”, Snap, Inc., [Online] Retrieved from the Internet: <URL: https://www.snap.com/en-US/news/>, (2017), 13 pgs.
“Adding photos to your listing”, eBay, [Online] Retrieved from the Internet: <URL: http://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs.
“U.S. Appl. No. 11/750,211, 312 Amendment filed Jun. 25, 2013”, 10 pgs.
“U.S. Appl. No. 11/750,211, Advisory Action dated Jan. 11, 2010”, 4 pgs.
“U.S. Appl. No. 11/750,211, Advisory Action dated May 31, 2012”, 3 pgs.
“U.S. Appl. No. 11/750,211, Final Office Action dated Mar. 13, 2012”, 14 pgs.
“U.S. Appl. No. 11/750,211, Final Office Action dated Oct. 23, 2009”, 13 pgs.
“U.S. Appl. No. 11/750,211, Final Office Action dated Oct. 28, 2010”, 16 pgs.
“U.S. Appl. No. 11/750,211, Non Final Office Action dated May 7, 2009”, 10 pgs.
“U.S. Appl. No. 11/750,211, Non Final Office Action dated May 12, 2010”, 15 pgs.
“U.S. Appl. No. 11/750,211, Non Final Office Action dated Oct. 6, 2011”, 16 pgs.
“U.S. Appl. No. 11/750,211, Notice of Allowance dated Jun. 7, 2013”, 18 pgs.
“U.S. Appl. No. 11/750,211, Response filed Jan. 4, 2012 to Non Final Office Action dated Oct. 6, 2011”, 12 pgs.
“U.S. Appl. No. 11/750,211, Response filed Mar. 28, 2011 to Final Office Action dated Oct. 28, 2010”, 10 pgs.
“U.S. Appl. No. 11/750,211, Response filed May 14, 2012 to Final Office Action dated Mar. 13, 2012”, 12 pgs.
“U.S. Appl. No. 11/750,211, Response filed Aug. 7, 2009 to Non Final Office Action dated May 7, 2009”, 11 pgs.
“U.S. Appl. No. 11/750,211, Response filed Aug. 12, 2010 to Non Final Office Action dated May 12, 2010”, 10 pgs.
“U.S. Appl. No. 11/750,211, Response filed Dec. 23, 2009 to Final Office Action dated Oct. 23, 2009”, 11 pgs.
“U.S. Appl. No. 15/396,117, Non Final Office Action dated Feb. 28, 2017”, 21 pgs.
“BlogStomp”, StompSoftware, [Online] Retrieved from the Internet: <URL: http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, Blast Radius, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20160711202454/http://www.blastradius.com/work/cup-magic>(2016), 7 pgs.
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, [Online] Retrieved from the Internet: <URL: http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs.
“InstaPlace Photo App Tell the Whole Story”, [Online] Retrieved from the Internet: <URL: https://youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs., 1:02 min.
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs.
“Introducing Snapchat Stories”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20131026084921/https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.; 00:47 min.
“Macy's Believe-o-Magic”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20190422101854/https://www.youtube.com/watch?v=xvzRXy3J0Z0&feature=youtu.be>, (Nov. 7, 2011), 102 pgs.; 00:51 min.
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe Campaign”, Business Wire, [Online] Retrieved from the Internet: <URL: https://www.businesswire.com/news/home/20111102006759/en/Macys-Introduces-Augmented-Reality-Experience-Stores-Country>, (Nov. 2, 2011), 6 pgs.
“Starbucks Cup Magic”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=RWwQXi9RG0w>, (Nov. 8, 2011), 87 pgs.; 00:47 min.
“Starbucks Cup Magic for Valentine's Day”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.; 00:45 min.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, Business Wire, [Online] Retrieved from the Internet: <URL: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, TechCrunch, [Online] Retrieved from the Internet: <URL: https://techcrunch.com/2011/09/08/mobli-filters>, (Sep. 8, 2011), 10 pgs.
Janthong, Isaranu, “Instaplace ready on Android Google Play store”, Android App Review Thailand, [Online] Retrieved from the Internet: <URL: http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs.
Macleod, Duncan, “Macys Believe-o-Magic App”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs.
Macleod, Duncan, “Starbucks Cup Magic Lets Merry”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs.
Notopoulos, Katie, “A Guide to the New Snapchat Filters and Big Fonts”, [Online] Retrieved from the Internet: <URL: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term=.bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, A Replay Function and for Whatever Reason, Time, Temperature and Speed Overlays”, TechCrunch, [Online] Retrieved form the Internet: <URL: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs.
Roccetti, Marco, “Networking Issues in Multimedia Entertainment”, (2004), 3 pgs.
Tripathi, Rohit, “Watermark Images in PHP and Save File on Server”, [Online] Retrieved from the Internet: <URL: http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server>, (Dec. 28, 2012), 4 pgs.
U.S. Appl. No. 11/750,211/U.S. Pat. No. 8,554,868, filed May 17, 2007, Simulataneous Sharing Communication Interface.
U.S. Appl. No. 14/047,662/U.S. Pat. No. 9,608,949, filed Oct. 7, 2013, Simultaneous Sharing Communication Interface.
U.S. Appl. No. 15/396,117/U.S. Pat. No. 10,135,765, filed Dec. 30, 2016, Real-Time Display of Multiple Annotated Images.
U.S. Appl. No. 15/396,133/U.S. Pat. No. 10,491,659, filed Dec. 30, 2016, Real-Time Display of Multiple Images (as amended).
Provisional Applications (1)
Number Date Country
60883760 Jan 2007 US
Continuations (3)
Number Date Country
Parent 15396133 Dec 2016 US
Child 16669130 US
Parent 14047662 Oct 2013 US
Child 15396133 US
Parent 11750211 May 2007 US
Child 14047662 US