User interface for proximity based teleconference transfer

Abstract
A method for transferring a teleconference between a mobile device and a conference center. The method includes identifying, by a mobile device, a candidate meeting center system for transfer of a teleconference conducted on the mobile device, and in response to identifying the candidate meeting center system, generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to facilitate transfer of the teleconference from the mobile device to the candidate meeting center system.
Description
BACKGROUND
1. Technical Field

The disclosed technology relates to methods and systems for providing a user interface to facilitate transfer of a teleconference between a meeting center system and a mobile device.


2. Introduction

With the increasing ubiquity of network connectivity, as well as improvements in data speeds, IP-based teleconferencing has become very popular. Due to the multiuser nature of teleconference calls, it is not uncommon for one or more users to leave and/or join an ongoing teleconference. In conventional meeting center systems, in order to transfer a call from a mobile device to the meeting center system, a joining user needs to end any ongoing calls on his/her device and dial into the teleconference using the meeting system hardware. Similarly, a user departing a teleconference conducted at the meeting center system would need to separately dial-in to the teleconference using his/her device in order to maintain connectivity.





BRIEF DESCRIPTION OF THE DRAWINGS

Certain features of the subject technology are set forth in the appended claims. However, the accompanying drawings, which are included to provide further understanding, illustrate disclosed aspects and together with the description serve to explain the principles of the subject technology. In the drawings:



FIGS. 1A and 1B illustrate an example of a graphical user interface (UI) used to provide user selectable options for initiating a teleconference transfer, according to some aspects of the technology.



FIGS. 2A-2C illustrate examples of various phases of a UI display provided during the process of initiating a teleconference transfer.



FIGS. 3A-3C illustrate examples of various phases of a UI display provided during the process of performing a teleconference transfer, including display of a call destination icon representing the destination teleconference meeting center.



FIGS. 3D and 3E illustrate an example of a UI display provided during the completion of a teleconference transfer process.



FIGS. 4A and 4B illustrate examples of a UI display after the completion of a teleconference transfer, e.g., from a mobile device to a conference center system.



FIGS. 4C and 4D illustrate examples of various UI displays that can be provided to facilitate control of the transferred teleconference on the meeting center system.



FIGS. 5A and 5B illustrate additional examples of UI displays to provide control options for an ongoing teleconference, including options for transferring the teleconference back to the mobile device.



FIGS. 5C-5E illustrate examples of a UI at various stages of a teleconference transfer from a conference center system to a mobile device.



FIGS. 6A and 6B illustrate examples of a UI provided at the completion of a teleconference transfer from a conference center system to a mobile device.



FIGS. 7A-7C illustrate examples of a UI display provided after a teleconference has been successfully transferred from a conference center system to a mobile device.



FIG. 8 illustrates steps of an example method for transferring a teleconference between a conference system and a mobile device.



FIG. 9 illustrates an example of an electronic system with which some aspects of the subject technology can be implemented.



FIGS. 10A and 10B illustrate example system embodiments.





DESCRIPTION OF EXAMPLE EMBODIMENTS

The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a more thorough understanding of the subject technology. However, it will be clear and apparent that the subject technology is not limited to the specific details set forth herein and may be practiced without these details. In some instances, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.


Overview

Aspects of the disclosed technology relate to systems and methods for transferring a teleconference between a mobile device and a conference center (e.g., a meeting center system). Steps performed to implement some methods of the technology can include operations for identifying, by a mobile device, a candidate meeting center system for transfer of a teleconference conducted on the mobile device, and in response to identifying the candidate meeting center system, generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to facilitate transfer of the teleconference from the mobile device to the candidate meeting center system. Systems and computer readable media are also provided.


Description

Aspects of the disclosed technology address various limitations of conventional meeting center systems by providing a user interface (UI) for conveniently transitioning teleconference connectivity between devices, such as a meeting center system and a mobile device. As used herein, “meeting center,” “meeting center system” and “conference center” can refer to one or more hardware and/or software systems implemented for conducting a teleconference in a particular meeting location, such as an office, conference room, or classroom, etc. Additionally, as used herein, a user's “mobile device” can refer to any of a variety of portable electronic devices that can be configured for transferring teleconference operation to and/or from the meeting center system. By way of non-limiting example, a mobile device can include any one of: a smart phone, a personal desktop assistant (PDA), a tablet computing device, a smart-watch device, or the like.


In one aspect, a UI is provided that automatically and intuitively directs a user to merge/transfer a telephone call (or video conference), depending on whether the user is arriving, at or departing from, a meeting center system location.


In some implementations, the UI can provide one or more prompts to the user in response to detecting that the user's device is located in close proximity to the meeting center system. Although determinations of proximity (e.g., between a mobile user device and the meeting center system) can be implementation specific, in some aspects, determinations can be made using a sound signal, such as a sonar signal. For example, a sonar signal emitted by the meeting center can be detected at the mobile device, verifying the mobile device's proximity. In this manner, the meeting center system and/or the user's mobile device can infer that the user is near the meeting center system and therefore likely to desire the transfer of teleconference management between devices. It is understood that other methods for determining device proximity can be implemented, without departing from the scope of the technology.


As discussed in further detail below, a detected proximity between the user's device and the meeting center system can trigger the display of various user selectable buttons and/or messages to intuitively guide the user through the teleconference transfer process. For example, upon identifying a proximately located meeting center system, a UI can be provided on a screen of the user's device e.g., to indicate actions that can be performed to conduct the teleconference transfer. Once the required actions are performed (e.g., through user interaction with touch-screen selectable options), the UI can display the conference destination, i.e., an image of the destination meeting center system. As discussed in further detail below, other user controls can also be displayed, for example, that provide options for managing the teleconference on the meeting system (e.g., from the mobile device). Such options may include, but are not limited to, graphical icons configured to cause teleconference termination, or muting, etc.


In another aspect, the technology can provide a UI configured to facilitate the moving of a teleconference from a meeting center system onto (or back to) a user's mobile device. User prompts for transitioning the call to the mobile device can be provided in response to detected changes in mobile device proximity to the meeting center system, for example, when it is determined that the mobile device is moving away from a location of the meeting center.


In some aspects, user prompts can be provided to either continue (e.g., copy) the teleconference to the user's mobile device. Additionally, graphical prompts can be provided that enable the user to terminate the call at the meeting center location, i.e., to move the call to the user's mobile device. Various aspects of a graphical user interface (GUI) will now be described with respect to FIGS. 1-11, discussed below.



FIGS. 1A and 1B illustrate examples of graphical UI displays (e.g., UI display 100A and UI display 100B) that can be used to provide user selectable options for managing a teleconference transfer, e.g., from the mobile device to a meeting center system. FIG. 1A illustrates an example UI display 100A that is provided during an ongoing teleconference, for example, in response to the detection of a proximately located meeting center system. As discussed in further detail below, determinations of proximity between a mobile device and a meeting center system (e.g., which is a candidate for receiving a teleconference transfer) can be based on signaling provided between the meeting center system and the mobile device, such as, the receipt of a sonar signal at the mobile device.


It is understood that various UI displays, such as in the examples provided by UI display 100A, and UI display 100B, can be provided on various types of mobile devices. Such devices can include, but are not limited to, one or more of: smart phone devices, tablet computers, smart watch devices, notebook computers, and/or game consoles, etc. Additionally, the UI displays discussed herein can be provided on a variety of display screen types, such as, capacitive touchscreens, organic light-emitting diode (OLED) displays, and the like.


Additionally, in the examples provided by FIG. 1A, and FIG. 1B, UI displays 100A, and 100B, each provide an image of a teleconference user, e.g., with whom a user of the mobile device is corresponding. However, it is understood that the UI displays can provide other types of images and/or video graphics concurrent with the teleconference session, without departing from the scope of the technology. For example, an image, icon or emoji associated with the corresponding teleconference participant may be displayed by a UI on the mobile device during the duration of the teleconference.


In FIG. 1A, UI display 100A includes a call move option icon 102A that is provided at a bottom portion of UI display 100A. In this example, display 100A is provided on a touchscreen, such as a capacitive touchscreen of a smartphone device, configured to receive user input via a user's touch-engagement with the touchscreen surface. As such, call move option icon 102A is selectable through user engagement (touch) with an area of the capacitive touchscreen surface corresponding with the display of call move option icon 102A.


Call move option icon 102A can include information, such as text, that can be useful in providing user guidance for interaction with the call move option icon. In the example of FIG. 1A, call move option icon 102A contains a user engagement instruction, including text for directing the user to “drag up to move call.” It is understood that call move option icon 102A can include additional and/or different user engagement instructions, depending on the desired implementation. Additionally, as discussed in further examples below, user engagement instructions can be provided independently from a call move option icon, and located in other areas within the UI display.



FIG. 1B illustrates another example of a UI display 100B, that includes call move option icon 102B. In the example provided by UI display 100B, call move option icon 102B includes a user engagement image, e.g., that provides useful hints as to how a user can select the displayed call move option icon 102B. In this example, the user engagement image comprises an up-arrow, indicating user engagement with the call move option icon in an upward direction, with respect to the UI display orientation, needed to initiate the teleconference transfer.


In some approaches, the displayed UI can alternate between different display states. For example, during an ongoing teleconference and upon detection of a proximately located candidate meeting center system, the mobile device can cause the UI display to alternate between UI display 100A, and UI display 100B. That is, the call move option icon can vacillate between call move option icon 102A (providing textual user instructions), and call move option 102B (providing graphical user instructions). The dynamic nature of the UI display can serve to further instruct the user about how to transfer the teleconference.



FIGS. 2A-2C illustrate various phases of an example UI display provided during the process of performing a teleconference transfer. In particular, FIG. 2A illustrates an example UI display 200A that includes user engagement icon 202A at a bottom portion of UI display 200A. UI display 200A also provides a user engagement instruction 201 (e.g., the text “drag up to move call”), in a top portion of the display, as well as a video window 203A, for example, that provides a video or image representation of the party with whom the teleconference is being conducted. As discussed above, user engagement instructions can be provided at various locations within the user interface, including as part of, or within, a user engagement icon.



FIGS. 2B and 2C illustrate example UI displays (200B, 200C) that represent the progression of display changes after user engagement with UI display 200A has begun. As illustrated by the various UI displays (200A-C), the graphics provided by the user interface (including user engagement instruction 201 (not illustrated), user engagement icon 202B and video window 203B), can begin to move in a manner consistent with the direction of a user's engagement with a touchscreen on which the displays are provided. In the illustrated examples, a video window (202B, 202C), user engagement icon (202B, 202C), and the user engagement instruction move in an upward direction concurrent with the user's progressive engagement with user engagement icons 202A, 202B, and 202C.



FIGS. 3A-3C illustrate examples of phases of UI displays (300A, 300B, and 300C, respectively) provided during the process of performing a teleconference transfer. Displays 300A-C variously include call move option icons (305A, 305B, 305C), and call destination icons (307A-C). UI displays 300A and 300B also include partial displays of video windows (309A and 309B, respectively)—which provide a video feed for the teleconference being conducted on the mobile device before the transfer is completed.



FIGS. 3B and 3C illustrate UI displays (300B, 300C) depicting the progression of the call transfer process, e.g., to a conference center destination entitled “Batcave.” As illustrated by UI display 300B, the teleconference transfer process is performed as the user slides call move option icon 305B toward call destination icon 307B. UI display 300C from FIG. 3C provides an example of the graphical depiction provided when the teleconference transfer is completed. For example, UI display 300C includes user engagement instruction 311, which delivers user instructions for how to complete the teleconference transfer, i.e., “released to move call.”



FIGS. 3D and 3E illustrate examples of UI displays (300D, 300E) provided during the completion of a teleconference transfer. FIG. 3D illustrates UI display 300D that provides a graphic indicating a status (e.g., “moving call”) of the teleconference transfer, as well as an indication of the new teleconference location (e.g., “Batcave”). FIG. 3E illustrates an example of the graphical depiction of the teleconference transfer completion process that includes the fading in of user image or picture, indicating a user (or group of users) with whom the teleconference is being conducted at the transferee location.



FIGS. 4A and 4B illustrate examples of UI displays (400A, 400B) provided after completion of a teleconference transfer. That is, FIGS. 4A/B illustrate graphical displays 400A/B provided on the mobile device that display information relating to the teleconference that has been transferred to the meeting center system. In the example of FIGS. 4A/B, UI displays 400A/B indicate a name 402A/B and graphical icon 404A/B associated with the party with whom the transferred teleconference is conducted (e.g., “Eva Fredriksen”), as well as an image or picture relating to Eva Fredriksen. Additionally, FIG. 4B illustrates a graphical display 400B similar to that of FIG. 4A, with the addition of call duration indicator 406 (e.g., 00:58 seconds) indicating a time duration for the transferred teleconference.



FIGS. 4C and 4D variously provide examples of UI displays (400C, 400D) provided on a mobile device that provide additional control options for a transferred teleconference that is actively conducted on a meeting center system (not illustrated). In particular, FIG. 4C/D illustrate UI displays 400C/D that includes a few of the display options discussed above, e.g., a name 402C/D of the teleconference party, a graphical icon of the teleconference party 404C/D, and a teleconference duration indicator 406C/D. However, UI displays 400C/D additionally include a call move option icon 403C/D that provide a user selectable option for transferring the teleconference from the meeting center system back to the mobile device. Similar to the example discussed above with respect to FIGS. 1-3, call move option icons 403C/D can include a user engagement instruction (e.g., 403C), or a user engagement image (e.g., 403D) that provides user information to guide the user in completing the teleconference transfer.


As further illustrated, UI displays 400C/D also include teleconference controls 405C/D that provide various user selectable options for controlling the teleconference being conducted on the conference center system. In the provided example, teleconference controls 405C/D include options for terminating the teleconference or muting the teleconference. However, it is understood that additional or different teleconference controls can be provided, without departing from the scope of the technology.


Additionally, UI displays 400C/D include user selectable management options 407C/D for facilitating management of the teleconference control interface. In particular, management options 407C/D provide options for exiting the teleconference management system, and also provide information identifying a location of location where the transferred teleconference is actively conducted (i.e., “Batcave”).



FIGS. 5A-5E variously illustrate examples of UI displays (500A-E, respectively) provided throughout the process of transferring a teleconference from a meeting center system to the mobile device. In the example of FIG. 5A, UI display 500A includes a name 502A (e.g., indicating a name with whom the teleconference is conducted), a call transfer option icon 503A (e.g., for moving the teleconference back to the mobile device), a graphical icon 504A (e.g., graphically indicating a party with whom the teleconference is conducted), as well as call control options 505A and management options 507A.


Once the teleconference transfer process has been initiated, for example, through user engagement with call transfer option icon 503A, some of the graphical displays associated with the teleconference are no longer displayed (e.g., graphical icons and call control options). Turning to FIG. 5B, UI display 500B provides management options 507B, as well as destination icon 510B representing a destination location available for the teleconference transfer, i.e., the mobile device. The example UI display 500B of FIG. 5B illustrates an example of the display changes that occur once a user begins to initiate the teleconference transfer back to the mobile device.



FIGS. 5C-5E show the graphical progression of the transfer process. For example, UI display 500C illustrates call transfer option icon 503C and destination icon 510C. As the user moves the call transfer option icon 503D/E toward destination icon 510D/E, the transfer of the teleconference back to the mobile device is completed.



FIGS. 6A and 6B illustrate examples of a UI display provided at the completion of a teleconference transfer from a conference center system to a mobile device. Similar to the example provided above with respect to FIGS. 3C-E, UI display 600A can include a user engagement instruction (e.g., “Release to copy call”) that provides information to a user that can provide instruction for completing the teleconference transfer.



FIG. 6B illustrates an example of a UI display 600B which provide a status indicator as the conference center to mobile device transfer is completed. That is, UI display 600B includes a status instruction to indicate the transfer progress by displaying: “Moving call.”



FIGS. 7A-7C illustrate examples of UI displays (700A-C) provided after a teleconference has been successfully transferred from a conference center system to a mobile device. As illustrated in the examples of FIGS. 7A-C, each of UI displays 700A-C provide a video window (e.g., 703A-C) that provides a video feed displaying a party with whom the newly transferred teleconference is conducted. Additionally, in the example of FIG. 7C, UI display 700C provides a set of user selectable teleconference control options 705C, for example, that provide user selectable options for managing various aspects of the teleconference at the previous location, i.e., at the meeting center system. In the illustrated example, teleconference control options 705C include options for terminating the teleconference on the meeting center system.



FIG. 8 illustrates steps of an example method 800 for transferring a teleconference e.g., between a conference system and a mobile device, according to some aspects of the technology. Method 800 begins with step 802 in which a teleconference is actively conducted on a user's mobile device. Depending on implementation, the same teleconference can also be concurrently conducted on the meeting center system (e.g., before the teleconference transfer is initiated).


In step 804, a proximately located meeting center system is identified. Proximity between a candidate meeting center system (i.e., one capable and authorized to receive the teleconference transfer) can be accomplished using various methods. As discussed above, proximity between the meeting center system and the mobile device can be determined at the mobile device through receipt of an inaudible sound signal, such as a sonar signal. In other aspects, location information of the meeting center system and the mobile device can be used to identify proximity between devices. For example, geolocation information, e.g., that is obtained using a geolocation positioning system (GPS), can be used to determine device proximity.


In step 806, in response to identifying the proximately located meeting center system, a user interface (UI) is provided by the mobile device to provide one or more transfer options to enable a user to transfer (e.g., move or copy) an ongoing teleconference from the mobile device to the meeting center system.


In some aspects, methods of the subject technology also provide ways to facilitate the transfer of a teleconference conducted on the meeting center system to a user's his/her mobile device. Similar to the methods described above, it can be determined (e.g., at the mobile device and/or meeting center system) that the mobile device is leaving a location proximate to the meeting center system. By way of example, determinations that the mobile device is leaving can be made when the mobile device loses contact with an audible signal, such as a sonar signal, emitted by the meeting center system. Further to the above examples, geolocation information (e.g., determined using one or more GPS systems) can be used to identify a departing mobile device.


In response to detecting the departure of a mobile device from the meeting center location, aspects of the subject technology can be used to provide a user interface display (e.g., a UI display), similar to those embodiments discussed above with respect to FIGS. 4C, 4D, and 5A-5E.



FIG. 9 illustrates an example of an electronic system with which some aspects of the subject technology can be implemented. Specifically, FIG. 9 illustrates an example network device 910, which could include, but is not limited to a mobile device, such as a smart phone, a notebook computer, or a tablet computing device.


Network device 910 includes a master central processing unit (CPU) 962, interfaces 968, and a bus 915 (e.g., a PCI bus). When acting under the control of appropriate software or firmware, the CPU 962 is responsible for executing packet management, error detection, and/or routing functions. The CPU 962 preferably accomplishes all these functions under the control of software including an operating system and any appropriate applications software. CPU 962 can include one or more processors 963 such as a processor from the Motorola family of microprocessors or the MIPS family of microprocessors. In an alternative embodiment, processor 963 is specially designed hardware for controlling the operations of router 910. In a specific embodiment, a memory 961 (such as non-volatile RAM and/or ROM) also forms part of CPU 962. However, there are many different ways in which memory could be coupled to the system.


The interfaces 968 can be provided as interface cards (sometimes referred to as “line cards”). Generally, they control the sending and receiving of data packets over the network and sometimes support other peripherals used with a router. Among the interfaces that can be provided are Ethernet interfaces, frame relay interfaces, cable interfaces, DSL interfaces, token ring interfaces, and the like. In addition, various very high-speed interfaces can be provided such as fast token ring interfaces, wireless interfaces, Ethernet interfaces, Gigabit Ethernet interfaces, ATM interfaces, HSSI interfaces, POS interfaces, FDDI interfaces and the like. Generally, these interfaces may include ports appropriate for communication with the appropriate media. In some cases, they may also include an independent processor and, in some instances, volatile RAM. The independent processors may control such communications intensive tasks as packet switching, media control and management. By providing separate processors for the communications intensive tasks, these interfaces allow the master microprocessor 962 to efficiently perform routing computations, network diagnostics, security functions, etc.


Although the system shown in FIG. 9 is one specific network device of the present invention, it is by no means the only network device architecture on which the present invention can be implemented. For example, an architecture having a single processor that handles communications as well as routing computations, etc. is often used. Further, other types of interfaces and media could also be used with the router.


Regardless of the network device's configuration, it may employ one or more memories or memory modules (including memory 961) configured to store program instructions for the general-purpose network operations and mechanisms for roaming, route optimization and routing functions described herein. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store tables such as mobility binding, registration, and association tables, etc.



FIG. 10A and FIG. 10B illustrate example system embodiments. Those of ordinary skill in the art will also readily appreciate that other system embodiments are possible.



FIG. 10A illustrates a system bus computing system architecture 1000 wherein the components of the system are in electrical communication with each other using a bus 1005. Exemplary system 1000 includes a processing unit (CPU or processor) 1010 and a system bus 1005 that couples various system components including the system memory 1015, such as read only memory (ROM) 1020 and random access memory (RAM) 1025, to the processor 1010. System 1000 can include a cache of high-speed memory connected directly with, in close proximity to, or integrated as part of the processor 1010. The system 1000 can copy data from the memory 1015 and/or the storage device 1030 to the cache 1012 for quick access by the processor 1010. In this way, the cache can provide a performance boost that avoids processor 1010 delays while waiting for data. These and other modules can control or be configured to control the processor 1010 to perform various actions. Other system memory 1015 can be available for use as well. Memory 1015 can include multiple different types of memory with different performance characteristics. The processor 1010 can include any general purpose processor and a hardware module or software module, such as module 11032, module 21034, and module 31036 stored in storage device 1030, configured to control the processor 1010 as well as a special-purpose processor where software instructions are incorporated into the actual processor design. The processor 1010 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc. A multi-core processor can be symmetric or asymmetric.


To enable user interaction with the computing device 1000, an input device 1045 can represent any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech and so forth. An output device 1035 can also be one or more of a number of output mechanisms known to those of skill in the art. In some instances, multimodal systems can enable a user to provide multiple types of input to communicate with the computing device 1000. The communications interface 1040 can generally govern and manage the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.


Storage device 1030 is a non-volatile memory and can be a hard disk or other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, solid state memory devices, digital versatile disks, cartridges, random access memories (RAMs) 1025, read only memory (ROM) 1020, and hybrids thereof.


The storage device 1030 can include software modules 1032, 1034, 1036 for controlling the processor 1010. Other hardware or software modules are contemplated. The storage device 1030 can be connected to the system bus 1005. In one aspect, a hardware module that performs a particular function can include the software component stored in a computer-readable medium in connection with the necessary hardware components, such as the processor 1010, bus 1005, display 1035, and so forth, to carry out the function.



FIG. 10B illustrates an example computer system 1050 having a chipset architecture that can be used in executing the described method and generating and displaying a graphical user interface (GUI). Computer system 1050 is an example of computer hardware, software, and firmware that can be used to implement the disclosed technology. System 1050 can include a processor 1055, representative of any number of physically and/or logically distinct resources capable of executing software, firmware, and hardware configured to perform identified computations. Processor 1055 can communicate with a chipset 1060 that can control input to and output from processor 1055. In this example, chipset 1060 outputs information to output device 1065, such as a display, and can read and write information to storage device 1070, which can include magnetic media, and solid state media, for example. Chipset 1060 can also read data from and write data to RAM 1075. A bridge 1080 for interfacing with a variety of user interface components 1085 can be provided for interfacing with chipset 1060. Such user interface components 1085 can include a keyboard, a microphone, touch detection and processing circuitry, a pointing device, such as a mouse, and so on. In general, inputs to system 1050 can come from any of a variety of sources, machine generated and/or human generated.


Chipset 1060 can also interface with one or more communication interfaces 1090 that can have different physical interfaces. Such communication interfaces can include interfaces for wired and wireless local area networks, for broadband wireless networks, as well as personal area networks. Some applications of the methods for generating, displaying, and using the GUI disclosed herein can include receiving ordered datasets over the physical interface or be generated by the machine itself by processor 1055 analyzing data stored in storage 1070 or 1075. Further, the machine can receive inputs from a user via user interface components 1085 and execute appropriate functions, such as browsing functions by interpreting these inputs using processor 1055.


It can be appreciated that example systems 1000 and 1050 can have more than one processor 1010 or be part of a group or cluster of computing devices networked together to provide greater processing capability.


For clarity of explanation, in some instances the present technology may be presented as including individual functional blocks including functional blocks comprising devices, device components, steps or routines in a method embodied in software, or combinations of hardware and software.


In some embodiments the computer-readable storage devices, mediums, and memories can include a cable or wireless signal containing a bit stream and the like. However, when mentioned, non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.


Methods according to the above-described examples can be implemented using computer-executable instructions that are stored or otherwise available from computer readable media. Such instructions can comprise, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, or source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memory, networked storage devices, and so on.


Devices implementing methods according to these disclosures can comprise hardware, firmware and/or software, and can take any of a variety of form factors. Typical examples of such form factors include laptops, smart phones, small form factor personal computers, personal digital assistants, rackmount devices, standalone devices, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example.


The instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are means for providing the functions described in these disclosures.


Although a variety of examples and other information was used to explain aspects within the scope of the appended claims, no limitation of the claims should be implied based on particular features or arrangements in such examples, as one of ordinary skill would be able to use these examples to derive a wide variety of implementations. Further and although some subject matter may have been described in language specific to examples of structural features and/or method steps, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to these described features or acts. For example, such functionality can be distributed differently or performed in components other than those identified herein. Rather, the described features and steps are disclosed as examples of components of systems and methods within the scope of the appended claims. Moreover, claim language reciting “at least one of” a set indicates that one member of the set or multiple members of the set satisfy the claim.


It should be understood that features or configurations herein with reference to one embodiment or example can be implemented in, or combined with, other embodiments or examples herein. That is, terms such as “embodiment”, “variation”, “aspect”, “example”, “configuration”, “implementation”, “case”, and any other terms which may connote an embodiment, as used herein to describe specific features or configurations, are not intended to limit any of the associated features or configurations to a specific or separate embodiment or embodiments, and should not be interpreted to suggest that such features or configurations cannot be combined with features or configurations described with reference to other embodiments, variations, aspects, examples, configurations, implementations, cases, and so forth. In other words, features described herein with reference to a specific example (e.g., embodiment, variation, aspect, configuration, implementation, case, etc.) can be combined with features described with reference to another example. Precisely, one of ordinary skill in the art will readily recognize that the various embodiments or examples described herein, and their associated features, can be combined with each other.


A phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology. A disclosure relating to an aspect may apply to all configurations, or one or more configurations. A phrase such as an aspect may refer to one or more aspects and vice versa. A phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology. A disclosure relating to a configuration may apply to all configurations, or one or more configurations. A phrase such as a configuration may refer to one or more configurations and vice versa. The word “exemplary” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.

Claims
  • 1. A computer-implemented method comprising: identifying, by a mobile device, a candidate meeting center system for transfer of a teleconference conducted on the mobile device;in response to identifying the candidate meeting center system, generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to facilitate transfer of the teleconference from the mobile device to the candidate meeting center system; andin response to transfer of the teleconference from the mobile device to the candidate meeting center system, the generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to provide remote control options at the mobile device to control aspects of the teleconference hosted on the candidate meeting center system;wherein the remote control options include at least volume control of the teleconference, termination of the teleconference, and return transfer of the teleconference from the candidate meeting center system to the mobile device;wherein the user selectable icons comprise a move call option and a call destination icon displayed on a touch screen of the mobile device, andwherein the UI is configured to initiate a transfer of the teleconference from the mobile device to the candidate meeting center system in response to a user's movement of the move call option in a direction of the call destination icon.
  • 2. The computer-implemented method of claim 1, wherein identifying the candidate meeting center system further comprises: receiving a proximity signal from the meeting center system, wherein the proximity signal is configured to provide confirmation that the meeting center system is proximately located to the mobile device.
  • 3. The computer-implemented method of claim 2, wherein the proximity signal comprises an audible signal.
  • 4. The computer-implemented method of claim 1, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option is configured for selection via a user's engagement with the call move option in an upward direction with respect to the touch screen of the mobile device.
  • 5. The computer-implemented method of claim 1, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option comprises a user engagement instruction that indicates a user interaction with the touch screen to initiate the transfer of the teleconference from the mobile device to the candidate meeting center system.
  • 6. The computer-implemented method of claim 1, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option comprises a user engagement image that indicates a user interaction with the touch screen to initiate the transfer of the teleconference from the mobile device to the candidate meeting center system.
  • 7. A system for facilitating transfer of a teleconference from a mobile device to a candidate meeting center system, the system comprising: one or more processors; anda computer-readable medium comprising instructions stored therein, which when executed by the one or more processors, cause the one or more processors to perform operations comprising: identifying, by a mobile device, a candidate meeting center system for transfer of a teleconference conducted on the mobile device;in response to identifying the candidate meeting center system, generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to facilitate transfer of the teleconference from the mobile device to the candidate meeting center system; andin response to transfer of the teleconference from the mobile device to the candidate meeting center system, the generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to provide remote control options at the mobile device to control aspects of the teleconference hosted on the candidate meeting center system;wherein the remote control options include at least volume control of the teleconference, termination of the teleconference, and return transfer of the teleconference from the candidate meeting center system to the mobile device;wherein the user selectable icons comprise a move call option and a call destination icon displayed on a touch screen of the mobile device, andwherein the UI is configured to initiate a transfer of the teleconference from the mobile device to the candidate meeting center system in response to a user's movement of the move call option in a direction of the call destination icon.
  • 8. The system of claim 7, wherein identifying the candidate meeting center system further comprises operations for: receiving a proximity signal from the meeting center system, wherein the proximity signal is configured to provide confirmation that the meeting center system is proximately located to the mobile device.
  • 9. The system of claim 8, wherein the proximity signal comprises an audible signal.
  • 10. The system of claim 7, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option is configured for selection via a user's engagement with the call move option in an upward direction with respect to the touch screen of the mobile device.
  • 11. The system of claim 7, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option comprises a user engagement instruction that indicates a user interaction with the touch screen to initiate the transfer of the teleconference from the mobile device to the candidate meeting center system.
  • 12. The system of claim 7, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option comprises a user engagement image that indicates a user interaction with the touch screen to initiate the transfer of the teleconference from the mobile device to the candidate meeting center system.
  • 13. A non-transitory computer-readable storage medium comprising instructions stored therein, which when executed by one or more processors, cause the one or more processors to perform operations comprising: identifying, by a mobile device, a candidate meeting center system for transfer of a teleconference conducted on the mobile device; andin response to identifying the candidate meeting center system, generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to facilitate transfer of the teleconference from the mobile device to the candidate meeting center system; andin response to transfer of the teleconference from the mobile device to the candidate meeting center system, the generating a user interface (UI) to provide one or more user selectable icons, the user selectable icons configured to provide remote control options at the mobile device to control aspects of the teleconference hosted on the candidate meeting center system;wherein the remote control options include at least volume control of the teleconference, termination of the teleconference, and return transfer of the teleconference from the candidate meeting center system to the mobile device;wherein the user selectable icons comprise a move call option and a call destination icon displayed on a touch screen of the mobile device, andwherein the UI is configured to initiate a transfer of the teleconference from the mobile device to the candidate meeting center system in response to a user's movement of the move call option in a direction of the call destination icon.
  • 14. The non-transitory computer-readable storage medium of claim 13, wherein identifying the candidate meeting center system further comprises operations for: receiving a proximity signal from the meeting center system, wherein the proximity signal is configured to provide confirmation that the meeting center system is proximately located to the mobile device.
  • 15. The non-transitory computer-readable storage medium of claim 14, wherein the proximity signal comprises an audible signal.
  • 16. The non-transitory computer-readable storage medium of claim 13, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option is configured for selection via a user's engagement with the call move option in an upward direction with respect to the touch screen of the mobile device.
  • 17. The non-transitory computer-readable storage medium of claim 13, wherein the user selectable icons comprise a call move option displayed on a touch screen of the mobile device, and wherein the call move option comprises a user engagement instruction that indicates a user interaction with the touch screen to initiate the transfer of the teleconference from the mobile device to the candidate meeting center system.
US Referenced Citations (448)
Number Name Date Kind
4460807 Kerr et al. Jul 1984 A
4890257 Anthias et al. Dec 1989 A
4977605 Fardeau et al. Dec 1990 A
5293430 Shiau et al. Mar 1994 A
5694563 Belfiore et al. Dec 1997 A
5699082 Marks et al. Dec 1997 A
5745711 Kitahara et al. Apr 1998 A
5767897 Howell Jun 1998 A
5825858 Shaffer et al. Oct 1998 A
5874962 de Judicibus et al. Feb 1999 A
5889671 Autermann et al. Mar 1999 A
5917537 Lightfoot et al. Jun 1999 A
5995096 Kitahara et al. Nov 1999 A
6023606 Monte et al. Feb 2000 A
6040817 Sumikawa Mar 2000 A
6075531 DeStefano Jun 2000 A
6085166 Beckhardt et al. Jul 2000 A
6191807 Hamada et al. Feb 2001 B1
6300951 Filetto et al. Oct 2001 B1
6392674 Hiraki et al. May 2002 B1
6424370 Courtney Jul 2002 B1
6463473 Gubbi Oct 2002 B1
6553363 Hoffman Apr 2003 B1
6554433 Holler Apr 2003 B1
6573913 Butler et al. Jun 2003 B1
6646997 Baxley et al. Nov 2003 B1
6665396 Khouri et al. Dec 2003 B1
6700979 Washiya Mar 2004 B1
6711419 Mori Mar 2004 B1
6754321 Innes et al. Jun 2004 B1
6754335 Shaffer et al. Jun 2004 B1
RE38609 Chen et al. Oct 2004 E
6816464 Scott et al. Nov 2004 B1
6865264 Berstis Mar 2005 B2
6938208 Reichardt Aug 2005 B2
6978499 Gallant et al. Dec 2005 B2
7046134 Hansen May 2006 B2
7046794 Piket et al. May 2006 B2
7058164 Chan et al. Jun 2006 B1
7058710 McCall et al. Jun 2006 B2
7062532 Sweat et al. Jun 2006 B1
7085367 Lang Aug 2006 B1
7124164 Chemtob Oct 2006 B1
7149499 Oran et al. Dec 2006 B1
7180993 Hamilton Feb 2007 B2
7209475 Shaffer et al. Apr 2007 B1
7340151 Taylor et al. Mar 2008 B2
7366310 Stinson et al. Apr 2008 B2
7418664 Ben-Shachar et al. Aug 2008 B2
7441198 Dempski et al. Oct 2008 B2
7478339 Pettiross et al. Jan 2009 B2
7500200 Kelso et al. Mar 2009 B2
7530022 Ben-Shachar et al. May 2009 B2
7552177 Kessen et al. Jun 2009 B2
7577711 McArdle Aug 2009 B2
7584258 Maresh Sep 2009 B2
7587028 Broerman et al. Sep 2009 B1
7606714 Williams et al. Oct 2009 B2
7606862 Swearingen et al. Oct 2009 B2
7620902 Manion et al. Nov 2009 B2
7634533 Rudolph et al. Dec 2009 B2
7774407 Daly et al. Aug 2010 B2
7792277 Shaffer et al. Sep 2010 B2
7830814 Allen et al. Nov 2010 B1
7840013 Dedieu et al. Nov 2010 B2
7840980 Gutta, Sr. Nov 2010 B2
7881450 Gentle et al. Feb 2011 B1
7920160 Tamaru et al. Apr 2011 B2
7956869 Gilra Jun 2011 B1
7986372 Ma et al. Jul 2011 B2
7995464 Croak et al. Aug 2011 B1
8059557 Sigg et al. Nov 2011 B1
8081205 Baird et al. Dec 2011 B2
8140973 Sandquist et al. Mar 2012 B2
8169463 Enstad et al. May 2012 B2
8219624 Haynes et al. Jul 2012 B2
8274893 Bansal et al. Sep 2012 B2
8290998 Stienhans et al. Oct 2012 B2
8301883 Sundaram et al. Oct 2012 B2
8340268 Knaz Dec 2012 B2
8358327 Duddy Jan 2013 B2
8423615 Hayes Apr 2013 B1
8428234 Knaz Apr 2013 B2
8433061 Cutler Apr 2013 B2
8434019 Nelson Apr 2013 B2
8456507 Mallappa et al. Jun 2013 B1
8462103 Moscovitch et al. Jun 2013 B1
8478848 Minert Jul 2013 B2
8520370 Waitzman, III et al. Aug 2013 B2
8625749 Jain et al. Jan 2014 B2
8630208 Kjeldaas Jan 2014 B1
8638354 Leow et al. Jan 2014 B2
8645464 Zimmet et al. Feb 2014 B2
8675847 Shaffer et al. Mar 2014 B2
8694587 Chaturvedi et al. Apr 2014 B2
8694593 Wren et al. Apr 2014 B1
8706539 Mohler Apr 2014 B1
8732149 Lida et al. May 2014 B2
8738080 Nhiayi et al. May 2014 B2
8751572 Behforooz et al. Jun 2014 B1
8831505 Seshadri Sep 2014 B1
8850203 Sundaram et al. Sep 2014 B2
8860774 Sheeley et al. Oct 2014 B1
8874644 Allen et al. Oct 2014 B2
8890924 Wu Nov 2014 B2
8892646 Chaturvedi et al. Nov 2014 B2
8914444 Hladik, Jr. Dec 2014 B2
8914472 Lee et al. Dec 2014 B1
8924862 Luo Dec 2014 B1
8930840 Riskó et al. Jan 2015 B1
8947493 Lian et al. Feb 2015 B2
8972494 Chen et al. Mar 2015 B2
9003445 Rowe Apr 2015 B1
9031839 Thorsen et al. May 2015 B2
9032028 Davidson et al. May 2015 B2
9075572 Ayoub et al. Jul 2015 B2
9118612 Fish et al. Aug 2015 B2
9131017 Kurupacheril et al. Sep 2015 B2
9137376 Basart et al. Sep 2015 B1
9143729 Anand et al. Sep 2015 B2
9165281 Orsolini et al. Oct 2015 B2
9197701 Petrov et al. Nov 2015 B1
9197848 Felkai et al. Nov 2015 B2
9201527 Kripalani et al. Dec 2015 B2
9203875 Huang et al. Dec 2015 B2
9204099 Brown Dec 2015 B2
9219735 Hoard et al. Dec 2015 B2
9246855 Maehiro Jan 2016 B2
9258033 Showering Feb 2016 B2
9268398 Tipirneni Feb 2016 B2
9298342 Zhang et al. Mar 2016 B2
9323417 Sun et al. Apr 2016 B2
9335892 Ubillos May 2016 B2
9349119 Desai et al. May 2016 B2
9367224 Ananthakrishnan et al. Jun 2016 B2
9369673 Ma et al. Jun 2016 B2
9407621 Vakil et al. Aug 2016 B2
9432512 You Aug 2016 B2
9449303 Underhill et al. Sep 2016 B2
9467848 Song Oct 2016 B1
9495664 Cole et al. Nov 2016 B2
9513861 Lin et al. Dec 2016 B2
9516022 Borzycki et al. Dec 2016 B2
9525711 Ackerman et al. Dec 2016 B2
9553799 Tarricone et al. Jan 2017 B2
9563480 Messerli et al. Feb 2017 B2
9609030 Sun et al. Mar 2017 B2
9609514 Mistry et al. Mar 2017 B2
9614756 Joshi Apr 2017 B2
9640194 Nemala et al. May 2017 B1
9667799 Olivier et al. May 2017 B2
9674625 Armstrong-Mutner Jun 2017 B2
9762709 Snyder et al. Sep 2017 B1
20010030661 Reichardt Oct 2001 A1
20020018051 Singh Feb 2002 A1
20020076003 Zellner et al. Jun 2002 A1
20020078153 Chung et al. Jun 2002 A1
20020140736 Chen Oct 2002 A1
20020188522 McCall et al. Dec 2002 A1
20030028647 Grosu Feb 2003 A1
20030046421 Horvitz et al. Mar 2003 A1
20030068087 Wu et al. Apr 2003 A1
20030154250 Miyashita Aug 2003 A1
20030174826 Hesse Sep 2003 A1
20030187800 Moore et al. Oct 2003 A1
20030197739 Bauer Oct 2003 A1
20030227423 Arai et al. Dec 2003 A1
20040039909 Cheng Feb 2004 A1
20040054885 Bartram et al. Mar 2004 A1
20040098456 Krzyzanowski et al. May 2004 A1
20040210637 Loveland Oct 2004 A1
20040253991 Azuma Dec 2004 A1
20040267938 Shoroff et al. Dec 2004 A1
20050014490 Desai et al. Jan 2005 A1
20050031136 Du et al. Feb 2005 A1
20050048916 Suh Mar 2005 A1
20050055405 Kaminsky et al. Mar 2005 A1
20050055412 Kaminsky et al. Mar 2005 A1
20050085243 Boyer et al. Apr 2005 A1
20050099492 Orr May 2005 A1
20050108328 Berkeland et al. May 2005 A1
20050131774 Huxter Jun 2005 A1
20050175208 Shaw et al. Aug 2005 A1
20050215229 Cheng Sep 2005 A1
20050226511 Short Oct 2005 A1
20050231588 Yang et al. Oct 2005 A1
20050286711 Lee et al. Dec 2005 A1
20060004911 Becker et al. Jan 2006 A1
20060020697 Kelso et al. Jan 2006 A1
20060026255 Malamud et al. Feb 2006 A1
20060083305 Dougherty et al. Apr 2006 A1
20060084471 Walter Apr 2006 A1
20060164552 Cutler Jul 2006 A1
20060224430 Butt Oct 2006 A1
20060250987 White et al. Nov 2006 A1
20060271624 Lyle et al. Nov 2006 A1
20070005752 Chawla et al. Jan 2007 A1
20070021973 Stremler Jan 2007 A1
20070025576 Wen Feb 2007 A1
20070041366 Vugenfirer et al. Feb 2007 A1
20070047707 Mayer et al. Mar 2007 A1
20070058842 Vallone et al. Mar 2007 A1
20070067387 Jain et al. Mar 2007 A1
20070091831 Croy et al. Apr 2007 A1
20070100986 Bagley et al. May 2007 A1
20070106747 Singh et al. May 2007 A1
20070116225 Zhao et al. May 2007 A1
20070139626 Saleh et al. Jun 2007 A1
20070150453 Morita Jun 2007 A1
20070168444 Chen et al. Jul 2007 A1
20070198637 Deboy et al. Aug 2007 A1
20070208590 Dorricott et al. Sep 2007 A1
20070248244 Sato et al. Oct 2007 A1
20070250567 Graham et al. Oct 2007 A1
20080059986 Kalinowski et al. Mar 2008 A1
20080068447 Mattila et al. Mar 2008 A1
20080071868 Arenburg et al. Mar 2008 A1
20080080532 O'Sullivan et al. Apr 2008 A1
20080107255 Geva et al. May 2008 A1
20080133663 Lentz Jun 2008 A1
20080154863 Goldstein Jun 2008 A1
20080209452 Ebert et al. Aug 2008 A1
20080270211 Vander Veen et al. Oct 2008 A1
20080278894 Chen et al. Nov 2008 A1
20090012963 Johnson et al. Jan 2009 A1
20090019374 Logan et al. Jan 2009 A1
20090049151 Pagan Feb 2009 A1
20090064245 Facemire et al. Mar 2009 A1
20090075633 Lee et al. Mar 2009 A1
20090089822 Wada Apr 2009 A1
20090094088 Chen et al. Apr 2009 A1
20090100142 Stern et al. Apr 2009 A1
20090119373 Denner et al. May 2009 A1
20090132949 Bosarge May 2009 A1
20090193327 Roychoudhuri et al. Jul 2009 A1
20090234667 Thayne Sep 2009 A1
20090254619 Kho et al. Oct 2009 A1
20090256901 Mauchly et al. Oct 2009 A1
20090278851 Ach et al. Nov 2009 A1
20090282104 O'Sullivan et al. Nov 2009 A1
20090292999 LaBine et al. Nov 2009 A1
20090296908 Lee et al. Dec 2009 A1
20090306981 Cromack et al. Dec 2009 A1
20090309846 Trachtenberg et al. Dec 2009 A1
20090313334 Seacat et al. Dec 2009 A1
20100005142 Xiao et al. Jan 2010 A1
20100005402 George et al. Jan 2010 A1
20100031192 Kong Feb 2010 A1
20100061538 Coleman et al. Mar 2010 A1
20100070640 Allen, Jr. et al. Mar 2010 A1
20100073454 Lovhaugen et al. Mar 2010 A1
20100077109 Yan et al. Mar 2010 A1
20100094867 Badros et al. Apr 2010 A1
20100095327 Fujinaka et al. Apr 2010 A1
20100121959 Lin et al. May 2010 A1
20100131856 Kalbfleisch et al. May 2010 A1
20100157978 Robbins et al. Jun 2010 A1
20100162170 Johns et al. Jun 2010 A1
20100183179 Griffin, Jr. et al. Jul 2010 A1
20100211872 Rolston et al. Aug 2010 A1
20100215334 Miyagi Aug 2010 A1
20100220615 Enstrom et al. Sep 2010 A1
20100241691 Savitzky et al. Sep 2010 A1
20100245535 Mauchly Sep 2010 A1
20100250817 Collopy et al. Sep 2010 A1
20100262266 Chang et al. Oct 2010 A1
20100262925 Liu et al. Oct 2010 A1
20100275164 Morikawa Oct 2010 A1
20100302033 Devenyi et al. Dec 2010 A1
20100303227 Gupta Dec 2010 A1
20100316207 Brunson Dec 2010 A1
20100318399 Li et al. Dec 2010 A1
20110072037 Lotzer Mar 2011 A1
20110075830 Dreher et al. Mar 2011 A1
20110087745 O'Sullivan et al. Apr 2011 A1
20110117535 Benko et al. May 2011 A1
20110131498 Chao et al. Jun 2011 A1
20110154427 Wei Jun 2011 A1
20110217966 McDonald Sep 2011 A1
20110230209 Kilian Sep 2011 A1
20110264928 Hinckley Oct 2011 A1
20110270609 Jones et al. Nov 2011 A1
20110271211 Jones et al. Nov 2011 A1
20110283226 Basson et al. Nov 2011 A1
20110314139 Song et al. Dec 2011 A1
20120009890 Curcio et al. Jan 2012 A1
20120013704 Sawayanagi et al. Jan 2012 A1
20120013768 Zurek et al. Jan 2012 A1
20120026279 Kato Feb 2012 A1
20120054288 Wiese et al. Mar 2012 A1
20120072364 Ho Mar 2012 A1
20120084714 Sirpal et al. Apr 2012 A1
20120092436 Pahud et al. Apr 2012 A1
20120140970 Kim et al. Jun 2012 A1
20120179502 Farooq et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120192075 Ebtekar et al. Jul 2012 A1
20120233020 Eberstadt et al. Sep 2012 A1
20120246229 Carr et al. Sep 2012 A1
20120246596 Ording et al. Sep 2012 A1
20120278727 Ananthakrishnan Nov 2012 A1
20120284635 Sitrick et al. Nov 2012 A1
20120296957 Stinson et al. Nov 2012 A1
20120303476 Krzyzanowski et al. Nov 2012 A1
20120306757 Keist et al. Dec 2012 A1
20120306993 Sellers-Blais Dec 2012 A1
20120308202 Murata et al. Dec 2012 A1
20120313971 Murata et al. Dec 2012 A1
20120315011 Messmer et al. Dec 2012 A1
20120321058 Eng et al. Dec 2012 A1
20120323645 Spiegel et al. Dec 2012 A1
20120324512 Cahnbley et al. Dec 2012 A1
20130002801 Mock Jan 2013 A1
20130027425 Yuan Jan 2013 A1
20130029648 Soundrapandian Jan 2013 A1
20130038675 Malik Feb 2013 A1
20130047093 Reuschel et al. Feb 2013 A1
20130050398 Krans et al. Feb 2013 A1
20130055112 Joseph et al. Feb 2013 A1
20130061054 Niccolai Mar 2013 A1
20130063542 Bhat et al. Mar 2013 A1
20130086633 Schultz Apr 2013 A1
20130090065 Fisunenko et al. Apr 2013 A1
20130091205 Kotler et al. Apr 2013 A1
20130091440 Kotler et al. Apr 2013 A1
20130094647 Mauro et al. Apr 2013 A1
20130106976 Chu et al. May 2013 A1
20130106977 Chu May 2013 A1
20130113602 Gilbertson et al. May 2013 A1
20130113827 Forutanpour et al. May 2013 A1
20130120522 Lian et al. May 2013 A1
20130124551 Foo May 2013 A1
20130129252 Lauper et al. May 2013 A1
20130135837 Kemppinen May 2013 A1
20130141371 Hallford et al. Jun 2013 A1
20130148789 Hillier et al. Jun 2013 A1
20130157636 Ryan Jun 2013 A1
20130182063 Jaiswal et al. Jul 2013 A1
20130185672 McCormick et al. Jul 2013 A1
20130198629 Tandon et al. Aug 2013 A1
20130210496 Zakarias et al. Aug 2013 A1
20130211826 Mannby Aug 2013 A1
20130212202 Lee Aug 2013 A1
20130212287 Chappelle Aug 2013 A1
20130215215 Gage et al. Aug 2013 A1
20130219278 Rosenberg Aug 2013 A1
20130222246 Booms et al. Aug 2013 A1
20130225080 Doss et al. Aug 2013 A1
20130227433 Doray et al. Aug 2013 A1
20130235866 Tian et al. Sep 2013 A1
20130242030 Kato et al. Sep 2013 A1
20130243213 Moquin Sep 2013 A1
20130252669 Nhiayi Sep 2013 A1
20130263020 Heiferman et al. Oct 2013 A1
20130290421 Benson et al. Oct 2013 A1
20130297704 Alberth, Jr. et al. Nov 2013 A1
20130300637 Smits et al. Nov 2013 A1
20130325970 Roberts et al. Dec 2013 A1
20130329865 Ristock et al. Dec 2013 A1
20130335507 Aarrestad et al. Dec 2013 A1
20130342637 Felkai Dec 2013 A1
20140012990 Ko Jan 2014 A1
20140028781 MacDonald Jan 2014 A1
20140040404 Pujare et al. Feb 2014 A1
20140040819 Duffy Feb 2014 A1
20140063174 Junuzovic et al. Mar 2014 A1
20140068452 Joseph et al. Mar 2014 A1
20140068670 Timmermann et al. Mar 2014 A1
20140078182 Utsunomiya Mar 2014 A1
20140108486 Borzycki et al. Apr 2014 A1
20140111597 Anderson et al. Apr 2014 A1
20140136630 Siegel et al. May 2014 A1
20140157338 Pearce Jun 2014 A1
20140161243 Contreras et al. Jun 2014 A1
20140195557 Oztaskent et al. Jul 2014 A1
20140198175 Shaffer et al. Jul 2014 A1
20140237371 Klemm et al. Aug 2014 A1
20140253671 Bentley et al. Sep 2014 A1
20140280595 Mani et al. Sep 2014 A1
20140282213 Musa et al. Sep 2014 A1
20140282888 Brooksby et al. Sep 2014 A1
20140296112 O'Driscoll et al. Oct 2014 A1
20140298210 Park et al. Oct 2014 A1
20140317561 Robinson et al. Oct 2014 A1
20140337840 Hyde et al. Nov 2014 A1
20140358264 Long et al. Dec 2014 A1
20140372908 Kashi et al. Dec 2014 A1
20150004571 Ironside et al. Jan 2015 A1
20150009278 Modai et al. Jan 2015 A1
20150029301 Nakatomi et al. Jan 2015 A1
20150067552 Leorin et al. Mar 2015 A1
20150070835 Mclean Mar 2015 A1
20150074189 Cox et al. Mar 2015 A1
20150081885 Thomas et al. Mar 2015 A1
20150082350 Ogasawara et al. Mar 2015 A1
20150085060 Fish et al. Mar 2015 A1
20150088575 Asli et al. Mar 2015 A1
20150089393 Zhang et al. Mar 2015 A1
20150089394 Chen et al. Mar 2015 A1
20150109399 Kuscher Apr 2015 A1
20150113050 Stahl Apr 2015 A1
20150113369 Chan et al. Apr 2015 A1
20150128068 Kim May 2015 A1
20150163455 Brady Jun 2015 A1
20150172120 Dwarampudi et al. Jun 2015 A1
20150178626 Pielot et al. Jun 2015 A1
20150215365 Shaffer et al. Jul 2015 A1
20150254760 Pepper Sep 2015 A1
20150288774 Larabie-Belanger Oct 2015 A1
20150301691 Qin Oct 2015 A1
20150304120 Xiao et al. Oct 2015 A1
20150304366 Bader-Natal et al. Oct 2015 A1
20150319113 Gunderson et al. Nov 2015 A1
20150350126 Xue Dec 2015 A1
20150350267 Cutler et al. Dec 2015 A1
20150350448 Coffman Dec 2015 A1
20150373063 Vashishtha et al. Dec 2015 A1
20150373414 Kinoshita Dec 2015 A1
20160037304 Dunkin et al. Feb 2016 A1
20160043986 Ronkainen Feb 2016 A1
20160044159 Wolff et al. Feb 2016 A1
20160044380 Barrett Feb 2016 A1
20160050079 Martin De Nicolas et al. Feb 2016 A1
20160050160 Li et al. Feb 2016 A1
20160050175 Chaudhry et al. Feb 2016 A1
20160070758 Thomson et al. Mar 2016 A1
20160071056 Ellison et al. Mar 2016 A1
20160072862 Bader-Natal et al. Mar 2016 A1
20160094593 Priya Mar 2016 A1
20160105345 Kim et al. Apr 2016 A1
20160110056 Hong et al. Apr 2016 A1
20160165056 Bargetzi et al. Jun 2016 A1
20160173537 Kumar et al. Jun 2016 A1
20160182580 Nayak Jun 2016 A1
20160266609 McCracken Sep 2016 A1
20160269411 Malachi Sep 2016 A1
20160277461 Sun et al. Sep 2016 A1
20160283909 Adiga Sep 2016 A1
20160307165 Grodum et al. Oct 2016 A1
20160309037 Rosenberg et al. Oct 2016 A1
20160321347 Zhou et al. Nov 2016 A1
20170006162 Bargetzi et al. Jan 2017 A1
20170006446 Harris et al. Jan 2017 A1
20170070706 Ursin et al. Mar 2017 A1
20170093874 Uthe Mar 2017 A1
20170104961 Pan et al. Apr 2017 A1
20170171260 Jerrard-Dunne et al. Jun 2017 A1
20170324850 Snyder et al. Nov 2017 A1
Foreign Referenced Citations (15)
Number Date Country
101055561 Oct 2007 CN
101076060 Nov 2007 CN
102572370 Jul 2012 CN
102655583 Sep 2012 CN
101729528 Nov 2012 CN
102938834 Feb 2013 CN
103141086 Jun 2013 CN
204331453 May 2015 CN
3843033 Sep 1991 DE
959585 Nov 1999 EP
2773131 Sep 2014 EP
WO 9855903 Dec 1998 WO
WO 2008139269 Nov 2008 WO
WO 2012167262 Dec 2012 WO
WO 2014118736 Aug 2014 WO
Non-Patent Literature Citations (38)
Entry
Author Unknown, “A Primer on the H.323 Series Standard,” Version 2.0, available at http://www.packetizer.com/volp/h323/papers/primer/, retrieved on Dec. 20, 2006, 17 pages.
Author Unknown, “Active screen follows mouse and dual monitors,” KDE Community Forums, Apr. 13, 2010, 3 page.
Author Unknown, “Implementing Media Gateway Control Protocols” A RADVision White Paper, Jan. 27, 2002, 16 pages.
Averusa, “Interactive Video Conferencing K-12 applications,” “Interactive Video Conferencing K-12 applications” copyright 2012. http://www.averusa.com/education/downloads/hvc brochure goved.pdf (last accessed Oct. 11, 2013).
Cisco Systems, Inc., “Cisco WebEx Meetings Server System Requirements release 1.5.” 30 pages, Aug. 14, 2013.
Cisco White Paper, “Web Conferencing: Unleash the Power of Secure, Real-Time Collaboration,” pp. 1-8, 2014.
Clarke, Brant, “Polycom Announces RealPresence Group Series,” “Polycom Announces RealPresence Group Series” dated Oct. 8, 2012 available at http://www.323.tv/news/polycom-realpresence-group-series (last accessed Oct. 11, 2013).
Clauser, Grant, et al., “Is the Google Home the voice-controlled speaker for you?,” The Wire Cutter, Nov. 22, 2016, pp. 1-15.
Cole, Camille, et al., “Videoconferencing for K-12 Classrooms,” Second Edition (excerpt), http://www.iste.org/docs/excerpts/VIDCO2-excerpt.pdf (last accessed Oct. 11, 2013), 2009.
Epson, “BrightLink Pro Projector,” BrightLink Pro Projector. http://www.epson.com/cgi-bin/Store/jsp/Landing/brightlink-pro-interactive-projectors.do?ref=van brightlink-pro—dated 2013 (last accessed Oct. 11, 2013).
InFocus, “Mondopad,” Mondopad. http://www.infocus.com/sites/default/files/InFocus-Mondopad-INF5520a-INF7021-Datasheet-EN.pdf (last accessed Oct. 11, 2013), 2013.
MacCormick, John, “Video Chat with Multiple Cameras,” CSCW '13, Proceedings of the 2013 conference on Computer supported cooperative work companion, pp. 195-198, ACM, New York, NY, USA, 2013.
Microsoft, “Positioning Objects on Multiple Display Monitors,” Aug. 12, 2012, 2 pages.
Mullins, Robert, “Polycom Adds Tablet Videoconferencing,” Mullins, R. “Polycom Adds Tablet Videoconferencing” available at http://www.informationweek.com/telecom/unified-communications/polycom-adds-tablet-videoconferencing/231900630 dated Oct. 12, 2011 (last accessed Oct. 11, 2013).
Nu-Star Technologies, “Interactive Whiteboard Conferencing,” Interactive Whiteboard Conferencing. http://www.nu-star.com/interactive-conf.php dated 2013 (last accessed Oct. 11, 2013).
Polycom, “Polycom RealPresence Mobile: Mobile Telepresence & Video Conferencing,” http://www.polycom.com/products-services/hd-teleoresence-video-conferencing/realpresence-mobile.html#stab1 (last accessed Oct. 11, 2013), 2013.
Polycom, “Polycom Turns Video Display Screens into Virtual Whiteboards with First Integrated Whiteboard Solution for Video Collaboration,” Polycom Turns Video Display Screens into Virtual Whiteboards with First Integrated Whiteboard Solution for Video Collaboration—http://www.polycom.com/company/news/press-releases/2011/20111027 2.html—dated Oct. 27, 2011.
Polycom, “Polycom UC Board, Transforming ordinary surfaces into virtual whiteboards” 2012, Polycom, Inc., San Jose, CA, http://www.uatg.com/pdf/polycom/polycom-uc-board- datasheet.pdf, (last accessed Oct. 11, 2013).
Stodle. Daniel, et al., “Gesture-Based, Touch-Free Multi-User Gaming on Wall-Sized, High-Resolution Tiled Displays,” 2008, 13 pages.
Thompson, Phil, et al., “Agent Based Ontology Driven Virtual Meeting Assistant,” Future Generation Information Technology, Springer Berlin Heidelberg, 2010, 4 pages.
TNO, “Multi-Touch Interaction Overview,” Dec. 1, 2009, 12 pages.
Toga, James, et al., “Demystifying Multimedia Conferencing Over the Internet Using the H.323 Set of Standards,” Intel Technology Journal Q2, 1998, 11 pages.
Ubuntu, “Force Unity to open new window on the screen where the cursor is?” Sep. 16, 2013, 1 page.
VB Forums, “Pointapi,” Aug. 8, 2001, 3 pages.
Vidyo, “VidyoPanorama,” VidyoPanorama—http://www.vidyo.com/products/vidyopartorama/ dated 2013 (last accessed Oct. 11, 2013).
Choi, Jae Young, et al; “Towards an Automatic Face Indexing System for Actor-based Video Services in an IPTV Environment,” IEEE Transactions on 56, No. 1 (2010): 147-155.
Cisco Systems, Inc. “Cisco webex: WebEx Meeting Center User Guide for Hosts, Presenters, and Participants” © 1997-2013, pp. 1-394 plus table of contents.
Cisco Systems, Inc., “Cisco Webex Meetings for iPad and iPhone Release Notes,” Version 5.0, Oct. 2013, 5 pages.
Cisco Systems, Inc., “Cisco Unified Personal Communicator 8.5”, 2011, 9 pages.
Eichen, Elliot, et al., “Smartphone Docking Stations and Strongly Converged VoIP Clients for Fixed-Mobile convergence,” IEEE Wireless Communications and Networking Conference: Services, Applications and Business, 2012, pp. 3140-3144.
Grothaus, Michael, “How Interactive Product Placements Could Save Television,” Jul. 25, 2013, 4 pages.
Hannigan, Nancy Kruse, et al., The IBM Lotus Samteime VB Family Extending the IBM Unified Communications and Collaboration Strategy (2007), available at http://www.ibm.com/developerworks/lotust/library/sametime8-new/, 10 pages.
Hirschmann, Kenny, “TWIDDLA: Smarter Than the Average Whiteboard,” Apr. 17, 2014, 2 pages.
Nyamgondalu, Nagendra, “Lotus Notes Calendar and Scheduling Explained!” IBM, Oct. 18, 2004, 10 pages.
Schreiber, Danny, “The Missing Guide for Google Hangout Video Calls,” Jun. 5, 2014, 6 pages.
Shervington, Martin, “Complete Guide to Google Hangouts for Businesses and Individuals,” Mar. 20, 2014, 15 pages.
Shi, Saiqi, et al, “Notification That a Mobile Meeting Attendee is Driving”, May 20, 2013, 13 pages.
International Search Report and Written Opinion from the International Searching Authority, dated Jul. 19, 2018, 14 pages, for corresponding International Patent Application No. PCT/US18/27087.
Related Publications (1)
Number Date Country
20180292972 A1 Oct 2018 US