This application is related to U.S. patent application Ser. No. 12/770,991, filed Apr. 30, 2010, entitled “Method and Apparatus for Two-Way Multimedia Communications,” which is incorporated by reference herein in its entirety.
This specification relates generally to two-way multimedia communications, and more particularly to methods and apparatus for enabling an individual to participate in a meeting from a remote location.
Modern telecommunications technologies enable people to conduct meetings without being physically present at the same location. It has become commonplace for individuals at different locations to use telephone conferencing and/or video communications technologies to conduct business meetings, conference calls, and other forms of interaction. However, existing communication systems used to conduct such meetings typically employ only a speakerphone and perhaps one or more computer-based audio/video platforms. Existing systems do not provide to those participating in such meetings a simulated experience of being in the presence of the other participants.
In accordance with an embodiment, a method for displaying image data is provided. First image data representing a first view, captured by a camera system, of a second location, is displayed at a first location. The first view is associated with a first orientation relative to the second location. A first angular displacement associated with a motion of a chair disposed at the first location is determined. Information representing the first angular displacement is transmitted to the camera system. Second image data representing a second view of the second location associated with a second orientation is displayed at the first location, the second orientation having a relationship to the first orientation based on the first angular displacement.
In one embodiment, motion data representing a motion of a chair is received from a sensor attached to the chair. In another embodiment, a motion of the chair is detected by the sensor mounted on the chair. The sensor may comprise one of a magnetometer and a compass sensor.
In another embodiment, information representing the first angular displacement is transmitted, by a device located at the first location, to the camera system. The device may be one of: a personal computer, a laptop computer, a cell phone, a wireless device, a personal digital assistant, and a television. The camera system may comprise a video camera.
In another embodiment, the device causes the video camera to turn from the first orientation to the second orientation, based on the determined first angular displacement. The step of displaying, at a first location, first image data representing a first view may further include generating, by a camera system disposed at the second location, first image data representing the first view of the second location, transmitting the first image data from the second location to the first location, and displaying the first image data on the device located at the first location.
These and other advantages of the present disclosure will be apparent to those of ordinary skill in the art by reference to the following Detailed Description and the accompanying drawings.
In accordance with an embodiment, a communication device (referred to herein as a “surrogate head device”) functions as a surrogate for an individual, enabling the individual to attend a meeting from a remote location. The surrogate head device is placed at a first location where a meeting is being conducted. The surrogate head device comprises a camera and microphones which capture images and sounds from the conference room; the images and sounds are transmitted to the remote location for viewing by the remote participant. The surrogate head device also comprises a display device which displays video images of the remote participant, and one or more speakers which convey voice signals received from the remote participant. Two-way communications are therefore conducted through the exchange of images and sounds between the first location and the remote participant.
The surrogate head device is supported by a support structure that allows the device to rotate to the right and to the left about a substantially vertical axis, enabling the viewing area of the camera to pan to the right or to the left, and to tilt up and down about a substantially horizontal axis, enabling the viewing area of the camera to pan up or down.
The remote participant utilizes a remote control device to control the surrogate head device. The remote control device may be linked to the surrogate head device via a network, such as the Internet. The remote control device includes a camera to capture video images of the remote participant, and one or more microphones to record his or her voice. The video images and voice signals are transmitted to the surrogate head device. The remote control device also comprises a display screen that enables the remote participant to view images of the meeting captured by the camera on the surrogate head device, and one or more audio speakers that enable the remote participant to hear voices and other sounds detected by the microphones on the surrogate head device. The audio speakers may be two speakers in a set of headphones worn by the remote participant, for example. The remote control device also includes one or more control devices, such as a computer mouse and/or a keypad, with which the remote participant controls the movement of the surrogate head device remotely. For example, the remote participant may cause the surrogate head device to rotate to the right or left, or to tilt up or down, by rolling a computer mouse to the right or to the left, or forward or backward. The remote participant's ability to rotate the surrogate head device to the right or left, or to tilt the device up and down, enables the remote participant to achieve and maintain eye contact with a person present at the meeting.
In one embodiment, the surrogate head device comprises two microphones situated in a manner to approximate the perception of sounds by a human. The sounds detected by the two microphones are mapped to two speakers used by the remote participant, generating for the remote participant a simulation of being present at the meeting. For example, when a person seated at the meeting to the right of the surrogate head device speaks, the sounds detected by the two microphones are mapped to the remote participant's two headphone speakers and cause the remote participant to perceive a voice coming from his or her right side.
The remote participant may control the movement of the surrogate head device based on the sounds generated by the two speakers in the headphones. For example, when the remote participant perceives a voice coming from his or her right side, the remote participant may cause the surrogate head device to rotate to the right in order to view the speaker at the meeting.
Base portion 174 supports head portion 172 and comprises a platform 190, a pan base 155, and a tilt base 150. In particular, head portion 172 is supported by tilt base 150, which comprises two vertical portions 150-A and 150-B disposed on pan base 155, and two horizontal support rods 126 attached to head portion 172. Support rods 126 define a horizontal axis 106 between vertical portions 150, and are configured to rotate about horizontal axis 106, causing head portion 172 to rotate about horizontal axis 106. Pan base 155 is disposed on platform 190 and is configured to rotate about a substantially vertical axis 108, causing head portion 172 to rotate about vertical axis 108. The capability of tilt base 150 and pan base 155 to rotate about two axes enables head portion 172 to rotate in order to face in a desired direction.
Display device 110 may comprise a liquid crystal display (“LCD”). In other embodiments, display device 110 may comprise another type of display device. Audio speakers 120-A and 120-B may comprise any type of audio device capable of reproducing voice signals and other sounds. Camera 130 may comprise any type of camera capable of capturing images and generating corresponding image data for transmission to a remote participant.
Microphones 140-A and 140-B may comprise any type of device capable of detecting sounds and generating corresponding audio signals for transmission to a remote participant. In one embodiment of the invention, two microphones 140-A and 140-B are situated on surrogate head device 100 at a distance that approximates the distance between the ears on a human's head, in order to receive audio signals in a manner substantially consistent with the reception of audio signals by a human's ears. Because microphones 140-A and 140-B are attached to head portion 172 of surrogate head device 100, the remote participant may maintain an accurate sense of audio direction because the microphones are always at the same position relative to camera 130. In other embodiments, surrogate head device 100 may be configured differently than as shown in
Network 205 may comprise one or more of a number of different types of networks, such as, for example, an intranet, a local area network (LAN), a wide area network (WAN), an internet, Fibre Channel-based storage area network (SAN) or Ethernet. Other networks may be used. Alternatively, network 205 may comprise a combination of different types of networks. In some embodiments, surrogate head device 100 may be linked to remote control device 230 via a direct connection.
Remote control device 230 is operated by an individual at a location remote from conference room 215. Remote control device 230 conveys, to the remote participant, audio and video signals received from surrogate head device 100, and transmits audio and video signals to surrogate head device 100. Remote control device 230 also transmits to surrogate head device 100 control signals received from the remote participant. In this manner, the remote participant may employ remote control device 230 to control surrogate head device 100 remotely.
By selective placement within conference room 215, surrogate head device 100 may enable the remote participant to receive audio and video signals from conference room 215 in a manner that simulates the sensation of being physically present in conference room 215.
While the exemplary embodiment discussed herein describes a meeting held in a conference room, the systems, apparatus and methods described herein may be used to enable an individual to attend other types of meetings held in other places, from a remote location.
Surrogate head device 100 also comprises a processor 462, an interface 464, and a memory 466. Processor 462 controls various operations of surrogate head device 100 by executing computer program instructions which define such operations. The computer program instructions may be stored in a non-transitory computer readable medium such as a random access memory (RAM), one or more disk drives, one or more optical disks, one or more tape drives, etc. Processor 462 may comprise hardware, software, or a combination of hardware and software. For example, in one embodiment, processor 462 comprises operating system software controlled by hardware, such as a central processing unit (CPU).
Interface 464 provides a communication gateway through which data may be transmitted between components of surrogate head device 100 and network 205. For example, interface 464 transmits to remote control device 230, via network 205, audio signals received by microphones 140-A and 140-B and video signals received by camera 130. Interface 464 receives audio signals and video signals from remote control device 230, via network 205, and transmits the audio and video signals to speakers 120-A and 120-B, and to display device 110, respectively. Interface 464 also receives control signals received from remote control device 230, and transmits the control signals to control module 457. In various embodiments, interface 464 may be implemented using a number of different mechanisms, such as one or more enterprise systems connection cards, modems, or network interfaces. Other types of interfaces may be used.
Memory 466 is accessed by processor 462 and/or other components of surrogate head device 100 to store various types of information. Memory 466 may comprise any one or more of a variety of different types of non-transitory computer readable media, such as random access memory (RAM), one or more disk drives, one or more optical disks, one or more tape drives, etc. Other types of memory devices may be used.
In one embodiment, pan base 155 may comprise one or more electromechanical components such as servos, motors, control circuitry, gears, etc., configured to enable pan base 155 to move in response to control signals. Pan base 155 may also comprise one or more microprocessors and memory devices to facilitate its operation. In other embodiments, other mechanisms may be used to control the movements of pan base 155.
In one embodiment, tilt base 150 may comprise one or more electromechanical components such as servos, motors, control circuitry, gears, etc., configured to enable tilt base 150 to move in response to control signals. Tilt base 150 may also comprise one or more microprocessors and memory devices to facilitate its operation. In other embodiments, other mechanisms may be used to control the movements of tilt base 150.
Surrogate head device 100 also comprises a control module 457. Control module 457 receives control signals from remote control device 230 (shown in
Control module 457 may comprise a software program that includes multiple modules or subroutines providing respective services or functions, for example. In other embodiments, control module 457 may comprise multiple software programs. In alternative embodiments, control module 457 may comprise hardware, or a combination of hardware and software. Control module 457 may comprise a non-transitory computer readable medium, such as a magnetic disk, magnetic tape, or optical disk, that includes instructions in the form of computer code operable to perform various functions. In some embodiments, some or all of control module 457 may comprise instructions in the form of computer code that are stored in memory 466.
In other embodiments, surrogate head device 100 may comprise other components (software or hardware) in addition to those discussed herein.
Referring again to
Display device 568 may comprise a liquid crystal display (“LCD”). In other embodiments, display device 568 may comprise another type of display device. Audio speakers 566 may comprise any type of audio device capable of reproducing voice signals and other audio signals that may be received from surrogate head device 100. Camera 562 may comprise any type of camera capable of capturing images and generating corresponding video data for transmission to surrogate head device 100. Microphone 564 may comprise any type of device capable of detecting sounds and generating corresponding audio data for transmission to surrogate head device 100.
Remote control device 230 also comprises a processor 610, an interface 620, and a memory 630. Processor 610 controls various operations of remote control device 230 by executing computer program instructions which define such operations. The computer program instructions may be stored in a non-transitory computer readable medium such as a random access memory (RAM), one or more disk drives, one or more optical disks, one or more tape drives, etc. Processor 610 may comprise hardware, software, or a combination of hardware and software. For example, in one embodiment, processor 610 comprises operating system software controlled by hardware, such as a central processing unit (CPU).
Interface 620 provides a communication gateway through which data may be transmitted between components of remote control device 230 and network 205. Interface 620 transmits to surrogate head device 100, via network 205, audio signals received by microphone 564 and video signals received by camera 562. Interface 620 receives audio signals and video signals from surrogate head device 100, via network 205, and transmits such signals to speakers 566 and to display device 568, respectively. Interface 620 receives control signals from remote control module 640 and transmits the control signals to surrogate head device 100. In some embodiments, interface 620 may receive control signals directly from mouse device 576 and from keyboard 574, and transmit the control signals to surrogate head device 100. In various embodiments, interface 620 may be implemented using a number of different mechanisms, such as one or more enterprise systems connection cards, modems, or network interfaces. Other types of interfaces may be used.
Memory 630 is accessed by processor 610 and/or other components of remote control device 230 to store various types of information. Memory 630 may comprise any one or more of a variety of different types of non-transitory computer readable media, such as random access memory (RAM), one or more disk drives, one or more optical disks, one or more tape drives, etc. Other types of memory devices may be used.
Remote control device 230 also comprises a remote control module 640. Remote control module 640 receives signals from mouse device 576 and from keyboard 574, and converts such signals into corresponding control signals for controlling surrogate head device 100. For example, movements of mouse device 576, or selections of keys on keyboard 574, may be detected and converted into appropriate control signals for controlling the movement of surrogate head device 100. Remote control module 640 transmits such control signals to surrogate head device 100 via interface 620. In another embodiment, a speech recognition system may be used to detect voice commands spoken by the remote participant, and generate corresponding control signals. In other embodiments, a gesture control system, and/or a facial recognition system may be used to detect facial movements and/or gestures made by the remote participant, and generate corresponding control signals.
Remote control module 640 may comprise a software program that includes multiple modules or subroutines providing respective services or functions, for example. In other embodiments, remote control module 640 may comprise multiple software programs. In alternative embodiments, remote control module 640 may comprise hardware, or a combination of hardware and software. Remote control module 640 may comprise a non-transitory computer readable medium, such as a magnetic disk, magnetic tape, or optical disk, that includes instructions in the form of computer code operable to perform various functions. In some embodiments, some or all of remote control module 640 may comprise instructions in the form of computer code that are stored in memory 630.
In other embodiments, remote control device 230 may comprise other components (software or hardware) in addition to those discussed herein.
In one embodiment, sounds detected by microphones 140-A and 140-B on surrogate head device 100 are selectively mapped to speakers 566-A and 566-B of remote control device 230, generating for remote participant 585 a simulation of being present in conference room 215. For example, when an individual seated in conference room 215 to the right of surrogate head device 100 speaks, the sounds detected by microphone 140-A are mapped to the remote participant's headphone speaker 566-A, and the sounds detected by microphone 140-B are mapped to the remote participant's headphone speaker 566-B, causing the remote participant to perceive a voice coming from his or her right side. In the exemplary embodiment, control module 457 (shown in
In some embodiments, including the embodiment described above, a remote participant operating remote control device 230 controls surrogate head device 100 to achieve and maintain eye contact with an individual in conference room 215. For example, appropriate rotation of surrogate head device 100 by a remote participant toward an individual who is speaking in conference room 215 may enable the remote operator and the speaker to see each other's faces and expressions in real-time, enabling eye-to eye contact to be achieved and maintained.
At step 720, two respective audio signals are detected at two microphones located on the device at the first location. As discussed above, surrogate head device 100 detects two audio signals at microphones 140-A and 140-B. The audio signals may contain voice signals, for example. At step 730, the two audio signals are mapped respectively to two channels associated with two speakers used by an operator at the second location. In the exemplary embodiment, surrogate head device 100 maps the two audio signals to two transmission channels (channels A and B, discussed above) and transmits the signals to remote control device 230. The two transmission channels are associated with two speakers in the remote operator's headphones 566.
At step 740, at least a portion of the device moves about at least one axis in response to control signals received from the operator at the second location. As discussed above, surrogate head device 100 receives control signals from remote control device 230, and in response, head portion 172 is rotated around a vertical axis by pan base 155 and/or about a horizontal axis by tilt base 150.
In some embodiments, the method steps described in
In another embodiment, a user at a first location employs a remote control device and a motion sensor to control a camera system located at a second location. The remote control device receives data representing a motion, and generates control signals based on the motion data. For example, a sensor attached to the user's chair or body may detect when the user turns to the left, and generate corresponding motion signals. The remote control device detects the control signals and transmits corresponding control signals to the remotely located camera system, causing the camera system (or a component of the camera system) to pan to the left. In this manner, the user may control the orientation of the camera system and obtain different views of the camera's surroundings. For example, the camera system may comprise surrogate head device 100. Alternatively, the camera system may comprise one or more surveillance cameras, for example.
In the exemplary embodiment of
Camera system 840 may comprise any type of imaging system capable of capturing image data from different orientations. For example, camera system 840 may comprise a surrogate head device such as surrogate head device 100 shown in
Sensor 875 comprises a motion sensor capable of generating data representing a motion experienced by the sensor. Sensor 875 transmits to remote control device 860 data representing the detected motion. In the illustrative embodiment, sensor 875 communicates with remote control device 860 wirelessly. In other embodiments, sensor 875 may communicate with remote control device 860 via a direct link, via a network, or in another manner.
In one embodiment, sensor 875 is attached to a rotatable chair.
In one embodiment, sensor 875 is a compass sensor having a 0.5 degree heading resolution and 1 degree repeatability. Sensor 875 may be battery-powered and communicate wirelessly. For example, sensor 875 may be interfaced to a microcontroller board and use a wireless network standard such as Zigbee to communicate with remote control device 860. Alternatively, sensor 875 may be powered by a USB connection from remote control device 860, and use the USB connection (and/or Wi-Fi) for wireless networking.
In one embodiment, sensor 875 is battery-powered using a Li-polymer rechargeable battery. Sensor 875 comprises a microcontroller board and communicates wirelessly with remote control device 860.
In accordance with an embodiment, a user may employ sensor 875 and remote control device 860 to control camera system 840.
In one embodiment, camera system 840 comprises surrogate head device 100 (shown in
In another embodiment, surrogate head device 100 may receive control signals indicating an angular displacement of chair 900 and, in response, cause head portion 172 to rotate a around a vertical axis by a number of degrees that is different from, but determined based on, the angular displacement of chair 900. Surrogate head device 100 may store and consult a mapping that maps various angular displacement inputs to respective angular displacement output values. For example, an angular displacement input of 50 degrees (representing the angular displacement of chair 900) may be mapped to an angular displacement value of 40 degrees. In such case, when the user swivels 50 degrees in chair 900, surrogate head device 100 causes head portion 172 to turn 40 degrees. In another example, surrogate head device 100 may be configured to rotate about a vertical axis in response to an angular displacement of chair 900, but only up to a predetermined limit, for example, a thirty degree displacement to the left and to the right of a selected orientation; any displacement of chair 900 beyond thirty degrees from a corresponding orientation would cause no additional rotation of surrogate head device 100. Other configurations are possible.
Suppose, then, that user 1175 wishes to participate remotely in a conference being held in conference room 215 (shown in
A motion of a chair disposed at the first location is detected. Supposing that user 1175 wishes to view participant 322, user 1175 swivels in chair 900 approximately 90 degrees to the right. Sensor 875 detects the rotational motion of chair 900 and transmits (via antenna 1091, for example) to remote control device 860 motion data representing the chair's motion. Remote control device 860 receives the motion data (via antenna 1028, for example).
At step 1230, a first angular displacement associated with the motion of a chair disposed at the first location is determined. Remote control device 860 determines, based on the motion data, that chair 900 has experienced (approximately) a 90 degree rotation to the right (clockwise).
At step 1235, information representing the first angular displacement is transmitted to the camera system. Remote control device 860 transmits to surrogate head device 100 angular displacement information defining the first angular displacement. Surrogate head device 100 receives the angular displacement information, and in response, causes head portion 172 to rotate clockwise around a vertical axis by 90 degrees, or by approximately 90 degrees. Surrogate head device 100 now captures second image data of a second view of conference room 215, including a view of participant 322. Surrogate head device 100 transmits the second image data to remote control device 860. The second view corresponds to a second orientation within conference room 215 that is displaced from the first orientation by approximately 90 degrees.
At step 1240, second image data representing a second view of the room associated with a second orientation, the second orientation having a relationship to the first orientation based on the first angular displacement, is displayed at the first location. Remote control device 860 displays the second image data, enabling user 1175 to see the second view, including participant 322. As discussed above, the second view corresponds to the second orientation within conference room 215; the angular displacement between the second orientation and the first orientation is approximately 90 degrees.
In another embodiment, sensor 875 detects that chair 900 tilts forward (rather than rotates), for example, when the user leans forward, causing the chair's seat to tilt forward. Sensor 875 transmits to remote control device 860 motion data representing the chair's tilting motion. The motion data may comprise an angular displacement, for example. Remote control device 860 transmits to surrogate head device 100 angular displacement information defining the chair's tilting motion. Surrogate head device 100 receives the angular displacement information, and in response, causes head portion 172 to rotate a around a horizontal axis by a corresponding angular displacement. Surrogate head device 100 now captures image data of a different view of conference room 215, such as a view of a document placed on the table, or a view of the floor. Surrogate head device 100 transmits image data to remote control device 860. Remote control device 860 displays the image data to the user (allowing the user to view a document on the table, for example).
In another embodiment, sensor 875 detects that chair 900 tilts back (rather than rotates), for example, when the user leans back, causing the chair's seat to tilt backward. Sensor 875 transmits to remote control device 860 motion data representing the chair's tilting motion. The motion data may comprise an angular displacement about a horizontal axis, for example. Remote control device 860 transmits to surrogate head device 100 displacement information defining the chair's tilting motion. Surrogate head device 100 receives the displacement information, and in response, causes head portion 172 to rotate a around a horizontal axis by a corresponding angular displacement. Surrogate head device 100 now captures image data of a different view of conference room 215, such as a view of a person standing in the conference room, or a view of the ceiling of the conference room. Surrogate head device 100 transmits image data to remote control device 860. Remote control device 860 displays the image data to the user (allowing the user to view a person standing in the conference room, for example).
In another embodiment, sensor 875 detects that chair 900 tilts forward (rather than rotates), for example, when the user leans forward, causing the chair's seat to tilt forward. Sensor 875 transmits to remote control device 860 motion data representing the chair's tilting motion. The motion data may comprise an angular displacement, for example. Remote control device 860 transmits to surrogate head device 100 angular displacement information defining the chair's tilting motion. Surrogate head device 100 receives the displacement information, and in response, causes camera 130 (on head portion 172) to zoom by an amount determined based on the angular displacement information. Surrogate head device 100 captures “zoomed” image data of conference room 215. Surrogate head device 100 transmits “zoomed” image data to remote control device 860. Remote control device 860 displays the “zoomed” image data to the user.
In another embodiment, sensor 875 is attached to the user's body or clothing, (instead of being attached to chair 900). For example, sensor 875 may be attached to a tag attached to the user's pocket, to a wristband, etc. In another embodiment, sensor 875 may be attached to or disposed within remote control device 860. When the user swivels in the chair, leans forward, leans back, etc., sensor 875 detects the motion and transmits motion data to control signal generator 1020 (within remote control device 860). Remote control device 860 controls surrogate head device 100 based on the motion data, in the manner described above.
In one embodiment, sensor 875 is attached to chair 900 and is employed in the manner described above to control rotational movements of a remote camera system such as surrogate head device 100. A second motion sensor is attached to the user's body and is used by the user to control a zoom function of the remote camera system. When the user moves forward, the second sensor detects the user's movement, and transmits to remote control device 860 motion data representing the user's motion. The motion data may comprise an angular displacement, for example. Remote control device 860 transmits to surrogate head device 100 angular displacement information defining how far the user has leaned forward. Surrogate head device 100 receives the angular displacement information, and in response, causes camera 130 (on head portion 172) to zoom by an amount determined based on the angular displacement information. Surrogate head device 100 captures “zoomed” image data of conference room 215. Surrogate head device 100 transmits “zoomed” image data to remote control device 860. Remote control device 860 displays the “zoomed” image data to the user.
In another embodiment, the zoom function may be controlled based on a distance between the user's head and remote control device 860. For example, remote control device 860 may determine when the user leans his or her head toward the screen of remote control device 860 and cause the camera to zoom in response to the movement of the user's head. For example, distance measurements (between remote control device 860 and the user's head) may be determined based on images captured by a camera on remote control device 860. Remote control device 860 may analyze such images using image processing techniques and/or face detection techniques. Alternatively, distance measurements (between remote control device 860 and the user's head) may be determined based on data obtained by a custom sensor disposed within remote control device 860, such as an ultrasonic ranging sensor.
In other embodiments, a camera system comprises one or more video cameras (e.g., surveillance cameras) disposed in a selected location.
In one embodiment, a user sitting in chair 900 (with attached sensor 875) uses sensor 875 and remote control device 860 to control camera system 1340. For example, user 1175 of
In another embodiment, remote control device 860 controls camera device 1310 directly. For example, remote control device 860 may transmit instructions directly to camera device 1310, causing camera device 1310 to turn a specified number of degrees in a specified direction.
In various embodiments, the method steps described herein, including the method steps described in
Systems, apparatus, and methods described herein may be implemented using digital circuitry, or using one or more computers using well-known computer processors, memory units, storage devices, computer software, and other components. Typically, a computer includes a processor for executing instructions and one or more memories for storing instructions and data. A computer may also include, or be coupled to, one or more mass storage devices, such as one or more magnetic disks, internal hard disks and removable disks, magneto-optical disks, optical disks, etc.
Systems, apparatus, and methods described herein may be implemented using a computer program product tangibly embodied in an information carrier, e.g., in a non-transitory machine-readable storage device, for execution by a programmable processor; and the method steps described herein, including one or more of the steps of
A high-level block diagram of an exemplary computer that may be used to implement systems, apparatus and methods described herein is illustrated in
Processor 1401 may include both general and special purpose microprocessors, and may be the sole processor or one of multiple processors of computer 1400. Processor 1401 may include one or more central processing units (CPUs), for example. Processor 1401, data storage device 1402, and/or memory 1403 may include, be supplemented by, or incorporated in, one or more application-specific integrated circuits (ASICs) and/or one or more field programmable gate arrays (FPGAs).
Data storage device 1402 and memory 1403 each include a tangible non-transitory computer readable storage medium. Data storage device 1402, and memory 1403, may each include high-speed random access memory, such as dynamic random access memory (DRAM), static random access memory (SRAM), double data rate synchronous dynamic random access memory (DDR RAM), or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices such as internal hard disks and removable disks, magneto-optical disk storage devices, optical disk storage devices, flash memory devices, semiconductor memory devices, such as erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), digital versatile disc read-only memory (DVD-ROM) disks, or other non-volatile solid state storage devices.
Input/output devices 1405 may include peripherals, such as a printer, scanner, display screen, etc. For example, input/output devices 1405 may include a display device such as a cathode ray tube (CRT) or liquid crystal display (LCD) monitor for displaying information to the user, a keyboard, and a pointing device such as a mouse or a trackball by which the user can provide input to computer 1400.
Any or all of the systems and apparatus discussed herein, including remote control device 230, remote control device 860, camera system 840, and components thereof, may be implemented using a computer such as computer 1400.
One skilled in the art will recognize that an implementation of an actual computer or computer system may have other structures and may contain other components as well, and that
The foregoing Detailed Description is to be understood as being in every respect illustrative and exemplary, but not restrictive, and the scope of the invention disclosed herein is not to be determined from the Detailed Description, but rather from the claims as interpreted according to the full breadth permitted by the patent laws. It is to be understood that the embodiments shown and described herein are only illustrative of the principles of the present invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention. Those skilled in the art could implement various other feature combinations without departing from the scope and spirit of the invention.
Number | Name | Date | Kind |
---|---|---|---|
3871113 | Crago | Mar 1975 | A |
3984628 | Sharp | Oct 1976 | A |
5355163 | Tomitaka | Oct 1994 | A |
5367506 | Inanaga et al. | Nov 1994 | A |
5500671 | Andersson et al. | Mar 1996 | A |
5526037 | Cortjens et al. | Jun 1996 | A |
5596645 | Fujimori | Jan 1997 | A |
5745161 | Ito | Apr 1998 | A |
5786846 | Hiroaki | Jul 1998 | A |
5844599 | Hildin | Dec 1998 | A |
5896128 | Boyer | Apr 1999 | A |
5940118 | Van Schyndel | Aug 1999 | A |
5963250 | Parker et al. | Oct 1999 | A |
6005610 | Pingali | Dec 1999 | A |
6021206 | Mcgrath | Feb 2000 | A |
6072522 | Ippolito et al. | Jun 2000 | A |
6122005 | Sasaki et al. | Sep 2000 | A |
6137485 | Kawai et al. | Oct 2000 | A |
6239838 | Lee et al. | May 2001 | B1 |
6266082 | Yonezawa et al. | Jul 2001 | B1 |
6275258 | Chim | Aug 2001 | B1 |
6313875 | Suga et al. | Nov 2001 | B1 |
6318825 | Carau | Nov 2001 | B1 |
6385352 | Roustaei | May 2002 | B1 |
6487600 | Lynch | Nov 2002 | B1 |
6593956 | Potts et al. | Jul 2003 | B1 |
6628887 | Rhodes et al. | Sep 2003 | B1 |
6766035 | Gutta | Jul 2004 | B1 |
6772195 | Hatlelid et al. | Aug 2004 | B1 |
6795106 | Cooper | Sep 2004 | B1 |
7035418 | Okuno et al. | Apr 2006 | B1 |
7039221 | Tumey et al. | May 2006 | B1 |
7111045 | Kato et al. | Sep 2006 | B2 |
7202889 | Suzuki et al. | Apr 2007 | B2 |
7221386 | Thacher et al. | May 2007 | B2 |
7271827 | Nister | Sep 2007 | B2 |
7283788 | Posa et al. | Oct 2007 | B1 |
7330607 | Jung et al. | Feb 2008 | B2 |
7512883 | Wallick et al. | Mar 2009 | B2 |
7626569 | Lanier | Dec 2009 | B2 |
7840903 | Amidon et al. | Nov 2010 | B1 |
7880739 | Long et al. | Feb 2011 | B2 |
7913176 | Blattner et al. | Mar 2011 | B1 |
7987309 | Rofougaran | Jul 2011 | B2 |
7995090 | Liu et al. | Aug 2011 | B2 |
8111282 | Cutler et al. | Feb 2012 | B2 |
8125444 | Norager | Feb 2012 | B2 |
8150063 | Chen et al. | Apr 2012 | B2 |
8156184 | Kurata et al. | Apr 2012 | B2 |
8264522 | Martin et al. | Sep 2012 | B2 |
8355040 | Trachtenberg et al. | Jan 2013 | B2 |
8380550 | Mattimore et al. | Feb 2013 | B2 |
8397168 | Leacock et al. | Mar 2013 | B2 |
8411128 | Kang et al. | Apr 2013 | B2 |
8411165 | Ozawa | Apr 2013 | B2 |
8451994 | Abuan et al. | May 2013 | B2 |
8547416 | Ozawa | Oct 2013 | B2 |
8584026 | Lynk et al. | Nov 2013 | B2 |
20020039111 | Gips et al. | Apr 2002 | A1 |
20020072993 | Sandus et al. | Jun 2002 | A1 |
20020149672 | Clapp et al. | Oct 2002 | A1 |
20020167486 | Tan et al. | Nov 2002 | A1 |
20030081115 | Curry et al. | May 2003 | A1 |
20030206232 | Suzuki et al. | Nov 2003 | A1 |
20040003409 | Berstis | Jan 2004 | A1 |
20040189701 | Badt | Sep 2004 | A1 |
20040233282 | Stavely et al. | Nov 2004 | A1 |
20040257432 | Girish et al. | Dec 2004 | A1 |
20050007445 | Foote et al. | Jan 2005 | A1 |
20050062869 | Zimmermann et al. | Mar 2005 | A1 |
20050280701 | Wardell | Dec 2005 | A1 |
20050285950 | Oya | Dec 2005 | A1 |
20060007222 | Uy | Jan 2006 | A1 |
20060077252 | Bain et al. | Apr 2006 | A1 |
20060152487 | Grunnet-Jepsen et al. | Jul 2006 | A1 |
20060187306 | Matsui | Aug 2006 | A1 |
20070002130 | Hartkop | Jan 2007 | A1 |
20070075965 | Huppi et al. | Apr 2007 | A1 |
20070120879 | Kanade et al. | May 2007 | A1 |
20070263824 | Bangalore et al. | Nov 2007 | A1 |
20070273839 | Doi et al. | Nov 2007 | A1 |
20080012936 | White | Jan 2008 | A1 |
20080063389 | Fang et al. | Mar 2008 | A1 |
20080086696 | Sri Prakash et al. | Apr 2008 | A1 |
20080170123 | Albertson et al. | Jul 2008 | A1 |
20080211915 | McCubbrey | Sep 2008 | A1 |
20090041298 | Sandler et al. | Feb 2009 | A1 |
20090111518 | Agrawal et al. | Apr 2009 | A1 |
20090119736 | Perlman et al. | May 2009 | A1 |
20090122572 | Page et al. | May 2009 | A1 |
20090141147 | Alberts et al. | Jun 2009 | A1 |
20090153474 | Quennesson | Jun 2009 | A1 |
20090202114 | Morin et al. | Aug 2009 | A1 |
20090207233 | Mauchly et al. | Aug 2009 | A1 |
20090210804 | Kurata et al. | Aug 2009 | A1 |
20090216501 | Yeow et al. | Aug 2009 | A1 |
20090309956 | Hawkins et al. | Dec 2009 | A1 |
20090315984 | Lin et al. | Dec 2009 | A1 |
20100073454 | Lovhaugen et al. | Mar 2010 | A1 |
20100073456 | Bolle | Mar 2010 | A1 |
20100128892 | Chen et al. | May 2010 | A1 |
20100188473 | King et al. | Jul 2010 | A1 |
20100262718 | Ikeno et al. | Oct 2010 | A1 |
20100285879 | Huang et al. | Nov 2010 | A1 |
20100293468 | Thijssen | Nov 2010 | A1 |
20100302343 | Bolle | Dec 2010 | A1 |
20100309117 | Ohta | Dec 2010 | A1 |
20100328423 | Etter | Dec 2010 | A1 |
20110134205 | Arney et al. | Jun 2011 | A1 |
20110149012 | Bolle et al. | Jun 2011 | A1 |
20110170256 | Lee | Jul 2011 | A1 |
20110181507 | Oakley | Jul 2011 | A1 |
20110254914 | Ng | Oct 2011 | A1 |
20110267421 | Sutter | Nov 2011 | A1 |
20110268263 | Jones et al. | Nov 2011 | A1 |
20120011454 | Droz et al. | Jan 2012 | A1 |
20120036181 | Isidore | Feb 2012 | A1 |
20120069218 | Gantman | Mar 2012 | A1 |
20120081504 | Ng et al. | Apr 2012 | A1 |
20120083314 | Ng et al. | Apr 2012 | A1 |
20120098921 | Stedman et al. | Apr 2012 | A1 |
20120154510 | Huitema et al. | Jun 2012 | A1 |
20120204120 | Lefar et al. | Aug 2012 | A1 |
20120216129 | Ng et al. | Aug 2012 | A1 |
20130062866 | Breed | Mar 2013 | A1 |
20130141573 | Sutter et al. | Jun 2013 | A1 |
20130314543 | Sutter et al. | Nov 2013 | A1 |
Number | Date | Country |
---|---|---|
1643769 | Apr 2006 | EP |
9306690 | Apr 1993 | WO |
0182626 | Nov 2001 | WO |
0186953 | Nov 2001 | WO |
Entry |
---|
Swivl Web Page, downloaded May 7, 2012; www.swivl.com, 3 pages. |
Swivl Blog Post dated Dec. 28, 2010, www.swivl.com/2010/12/why-a-video-accessory/, 2 pages. |
Swivl Blog Post dated Dec. 22, 2010, www.swivl.com/2010/12/live-on-indiegogo/, 2 pages. |
Travis Deyle—IRobot AVA Telepresence Robot at CES 2011—5 pages—hizook.com, Jan. 6, 2011—www.hizook.com/blog/2011/01/06/irobot-ava-telepresence-robot-ces-2011-one-step-closer-robot-app-stores. |
CISCO Webex, “What is Webex?” https://web.archive.org/web/20110101032216/http://www.webex.com/what-is-webex/index.html, downloaded Jan. 24, 2014, 2 pages. |
“HP SkyRoom Version 1 (Quanity 500) Node-locked E-LTU Software (VK634AAE)—Specifications and Warranty,” Hewlett Packard, http://h10010.www.1.hp.com/wwpc/us/en/sm/WF06b/18964-18964-4020068-4020071-4020069-4020938-4026194-4026196.html?dnr=2, downloaded Jan. 24, 2014, 3 pages. |
Gross, M., et al., “blue-c: A Spatially Immersive Display and 3D Video Portal for Telepresence,” http://blue.ethz.ch/ACM 0730-0301/03/0700-0819, 2003, pp. 829-827. |
Iizadi, S., et al., “Going beyond the Display: A Surface Technology with an Electronically Switchable Diffuser,” UIST '08, Oct. 19-22, 2008, Monterey, California, pp. 269-278. |
Ishii, H. et al., “ClearBoard: A Seamless Medium for Shared Drawing and Conversation with Eye Contact,” CHI '92, May 3-7, 1992, pp. 525-532. |
Kuechler, M., et al., “HoloPort—A Device for Simultaneous Video and Data Conferencing Featuring Gaze Awareness,” IEEE Virtual Reality Conference (VR'06) IEEE Computer Society, Mar. 25-29, 2006, pp. 81-88. |
Lo, D., “Multimodal Talker Localization in Video Conferencing Environments,” The 3rd IEEE International Workshop on Haptic, Audio and Visual Environments and their Applications, 2004 (HAVE '04), Oct. 2-3, 2004, pp. 195-200. |
NEC Display Solutions, “CRV43 Curved Ultra-Wide Display,” http://www.necdisplay.com/newtechnologies/curveddisplay/, Apr. 12, 2010, 2 pages. |
Polycom, Polycom CX 5000 Video Collarboration Device-Products-Polycom, http://www.polycom.com/products/voice/conferencing—solutions/microsft—optimized—conferencing/cx5000.html, Apr. 12, 2010, 1 page. |
Shiwa, S., et al., “A Large-Screen Visual Telecommunication Device Enabling Eye Contact,” SID 91 Digest, 1991, pp. 327-328. |
Tan, K., et al., “ConnectBoard: A Remote Collaboration System that Supports Gaze-Aware Interaction and Sharing,” 2009 IEEE International Workshop on Multimedia Signal Processing, (MMSP '09) Oct. 5-7, 2009, 6 pages. |
Lance Ulanoff—I'Robot's AVA is an App-Ready Robot—2 pages—pcmag.com, Jan. 6, 2011—www.pcmag.com/article2/0,2817,2375313,00.asp. |
Foreign Communication From a Related Counterpart Application, PCT Application No. PCT/US2012/066511, International Search Report dated Jun. 13, 2013, 4 pages. |
International Search Report and Written Opinion mailed on Dec. 3, 2014, in connection with related international patent application No. PCT/US2013/48070, 8 pgs. |
Web Conferencing & Online Meetings Overview: WebEx; http://www.webex.com/what-is-webex/index.html, printed on Feb. 22, 2011. |
Number | Date | Country | |
---|---|---|---|
20130314543 A1 | Nov 2013 | US |