This disclosure relates to personal robots and in particular telepresence robots that are able to provide human to human video and audio conferencing otherwise known as telepresence.
The use of video conferencing has in recent years become an important tool for many businesses. However, there are many limitations to a typical video conference set up. Specifically, the person joining the conference by way of video conference typically has little or no control over who they are looking at. The limitations of video conferencing and other issues related to working with a team from a remote location have generated a new type of robot often referred to as a telepresence personal robot. A telepresence robot typically combines the freedom of mobility with video conferencing. This combination provides an improved experience over video conferencing and that improves people's connection with clients, employees and teammates.
There are some telepresence robots available however they have several limitations that limit their functionality. Of those telepresence robots currently on the market, many do not have the ability to rotate or tilt their head (pan/tilt). This is a severe design limitation as the ability to look around is an important feature for the user to fully participate in the meeting. Furthermore, of those designs which do incorporate the flexibility of pan/tilt, none meet some specific requirements that would allow for full participation of the user during the video teleconferencing operation. More specifically, these requirements are threefold. First, the video image should be stable, even during travel. Second, there should be a safety system in place which will limit the damage done to internal parts should a user attempt to manually move the head. Finally, the system should have longevity, that is over time the system does not degrade to the point where it no longer meets the first two requirements.
Some telepresence robots are equipped with the head pan-tilt motion using coreless motors to directly drive the shaft of the rotating axes. However, there are some limitations. An example is in iRobot Company's product, called RP-Vita™. The display monitor hangs down from a flat “neck”—a hollow supporting structure of the monitor. The two motion axes are separated: the pan components are located in the “shoulder” area—the main body of the robot, and the tilt components are directly attached to one end of the “neck” through a joint between the neck and the display. Even though RP-Vita uses a coreless motor which is self-supporting and has high power-to-volume characteristics, the single direct connection between display and motor are too weak to hold the display stably in the tilt direction. Therefore, the inherent limitation in the use of such pan-tilt motion is that the tilt direction mechanical structure is too weak to hold the display of the robot in limited positions. Because the range of motion of the RP-Vita head is limited, it can only be used in specific situations.
Accordingly, it would be advantageous to provide a telepresence robot that has a head with a wide range of motion. Alternatively, it would be advantageous to provide a telepresence robot that provides a user interface that provide the user with a wide range of options in regard to operating the telepresence robot.
A telepresence robot for use in association with a remote computing device includes a base, a body, a neck, a head and a tilt mechanism. The base has a drive system in communication with the remote computing device. The body is operably attached to the base. The neck is operably attached to the body. The head is operably attached to the neck. The head has a screen and the screen is in communication with the remote computing device. The tilt mechanism connects the head to the neck. The tilt mechanism includes a chain and sprocket assembly whereby the head tilts at least 10 degrees forwardly from a generally upright position and at least 10 degrees backwardly from the generally upright position.
Optionally, the tilt mechanism may tilt the head further to at least 40 degrees forwardly from the generally upright position and at least 90 degrees backwardly from the generally upright position.
The chain and sprocket assembly of the tilt mechanism may include a main shaft assembly and a spring operably connected to the main shaft assembly whereby the spring counterbalances the head.
The tilt mechanism further may include a clutch operably attached to the chain and sprocket assembly and the main shaft assembly whereby responsive to a force on the head over a predetermined value the clutch operably disengages the chain and sprocket assembly from the main shaft assembly.
The telepresence robot may further include a pan assembly operably connected to the head. The pan assembly may rotate the head from −160 degrees to +160 degrees relative to a neutral position.
The telepresence robot may include a front head camera. The telepresence robot may include a navigation camera. The navigation camera may be a fish eye camera pointing generally forwardly.
The telepresence robot may include a back-head camera.
The remote computing device may be a computer having an image user interface. The image user interface may display a plurality of images. The plurality of images may include images from the front head camera, the back-head camera and the navigation camera. The plurality of images may include an image from a camera operably attached to the remote computing device. One of the plurality of images may be a larger image. The larger image may be swapped between the images from the front head camera, the back-head camera, the navigation camera and the remote computing device camera. The image user interface may include a mobile control interface. The image user interface may include a pan and tilt control interface. The image user interface may include a single image from the front head camera. The mobile control interface may be controlled by clicking on predetermined portions on the mobile control interface.
The remote computing device may include a keyboard having arrow keys and wherein the mobile control interface may be controlled by using the arrow keys.
The pan and tilt may be controlled by clicking on predetermined portions of the pan and tilt control interface images.
The telepresence robot may be operably connectable to a docking station.
The telepresence robot may include rechargeable batteries and when the telepresence robot is operably connected to the docking station the rechargeable batteries may be recharged.
Further features will be described or will become apparent in the course of the following detailed description.
The embodiments will now be described by way of example only, with reference to the accompanying drawings, in which:
Referring to
Referring to
Neck 14 includes a tilt mechanism 30 which is shown in detail in
The main shaft assembly 32 is shown in detail in
The shaft sprocket 58 of the main shaft assembly 32 is operably connected to the motor 34 with chain 38. The shaft sprocket 58 is operably connected to the clutch side main shaft 54 by the clutch 60. Thus, the clutch 60 is operably connected between the chain and sprocket assembly 31 and the main shaft assembly 32. In the event a force on the head 12 over a predetermined amount, the clutch operably disengages the chain and sprocket assembly 31 from the main shaft assembly 32. More specifically, in the event a force is applied to the head over a predetermined value, the plungers 66 will slip and disengage from the clutch sprocket interface 62. Therefore, in the event that a user or other external force attempts to manually force the head 12 to tilt, the clutch 60 reduces the likelihood of damaging the internal systems. Similarly, should the head 12 begin tilting responsive to a command and encounter an obstacle, a predetermined amount of force will be applied before the clutch 60 disengages the chain and sprocket assembly 31.
The chain and sprocket assembly 31 is used to tilt the head 12. The force exerted by the chain and sprocket assembly 31 will vary depending on the angle of tilt on the head 12. However, typically there the chain and sprocket assembly 31 will exert at least some force on the head 12. The motor is 34 is attached to the front panel 44 of the neck 14 and is thus stationary within the neck. The chain 38 is used to transfer the force from the motor 34 to the head 12. A chain tensioning bracket 90 is attached to the front panel 44 of the neck 14. The tilt motor mount 42 has elongate holes 92 for receiving screws therein. The tilt motor mount 42 is attached to the chain tensioning bracket 90 with a nut 94 and bolt 96 such that the screws attaching the tilt motor mount 42 to the front panel 44 can move in the elongate holes 92 until the chain is tightened and then the tilt motor mount 42 is secured in place. As can be seen in 11 and 12 when the back panel 45 (shown in
The spring 64 counterbalances the weight of the head 12 as it moves further from its center of gravity above the pivot point. This provides for smooth motion as well as allows the use of a smaller, more efficient motor 34 to drive the tilt. As the head 12 rotates further forward, the center of gravity follows and moves away from the pivot point. Because of shift in the center of gravity, an additional strain on the tilt motor 34 is applied when attempting to lift the head back up. The spring 64 applies a torsional force in the opposite direction, the torsion spring 64 effectively decreases and may eliminate the strain on the tilt motor 34 caused by gravity. This helps to provide smooth and consistent motion of the head.
The telepresence robot 10 also includes a pan assembly 100 which is housed in the body 16. The pan assembly 100 is in remote communication with the computing device 200. Referring to
The base 18 has a relatively small footprint with a relatively few parts. In the embodiment shown in
In addition to the front head camera 26 and the back-head camera 28, the telepresence robot 10 also has a navigation camera 108 as shown in
In addition, the base 18 may have an IR receiver 132 attached to the base 18 with a mount 134. The telepresence robot 10 may be charged with a docking station 20 shown in
Docking station 20 has two modes: 1) standby mode and 2) docking mode. The docking station 20 has two LED (light emitting diode) indicators including a red LED 158 and a green LED 160. When the docking station 20 is in the standby mode the green LED 160 is on and the red LED 158 is off. In the standby mode the docking station is plugged in but the transmitter/emitter is not activated. In the docking mode the power is on and the IR emitter is on. In addition, the LED lights switch to green LED flashing and red LED flashing status to indicating the telepresence robot is approaching the docking station. Then the red LED switches to solid to indicating the charging is processing. Then the green LED switches to solid to indicate that the charging is completed.
By way of example only, the following chart shows the detailed sequence for charging the telepresence robot 10.
The description for the 5 scenarios is shown below:
Similarly the telepresence robot 10 has two modes: 1) operation mode and 2) docking mode. While the operation mode is engaged, the collision avoidance system on telepresence robot 10 is enabled and it functions to detect obstacles and avoid collisions. While the docking mode is enabled, the collision avoidance system is disabled.
The IR sensor 130 on the telepresence robot 10 toggles on and off a predetermined number of times each second. By way of example, IR sensor 130 toggles off five times each second. The override function activates only if the IR sensor 130 detects a value greater than a predetermined target threshold for a period longer than the 200 milliseconds the telepresence robot 10 own IR sensor 130 is active. This reduces the possibility of a reflection triggering the override.
The IR transmitter 156 can provide a constant IR emission shown at 157 in
In use, when the telepresence robot 10 has low battery level, the user who is controlling it will get the notice and will remotely drive the telepresence robot 10 near to the front of the docking station 20. By way of example this may be about 1.5 m from the front of the docking station 20. The telepresence robot 10 has an IR receiver 132 at the front of the base 18, and the docking station has an IR transmitter 156. Once the telepresence robot 10 is located around the charging station area, the IR transmitter 156 will emit signal and IR receiver 132 will receive the signal. Then docking station 20 switches from the standby mode to the docking mode and the telepresence robot 10 switches from the operation mode to the docking mode. Then, the telepresence robot 10 will automatically drive to the docking station 20 and dock itself to the docking station for recharging.
Alternatively the telepresence robot 10 could be manually directly connected to an AC outlet through 12V cigar port 138.
The telepresence robot 10 is battery powered. By way of example it is powered by a set of batteries connected in parallel. In the example herein the set of batteries is three 12 Volt, 12 Amp rechargeable batteries. In the example herein, the screen 22 has a built in battery and is operably connected to a central USB hub. The USB hub is connected to the USB port 140. The USB port 140 is connected to the batteries.
The telepresence robot 10 may be controlled remotely with a computer or other computing device 200. By way of example the telepresence robot 10 could also be controlled by a tablet. The computing device 200 is in communication with the screen 22 of the head 12 of the telepresence robot 10. The image user interface or remote operator user interface 202 on the user's computing device 200 may include a plurality of images 204 from cameras on the telepresence robot 10, an image a user camera operably connected to the user's computing device 200, a mobile control interface 206 and a pan and tilt control interface 208. The plurality of images may include a larger image 210 and a plurality of smaller images 212 as shown in
By way of example only, these screen swapping operations have been implemented to swap between cameras. The user can drag and drop one video stream on the webpage, and the JavaScript in this webpage calls XMLHttpRequest to communicate with the web server; then the web server sends swap command to the virtual camera with UDP (User Datagram Protocol), and finally the virtual camera swaps the video streams. The techniques used in this case include JavaScript, AJAX, WinSock, Overlapped IO and Windows event objects. The algorithm has been carefully designed to ensure smooth swapping with near zero overhead. As used herein overhead means any combination of excess or indirect computation time, memory, bandwidth, or other resources that are required to attain a particular goal.
The remote control system is an innovative graphical interface providing the user with control of the telepresence robot 10 in a non-touch screen based working environment. Following is a detailed description of the remote touching control system and the operational graphical interface for use therewith.
Referring to
The transition from a mouse click to commands given to each wheel is a multi-step process. The first step is to record the location of the user's mouse click. This value is the x, y coordinate on the screen where the click occurred, relative to the top left corner of the screen. To make this position useful, it needs to be relative to where the image is. To accomplish this, the coordinates of the image being display on the screen are subtracted from the location of the mouse click. This results in a Cartesian coordinate relative to the image. The Cartesian coordinates are checked to make sure they are within the bounds of an acceptable command. That is to say they must be within pre-defined limits. As shown in
The mouse cursor is “placed” on the screen and the cursor has a coordinate in the main Cartesian system and another coordinate in the control Cartesian system. There is a relative value between these two coordinates and this value will be sent to the back-stage for calculation to control the movement of the telepresence robot.
By way of example, if the mouse cursor is inside the control Cartesian system, the mouse can control the telepresence robot's movement; however, if the mouse cursor move outside the “boundary” of the control window, the mouse cannot control the telepresence robot's movement. The mouse cursor's coordinate is always checked against to the origin-O of the entire tablet to ensure the cursor is within the range of control window.
The next step is to check if the user's click was within the circle located in the center of the image. This is accomplished by using the distance value from the polar form of the coordinate and comparing it to the radius of the circle. If the click's distance value is greater than the radius of the circle, the click was clearly outside the circle.
By way of example, in use if the user clicks was within the circle or stop zone 220, the telepresence robot 10 stops. If the click was outside the circle, the telepresence robot 10 will move. To give the proper speed to each wheel, the Cartesian coordinates relative to the center of the image or stop zone 220 are used and wheel speeds are proportionally assigned. Thus, if the mouse cursor is closer to the center of the circle or stop zone 220, a lower speed is applied to the telepresence robot movement. If the mouse cursor is further away from the center of the stop zone 220, a higher speed is applied to the telepresence robot 10.
Alternatively the user may use the arrow keys located on a keyboard to control direction of the telepresence robot 10. This is particularly useful when fine control is not needed. The up key moves the telepresence robot forward, the down key backwards, and the left keys turns the telepresence robot 10 left and the right key turns the telepresence robot 10 right.
The operation of the telepresence robot 10 is via remote control and may be via the Internet. The operation could include a remote log-in. The user control may include dual-direction media communication via the Internet and media recording. The pan and tilt control interface 208 shown in
Referring to
The telepresence robot 10 includes control and processing hardware 300.
It is to be understood that the example system shown in
The computing device 200 may take on a variety of forms.
The data storage 376 can be utilized by computing device 200 to store, among other things, applications 378 and/or other data. For example, data storage 376 may also be employed to store information that describes various capabilities of computing device 200. The information may then be provided to another device based on any of a variety of events, including being sent as part of a header during a communication, sent upon request, or the like. Moreover, data storage 376 may also be employed to store personal information including but not limited to address lists, contact lists, personal preferences, or the like. In one embodiment, data storage 376 may be configured to store information, including, but not limited to user account information, vendor information, social network information, or the like. At least a portion of the information may also be stored on a disk drive or other storage medium within computing device 200, such as hard disk drive 380, external storage device 382, or the like. In one embodiment, a portion of the information may also be located remote to computing device 200 which controls the telepresence robot 10. The applications 378 may include an internet browser 384, messaging application 386 or other specific applications 388. Internet browser 384 may be configured to receive and to send web pages, forms, web-based messages, and the like. Internet browser 384 may, for example, receive and display (and/or play) graphics, text, multimedia, audio data, and the like, employing virtually any web based language, including, but not limited to Standard Generalized Markup Language (SMGL), such as HyperText Markup Language (HTML), a wireless application protocol (WAP), a Handheld Device Markup Language (HDML), such as Wireless Markup Language (WML), WMLScript, JavaScript, and the like. Messaging application 386 may be configured to initiate and manage a messaging session using any of a variety of messaging communications including, but not limited to email, Short Message Service (SMS), Instant Message (IM), Multimedia Message Service (MMS), internet relay chat (IRC), mIRC, and the like. For example, in one embodiment, messaging application 386 may be configured as an IM application, such as AOL Instant Messenger, Yahoo! Messenger, .NET Messenger Server, ICQ, or the like. In one embodiment messenger may be configured to include a mail user agent (MUA) such as Elm, Pine, MH, Outlook, Eudora, Mac Mail, Mozilla Thunderbird, or the like. In another embodiment, messaging application 386 may be an application that is configured to integrate and employ a variety of messaging protocols. In one embodiment, messaging application 386 may employ various message boxes to manage and/or store messages.
Generally speaking, the systems described herein are directed to telepresence robots. Various embodiments and aspects of the disclosure are described with reference to detailed discussion. The description and drawings are illustrative of the disclosure and are not to be construed as limiting the disclosure. Numerous specific details are described to provide a thorough understanding of various embodiments of the present disclosure. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments of the present disclosure.
As used herein, the terms, “comprises” and “comprising” are to be construed as being inclusive and open ended, and not exclusive. Specifically, when used in the specification and claims, the terms, “comprises” and “comprising” and variations thereof mean the specified features, steps or components are included. These terms are not to be interpreted to exclude the presence of other features, steps or components.
As used herein, the term “by way of example” means serving as an example, instance, or illustration and should not be construed as preferred or advantageous over other configurations disclosed herein.
As used herein the “operably connected” or “operably attached” means that the two elements are connected or attached either directly or indirectly. Accordingly the items need not be directly connected or attached but may have other items connected or attached therebetween.