Remotely controlling a self-propelled device in a virtualized environment

Abstract
A computing device operating as a controller can obtain image data from a camera component. The computing device can determine a location of the self-propelled device relative to the camera based on the image data. A virtual content may be generated on the computing device based at least in part on the location of the self-propelled device.
Description
TECHNICAL FIELD

Examples described herein relate to remote control of devices, and more specifically, to implementing remotely controlling devices in a virtualized environment.


BACKGROUND

Various kinds of remotely controllable devices exist. For example hobbyists often operate “RC” devices in the form of cars, trucks, airplanes and helicopters. Such devices typically receive commands from a controller device, and after movement (e.g., direction or velocity) based on the input. Some devices use software-based controllers, which can be implemented in the form of an application running on a device such as a smart phone or a tablet.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a system for implementing a remotely controlling a self-propelled device in context of a virtual environment, according to an embodiment.



FIG. 2 illustrates a method for controlling a self-propelled device for use in a virtual environment, according to an embodiment.



FIG. 3 illustrates a method for implementing a virtual environment for use with a remotely controlled self-propelled device, according to an embodiment.



FIG. 4 illustrates an example of a virtual environment on a device that controls movement of another device.



FIG. 5 illustrates an example hardware diagram a computing device on which a controller for a self-propelled device can be implemented.



FIG. 6A and FIG. 6B illustrate examples of self-propelled devices, under some embodiments.





DETAILED DESCRIPTION

According to some embodiments, a computing device is operated to process image data in order to track a movement or position of a self-propelled device.


Still further, in some embodiments, a self-propelled device is tracked, and position information determined from tracking the self-propelled device is integrated with a virtual environment. The virtual environment can include facets that are based on the position and/or state of the object, as well as on other aspects of the real-world environment as determined from tracking and/or communicating with the self-propelled device.


In an embodiment, image data is generated by a camera component. The camera component can be provided as part of, or alternatively coupled to, a computing device that acts as a controller for a self-propelled device. From the image data, a location of the self-propelled device can be determined (e.g., relative to the computing device). Content can be generated based at least in part on the location of the self-propelled device as the self-propelled device is moved or otherwise operated through the controller.


In some variations, sensor input is received from the self-propelled device. In particular, the sensor input can be indicative of the position of the self-propelled device. By way of example, the self-propelled device can include a gyroscope, an inertial mass unit (IMU), a GPS, an accelerometer, a light sensor, and/or proximity sensor. The sensor information communicated from the self-propelled device can include readings or output from the sensors. Additionally, the sensor information communicated from the self-propelled device can be in raw form, or in processed form (e.g., numerical values determined from a combination of sensors).


According to another embodiment, a computing device operating as a controller can obtain image data from a camera component. The computing device can determine a location of the self-propelled device relative to the camera based on the image data. A virtual content may be generated on the computing device based at least in part on the location of the self-propelled device.


As used herein, the term “substantially” means at least almost entirely. In quantitative terms, “substantially” means at least 80% of a stated reference (e.g., quantity of shape).


In similar regard, “spherical” or “sphere” means “substantially spherical.” An object is spherical if it appears rounded, contoured or elliptical to a user. Objects which include, for example, half-domes, quarter-domes, elliptical (dimension of one axis larger than another) can be considered spherical as used herein.


DETAILED DESCRIPTION


FIG. 1 illustrates a system for remotely controlling a self-propelled device in context of a virtual environment, according to an embodiment. In FIG. 1, system 50 includes a controller and a self-propelled device (“SPD”) 10. The controller 100 can correspond to, for example, a mobile computing device, such as a voice/telephony device for cellular communications, or a roaming device that can use local wireless or similar communications to communicate with network nodes and/or other devices. By way of example, controller 100 can correspond to a smart phone, tablet, netbook are other mobile computing device that utilizes application layer logic (e.g., downloaded application or “app”) or other programming to control the SPD 10. In variations, the controller 100 can be implemented as a dedicated or specialized device that controls self-propelled devices.


As described by various embodiments, SPD 10 can be operated to move under control of another device, such controller 100. In some embodiments, SPD 10 is configured with resources that enable one or more of the following: (i) maintain self-awareness of orientation and/or position relative to an initial reference frame after the device initiates movement; (ii) process control input programmatically, so as to enable a diverse range of program-specific responses to different control inputs; (iii) enable another device to control its movement using software or programming logic that is communicative with programming logic on the self-propelled device; and/or (iv) generate an output response for its movement and state that it is software interpretable by the control device.


With further reference to FIG. 1, controller 100 includes a user interface 110 an object detector 120, and an SPD interface 130. As described in greater detail, the user interface 110 can be operated to generate content that is based partially on the SPD 10, and/or the location of the SPD 10 relative to the controller 100. In particular, the user interface 110 can be used to generate content that provides a virtual (or augmented reality) context for the SPD 10. The user interface 110 can also include a command interface 112 that translates user input and/or other events into commands 119 for controlling the SPD 10 via the SPD interface 130.


In operation, controller 100 may include a camera interface 134, which obtains image data 113 depicting a real-time state of the SPD 10 in a scene that is captured by the camera of the controller 100. In one embodiment, the camera interface 134 receives image data that represents a series of images (or a video) of the scene in which the SPD 10 is present (e.g., moving under control of the controller 100). The object detector 120 can include image processing logic to analyze the image data 113, and to identify objects of interest from the image data 113. In particular, the object detector 120 can process the image data 113 in order to identify a depiction of the SPD 10, and optionally, a depiction of objects or landmarks in the scene that are deemed to be of interest.


In some implementations, the object detector 120 includes image analysis logic for analyzing image data 113 for presence of spheres, half spheres or portions thereof (e.g., quadrant of sphere). Thus, object detector 120 can include image processing logic for locating a shape that is indicative of SPD 10 (e.g., detect circle or portion thereof that is indicative of a spherical aspect of the SPD 10). Other characteristics such as illumination (e.g., LED illumination) or structural features of the SPD 10 can also be used to both detect the SPD 10, or to detect an orientation or other aspect of the SPD 10 in its environment. The object detector 120 can also include image processing resources that are configured to detect other specific kinds of objects other than spheres, such as objects typically encountered in the real world environment of SPD 10. These can include, for example, wall structures, table legs, or surfaces (e.g., carpet, grass etc.).


In one implementation, object detector 120 uses dimensional analysis to determine, from the image data 113, a relative position or distance of the SPD 10 from controller 100. In particular, one embodiment provides for the SPD 10 to be spherical. The object detector 120 can use dimensional analysis by comparing a dimension of the depiction for the SPD 10 (e.g., circle or portion thereof) with a reference dimension for the spherical housing of the self-propelled device.


In one variation, the controller 100 can also use local sensor information 109, provided from sensor devices resident on the computing device of controller 100. For example, local sensor information 109 can be provided by an accelerometer, gyroscope, magnetometer, or Global Positioning System (GPS) device that is resident on the computing device of the controller 100. As an addition or alternative, some implementations provide for the controller 100 to use the relative height of the controller 100 (e.g., distance from ground), and/or the orientation of the controller 100 with respect to a horizontal plane, in order to determine position information for the SPD 10.


As an addition or alternative, SPD 10 can communicate sensor input 11 to the controller 100. The sensor input 11 can correspond to, for example, information determined from an inertial mass unit (“IMU”), gyroscope, accelerometer, magnetometer, or GPS. The sensor input 11 can be either raw data, or data processed on the SPD 10 before being communicated to the controller 100. In variations, the controller 100 includes sensor logic 128, either with the object detector 120, or as a separate logical component (e.g., plug-in), to handle device sensor input 111 (corresponding to sensor input 11 provided from the SPD 10). The sensor logic 128 can determine sensor location information 115, corresponding to information that is indicative, probative or otherwise relevant to a position, relative location, or physical state of the SPD 10. Sensor location information 115 can be used by the object detector 120, in combination with image data 113, to determine a relative position for the SPD 10. The information determined for the SPD 10 can include, for example, a distance of the SPD 10 from the controller or reference point in either 1-, 2- or 3-dimensions, or a coordinate of the SPD 10 within a particular reference frame. The information determined from the SPD 10 can also be indicative of a physical state of the device (e.g., LED on device is on, device is going downhill or has hit obstruction, etc.).


In other variations, the position information for the SPD 10 can be communicated by the SPD to the controller 100 via a wireless link (e.g., Bluetooth). For example, the SPD 10 may be self-aware by way of its own geo-aware resources. In particular, the SPD 10 can include sensors and devices such as an accelerometer, a Global Positioning System (GPS) unit, a gyroscope and/or a magnetometer.


The object detector 120 can communicate raw position information 117 to one or more content output components of the controller 100. In one implementation, a coordinate mapping component 122 maps the raw position information 117 to coordinate positions 121 of an alternative reference frame that is specific to a particular virtual environment. The alternative reference frame can be generated and maintained through, for example, the content generation component 124, as part of a virtual environment. The content generation component 124 may independently use content input from, for example, a content library 125 in order to generate aspects of a virtual environment.


In some variations, the content generation component 124 can obtain virtual environment parameters 129 for use in creating specific virtual environments. For example, specific games or gaming scenarios may carry alternative virtual parameters 129. The virtual environment parameters 129 can, for example, (i) map raw position information 117 to virtual coordinates, (ii) convert detected objects into graphical representations (e.g., transfer real-world object into an anime gaming feature), and (iii) provide rules for a physics engine (e.g., application of Newtonian using virtual reference frame). Using the virtual environment parameters 129 and the content library 125, content generation component 124 can update the virtual environment using the raw position information 117 determined from tracking the position of the SPD 10. For example, a graphic representation of the SPD 10 can be reflected in the virtual environment, based on the position information and the alternative coordinate system provided by the coordinate mapping component 122.


As an addition or alternative, the content generation component 124 can include logic to (i) flag certain coordinates (e.g., determined from, or corresponding to raw position information 117) as landmarks or points of interest for future use in a virtual environment, and/or (ii) access a data store of historical content (“historical content store 127”) that is based on prior landmarks or points of interest.


The content generation component 124 can provide content output 123 for user interface 110. The user interface 110 can create a presentation that depicts the content output 123. The user interface 110 can also include or modify the content output 123 to allot for input from the user that can affect operation of the SPD 10, as well as to permit manipulation of the content presented. For example, the user interface 110 can include a framework of graphic features that the user can interact with in order to (i) alter a virtual aspect provided from content generation component 124, and/or (ii) adjust performance or operation (e.g., speed up, change direction, stop, execute sequence, spin etc.) of the SPD 10.


In more detail, the user interface 110 can include the command interface 112 to enable the user to control the SPD 10. For example, the command interface 112 can correspond to one or more features or aspects of the virtual environment that allow the user to enter input for purpose of controlling movement, operations and other aspects of the SPD 10. As described elsewhere, subsequent control of SPD 10 (e.g., movement) can also affect the virtual environment.


Methodology



FIG. 2 illustrates a method for controlling a self propelled device for use in a virtual environment, according to an embodiment. FIG. 3 illustrates a method for implementing a virtual environment for use with a self controlled device, according to an embodiment. Examples such as described with FIG. 2 and FIG. 3 can be implemented using components of the system such as described with FIG. 1. Accordingly, reference may be made to elements of FIG. 2 and FIG. 3 for purpose of illustrating suitable components or elements for performing a step or sub-step being described.


With reference to FIG. 2, image data can be generated for a particular scene in real time, with respect to the operation or movement of a self-propelled device (210). For example, controller 100 can be operated on a computing device that includes a camera. A user can point the camera to capture at the SPD 10 (e.g., operate the camera in video mode) in order to receive image data 113.


A location of a self-propelled device is detected from the scene (220). According to some embodiments, the image data is used to detect a location of the self-propelled device within a given scene. For example, in order to locate the SPD 10, the image processing logic can include object-type specific detectors to locate (i) shapes, such as circles, semi-circles, or portions thereof, in implementations in which the SPD 10 is a sphere, (ii) surface color (e.g., white), (iii) surface ornamentation or mark (e.g., visual code), or (iv) specific characteristic structural features. The image processing logic can also include object-type specific detectors to identify, from the image data objects that are likely to be found in the environment of the SPD 10. For example, object detector 120 can implement image processes to detect walls, rugs, specific kinds of floorings, pet dishes, steps, or lighting conditions that may be present in the environment of the SPD 10.


As an addition or alternative, controller 100 can use sensor data 109, communicated from the SPD 10, in order to determine the position of the SPD in a given scene. For example, SPD 10 can include resources for being aware of its position relative to, for example, controller 100. U.S. patent application Ser. No. 13/342,853, which is incorporated by reference herein, describes various sensor inputs and logic for enabling a device such as SPD 10 be self-aware of its location.


Content is generated based on the location of the SPD 10 (230). In particular, various kinds of virtual content can be generated that are based on movement, physical presence of other aspects of SPD 10 in the real-world environment. In one implementation, the content provided includes a map or other depiction of physical space (232). For example, the SPD 10 can be operated in a room, and a graphic map of the room, or of an alternative environment (e.g. gaming environment, such as a race track) can be generated that is based on the position of the SPD 10.


As an addition or alternative, the generated content can correspond to a virtual environment (234), such as, for example, in environment in which the SPD 10 has a virtual representation, and the surrounding environment is based partly on the real-world environment of the SPD 10. A set of virtual parameters can be determined as part of generating content. For example, a virtual map can be structured, and/or used as a framework or basis for conducting other events and aspects of a virtual environment. The virtual map can be determined from, for example, a path of the SPD, identified landmarks and other events, then translated into an alternative reference frame for the specific virtual environment.


With respect to FIG. 3, the location of a self-propelled device is determined from image data (310). In one implementation, the self-propelled device can have a geometry or shape that is detectable from the surrounding environment when viewed through image data. Dimensional analysis can be performed on the detected shape within the image data in order to determine a relative position (e.g., distance) of the self-propelled device with respect to a particular reference, such as the location where the image is captured (e.g., controller 100). As an example, the self-propelled device can be implemented as a sphere, and image processing logic of the controller can be configured to detect circles are ellipses (or portions thereof) that can correspond to the sphere. Based on the dimensions of the circle or ellipse, the relative position of the self-propelled device can be determined. The relative position can be determined as a coordinate (e.g., in 2- or 3-D), or through single linear dimension such as distance from the point where the image is captured.


In an embodiment, the location of the self-propelled device is recorded as the object moves about its scene (320). Thus, the self-propelled device can be tracked as it moves about. A path can be determined for the self-propelled device based on its past locations in a given timeframe.


The locations of the self-propelled device can be recorded over a given duration (330). For example, the device can be operated in a given time frame, and the movement of the device can be sampled by way of image processing. The self-propelled device can also be tracked so that the device's path can be determined for the given time period. The path or recorded locations of the self-propelled device can then be integrated into a virtual environment. More generally, the position of the self-propelled device can be integrated with content that is otherwise independent of the real-world environment of the self-propelled device.


In particular, some embodiments provide for recording landmarks that the self-propelled device experiences (332). Landmarks can include, for example, a starting point, an obstruction such as a wall or table leg, variations of the underlying surface in which to self-propelled device operates (e.g., type of flooring), variations in lighting (e.g., well lighted place versus dark place), and variations in the gradient of the underlying surface.


As another addition our variation, the recorded path of the self-propelled device can be used to generate a map (334). The map can define geographic parameters that are translated into a virtual environment. For example, the walls of the room can be translated into a track. The path of the object can also be translated into a virtual path. The map can also include the landmarks, as determined in (332).


EXAMPLE


FIG. 4 illustrates an example of a virtual environment on a device that controls movement of another device. In an example of FIG. 4, a mobile device 400 operates as a controller for a self-propelled device 410. In particular, the mobile device 400 can execute software (e.g., downloaded application) which creates virtual environment that is based on the movements of the self-propelled device 410. For example, mobile device 400 can execute different applications, of which each generate separate virtual environments, and each virtual environment can have a corresponding content, theme, simulated physics, and user involvement (e.g., how and what the user can respond to, input features etc.).


A field of view for a camera component 404 of the device 400 can monitor the region 411 for movement or changes to the physical state of the self-propelled device 410. The region 411 can also extend beyond the field of view. For example, the region 411 can encompass multiple rooms or an extended path traversed by the self-propelled device.


In the example of FIG. 4, the mobile device 400 renders a virtual environment in which the self-propelled device is reflected as a vehicle moving about obstacles. Thus, the self-propelled devices depicted in alternative, graphic form. The mobile device 400 may execute an application that coordinates a virtual environment with the real-world events related to the movement of the self-propelled device 410 in a given area. In an example of FIG. 4, the self-propelled device 410 is depicted as an automobile 412 on the mobile device 400. An application, running on the mobile device 400, renders the self-propelled device 410 as a vehicle. A remainder of the virtual environment can reflect hazards or other facets that tie into what the self-propelled device 410 experiences.


According to some embodiments, the self-propelled device can be moved in the real world amongst obstacles, and landmarks or points of interest. A region 411 in which the self-propelled device 410 is moved in the real-world can also be mapped into the virtual environment, and some landmarks or points of interest within the region 411 can be reflected in alternative forms in the virtual environment. For example, real-world obstruction such as walls 415 can be reflected as an obstacle 422 (e.g., one that is smaller in size) in the virtual environment. As another example, a charging dock 418 can be used as a landmark for the self-propelled device in maintaining a point of reference for future use. For example, the location of the charging dock 418 can be recorded and mapped to a virtual garage 428.


In some embodiments, mobile device 400 uses a camera to track the object in a given real-world region. The position of the object in the given region can be reflected in the corresponding virtual environment. Each virtual environment in use on the mobile device 400 can map to the real world based on a set of transformations. Thus, for example, mobile device 400 can use its camera to track the self-propelled device 410, and the position of the self-propelled device (as well as other physical information) can be used to coordinate the position of a virtual object of interest in a corresponding virtual environment.


Additionally, the self-propelled device 410 can be shaped so that it is readily detectable from image data. Still further, the self-propelled device 410 can be shaped so that the relative depth or distance from the mobile device 400 can be determined based on dimensional analysis of the detected shape versus a reference. For example, as shown by an example of FIG. 4, the self-propelled device 410 can be spherical, so that the detected size of the object in the image data can be correlated to a physical distance measure.


Optionally, the mobile device 400 can use sensor input from the self-propelled device, as an alternative or addition to using image data. The self-propelled device can communicate, for example, information determined from the IMU of the device. This information can enable the device to determine its own location relative to a particular reference frame. The device 400 can signal the information to the mobile device 400 using a wireless link.


Controller Hardware Diagram



FIG. 5 illustrates an example hardware diagram a computing device on which a controller for a self-propelled device can be implemented. In particular, a computing device 500 can configured to implement functionality described with, for example, controller 100 as described by some embodiments of FIG. 1. In some implementations, the computing device 500 is a mobile device, such as a cellular telephony messaging device that (e.g., IPHONE model device manufactured by APPLE INC.) or tablet (e.g., IPAD model device manufactured by APPLE INC.) which executes one or more applications for controlling movement and other operations of a self-propelled device. For example, in the context of FIG. 1, system 50 may be implemented by memory and processor resources as described in FIG. 5.


In one embodiment, the computing device 500 includes one or more processors 510, memory resources 520, a display device 530, one or more communication sub-systems 540 (including wireless communication sub-systems), and one or more sensors 560. According to different implementations, the communication sub-systems 540 enables the computing device 500 to exchange wireless communications with a self-propelled device using wireless communication mediums and protocols (e.g., WI-FI, BLUETOOTH, Infrared). In some embodiments, the computing device 500 can also include one or more input mechanisms 550 (e.g., a button, a switch, a touch-sensitive input device).


The processor 510 can be configured with software and/or other logic to perform one or more processes, steps and functions described with the various examples described herein, such as described by FIG. 1 through FIG. 3. Accordingly, the processor 510 can be configured, with instructions and data stored in the memory resources 520, to implement functionality described with controller 100 (as described with FIG. 1). The memory resources 520 may store instructions (e.g., applications) used by the processor 510 in implementing the functionality of the controller 100. For example, the memory resources 520 may store instructions for enabling the processor to (i) determine position information for the self-propelled device, (ii) implement a virtual reality where a graphic representing the self-propelled device is presented, and (iii) communicate and command the self-propelled device.


In performing operations of the controller, the processor 510 may utilize various forms of input. In particular, processor 510 can receive user input via the input mechanisms (e.g., touch sensor integrated with display, buttons, voice input etc.). Furthermore, the processor 510 can also receive sensor input from one or more sensors 560 that are included with the computing device 500. Examples of sensors 560 include accelerometers, proximity sensors, capacitive sensors, light sensors, magnetometers, inertial mass unit (or IMU) or gyroscopes. As described with an example of FIG. 1, processor 510 may also receive image data from image capture device 570, such as a charged-coupled device (CCD) camera that is able to capture video and images.


In one embodiment, the processor 510 can control the display device 530 to provide virtual or “augmented reality” content. As described with other examples, such content may include alternative graphic representations of a self-propelled device, as well as virtual elements or facets which are affected by real-world events, particularly relating to the position, movement and state of the self-propelled device that is under control of the computing device 500.


Self-Propelled Device



FIG. 6A and FIG. 6B illustrate examples of self-propelled devices, under some embodiments. With reference to FIG. 6A, a self-propelled device 600 can be spherical in shape, and include internal components 622 such as a drive system, sensors (e.g., gyroscope, inertial mass unit (IMU), GPS, accelerometer, light sensor, proximity sensor, etc.), processing resources and communication interface for communicating with a controller using a wireless communication medium (e.g., Bluetooth, Wi-Fi etc.). Numerous examples for the construction and operation of the self-propelled device, in accordance with an example such as provided in FIG. 6A and elsewhere in this application, are disclosed in, for example, U.S. patent application Ser. No. 13/342,863 and U.S. patent application Ser. No. 13/261,647; both of which are hereby incorporated in its entirety by reference.


In some examples, the self-propelled device 600 includes a housing 610 that is substantially spherical. In variations, the housing can be elliptical, semi-spherical, or include spherical portions. Other suitable geometric shapes may also be used. However, one advantage provided by using a spherical shape for the housing 610 is that the shape of the device remain symmetrical when viewed from various angles. This facilitates using image analysis for purpose of locating the object by relative position or distance.


With reference to FIG. 6B, a combination device 650 includes a second device 660 and a spherical housing element 670. The spherical housing element 670 may be modularized so as to be assembled onto the second device 660 as a self-contained unit, either during manufacturing or after-sale. For example, the second device 660 can correspond to a radio-frequency controlled device (e.g., car) that is augmented with the spherical housing element 670 to facilitate detection/position determination with use of image processing. In particular, the exposed spherical aspect of the housing element 670 can be imaged and subjected to dimensional or geometric analysis in order to approximate position information (e.g., distance from camera). As with other examples, the combined device can also include a variety of sensors 672, such as gyroscope, inertial mass unit (IMU), GPS, accelerometer, light sensor, or proximity sensor. For example, the housing element 670 can use wireless communications to signal additional sensor data to a controller device. The sensor data communicated from the sensors can also be indicative or determinative of the position of the housing element 670.


It is contemplated for embodiments described herein to extend to individual elements and concepts described, independently of other concepts, ideas or system, as well as to combinations of elements recited anywhere in this application. Although numerous examples are described in detail with reference to the accompanying drawings, it is to be understood that the embodiments extend beyond the described examples. Furthermore, it is contemplated that a particular feature described either individually or as part of an embodiment can be combined with other individually described features, or parts of other embodiments, even if the other features and embodiments make no mention of the particular feature.

Claims
  • 1. A computer-implemented method for operating a computing device, the method comprising: generating image data by a camera component of the computing device;programmatically detecting, by the computing device, from the image data, a location of a self-propelled device relative to the computing device;receiving, at the computing device, input from the self-propelled device, the input including sensor information that is obtained on the self-propelled device; andgenerating, on the computing device, content based on the location of the self-propelled device as the self-propelled device moves, wherein the self-propelled device is remote from the computing device.
  • 2. The computer-implemented method of claim 1, wherein programmatically detecting the location of the self-propelled device comprises image processing the image data.
  • 3. The computer-implemented method of claim 2, wherein image processing the image data comprises utilizing an object-type specific detector to locate at least one of a predetermined shape, a predetermined color, a predetermined visual code, and a predetermined structural feature.
  • 4. The computer-implemented method of claim 1, further comprising programmatically detecting, from the image data generated by the camera component of the computing device, a location of an object relative to at least one of the computing device and the self-propelled device.
  • 5. The computer-implemented method of claim 1, wherein the sensor information comprises information obtained from at least one of a gyroscope, an inertial mass unit, a GPS, an accelerometer, a light sensor, and a proximity sensor located on the self-propelled device.
  • 6. The computer-implemented method of claim 1, further comprising displaying the generated content on the computing device.
  • 7. The computer-implemented method of claim 6, wherein the generated content comprises a map.
  • 8. The computer-implemented method of claim 6, wherein the generated content comprises a depiction of a physical space in which the self-propelled device is located.
  • 9. The computer-implemented method of claim 6, wherein the generated content corresponds to a virtual environment.
  • 10. The computer-implemented method of claim 9, wherein the generated content comprises a graphic form of the self-propelled device.
  • 11. The computer-implemented method of claim 1, further comprising: identifying a landmark from the image data generated by the camera component of the computing device; andtranslating the landmark into an object, wherein the generated content comprises a virtual environment and the object.
  • 12. The computer-implemented method of claim 11, further comprising displaying the generated content on the computing device.
RELATED APPLICATIONS

This application is a continuation of patent application Ser. No. 13/766,455, entitled “REMOTELY CONTROLLING A SELF-PROPELLED DEVICE IN A VIRTUALIZED ENVIRONMENT”, filed Feb. 13, 2013, which is a continuation-in-part of U.S. patent application Ser. No. 13/342,853, entitled “SELF-PROPELLED DEVICE WITH ACTIVELY ENGAGED DRIVE SYSTEM”, filed Jan. 3, 2012, now U.S. Pat. No. 8,571,781, issued Oct. 29, 2013, which claims the benefit of U.S. Provisional Application No. 61/430,023, entitled “METHOD AND SYSTEM FOR CONTROLLING A ROBOTIC DEVICE, filed Jan. 5, 2011; U.S. Provisional Application No. 61/430,083, entitled “SYSTEM AND METHOD FOR ESTABLISHING 2-WAY COMMUNICATION FOR CONTROLLING A ROBOTIC DEVICE”, filed Jan. 5, 2011 and U.S. Provisional Application No. 61/553,923, entitled “A SELF-PROPELLED DEVICE AND SYSTEM FOR CONTROLLING SAME”, filed Oct. 31, 2011; all of the aforementioned applications being hereby incorporated by reference in their respective entirety for all purposes.

US Referenced Citations (415)
Number Name Date Kind
90546 Huntington May 1869 A
933623 Cecil Sep 1909 A
1263262 McFaul Apr 1918 A
2769601 Hagopian Nov 1956 A
2949696 Easterling Aug 1960 A
2977714 Gibson Apr 1961 A
3313365 Jackson Apr 1967 A
3667156 Tomiyama Jun 1972 A
3683216 Post Aug 1972 A
3821995 Aghnides Jul 1974 A
4310987 Chieffo Jan 1982 A
4519466 Shiraishi May 1985 A
4541814 Martin Sep 1985 A
4601675 Robinson Jul 1986 A
4893182 Gautraud Jan 1990 A
4897070 Wagstaff Jan 1990 A
4996468 Field et al. Feb 1991 A
5087000 Suto Feb 1992 A
5213176 Oroku et al. May 1993 A
5297951 Asai Mar 1994 A
5297981 Maxim et al. Mar 1994 A
5342051 Rankin et al. Aug 1994 A
5413345 Nauck May 1995 A
5439408 Wilkinson Aug 1995 A
5489099 Rankin et al. Feb 1996 A
5513854 Daver Mar 1996 A
5595121 Elliot Jan 1997 A
5628232 Bakholdin et al. May 1997 A
5644139 Allen et al. Jul 1997 A
5676582 Lin Oct 1997 A
5739657 Takayama et al. Apr 1998 A
5759083 Polumbaum et al. Jun 1998 A
5780826 Hareyama et al. Jul 1998 A
5793142 Richard Sep 1998 A
5871386 Bart et al. Feb 1999 A
5952796 Colgate et al. Sep 1999 A
5953056 Tucker Sep 1999 A
6017272 Rieder Jan 2000 A
6021222 Yamagata Feb 2000 A
6144128 Rosen Nov 2000 A
6227933 Michaud et al. May 2001 B1
6246927 Dratman Jun 2001 B1
6267673 Miyamoto et al. Jul 2001 B1
6315667 Steinhart Nov 2001 B1
6320352 Terazoe Nov 2001 B2
6390213 Bleicher May 2002 B1
6439956 Ho Jul 2002 B1
6430471 Kintou Aug 2002 B1
6449010 Tucker Sep 2002 B1
6456938 Bernard Sep 2002 B1
6458008 Hyneman Oct 2002 B1
6459955 Bartsch et al. Oct 2002 B1
6502657 Kerrebrock et al. Jan 2003 B2
6535793 Allard Mar 2003 B2
6573883 Bartlett Jun 2003 B1
6584376 Van Kommer Jun 2003 B1
6604181 Moriya Aug 2003 B1
6615109 Matsuoka et al. Sep 2003 B1
6764373 Osawa et al. Jul 2004 B1
6785590 Kasuga et al. Aug 2004 B2
6786795 Mullaney et al. Sep 2004 B1
6789768 Kalisch Sep 2004 B1
6856696 Ajioka Feb 2005 B1
6859555 Fang Feb 2005 B1
6901110 Tsougarakis et al. May 2005 B1
6902464 Lee Jun 2005 B1
6945843 Motosko Sep 2005 B1
6980956 Takagi et al. Dec 2005 B1
7058205 Jepson et al. Jun 2006 B2
7069113 Matsuoka et al. Jun 2006 B2
7130741 Bodin et al. Oct 2006 B2
7170047 Pal Jan 2007 B2
7173604 Marvit et al. Feb 2007 B2
7258591 Xu et al. Aug 2007 B2
7283647 McNitt Oct 2007 B2
7292711 Kiraly et al. Nov 2007 B2
7298869 Abernathy Nov 2007 B1
7324663 Kiraly et al. Jan 2008 B2
7328671 Kates Feb 2008 B2
7340077 Gokturk et al. Mar 2008 B2
7340344 Chappell Mar 2008 B2
7344430 Hasty et al. Mar 2008 B2
7409924 Kates Aug 2008 B2
7424867 Kates Sep 2008 B2
7432718 Ishihara et al. Oct 2008 B2
7463001 Tsurukawa Dec 2008 B2
7499077 Li Mar 2009 B2
7501780 Yamamoto Mar 2009 B2
7526362 Kim et al. Apr 2009 B2
7538764 Salomie May 2009 B2
7639874 Bushell et al. Dec 2009 B2
7699683 Caspi Apr 2010 B2
7702131 Chinen et al. Apr 2010 B2
7714880 Johnson May 2010 B2
7714895 Pretlove et al. May 2010 B2
7726422 Sun et al. Jun 2010 B2
7729537 Grady Jun 2010 B2
7755660 Nejikovsky et al. Jul 2010 B2
7773773 Abercrombie Aug 2010 B2
7822507 Ishihara et al. Oct 2010 B2
7847504 Hollis Dec 2010 B2
7853357 Sawada et al. Dec 2010 B2
7889226 Pescatore et al. Feb 2011 B2
7957837 Ziegler et al. Jun 2011 B2
7979162 Niemela et al. Jul 2011 B2
8025551 Torres et al. Sep 2011 B2
8038504 Wong Oct 2011 B1
8077914 Kaplan Dec 2011 B1
8099189 Kaznov et al. Jan 2012 B2
8128450 Imai Mar 2012 B2
8128500 Borst et al. Mar 2012 B1
8142287 Podoloff Mar 2012 B2
8144118 Hildreith Mar 2012 B2
8180436 Boyden et al. May 2012 B2
8190295 Garretson May 2012 B1
8195333 Ziegler et al. Jun 2012 B2
8197298 Willett Jun 2012 B2
8210289 Lu et al. Jul 2012 B1
8258917 Cai et al. Sep 2012 B2
8269447 Smoot et al. Sep 2012 B2
8274406 Karlsson et al. Sep 2012 B2
8275544 Wells et al. Sep 2012 B1
8326469 Phillips et al. Dec 2012 B2
8330639 Wong et al. Dec 2012 B2
8352643 Birnbaum et al. Jan 2013 B2
8355818 Nielsen et al. Jan 2013 B2
8364136 Hoffberg et al. Jan 2013 B2
8376756 Robb Feb 2013 B2
8392065 Tolstedt et al. Mar 2013 B2
8396611 Phillips et al. Mar 2013 B2
8400619 Bernstein et al. Mar 2013 B1
8417384 Togawa et al. Apr 2013 B2
8430192 Gillett Apr 2013 B2
8442661 Blackwell et al. May 2013 B1
8459383 Burget Jun 2013 B1
8522902 Gomi et al. Sep 2013 B2
8523846 Makino Sep 2013 B2
8540038 Ullman Sep 2013 B1
8571781 Bernstein et al. Oct 2013 B2
8577595 Zhao et al. Nov 2013 B2
8600600 Jung Dec 2013 B2
8670889 Kaznov Mar 2014 B2
8672062 Schroll et al. Mar 2014 B2
8751063 Bernstein et al. Jun 2014 B2
8766983 Marks et al. Jul 2014 B2
8788130 Tran et al. Jul 2014 B1
8805947 Kuzkin Aug 2014 B1
8811675 Chadranshekar Aug 2014 B2
8838273 Hvass et al. Sep 2014 B2
8854392 Child Oct 2014 B2
8862301 Araki et al. Oct 2014 B2
8882559 Fessenmaier Nov 2014 B2
8885882 Yin et al. Nov 2014 B1
9008860 Waldock Apr 2015 B2
9011197 Smoot et al. Apr 2015 B2
9014848 Farlow et al. Apr 2015 B2
9041622 McCulloch May 2015 B2
9090214 Bernstein et al. Jul 2015 B2
9114838 Bernstein et al. Aug 2015 B2
9150263 Bernstein et al. Oct 2015 B2
9171211 Keat Oct 2015 B2
9193404 Bernstein et al. Nov 2015 B2
9211920 Bernstein et al. Dec 2015 B1
9218316 Bernstein et al. Dec 2015 B2
9280717 Polo et al. Mar 2016 B2
9290220 Bernstein et al. Mar 2016 B2
9292758 Polo et al. Mar 2016 B2
9389612 Bernstein et al. Jul 2016 B2
9394016 Bernstein et al. Jul 2016 B2
9395725 Berstein et al. Jul 2016 B2
9429940 Bernstein et al. Aug 2016 B2
9457730 Berstein et al. Oct 2016 B2
9481410 Bernstein et al. Nov 2016 B2
9483876 Polo et al. Nov 2016 B2
9558612 Lyons Jan 2017 B2
20020011368 Berg Jan 2002 A1
20020036104 Kerrebrock et al. Mar 2002 A1
20020142701 Rosenberg Oct 2002 A1
20030093182 Yokoyama May 2003 A1
20030118217 Kondo et al. Jun 2003 A1
20030179176 Waterston Sep 2003 A1
20030216834 Allard Nov 2003 A1
20030216835 Wakui Nov 2003 A1
20040002843 Robarts et al. Jan 2004 A1
20040013295 Sabe Jan 2004 A1
20040015266 Skoog Jan 2004 A1
20040158357 Lee Aug 2004 A1
20040168837 Michaud et al. Sep 2004 A1
20040182614 Yoshiaki Sep 2004 A1
20040186623 Dooley et al. Sep 2004 A1
20040192163 Siegel Sep 2004 A1
20040198159 Xu et al. Oct 2004 A1
20050004723 Duggan et al. Jan 2005 A1
20050041839 Saitou Feb 2005 A1
20050091684 Kawabata Apr 2005 A1
20050186884 Evans Aug 2005 A1
20050216186 Dorfman Sep 2005 A1
20050226192 Red Oct 2005 A1
20050264472 Rast Dec 2005 A1
20060080802 Tani Apr 2006 A1
20060095158 Lee et al. May 2006 A1
20060101465 Kato et al. May 2006 A1
20060132318 Shimizu Jun 2006 A1
20060164261 Stiffler Jul 2006 A1
20060241812 Jung Oct 2006 A1
20060271251 Hopkins Nov 2006 A1
20070034734 Yoeli Feb 2007 A1
20070078004 Suzuki Apr 2007 A1
20070085706 Feyereisen et al. Apr 2007 A1
20070112462 Kim et al. May 2007 A1
20070150103 Im Jun 2007 A1
20070162862 Ogasawara Jul 2007 A1
20070192910 Vu Aug 2007 A1
20070215394 Sun Sep 2007 A1
20070249422 Podoloff Oct 2007 A1
20070259592 Imai et al. Nov 2007 A1
20070282484 Chung et al. Dec 2007 A1
20080009965 Bruemmer et al. Jan 2008 A1
20080012518 Yamamoto Jan 2008 A1
20080033641 Medalia Feb 2008 A1
20080077284 Swope Mar 2008 A1
20080082208 Hong Apr 2008 A1
20080086236 Saito Apr 2008 A1
20080086241 Phillips et al. Apr 2008 A1
20080121097 Rudakevych et al. May 2008 A1
20080174268 Koo et al. Jul 2008 A1
20080174448 Hudson Jul 2008 A1
20080182479 Elliott et al. Jul 2008 A1
20080240507 Niwa et al. Oct 2008 A1
20080263628 Norman et al. Oct 2008 A1
20080267450 Sugimoto et al. Oct 2008 A1
20080269949 Norman et al. Oct 2008 A1
20090016583 Wolf Jan 2009 A1
20090018712 Duncan Jan 2009 A1
20090028439 Elangovan et al. Jan 2009 A1
20090033623 Lin Feb 2009 A1
20090055019 Stiehl et al. Feb 2009 A1
20090057238 Garti Mar 2009 A1
20090069084 Reece Mar 2009 A1
20090073034 Lin Mar 2009 A1
20090078484 Kocijan Mar 2009 A1
20090081923 Dooley et al. Mar 2009 A1
20090118020 Koivisto May 2009 A1
20090133467 Mori et al. May 2009 A1
20090138232 Fuwa May 2009 A1
20090153349 Lin Jun 2009 A1
20090157221 Sip Jun 2009 A1
20090161983 Ciurea Jun 2009 A1
20090164638 Jang Jun 2009 A1
20090171516 Reich Jul 2009 A1
20090187299 Fregene Jul 2009 A1
20090198371 Emanuel et al. Aug 2009 A1
20090204261 Strand et al. Aug 2009 A1
20090222148 Knotts et al. Sep 2009 A1
20090226035 Iihoshi et al. Sep 2009 A1
20090245656 Hu Oct 2009 A1
20090256822 Amireh et al. Oct 2009 A1
20090257741 Greb Oct 2009 A1
20090262074 Nasiri et al. Oct 2009 A1
20090265671 Sachs et al. Oct 2009 A1
20090278932 Yi Nov 2009 A1
20090284553 Seydoux Nov 2009 A1
20090316012 Matos Dec 2009 A1
20100002909 Lefevre et al. Jan 2010 A1
20100004798 Bodin et al. Jan 2010 A1
20100010669 Lee et al. Jan 2010 A1
20100010672 Wang et al. Jan 2010 A1
20100032224 Liu Feb 2010 A1
20100057059 Makino Mar 2010 A1
20100063652 Anderson Mar 2010 A1
20100066676 Kramer et al. Mar 2010 A1
20100084513 Gariepy et al. Apr 2010 A1
20100090661 Chen et al. Apr 2010 A1
20100106344 Edwards et al. Apr 2010 A1
20100145236 Greenberg et al. Jun 2010 A1
20100169098 Patch Jul 2010 A1
20100172287 Krieter Jul 2010 A1
20100178982 Ehrman Jul 2010 A1
20100183195 Sharma Jul 2010 A1
20100234993 Seelinger et al. Sep 2010 A1
20100241289 Sandberg Sep 2010 A1
20100261526 Anderson et al. Oct 2010 A1
20100264756 Lee et al. Oct 2010 A1
20100283988 Mosier et al. Nov 2010 A1
20100302247 Perez et al. Dec 2010 A1
20100302359 Adams Dec 2010 A1
20100305778 Dorneich et al. Dec 2010 A1
20100305781 Felix Dec 2010 A1
20100312917 Allport Dec 2010 A1
20100324753 Okumatsu Dec 2010 A1
20110003640 Ehrman Jan 2011 A9
20110018731 Linsky et al. Jan 2011 A1
20110018794 Linsky et al. Jan 2011 A1
20110022196 Linsky et al. Jan 2011 A1
20110035054 Gal et al. Feb 2011 A1
20110050940 Lanz et al. Mar 2011 A1
20110060492 Kaznov Mar 2011 A1
20110065488 Okamura et al. Mar 2011 A1
20110071652 Brown et al. Mar 2011 A1
20110071702 Wang et al. Mar 2011 A1
20110082566 Herr et al. Apr 2011 A1
20110087371 Sandberg et al. Apr 2011 A1
20110138416 Kang et al. Jun 2011 A1
20110153885 Mak et al. Jun 2011 A1
20110156943 Wong et al. Jun 2011 A1
20110174565 Rochat Jul 2011 A1
20110183732 Block et al. Jul 2011 A1
20110184590 Duggan et al. Jul 2011 A1
20110201362 Bregman-Amitai et al. Aug 2011 A1
20110132671 Lee et al. Sep 2011 A1
20110213278 Horak et al. Sep 2011 A1
20110231013 Smoot et al. Sep 2011 A1
20110234488 Ge et al. Sep 2011 A1
20110237324 Clavin et al. Sep 2011 A1
20110246904 Pinto Oct 2011 A1
20110249869 Stoeffler Oct 2011 A1
20110250967 Kulas Oct 2011 A1
20110249074 Cranfill Nov 2011 A1
20110273379 Chen et al. Nov 2011 A1
20110283223 Vaittinen et al. Nov 2011 A1
20110285349 Widmer et al. Nov 2011 A1
20110286631 Wagner et al. Nov 2011 A1
20110291926 Gokturk et al. Dec 2011 A1
20110294397 Tsai Dec 2011 A1
20110301901 Panagas Dec 2011 A1
20110304633 Beardsley Dec 2011 A1
20110308873 Kim et al. Dec 2011 A1
20110313568 Blackwell et al. Dec 2011 A1
20110320153 Lightcap Dec 2011 A1
20110320830 Ito Dec 2011 A1
20120009845 Schmelzer Jan 2012 A1
20120035799 Ehrmann Feb 2012 A1
20120043149 Kim et al. Feb 2012 A1
20120043172 Ichikawa Feb 2012 A1
20120059520 Kossett Mar 2012 A1
20120065747 Brown et al. Mar 2012 A1
20120072023 Ota Mar 2012 A1
20120083945 Oakley et al. Apr 2012 A1
20120083962 Sato et al. Apr 2012 A1
20120099756 Sherman et al. Apr 2012 A1
20120100915 Margalit et al. Apr 2012 A1
20120106783 Chang et al. May 2012 A1
20120112553 Stoner May 2012 A1
20120129605 Livet May 2012 A1
20120143482 Goossen et al. Jun 2012 A1
20120146775 Kudo et al. Jun 2012 A1
20120149359 Huang Jun 2012 A1
20120155724 Kitamura Jun 2012 A1
20120167014 Joo et al. Jun 2012 A1
20120168240 Wilson Jul 2012 A1
20120173018 Allen et al. Jul 2012 A1
20120173049 Bernstein et al. Jul 2012 A1
20120173050 Berstein et al. Jul 2012 A1
20120185115 Dean Jul 2012 A1
20120193154 Wellborn et al. Aug 2012 A1
20120197439 Wang et al. Aug 2012 A1
20120200380 Kocijan Aug 2012 A1
20120215355 Bewley et al. Aug 2012 A1
20120229647 Calman et al. Sep 2012 A1
20120232977 Calman et al. Sep 2012 A1
20120233015 Calman et al. Sep 2012 A1
20120240077 Vaittinen et al. Sep 2012 A1
20120244969 Binder Sep 2012 A1
20120258645 Cheng Oct 2012 A1
20120262002 Widmer et al. Oct 2012 A1
20120293548 Perez et al. Nov 2012 A1
20120298049 Cook et al. Nov 2012 A1
20120298430 Schroll et al. Nov 2012 A1
20120302129 Persaud Nov 2012 A1
20120306850 Balan et al. Dec 2012 A1
20120307001 Osako et al. Dec 2012 A1
20120309261 Boman et al. Dec 2012 A1
20120311810 Gilbert et al. Dec 2012 A1
20130022274 Lawrence Jan 2013 A1
20130040533 Miller Feb 2013 A1
20130050069 Ota Feb 2013 A1
20130065482 Trickett Mar 2013 A1
20130105239 Fung May 2013 A1
20130109272 Rindlishbacher May 2013 A1
20130113307 Kim et al. May 2013 A1
20130143482 Regier Jun 2013 A1
20130178257 Lengseth Jul 2013 A1
20130200207 Pongratz Aug 2013 A1
20130259386 Chadranshekar Oct 2013 A1
20130265225 Nasiri et al. Oct 2013 A1
20130293584 Anderson et al. Nov 2013 A1
20130307875 Anderson et al. Nov 2013 A1
20130335301 Wong et al. Dec 2013 A1
20140008496 Ye Jan 2014 A1
20140015493 Wirz et al. Jan 2014 A1
20140051513 Polo et al. Feb 2014 A1
20140120887 Huang May 2014 A1
20140176487 Kikuchi Jun 2014 A1
20140207280 Duffley Jul 2014 A1
20140238762 Berberian et al. Aug 2014 A1
20140249697 Fredriksson Sep 2014 A1
20140371954 Lee et al. Dec 2014 A1
20150091697 Takayasu Apr 2015 A1
20150175202 MacGregor Jun 2015 A1
20150209664 Haseltine Jul 2015 A1
20150268666 Wang Sep 2015 A1
20160033967 Bernstein et al. Feb 2016 A1
20160090133 Bernstein et al. Mar 2016 A1
20160148367 Polo et al. May 2016 A1
20160202696 Bernstein et al. Jul 2016 A1
20160246299 Berberian et al. Aug 2016 A1
20160282871 Berstein et al. Sep 2016 A1
20160291591 Bernstein et al. Oct 2016 A1
20160291595 Halloran Oct 2016 A1
20160349748 Bernstein et al. Dec 2016 A1
20170080352 Bernstein et al. Mar 2017 A1
20170092009 Polo et al. Mar 2017 A1
20180224845 Bernstein et al. Aug 2018 A1
20180296911 Polo et al. Oct 2018 A1
20180364699 Bernstein et al. Dec 2018 A1
Foreign Referenced Citations (47)
Number Date Country
1302717 Jul 2001 CN
1765595 May 2006 CN
101154110 Apr 2008 CN
201147642 Nov 2008 CN
201220111 Apr 2009 CN
101426664 May 2009 CN
102060060 May 2011 CN
102421629 Apr 2012 CN
19809168 Sep 1999 DE
101 46 862 May 2002 DE
102011108689 Apr 2012 DE
371149 Jun 1990 EP
1944573 Jul 2008 EP
102010042395 Apr 2012 EP
3727 Jan 1898 GB
2309650 Aug 1997 GB
2319756 Jun 1998 GB
03182290 Aug 1991 JP
H07-308462 Nov 1995 JP
09254838 Sep 1997 JP
2000218578 Aug 2000 JP
2001153650 Jun 2001 JP
2002126373 May 2002 JP
2002345706 Dec 2002 JP
2004042246 Feb 2004 JP
2004148439 May 2004 JP
2004260917 Sep 2004 JP
2005165692 Jun 2005 JP
2007072802 Mar 2007 JP
2007213353 Aug 2007 JP
2008-040725 Feb 2008 JP
2011530756 Dec 2011 JP
2012022457 Feb 2012 JP
4893862 Mar 2012 JP
10-2008-040725 Aug 2008 KR
10-2008-0073626 Aug 2008 KR
10-2008-0073626 Aug 2008 KR
10-2009-0000013 Jan 2009 KR
20100001408 Jan 2010 KR
10-2008-0092595 Jul 2010 KR
10-0969873 Jul 2010 KR
20105393 Apr 2010 TW
WO-9725239 Jul 1997 WO
WO-2006049559 May 2006 WO
2008008847 Jan 2008 WO
WO-2012094349 Jul 2012 WO
2012103525 Aug 2012 WO
Non-Patent Literature Citations (268)
Entry
US 9,342,073 B2, 05/2016, Berstein et al. (withdrawn)
U.S. Appl. No. 13/342,908, Office Action dated Dec. 20, 2013, 26 pages.
U.S. Appl. No. 13/342,908, Office Action dated Jun. 5, 2014, 21 pages.
U.S. Appl. No. 13/342,908, Supplemental Amendment and Response filed Apr. 17, 2015, 10 pages.
U.S. Appl. No. 13/342,914, Advisory Action dated Feb. 13, 2014, 3 pages.
U.S. Appl. No. 13/342,914, Amendment and Response filed Sep. 3, 2013, 24 pages.
U.S. Appl. No. 13/342,914, Amendment and Response filed Feb. 3, 2014, 12 pages.
U.S. Appl. No. 13/342,914, Appeal Brief filed Jul. 3, 2014, 27 pages.
U.S. Appl. No. 13/342,914, Office Action dated Jun. 3, 2013, 30 pages.
U.S. Appl. No. 13/342,914, Office Action dated Nov. 13, 2013, 28 pages.
U.S. Appl. No. 13/342,914, Response to Appeal Brief dated Jul. 29, 2014, 10 pages.
U.S. Appl. No. 13/549,097, Amendment and Response filed Mar. 24, 2015, 14 pages.
U.S. Appl. No. 13/549,097, Amendment and Response filed Jan. 22, 2016, 16 pages.
U.S. Appl. No. 13/549,097, Office Action dated Dec. 26, 2014, 20 pages.
U.S. Appl. No. 13/549,097, Office Action dated Oct. 22, 2015, 20 pages.
U.S. Appl. No. 13/549,097, Office Action dated Oct. 4, 2016, 22 pages.
U.S. Appl. No. 13/766,455, Amendment and Response filed Jul. 15, 2015, 11 pages.
U.S. Appl. No. 13/766,455, Notice of Allowance dated Aug. 20, 2015, 15 pages.
U.S. Appl. No. 13/766,455, Office Action dated Apr. 15, 2015, 9 pages.
U.S. Appl. No. 13/894,247, Amendment and Response filed Aug. 13, 2015, 9 pages.
U.S. Appl. No. 13/894,247, Notice of Allowance dated Oct. 29, 2015, 7 pages.
U.S. Appl. No. 13/894,247, Office Action dated Jun. 12, 2015, 14 pages.
U.S. Appl. No. 14/035,841 Amendment and Response filed Sep. 14, 2015, 12 pages.
U.S. Appl. No. 14/035,841, Notice of Allowance dated Sep. 25, 2015, 5 pages.
U.S. Appl. No. 14/035,841, Notice of Allowance dated Oct. 7, 2016, 2 pages.
U.S. Appl. No. 14/035,841, Notice of Allowance dated Oct. 16, 2016, 2 pages.
U.S. Appl. No. 14/035,841, Office Action dated May 13, 2015, 12 pages.
U.S. Appl. No. 14/054,636, Amendment and Response filed Mar. 17, 2016, 13 pages.
U.S. Appl. No. 14/054,636, Amendment and Response filed Sep. 23, 2016, 14 pages.
U.S. Appl. No. 14/054,636, Notice of Allowance dated Dec. 21, 2016, 8 pages.
U.S. Appl. No. 14/054,636, Office Action dated Jan. 20, 2016, 14 pages.
U.S. Appl. No. 14/054,636, Office Action dated Jun. 24, 2016, 23 pages.
U.S. Appl. No. 14/137,954, Amendment and Response filed Aug. 3, 2015, 14 pages.
U.S. Appl. No. 14/137,954, Amendment and Response filed Feb. 5, 2016, 11 pages.
U.S. Appl. No. 14/137,954, Amendment and Response filed Jun. 6, 2016, 12 pages.
U.S. Appl. No. 14/137,954, Notice of Allowance dated Sep. 26, 2016, 8 pages.
U.S. Appl. No. 14/137,954, Office Action dated May 4, 2015, 26 pages.
U.S. Appl. No. 14/137,954, Office Action dated Nov. 5, 2015, 31 pages.
U.S. Appl. No. 14/137,954, Office Action dated Apr. 12, 2016, 27 pages.
U.S. Appl. No. 14/148,541, Amendment and Response filed Sep. 4, 2015, 14 pages.
U.S. Appl. No. 14/148,541, Notice of Allowance dated Nov. 18, 2015, 11 pages.
U.S. Appl. No. 14/148,541, Office Action dated Jun. 4, 2015, 18 pages.
U.S. Appl. No. 14/261,288, Amendment and Response filed Nov. 5, 2015, 12 pages.
U.S. Appl. No. 14/261,288, Notice of Allowance dated Nov. 23, 2015, 10 pages.
U.S. Appl. No. 14/261,288, Office Action dated Jul. 7, 2015, 13 pages.
U.S. Appl. No. 14/271,203, Advisory Action dated Mar. 2, 2016, 3 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Oct. 26, 2015, 10 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Feb. 23, 2016, 9 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Mar. 11, 2016, 9 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Jun. 6, 2016, 9 pages.
U.S. Appl. No. 14/271,203, Office Action dated Jul. 27, 2015, 11 pages.
U.S. Appl. No. 61/362,005, filed Jul. 7, 2010, Schmelzer, Richard.
International Search Report and the Written Opinion dated Dec. 3, 2012, for related PCT/US2012/020115 11 pages.
Koshiyama et al., Machine Translation for JP 2000-218578, Aug. 8, 2000, 11 Pages.
GearBox Ball Prototype Jun. 29, 2010, Pictures from Video [online]. Orbotix, Inc., Jun. 30, 2010, 91 pages. Retrieved from the internet:<URL: http://www.youtube.com/watch?v=qRBM7bAaXpU>.
International Search Report and the Written Opinion dated Aug. 28, 2013, for related PCT/US2013/041023, 11 pages.
Liu, Dalian et al., “Motion Control of a Spherical Mobile Robot by Feedback Linearization,” 7th WC on IC&A, Jun. 27, 2008, Chongqing, China, pp. 965-970. 6 pages.
Shu, Guanghui et al., “Motion Control of Spherical Robot Based on Conservation of Angular Momentum,” IEEE Intl Conf on Mechatronics & Automation, Aug. 9, 2012, Changchun, China, pp. 599-604. 6 pages.
Joshi, Vrunda et al., “Design, modeling and controllability of a spherical mobile robot”, 13th Natl Conf on Mechanisms & Machines (NaCoMM07) IlSc, Bangalore, India, Dec. 13, 2007, pp. 1-6.
Harmo, Panu et al., “Moving Eye—Interactive Telepresence over Internet with a Ball Shaped Mobile Robot,” Automation Tech Lab, Finland, Oct. 2, 2001. 6 pages. http://automation.tkk.fi/files/tervetaas/MovingEye4.pdf.
Halme, Aarne, et al., “Motion Control of a Spherical Mobile Robot”, Helsinki, IEEE AMC '1996, pp. 259-264. 6 pages.
European Search Report and European Search Opinion dated Nov. 6, 2014, for related EP 12731945.7, 7 pages.
International Search Report and Written Opinion in related PCT/US2014/059973 dated Dec. 17, 2014 13 pages.
“Roll, Pitch, and Yaw 1/ How Things Fly”, How Things Fly website, date unknown, retrieved from https://howthingsfly.si.edu/flight- dynamics/roll-pitch-and-yaw.
Korean Office Action in Application 10-2014-7034020, dated Dec. 23, 2016, 11 pages.
U.S. Appl. No. 14/884,632, Office Action dated Jan. 25, 2017, 7 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Feb. 1, 2017, 12 pages.
U.S. Appl. No. 13/342,914, Decision on Appeal dated Feb. 1, 2017, 8 pages.
U.S. Appl. No. 14/271,203, Office Action dated Dec. 21, 2015, 10 pages.
U.S. Appl. No. 14/271,203, Office Action dated Apr. 4, 2016, 10 pages.
U.S. Appl. No. 14/271,203, Office Action dated Aug. 1, 2016, 17 pages.
U.S. Appl. No. 14/459,235, Notice of Allowance dated Mar. 6, 2015, 9 pages.
U.S. Appl. No. 14/459,235, Notice of Allowance dated Jun. 25, 2015, 7 pages.
U.S. Appl. No. 14/663,446, Notice of Allowance dated Sep. 25, 2015, 9 pages.
U.S. Appl. No. 14/691,349, Amendment and Response filed Aug. 28, 2015, 11 pages.
U.S. Appl. No. 14/691,349, Amendment and Response filed Jan. 26, 2016, 6 pages.
U.S. Appl. No. 14/691,349, Notice of Allowance dated Mar. 4, 2016, 5 pages.
U.S. Appl. No. 14/691,349, Notice of Allowance dated Jun. 6, 2016, 5 pages.
U.S. Appl. No. 14/691,349, Office Action dated Jul. 17, 2015, 9 pages.
U.S. Appl. No. 14/832,801, Amendment and Response filed Feb. 5, 2016, 10 pages.
U.S. Appl. No. 14/832,801, Amendment and Response filed Feb. 12, 2016, 8 pages.
U.S. Appl. No. 14/832,801, Notice of Allowance dated Mar. 22, 2016, 10 pages.
U.S. Appl. No. 14/832,801, Notice of Allowance dated May 11, 2016, 5 pages.
U.S. Appl. No. 14/832,801, Office Action dated Nov. 6, 2015, 6 pages.
U.S. Appl. No. 14/839,610, Amendment and Response filed Feb. 18, 2016, 11 pages.
U.S. Appl. No. 14/839,610, Notice of Allowance dated Mar. 23, 2016, 16 pages.
U.S. Appl. No. 14/839,610, Office Action dated Nov. 18, 2015, 7 pages.
U.S. Appl. No. 14/850,910, Amendment and Response filed Feb. 18, 2016, 7 pages.
U.S. Appl. No. 14/850,910, Notice of Allowance dated Mar. 17, 2016, 11 pages.
U.S. Appl. No. 14/850,910, Office Action dated Nov. 25, 2015, 8 pages.
U.S. Appl. No. 14/968,594, Amendment and Response filed Apr. 5, 2016, 7 pages.
U.S. Appl. No. 14/968,594, Notice of Allowance dated Jul. 19, 2016, 6 pages.
U.S. Appl. No. 14/968,594, Office Action dated Feb. 3, 2016, 5 pages.
U.S. Appl. No. 14/975,510, Amendment and Response filed May 12, 2016, 8 pages.
U.S. Appl. No. 14/975,510, Notice of Allowance dated Jul. 7, 2016, 5 pages.
U.S. Appl. No. 14/975,510, Office Action dated Feb. 12, 2016, 6 pages.
U.S. Appl. No. 15/017,211, Notice of Allowance dated Jul. 5, 2016, 10 pages.
U.S. Appl. No. 15/017,211, Notice of Allowance dated Aug. 8, 2016, 4 pages.
U.S. Appl. No. 15/232,490, Office Action dated Sep. 23, 2016, 5 pages.
European Search Report in Application 13790911.5, dated Oct. 14, 2016, 10 pages.
Loy et al., “Fast Radial Symmetry for Detecing Points of Interest”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, USA, vol. 25, No. 8, Aug. 1, 2003, 15 pages.
European Search Report in Application 14795148.7, dated Dec. 7, 2016, 7 pages.
Airioiu, “Force Feedback Stabilization for Remote Control of an Assistive Mobile Robot”, AACC Publication, 2011, pp. 4898-4903.
Chinese Office Action in Application 201380036857.2, dated Jun. 29, 2016, 10 pages.
Chinese Office Action in Application 201620300686, dated Sep. 9, 2016, 3 pages.
Diolaiti et al., “Tele-operation of a Mobile Robot Through Haptic Feedback”, IEEE, 2002, p. 1-6.
European Search Report in Application 13817382.2, dated Mar. 11, 2016, 8 pages.
Hashimoto et al., “TouchMe: An Augmented Reality Based Remote Robot Manipulation”, Nov. 2011, pp. 61-66.
Korean Office Action in Application 10-2015-7003642, dated Nov. 28, 2016, 13 pages.
Osorio et al., “Mobile Robots Design and Implementation: From Virtual Simulation to Real Robots”, IDME Publication, 2010, 6 pages.
PCT International Search Report in PCT/US2013/050327, dated Oct. 15, 2013, 11 pages.
PCT International Search Report in PCT/US2014/037013, dated Aug. 26, 2014, 8 pages.
PCT International Search Report in PCT/US2014/068606, dated Mar. 2, 2015, 7 pages.
PCT International Search Report in PCT/US2015/030877, dated Aug. 13, 2015, 5 pages.
PCT International Search Report in PCT/US2015/044885, dated Oct. 29, 2015, 7 pages.
Simsarian et al., “Achieving Virtual Presence with a Semi-autonomous Robot through a Multi-reality and speech control interface”, 1996, pp. 50-63.
U.S. Appl. No. 13/342,853, Amendment and Response filed Feb. 19, 2013, 7 pages.
U.S. Appl. No. 13/342,853, Notice of Allowance dated Apr. 19, 2013, 6 pages.
U.S. Appl. No. 13/342,853, Notice of Allowance dated Jun. 20, 2013, 6 pages.
U.S. Appl. No. 13/342,853, Office Action dated Oct. 16, 2012, 10 pages.
U.S. Appl. No. 13/342,874, Amendment and Response filed Sep. 13, 2013, 21 pages.
U.S. Appl. No. 13/342,874, Amendment and Response filed Jan. 21, 2014, 13 pages.
U.S. Appl. No. 13/342,874, Amendment and Response filed Jul. 14, 2014, 13 pages.
U.S. Appl. No. 13/342,874, Amendment and Response filed Mar. 5, 2015, 11 pages.
U.S. Appl. No. 13/342,874, Amendment and Response filed Jul. 7, 2015, 9 pages.
U.S. Appl. No. 13/342,874, Notice of Allowance dated Jul. 24, 2015, 18 pages.
U.S. Appl. No. 13/342,874, Notice of Allowance dated Aug. 11, 2015, 3 pages.
U.S. Appl. No. 13/342,874, Office Action dated Apr. 29, 2013, 16 pages.
U.S. Appl. No. 13/342,874, Office Action dated May 13, 2013, 17 pages.
U.S. Appl. No. 13/342,874, Office Action dated Nov. 18, 2013, 17 pages.
U.S. Appl. No. 13/342,874, Office Action dated Sep. 4, 2014, 16 pages.
U.S. Appl. No. 13/342,874, Office Action dated Apr. 7, 2015, 8 pages.
U.S. Appl. No. 13/342,884, Amendment and Response filed Sep. 16, 2013, 32 pages.
U.S. Appl. No. 13/342,884, Amendment and Response filed Jan. 21, 2014, 11 pages.
U.S. Appl. No. 13/342,884, Notice of Allowance dated Feb. 19, 2014, 14 pages.
U.S. Appl. No. 13/342,884, Office Action dated Apr. 16, 2013, 13 pages.
U.S. Appl. No. 13/342,884, Office Action dated Nov. 18, 2013, 15 pages.
U.S. Appl. No. 13/342,892, Amendment and Response filed Sep. 9, 2013, 27 pages.
U.S. Appl. No. 13/342,892, Amendment and Response filed Feb. 18, 2014, 12 pages.
U.S. Appl. No. 13/342,892, Appeal Brief filed Jul. 17, 2014, 30 pages.
U.S. Appl. No. 13/342,892, Office Action dated Apr. 9, 2013, 19 pages.
U.S. Appl. No. 13/342,892, Office Action dated Nov. 15, 2013, 18 pages.
U.S. Appl. No. 13/342,892, Response to Appeal Brief dated Aug. 6, 2014, 16 pages.
U.S. Appl. No. 13/342,908, Advisory Action dated Aug. 11, 2014, 3 pages.
U.S. Appl. No. 13/342,908, Advisory Action dated Sep. 18, 2014, 4 pages.
U.S. Appl. No. 13/342,908, Amendment and Response filed Oct. 15, 2013, 32 pages.
U.S. Appl. No. 13/342,908, Amendment and Response filed Mar. 20, 2014, 21 pages.
U.S. Appl. No. 13/342,908, Amendment and Response filed Aug. 4, 2014, 13 pages.
U.S. Appl. No. 13/342,908, Amendment and Response filed Sep. 5, 2014, 18 pages.
U.S. Appl. No. 13/342,908, Amendment and Response filed Apr. 6, 2015, 12 pages.
U.S. Appl. No. 13/342,908, Notice of Allowance dated Apr. 29, 2015, 12 pages.
U.S. Appl. No. 13/342,908, Office Action dated Jun. 13, 2013, 34 pages.
Chinese Office Action in Application 201620300686.0, dated Feb. 3, 2016, 5 pages.
Chinese Office Action in Application 201702030180700, dated Feb. 7, 2017, 8 pages.
Japanese Office Action in Application 2015-512768, dated Dec. 6, 2016, 9 pages.
PCT International Preliminary Report on Patentability in PCT/US2015/030877, dated Feb. 23, 2017, 5 pages.
PCT International Preliminary Report on Patentability in PCT/US2015/044885, dated Feb. 23, 2017, 5 pages.
U.S. Appl. No. 14/054,636, Notice of Allowance dated Mar. 1, 2017, 7 pages.
U.S. Appl. No. 14/271,203, Office Action dated Feb. 21, 2017, 12 pages.
U.S. Appl. No. 15/232,490, Amendment and Response filed Feb. 22, 2017, 3 pages.
Airplane Flying Handbook (FAA-H-8083-3B) Chapter 10, Figure, 10-2, https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/airplane_handbook/media/12_afh_ch10.pdf, 10 pages, 2004, 10 pages.
Xialing Lv and Minglu Zhang, Robot Control Based on Voice Command, IEEE International Conference on Automation and Logistics 2490, 2008, 5 pages.
Curriculum of Dr. Jason Janet cited in IPR2017-01272, filed Apr. 20, 2017, 6 pages.
Declaration of Dr. Jason Janet cited in IPR2017-01272, filed Apr. 20, 2017, 79 pages.
Randall Munroe, New Pet, http://xkcd.com/413/, Retrieved from Internet Archive (http://web.archive.org/web/20080701080435/http://xkcd.com/413/) (2008), Retrieved on Apr. 13, 2017, 3 pages.
Gene F. Franklin, J. David Powell, Abbas Emami-Naeini, Feedback Control of Dynamic Systems, Fourth Edition, Prentice Hall, 2002, 28 pages.
Hashem Ghariblu and Hadi Mohammadi, Structure and Dynamic Modeling of a Spherical Robot, 8th International Symposium on Mechatronics and its Applications, 2012, 5 pages.
Hiroyuki Fujita, A Decade of MEMS and its Future, Proceedings IEEE the Tenth Annual International Workshop on Micro Electro Mechanical Systems, 1997, 8 pages.
How a Small Robotics Startup Helped Disney Bring BB-8 to Life, US Chamber of Commerce (https://www.uschamber.com/above-thefold/how-small-robotics-startup-helped-disney-bring-bb-8-life), Retrieved on Mar. 31, 2017, 6 pages.
Qiang Zhan, Yao Cai, and Caixia Yan, Design, Analysis and Experiments of an Omni-Directional Spherical Robot, IEEE International Conference on Robotics and Automation 4921, 2011, 6 pages.
Martyn Williams, Sony unwraps high-tech ‘healing’ ball, CNN.com, published Mar. 28, 2002, http://edition.cnn.com/2002/TECH/ptech/03/28/robodex.healing.ball.idg/?related, retreived on Apr. 4, 2017, 1 page.
Masato Ishikawa, Ryohei Kitayoshi, and Toshiharu Sugie, Dynamic rolling locomotion by spherical mobile robots considering its generalized momentum, Proceedings of SICE Annual Conference 2010 2311 (2010), 6 pages.
Meet BB-8: The New Droid in the Lives of Star Wars Buffs, Wharton School of the University of Pennsylvania (Nov. 13, 2015) (http://knowledge.wharton.upenn.edu/article/meet-bb-8-the-newdroid-in-the-lives-of-star-wars-buffs/), Retrieved on Mar. 31, 2017, 3 pages.
Petition for Inter Parties Review of U.S. Pat. No. 9,211,920, filed Apr. 20, 2017, 75 pages.
U.S. Appl. No. 15/232,490, Office Action dated Mar. 17, 2017, 4 pages.
U.S. Appl. No. 15/040,331, Office Action dated Apr. 13, 2017, 10 pages.
U.S. Appl. No. 13/549,097, Amendment and Response filed Mar. 14, 2017, 13 pages.
U.S. Appl. No. 14/137,954, Notice of Allowance dated Mar. 8, 2017, 8 pages.
U.S. Appl. No. 14/884,632, Amendment and Response filed Apr. 19, 2017, 3 pages.
U.S. Appl. No. 15/281,478, Office Action dated May 5, 2017, 5 pages.
U.S. Appl. No. 15/232,490, Amendment and Response filed Jul. 10, 2017, 3 pages.
U.S. Appl. No. 15/146,631, Office Action dated May 16, 2017, 11 pages.
U.S. Appl. No. 15/040,331, Amendment and Response filed Jul. 10, 2017, 10 pages.
U.S. Appl. No. 14/884,632, Supplemental Notice of Allowance dated Jun. 1, 2017, 2 pages.
U.S. Appl. No. 14/884,632, Notice of Allowance dated May 15, 2017, 8 pages.
U.S. Appl. No. 13/342,892, Board Decision dated May 5, 2017, 8 pages.
U.S. Appl. No. 13/342,892, Notice of Allowance dated Jun. 7, 2017, 7 pages.
U.S. Appl. No. 13/342,892, Supplemental Notice of Allowance dated Jun. 29, 2017, 2 pages.
U.S. Appl. No. 13/549,097, Office Action dated Jun. 26, 2017, 30 pages.
U.S. Appl. No. 14/054,636, Notice of Allowance dated Jul. 7, 2017, 7 pages.
U.S. Appl. No. 14/137,954, Notice of Allowance dated Jun. 29, 2017, 8 pages.
European Extended Search Report in Application 14795148.7, dated Apr. 5, 2017, 12 pages.
Chinese Office Action in Application 201380036857.2, dated Mar. 22, 2017, 11 pages.
Japanese Office Action in Application 2015-521853, dated Feb. 14, 2017, 6 pages.
U.S. Appl. No. 15/232,490, Notice of Allowance dated Sep. 21, 2017, 7 pages.
U.S. Appl. No. 15/146,631, Office Action dated Sep. 21, 2017, 14 pages.
U.S. Appl. No. 13/549,097, Advisory Action dated Sep. 22, 2017, 2 pages.
U.S. Appl. No. 13/342,892, Supplemental Notice of Allowance dated Jul. 26, 2017, 2 pages.
U.S. Appl. No. 13/549,097, Amendment and Response filed Aug. 25, 2017, 11 pages.
U.S. Appl. No. 14/054,636, Supplemental Notice of Allowance dated Aug. 2, 2017, 4 pages.
U.S. Appl. No. 14/137,954, Supplemental Notice of Allowance dated Jul. 27, 2017, 2 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Aug. 18, 2017, 11 pages.
U.S. Appl. No. 14/884,632, Supplemental Notice of Allowance dated Jul. 28, 2017, 2 pages.
U.S. Appl. No. 15/040,331, Notice of Allowance dated Aug. 1, 2017, 9 pages.
U.S. Appl. No. 15/146,631, Amendment and Response filed Aug. 18, 2017, 10 pages.
U.S. Appl. No. 15/177,809, Office Action dated Aug. 16, 2017, 6 pages.
U.S. Appl. No. 15/180,485, Office Action dated Aug. 17, 2017, 9 pages.
U.S. Appl. No. 15/232,490, Notice of Allowance dated Aug. 10, 2017, 5 pages.
U.S. Appl. No. 15/281,478, Amendment and Response filed Sep. 5, 2017, 8 pages.
Wright's Brothers Propulsion System, Smithsonian national Air and Museum, retrieved , retreived Aug. 17, 2017, https://airandspace.si.edu/exhibitions/wright-brothers/online/fly/1903/propulsion.cfm, 5 pages.
Chinese Notice of Allowance in Application 201380036857.2, dated Aug. 1, 2017, 4 pages.
Chinese Office Action in Application 201510463336.6, dated Jul. 17, 2017, 5 pages. (No English Translation).
Korean Notice of Allowance in Application 10-2015-7003642, dated Jul. 25, 2017, 4 pages.
Chinese Office Action in Application 201480029695.4, dated May 27, 2017, 22 pages.
Chinese Office Action in Application 201510463007.1, dated May 31, 2017, 8 pages.
Chinese Office Action in Application 201620300686, dated May 2, 2017, 2 pages. (No English Translation).
European Extended Search Report in Application 14853882.0, dated Jun. 22, 2017, 6 pages.
European Office Action in Application 13817383.8, dated Apr. 20, 2017, 6 pages.
Korean Office Action in Application 10-2014-7034020, dated Jun. 30, 2017, 11 pages.
U.S. Appl. No. 15/281,409, Office Action dated Jul. 6, 2018, 19 pages.
U.S. Appl. No. 15/180,485, Notice of Allowance dated Jun. 4, 2018, 2 pages.
U.S. Appl. No. 15/010,337, Amendment and Response filed May 22, 2018, 10 pages.
U.S. Appl. No. 15/146,631, Notice of Allowance dated Aug. 15, 2018, 5 pages.
European Extended Search Report in Application 15831882.4, dated Jun. 13, 2018, 13 pages.
European Office Action in Application 13817382.8, dated Aug. 3, 2018, 4 pages.
A. Milelle et al., “Model-Based Relative Localization for Cooperative Robots Using Stero Vision”, Dec. 3, 2005, https://infoscience.epfi.ch/record/97591/files/Model-Based_Relative_Localization_MILELLA05.pdf.
European Office Action in Application 13790911.5, dated Jan. 26, 2018, 7 pages.
U.S. Appl. No. 14/146,631, Office Action dated Feb. 2, 2018, 12 pages.
U.S. Appl. No. 15/146,631, Advisory Action dated Apr. 23, 2018, 2 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Dec. 22, 2017, 12 pages.
U.S. Appl. No. 15/010,337, Office Action dated Dec. 22, 2017, 12 pages.
U.S. Appl. No. 15/146,631, Amendment and Response filed Dec. 18, 2017, 9 pages.
U.S. Appl. No. 15/281,478, Amendment and Response filed Jan. 29, 2018, 8 pages.
U.S. Appl. No. 15/281,478, Office Action dated Dec. 15, 2017, 6 pages.
U.S. Appl. No. 15/177,809, Notice of Allowance dated Dec. 12, 2017, 8 pages.
U.S. Appl. No. 15/180,485, Amendment and Response filed Dec. 22, 2017, 8 pages.
U.S. Appl. No. 15/180,485, Notice of Allowance dated Jan. 26, 2018, 10 pages.
Chinese Notice of Allowance in Application 201510463336.6, dated Nov. 17, 2017, 4 pages.
European Office Action in Application 12731845.7, dated Oct. 25, 2017, 6 pages.
European Office Action in Application 13817382.8, dated Nov. 14, 2017, 5 pages.
Japanese Office Action in 2015-512768, dated Sep. 26, 2017,10 pages.
Japanese Office Action in Application 2015-521853, dated Oct. 31, 2017, 6 pages.
U.S. Appl. No. 13/549,097, Amendment and Response filed Oct. 24, 2017, 11 pages.
U.S. Appl. No. 14/271,203, Office Action dated Oct. 18, 2017, 13 pages.
U.S. Appl. No. 15/177,809, Amendment and Response filed Nov. 17, 2017, 7 pages.
U.S. Appl. No. 15/180,485, Amendment and Response filed Nov. 17, 2017, 11 pages.
U.S. Appl. No. 15/180,485, Office Action dated Dec. 7, 2017, 9 pages.
U.S. Appl. No. 15/281,478, Notice of Allowance dated Feb. 22, 2018, 8 pages.
Chinese Notice of Allowance in Application 201510463007.1, dated Mar. 5, 2018, 6 pages.
Chinese Office Action in Application 201480029695.4, dated Feb. 23, 2018, 14 pages.
European Search Report in Application 15831882.4, dated Mar. 1, 2018, 16 pages.
U.S. Appl. No. 15/177,809, Supplemental Notice of Allowance dated Mar. 15, 2018, 2 pages.
U.S. Appl. No. 15/177,809, Supplemental Notice of Allowance dated Mar. 21, 2018, 2 pages.
U.S. Appl. No. 15/180,485 Supplemental Notice of Allowance dated Mar. 15, 2018, 2 pages.
U.S. Appl. No. 14/271,203, Office Action dated Apr. 6, 2018, 13 pages.
U.S. Appl. No. 13/549,097, Notice of Allowance dated Apr. 18, 2018, 12 pages.
U.S. Appl. No. 14/933,827, Office Action dated May 10, 2018, 7 pages.
European Office Action in Application 14795148.7, dated Oct. 4, 2018, 7 pages.
U.S. Appl. No. 14/271,203, Amendment and Response filed Sep. 5, 2018, 7 pages.
U.S. Appl. No. 15/010,337, Notice of Allowance dated Sep. 11, 2018, 17 pages.
U.S. Appl. No. 15/822,676, Office Action dated Nov. 30, 2018, 27 pages.
U.S. Appl. No. 15/888,354, Office Action dated Oct. 5, 2018, 13 pages.
U.S. Appl. No. 15/146,631, Notice of Allowance dated Oct. 11, 2018, 2 pages.
Chinese Notice of Allowance in 201480029695.4, dated Jan. 15, 2019, 4 pages.
Chinese Office Action in 201580055348.3, dated Dec. 5, 2018, 17 pages.
U.S. Appl. No. 15/281,409, Amendment and Response filed Jan. 7, 2019, 16 pages.
U.S. Appl. No. 15/888,354, Amendment and Response filed Jan. 4, 2019, 6 pages.
U.S. Appl. No. 14/271,203, Notice of Allowance dated Dec. 18, 2018, 7 pages.
Related Publications (1)
Number Date Country
20160054734 A1 Feb 2016 US
Provisional Applications (3)
Number Date Country
61430023 Jan 2011 US
61430083 Jan 2011 US
61553923 Oct 2011 US
Continuations (1)
Number Date Country
Parent 13766455 Feb 2013 US
Child 14933827 US
Continuation in Parts (1)
Number Date Country
Parent 13342853 Jan 2012 US
Child 13766455 US