The present teachings relate generally to utility services. For example, the present teachings can relate to assisted delivery of goods originating at distributed establishments and destined for customers located in the vicinity of the distributed establishments. What is needed is a system that can accommodate trips of various lengths, and can solve the problem of short-distance assistance to customers. What is further needed is a system that can accommodate semi-autonomous and autonomous operation, and can deliver utility services economically.
The utility system of the present teachings solves the problems stated herein and other problems by one or a combination of the features stated herein.
The system of the present teachings can be part of a fleet network of similar systems. The fleet network can also include trucks, planes, cars such as self-driving cars, and business establishments. All members of the fleet network can communicate seamlessly to share, for example, but not limited to, navigation data, dynamic objects, alternate routing, and utility requirements including utility characteristics, customer location, and destination. The system of the present teachings can interface with existing truck systems so that the fleet is seamlessly connected. Piloted utility vehicles can include technology disclosed in U.S. patent application Ser. No. 15/600,703 filed on May 20, 2017, entitled Mobility Device.
The utility robot of the present teachings can operate in an autonomous or semi-autonomous mode. The autonomous utility robot can, in conjunction with the network, control its movement without the assistance of an operator. The semi-autonomous utility robot can include technology that can receive and process input from the operator of the semi-autonomous utility robot. The input can, for example, but not limited to, override autonomous control of the utility robot, or be considered in controlling the utility robot, or be ignored. The utility robot can include a set of sensors appropriate for the location of the utility robot. For example, when the utility robot is deployed in an environment that includes many other members of the fleet network, the utility robot can include a first number of sensors. In some configurations, for example, in an environment that includes a relatively small number of members of the fleet network, the utility robot can include a second number of sensors. The sensors can operate in conjunction with sensors that are associated with other members of the fleet network. In some configurations, the utility robot can include enough physical storage space to accommodate delivery items from typical distributed sources such as pharmaceuticals, food, meals, and documents. The utility robot can operate on city sidewalks, and near and within buildings, among other places. The utility robot can include the capability to determine a current location and situation of the utility robot (localization), through the use of, for example, but not limited to, fiducials, sensors, external application data, operator input, beacons, and physical orientation of the utility robot. The utility robot can plan a route to reach a desired destination, detect obstacles along the route, and dynamically determine specific actions that the utility robot is to take based on the route, current location, and obstacles. Obstacles can include, but are not limited to including, dynamic (mobile) obstacles, such as, for example, but not limited to, pedestrians, vehicles, animals, and static obstacles such as, for example, but not limited to, trashcans, sidewalks, trees, buildings, and potholes. The utility robot can accommodate map matching including locating obstacles visually and matching them to other data such as, for example, satellite data. The utility robot can determine preferred routes and routes to be avoided. In some configurations, the utility robot can climb curbs. In some configurations, the utility robot can climb stairs. The utility robot can achieve stabilized operation while on four wheels, including while climbing stairs. The utility robot can maintain a pre-selected distance, which could vary along the route, from an obstacle such as, for example, but not limited to, a building. The utility robot of the present teachings can be driven by an operator who is seated upon a seating feature of the utility robot. In some configurations, the utility robot can take the form of a wheelchair, and can thus legally traverse sidewalks in all jurisdictions. The utility robot can accommodate disabled operators, and can include carrying capacity for, for example, but not limited to, pizzas and pharmaceuticals. In some configurations, the utility robot can follow rules of the road to maintain the safety of the utility robot, the operator of the utility robot (when present), and the people and obstacles encountered by the utility robot. The rules can include, for example, but not limited to, what to do when encountering an obstacle and what to do when crossing a road. For example, the rules can include prohibitions on rolling over someone or something, and traveling into unsafe places. The rules can also include prohibitions on stopping in unsafe locations, for example, the middle of an intersection. In general, safety protocols can be established and learned by the utility robot of the present teachings.
The utility robot of the present teachings can serve many purposes. The utility robot of the present teachings can be summoned to assist an individual in carrying heavy things, for example, to a bus stop. In some configurations, the utility robot of the present teachings can watch for threats and odd occurrences, and can be summoned to escort individuals from place to place. In some configurations, the utility robot of the present teachings can be summoned by a mobile device, to a location that can change between the summons and the rendez-vous of the utility robot and the mobile device. The utility vehicle can transport items from one location to another, for example, from a pharmacy to the residence of the person ordering the pharmaceuticals. The utility robot can communicate with pedestrians and vehicles, for example, by gesturing and providing awareness feedback.
In some configurations, the utility robot of the present teachings can travel at least fifteen miles at sixteen miles/hour on a single battery charge. The utility robot of the present teachings can use GPS, road signs, stereo cameras, cell phone repeaters, smart beacons with steerable RF beams that can direct the utility robot along a desired route, IMU data between beacons, and other beacon data to help the utility robot to recognize and traverse the desired route. In some configurations, at least one autonomous utility robot of the present teachings can be coupled, for example, electronically, with at least one semi-autonomous utility robot. Batteries can include quick change/quick charge batteries. In some configurations, batteries can be protected from being stolen. The batteries can be locked down, for example, or they can include an identification number that is required to enable the batteries.
The utility robot of the present teachings can accommodate such numbers and types of sensors as are necessary for the function of the utility robot. For example, the utility robot, when operating in an urban area, can expect to receive real time data relevant to its travel path from other members of the fleet network such as, for example, but not limited to, beacons and fiducials. Thus, the utility robot, when operating in an urban area, can include a sensor package appropriate for its environment. The same utility robot, when operating in an area that includes fewer fleet members can include a sensor package appropriate for its environment, and possibly different form the urban area sensor package. Sensors can be integrated with the utility robot of the present teachings. The sensors can access and/or collect street/building/curb data, and can include, for example, but not limited to, visual sensors, LIDAR, RADAR, ultrasonic sensors, and audio sensors, and data from GPS, Wifi and cell towers, commercial beacons, and painted curbs. The visual sensors can include stereoscopic visual sensors that can enable object classification and stop light classification, for example. In some configurations, visual sensors can detect curbs. Detection of curbs can be simplified by painting the curbs with substances that can include, but are not limited to including, reflective materials and colors. Curbs can also be painted with conductive materials that can trigger detection by appropriate sensors mounted on a fleet member such as the utility robot. LIDAR can enable the creation of a point cloud representation of the environment of the utility robot, and can be used for obstacle avoidance, object classification, and mapping/localization. Maps can contain static objects in the environment. Localization provides information about the locations of static objects, which can be useful in recognizing dynamic objects. Audio and/or ultrasonic sensors can be used to detect the presence of, for example, but not limited to, vehicles, pedestrians, crosswalk signals, and animals, and can enable collision avoidance and semi-autonomous driving. Ultrasonic sensors can enable calculation of the distance between the utility robot and the closest object. In some configurations, the utility robot can accommodate repositioning of the sensors upon the utility robot. For example, sensors can be positioned to accommodate the variable placement of storage containers on the utility robot.
In some configurations, vehicles, such as, for example, but not limited to, trucks and self-driving vehicles, can transport the utility robots of the present teachings closer to their starting locations and destinations, and can retrieve the utility robots to remove them to storage, charging, and service areas, for example. With respect to trucks, in some configurations, as the utility robots can enter the trucks, their batteries can be removed and be replaced with fully charged batteries so that the utility robots can continue their services. The truck can include the capability to swap out batteries and charge them. In some configurations, empty storage compartments can also be filled on the delivery truck, and the utility robot can be sent from the truck to perform further deliveries. The utility robots and trucks can locate each other wirelessly. A dispatching mechanism can couple trucks with services and batteries with utility robots that need them. The trucks can include at least one ramp to receive and discharge the utility robots of the present teachings.
In some configurations, the movement of trucks and utility robots of the present teachings can be coordinated to minimize one or more of service costs, service times, and occurrences of stranded utility robots. Service costs may include fuels for trucks, battery costs for utility robots, and maintenance/replacement costs of trucks and utility robots. The trucks can include on- and off-ramps that can accommodate rolling retrieval and discharge of the utility robots. The trucks can be parked in convenient places and the utility robots of the present teachings can perform services in conjunction with the trucks. In some configurations, the trucks and utility robots can be dynamically routed to meet at a location, where the location can be chosen based at least on, for example, but not limited to, the amount of time it would take for the fleet members to reach the location, availability of parking at the location, and routing efficiency. In some configurations, the utility robots of the present teachings can be moved from place to place, depending upon where they are needed the most, by, for example, the trucks. Daily schedules can control where the utility robots of the present teachings are transported. For example, a truck can pick up the utility robot of the present teachings when the utility robot has completed its services and/or when its batteries need to be charged, and/or when it needs service. The utility robot can automatically remain in the location of its final service until a truck arrives to retrieve it. A truck can be used to transport the utility robot of the present teachings from a station such as a store where goods and services have been purchased to a retirement home, for example, where the goods and services are to be delivered. The utility robot of the present teachings can be dropped off at, for example, the retirement home at which time the utility robot can deliver the goods and services. In some configurations, a first of the utility robots of the present teachings can deliver parcels to the truck, and those parcels can be removed from the first of the utility robots to the truck. The parcels can be picked up by a second of the utility robots of the present teachings that is heading towards the delivery destination of the parcel. The utility robots of the present teachings can be deployed from moving trucks or other moving vehicles.
In some configurations, self-driving vehicles can be fitted with controls and hardware that can accommodate the utility robot of the present teachings. Self-driving vehicles can be more ubiquitous in and adaptable to urban settings than trucks. For example, a utility robot of the present teachings can receive goods to be delivered, summon a nearby self-driving vehicle, move to meet the vehicle, enter the vehicle, and become docked in the vehicle. The battery of the utility robot of the present teachings can be charged during the delivery trip by the self-driving vehicle. The self-driving vehicle, as part of the fleet, can access the service information for the utility robot from which the summons came, and can move the utility robot of the present teachings to the service destination(s).
In some configurations, at least one semi-autonomous utility robot can be associated with at least one autonomous utility robot. The semi-autonomous utility robot and the autonomous utility robot can wirelessly communicate with each other to maintain synchronous behavior when desired. In some configurations, the group of utility robots can form a secure ad hoc network whose participants can change as autonomous utility robots enter and leave association with the semi-autonomous utility robot. The ad hoc network can communicate with the fleet network. In some configurations, the utility robots can communicate by, for example, wifi, through standard electronic means such as text, email, and phone. In some configurations, each of the utility robots can share features of the route upon which the group travels by individually measuring wheel rotations and inertial values and sharing those data. The group of utility robots of the present teachings can arrange to meet a truck. The arrangement can be made by a cellular telephone call to a dispatcher, for example. A dispatcher, which may be automatic or semi-automatic, can locate the truck that is nearest the group of utility robots of the present teachings and can route the truck to the location of the group. In some configurations, a meetup request can be generated by one or more utility robots of the group, and can be electronically transmitted to trucks that come within wifi and/or ad hoc network range of the group of utility robots. In some configurations, the group of utility robots can be in continuous electronic communication with the fleet of trucks, can monitor their whereabouts, and can summon the nearest truck and/or the truck with the appropriate specifications such as, for example, size and on/off ramps. In some configurations, summoning the one or more of the utility robots of the group of the present teachings can automatically involve summoning a utility robot with the correctly-sized storage compartment(s) for the parcel(s), and the utility robot that is geographically closest to the pickup point for the parcel(s).
In some configurations, the utility robot can include storage for items to be delivered, and can track the sizes of storage containers on each utility robot, as well as the sizes of the contents of the storage containers. The utility robot can receive the size of the package and can determine if the package can fit in any available storage in the fleet of utility robots of the present teachings. The storage can be compartmentalized for security and safety of the contents of the delivered goods. Each of the compartments can be separately secured, and the sizes of the compartments can vary according to the sizes of the parcels. Each of the compartments can include, for example, a sensor that can read the address on the parcel and ensure that the parcel is sized correctly for the storage container and the utility robot. For example, a drug store might require several small compartments to house prescription orders, while a restaurant might require pizza-sized compartments. In some configurations, the utility robot can include operator seating, and the storage compartments can be located behind, above, beside, in front of, and/or under the operator, for example. The storage containers can be sized according to the current parcel load. For example, the storage containers can include interlockable features that can enable increasing or decreasing the interior size of the storage containers. The storage containers can also include exterior features that can enable flexible mounting of the storage containers upon the chassis of the utility robot of the present teachings.
In some configurations, the utility robot can include storage compartments and can accommodate long-term storage, for example, overnight storage, that can be advantageously provided when the utility robot is securely located within an enclosure in proximity to a charging station. The storage compartments can actively or passively self-identify, and can include tamper and content status information. The storage compartments can automatically interface with the system controller to provide information such as, for example, but not limited to, the tamper information and the content status information. In some configurations, the storage compartments can include information that can be used when by the controller to command the utility robot. In some configurations, when contents within the storage compartments are destination-tagged, the storage compartment can sense the place where the contents are to be delivered and can direct the controller to drive the utility robot to the destination. In some configurations, the storage compartment can transmit destination information to other members of the delivery fleet. In some configurations, contents within the storage compartment can protrude from the storage compartment. Sensors can detect the orientation of the storage compartment and can maintain the storage compartment at a pre-selected angle with respect to the ground.
In some configurations, storage compartments can include temperature/humidity control that can accommodate extended storage, for example, but not limited to, overnight storage, of goods for delivery. In some configurations, storage of food and pharmaceuticals, for example, can be accommodated by temperature and or humidity control within the storage compartments of the present teachings. In some configurations, the storage compartments can include insulation and cold packs of ice, dry ice or other commercially available cold packs such as model S-12762 available from ULINE® in Pleasant Prairie, Wis. In some configurations, storage compartments can include electrically powered refrigerators and/or heaters. In some configurations, the electrically powered heater or cooler may be powered by mains AC. In some configurations, the power can be provided by the batteries of utility robot.
The storage compartments can include sensors mounted exteriorly and interiorly. The storage compartment sensors can detect when they have been touched and moved, and can provide that information to a controller executing in the utility robot. In some configurations, storage compartment sensors can monitor environmental factors, such as, for example, but not limited to, temperature and humidity as well as shock and vibration loads. In some configurations, storage compartment sensors can detect the size and weight of a package and can read information embedded in or on the package. The information can, for example, be embedded in an RFID tag or encoded into a barcode or QR code. The utility robot can compare the information embedded in or on the package to a manifest associated with the delivery, and can raise an alert and/or alarm if the information does not match the manifest.
In some configurations, one or more of the storage compartments can ride above the operator of the utility robot of the present teachings. In some configurations, the above-operator storage compartment(s) can ride on a telescoping device, and can be raised up and down to enable convenient access to the contents of the storage compartment(s), while at the same time enabling convenient entry and exit of the operator onto the utility robot of the present teachings. The telescoping device can include articulation. The storage compartments can ride on positioning rails, and can be positioned backwards, forwards, up, down, and from side to side, for example. The storage compartments can be maintained in a particular orientation automatically by the controller.
In some configurations, the storage containers can be positioned in various orientations and at various locations with respect to each other and the chassis of the utility robot. The storage compartment can accommodate weather barriers to protect the operator of the utility robot from inclement weather. In some configurations, curtains attached to an elevated storage container can protect an operator and possibly storage containers from inclement weather. Parts of the storage container can be articulated to accommodate storing and removing items, and to accommodate secure placement of the storage container. In some configurations, the utility robot can include active control of the storage container, for example, to maintain a particular orientation of the storage container. If the contents of the storage container must remain in a particular orientation to prevent destruction of the contents, active control of the orientation of the contents within the storage container can be enabled. In some configurations, each face of the contents of the storage container can be identified to enable proper orientation of the contents.
In some configurations, sensors can be mounted in various locations on/in the storage container, for example, to notify the utility robot when the storage container could be subject to an undesired collision. In some configurations, the storage container and/or the manifest can inform the utility robot to adjust accelerations according to a pre-selected threshold. The utility robot, which can determine the current rate of acceleration of the utility robot based on data collected from the utility robot's wheel counter and IMU, can limit commands to the drive wheels and/or brakes to adjust accelerations according to the pre-selected threshold.
In some configurations, one of the storage containers can be mounted behind the operator, and can be greater than or equal to about two feet tall. The storage containers can include snap-on features that can allow placement of the storage containers onto the chassis in various configurations. The storage containers can receive and process information from an electronic application, for example, open and close commands from a wireless device. In some configurations, when a parcel is loaded into a storage container, the utility robot can identify, for example by taking a photograph, the individual who loads the parcel and associate the parcel with the identification. In some configurations, the storage container of the present teachings can measure 30-40 inches by two feet. In some configurations, the utility robot can automatically poll the parcels it carries and automatically summon any needed assistance to deliver the parcels in a timely manner. The mounted storage containers can be interchangeable with storage containers of sizes suitable for the particular delivery and can be secured to the utility robot.
The utility robot of the present teachings can be docked proximal to where package delivery can originate. In some configurations, docking stations can include openings in the building where the packages are located. Packages can be deposited at stations within the buildings and near the openings, and can be automatically sorted. The sorted packages can be automatically loaded onto a utility robot of the present teachings through one of the openings. Sensors and/or transponders can detect the contents of the packages.
The utility robots of the present teachings can include technology to collect payment for services and retain payment records. The utility robot can notify the service target that the service has been completed, for example, by a cell phone notification or a text. The service target can move towards the utility robot to avoid challenging terrain such as, for example, stairs. In some configurations in which the service provided is a delivery service, storage compartments can include embedded RFID circuitry that can be broken when the delivery storage is opened. An RFID scanner could be used to reveal that the storage container has been opened. To maintain privacy, the contents of the storage container can be moved to a secure location before opening. The utility robot can receive information about the service target such as, for example, biometric information, to identify that the service is being delivered to the correct target. For example, the utility robot can secure the storage container until the target is recognized by, for example, facial recognition technology. The utility robot can receive personal information such as credit card and cell phone information, to, for example, unlock a storage container. In some configurations, the utility robot can include biometric sensors, for example, facial sensors and/or fingerprint sensors, that can, for example, detect if the contents of a storage container are associated with the person attempting to collect the contents. In some configurations, the utility robot can combine correct location information with correct code entry or other forms of identification to unlock the storage container.
The utility robots of the present teachings can detect tampering with the utility robot, and thus unsafe and dangerous conditions. In some configurations, the utility robot can detect a change in the center of mass that can indicate tampering. Adding or subtracting weight from the utility robot can change the center of mass. The utility robot can include an IMU, and can measure the location of center of mass based on the response of the vehicle to accelerations and changes in the attitude of the utility robot. The change of mass can indicate that the utility robot might be compromised. In some configurations in which packages are being transported, the utility robot can detect packages that do not include identification sufficient to couple the package with the delivery target. For example, the utility robot can detect an unapproved package because a loading authorization code does not match the expected code, or the RFID code is incorrect or missing, or there is a mismatch between the actual weight of the package and the weight listed on the manifest. The utility robot can generate an alert, the type of which can depend upon the probable cause of the suspected tampering. Some alerts can be directed to the state authorities, while others can be directed to an electronic record that can be accessed by the utility robot of the present teachings, the trucks, the smart beacons, and other possible participants in the provided service, possibly through the fleet network. Following an error condition, the utility robot can automatically or semi-automatically steer the utility robot to a safe location such as a charging station. In some configurations, the contents of storage containers can be secured.
Beacons can communicate with the utility robot, and the status of the utility robot and its current activities can be provided to the beacons and thus to the fleet network. In some configurations where the utility robot is delivering goods, beacons can communicate with the contents of the storage containers, and a list and status of the contents of the storage containers can be made available to other members of the delivery fleet through the fleet network. All of the members of the fleet can be recognized by each other. If a utility robot of the present teachings detects that it has been compromised, it can initiate a safety procedure in which its secure electronic information can be backed up and destroyed, and the contents of its storage containers can be safely locked down.
To facilitate mapping of the route traveled by the utility robot between the starting and ending points, whether the starting point is at a fixed location, such as a pickup station associated with a brick-and-mortar source, or whether the starting point is at a mobile location, such as a truck or a pedestrian, the utility robot can begin with a static map. In some configurations, the static map can be derived from an open source map. In some configurations, the fleet system can include at least one server that can manage static map activity. In some configurations, the utility robot can maintain a local version of the static map from which it can operate between updates from the version maintained by the server. In some configurations, the utility robot can augment the static map with, for example, but not limited to, indications of congested areas based on information from, for example, but not limited to, other fleet vehicles, cell phone applications, obstacles such as trees and trash cans, pedestrians, heat map data, and wifi signals. The static map can be used, in conjunction with utility robot sensor data and fleet data, to deduce the location of dynamic objects. The utility robot can collect navigation data while enroute to a target and can avoid the congested areas. The utility robot can, for example, detect fiducials and beacons installed at various places along the path, for example, but not limited to, street corners and street signs at street corners. The fiducials and beacons can be members of the fleet network and thus share data with and possibly receive information from members of the fleet network. The fiducials and beacons can be installed and maintained by any entity including, but not limited to, the item's source entity, the company managing the deliveries, and the city in which the deliveries are taking place. The utility robots can receive information from fiducials and beacons installed at street intersections and, in some configurations, can send information to the fiducials and beacons that are configured to receive information. The utility robot can also sense safety features such as traffic lights and walk/no-walk indicators that can generate alerts audibly, visually, another type/frequency of signal, and/or all of the alert generation methods. The utility robot can process traffic light data and follow the pre-established road rules that it has learned. For example, the utility robot can be taught to stop when the traffic light is red. Vehicles in an intersection can be detected. Route issues such as closures can be detected. The utility robot can update the fleet network's database with information such as, but not limited to, traffic light information, that can enrich the mapping utility robot available to the fleet network. In some configurations, the utility robot can make use of information collected by a body camera worn by the operator of a member of the fleet network.
Semi-autonomous utility robots of the present teachings can receive input from operators during each trip and can use that input to record locations of obstacles such as, for example, but not limited to, stairs, cross-walks, doors, ramps, escalators, and elevators. From these data and real-time and/or semi-real-time data, maps and dynamic navigation routes can be created and updated. Autonomous utility robots can use the maps for current and future deliveries. For each step in the dynamic navigation route, the utility robot of the present teachings can determine the obstacles in the navigation route, the amount of time required to complete a desired motion that the utility robot will have to accomplish to follow the navigation path, the space that will be occupied by the static and dynamic obstacles in the path at that time, and the space required to complete the desired motion. With respect to the obstacles, the utility robot can determine if there is an obstacle in the path, how big the obstacle is, whether or not the obstacle is moving, and how fast and in what direction the obstacle is moving and accelerating. The dynamic navigation path can be updated during navigation. The path with the fewest obstacles can be chosen, and dynamic route modifications can be made if a selected route becomes less optimal while the utility robot is in transit. For example, if a group of pedestrians moves to a position in the chosen route, the route can be modified to avoid the group of pedestrians. Likewise, if repairs begin on a sidewalk, for example, the route can be modified to avoid the construction zone. Stereo cameras and point cloud data can be used to locate and avoid obstacles. The distance from various obstacles can be determined by real-time sensing technology such as, for example, but not limited to planar LIDAR, ultrasonic sensor arrays, RADAR stereoscopic imaging, monocular imaging, and velodyne LIDAR. In some configurations, processing of sensor data by the utility robot can allow the utility robot to determine, for example, whether the utility robot is within an allowed envelope of the planned path, and whether the obstacles in the navigation path are behaving as predicted in the dynamic navigation path. The utility robot can accommodate trips of various lengths, solving the problem of short-distance delivery of services.
Information can be derived from commercially-available navigation tools that provide online mapping for pedestrians, for example. Commercially-available navigation tools such as, for example, but not limited to, GOOGLE® maps, BING® maps, and MAQUEST® maps, can provide pedestrian map data that can be combined with obstacle data to generate a clear path from source to destination as the utility robot travels from one place to another. Crowd-sourced data can augment both navigational and obstacle data. Operators who travel in the vicinity of the source of the goods and the target services area can be invited to wear cameras and upload data to the utility robot, and/or to upload an application that can, for example, but not limited to, track location, speed of movement, congestion, and/or user comments. Operators can perform the job of smart sensors, providing, for example, but not limited to, situational awareness and preferred speed to the utility robot. In some configurations, operator driven systems of the present teachings can generate training data for interactions with people including, but not limited to, acceptable approach distances, following distances, and passing distances. Cellular phone-type data, such as, for example, but not limited to, obstacles and their speed and local conditions, can be made available to the fleet's database to enable detailed and accurate navigation maps. The utility robot can include technology that can determine areas in which the GPS signal falls below a desired threshold so that other technologies can be used to maintain communications. Sidewalks can be painted with various substances, such as, for example, but not limited to, photo luminescent substances, that can be detected by sensors on the utility robot. The utility robot can use the data gathered from sensing the substances to create and augment navigation maps.
Wheel rotation and inertial measurement data can be combined to determine dead reckoning positions when creating the maps. Sensor data, such as data from visual sensors, can be used to determine dead reckoning positions. The utility robots of the present teachings can receive information about their routes from information collected by trucks, and members of the fleet can be used to create/improve pedestrian maps. The trucks can include portable utility robots, and the operators of the trucks can collect further data by use of body cameras and location sensors to map walking deliveries. Visual, audible, and thermal sensing mechanisms can be used on the trucks and in conjunction with the operator's movements. The utility robot can make use of optimized and/or preferred route information collected by trucks and operators. The utility robot can include a pedestrian route on the desired navigation map.
In some configurations, the utility robot can learn navigation paths independently and can share the navigation information with other members of the fleet network. In some configurations, the operator can select at least one optimum navigation route. The utility robot can also include cameras that can be used to augment navigation maps. Areas that can be located inside buildings such as, for example, but not limited to, doors, stairs, and elevators, and routes limited to pedestrians, can be candidates for body camera data collection. In subsequent journeys to the same location, the doors, stairs, and elevators may be navigable by the utility robot, and the utility robot can by-pass pedestrian-only paths, for example. The utility robot can follow a planned route. The utility robot can receive commands from the operator, and/or can self-command based on the desired route. Steering and location assistance can be provided by navigation tools combined with obstacle avoidance tools. The utility robot can accommodate ADA access rules, including, but not limited to, space requirements with respect to the utility robot's egress and ingress requirements.
In some configurations, the dynamic navigation path can be updated by the utility robot when the utility robot determines if an obstacle can be surmounted and/or avoided. For example, the utility robot can determine if the obstacle can be driven over, such as a curb, a rock, or a pothole, or can be driven around. The utility robot can determine if the obstacle can be expected to move out of the navigation path, and if there is a way that the utility robot can make progress along the planned navigation path. In some configurations, the utility robot of the present teachings can accommodate crossing roads with and without traffic signals, curbs, dynamic obstacles, and complete path obstruction. The utility robot can include routing technology that can avoid congested areas based on, for example, but not limited to, current congestion information from other utility robots of the present teachings, crowd-sourced congestion information, and historical congestion information from other utility robots of the present teachings and trucks. Historical congestion information can include, but is not limited to including, day and time of congestions from past traverses in the same area by utility robots of the present teachings, and data and time of congestion from delivery truck speed. Dynamic navigation paths can be created based on current path data and the maps. The utility robot can include training technology in which data from operators traveling a route can inform the utility robot of the present teachings how to interact with moving obstacles and how to behave in an environment having moving obstacles. In some configurations, data from fleet drivers traveling the route can be used as training data for machine learning on how to interact with moving people or in an environment of moving people. In some configurations, a heat map of pedestrian traffic can be used to update pedestrian density data. In some configurations, route planning can take into account the desired transit time, the estimated transit time, how much space obstacles are occupying on the planned route, and how much space the utility robot requires. The utility robot can determine its status with respect to the planned route, and can track what movement the obstacles in the planned route are making.
Each form of sensor data can provide a unique view of its surroundings, and fusing the various types of sensor data can help to specifically identify obstacles, including dynamic objects. Using these data, dynamic objects can be classified by methods including, but not limited to, semantic segmentation. Predicting the future position of a dynamic object, after it has been identified, can be accomplished by semantic scene segmentation which can color code a scene based on object type. The future position of a dynamic object can also be predicted by creating behavioral models of dynamic objects that can be processed by the utility robots of the present teachings. Neural networks, Kalman filters, and other machine learning techniques can also be used to train the utility robot of the present teachings to understand and react to its surroundings. If the utility robot encounters an obstacle with which it can interact, for example, a pedestrian, the utility robot can be trained to stop before encountering the pedestrian, greet the pedestrian, and avoid hitting the pedestrian, for example. In some configurations, planar LIDAR, visual sensors, and ultrasonic sensors can be used to detect pedestrians. A critical distance around a pedestrian can be defined based on the distance needed to stop based on sensor delays, and social norms, for example. The socially-acceptable interactions between the utility robot and humans may be defined by data from user-driven systems interacting with humans. In some configurations, the data collected by the user-driven systems can be used to train a neural network in the autonomous systems that can control the utility robot's interaction with humans. In some configurations, to avoid obstacles such as humans and vehicles when crossing a street, RADAR and/or LIDAR, combined with stereo cameras, can be used for long distance viewing and to reliably identify the obstacles and create a crossing strategy. In some configurations, the utility robot of the present teachings can communicate wirelessly with available electronic sources such as elevators and pedestrian crosswalks. Smart beacons can be used for this purpose. When obstacles such as construction zones are encountered, the utility robot of the present teachings can purposefully navigate the construction zone, and can inform other fleet members of the extent of the obstacle, giving the other fleet members an opportunity to avoid the obstacle. A neural network executing in the utility robot can train the utility robot to recognize crossing signals, for example, and to cross when safe.
The utility robot can receive information from smart beacons placed strategically along travel paths. In some configurations, information from the smart beacons can be encrypted, and/or information exchanged between the utility robot of the present teaching and the smart beacon can be encrypted to protect the utility robot from malicious hacking. In some configurations, the smart beacons can include cameras, RADAR, and/or LIDAR that can be used to map the local area. In some configurations, smart beacons can vary in complexity and specialization. For example, smart beacons that can manage network communications can be placed in areas where it is likely that network members will need communication services. Smart beacons that include mapping cameras can be placed in locations where mapping is required, and can be moved from place to place depending on current needs. In some configurations, smart beacons can include data transfer hot spot capability, or other networking capability to enable the fleet network of the present teachings to communicate among fleet members. In some configurations, smart beacons can recognize the travel path and be aware of the next navigation step required for the utility robot to reach its desired destination. Smart beacons can receive at least part of the utility robot's path and/or destination from a server. The smart beacons can identify the utility robots of the present teachings, possibly through the secure wireless exchange of identifying information, possibly through visual and/or audible identification techniques, or other means. Secure exchange of messages can include encryption, for example, and other forms of protection against in-flight message modification, man-in-the-middle threats such as eavesdropping and denial of service, third party application threats, and malicious/erroneous application threats. The utility robot can receive navigation information from the smart beacon, including homing, triangulation, and aiming signals. The utility robot can receive current mapping information including, but not limited to, congestion areas and path closures, from the smart beacon, and the utility robot can send the information it has collected to the smart beacon. The utility robot can make beacon information available to other utility robot fleet members at any time, for example, but not limited to, during a parcel delivery and/or pickup. The utility robot can receive information from the smart beacon that can be used to correct the utility robot's IMU dead reckoning and wheel rotation navigation. In some configurations, the utility robot can navigate entirely through information received from the smart beacon. For example, in a congested area, it is possible that some of the sensors located on the utility robot of the present teachings could be blocked. Sensors, for example, LIDAR sensors, on the smart beacon can provide navigation information to the utility robot of the present teachings that the utility robot could not itself have obtained with its on-board sensors. Sensors located on any of the utility robots of the present teachings, the trucks, and/or the smart beacons can provide current congestion information from cameras and/or thermal imaging to form heat maps. The utility robot can receive instructions from a steerable RF or laser beacon that can be controlled by another member of the fleet, a central control location, or by the utility robot itself. In some configurations, the utility robot can be configured with a minimum number of sensors if data are planned to be collected by other fleet members. The utility robot can receive these sensor data, for example, the heat maps, and recognize the location of groups of obstacles, possibly dynamic obstacles, within potential travel routes. In areas without various types of beacons, exploring utility robots with partial or full complements of sensors, can retrieve navigation and congestion data and make the data accessible to utility robots of the present teachings that are traveling the explored routes to deliver goods and services. The exploring systems can provide their sensor data and analyses to a central service, a cloud-based storage area, a smart beacon, and/or another exploring system, utility robot, and/or truck or other member of the delivery fleet, for example. Beacons can be used to facilitate data communications among the fleet members, and can be used to improve localization accuracy. In some configurations, beacons can include wireless access points generating signals, such as, for example, wifi and RF signals, that can be used to help navigate the utility robot in areas in which global positioning techniques are inadequate.
The present teachings will be more readily understood by reference to the following description, taken with the accompanying drawings, in which:
The utility system of the present teachings is discussed in detail herein in relation to commercial services. However, various types of applications may take advantage of the features of the present teachings.
Referring now to
Referring now to
Continuing to refer to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Continuing to refer to
Referring now to
Continuing to refer to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Continuing to still further refer to
Referring now primarily to
Referring again to
Referring again to
Referring now primarily to
Referring now to
Referring now to
Referring now to
Referring now to
While the present teachings have been described in terms of specific configurations, it is to be understood that they are not limited to these disclosed configurations. Many modifications and other configurations will come to mind to those skilled in the art to which this pertains, and which are intended to be and are covered by both this disclosure and the appended claims. It is intended that the scope of the present teachings should be determined by proper interpretation and construction of the appended claims and their legal equivalents, as understood by those of skill in the art relying upon the disclosure in this specification and the attached drawings.
This application claims the benefit of U.S. Provisional Application Ser. No. 62/682,129, filed Jun. 7, 2018, entitled SYSTEM AND METHOD FOR DISTRIBUTED UTILITY SERVICE EXECUTION, which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
584127 | Draullette et al. | Jun 1897 | A |
849270 | Schafer | Apr 1907 | A |
880823 | Redfield | Mar 1908 | A |
2224411 | Smith | Dec 1940 | A |
2415056 | Wheeler | Jan 1947 | A |
2618447 | Lecarme | Nov 1952 | A |
2742973 | Johannesen | Apr 1956 | A |
2966223 | Gleasman | Dec 1960 | A |
3017199 | Sechrist | Jan 1962 | A |
3145797 | Taylor | Aug 1964 | A |
3179355 | Pickering | Apr 1965 | A |
3260324 | Suarez | Jul 1966 | A |
3283398 | Andren | Nov 1966 | A |
3288234 | Feliz | Nov 1966 | A |
3306626 | Kawada | Feb 1967 | A |
3313365 | Jackson | Apr 1967 | A |
3338328 | Cataldo | Aug 1967 | A |
3348518 | Forsyth | Oct 1967 | A |
3374845 | Selwyn | Mar 1968 | A |
3399742 | Malick | Sep 1968 | A |
3446304 | Alimanestiand | May 1969 | A |
3450219 | Fleming | Jun 1969 | A |
3515401 | Gross | Jun 1970 | A |
3580344 | Floyd | May 1971 | A |
3596298 | Durst, Jr. | Aug 1971 | A |
3628624 | Wesener | Dec 1971 | A |
3718342 | Freed | Feb 1973 | A |
3787066 | Hautier | Jan 1974 | A |
3790150 | Lippert | Feb 1974 | A |
3860264 | Douglas | Jan 1975 | A |
3872945 | Hickman | Mar 1975 | A |
3893689 | Verhoff | Jul 1975 | A |
3952822 | Udden | Apr 1976 | A |
3965402 | Mogle | Jun 1976 | A |
3993154 | Simmons et al. | Nov 1976 | A |
4005907 | Bonomo | Feb 1977 | A |
4018440 | Deutsch | Apr 1977 | A |
4030753 | Meiners | Jun 1977 | A |
4054319 | Fogg et al. | Oct 1977 | A |
4062558 | Wasserman | Dec 1977 | A |
4065145 | Chambers | Dec 1977 | A |
4065146 | Denzer | Dec 1977 | A |
4076270 | Winchell | Feb 1978 | A |
4078627 | Brown et al. | Mar 1978 | A |
4087107 | Winchell | May 1978 | A |
4088199 | Trautwein | May 1978 | A |
4094372 | Notter | Jun 1978 | A |
4109741 | Gabriel | Aug 1978 | A |
4111445 | Haibeck | Sep 1978 | A |
4115445 | Hearsey | Sep 1978 | A |
4140200 | Tucek | Feb 1979 | A |
4151892 | Francken | May 1979 | A |
D253234 | Cooke | Oct 1979 | S |
4222449 | Feliz | Sep 1980 | A |
4264082 | Fouchey, Jr. | Apr 1981 | A |
4266627 | Lauber | May 1981 | A |
4274503 | Charles Mackintosh | Jun 1981 | A |
4281734 | Johnston | Aug 1981 | A |
4293052 | Daswick | Oct 1981 | A |
4307788 | Shelton | Dec 1981 | A |
4325565 | Winchell | Apr 1982 | A |
4354569 | Eichholz | Oct 1982 | A |
D266758 | Johannsen et al. | Nov 1982 | S |
4363493 | Veneklasen | Dec 1982 | A |
4373600 | Buschbom | Feb 1983 | A |
4375840 | Campbell | Mar 1983 | A |
4413693 | Derby | Nov 1983 | A |
4448455 | Ellegaard | May 1984 | A |
4456086 | Wier | Jun 1984 | A |
4484648 | Jephcott | Nov 1984 | A |
4510956 | King | Apr 1985 | A |
4512588 | Cox | Apr 1985 | A |
4556997 | Takamiya | Dec 1985 | A |
4560022 | Kassai | Dec 1985 | A |
4566707 | Nitzberg | Jan 1986 | A |
4570078 | Yashima | Feb 1986 | A |
4571844 | Komasaku | Feb 1986 | A |
4624469 | Bourne, Jr. | Nov 1986 | A |
4648783 | Tan | Mar 1987 | A |
4657271 | Salmon | Apr 1987 | A |
4657272 | Davenport | Apr 1987 | A |
4674584 | Watkins | Jun 1987 | A |
4685693 | Vadjunec | Aug 1987 | A |
4709772 | Brunet | Dec 1987 | A |
4712806 | Patrin | Dec 1987 | A |
4716980 | Butler | Jan 1988 | A |
4722547 | Kishi | Feb 1988 | A |
4732353 | Studer | Mar 1988 | A |
4740001 | Torleumke | Apr 1988 | A |
4746132 | Eagan | May 1988 | A |
4750578 | Brandenfels | Jun 1988 | A |
4754255 | Sanders et al. | Jun 1988 | A |
4770410 | Brown | Sep 1988 | A |
4778133 | Sakurai | Oct 1988 | A |
4786069 | Tang | Nov 1988 | A |
4787679 | Arnold | Nov 1988 | A |
4790400 | Sheeter | Dec 1988 | A |
4790548 | Decelles | Dec 1988 | A |
4794730 | Fischbach | Jan 1989 | A |
4794999 | Hester | Jan 1989 | A |
4798255 | Wu | Jan 1989 | A |
4802542 | Houston | Feb 1989 | A |
4809804 | Houston | Mar 1989 | A |
4834200 | Kajita | May 1989 | A |
4837694 | Narita et al. | Jun 1989 | A |
4863182 | Chern | Sep 1989 | A |
4867188 | Reid | Sep 1989 | A |
4869279 | Hedges | Sep 1989 | A |
4874055 | Beer | Oct 1989 | A |
4890853 | Olson | Jan 1990 | A |
4897070 | Wagstaff | Jan 1990 | A |
4913252 | Bartley et al. | Apr 1990 | A |
4919225 | Sturges | Apr 1990 | A |
D308364 | Beasley, Jr. et al. | Jun 1990 | S |
4941854 | Takahashi et al. | Jul 1990 | A |
4944360 | Sturges | Jul 1990 | A |
4953851 | Sherlock | Sep 1990 | A |
4964679 | Rath | Oct 1990 | A |
4967862 | Pong et al. | Nov 1990 | A |
4973071 | Ishizaki | Nov 1990 | A |
4984754 | Yarrington | Jan 1991 | A |
4985947 | Ethridge | Jan 1991 | A |
4998596 | Miksitz | Mar 1991 | A |
5001636 | Shiraishi et al. | Mar 1991 | A |
5002295 | Lin | Mar 1991 | A |
5011171 | Cook | Apr 1991 | A |
5012176 | LaForge | Apr 1991 | A |
RE33675 | Young | Aug 1991 | E |
5044457 | Aikman | Sep 1991 | A |
5052237 | Reimann | Oct 1991 | A |
5076390 | Haskins | Dec 1991 | A |
5087103 | Pompier | Feb 1992 | A |
5088761 | Takehara et al. | Feb 1992 | A |
5098041 | Uetrecht | Mar 1992 | A |
5111899 | Reimann | May 1992 | A |
5123972 | Ostrander | Jun 1992 | A |
5124938 | Algrain | Jun 1992 | A |
5125468 | Coker | Jun 1992 | A |
5127709 | Rubinstein | Jul 1992 | A |
5136219 | Takahashi | Aug 1992 | A |
5158493 | Morgrey | Oct 1992 | A |
5161820 | Vollmer | Nov 1992 | A |
5165711 | Tsai | Nov 1992 | A |
5168947 | Rodenborn | Dec 1992 | A |
5171173 | Henderson | Dec 1992 | A |
5186270 | West | Feb 1993 | A |
5208521 | Aoyama | May 1993 | A |
5217246 | Williams | Jun 1993 | A |
5221883 | Takenaka | Jun 1993 | A |
5229068 | Johansson et al. | Jul 1993 | A |
5241875 | Kochanneck | Sep 1993 | A |
5248007 | Watkins | Sep 1993 | A |
5261503 | Yasui | Nov 1993 | A |
5274576 | Williams | Dec 1993 | A |
5276588 | Repplinger | Jan 1994 | A |
5276624 | Ito | Jan 1994 | A |
5297646 | Yamamura | Mar 1994 | A |
5307888 | Urvoy | May 1994 | A |
5307892 | Philips | May 1994 | A |
5314034 | Chittal | May 1994 | A |
5350033 | Kraft | Sep 1994 | A |
5364165 | Okamoto | Nov 1994 | A |
5366036 | Perry | Nov 1994 | A |
5369580 | Monji | Nov 1994 | A |
5376868 | Toyoda | Dec 1994 | A |
D355148 | Orsolini | Feb 1995 | S |
5388658 | Ando et al. | Feb 1995 | A |
5397890 | Schueler | Mar 1995 | A |
5408411 | Nakamura | Apr 1995 | A |
5408811 | Satake | Apr 1995 | A |
5417298 | Shibahata | May 1995 | A |
5419624 | Adler | May 1995 | A |
5450919 | Shitani | Sep 1995 | A |
5465806 | Higasa | Nov 1995 | A |
5482125 | Pagett | Jan 1996 | A |
D373121 | Deiuliis et al. | Aug 1996 | S |
5551756 | Gurasich et al. | Sep 1996 | A |
5575348 | Goertzen | Nov 1996 | A |
5576959 | Hrovat | Nov 1996 | A |
D376585 | Wathen et al. | Dec 1996 | S |
5615116 | Gudat | Mar 1997 | A |
D381325 | McMahan et al. | Jul 1997 | S |
5646845 | Gudat | Jul 1997 | A |
5649605 | Ronne et al. | Jul 1997 | A |
5657828 | Nagamachi | Aug 1997 | A |
5695021 | Schaffner | Dec 1997 | A |
5701965 | Kamen et al. | Dec 1997 | A |
5701968 | Wright-Ott | Dec 1997 | A |
5705746 | Trost | Jan 1998 | A |
5732379 | Eckert | Mar 1998 | A |
5743347 | Gingerich | Apr 1998 | A |
5746282 | Fujiwara | May 1998 | A |
5769441 | Namngani | Jun 1998 | A |
5774819 | Yamamoto et al. | Jun 1998 | A |
5775452 | Patmont | Jul 1998 | A |
5791425 | Kamen | Aug 1998 | A |
5794730 | Kamen | Aug 1998 | A |
5799745 | Fukatani | Sep 1998 | A |
5799914 | Chivallier et al. | Sep 1998 | A |
5826209 | Matsuno | Oct 1998 | A |
D402645 | Garguilo | Dec 1998 | S |
5848660 | McGreen | Dec 1998 | A |
5850136 | Kaneko | Dec 1998 | A |
5869943 | Nakashima et al. | Feb 1999 | A |
5869946 | Carobolante | Feb 1999 | A |
5893896 | Imamura et al. | Apr 1999 | A |
5927414 | Kan et al. | Jul 1999 | A |
5928309 | Korver | Jul 1999 | A |
5931421 | Surauer et al. | Aug 1999 | A |
5939864 | Lenhart et al. | Aug 1999 | A |
5965991 | Koike | Oct 1999 | A |
5971091 | Kamen | Oct 1999 | A |
5973463 | Okuda | Oct 1999 | A |
5975225 | Kamen | Nov 1999 | A |
5986221 | Stanley | Nov 1999 | A |
6002975 | Schiffmann | Dec 1999 | A |
6003624 | Jorgensen et al. | Dec 1999 | A |
6024182 | Hamada et al. | Feb 2000 | A |
6036619 | Tashiro | Mar 2000 | A |
6039142 | Eckstein | Mar 2000 | A |
6050357 | Staelin | Apr 2000 | A |
6052647 | Parkinson | Apr 2000 | A |
6053579 | Nelson et al. | Apr 2000 | A |
6059062 | Staelin | May 2000 | A |
6062600 | Kamen et al. | May 2000 | A |
6062651 | Schaad | May 2000 | A |
6065558 | Wielenga | May 2000 | A |
6073951 | Jindra et al. | Jun 2000 | A |
6076033 | Hamada | Jun 2000 | A |
6089680 | Yoshioka et al. | Jul 2000 | A |
6092249 | Kamen et al. | Jul 2000 | A |
D428936 | Serfaty et al. | Aug 2000 | S |
6105704 | Hamada | Aug 2000 | A |
6108592 | Kurtzberg et al. | Aug 2000 | A |
6123398 | Arai | Sep 2000 | A |
6125953 | Arai | Oct 2000 | A |
6125957 | Kauffmann | Oct 2000 | A |
6131057 | Tamaki | Oct 2000 | A |
6141613 | Fan | Oct 2000 | A |
6148939 | Brookhart | Nov 2000 | A |
6154692 | Cielaszyk | Nov 2000 | A |
D434762 | Ikenaga | Dec 2000 | S |
6169946 | Griessbach | Jan 2001 | B1 |
6189643 | Takahashi | Feb 2001 | B1 |
6192305 | Schiffmann | Feb 2001 | B1 |
6208734 | Ortscheid et al. | Mar 2001 | B1 |
6208929 | Matsuno et al. | Mar 2001 | B1 |
6212276 | Inoue | Apr 2001 | B1 |
6223104 | Kamen | Apr 2001 | B1 |
6223114 | Boros | Apr 2001 | B1 |
6225977 | Li | May 2001 | B1 |
D444184 | Kettler | Jun 2001 | S |
6247548 | Hayashi | Jun 2001 | B1 |
6260646 | Fernandez et al. | Jul 2001 | B1 |
6263261 | Brown | Jul 2001 | B1 |
6264218 | Slagerman | Jul 2001 | B1 |
6270105 | Friedrich | Aug 2001 | B1 |
6273212 | Husted et al. | Aug 2001 | B1 |
6276471 | Kratzenberg et al. | Aug 2001 | B1 |
6285778 | Nakajima | Sep 2001 | B1 |
6288505 | Heinzmann | Sep 2001 | B1 |
6292722 | Holmes et al. | Sep 2001 | B1 |
6302230 | Kamen | Oct 2001 | B1 |
6311794 | Morrell et al. | Nov 2001 | B1 |
6320336 | Eguchi | Nov 2001 | B1 |
6324446 | Brown et al. | Nov 2001 | B1 |
6325736 | Hamada | Dec 2001 | B1 |
6328125 | Van Den Brink | Dec 2001 | B1 |
6332103 | Steenson, Jr. | Dec 2001 | B1 |
6332104 | Brown | Dec 2001 | B1 |
D452692 | Fukuda | Jan 2002 | S |
6343664 | Morrell et al. | Feb 2002 | B2 |
6356188 | Meyers | Mar 2002 | B1 |
6357544 | Kamen | Mar 2002 | B1 |
6360996 | Bockman et al. | Mar 2002 | B1 |
6367817 | Kamen | Apr 2002 | B1 |
6371228 | Husted et al. | Apr 2002 | B1 |
6375209 | Schlangen | Apr 2002 | B1 |
6377906 | Rowe | Apr 2002 | B1 |
6386576 | Kamen et al. | May 2002 | B1 |
6388580 | Graham | May 2002 | B1 |
6397046 | Kfoury | May 2002 | B1 |
6405816 | Kamen et al. | Jun 2002 | B1 |
6408240 | Morrell et al. | Jun 2002 | B1 |
6415215 | Nishizaki | Jul 2002 | B1 |
6415879 | Kamen et al. | Jul 2002 | B2 |
6416272 | Suehiro | Jul 2002 | B1 |
6435535 | Field | Aug 2002 | B1 |
6435538 | Ellis | Aug 2002 | B2 |
6443250 | Kamen et al. | Sep 2002 | B1 |
6443251 | Morrell et al. | Sep 2002 | B1 |
6446320 | Kilgore | Sep 2002 | B2 |
6463369 | Sadano | Oct 2002 | B2 |
D466122 | Moody | Nov 2002 | S |
6484829 | Cox | Nov 2002 | B1 |
D466516 | Peiker | Dec 2002 | S |
6502011 | Haag | Dec 2002 | B2 |
6508319 | Langenfeld et al. | Jan 2003 | B1 |
6538411 | Field et al. | Mar 2003 | B1 |
6543564 | Kamen | Apr 2003 | B1 |
6543848 | Yasuo Suga et al. | Apr 2003 | B1 |
6543858 | Melton | Apr 2003 | B1 |
6547026 | Kamen et al. | Apr 2003 | B2 |
6553271 | Morrell | Apr 2003 | B1 |
6554250 | Alves et al. | Apr 2003 | B2 |
6556909 | Matsumoto | Apr 2003 | B2 |
6561294 | Kamen | May 2003 | B1 |
6562511 | Daroux | May 2003 | B2 |
6571176 | Shinmura | May 2003 | B1 |
6571892 | Kamen | Jun 2003 | B2 |
6575539 | Reich | Jun 2003 | B2 |
6581714 | Kamen | Jun 2003 | B1 |
6582181 | Suehiro et al. | Jun 2003 | B2 |
6586901 | Singer et al. | Jul 2003 | B1 |
6593849 | Chubb | Jul 2003 | B2 |
6598941 | Field et al. | Jul 2003 | B2 |
6614343 | Fennel | Sep 2003 | B1 |
6615938 | Morrell et al. | Sep 2003 | B2 |
6634451 | Sakakiyama | Oct 2003 | B2 |
6643451 | Tokura et al. | Nov 2003 | B1 |
6647248 | Ortscheid et al. | Nov 2003 | B1 |
6651763 | Kamen et al. | Nov 2003 | B1 |
6651766 | Kamen | Nov 2003 | B2 |
6654674 | Lu | Nov 2003 | B2 |
6654675 | Pedersen et al. | Nov 2003 | B2 |
6659211 | Esposito | Dec 2003 | B2 |
6659570 | Nakamura | Dec 2003 | B2 |
D485279 | DeCombe | Jan 2004 | S |
6694225 | Aga | Feb 2004 | B2 |
6704622 | Tinskey | Mar 2004 | B2 |
6713693 | Sadowski et al. | Mar 2004 | B1 |
D489027 | Waters | Apr 2004 | S |
D489029 | Waters | Apr 2004 | S |
D489300 | Chang | May 2004 | S |
6752231 | Hume | Jun 2004 | B2 |
D493127 | Waters | Jul 2004 | S |
D493128 | Waters | Jul 2004 | S |
D493801 | Byun | Aug 2004 | S |
D494099 | Maurer | Aug 2004 | S |
6779621 | Kamen et al. | Aug 2004 | B2 |
6781960 | Charas | Aug 2004 | B1 |
6789640 | Arling | Sep 2004 | B1 |
6793258 | Gray | Sep 2004 | B2 |
6796396 | Kamen | Sep 2004 | B2 |
6799649 | Kamen et al. | Oct 2004 | B2 |
6827163 | Amsbury et al. | Dec 2004 | B2 |
6856326 | Zhai | Feb 2005 | B1 |
D503402 | Su et al. | Mar 2005 | S |
6866107 | Heinzmann et al. | Mar 2005 | B2 |
6868931 | Morrell | Mar 2005 | B2 |
D503928 | Obata | Apr 2005 | S |
6874591 | Morrell et al. | Apr 2005 | B2 |
6889784 | Troll | May 2005 | B2 |
6907949 | Wang | Jun 2005 | B1 |
D507206 | Wang | Jul 2005 | S |
6920947 | Kamen et al. | Jul 2005 | B2 |
6938923 | Mulhern et al. | Sep 2005 | B2 |
6962383 | Takenoshita et al. | Nov 2005 | B2 |
6965206 | Kamen et al. | Nov 2005 | B2 |
6969079 | Kamen et al. | Nov 2005 | B2 |
7000933 | Arling et al. | Feb 2006 | B2 |
7004271 | Kamen et al. | Feb 2006 | B1 |
7006901 | Wang | Feb 2006 | B2 |
D517086 | Siebel | Mar 2006 | S |
7017686 | Kamen et al. | Mar 2006 | B2 |
D521017 | Jewitt | May 2006 | S |
7040713 | Rudolf | May 2006 | B2 |
D524315 | Reusing | Jul 2006 | S |
7090040 | Kamen et al. | Aug 2006 | B2 |
D528468 | Arling | Sep 2006 | S |
D529005 | Hong | Sep 2006 | S |
7102328 | Long et al. | Sep 2006 | B2 |
7130702 | Morrell | Oct 2006 | B2 |
7174976 | Kamen et al. | Feb 2007 | B2 |
7178611 | Zupanick | Feb 2007 | B2 |
7178614 | Ishii | Feb 2007 | B2 |
7182166 | Gray et al. | Feb 2007 | B2 |
D539810 | Cummins | Apr 2007 | S |
7198223 | Phelps, III et al. | Apr 2007 | B2 |
7210544 | Kamen et al. | May 2007 | B2 |
7219912 | Meyer | May 2007 | B2 |
D544486 | Hussaini | Jun 2007 | S |
7234779 | Bedford et al. | Jun 2007 | B2 |
D546782 | Poulet et al. | Jul 2007 | S |
D549721 | Ito | Aug 2007 | S |
D549722 | Ito et al. | Aug 2007 | S |
D551592 | Chang et al. | Sep 2007 | S |
D551722 | Chang et al. | Sep 2007 | S |
7272681 | Davies | Sep 2007 | B2 |
7273116 | Kamen et al. | Sep 2007 | B2 |
D552609 | Kornblum | Oct 2007 | S |
7275607 | Kamen et al. | Oct 2007 | B2 |
D556149 | Kaufhold et al. | Nov 2007 | S |
D557220 | Ewringmann | Dec 2007 | S |
D557221 | Ewringmann | Dec 2007 | S |
7303032 | Kahlert et al. | Dec 2007 | B2 |
7316441 | Iwatani et al. | Jan 2008 | B2 |
D564033 | Itskov et al. | Mar 2008 | S |
7363993 | Ishii | Apr 2008 | B2 |
7370713 | Kamen | May 2008 | B1 |
7399035 | Kusanagi et al. | Jul 2008 | B2 |
7481291 | Nishikawa | Jan 2009 | B2 |
D585906 | Berg et al. | Feb 2009 | S |
D587660 | Lin | Mar 2009 | S |
7539557 | Yamauchi | May 2009 | B2 |
7546889 | Kamen et al. | Jun 2009 | B2 |
D598927 | Hirsch | Aug 2009 | S |
7589643 | Dagci | Sep 2009 | B2 |
7592900 | Kamen et al. | Sep 2009 | B2 |
D601922 | Imai et al. | Oct 2009 | S |
7640086 | Nakashima et al. | Dec 2009 | B2 |
7688191 | Lu | Mar 2010 | B2 |
7690447 | Kamen et al. | Apr 2010 | B2 |
7690452 | Kamen et al. | Apr 2010 | B2 |
7703568 | Ishii | Apr 2010 | B2 |
D614998 | Fujita | May 2010 | S |
7740099 | Field et al. | Jun 2010 | B2 |
D619945 | Sadanowicz et al. | Jul 2010 | S |
7757794 | Heinzmann et al. | Jul 2010 | B2 |
7789174 | Kamen | Sep 2010 | B2 |
7823676 | Yamada et al. | Nov 2010 | B2 |
7856248 | Fujisaki | Dec 2010 | B1 |
7857088 | Field | Dec 2010 | B2 |
D632229 | Kruse | Feb 2011 | S |
7896440 | Tsai | Mar 2011 | B2 |
7900725 | Heinzmann et al. | Mar 2011 | B2 |
7917097 | Hawkins et al. | Mar 2011 | B2 |
7938207 | Kamen et al. | May 2011 | B2 |
7958956 | Kakinuma et al. | Jun 2011 | B2 |
D644654 | Maitlen et al. | Sep 2011 | S |
8011459 | Serai | Sep 2011 | B2 |
8014923 | Ishii | Sep 2011 | B2 |
8025325 | Carrier et al. | Sep 2011 | B1 |
8028777 | Kakinuma | Oct 2011 | B2 |
8050820 | Yanaka | Nov 2011 | B2 |
8050837 | Yamada | Nov 2011 | B2 |
8074388 | Trainer | Dec 2011 | B2 |
8091672 | Gutsch | Jan 2012 | B2 |
8113244 | Kamen et al. | Feb 2012 | B2 |
8151912 | Koide et al. | Apr 2012 | B2 |
8155828 | Fuwa et al. | Apr 2012 | B2 |
8160794 | Fuwa | Apr 2012 | B2 |
8162089 | Shaw | Apr 2012 | B2 |
8170780 | Field | May 2012 | B2 |
8170781 | Fuwa | May 2012 | B2 |
8172016 | Goertzen et al. | May 2012 | B2 |
8186462 | Kamen | May 2012 | B2 |
8224524 | Nakashima | Jul 2012 | B2 |
8225891 | Takenaka | Jul 2012 | B2 |
8239992 | Schnittman | Aug 2012 | B2 |
8248222 | Kamen | Aug 2012 | B2 |
8249773 | Kawada | Aug 2012 | B2 |
8255105 | Weissert | Aug 2012 | B2 |
8265774 | Senba | Sep 2012 | B2 |
8269130 | Mangan et al. | Sep 2012 | B2 |
8285474 | Doi | Oct 2012 | B2 |
8312017 | Martin et al. | Nov 2012 | B2 |
8326469 | Phillips | Dec 2012 | B2 |
8346441 | Miki et al. | Jan 2013 | B2 |
8371410 | Fuwa | Feb 2013 | B2 |
D678217 | Helm | Mar 2013 | S |
D678320 | Kanalakis, Jr. | Mar 2013 | S |
8396611 | Phillips | Mar 2013 | B2 |
8417404 | Yen | Apr 2013 | B2 |
8418705 | Ota et al. | Apr 2013 | B2 |
8453768 | Kamen | Jun 2013 | B2 |
8467941 | Field | Jun 2013 | B2 |
D686200 | Huang et al. | Jul 2013 | S |
8490723 | Heinzmann | Jul 2013 | B2 |
8504248 | Taira | Aug 2013 | B2 |
8564444 | Ota et al. | Oct 2013 | B2 |
8572822 | Hasegawa | Nov 2013 | B2 |
8584782 | Chen | Nov 2013 | B2 |
8587583 | Newcombe | Nov 2013 | B2 |
8621684 | Okumatsu | Jan 2014 | B2 |
8636451 | Yamashita et al. | Jan 2014 | B2 |
8639416 | Jones | Jan 2014 | B2 |
8640807 | Takenaka | Feb 2014 | B2 |
8672339 | Raike, III | Mar 2014 | B2 |
8672356 | Inaguma | Mar 2014 | B2 |
8684123 | Chen | Apr 2014 | B2 |
8690265 | Noblanc | Apr 2014 | B2 |
D704621 | Taylor | May 2014 | S |
D705799 | Funabashi et al. | May 2014 | S |
8738238 | Rekow | May 2014 | B2 |
8738278 | Chen | May 2014 | B2 |
D706807 | Harre | Jun 2014 | S |
D707701 | d'Amore | Jun 2014 | S |
8744720 | Fujisaki | Jun 2014 | B1 |
8753208 | Jaouen et al. | Jun 2014 | B2 |
D708203 | Johnson | Jul 2014 | S |
8775001 | Phillips | Jul 2014 | B2 |
8807250 | Chen | Aug 2014 | B2 |
8830048 | Kamen et al. | Sep 2014 | B2 |
8832875 | Odashima et al. | Sep 2014 | B2 |
8843244 | Phillips | Sep 2014 | B2 |
D716325 | Brudnicki | Oct 2014 | S |
8860551 | Carraher | Oct 2014 | B2 |
D716818 | Alegiani | Nov 2014 | S |
8925563 | Ota et al. | Jan 2015 | B2 |
8958976 | Kajima | Feb 2015 | B2 |
D723558 | Downs | Mar 2015 | S |
8978791 | Ha | Mar 2015 | B2 |
9002535 | Powers | Apr 2015 | B2 |
9016410 | Trowell et al. | Apr 2015 | B2 |
D729270 | Clare | May 2015 | S |
D729833 | Clare | May 2015 | S |
9038212 | Yamaguchi et al. | May 2015 | B2 |
D732062 | Kwon | Jun 2015 | S |
9045190 | Chen | Jun 2015 | B2 |
9056629 | Kamo | Jun 2015 | B2 |
9079039 | Carlson | Jul 2015 | B2 |
9096281 | Li | Aug 2015 | B1 |
D738907 | Cabrera-Cordon et al. | Sep 2015 | S |
D738913 | Cabrera-Cordon et al. | Sep 2015 | S |
9126497 | Heinzmann | Sep 2015 | B2 |
9156516 | Kahlert | Oct 2015 | B2 |
D742300 | Fontaeus | Nov 2015 | S |
D742407 | Park | Nov 2015 | S |
D742795 | Siao | Nov 2015 | S |
9187071 | Vinck et al. | Nov 2015 | B2 |
9193066 | Ohm | Nov 2015 | B2 |
9218003 | Fong | Dec 2015 | B2 |
D747352 | Lee et al. | Jan 2016 | S |
D750179 | Foulkes et al. | Feb 2016 | S |
D752572 | Kohler et al. | Mar 2016 | S |
9278036 | Lee | Mar 2016 | B2 |
9309692 | Westwinkel | Apr 2016 | B2 |
D755785 | Sirotich | May 2016 | S |
D757732 | Galanti | May 2016 | S |
D758284 | Ringer et al. | Jun 2016 | S |
D762179 | Wong | Jul 2016 | S |
9400044 | Wadhva et al. | Jul 2016 | B2 |
D763359 | Kwong | Aug 2016 | S |
D764520 | Lee et al. | Aug 2016 | S |
9403566 | Jacobsen | Aug 2016 | B2 |
9404756 | Fong | Aug 2016 | B2 |
D765718 | Vinna | Sep 2016 | S |
D766312 | Hedges | Sep 2016 | S |
9455104 | Leusenkamp et al. | Sep 2016 | B1 |
D769314 | Piroddi | Oct 2016 | S |
D770514 | Bae et al. | Nov 2016 | S |
D772255 | Taylor et al. | Nov 2016 | S |
D772924 | Begin et al. | Nov 2016 | S |
D772930 | Vazquez et al. | Nov 2016 | S |
D775148 | Anzures | Dec 2016 | S |
9527213 | Luo | Dec 2016 | B2 |
D778312 | Goodwin et al. | Feb 2017 | S |
9567021 | Mailey | Feb 2017 | B2 |
D784405 | Kim et al. | Apr 2017 | S |
D786278 | Motamedi | May 2017 | S |
D786770 | Smallhorn | May 2017 | S |
D787420 | Smallhorn | May 2017 | S |
D787996 | Rode et al. | May 2017 | S |
9636265 | Furuta | May 2017 | B2 |
9656704 | Couture | May 2017 | B2 |
9662438 | Kamen et al. | May 2017 | B2 |
D791174 | Hart et al. | Jul 2017 | S |
D792444 | Cho et al. | Jul 2017 | S |
D794674 | Brush | Aug 2017 | S |
9730029 | Choudhury | Aug 2017 | B2 |
9744879 | Drako | Aug 2017 | B2 |
D797772 | Mizono et al. | Sep 2017 | S |
D798318 | Ferguson | Sep 2017 | S |
9750896 | Kamen et al. | Sep 2017 | B2 |
9770825 | Goldenberg | Sep 2017 | B2 |
D801996 | Yang | Nov 2017 | S |
D802002 | Howard et al. | Nov 2017 | S |
D804393 | Yoo et al. | Dec 2017 | S |
D805972 | Lee et al. | Dec 2017 | S |
D805973 | Mullaney | Dec 2017 | S |
D807235 | Collins | Jan 2018 | S |
D807236 | Collins | Jan 2018 | S |
D807277 | Lee et al. | Jan 2018 | S |
D812533 | Lee et al. | Mar 2018 | S |
D814370 | Kim et al. | Apr 2018 | S |
D816090 | Stonecipher et al. | Apr 2018 | S |
9974467 | Blahnik et al. | May 2018 | B2 |
D821410 | Vinna et al. | Jun 2018 | S |
9989970 | Morey | Jun 2018 | B1 |
9996157 | Chaudhri et al. | Jun 2018 | B2 |
10007391 | Sabatelli et al. | Jun 2018 | B2 |
10025472 | Sabatelli | Jul 2018 | B2 |
D825437 | Hilton et al. | Aug 2018 | S |
D825493 | Chen | Aug 2018 | S |
D826244 | Yampolskaya | Aug 2018 | S |
D826255 | Andrizzi et al. | Aug 2018 | S |
10055108 | Bates | Aug 2018 | B2 |
10055184 | Ferrell et al. | Aug 2018 | B1 |
D829740 | Lepine et al. | Oct 2018 | S |
D830384 | Lepine et al. | Oct 2018 | S |
D830385 | Lepine et al. | Oct 2018 | S |
D830386 | Lepine et al. | Oct 2018 | S |
D831046 | Hashimoto et al. | Oct 2018 | S |
D832289 | Chen et al. | Oct 2018 | S |
10088993 | Hall | Oct 2018 | B2 |
10127250 | Dingman et al. | Nov 2018 | B2 |
10130534 | Mattes | Nov 2018 | B2 |
D835118 | Lee et al. | Dec 2018 | S |
D835139 | Li | Dec 2018 | S |
D835141 | Li et al. | Dec 2018 | S |
D835632 | Liu et al. | Dec 2018 | S |
10149589 | Lindhe | Dec 2018 | B2 |
D838731 | Pillalamarri et al. | Jan 2019 | S |
10172752 | Goffer | Jan 2019 | B2 |
D840413 | Leach et al. | Feb 2019 | S |
D841021 | Klar et al. | Feb 2019 | S |
D841022 | Klar et al. | Feb 2019 | S |
D841676 | Zhang | Feb 2019 | S |
D841687 | Muller et al. | Feb 2019 | S |
10203211 | Mishra | Feb 2019 | B1 |
10216188 | Brady et al. | Feb 2019 | B2 |
D842897 | Kumar | Mar 2019 | S |
10220843 | Coulter | Mar 2019 | B2 |
10222798 | Brady et al. | Mar 2019 | B1 |
10229245 | Laurance | Mar 2019 | B2 |
10230538 | Killian et al. | Mar 2019 | B2 |
10233021 | Brady | Mar 2019 | B1 |
10235014 | Yang | Mar 2019 | B2 |
10241516 | Brady et al. | Mar 2019 | B1 |
D847161 | Chaudhri | Apr 2019 | S |
10266097 | Takahata | Apr 2019 | B2 |
10272294 | Williams et al. | Apr 2019 | B2 |
D847836 | Thoreson | May 2019 | S |
10296167 | Liu | May 2019 | B2 |
10296194 | McLean | May 2019 | B2 |
10308430 | Brady et al. | Jun 2019 | B1 |
10310499 | Brady et al. | Jun 2019 | B1 |
10318589 | Sharp | Jun 2019 | B2 |
10338776 | Andersson | Jul 2019 | B2 |
D855634 | Kim | Aug 2019 | S |
10372304 | Jaramillo, III | Aug 2019 | B2 |
10379695 | Carlos | Aug 2019 | B2 |
10386942 | Kim | Aug 2019 | B2 |
10423283 | Ikeda | Sep 2019 | B2 |
10474737 | Girsova et al. | Nov 2019 | B1 |
10532885 | Brady et al. | Jan 2020 | B1 |
10628790 | Aggarwal | Apr 2020 | B1 |
10901418 | Brady et al. | Jan 2021 | B2 |
20010006125 | Richey | Jul 2001 | A1 |
20010037163 | Allard | Nov 2001 | A1 |
20020007239 | Matsumoto | Jan 2002 | A1 |
20020011361 | Richey | Jan 2002 | A1 |
20020056582 | Chubb | May 2002 | A1 |
20020063006 | Kamen | May 2002 | A1 |
20020074189 | Hester | Jun 2002 | A1 |
20020082749 | Meyers | Jun 2002 | A1 |
20020121394 | Kamen | Sep 2002 | A1 |
20020121572 | Jacobson | Sep 2002 | A1 |
20020189870 | Kamen | Dec 2002 | A1 |
20030014167 | Pedersen | Jan 2003 | A1 |
20030128840 | Luginbill | Jul 2003 | A1 |
20030226698 | Kamen | Dec 2003 | A1 |
20040005958 | Kamen | Jan 2004 | A1 |
20040007121 | Graves | Jan 2004 | A1 |
20040007399 | Heinzmann et al. | Jan 2004 | A1 |
20040007644 | Phelps, III et al. | Jan 2004 | A1 |
20040055796 | Kamen | Mar 2004 | A1 |
20040069543 | Kamen | Apr 2004 | A1 |
20040124655 | Takenoshita et al. | Jul 2004 | A1 |
20040135434 | Honda | Jul 2004 | A1 |
20040201271 | Kakinuma | Oct 2004 | A1 |
20040256886 | Wu | Dec 2004 | A1 |
20040262871 | Schreuder | Dec 2004 | A1 |
20050029023 | Takami | Feb 2005 | A1 |
20050121866 | Kamen | Jun 2005 | A1 |
20050134014 | Xie | Jun 2005 | A1 |
20050211477 | Gray | Sep 2005 | A1 |
20050236208 | Runkles | Oct 2005 | A1 |
20050236894 | Lu et al. | Oct 2005 | A1 |
20050251292 | Casey | Nov 2005 | A1 |
20060108956 | Clark | May 2006 | A1 |
20060163437 | Lin | Jul 2006 | A1 |
20060187646 | Belson et al. | Aug 2006 | A1 |
20060202439 | Kahlert | Sep 2006 | A1 |
20060231313 | Ishii | Oct 2006 | A1 |
20060279554 | Shin | Dec 2006 | A1 |
20060293850 | Ahn | Dec 2006 | A1 |
20070001830 | Dagci | Jan 2007 | A1 |
20070055424 | Peters et al. | Mar 2007 | A1 |
20070085300 | Loewenthal | Apr 2007 | A1 |
20070100511 | Koerlin | May 2007 | A1 |
20070156286 | Yamauchi | Jul 2007 | A1 |
20070198175 | Williams | Aug 2007 | A1 |
20070208483 | Rabin | Sep 2007 | A1 |
20070213900 | Raab | Sep 2007 | A1 |
20070216205 | Davis | Sep 2007 | A1 |
20070221423 | Chang | Sep 2007 | A1 |
20070296170 | Field | Dec 2007 | A1 |
20080029985 | Chen | Feb 2008 | A1 |
20080042379 | Amran | Feb 2008 | A1 |
20080066974 | Pearlman | Mar 2008 | A1 |
20080086241 | Phillips | Apr 2008 | A1 |
20080147281 | Ishii | Jun 2008 | A1 |
20080149798 | Tinoco | Jun 2008 | A1 |
20080174415 | Tanida | Jul 2008 | A1 |
20080197599 | Comstock | Aug 2008 | A1 |
20080238005 | James | Oct 2008 | A1 |
20080294288 | Yamauchi | Nov 2008 | A1 |
20080302938 | Goodwin et al. | Dec 2008 | A1 |
20090009984 | Mangiardi | Jan 2009 | A1 |
20090032323 | Kakinuma | Feb 2009 | A1 |
20090037033 | Phillips | Feb 2009 | A1 |
20090045025 | Bassett | Feb 2009 | A1 |
20090078485 | Gutsch | Mar 2009 | A1 |
20090105908 | Casey | Apr 2009 | A1 |
20090115149 | Wallis | May 2009 | A1 |
20090224524 | Rathsack | Sep 2009 | A1 |
20100025139 | Kosaka | Feb 2010 | A1 |
20100107076 | Laurance | Apr 2010 | A1 |
20100114468 | Field | May 2010 | A1 |
20100121538 | Ishii | May 2010 | A1 |
20100126787 | Kawada | May 2010 | A1 |
20100138128 | Strothmann | Jun 2010 | A1 |
20100222994 | Field | Sep 2010 | A1 |
20100230919 | Kawada | Sep 2010 | A1 |
20100235028 | Ishii | Sep 2010 | A1 |
20100237645 | Trainer | Sep 2010 | A1 |
20100250040 | Yamano | Sep 2010 | A1 |
20110035101 | Kawada et al. | Feb 2011 | A1 |
20110054717 | Yamauchi | Mar 2011 | A1 |
20110106339 | Phillips et al. | May 2011 | A1 |
20110123286 | Van Roosmalen | May 2011 | A1 |
20110175329 | Gingras | Jul 2011 | A1 |
20110209929 | Heinzmann | Sep 2011 | A1 |
20110215540 | Hunziker et al. | Sep 2011 | A1 |
20110220427 | Chen | Sep 2011 | A1 |
20110221160 | Shaw | Sep 2011 | A1 |
20110225417 | Maharajh et al. | Sep 2011 | A1 |
20110238247 | Yen | Sep 2011 | A1 |
20110285195 | Ratgen | Nov 2011 | A1 |
20120019554 | Narimatu et al. | Jan 2012 | A1 |
20120046821 | Pettersson | Feb 2012 | A1 |
20120072052 | Powers | Mar 2012 | A1 |
20120168240 | Wilson | Jul 2012 | A1 |
20120174037 | Relyea et al. | Jul 2012 | A1 |
20120185091 | Field | Jul 2012 | A1 |
20120185094 | Rosenstein | Jul 2012 | A1 |
20120197470 | Inui | Aug 2012 | A1 |
20120205176 | Ha | Aug 2012 | A1 |
20120215355 | Bewley | Aug 2012 | A1 |
20120219395 | Inaguma et al. | Aug 2012 | A1 |
20120239284 | Field | Sep 2012 | A1 |
20120290162 | Stevens | Nov 2012 | A1 |
20120313335 | Zanderlehn | Dec 2012 | A1 |
20130032422 | Chen | Feb 2013 | A1 |
20130032423 | Chen | Feb 2013 | A1 |
20130080015 | Strothmann | Mar 2013 | A1 |
20130081885 | Connor | Apr 2013 | A1 |
20130105239 | Fung | May 2013 | A1 |
20130146409 | Boyle | Jun 2013 | A1 |
20130188809 | Jones | Jul 2013 | A1 |
20130218380 | Phillips et al. | Aug 2013 | A1 |
20130228385 | Chen | Sep 2013 | A1 |
20130231814 | Sarokhan | Sep 2013 | A1 |
20130253769 | Kamo et al. | Sep 2013 | A1 |
20130332064 | Funk | Dec 2013 | A1 |
20140005933 | Fong | Jan 2014 | A1 |
20140018994 | Panzarella | Jan 2014 | A1 |
20140034400 | Underwood | Feb 2014 | A1 |
20140058600 | Hoffmann | Feb 2014 | A1 |
20140083225 | Downs | Mar 2014 | A1 |
20140088761 | Shamlian | Mar 2014 | A1 |
20140187237 | Li | Jul 2014 | A1 |
20140202777 | Lee | Jul 2014 | A1 |
20140246257 | Jacobsen | Sep 2014 | A1 |
20140246258 | Wyrobek | Sep 2014 | A1 |
20140277888 | Dastoor et al. | Sep 2014 | A1 |
20140371979 | Drew | Dec 2014 | A1 |
20150006005 | Yu et al. | Jan 2015 | A1 |
20150060162 | Goffer | Mar 2015 | A1 |
20150112264 | Kamen et al. | Apr 2015 | A1 |
20150119289 | Chen | Apr 2015 | A1 |
20150123453 | Benoit, Jr. | May 2015 | A1 |
20150012057 | Carlson et al. | Jul 2015 | A1 |
20150197247 | Ichinokawa | Jul 2015 | A1 |
20150198440 | Pearlman et al. | Jul 2015 | A1 |
20150231891 | Yashiro et al. | Aug 2015 | A1 |
20150245962 | Furuta | Sep 2015 | A1 |
20150246703 | Oishi et al. | Sep 2015 | A1 |
20150289653 | Hector et al. | Oct 2015 | A1 |
20150342517 | Rabischong | Dec 2015 | A1 |
20160014252 | Biderman et al. | Jan 2016 | A1 |
20160031497 | Luo | Feb 2016 | A1 |
20160035161 | Friedli et al. | Feb 2016 | A1 |
20160069691 | Fong | Mar 2016 | A1 |
20160075535 | Ooms | Mar 2016 | A1 |
20160101685 | Darpino et al. | Apr 2016 | A1 |
20160144505 | Fong | May 2016 | A1 |
20160170411 | Wei | Jun 2016 | A1 |
20160264019 | Drako | Sep 2016 | A1 |
20160291848 | Hall | Oct 2016 | A1 |
20160362147 | Mailey | Dec 2016 | A1 |
20170052033 | Fong | Feb 2017 | A1 |
20170080967 | Atkins | Mar 2017 | A1 |
20170176188 | Georgy et al. | Jun 2017 | A1 |
20170225321 | Oeyle | Aug 2017 | A1 |
20170240169 | Coulter et al. | Aug 2017 | A1 |
20170243365 | Nuijten | Aug 2017 | A1 |
20170259811 | Coulter et al. | Sep 2017 | A1 |
20170300058 | Peret et al. | Oct 2017 | A1 |
20180024553 | Kong et al. | Jan 2018 | A1 |
20180056985 | Coulter | Mar 2018 | A1 |
20180102227 | Poon | Apr 2018 | A1 |
20180143801 | Stucker et al. | May 2018 | A1 |
20180146757 | Singh Johar | May 2018 | A1 |
20180164829 | Oshima et al. | Jun 2018 | A1 |
20180185212 | Lucas | Jul 2018 | A1 |
20180203522 | Stucki et al. | Jul 2018 | A1 |
20180253220 | Tuhami | Sep 2018 | A1 |
20180329418 | Baalke | Nov 2018 | A1 |
20190025853 | Julian | Jan 2019 | A1 |
20190033868 | Ferguson | Jan 2019 | A1 |
20190038487 | Cherny | Feb 2019 | A1 |
20190041219 | Schubert | Feb 2019 | A1 |
20190046373 | Coulter | Feb 2019 | A1 |
20190087778 | Evans, Jr. | Mar 2019 | A1 |
20190114564 | Ferguson | Apr 2019 | A1 |
20190224057 | Jordan | Jul 2019 | A1 |
20190231617 | Cazali | Aug 2019 | A1 |
20190269567 | Kao | Sep 2019 | A1 |
20210141377 | Brady et al. | May 2021 | A1 |
Number | Date | Country |
---|---|---|
2822729 | Mar 2006 | CA |
2897221 | Mar 2006 | CA |
2897542 | Jan 2016 | CA |
101056680 | Oct 2007 | CN |
104071275 | Oct 2014 | CN |
2048593 | May 1971 | DE |
3103961 | Sep 1982 | DE |
3128112 | Feb 1983 | DE |
3242880 | Jun 1983 | DE |
3411489 | Oct 1984 | DE |
4110905 | Oct 1991 | DE |
4404594 | Aug 1995 | DE |
19625498 | Nov 1997 | DE |
29808091 | Aug 1998 | DE |
29808096 | Aug 1998 | DE |
10209093 | Sep 2003 | DE |
0109927 | May 1984 | EP |
0193473 | Sep 1986 | EP |
0537698 | Apr 1993 | EP |
0551986 | Jul 1993 | EP |
0663313 | Jul 1995 | EP |
0746089 | Dec 1996 | EP |
0958978 | Nov 1999 | EP |
1063530 | Dec 2000 | EP |
1791609 | Sep 2005 | EP |
1791609 | Mar 2006 | EP |
1759973 | Mar 2007 | EP |
1805071 | Jul 2007 | EP |
980237 | May 1951 | FR |
2502090 | Sep 1982 | FR |
152664 | Jan 1922 | GB |
1213930 | Nov 1970 | GB |
2139576 | Nov 1984 | GB |
2388579 | Nov 2003 | GB |
52-44933 | Apr 1977 | JP |
57-87766 | Jan 1982 | JP |
57-110569 | Jul 1982 | JP |
59-73372 | Apr 1984 | JP |
60-255580 | Dec 1985 | JP |
62-12810 | Jan 1987 | JP |
63-305082 | Dec 1988 | JP |
H01-316810 | Dec 1989 | JP |
2-190277 | Jul 1990 | JP |
4-201793 | Jul 1992 | JP |
5-213240 | Aug 1993 | JP |
6-171562 | Dec 1994 | JP |
61-05415 | Dec 1994 | JP |
7255780 | Oct 1995 | JP |
09-010375 | Jan 1997 | JP |
9-248320 | Sep 1997 | JP |
10-023613 | Jan 1998 | JP |
2000-070308 | Jul 2000 | JP |
2000-288032 | Oct 2000 | JP |
2005-022631 | Jan 2005 | JP |
4572594 | Jan 2006 | JP |
2007-069688 | Mar 2007 | JP |
D1314974 | Nov 2007 | JP |
D1323922 | Mar 2008 | JP |
4687784 | Jul 2010 | JP |
2010-240011 | Oct 2010 | JP |
2010-274759 | Dec 2010 | JP |
2011-246124 | Dec 2011 | JP |
5243795 | Jul 2013 | JP |
2014-019212 | Feb 2014 | JP |
2014-174275 | Sep 2014 | JP |
2014-195403 | Oct 2014 | JP |
2014-204544 | Oct 2014 | JP |
2014-218247 | Nov 2014 | JP |
2015-070897 | Apr 2015 | JP |
2015-171895 | Oct 2015 | JP |
2016-084135 | May 2016 | JP |
2018-062344 | Apr 2018 | JP |
D124943 | Jun 2006 | TW |
WO 198605752 | Oct 1986 | WO |
WO 198906117 | Jul 1989 | WO |
WO 199623478 | Aug 1996 | WO |
WO 199846474 | Oct 1998 | WO |
WO 199911488 | Mar 1999 | WO |
WO 2000023315 | Apr 2000 | WO |
WO 2000054719 | Sep 2000 | WO |
WO 2000054721 | Sep 2000 | WO |
WO 2000075001 | Dec 2000 | WO |
WO 2001002920 | Jan 2001 | WO |
WO 200230730 | Apr 2002 | WO |
WO 2002072383 | Sep 2002 | WO |
WO 2003068342 | Aug 2003 | WO |
WO2003103559 | Dec 2003 | WO |
WO 2003106250 | Dec 2003 | WO |
WO 2004007264 | Jan 2004 | WO |
WO2004078603 | Sep 2004 | WO |
WO 2006031917 | Mar 2006 | WO |
WO 2006042302 | Apr 2006 | WO |
WO 2009052471 | Apr 2009 | WO |
WO 2010084421 | Jul 2010 | WO |
WO 2012090248 | Jul 2012 | WO |
WO 2013096789 | Jun 2013 | WO |
WO 2015167411 | Nov 2015 | WO |
WO 2017147347 | Aug 2017 | WO |
WO 2017156586 | Sep 2017 | WO |
WO 2017180868 | Oct 2017 | WO |
WO 2017201513 | Nov 2017 | WO |
Entry |
---|
U.S. Appl. No. 15/600,703 (U22), B1-B100, B102-B103. |
U.S. Appl. No. 15/600,703 (U22), C8-C40, C42-050, C52-C65, C67-C74. |
U.S. Appl. No. 15/787,613 (W10) C1-C7, C41, C51, C66. |
Derry et al., Automated Doorway Detection for Assistive Shared-Control Wheelchairs, 2013 IEEE International Conference on Robotics and Automation, May 6-10, 2013, https://cpb-us-e1.wpmucdn.com/sites.northwestern.edu/dist/5/1812/files/2016/05/13icra_derry.pdf. |
U.S. Appl. No. 16/035,205, filed Jul. 13, 2018. |
U.S. Appl. No. 15/600,703, filed May 20, 2017. |
PCT/US17/33705, May 20, 2017. |
PCT/2018/042114, Jul. 13, 2018. |
U.S. Appl. No. 15/486,980, filed Apr. 13, 2017. |
PCT/US17/27410, Apr. 13, 2017. |
U.S. Appl. No. 15/441,190, filed Feb. 23, 2017. |
PCT/US17/19214, Feb. 23, 2017. |
U.S. Appl. No. 16/200,088, filed Nov. 26, 2018. |
Adhikari, B., A Single Subject Participatory Action Design Method for Powered Wheelchairs Providing Automated Back-in Parking Assistance to Cognitively Impaired Older Adults: A pilot study, Department of Computer Science, The University of British Columbia, Vancouver, Canada, Jan. 5, 2015, slide deck. |
Adhikari, B., A Single Subject Participatory Action Design Method for Powered Wheelchairs Providing Automated Back-in Parking Assistance to Cognitively Impaired Older Adults: A pilot study, Master's Thesis, Department of Comptuer Science, The University of British Columbia, Vancouver, Canada, Dec. 2014. |
Brown, Jr. et al., “A Single-Wheel, Gyroscopically Stabilized Robot,” IEEE Robotics & Automation Magazine, Sep. 1997. |
“BTCR9 Fansyn Bluetooth . . . ” Fanimation, published Feb. 4, 2017 (Retrieved from the Internet Sep. 27, 2019). Internet URL: https://web.archive.org/web/20170204193258/https://www.fanimation.com/products/index.php/controls-remotes/fansync-bluetooth-receiver-transmitter-downlight.html(Year : 2017). |
Cho et al, Sloped Terrain Segmentation for Autonomous Drive Using Sparse 3D Point Cloud, The Scientific World Journal, 2014, https://www.hindawi.com/journals/tswj/2014/582753/. |
Cooper, Rory A., “Intelligent Control of Power Wheelchairs”, IEEE Engineering in Medicine and Biology Magazine, IEEE Service Center, Piscataway, NJ, US, vol. 14, No. 4, Jul. 1, 1995, pp. 423-431, XP11084628. |
Dejun Yin and Yoichi Hori, “A Novel Traction Control for Electric Vehicle without Chassis Velocity, Motion Control”, Federico Casolo (Ed.), InTech, DOI: 10.5772/6962. Available from: https://mts.intechopen.com/books/motion-control/a-novel-traction-control-for-electric-vehide-without-chassis-velocity, 2010. |
Derry et al., Automated Doorway Detection for Assistive Shared-Control Wheelchairs, 2013 IEEE International Conference on Robotics and Automation, May 6-10, 2013, https://cpb-us-el.wpmacdn.com/sites.northwestern.edu/dist/5/1812/files/2016/05/13icra_derry.pdf. |
Elnagar, A., “Prediction of Moving Objects in Dynamic Environments Using Kalman Filters,” Proceedings of 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, Jul. 29-Aug. 1, 2001. |
Fresk, et al., “Full Quaternion Based Attitude Control for a Quadrator”, 2013 European Control Conference (EDD), Jul. 17-19, 2013, Zurich, Switzerland, pp. 3864-3869. |
Grasser, F. et al., “JOE: A Mobile, Inverted Pendulum,” IEEE Transactions on Industrial Electronics, vol. 49, No. 1, Feb. 2002. |
Ha, et al. “Trajectory Tracking Control for Navigation of Self-Contained Mobile Inverse Pendulum” Intelligent Robots and Systems '94. ‘Advanced Robotic Systems and the Real World’, IROS '94. Proceedings of the IEEE/RSJ/GI International Conference on, vol. 3, no., pp. 1875-1882, Sep. 12-16, 1994. |
Ha, et al., “Trajectory Tracking Control for Navigation of the Inverse Pendulum Type Self-Contained Mobile Robot” Robotics and Autonomous Systems 17, 65-80 (1996). |
Helgesson, L., “Pitch and roll estimating Kalman filter for stabilizing quadrocopters”, http:/lhelge.se/2012/04/pitch-and-roll-estimating-kalman-filter-for-stabilizing-quadrocopters/, Oct. 15, 2012. |
How et al., “Clinical Evaluation of the Intelligent Wheelchair System”, Proceedings of Festival of international Conference on Caregiving, Disability, Aging and Technology, Toronto, Canada, 2011. |
I-Real, Personal Mobility Device, https://www.youtube.com/warch?v=WAGpxIUpdWw, Published on Jan. 15, 2013, appeared first in Apr. 2012, D1 Grand Prix event, Odaiba, JP. |
Ishida and Miyamoto, “Collision-Detecting Device for Omnidirectional Electric Wheelchair”, Research Article, ISRN Robotics, vol. 2013, Article ID 672826, Nov. 1, 2012. |
I-swing, Single Person Vehicle, https://www.youtube.com/watch?feature=player_embedded&v=1QSybf7sLtg, Published on Sep. 14, 2006, Featured on Hacked Gadgets, http://hackedgadgets.com. |
I-Unit, Wheelchair, https://www.youtube.com/watch?v=RbrrIrh3GBE, Published on June 6, 2006, Filmed at Megaweb Center at Tokyo. |
Johnson, R.C., “Unicycles and Bifurcations”, American J. of Physics, vol. 66, No. 7, 589-92 (Oct. 22, 2002). |
Kanoh, “Applied Control of Inverted Pendulum”, Computrol, vol. 2, (1983), pp. 69-75. |
Kawaji, S., “Stabilization of Unicycle Using Spinning Motion”, Denki Gakkai Ronbushi, D, vol. 107, Issue 1, Japan (1987), pp. 21-28. |
Koyanagi et al., “A Wheeled Inverse Pendulum Type Self-Contained Mobile Robot”, The Society of Instrument and Control Engineers, Special issue of the 31st SICE Annual Conference, Japan 1992, pp. 51-56. |
Koyanagi et al., “A Wheeled Inverse Pendulum Type Self-Contained Mobile Robot and its Two Dimensional Trajectory Control”, Proceeding of the Second International Symposium on Measurement and Control in Robotics, Japan 1992, pp. 891-897. |
Koyanagi et al., “A Wheeled Inverse Pendulum Type Self-Contained Mobile Robot and its Posture Control and Vehicle Control”, The Society of Instrument and Control Engineers, Special issue of the 31st SICE Annual Conference, Japan, 1992, pp. 13-16. |
Lam, H. K. et al., “Fuzzy Model Reference Control of Wheeled Mobile Robots,” The 27th Annual Conference of the IEEE Industrial Electronics Society (2001). |
Liu, H.S. et al., “Accelerometer for Mobile Robot Positioning,” IEEE Transactions on Industry Applications, vol. No. 3, Oct. 1999. |
Meeussen et al., Autonomous Door Opening and Plugging In with a Personal Robot, Willow Garage, USA, IEEE International Conference on Robotics and Automation, May 3-7, 2010, http://www.willowgarage.com/sites/default/files/m2.pdf. |
Momoi & Yamafuji, “Motion Control of the Parallel Bicycle-Type Mobile Robot Composed of a Triple Inverted Pendulum”, Paper Read at Meeting of Japan Society of Mechanical Engineering (Series C), vol. 57, No. 541, (Sep. 1991), pp. 154-159. |
Montella, C., et al., “To the Bookstore! Autonomous Wheelchair Navigation in an Urban Environment”, Lehigh University, published in FSR, 2012, Part of the Springer Tracts in Advanced Robotics book series (STAR, vol. 92), first online Dec. 31, 2013. |
News article, “Amazing Wheelchair Goes Up and Down Stairs”. |
Oishi et al., “Building A Smart Wheelchair On A Flexible Software Platform”, RESNA International Conference on Technology and Aging, 2011. |
Osaka et al., “Stabilization of unicycle”, Systems and Control, vol. 25, No. 3, Japan Mar. 1981, pp. 159-166. |
PCT/US2017/019214, Written Opinion of the International Search Authority, dated Aug. 31, 2017. |
PCT/US2017/027410, Written Opinion of the International Search Authority, dated Dec. 4, 2017. |
PCT/US2017/033705, Written Opinion of the International Search Authority, dated Nov. 23, 2017. |
PCT/US2017/033705, Invitation to pay additional fees and partial search report, Int. App. #PCT/US2017/033705, Intl. filing date May 20, 2017. |
Roy et al., “Five-Wheel Unicycle System”, Medical & Biological Engineering & Computing, vol. 23, No. 6, United Kingdom Nov. 1985, pp. 593-596. Entire document can be purchased via: https://link.springer.com/article/10.1007%2F8F02455316. |
Sabatini, A, “Quaternion-based Extended Kalman Filter for Determining Orientation by Inertial and Magnetic Sensing”, IEEE Transactions on Biomedical Engineering, vol. 53:7, Jul. 2006, pp. 1346-1356. |
Schoonwinkel, A., “Design and Test of a Computer-Stabilized Unicycle”, Stanford University (1988), UMI Dissertation Services, Dissertation Abstracts International, vol. 49/03-B, Stanford University 1987, pp. 890-1294. |
Bob_Schor. “Re: Cannot get latch mechanical action on Boolean button . . . ” NI Community, published Jun. 2, 2018 (Retrieved from the Internet Sep. 26, 2019). Internet URL: https://forums.ni.com/t5/LabVIEW/Cannot-get-latch-mechanical-action-on-boolean-button-inside-a/td-p/3799821?profile.language=en (Year: 2018). |
Sheng et al., “Postural Stability of a Human Riding a Unicycle and Its Emulation by a Robot,” IEEE Transactions on Robotics and Automation, vol. 13:5, Oct. 1997. |
Sheng, Zaiquan; Yamafuji, Kazuo: “Realization of a Human Riding a Unicycle by a Robot”. Proceedings of the 1995 IEEE International Conference on Robotics and Automation, vol. 2, 1995, pp. 1319-1326. |
Stew's Hovercraft Page, http://www.stewcam.com/hover-craft.html. |
Takahashi et al., “Back and Forward Moving Scheme of Front Wheel Raising for Inverse Pendulum Control Wheel Chair Robot”, Proceedings of the 2001 IEEE International Conference of Robotics & Automation, Seoul, Korea, May 21-26, 2001, pp. 3189-3194. |
Takahashi et al., “Front Wheel Raising and Inverse Pendulum Control of Power Assist Wheel Chair Robot”, IEEE, 1999, pp. 668-673. |
Tanaka et al., “A Mobile Robot for Service Use: Behaviour Simulation System and Intelligent Control,” Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1997. |
TECKNICO'S Home Page, “Those Amazing Flying Machines”, http://www.swiftsite.com/technico, May 24, 1999. |
Ulyanov et al., “Fuzzy Intelligent Emotion and Instinct Control of a Robotic Unicycle,” Proceedings of the 1996 4th International Workshop on Advanced Motion Control, Mar. 18-21, 1996. |
Ulyanov et al., “Soft computing for the intelligent robust control of a robotic unicycle with a new physical measure for mechanical controllability”. Soft Computing vol. 2:2, Jun. 1998, pp. 73-88. |
Umpad, Leomar. “How Do I Use My Samsung Galaxy Device as a TV Remote Control?” Tech Recipes,published Nov. 27, 2014 (Retrieved from the Internet Sep. 27, 2019). Internet URL: <https://www. Tech-recipes.com/rx/51556/how-do-i-use-my-samsung-galaxy-device-as-a-tv-remote-control/> (Year: 2014). |
Viswanathan et al., “Navigation Assistance for Intelligent Wheelchairs”, 3rd International Conference on Technology and Aging/RESNA, Toronto, 2011. |
Vos et al., “Dynamics and Nonlinear Adaptive Control of an Autonomous Unicycle—Theory and Experiment”, American Institute of Aeronautics and Astronautics, A90-26772 10-39, Washington, D.C. 1990, Abstract only. |
Vos, D., Dynamics and Nonlinear Adaptive Control of an Autonomous Unicycle, Massachusetts Institute of Technology, Jun. 7, 1989. |
Vos, D., “Nonlinear Control of an Autonomous Unicycle Robot: Practical Issues”, Massachusetts Institute of Technology, Jun. 5, 1992. |
Wang et al., “Real-time Model-based Electrical Powered Wheelchair Control”, Med Eng Phys. Dec. 2009: 31(10): 1244-1254. |
Watson Industries, Inc., “Single Axis Vertical Reference System Owner's Manual ADS-C132-1A”, Apr. 20, 2015, pp. 3-4. |
Welch et al., “An Introduction to the Kalman Filter,” SIGGRAPH 2001, Department of Computer Science University of North Carolina at Chapel Hill, http://www.cs.unc.edu/˜l_welch.gbl, 2001. |
WO 2000/073101, IPER of the International Search Authority, filing date Mar. 14, 2000. |
WO 2000/075001, IPER of the International Search Authority, filing date Jun. 1, 2000. |
WO2002/030730, IPER of the International Search Authority, filing date Oct. 11, 2001. |
WO2004/007264, Initial Publication with ISR, International Publication Date Jan. 22, 2004. |
WO 2017/147347 Written Opinion of the International Search Authority, Int. App. #PCT/US2017/019214, priority date Feb. 23, 2016. |
WO 2017/201513, Invitation to pay additional fees and partial search report, Int. App. #PCT/US2017/033705, Inti, filing date May 20, 2017. |
WO 2017/201513, Written Opinion of the International Searching Authority, Int. App. #PCT/US2017/033705, Intl. filing date May 20, 2017. |
Wolstenholme, Kevin. “Updating Glide—The Full Breawkdown.” RisingHigh Academy, published Aug. 26, 2017 (Retrieved from the Internet Sep. 26, 2019). Internet URL: https://risinghighacademy.com/category/games/(Year:2017). |
Yamafuji & Kawamura, “Study on the Postural and Driving Control of Coaxial Bicycle”, Paper Read at Meeting of Japan Society of Mechanical Engineering (Series C), vol. 54, No. 501, (May 1988), pp. 1114-1121, Abstract in English. |
Yamafuji & Kawamura, “Study of Postural and Driving Control of Coaxial Bicycle”, Papers Read at Meeting of Japan Society of Mechanical Engineering (vol. C), vol. 54, No. 501 (May 1988), Paper No. 87-0901A. |
Yamafuji et al., “Synchronization and Steering Control of Parallel Bicycle”, Paper Read at Meeting of Japan Society of Mechanical Engineering (Series C), vol. 55, No. 513, (May 1989), pp. 1229-1234. |
Yamafuji, “A Proposal for Modular-Structured Mobile Robots for Work that Principally Involve a Vehicle with Two Parallel Wheels”, Automation Technology, vol. 20, pp. 113-118 (1988). |
Yun et al., “Implementation and Experimental Results of a Quarternion-Based Kalman Filter for Human Body Motion Tracking”, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, Apr. 2005, pp. 317-322. |
Yun et al., “Design, Implementation and Experimental Results of a Quarternion-Based Kalman Filter for Human Body Motion Tracking”, IEEE Transactions on Robotics, vol. 22, No. 6, Dec. 2006, pp. 1216-1227. |
Zenkov, DV, AM Bloch, and JE Marsden [2001] “The Lyapunov-Malkin Theorem and Stabilization of the Unicycle with Rider”. Systems and Control Letters, vol. 45, No. 4, Apr. 5, 2002, pp. 293-302(10). |
Zenkov, DV, AM Bloch, NE Leonard and JE Marsden, “Matching and Stabilization of Low-Dimensional Nonholonomic Systems”. Proc. CPC, 39, (2000), 1289-1295. |
Number | Date | Country | |
---|---|---|---|
20190377349 A1 | Dec 2019 | US |
Number | Date | Country | |
---|---|---|---|
62682129 | Jun 2018 | US |