The present inventive concepts relate to the field of vehicles, and more particularly, although not exclusively, to self-driving vehicles suitable as personal-use vehicles.
In accordance with one aspect of the inventive concepts, provided is a method of modeling behavior for use by a self-driving vehicle. The self-driving vehicle may take the form of a personal use or companion vehicle that cooperatively engages with at least one human user. As a companion vehicle, the self-driving vehicle may take the form of a “follower” vehicle, which can be a self-driving vehicle structured and arranged to cooperatively operate with a human user by implementing one or more behavior models in response to sensed conditions from the environment, stored data, and/or actions of the human user.
Generally, the follower vehicle is responsive to the leader and its actions. As a follower vehicle, the leader can be a human in some embodiments, but in other embodiments the leader could be another vehicle. The follower vehicle need not always physically lag behind a leader, but it may a predominate amount of time. In some embodiments, the follower vehicle may precede a leader in a specific instance in accordance with an applicable behavior model, for example when encountering a specific structural element accounted for in a behavior model.
The behavior models may take the form a set of stored computer instructions executable by at least one processor configured to control aspects of the self-driving follower vehicle. That is, executing the behavior model may control the drive, acceleration, deceleration, turning, pausing and/or stopping equipment and functions of the self-driving follower vehicle. Other functionality of the follower vehicle may be controlled by execution of the behavior model. Execution of the behavior models, therefore, improve the overall operation of the follower vehicle, e.g., making the follower vehicle more efficient, safe, responsive, and/or user-friendly, while also make the follower vehicle less of a distraction and obstacle to a human users and others.
In various embodiments, the method includes monitoring, tracking, and measuring leader interactions and actions with at least one other entity to determine a behavioral model of appropriate and cooperative behavior of a follower vehicle, where the other entity can be at least one structural element, human, and/or other vehicle. The method can include electronically storing the behavioral model, and can also include executing the behavioral model to cause the self-driving vehicle to cooperatively navigate an environment as a follower vehicle.
In accordance with one aspect of the inventive concepts, provided is a method of modeling behavior for a self-driving vehicle. The method includes tracking and measuring leader-follower interactions and actions with at least one structural element of an environment, including leader behaviors and follower behaviors, representing the leader behaviors and the follower behaviors in a behavior model, and electronically storing the behavioral model. The behavioral model is executable by the self-driving vehicle to cooperatively navigate the at least one structural element as a follower vehicle.
In various embodiments, the method includes establishing a measurable interaction environment model for the at least one structural element.
In various embodiments, the method includes tracking a person, a vehicle, and object movements with respect to the at least one structural element.
In various embodiments, the method includes providing an electronic format for describing stops, starts, pauses, movements, and behaviors of the self-driving vehicle/follower with respect to the at least one structural element and the leader.
In various embodiments, the method includes measuring and modeling the at least one structural element, follower vehicle, and/or human movement includes using one or more of film industry motion capture tools, stop action photography, filming and measuring vehicle and/or human movements, and/or motion sensors on vehicles and people.
In various embodiments, the at least one structural element comprises at least one right swing-in door.
In various embodiments, the at least one structural element comprises at least one left swing-in door.
In various embodiments, the at least one structural element comprises at least one right swing-out door.
In various embodiments, the at least one structural element comprises at least one left swing-out door.
In various embodiments, the at least one structural element comprises at least one sliding door.
In various embodiments, the at least one structural element comprises a plurality of right swing-in doors, a plurality of left swing-in doors, a plurality of right swing-out doors, a plurality of left swing-out doors, or a plurality of sliding doors.
In various embodiments, the vehicle is a mobile carrier comprising at least one storage compartment.
In various embodiments, the at least one structural element includes a door and the behavior model comprises: the vehicle following the leader, in response to the leader pausing at the door, the vehicle pausing and waiting for the door to open, in response to the door opening, the vehicle proceeding through the door and waiting for the leader, in response to the leader proceeding through the door, the vehicle resuming tracking of the leader, and the vehicle resuming following the leader.
In various embodiments, the vehicle proceeding through the door and waiting for the leader includes the vehicle moving aside and off a centerline of the door to wait for the leader.
In various embodiments, the at least one structural element includes a first door and a second door and the behavior model comprises: the vehicle following the leader, in response to the leader pausing at the first door, the vehicle pausing and waiting for the first door to open, in response to the first door opening, the vehicle proceeding through the first door and waiting for the leader, in response to the second door opening, the vehicle proceeding through the second door and waiting for the leader, in response to the leader proceeding through the second door, the vehicle resuming tracking of the leader, and the vehicle resuming following the leader.
In various embodiments, the vehicle proceeding through the first and/or the second door and waiting for the leader includes the vehicle moving aside and off a centerline of the first and/or second door to wait for the leader.
In various embodiments, the at least one structural element includes an elevator and the behavior model comprises the vehicle following the leader, in response to the leader pausing at an elevator door, the vehicle pausing and waiting for the door to open, in response to the door opening, the vehicle proceeding through the door moving to the back of the elevator, in response to the leader proceeding through the door into the elevator, the vehicle resuming tracking of the leader, and the vehicle resuming following the leader out of the elevator.
In accordance with another aspect of the inventive concepts, provided is a follower vehicle. The follower vehicle includes a body, a drive system configured to navigate the body to follow a leader, a computer processor and computer storage device, a behavior model executable by the processor to cooperatively navigate at least one structural element with the leader.
In various embodiments, the at least one structural element comprises at least one right swing-in door.
In various embodiments, the at least one structural element comprises at least one left swing-in door.
In various embodiments, the at least one structural element comprises at least one right swing-out door.
In various embodiments, the at least one structural element comprises at least one left swing-out door.
In various embodiments, the at least one structural element comprises at least one sliding door.
In various embodiments, the at least one structural element comprises: a plurality of right swing-in doors, a plurality of left swing-in doors, a plurality of right swing-out doors, a plurality of left swing-out doors, or a plurality of sliding doors.
In various embodiments, the vehicle is a mobile carrier comprising at least one storage compartment.
In various embodiments, the at least one structural element includes a door and the behavior model is executable to cause the vehicle to follow the leader, detect the leader paused at the door and in response pause and wait for the door to open, in response to the open door, proceed through the door and wait for the leader, in response to the leader proceeding through the door, resume tracking the leader, and resume following the leader.
In various embodiments, the behavior model is executable by the processor to cause the vehicle to move aside and off a centerline of the door to wait for the leader.
In various embodiments, the at least one structural element includes a first door and a second door and the behavior model is executable to cause the vehicle to: follow the leader, detect the leader paused at the first door and in response pause and wait for the first door to open, in response to the first door opening, proceed through the first door and wait for the leader, in response to the second door opening, proceed through the second door and wait for the leader, in response to the leader proceeding through the second door, resume tracking the leader, and resume following the leader.
In various embodiments, the behavior model is executable by the processor to cause the vehicle to move aside and off a centerline of the first and/or second door to wait for the leader.
In various embodiments, the at least one structural element includes an elevator and the behavior model is executable to cause the vehicle to: follow the leader, detect the leader paused at the first door and in response pause and wait for the first door to open, in response to the door opening, proceed through the door and move to the back of the elevator, in response to the leader proceeding through the door into the elevator, resume tracking the leader, and resume following the leader out of the elevator.
In accordance with another aspect of the inventive concept, provided is a method of navigating a follower vehicle through at least one structural element. The method includes using at least one processor, executing a behavior model to cause the vehicle to navigate the at least one structural element with the leader. The behavior model embodies leader behaviors and follower behaviors defining actions of the follower vehicle in response to actions of the leader related to the at least one structural element.
In various embodiments, the at least one structural element includes a door and navigating the follower vehicle by executing the behavior model comprises: the vehicle following the leader, in response to the leader pausing at the door, the vehicle pausing and waiting for the door to open, in response to the door opening, the vehicle proceeding through the door and waiting for the leader, in response to the leader proceeding through the door, the vehicle resuming tracking of the leader, and the vehicle resuming following the leader.
In various embodiments, the vehicle proceeding through the door and waiting for the leader includes the vehicle moving aside and off a centerline of the door to wait for the leader.
In various embodiments, the at least one structural element includes a first door and a second door and navigating the follower vehicle by executing the behavior model comprises: the vehicle following the leader, in response to the leader pausing at the first door, the vehicle pausing and waiting for the first door to open, in response to the first door opening, the vehicle proceeding through the first door and waiting for the leader, in response to the second door opening, the vehicle proceeding through the second door and waiting for the leader, in response to the leader proceeding through the second door, the vehicle resuming tracking of the leader, and the vehicle resuming following the leader.
In various embodiments, the vehicle proceeding through the first and/or the second door and waiting for the leader includes the vehicle moving aside and off a centerline of the first and/or second door to wait for the leader.
In various embodiments, the at least one structural element includes an elevator and navigating the follower vehicle by executing the behavior model comprises: the vehicle following the leader, in response to the leader pausing at an elevator door, the vehicle pausing and waiting for the door to open, in response to the door opening, the vehicle proceeding through the door moving to the back of the elevator, in response to the leader proceeding through the door into the elevator, the vehicle resuming tracking of the leader, and the vehicle resuming following the leader out of the elevator.
In accordance with another aspect of the inventive concepts, provided is a method of generating a behavior model for navigating a follower vehicle through at least one structural element, as shown and described.
In accordance with another aspect of the inventive concepts, provided is a follower vehicle configured to use a behavior model to navigate through at least one structural element, as shown and described.
In accordance with another aspect of the inventive concepts, provided is a method of navigating a follower vehicle through at least one structural element using a behavior model, as shown and described.
The present invention will become more apparent in view of the attached drawings and accompanying detailed description. The embodiments depicted therein are provided by way of example, not by way of limitation, wherein like reference numerals refer to the same or similar elements. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating aspects of the invention. In the drawings:
Various aspects of the inventive concepts will be described more fully hereinafter with reference to the accompanying drawings, in which some exemplary embodiments are shown. The present inventive concept may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein.
It will be understood that, although the terms first, second, etc. are be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another, but not to imply a required sequence of elements. For example, a first element can be termed a second element, and, similarly, a second element can be termed a first element, without departing from the scope of the present invention. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being “on” or “connected” or “coupled” to another element, it can be directly on or connected or coupled to the other element or intervening elements can be present. In contrast, when an element is referred to as being “directly on” or “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between,” “adjacent” versus “directly adjacent,” etc.).
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like may be used to describe an element and/or feature's relationship to another element(s) and/or feature(s) as, for example, illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use and/or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” and/or “beneath” other elements or features would then be oriented “above” the other elements or features. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
To the extent that functional features, operations, and/or steps are described herein, or otherwise understood to be included within various embodiments of the inventive concept, such functional features, operations, and/or steps can be embodied in functional blocks, units, modules, operations and/or methods. And to the extent that such functional blocks, units, modules, operations and/or methods include computer program code, such computer program code can be stored in a computer readable medium, e.g., such as non-transitory memory and media, that is executable by at least one computer processor.
In accordance with one aspect of the inventive concepts, provided is a method of modeling behavior for use by a self-driving vehicle. The self-driving vehicle may take the form of a personal use or companion vehicle that cooperatively engages with at least one human user. As a companion vehicle, the self-driving vehicle may take the form of a “follower” vehicle, which can be a self-driving vehicle structured and arranged to cooperatively operate with a human user by implementing one or more behavior models in response to sensed conditions from the environment, stored data, and/or actions of the human user.
Generally, the follower vehicle is responsive to the leader and the leader's actions. As a follower vehicle, the leader can be a human in some embodiments, but in other embodiments the leader could be another vehicle. The follower vehicle need not always physically lag behind a leader, but it may in a predominate amount of operation. In some embodiments, the follower vehicle may precede a leader in a specific instance in accordance with an applicable behavior model, for example when encountering a specific structural element accounted for in a behavior model.
The behavior models may take the form a set of stored computer instructions and/or code executable by at least one processor configured to control aspects of the self-driving follower vehicle. All or part of the computer instructions and/or code can be stored locally on the vehicle or remotely. Executing the behavior model may control the drive, acceleration, deceleration, turning, pausing and/or stopping equipment and functions of the self-driving follower vehicle. Other functionality of the follower vehicle may also be controlled by execution of the behavior model. Execution of the behavior models, therefore, improve the overall operation of the follower vehicle, e.g., making the follower vehicle more efficient, safe, responsive, and/or user-friendly, while also make the follower vehicle less of a distraction and obstacle to a human users and others.
As a follower vehicle, the self-driving vehicle can be configured to follow a leader, such as a human leader. In various embodiments, the follower vehicle may also be a mobile carrier vehicle (or “mobile carrier”) configured as a companion to a human. In various embodiments, a mobile carrier vehicle is a vehicle that includes structural and functional elements that define at least one volume useful for carrying goods. In various embodiments, the one or more volume can be configured to receive functional systems or subsystems that can interface with power and/or control ports of the self-driving vehicle, for use by the leader and or the self-driving vehicle. However, a self-driving follower vehicle is not limited to mobile carriers, and could, for example, take other forms of personal use and/or companion vehicles.
As a follower vehicle, the mobile carrier is generally configured to follow a human leader. Although, as will be apparent from the various embodiments described herein, the follower vehicle can proceed the human in some instances, at least for a portion of a path taken by the human. Such instances can include, but may not be limited to, passage through various types of doorways, vestibules, passageways, and/or other structural elements, such as elevators. The follower vehicle does not mimic human behavior, but rather is responsive to human behavior and structural elements to implement a behavior that is different from, but cooperative with, the human's to safely and efficiently navigate an encountered structural element with the human leader.
The structural elements can form part of an environment, where the environment can be indoor, outdoor, a transition from indoor to outdoor, and/or a transition from outdoor to indoor. The environment can be or include a building or an outdoor area, as examples.
Traditional robotic vehicles are designed to either follow pre-defined routes designed as part of the infrastructure, or in the case of self-driving navigation, to follow the most efficient, or shortest, route for that segment of the activity. This requires any nearby pedestrians to cede the path to the vehicle and results in unnatural behaviors by the pedestrians. Such traditional robotic vehicles are not follower vehicles, so have no particular necessity to exhibit behaviors that are cooperative with a human, as a companion to the human.
In various embodiments, a self-driving companion vehicle, such as a follower vehicle, is configured to execute certain behaviors in cooperation with a human—where the follower vehicle is a companion to the human. In various embodiments, the follower vehicle behaviors are implemented as execution of a set of instructions based on one or more behavior models. The behavior models define follower vehicle stops, starts, pauses, and movements based on the structural element encountered and the human leader's stops, starts, pauses, and movements. As a result, the follower vehicle is configured to be responsive to the structural element encountered and the human leader's actions, such as stops, starts, pauses, and movements, to navigate the structural element in cooperation with the human leader. Therefore, preferably, the follower vehicle is configured to recognize the structural element, as well as the human's actions.
The behavioral models can be obtained and developed in a variety of manners. In some embodiments, a system and method are provided that monitor and track pedestrian motions, movements, and activities during specific environment interactions, along with motions and movements performed by trained followers, to determine the appropriate stops, starts, routes, paths, and movements to be taken by a follower vehicle to make the interaction between a pedestrian/human leader and a follower vehicle effective with minimal or no obstruction to the leader and others. The interactions of the human leaders and followers when encountering structural elements are recorded and represented in computer or electronic behavior models that can be represented in electronic instructions, such as computer program code executable by a processor of a self-driving companion vehicle, such as a follower vehicle.
To operate as a follower vehicle, the self-driving vehicle preferably acquires the leader, e.g., by sensing the presence of the leader. Acquiring the leader may include recognition of the leader by one or more of various sensing and/or input mechanisms, such as various types of biometric sensors and/or input devices. The behavior models implemented by the self-driving vehicle, as a follower vehicle, take into consideration that the self-driving vehicle is a follower, so must take certain actions to wait for (pause) and reacquire the leader.
In accordance with aspects of the inventive concepts, robotic self-driving vehicles are configured to maximize efficiency by implementing behavior models at least partially, if not completely, defining how the self-driving vehicle responds to structural elements and a human leader's stops, starts, pauses, and movements. For example, in one embodiment, a self-driving vehicle coming to a closed doorway would stop at the entrance to the doorway, wait for the door to open, and then proceed directly through the door, wait for the human leader, re-acquire the human leader, and then continue with the human leader to a destination.
There is no need in the prior art to determine what actions a self-driving companion vehicle would take to pass and “re-acquire” a human leader. The present invention makes for a “seamless” and natural interaction between the self-driving vehicle and a human leader, particularly when navigating various structural elements of an environment. Modeling these behavioral interactions requires studying and considering leader-follower actions, not just analysis of path geometries, and then designing into the self-driving vehicle path segments. That alone would not provide a self-driving companion vehicle capable of more natural interaction with a leader encountering various types of structural elements.
In one embodiment, a method of determining and modeling desired follower vehicle behavior and movement includes:
Measuring and modeling the structural elements and vehicles and/or human (e.g., leader) movement can include one or more of:
Data acquired from the above methods can be organized and stored in electronic form, and then processed into a self-driving companion vehicle behavior models, in an electronic format. The behavior model can comprise a set of rules implemented by a follower vehicle in combination with sensed inputs to cause the follower vehicle to function in new ways. A different behavior model can be defined for specific structures or situations encountered. Each behavior model can include a set of leader steps and a corresponding set of follower vehicle steps. In order to proceed from one step to the next, the follower vehicle preferably senses or otherwise determines that the leader has accomplished its step. The leader and the follower work through a series of steps until the two negotiate the encountered structural element. The various approaches to sensing and measuring discussed above, and the data obtained therefrom, is computer processed to generate the steps of the behavior models.
In various embodiments, a self-driving companion vehicle in accordance with aspects of the inventive concept can take the form of the GITA™ mobile-carrier vehicle by Piaggio Fast Forward of Boston, Mass. (GITA™ is a trademark of Piaggio Fast Forward of Boston, Mass.) Such a mobile carrier vehicle can be used to develop and use the behavior models described herein.
The follower vehicle can include a body, drive system, at least one processor, at least one memory, and at least one sensor that cooperatively enable the follower vehicle to sense and follow a leader, e.g., a human leader. The at least one sensor can also be configured to sense structural elements, such as doors, walls and other objects in an environment. The processor can control the drive system to cause the follower vehicle to stop, accelerate, decelerate, pause, start, and turn as it follows and interacts with a leader. The processor can control behavior of the follower vehicle based, at least in part, on one or more the behavior models, e.g., comprising a plurality of stored rules, and sensor inputs. The body can define at least one volume, cavity or compartment configured to store a variety of types of items. The compartment can include one or more of thermal, vibration, and shock insulation. The compartment can be air and/or water tight, in some embodiments.
It will be understood by those skilled in the art that the specific distance measurements and movements of the human, e.g., “1.2 m”, “turns 90°”, “Pulls on door handle with right hand”, and so on, are not essential human movements, but rather modeled human behaviors useful for determining behavioral models to be implemented by the follower vehicle. That is, variations in specific details of the human behavior are permissible without altering the basic functionality of the follower vehicle. The specific distances and movements of the human are representative of more general human behavior.
Table 1 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 1 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a right swing-in door. In Table 1, the Person approaches the door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 1 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the door and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can be embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 2 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 2 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a left swing-in door. In Table 2, the Person approaches the door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 2 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the door and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 3 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 3 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a right swing-out door. In Table 3, the Person approaches the door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 3 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the door and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 4 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 4 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a left swing-out door. In Table 4, the Person approaches the door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 4 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the door and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 5 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 5 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a vestibule having a plurality of right swing-in doors. In Table 5, the Person approaches the first door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 5 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the vestibule and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 6 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 6 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a vestibule having a plurality of left swing-in doors. In Table 6, the Person approaches the first door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 6 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the vestibule and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 7 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 7 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a vestibule having a plurality of right swing-out doors. In Table 7, the Person approaches the first door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 7 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the vestibule and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 8 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 8 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is a vestibule having a plurality of left swing-out doors. In Table 8, the Person approaches the first door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 8 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both through the vestibule and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
Table 9 below represents the behavioral model implemented by the follower vehicle, and includes follower vehicle stops, starts, pauses and movements in relation to modeled human leader stops, starts, pauses and movements in view of the encountered structural element. Therefore, Table 9 shows the series of steps taken by the follower vehicle in response to its interpretation of human behavior and environmental structural elements, which can be sensed by the one or more sensors. In this embodiment, the structural element is an elevator having at least one sliding door that opens on one side of the elevator. In Table 9, the Person approaches the elevator door, in step 1, and the follower vehicle approaches following the Person, in step 1, and eventually stops at distance away from the door. Table 9 goes on to outline steps taken by the human and then responsive steps taken by the follower vehicle, until the human and follower vehicle are both in and out of the elevator and the follower vehicle has resumed following the human leader. Responsive steps taken by the follower vehicle can embodied within logic executed by the follower vehicle, such as within program code or other computer instructions.
The embodiments of
While the foregoing has described what are considered to be the best mode and/or other preferred embodiments, it is understood that various modifications can be made therein and that the invention or inventions may be implemented in various forms and embodiments, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim that which is literally described and all equivalents thereto, including all modifications and variations that fall within the scope of each claim.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provide in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment may also be provided separately or in any suitable sub-combination.
For example, it will be appreciated that all of the features set out in any of the claims (whether independent or dependent) can combined in any given way.
The present application is a continuation application of U.S. patent application Ser. No. 17/049,141, filed Oct. 20, 2020, entitled METHOD FOR DETERMINING SELF-DRIVING VEHICLE BEHAVIOR MODELS, A SELF-DRIVING VEHICLE, AND A METHOD OF NAVIGATING A SELF-DRIVING VEHICLE, which is a 371 national stage application of Patent Cooperation Treaty Application No. PCT/US2019/030208 filed May 1, 2019, which in turn claims priority under 35 USC 119(e) to U.S. Provisional Patent Application No. 62/665,183 filed May 1, 2018, the contents of which are incorporated by reference. The present application may be related to U.S. patent application Ser. No. 15/296,884 filed Oct. 18, 2016, entitled VEHICLE HAVING STABILIZATION SYSTEM, and U.S. patent application Ser. No. 29/592,340 filed Jan. 30, 2017, entitled BELT, each of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
282299 | Freeman | Jul 1883 | A |
1819924 | Seppol | Aug 1931 | A |
3123173 | Jacobs | Mar 1964 | A |
3418005 | Allina | Dec 1968 | A |
3776353 | Roth | Dec 1973 | A |
3858673 | Browning | Jan 1975 | A |
3921740 | Forster | Nov 1975 | A |
4179006 | Lenack et al. | Dec 1979 | A |
4222452 | Fachini et al. | Sep 1980 | A |
4714140 | Hatton et al. | Dec 1987 | A |
4794999 | Hester | Jan 1989 | A |
4986387 | Thompson et al. | Jan 1991 | A |
5094375 | Wright | Mar 1992 | A |
5248011 | Richards | Sep 1993 | A |
5261684 | Soto | Nov 1993 | A |
5322140 | Bussinger | Jun 1994 | A |
5343974 | Rabek | Sep 1994 | A |
5366036 | Perry | Nov 1994 | A |
5439240 | Tichenor et al. | Aug 1995 | A |
5558174 | Avitan et al. | Sep 1996 | A |
5669619 | Kim | Sep 1997 | A |
5818189 | Uchiyama et al. | Oct 1998 | A |
6260645 | Pawlowski et al. | Jul 2001 | B1 |
6311794 | Morrell et al. | Nov 2001 | B1 |
6328125 | Van Den Brink et al. | Dec 2001 | B1 |
6553271 | Morrell | Apr 2003 | B1 |
6571892 | Kamen et al. | Jun 2003 | B2 |
6880654 | Plishner | Apr 2005 | B2 |
6974399 | Lo | Dec 2005 | B2 |
7017696 | Pal | Mar 2006 | B2 |
7124854 | Huang | Oct 2006 | B2 |
7185726 | Young | Mar 2007 | B2 |
7337862 | Greenley et al. | Mar 2008 | B1 |
7841435 | Raue | Nov 2010 | B2 |
7938210 | Kunzler et al. | May 2011 | B2 |
7996109 | Zini et al. | Aug 2011 | B2 |
7997361 | Bell et al. | Aug 2011 | B1 |
8002060 | Komatsu | Aug 2011 | B2 |
8083013 | Bewley et al. | Dec 2011 | B2 |
8096378 | Xie | Jan 2012 | B2 |
8123237 | Takemura | Feb 2012 | B2 |
8160794 | Fuwa | Apr 2012 | B2 |
8170781 | Fuwa | May 2012 | B2 |
8186467 | Yoshino et al. | May 2012 | B2 |
8229618 | Tolstedt et al. | Jul 2012 | B2 |
8490723 | Heinzmann et al. | Jul 2013 | B2 |
8684123 | Chen | Apr 2014 | B2 |
8807250 | Chen | Aug 2014 | B2 |
8807254 | Manus | Aug 2014 | B2 |
8932170 | Ishizuka | Jan 2015 | B2 |
8985264 | Shirley | Mar 2015 | B2 |
9010474 | Martinelli et al. | Apr 2015 | B2 |
9045190 | Chen | Jun 2015 | B2 |
9101817 | Doerksen | Aug 2015 | B2 |
9364766 | Mielniczek | Jun 2016 | B2 |
9511811 | Andreev | Dec 2016 | B2 |
9557740 | Crawley | Jan 2017 | B2 |
9630447 | Yoshino et al. | Apr 2017 | B2 |
9701012 | Theobald | Jul 2017 | B1 |
9764592 | Hays et al. | Sep 2017 | B1 |
9776327 | Pinter et al. | Oct 2017 | B2 |
9789017 | Hays et al. | Oct 2017 | B2 |
9789415 | Mielniczek | Oct 2017 | B2 |
9849047 | Hays et al. | Dec 2017 | B2 |
10076954 | Burtov et al. | Sep 2018 | B2 |
10093168 | Hays et al. | Oct 2018 | B2 |
10144478 | Ying et al. | Dec 2018 | B2 |
10173738 | Schnapp et al. | Jan 2019 | B2 |
10223848 | Browning et al. | Mar 2019 | B2 |
10293676 | Schnapp et al. | May 2019 | B2 |
10322718 | Lian et al. | Jun 2019 | B2 |
10343740 | Kama et al. | Jul 2019 | B2 |
11112807 | Weiss | Sep 2021 | B1 |
20010042650 | Van Den Berg | Nov 2001 | A1 |
20020011368 | Van Den Berg | Jan 2002 | A1 |
20020121394 | Kamen et al. | Sep 2002 | A1 |
20020149172 | Field et al. | Oct 2002 | A1 |
20040124023 | Plishner | Jul 2004 | A1 |
20040182625 | Pal | Sep 2004 | A1 |
20050016785 | Young | Jan 2005 | A1 |
20050056479 | Huang | Mar 2005 | A1 |
20050176542 | Lo | Aug 2005 | A1 |
20060254841 | Strong | Nov 2006 | A1 |
20070129849 | Zini et al. | Jun 2007 | A1 |
20080041654 | Raue | Feb 2008 | A1 |
20080230285 | Bewley et al. | Sep 2008 | A1 |
20080245593 | Kim | Oct 2008 | A1 |
20090166112 | Yoshino et al. | Jul 2009 | A1 |
20090315286 | Takemura | Dec 2009 | A1 |
20090319124 | Fuwa | Dec 2009 | A1 |
20100057319 | Inaji et al. | Mar 2010 | A1 |
20100063663 | Tolstedt et al. | Mar 2010 | A1 |
20100063680 | Tolstedt | Mar 2010 | A1 |
20100070132 | Doi | Mar 2010 | A1 |
20100161206 | Naito | Jun 2010 | A1 |
20100168993 | Doi et al. | Jul 2010 | A1 |
20100252338 | Xie | Oct 2010 | A1 |
20110010066 | Fuwa | Jan 2011 | A1 |
20110208357 | Yamauchi | Aug 2011 | A1 |
20110209929 | Heinzmann et al. | Sep 2011 | A1 |
20110220427 | Chen | Sep 2011 | A1 |
20110303035 | Niebergall et al. | Dec 2011 | A1 |
20130032423 | Chen | Feb 2013 | A1 |
20130069420 | Manus | Mar 2013 | A1 |
20130166157 | Schleicher | Jun 2013 | A1 |
20130228385 | Chen | Sep 2013 | A1 |
20140011625 | Thompson | Jan 2014 | A1 |
20140116799 | Pettigrew et al. | May 2014 | A1 |
20140131126 | Martinelli et al. | May 2014 | A1 |
20140230602 | Shirley | Aug 2014 | A1 |
20140326525 | Doerksen | Nov 2014 | A1 |
20140336818 | Posselius | Nov 2014 | A1 |
20150012163 | Crawley | Jan 2015 | A1 |
20150093956 | Mielniczek | Apr 2015 | A1 |
20160031515 | Andreev | Feb 2016 | A1 |
20160068056 | Burtov et al. | Mar 2016 | A1 |
20160229058 | Pinter et al. | Aug 2016 | A1 |
20160303900 | Yoshino et al. | Oct 2016 | A1 |
20160325585 | Hays et al. | Nov 2016 | A1 |
20160332086 | Mielniczek | Nov 2016 | A1 |
20160346142 | Hays et al. | Dec 2016 | A1 |
20180009311 | Hays et al. | Jan 2018 | A1 |
20180072366 | Kama et al. | Mar 2018 | A1 |
20180082502 | Browning et al. | Mar 2018 | A1 |
20180105033 | Schnapp et al. | Apr 2018 | A1 |
20180105215 | Schnapp et al. | Apr 2018 | A1 |
20180148121 | Ying et al. | May 2018 | A1 |
20180230285 | Bueno Lopez et al. | Aug 2018 | A1 |
20180237001 | Lian et al. | Aug 2018 | A1 |
20180237065 | Yamamoto et al. | Aug 2018 | A1 |
20180333862 | Hayashi | Nov 2018 | A1 |
20190031017 | Hays et al. | Jan 2019 | A1 |
20200047826 | Schnapp et al. | Feb 2020 | A1 |
Number | Date | Country |
---|---|---|
102008379 | Apr 2011 | CN |
104590476 | May 2015 | CN |
202007011698 | Jan 2009 | DE |
0705724 | Apr 1996 | EP |
1889743 | Jun 2008 | EP |
2058216 | May 2009 | EP |
2163467 | Mar 2010 | EP |
2516619 | Feb 2015 | GB |
59195965 | Dec 1984 | JP |
57760 | Feb 1993 | JP |
0620176 | Mar 1994 | JP |
H0692273 | Apr 1994 | JP |
06134049 | May 1994 | JP |
06061680 | Aug 1994 | JP |
0920250 | Jan 1997 | JP |
09215713 | Aug 1997 | JP |
2000502636 | Mar 2000 | JP |
2000355293 | Dec 2000 | JP |
2001339812 | Dec 2001 | JP |
2004129435 | Apr 2004 | JP |
2006116186 | May 2006 | JP |
2006123854 | May 2006 | JP |
3993883 | Oct 2007 | JP |
2007313980 | Dec 2007 | JP |
2008055951 | Mar 2008 | JP |
2009040379 | Feb 2009 | JP |
2012122250 | Jun 2012 | JP |
2014519446 | Aug 2014 | JP |
2015523933 | Aug 2015 | JP |
2019003540 | Jan 2019 | JP |
200412471 | Mar 2006 | KR |
2020080003926 | Sep 2008 | KR |
101272035 | Jun 2013 | KR |
101598132 | Feb 2016 | KR |
0115962 | Mar 2001 | WO |
03065963 | Aug 2003 | WO |
2008067822 | Jun 2008 | WO |
2011107674 | Sep 2011 | WO |
2015140767 | Sep 2015 | WO |
2018075013 | Apr 2018 | WO |
2018140071 | Aug 2018 | WO |
2019075002 | Apr 2019 | WO |
2019213264 | Nov 2019 | WO |
Entry |
---|
Japanese Notice of Allowance dated Jan. 18, 2022 issued in corresponding Japanese Application No. 2019-540385, with machine translation to English. |
Extended European Search Report dated Aug. 3, 2021 issued in corresponding European Application No. 18866624.2. |
Extended European Search Report dated Oct. 7, 2021 issued in corresponding European Application No. 19795777.2. |
Loper et al. “Mobile human-robot teaming with environmental tolerance”, Human-Robot Interaction (HRI), 4th ACM/IEEE International Conference, Mar. 9, 2009, pp. 157-164. |
Zender et al. “Human and Situation-Aware People Following”, Robot and Human Interactive Communication, the 16th IEEE International Symposium, Aug. 26, 2007, pp. 1131-1136. |
Japanese Office Action dated Sep. 21, 2021 issued in corresponding Japanese Application No. 2019-540385, with machine translation to English. |
Japanese Office Action dated Feb. 2, 2021 issued in corresponding Japanese Application No. 2019-540385, with machine translation to English. |
Beroud, Annick. “L'intralogistique au service de la performance” a la matinale de l'Aslog (with English machine translation) L'antenne, Sep. 27, 2016. Retrieved from URL: http://www.lantenne.com/L-intralogistique-au-service-de-la-performance-a-la-matinale-de-l-Aslog_a33383.html. |
European Office Action dated Jul. 16, 2020 issued in corresponding European Application No. 16790806.0. |
European Office Action dated Nov. 4, 2020 issued in corresponding European Application No. 17725412.5. |
International Preliminary Report on Patentability dated Aug. 8, 2019 issued in corresponding International Application No. PCT/US2017/031944. |
International Preliminary Report on Patentability dated Feb. 26, 2021 issued in corresponding International Application No. PCT/US2019/030208. |
International Search Report and Written Opinion dated Dec. 14, 2018 issued in corresponding International Application No. PCT/US18/55135. |
International Search Report and Written Opinion dated Feb. 20, 2017 issued in corresponding International Application No. PCT/US2016/057529. |
International Search Report and Written Opinion dated Jan. 17, 2020 issued in corresponding International Application No. PCT/US2019/057472. |
International Search Report and Written Opinion dated Jul. 22, 2019 issued in corresponding International Application No. PCT/US2019/030208. |
International Search Report and Written Opinion dated Oct. 24, 2017, issued in the corresponding International Search Report and Written Opinion application No. PCT/US2017/031944. |
Italian Search Report dated Sep. 27, 2017 issued in corresponding Italian Application No. 201700007710, with English translation. |
Japanese Office Action dated Dec. 1, 2020 issued in corresponding Japanese Application No. 2019-521784, with English translation. |
Japanese Office Action dated Jun. 8, 2021 issued in corresponding Japanese Application No. 2021-510268, with English summary. |
Goher, K. “A two-wheeled machine with a handling mechanism in two different directions”; Robot. Biomim, vol. 3, No. 17; Publication [online]. 2016 [retrieved Nov. 26, 2018).Retrieved from the Internet: URL: https://jrobio.springeropen.com/track/pdf/10.1186/s40638-016-0049-8; entire document. |
Goher, K. M., et al. Dynamic Modeling and Control of a Two Wheeled Robotic Vehicle With a Virtual Payload, ARPN Journal of Engineering and Applied Sciences, vol. 6, No. 3, Mar. 2011. |
Hay, Benjamin. TwinswHeel, le livreur de colis de demain? (with English machine translation) Tumblr French IoT, Oct. 6, 2016.Retrieved from URL: htlp://french-iot.tumblr.com/post/151417346436/twinswheel-le-livreur-de-colis-de-demain-la. |
Hu, J., & Yan, G. (2014). Analysis of two-wheeled self-balancing mobile robots based on ADRC. Jidian Gongcheng/Mechanical & Electrical Engineering Magazine, 31(2), 159-164. doi:http://dx.doi.org/10.3969/j.ssn.1001-4551.2014.02.006—Abstract Only. |
Huang et al., “Modeling and Velocity Control for a Novel Narrow Vehicle Based on Mobile Wheeled Inverted Pendulum”, IEEE Transactions on Control Systems Technology, vol. 21 No. 5, Sep. 2013, pp. 1607-1617. (Year: 2013). |
Huang et al., “Nonlinear Disturbance Observer-Based Dynamic Surface Control of Mobile Wheeled Inverted Pendulum”, IEEE Transactions on Control Systems Technology, vol. 23 No. 6, Nov. 2015, pp. 2400-2407. (Year: 2015). |
Ji, P., Zhu, Y., Cheng, C. et al. (2014). Design of self-balancing two-wheeled vehicle control system based on STM32. Dianzi Keji—Electronic Science and Technology, 27(11), 96-99, 105. Retrieved from http://search.proquest.com/docview/1651444797?accountid=10920—Abstract Only. |
Larimi, S. R., Zarafshan, P., & Moosavian, S. A. A. A new stabilization algorithm for a two-wheeled mobile robot aided by reaction wheel. Journal of Dynamic Systems, Measurement, and Control (Transactions of the ASME), vol. 137, No. 1, Jan. 2015. |
Libeskind, Jerome. A quoi ressemblera le demier kilometre dans 10 ans? (with English machine translation) Logicites. Sep. 26, 2016. Retrieved from URL: http://www.logicites.fr/2016/09/26/a-quoi-ressemblera-dernier-Kilometre-10-ans/. |
Rahman, M. T. A., Ahmad, S., Akmeliawati, R. et al. Centre of gravity (C.O.G)-based analysis on the dynamics of the extendable double-link two-wheeled mobile robot. IOP Conference Series: Materials Science and Engineering, vol. 53, No. 1, 2013. |
Ruan, X., Chen, J., Cai, J. et al. (2011). Research on stable control for two-wheeled self-balancing robot in complex environment. Beijing Gongye Daxue Xuebao (Journal of Beijing University ofTechnology), 37(9), 1310-1316. Retrieved from http:1/search .proquest.com/docview/963872724 ?accountid= 10920—Abstract Only. |
Sales, J., Marti, J_ V., Mann, R et al. CompaRob: the shopping cart assistance robot. International Journal of Distributed Sensor Networks, 2016. |
Van der Wijk, V., & Herder, J. L. Force balancing of variable payload by active force-balanced reconfiguration of the mechanism. In Reconfigurable Mechanisms and Robots, 2009. ReMAR 2009. ASME/IFToMM International Conference, IEEE, Jun. 2009. |
Wang, Kun, et al. Enhanced active dynamic balancing of the planar robots using a three-rotating-bar balancer, Mvances in Mechanical Engineering, vol. 8, No. 4, pp. 1-10, 2016. |
Wu, K., Li, W., Liu, C. et al. (2006). Dynamic control of two-wheeled mobile robot. Yuhang Xuebao I Journal of 13 Astronautics, 27(2), 272-275. Retrieved from http:l/search.proquest.com/docview/29224261?accountid=10920—Abstract Only. |
YouTube video uploaded on Nov. 21, 2016, titled “TwinswHeel M6 1945 2016 11 18” downloaded from: https:l/www.youtube.com/watch?v=e3laoGU56nY&feature=youtu.be on Jan. 19, 2017. |
YouTube video uploaded on Sep. 15, 2016, titled “TwinswHeel Lyon 2016 09 13 EN” downloaded from: https:l/www.youtube.com/watch?v=ysYtN3Wm5Dw&feature=youtu.be on Jan. 19, 2017. |
Zhao, Y., Woo, C., & Lee, J. (2015). Balancing control of mobile manipulator with sliding mode controller. International Conference on Control, Automation and Systems (ICCAS), 802-805. |
Number | Date | Country | |
---|---|---|---|
20220026931 A1 | Jan 2022 | US |
Number | Date | Country | |
---|---|---|---|
62665183 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17049141 | US | |
Child | 17444296 | US |