The various embodiments of the present disclosure relate generally to extended reality (XR) collaborative systems.
Manufacturing operations in natural spaces can be challenging, requiring significant amounts of human capital to execute needed operations. The value of human capital within manufacturing operations is the ability to accommodate the natural variability of raw material of interest, especially within these natural spaces. Traditionally, workers performing manufacturing operation roles are unable to work remotely and must physically be in manufacturing facilities to perform work related tasks. Prior attempts to provide remote work capabilities to workers in manufacturing operations were unfavorable as systems proved to be challenging to program and implement. Accordingly, there is a need for providing a collaborative environment between people and autonomous robotic devices to address the aforementioned challenges present in performing manufacturing operations in natural spaces.
An exemplary embodiment of the present disclosure provides an extended reality (XR) system comprising an autonomous robotic device and a user interface. The autonomous robotic device can be located in a physical environment. The user interface can be configured to display an XR environment corresponding to at least a portion of the physical environment and receive an input from the user based on the user's perception in the XR environment. The autonomous robotic device can be configured to perform an autonomous action based at least in part on an input received from the user.
In any of the embodiments disclosed herein, the autonomous robotic device can be further configured to use a machine learning algorithm to perform autonomous actions.
In any of the embodiments disclosed herein, the machine learning algorithm can be trained using data points representative of the physical environment and inputs based on the user's perception in the XR environment.
In any of the embodiments disclosed herein, the machine learning algorithm can be further trained using data points indicative of a success score of the autonomous action.
In any of the embodiments disclosed herein, the autonomous robotic device can be configured to request the user of the XR system to provide the input.
In any of the embodiments disclosed herein, the autonomous robotic device can be configured to request the user of the extended reality system to provide the input when the robotic device is unable to use a machine learning algorithm to perform the autonomous action without the user's input.
In any of the embodiments disclosed herein, the user interface can be configured to receive the input from the user via a network interface.
In any of the embodiments disclosed herein, the XR system can further comprise one or more sensors configured to monitor at least one discrete data value in the physical environment and the user interface can be further configured to display the XR environment based at least in part on the at least one discrete data value.
In any of the embodiments disclosed herein, the XR system can further comprise user equipment that can be configured to allow the user to interact with the user interface.
In any of the embodiments disclosed herein, the user equipment can comprise a head mounted display (HMD) that can be configured to display the XR environment to the user.
In any of the embodiments disclosed herein, the user equipment can comprise a controller that can be configured to allow the user to provide the input based on a user's perception in the XR environment.
In any of the embodiments disclosed herein, the user interface can be further configured to monitor movement of the controller by the user and alter a display of the XR environment based on said movement.
Another embodiment of the present disclosure provides a method of using an extended reality (XR) system to manipulate an autonomous robotic device located in a physical environment. The method can comprise: displaying an XR environment in a user interface corresponding to at least a portion of the environment; receiving an input from a user based on the user's perception in the XR environment; and performing an autonomous action with the robotic device based, at least in part, on the input received from the user.
In any of the embodiments disclosed herein, the method can further comprise using a machine learning algorithm to perform autonomous actions with the autonomous robotic device.
In any of the embodiments disclosed herein, the method can further comprise training the machine learning algorithm using data points representative of the physical environment and inputs received from the user based on the user's perception in the XR environment.
In any of the embodiments disclosed herein, the method can further comprise further training the machine learning algorithm using points indicative of a success score of the autonomous action performed by the autonomous robotic device.
In any of the embodiments disclosed herein, the method can further comprise requesting the user of the XR system to provide the input.
In any of the embodiments disclosed herein, the method can further comprise requesting the user of the XR system to provide the input when the autonomous robotic device is unable to use a machine learning algorithm to perform the autonomous action without the user's input.
In any of the embodiments disclosed herein, receiving the input from a user can occur via a network interface.
In any of the embodiments disclosed herein, the method can further comprise interacting, by one or more additional users, with the XR environment to monitor the input provided by the user.
In any of the embodiments disclosed herein, the method can further comprise interacting, by the user using user equipment, with the user interface.
In any of the embodiments disclosed herein, the method can further comprise displaying the XR environment to the user on the head mounted display (HMD) wherein the user equipment comprises an HMD.
In any of the embodiments disclosed herein, the method can further comprise generating, by the user with the controller, the input based on the user's perception in the XR environment.
In any of the embodiments disclosed herein, the method can further comprise monitoring movement of the controller by the user and altering a display of the XR environment based on said movement of the controller.
These and other aspects of the present disclosure are described in the Detailed Description below and the accompanying drawings. Other aspects and features of embodiments will become apparent to those of ordinary skill in the art upon reviewing the following description of specific, exemplary embodiments in concert with the drawings. While features of the present disclosure may be discussed relative to certain embodiments and figures, all embodiments of the present disclosure can include one or more of the features discussed herein. Further, while one or more embodiments may be discussed as having certain advantageous features, one or more of such features may also be used with the various embodiments discussed herein. In similar fashion, while exemplary embodiments may be discussed below as device, system, or method embodiments, it is to be understood that such exemplary embodiments can be implemented in various devices, systems, and methods of the present disclosure.
The following detailed description of specific embodiments of the disclosure will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the disclosure, specific embodiments are shown in the drawings. It should be understood, however, that the disclosure is not limited to the precise arrangements and instrumentalities of the embodiments shown in the drawings.
To facilitate an understanding of the principles and features of the present disclosure, various illustrative embodiments are explained below. The components, steps, and materials described hereinafter as making up various elements of the embodiments disclosed herein are intended to be illustrative and not restrictive. Many suitable components, steps, and materials that would perform the same or similar functions as the components, steps, and materials described herein are intended to be embraced within the scope of the disclosure. Such other components, steps, and materials not described herein can include, but are not limited to, similar components or steps that are developed after development of the embodiments disclosed herein.
It must also be noted that, as used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural references unless the context clearly dictates otherwise. For example, reference to a component is intended also to include composition of a plurality of components. References to a composition containing “a” constituent is intended to include other constituents in addition to the one named.
Also, in describing the exemplary embodiments, terminology will be resorted to for the sake of clarity. It is intended that each term contemplates its broadest meaning as understood by those skilled in the art and includes all technical equivalents which operate in a similar manner to accomplish a similar purpose.
By “comprising” or “containing” or “including” is meant that at least the named compound, element, particle, or method step is present in the composition or article or method, but does not exclude the presence of other compounds, materials, particles, method steps, even if other such compounds, material, particles, method steps have the same function as what is named.
It is also to be understood that the mention of one or more method steps does not preclude the presence of additional method steps or intervening method steps between those steps expressly identified. Similarly, it is also to be understood that the mention of one or more components in a composition does not preclude the presence of additional components than those expressly identified.
The materials described as making up the various elements of the invention are intended to be illustrative and not restrictive. Many suitable materials that would perform the same or a similar function as the materials described herein are intended to be embraced within the scope of the invention. Such other materials not described herein can include, but are not limited to, for example, materials that are developed after the time of the development of the invention.
There is no such thing as a self-reliant robot, especially in the biological world. Therefore, tools that can provide easy and seamless collaboration between people and robotic devices to support manufacturing operations, especially within natural space, are needed. Tools, described herein, allow for advantages such as remote operation of machinery within manufacturing facilities to execute tasks and improve productivity, in comparison to the substantial costs posed by increasing human capital.
The collaborative extended reality (XR) system (100) can include the following elements: an autonomous robotic device (300) configured to perform autonomous actions, user interface (600) configured to display a XR environment, user equipment (400) configured to display the user interface (600) and allow a user (200) to interact with said user interface (600), and one or more sensors (500) configured to monitor at least one discrete data value within a physical environment.
For the purposes of explanation, the XR system (100) is discussed in the context of being applied to the poultry production industry. The disclosure, however, is not so limited. Rather, as those skilled in the art would appreciate, the XR system (100) disclosed herein can find many applications in various applications where it may be desirable to provide user input to assist in task completion. Within the poultry production industry, second and further processing operations require significant participation of human workers. Typically, tasks can be classified as either gross operations, which can include moving of whole products or sections thereof from machine to machine, or fine operations, which can include cutting or proper layering of raw material in packaging that could require more anatomical knowledge or dexterity to execute. Through using the claimed XR system (100) described herein, a user (200) can provide an input to an autonomous robotic device (300), via the user interface (600), to perform an autonomous action corresponding to the gross or fine operations in a poultry manufacturing facility.
As one who is skilled in the art can appreciate, an autonomous robotic device (300) is a class of devices that is different from a telerobotic device. Specifically, an autonomous robotic device (300) differs from a telerobotic device in that an autonomous robotic device (300) does not require the user's input to control each facet of the operation to be performed; rather telerobotic devices are directly controlled by users. Similarly, an autonomous action, performed by an autonomous robotic device (300), is an action that considers but is not identical to the instruction/input received from the user (200). In a poultry production application, for example, an autonomous robotic device (300) performing an autonomous action could be loading raw natural material onto a cone moving through an assembly line. Although the user's input could designate a point where the autonomous robotic device (300) should grasp the raw natural material, the autonomous robotic device (300) can subsequently determine a path to move the raw natural material from its current location to the cone independent of the user's input. In other words, the user (200) provides an input used by the autonomous robotic device (300) to determine where to grasp the raw natural material, but the robot autonomously makes additional decisions in order to move the raw natural material to the cone.
The user (200) of the XR system (100) can provide an input to the autonomous robotic device (300) to perform the autonomous action through using user equipment (400). The user equipment (400) can include many different components known in the art. For example, in some embodiments, the user equipment (400) can include a controller (420) and/or a head mounted display (HMD) (410), to allow the user (200) to interact with the user interface (600). In some embodiments, for example, HMD (410) could include but not be limited to an immersive display helmet, brain implant to visualize the transmitted display, and the like.
Within the XR system (100), the user interface (600) aggregates discrete data sets from one or more sensors (500). As one who is skilled in the art will appreciate, there are a plethora of different types of sensors, which can be configured to monitor discrete data values within a physical environment. Examples of different types of sensors can include but are not limited to temperature sensors, photo sensors, vibration sensors, motion sensors, color sensors, and the like. Within said XR system (100) the one or more sensors (500) can monitor discrete data sets within the physical environment, which can then be aggregated by the user interface (600) that can construct the XR environment to the user (200) that can be based at least in part on said discrete data sets and corresponding at least in part to the physical environment.
The user (200) can interact with the user interface (600) which displays the constructed XR environment, based in part on the discrete data sets monitored by the one or more sensors (500), using the user equipment (400). The user interface (600) can be displayed to the user (200) through the HMD (410) of the user equipment (400). As one who is skilled in the art will appreciate, the use of the HMD (410) to display the user interface (600) and therein the XR environment can assist the perception of the user (200) when interacting with the user interface (600). Additionally, through use of the HMD (410), the user (200) can determine input points that can be provided to the user interface (600) via the controller (420). The input provided by the user (200) can be received by the user interface (600) and transmitted to the autonomous robotic device (300) via a network interface. As one who is skilled in the art will appreciate, a network interface can be a medium of interconnectivity between two devices separated by large physical distances. Examples of a medium of interconnectivity relating to the preferred application can include but is not limited to cloud based networks, wired networks, wireless (Wi-Fi) networks, Bluetooth networks, and the like.
The autonomous robotic device (300) of the XR system (100) can also be configured to use a machine learning algorithm to carry out autonomous actions, without an input from the user (200). This configuration can be desirable as it can increase productivity within manufacturing operations, specifically enabling the autonomous robotic device (300) to perform repetitive tasks at a high rate of efficiency while considering natural variability of the raw natural material. The natural variability described previously, in the preferred application, could include but is not limited to positioning of the raw natural material to be grasped for a gross operation or varying anatomical presentation of the raw material for a fine operation. As one who is skilled in the art will appreciate, a machine learning algorithm is a subfield within artificial intelligence (AI) that enables computer systems and other related devices to learn how to perform tasks and improve performance in performing tasks over time. Examples of types of machine learning that can be used can include but are not limited to supervised learning algorithms, unsupervised learning algorithms, semi-supervised learning algorithms, reinforcement learning algorithms, and the like. In addition to the aforementioned examples, other algorithms that are not only based on machine learning or AI can be utilized with the autonomous robotic device (300) to perform autonomous actions such as deterministic algorithms, statistical algorithms, and the like. In some embodiments, if the machine learning algorithm is unable to be used to complete the autonomous action due to natural variability of the raw material, the XR system (100) can request the user (200) to provide an input to the user interface (600) that can be transmitted to the autonomous robotic device (300) via the network interface to perform the autonomous action. This collaboration by requesting the user (200) to provide an input to the autonomous robotic device (300) to complete the autonomous action can also be advantageous as it allows the user (200) to further train the autonomous robotic device (300) beyond performing the immediate intended autonomous action. The input provided by the user (200) to the autonomous robotic device (300) can also be used to support the development of specific autonomous action applications to be used by the autonomous robotic device (300).
It is to be understood that the embodiments and claims disclosed herein are not limited in their application to the details of construction and arrangement of the components set forth in the description and illustrated in the drawings. Rather, the description and the drawings provide examples of the embodiments envisioned. The embodiments and claims disclosed herein are further capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purposes of description and should not be regarded as limiting the claims.
Accordingly, those skilled in the art will appreciate that the conception upon which the application and claims are based may be readily utilized as a basis for the design of other structures, methods, and systems for carrying out the several purposes of the embodiments and claims presented in this application. It is important, therefore, that the claims be regarded as including such equivalent constructions.
Furthermore, the purpose of the foregoing Abstract is to enable the United States Patent and Trademark Office and the public generally, and especially including the practitioners in the art who are not familiar with patent and legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The Abstract is neither intended to define the claims of the application, nor is it intended to be limiting to the scope of the claims in any way.
This application claims the benefit of U.S. Provisional Application Ser. No. 63/234,452, filed on 18 Aug. 2021, which is incorporated herein by reference in its entirety as if fully set forth below.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/075070 | 8/17/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63234452 | Aug 2021 | US |