Over the years, industrial applications that involve manual labor have increasingly taken advantage of robotic aids. On the assembly line and in various other types of manufacturing both manual labor and robotic arms or other robotic tools are utilized in attaining a finished article of manufacture.
Of course, with the use of robotic aids in conjunction with manual labor, certain precautions are necessary. For example, as a matter of safety in a manufacturing facility, robotic tasks may be performed in an isolated area or caged off from tasks that involve direct human manipulation. Thus, human laborers may be kept safely away from a swinging robotic arm or other heavy, load bearing implement as robotic tasks are performed.
Keeping the robotic equipment and human laborer away from one another may be sufficient to ensure safety in the workplace. Unfortunately, this means that the manufacturing process may be more time consuming and inefficient with the laborer unable to begin work on the article of manufacture until the robotic task has been completed, or vice versa.
In an effort to make the manufacturing process more efficient, collaborative robotic aids have been developed that might allow for the laborer to perform tasks on the article of manufacture simultaneous with the robotic aid performing tasks on the same article. However, in order for this simultaneous human-robot task to be undertaken, other precautions are taken. For example, the robot may be limited in speed, payload and types of tasks which may be allowed. These precautions not only require limiting robotic tasks but they also do not fully eliminate the hazards involved. That is, conventional collaborative robotic aids are still prone to unintentionally harm the laborer when accidentally bumping into the laborer. It is just that this accidental bumping is less likely to result in serious long-term injury or death to the laborer.
Alternatively, safety across a manufacturing facility could be improved through other means apart from utilizing a collaborative robot that is limited in terms of payloads, speeds and so forth. For example, the possibility of utilizing a plurality of stereo cameras, LIDAR equipment, 3D depth data packet capable acquisition systems or other means of real-time data tracking and acquisition could be combined with high-performance industrial computing. However, when considering that the conventional manufacturing facility may involve ten or more laborers, multiple robots and 5-10,000 square feet of floorspace to monitor, this type of collaborative system may cost in excess of $100,000 in equipment alone. Additional costs in the form of system installation, maintenance and upgrades over time can render such a system impractical
With these issues in mind, operators are generally left with the option of installing a truly cost prohibitive system for sake of safety and efficiency, or to simply utilize a limited robot system. As a result, operators often opt for a robotic aid in the form of a limited collaborative robot operating at minimal speeds and loads or one that is truly segregated from human laborers within the facility. The option of an affordable, efficient and truly collaborative robotic system that may be utilized simultaneous with laborers without limitation remains unavailable.
A robotic assembly is disclosed herein for collaborative operation with a human at a worksite. The assembly includes at least one camera for the acquisition of visual data at the site and a movable implement to facilitate a first task at the site. The first task proceeds according to a first motion plan which is different from a second task facilitated by the human at the site. Further, a mid-tier consumer grade processor is provided to direct the implement according to the first motion plan and to attain the visual data in real-time. Thus, the processor is configured to develop and adjust an adoptable second motion plan for the first task during the directing and the attaining based on human positioning during the tasks.
In the following description, numerous details are set forth to provide an understanding of the present disclosure. However, it will be understood by those skilled in the art that the embodiments described may be practiced without these particular details. Further, numerous variations or modifications may be employed which remain contemplated by the embodiments as specifically described.
Embodiments are described with reference to a particular workspace utilizing a collaborative robotic assembly in cooperation with one or more human laborers focused on manufacturing small articles. For example, mobile device components and devices may be assembled at the worksite through embodiments of collaborative robotic assemblies detailed herein. However, other types of collaborative operations may take advantage of embodiments described herein. For example, larger car frame assemblies or pick and place systems may utilize such systems. So long as the collaborative robotic assembly is capable of performing tasks through adjustable motion plans implemented through limited data processed through a mid-tier consumer grade processor, appreciable benefit may be realized.
Referring now to
Continuing with reference to
In the embodiment shown, the robotic aids 160, 165 are shown in an immobile configuration at a dedicated location. However, it is worth pointing out that the embodiments of the system herein are such that the aids 160, 165 may be more mobile and not necessarily immobilized at a given location of the facility floor. Nevertheless, truly collaborative robotic assistance may be available to the laborers 130, 140 without undue concern over a robotic implement 167 or arm striking a laborer 130, 140. No fencing off of the robotic aids 160, 165 for sake of safety is required. This remains true even for circumstances where a robotic aid 160, 165 and a laborer 130, 140 may manipulate the same article 190 at the same point in time (again, as illustrated in
Referring now to
Referring now to
Continuing with reference to
The described assembly 100 of
Referring now to
For the described example of smartphone component assembly the risk to the laborer 130 may seem less than significant. However, when considering the precision required of the multiple tasks to complete such an article, the benefit of avoiding jarring accidental contact may be quite significant. Once more, embodiments of the assembly 100 of
Referring now to
With the second motion plan developed, the first motion plan is continuously evaluated for validity in terms of collision-free movement (see 565). If the plan is valid as indicated at 580, continued execution will take place as indicated at 590. However, if the first plan is no longer valid, a new valid plan will be executed as indicated at 595 if available. Of course, if a second motion plan is not available, a pause will occur as another second motion plan is developed (see 530). At some point, the robotic aid may reach its target or task position to perform the task at hand (see 570). Thus, once complete, a new target or task will be in store (see 550).
Such a system and method may be feasible with voluminous point cloud data and industrial speed processors. However, it is worth noting that for embodiments described here, a non-point cloud data acquisition camera and a mid-tier consumer grade workstation may be more than sufficient to achieve the level of collaborative, collision-free, robotic-human laborer co-working described herein.
Embodiments described above provide a cost effective manner of implementing a truly collaborative system for manufacturing that employs simultaneous robotic and human laborer interactions. This is achieved without the requirement of high data acquisition point cloud or depth cameras or industrial grade processors. As a result, inefficient segregated caging off of robotic aids or avoidance of article processing by laborer and robotic aid at the same time are not required. Thus, a cost-effective, truly collaborative system is provided.
The preceding description has been presented with reference to presently preferred embodiments. Persons skilled in the art and technology to which these embodiments pertain will appreciate that alterations and changes in the described structures and methods of operation may be practiced without meaningfully departing from the principle and scope of these embodiments. Furthermore, the foregoing description should not be read as pertaining only to the precise structures described and shown in the accompanying drawings, but rather should be read as consistent with and as support for the following claims, which are to have their fullest and fairest scope.
Number | Name | Date | Kind |
---|---|---|---|
10675766 | Niemeyer | Jun 2020 | B1 |
10899017 | De Sapio | Jan 2021 | B1 |
11919173 | Denenberg | Mar 2024 | B2 |
20090118863 | Dariush | May 2009 | A1 |
20160023352 | Kennedy | Jan 2016 | A1 |
20170173795 | Tan | Jun 2017 | A1 |
20180099408 | Shibata | Apr 2018 | A1 |
20190134819 | Okahara | May 2019 | A1 |
20200086487 | Johnson | Mar 2020 | A1 |
20210206003 | Zhou | Jul 2021 | A1 |
20210309264 | Felip Leon | Oct 2021 | A1 |
20220241974 | Mishima | Aug 2022 | A1 |
20220315355 | Sun | Oct 2022 | A1 |
20220379474 | Vu | Dec 2022 | A1 |
20230233105 | Chen | Jul 2023 | A1 |
20230294277 | Yang | Sep 2023 | A1 |
20230298257 | Gautron | Sep 2023 | A1 |
20230347522 | Wollstadt | Nov 2023 | A1 |
20230364792 | Takano | Nov 2023 | A1 |
20230373092 | Zoghlami | Nov 2023 | A1 |
20230381959 | Thomaz | Nov 2023 | A1 |
20230410430 | Kutsyy | Dec 2023 | A1 |
20240083027 | Spampinato | Mar 2024 | A1 |
20240083031 | Falco | Mar 2024 | A1 |
20240308071 | Zhang | Sep 2024 | A1 |
20240424678 | Vu | Dec 2024 | A1 |
Entry |
---|
Bruce, Real-Time Motion Planning and Safe Navigation in Dynamic Multi-Robot Environments, School of Computer Science Carnegie Mellon University, Dec. 15, 2006, pp. 1-204 (Year: 2006). |
Matheson et al., Human-Robot Collaboration in Manufacturing Applications: A Review, Robotics Dec. 6, 2019, pp. 1-25 (Year: 2019). |
Cherubini et al., Collaborative manufacturing with physical human-robot interaction, Robotics and Computer-Integrated Manufacturing, 2016, pp. 1-13 (Year: 2016). |
Chen et al., Review of Low Frame Rate Effects on Human Performance, IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 37, No. 6, Nov. 2007, pp. 1063-1076 (Year: 2007). |
Fan et al., Vision-based holistic scene understanding towards proactive human-robot collaboration, Jan. 10, 2022, Robotics and Computer-Integrated Manufacturing 75, pp. 1-17 (Year: 2022). |
Number | Date | Country | |
---|---|---|---|
20230390932 A1 | Dec 2023 | US |