The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to the use of wearable devices to improve robotic safety.
Robotic automation can provide for increasing overall productivity and efficiency. However, in many manufacturing contexts both robots and workers may work alongside each other to provide human robot collaboration.
For example, consider automotive manufacturing where industrial robots and human workers may work together on a production line. Industrial robots such as industrial robotic arms may be used to perform various assembly tasks. However other tasks remain human tasks. One of the problems associated with such environments is safety. Various robot manufacturing equipment can exert tremendous force which has the potential to harm human workers. Therefore, what is needed are methods and systems which enhance safety during robot-human collaborations, especially in manufacturing facilties.
Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
It is a further object, feature, or advantage of the present invention to improve safety of human workers in manufacturing environments where humans and robots collaborate.
It is a still further object, feature, or advantage of the present invention to allow robots to locate human collaborators.
Another object, feature, or advantage is to stop, attenuate, or reverse motion of a robot in order to protect a human worker.
One or more of these and for other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an objects, features, or advantages stated herein.
According to one aspect, a system for increasing safety during robot-human collaborations in a manufacturing environment is provided. The method includes at least one wearable device for use by a human worker and an industrial robot in operative communication with the at least one wearable device. The industrial robot is equipped to detect location of the human worker using the at least one wearable device. The at least one wearable device may include an earpiece. The at least one wearable device may include a set of earpieces including a left wearable earpiece and a right wearable earpiece. The system may include a wearable housing, a processor disposed within the wearable housing, a transceiver disposed within the wearable housing and operatively connected to the processor, and one or more sensors operatively connected to the processor. The one or more sensors includes at least one inertial sensor. The processor may be configured to track changes in movement of the human worker using the at least one inertial sensor and communicate position or changes in movement of the human worker to the industrial robot using the transceiver. The industrial robot is configured to avoid contact with the human worker by stopping. The industrial robot may be configured to avoid contact with the human worker by changing direction of motion. The industrial robot may be configured to reduce force of contact with the human worker. The industrial robot may include a robotic arm. The at least one wearable device may include at least one sensor for detecting position of the industrial robot.
According to another aspect, a method for increasing safety during robot-human collaborations in a manufacturing environment is provided. The method may include providing a wearable device, providing an industrial robot, determining relative position between the wearable device and the industrial robot, and altering operation of the industrial robot when the relative position between the wearable device and the industrial robot is less than a threshold. The altering the operation of the industrial robot may include stopping the industrial robot or changing direction of motion of the industrial robot or the amount of force produced by the robot. The method may also provide for producing an audible alert at the wearable device when the relative position between the wearable device and the industrial robot is less than the threshold. The method may further include producing a sound at the wearable device wherein the sound is shaped to represent a spatial location of the industrial robot relative to the wearable device.
According to another aspect, a system for increasing safety during robot-human collaborations in a manufacturing environment is provided. The system includes at least one wearable device for use by a human worker and an industrial robot in operative communication with the at least one wearable device. The at least one wearable device is configured to detect location of the industrial robot. The at least one wearable device may be an earpiece. The earpiece may include an earpiece housing, a processor disposed within the earpiece housing, and at least one sensor operatively connected to the processor. The at least one sensor may be configured to detect location of the industrial robot by emitting a field and detecting when the industrial robot enters the field. The earpiece may further includes a speaker and the processor may be configured to determine where within the field the industrial robot is located and to produce a sound at the speaker to alert the human worker of the position of the industrial robot.
Robots used in manufacturing can provide a significant risk to humans who work near them particularly in the industrial setting such as in the manufacturing of vehicles or other products. Although generally described in the context of industrial manufacturing, it is to be understood that the present invention may be used in other contexts as well.
As will be explained in further details with respect to various examples, the wearable device(s) 10 interact with the robot control system 40 in any number of different ways. For example, the wearable device(s) 10 may provide sensor data to the robot control system. Based on this information, the robot 2 may take any number of actions which may include one or more actions such as stopping movement of the robot, changing direction of movement of the robot, decreasing the amount of force exerted by the robot, or other types of actions.
In addition, the wearable device may communicate information to the worker. For example, where the wearable device is an earpiece or where there are a set of earpieces, an audible alert may be communicated to the worker to alert the worker 4 of the proximity of the robot 2. The closer the robot 2, the louder the sound may be. In addition, the sounds produced by the earpiece may be three-dimensionally shaped so that the worker perceives the sound as coming from a particular location associated with the actual location of the robot.
Thus, a robot may track movement of a human worker who is wearing one or more wearable devices. This may be accomplished by calibrating the position of the human worker relative to the robot and then tracking changes in movement of the human worker by examining sensor data associated with one or more of the inertial sensors. Thus, as the person moves and inertial sensor data is reported, the robot and/or its control system may update the location of the person relative to the robot.
The sensor(s) 32 may also include one or more contact sensors 72, one or more bone conduction microphones 71, one or more air conduction microphones 70, one or more chemical sensors 79, a pulse oximeter 76, a temperature sensor 80, or other physiological or biological sensor(s). Further examples of physiological or biological sensors include an alcohol sensor 83, glucose sensor 85, or bilirubin sensor 87. Other examples of physiological or biological sensors may also be included in the device. These may include a blood pressure sensor 82, an electroencephalogram (EEG) 84, an Adenosine Triphosphate (ATP) sensor, a lactic acid sensor 88, a hemoglobin sensor 90, a hematocrit sensor 92 or other biological or chemical sensor. The various sensors shown may be used to collect information regarding worker health to further improve worker safety by alerting the worker or others when a health issue is determined.
A spectrometer 16 is also shown. The spectrometer 16 may be an infrared (IR) through ultraviolet (UV) spectrometer although it is contemplated that any number of wavelengths in the infrared, visible, or ultraviolet spectrums may be detected. The spectrometer 16 is preferably adapted to measure environmental wavelengths for analysis and recommendations and thus preferably is located on or at the external facing side of the device.
A gesture control interface 36 is also operatively connected to or integrated into the intelligent control system 30. The gesture control interface 36 may include one or more emitters 82 and one or more detectors 84 for sensing user gestures. The emitters may be of any number of types including infrared LEDs. The device may include a transceiver 35 which may allow for induction transmissions such as through near field magnetic induction. A short range transceiver 34 using Bluetooth, BLE, UWB, or other means of radio communication may also be present. The short range transceiver 34 may be used to communicate with the vehicle control system. In operation, the intelligent control system 30 may be configured to convey different information using one or more of the LED(s) 20 based on context or mode of operation of the device. The various sensors 32, the processor 30, and other electronic components may be located on the printed circuit board of the device. One or more speakers 73 may also be operatively connected to the intelligent control system 30.
It is to also be understood that the same sensors or types of sensor used for the gesture control interface 36 may be used in creating a field surrounding a wearable device and detect intrusions into the field such as from a robot. Thus, LEDs, ultrasound, capacitive, or other fields may be created which extend outwardly from a wearable device associated with a worker to detect the presence of a robot.
A field emitter and detector may also be operatively connected to the intelligent control system 30 to generate an electromagnetic field or other type of field surrounding a user which a robot would interfere with if the robot was too close to the user. Disruptions in the field may be emitted and then detected at the field emitter/detector 37 and communicated to and interpreted by the intelligent control system 30. For purposes of determining changes in a field, it is contemplated that other types of fields may be used such as capacitive fields or ultrasonic fields or other types of fields which may be disrupted by the presence of a robot nearby.
In some embodiments there are multiple ways to track relative positions of the person wearing the wearable device and the manufacturing robot. For example, the earpiece may use inertial sensor measurements to keep track of position which may be communicated to the industrial robot. In addition, the earpiece may monitor changes in a field associated with the user in order to determine position of the industrial robot. This provides additional safeguards to assist in preventing accidents and injuries and thus is a further advantage. Also, where a person wears multiple wearable devices (such as two earpieces), additional tracking may be performed independent for each wearable device.
The earpieces shown have additional utility in a manufacturing environment. For example, where there are loud noises it may be beneficial to wear the earpieces to protect a worker from the loud noises. Here, the earpieces may be configured to capture and reproduce ambient sounds to the operator. This may be accomplished by using one or more microphones on the earpieces to detect ambient sound and then to re-create the ambient sound at one or more speakers of the earpiece. Thus, even though the operator is wearing earpieces there is audio transparency. In addition, as previously explained, because the earpieces may be inserted into the external auditory canal, speakers within the earpiece may be used to allow sound to be shaped so that the sounds are perceived three-dimensionally.
Therefore various apparatus, methods, and systems have been shown and described for improving worker safety, particularly when humans are working collaboratively with robots. It should be appreciated, however, that various apparatus, methods, and systems may be used in other applications and other environments.
This application claims priority to U.S. Provisional Patent Application 62/261,779, filed on Dec. 1, 2015, and entitled Robotic safety using wearables, hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3934100 | Harada | Jan 1976 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4865044 | Wallace et al. | Sep 1989 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5363444 | Norris | Nov 1994 | A |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
5802167 | Hong | Sep 1998 | A |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
6408081 | Boesen | Jun 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
8140357 | Boesen | Mar 2012 | B1 |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090072631 | Iida | Mar 2009 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140277726 | Nakamura | Sep 2014 | A1 |
20160062345 | Stubbs | Mar 2016 | A1 |
20160274586 | Stubbs | Sep 2016 | A1 |
Number | Date | Country |
---|---|---|
202014100411 | May 2015 | DE |
1017252 | Jul 2000 | EP |
1435737 | Jul 2004 | EP |
2903186 | Aug 2015 | EP |
2783811 | Apr 2016 | EP |
2074817 | Apr 1981 | GB |
S61173886 | Aug 1986 | JP |
06292195 | Oct 1998 | JP |
2006043862 | Feb 2006 | JP |
20050006702 | Jan 2005 | KR |
2011085441 | Jul 2011 | WO |
2014043179 | Mar 2014 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
Entry |
---|
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
BRAGI is on Facebook (2014). |
BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014). |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
BRAGI Update—Let's Get Ready to Rumble, A Lot to Be Done Over Christmas (Dec. 22, 2014). |
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014). |
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014). |
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014). |
BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014). |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014). |
BRAGI Update—Status on Wireless, Bits and Pieces, Testing—Oh Yeah, Timeline(Apr. 24, 2015). |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
BRAGI Update—Alpha 5 and Back to China, Backer Day, on Track(May 16, 2015). |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
BRAGI Update—Certifications, Production, Ramping Up. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
BRAGI Update—Getting Close(Aug. 6, 2014). |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
BRAGI Update—On Track, on Track and Gems Overview. |
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
Last Push Before The Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
STAAB, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—It's Your Dash (Feb. 14, 2014). |
Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014). |
Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014). |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014). |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
International Search Report & Written Opinion, PCT/EP16/79226, (dated Jun. 29, 2017), 27 Pages. |
Number | Date | Country | |
---|---|---|---|
20170151668 A1 | Jun 2017 | US |
Number | Date | Country | |
---|---|---|---|
62261779 | Dec 2015 | US |