Negotiation-based human-robot collaboration via augmented reality

Information

  • Patent Grant
  • 11958183
  • Patent Number
    11,958,183
  • Date Filed
    Friday, September 18, 2020
    3 years ago
  • Date Issued
    Tuesday, April 16, 2024
    16 days ago
  • Inventors
    • Zhang; Shiqi (Vestal, NY, US)
    • Chandan; Kishan Dhananjay (Binghamton, NY, US)
  • Original Assignees
  • Examiners
    • Sample; Jonathan L
    Agents
    • Hoffberg & Associates
    • Hoffberg; Steven M.
Abstract
Effective human-robot collaboration (HRC) requires extensive communication among the human and robot teammates, because their actions can potentially produce conflicts, synergies, or both. An augmented reality-driven, negotiation-based (ARN) framework is provided for HRC, where ARN supports planning-phase negotiations within human-robot teams. Experiments in an office environment, where multiple mobile robots work on delivery tasks, where the robots could not complete the tasks on their own, but sometimes need help from their human teammate, making human-robot collaboration necessary. In comparison to a non-AR baseline, ARN significantly improved the human users' work efficiency, and their cognitive load, while reducing the overall task completion time of the whole team.
Description
FIELD OF THE INVENTION

The present application relates to the field of robotics, and more particularly to a system and method for controlling a robot using an augmented reality user interface.


BACKGROUND OF THE INVENTION

Robots are increasingly ubiquitous in everyday environments, but few of them collaborate or even communicate with people in their work time. For instance, the work zones for Amazon's warehouse robots and people are completely separated in their fulfillment centers, and there is no direct human-robot communication at runtime except for object handovers or people wearing a “Tech Vest” (Wurman, D'Andrea, and Mountz 2008).


Another notable example is the Relay robots from Savioke that have completed more than 300,000 deliveries in hotels, hospitals, and logistics facilities (Ivanov, Webster, and Berezina 2017). Their robots work in human presence, but the human-robot interaction does not go beyond avoiding each other as obstacles until the moment of delivery. Despite the significant achievements in multi-agent systems (Wooldridge 2009), human-robot collaboration (HRC), as a kind of multi-agent system, is still rare in practice.


Augmented Reality (AR) focuses on overlaying information in an augmented layer over the real environment to make objects interactive (Azuma et al. 2001). On the one hand, AR has promising applications in robotics, and people can visualize the state of the robot in a visually enhanced form while giving feedback at runtime (Green et al. 2007). On the other hand, there are a number of collaboration algorithms developed for multiagent systems (MAS) (Wooldridge 2009; Stone and Veloso 2000), where a human-robot team is a kind of MAS. Despite the existing research on AR in robotics and multiagent systems, few have leveraged AR for HRC.


See, US 20190254754; 20190254753; 20190236399; 20190235492; 20190231436; 20190088162; 20180304461; 20180285697; 20180085616; 20180081439; 20180061137; 20180055326; 20180055312; 20180048876; 20180009059; 20170232615; 20170165841; 20170075116; 20160158937; 20160140729; 20160055677; 20160004306; 20150369864; 20150323990; 20150290803; 20150290802; 20140333668; 20140293014; 20140277737; 20140275760; 20140241617; 20140241612; 20140081459; 20130346348; 20130345870; 20130343640; 20120194644; 20110306986; 20110164116; 20110164030; 20110128300; 20030179308; U.S. Pat. Nos. 10,413,994; 10,373,389; 10,368,784; 10,366,510; 10,052,765; 9,977,496; 9,964,765; 9,940,553; 9,916,006; 9,794,541; 9,767,608; 9,751,015; 9,701,015; 9,669,544; 9,434,072; 9,092,698; 8,885,022; 8,817,078; 8,803,951; 8,711,206.


When humans and robots work in a shared environment, it is vital that they communicate with each other to avoid conflicts, leverage complementary capabilities, and facilitate the smooth accomplishment of tasks. However, humans and robots prefer different modalities for communication. While humans employ natural language, body language, gestures, written communication, etc., the robots need information in a digital form, e.g., text-based commands. Researchers developed algorithms to bridge the human-robot communication gap using natural language (Tellex et al. 2011; Chai et al. 2014; Thomason et al. 2015; Matuszek et al. 2013), and vision (Waldherr, Romero, and Thrun 2000; Nickel and Stiefelhagen 2007; Yang, Park, and Lee 2007). Despite those successes, AR has its unique advantages on elevating coordination through communicating spatial information, e.g., through which door a robot is coming into a room and how (i.e., the planned trajectory), when people and robots share a physical environment (Azuma 1997).


One way of delivering spatial information related to the local environment is through projecting robot's state and motion intent to humans using visual cues (Park and Kim 2009; Watanabe et al. 2015; Reinhart, Vogl, and Kresse 2007). For instance, researchers used an LED projector attached to the robot to show its planned motion trajectory, allowing the human partner to respond to the robot's plan to avoid possible collisions (Chadalavada et al. 2015). While such systems facilitate human-robot communication about spatial information, they have the requirement that the human must be in close proximity to the robot. Also, bidirectional communication is difficult in projection-based systems.


Early research on AR for human-robot interaction (HRI) produced a system called ARGOS that allows a human operator to interactively plan, and optimize robot trajectories (Milgram et al. 1993). More recently, researchers have developed frameworks to help human operators to visualize the motion-level intentions of unmanned aerial vehicles (UAVs) using AR (Walker et al. 2018; Hedayati, Walker, and Szafir 2018). In another line of research, people used an AR interface to help humans visualize a robot arm's planned actions in the car assembly tasks (Amor et al. 2018). However, the communication of those systems is unidirectional, i.e., their methods only convey robot intention to the human and do not support the communication the other way around.


See, U.S. Pat. Nos. 4,163,183; 4,260,940; 4,260,941; 4,754,402; 4,789,940; 4,940,925; 4,982,329; 5,046,022; 5,280,431; 5,342,283; 5,375,059; 5,390,125; 5,438,517; 5,548,516; 5,555,503; 5,579,444; 5,610,815; 5,612,883; 5,615,116; 5,629,855; 5,640,323; 5,646,843; 5,646,845; 5,648,901; 5,657,226; 5,680,306; 5,680,313; 5,684,696; 5,838,562; 5,956,250; 5,969,973; 5,983,161; 5,985,214; 6,042,555; 6,099,457; 6,122,572; 6,169,981; 6,233,545; 6,252,544; 6,275,773; 6,292,830; 6,341,243; 6,341,372; 6,360,193; 6,413,229; 6,429,812; 6,468,800; 6,472,218; 6,487,500; 6,507,767; 6,574,628; 6,580,967; 6,581,048; 6,666,811; 6,678,577; 6,685,884; 6,748,325; 6,791,472; 6,842,674; 6,842,692; 6,845,294; 6,890,485; 6,898,484; 6,904,335; 6,965,816; 6,988,026; 7,033,781; 7,047,861; 7,054,718; 7,072,764; 7,099,747; 7,103,460; 7,168,748; 7,236,861; 7,268,700; 7,269,479; 7,271,737; 7,298,289; 7,330,844; 7,343,222; 7,383,107; 7,386,163; 7,415,321; 7,421,321; 7,528,835; 7,558,156; 7,590,589; 7,649,331; 7,662,128; 7,706,918; 7,720,777; 7,742,845; 7,751,928; 7,756,615; 7,765,029; 7,765,038; 7,774,243; 7,835,778; 7,865,267; 7,873,471; 7,881,824; 7,904,182; 7,949,428; 7,970,476; 7,991,576; 8,010,180; 8,052,857; 8,112,176; 8,121,618; 8,126,642; 8,139,109; 8,145,295; 8,157,205; 8,160,680; 8,180,436; 8,195,343; 8,195,358; 8,195,599; 8,200,428; 8,213,261; 8,221,322; 8,229,163; 8,229,618; 8,237,775; 8,244,327; 8,244,469; 8,255,092; 8,280,623; 8,306,650; 8,340,823; 8,373,582; 8,374,721; 8,392,065; 8,412,449; 8,414,356; 8,419,804; 8,422,994; 8,447,440; 8,447,524; 8,457,830; 8,467,779; 8,467,928; 8,478,493; 8,483,876; 8,485,861; 8,489,115; 8,512,219; 8,512,415; 8,518,031; 8,521,257; 8,538,673; 8,568,363; 8,577,126; 8,577,538; 8,583,286; 8,583,313; 8,600,830; 8,612,052; 8,629,789; 8,630,763; 8,660,642; 8,666,587; 8,682,309; 8,682,726; 8,694,092; 8,706,185; 8,706,186; 8,706,298; 8,706,394; 8,712,686; 8,725,292; 8,727,987; 8,737,986; 8,761,931; 8,781,629; 8,784,384; 8,784,385; 8,798,932; 8,798,933; 8,818,567; 8,822,924; 8,831,863; 8,834,488; 8,842,176; 8,849,441; 8,858,912; 8,864,846; 8,874,162; 8,874,477; 8,900,325; 8,911,499; 8,914,182; 8,918,209; 8,920,332; 8,930,019; 8,935,119; 8,936,629; 8,939,056; 8,945,017; 8,947,531; 8,948,832; 8,956,303; 8,965,677; 8,965,688; 8,965,730; 8,968,332; 8,972,177; 8,989,972; 8,998,815; 9,002,426; 9,005,129; 9,008,962; 9,014,790; 9,014,848; 9,020,617; 9,021,024; 9,040,087; 9,044,149; 9,044,171; 9,046,373; 9,046,892; 9,056,676; 9,060,678; 9,066,211; 9,075,146; 9,079,060; 9,089,968; 9,098,079; 9,113,794; 9,113,795; 9,129,532; 9,131,529; 9,139,310; 9,151,633; 9,168,392; 9,168,419; 9,177,476; 9,183,560; 9,188,980; 9,198,563; 9,198,604; 9,199,725; 9,211,201; 9,220,086; 9,220,917; 9,221,177; 9,228,859; 9,234,744; 9,237,855; 9,248,982; 9,251,393; 9,253,753; 9,261,376; 9,282,902; 9,292,936; 9,300,423; 9,302,122; 9,302,783; 9,307,917; 9,311,670; 9,333,042; 9,345,387; 9,345,592; 9,346,560; 9,351,106; 9,351,856; 9,358,975; 9,361,797; 9,383,752; 9,389,085; 9,392,920; 9,400,503; 9,402,552; 9,408,530; 9,410,979; 9,412,278; 9,413,852; 9,420,203; 9,420,432; 9,429,657; 9,429,661; 9,431,006; 9,440,545; 9,443,192; 9,445,711; 9,448,072; 9,456,787; 9,457,915; 9,459,273; 9,459,626; 9,465,390; 9,467,834; 9,468,349; 9,470,529; 9,470,702; 9,477,230; 9,480,534; 9,486,921; 9,488,987; 9,489,655; 9,491,589; 9,494,432; 9,497,380; 9,498,649; 9,504,408; 9,507,346; 9,511,799; 9,513,132; 9,517,668; 9,517,767; 9,519,882; 9,520,638; 9,527,586; 9,527,587; 9,538,892; 9,539,117; 9,540,043; 9,540,102; 9,541,383; 9,542,600; 9,543,636; 9,551,582; 9,554,922; 9,557,742; 9,561,794; 9,561,941; 9,567,074; 9,568,492; 9,572,533; 9,574,883; 9,582,720; 9,586,316; 9,588,195; 9,592,851; 9,593,957; 9,597,014; 9,599,632; 9,599,990; 9,600,138; 9,605,952; 9,606,539; 9,609,107; 9,612,123; 9,612,403; 9,616,252; 9,623,562; 9,630,619; 9,632,502; 9,638,829; 9,641,239; 9,643,316; 9,645,159; 9,645,577; 9,646,614; 9,649,767; 9,651,368; 9,655,548; 9,658,239; 9,661,827; 9,662,053; 9,671,418; 9,671,566; 9,682,481; 9,683,858; 9,687,377; 9,701,239; 9,703,295; 9,714,954; 9,720,415; 9,721,471; 9,726,686; 9,731,853; 9,734,220; 9,734,367; 9,734,455; 9,734,632; 9,736,655; 9,744,672; 9,746,330; 9,747,809; 9,750,977; 9,754,226; 9,754,419; 9,754,490; 9,760,093; 9,769,365; 9,775,681; 9,776,326; 9,776,716; 9,785,911; 9,786,202; 9,791,866; 9,792,613; 9,795,445; 9,798,329; 9,801,527; 9,802,661; 9,804,599; 9,805,372; 9,805,607; 9,818,136; 9,827,683; 9,830,485; 9,832,749; 9,835,637; 9,839,552; 9,840,003; 9,857,170; 9,860,129; 9,861,075; 9,869,484; 9,870,566; 9,873,196; 9,878,664; 9,880,561; 9,881,497; 9,888,105; 9,901,408; 9,902,069; 9,907,721; 9,910,441; 9,911,020; 9,916,002; 9,916,010; 9,916,703; 9,919,360; 9,927,807; 9,928,649; 9,931,573; 9,931,697; 9,937,621; 9,939,817; 9,940,604; 9,945,677; 9,947,230; 9,952,042; 9,952,591; 9,958,864; 9,958,875; 9,965,730; 9,968,280; 9,972,137; 9,975,249; 9,978,013; 9,980,630; 9,983,584; 9,984,339; 10001499; 10,001,780; 10,002,471; 10,002,537; 10,011,012; 10,012,996; 10,013,610; 10,015,466; 10,022,867; 10,023,393; 10,025,886; 10,030,988; 10,034,066; 10,048,683; 10,051,411; 10,059,467; 10,061,325; 10,061,328; 10,065,309; 10,065,317; 10,068,470; 10,070,974; 10,078,136; 10,080,672; 10,081,104; 10,082,397; 10,083,406; 10,099,391; 10,105,244; 10,106,283; 10,108,194; 10,122,995; 10,123,181; 10,126,136; 10,126,757; 10,127,816; 10,133,990; 10,137,011; 10,144,591; 10,147,069; 10,153,537; 10,159,218; 10,162,353; 10,162,355; 10,168,704; 10,172,409; 10,178,445; 10,178,973; 10,188,472; 10,191,495; 10,192,113; 10,194,836; 10,194,858; 10,203,762; 10,207,404; 10,209,365; 10,212,876; 10,216,195; 10,218,433; 10,222,215; 10,225,025; 10,228,242; 10,230,745; 10,231,790; 10,239,208; 10,239,740; 10,243,379; 10,248,119; 10,251,805; 10,252,335; 10,254,499; 10,254,536; 10,255,719; 10,255,723; 10,259,514; 10,260,890; 10,262,213; 10,264,586; 10,265,859; 10,265,871; 10,267,970; 10,269,108; 10,270,789; 10,277,885; 10,279,906; 10,283,110; 10,285,828; 10,288,419; 10,291,334; 10,293,485; 10,295,338; 10,296,012; 10,296,995; 10,300,601; 10,300,603; 10,303,166; 10,303,174; 10,307,199; 10,307,272; 10,309,792; 10,310,517; 10,311,731; 10,320,610; 10,326,689; 10,327,674; 10,328,575; 10,328,578; 10,330,440; 10,331,129; 10,334,050; 10,334,164; 10,334,906; 10,335,004; 10,336,543; 10,338,391; 10,338,594; 10,352,693; 10,353,388; 10,353,532; 10,354,441; 10,358,057; 10,359,783; 10,360,799; 10,362,057; 10,363,657; 10,363,826; 10,365,656; 10,365,716; 10,366,289; 10,366,508; 10,368,249; 10,369,974; 10,372,132; 10,372,721; 10,375,289; 10,375,517; 10,377,040; 10,377,375; 10,379,007; 10,379,539; 10,382,889; 10,382,975; 10,384,351; 10,386,857; 10,389,037; 10,391,632; 10,394,246; 10,395,117; 10,395,434; 10,397,802; 10,398,520; 10,399,443; 10,401,852; 10,401,864; 10,402,731; 10,406,687; 10,408,613; 10,409,284; 10,410,328; 10,411,356; 10,414,052; 10,414,395; 10,416,668; 20020005614; 20020012611; 20020016647; 20020022927; 20020073101; 20020184236; 20020198623; 20020198697; 20030093187; 20030199944; 20030208302; 20040006422; 20040006566; 20040013295; 20040019402; 20040030448; 20040030449; 20040030450; 20040030451; 20040030570; 20040030571; 20040030741; 20040068351; 20040068415; 20040068416; 20040077090; 20040103740; 20040107021; 20040128028; 20040130442; 20040133168; 20040134336; 20040134337; 20040162638; 20040242953; 20040267442; 20050005266; 20050065649; 20050071043; 20050125099; 20050131581; 20050149251; 20050183569; 20050187677; 20050191670; 20050192727; 20050215764; 20050237188; 20050240253; 20050240307; 20050249667; 20050251291; 20050273218; 20060064202; 20060095171; 20060097683; 20060142657; 20060161218; 20060161405; 20060167784; 20060184272; 20060229801; 20060241368; 20060241718; 20070007384; 20070010898; 20070018890; 20070027612; 20070039831; 20070055662; 20070063875; 20070070072; 20070087756; 20070100780; 20070124024; 20070150565; 20070159924; 20070219933; 20070220637; 20070239315; 20070250119; 20070262860; 20080009772; 20080027591; 20080059015; 20080072139; 20080167771; 20080226498; 20080228239; 20080270097; 20080300777; 20080312561; 20090000626; 20090000627; 20090012531; 20090012532; 20090037033; 20090043504; 20090073034; 20090087029; 20090148035; 20090152391; 20090178597; 20090306741; 20090312808; 20090312817; 20090326604; 20100017046; 20100042258; 20100056900; 20100076631; 20100076737; 20100106356; 20100114633; 20100149917; 20100161232; 20100235285; 20100268382; 20100286791; 20100286824; 20100312387; 20100312388; 20100317420; 20110002194; 20110004513; 20110022230; 20110077775; 20110082717; 20110118855; 20110130114; 20110231016; 20110231050; 20110288684; 20120069131; 20120072023; 20120075072; 20120084839; 20120101680; 20120109150; 20120130632; 20120149353; 20120166024; 20120173018; 20120182392; 20120185094; 20120209432; 20120215354; 20120274775; 20120290152; 20120310112; 20120316725; 20130079693; 20130131985; 20130165070; 20130166195; 20130166202; 20130166387; 20130196300; 20130212420; 20130238183; 20130252586; 20130274986; 20130279392; 20130279393; 20130279491; 20130303847; 20130320212; 20130335273; 20140039298; 20140067188; 20140100693; 20140100697; 20140155087; 20140155098; 20140156806; 20140163664; 20140187913; 20140193040; 20140214259; 20140244035; 20140263989; 20140264047; 20140266939; 20140268601; 20140273858; 20140275850; 20140275852; 20140275854; 20140276119; 20140278220; 20140278229; 20140278634; 20140288390; 20140288391; 20140288392; 20140288435; 20140288436; 20140288438; 20140297217; 20140297218; 20140303486; 20140305204; 20140316305; 20140316570; 20140347265; 20140356817; 20140358012; 20140374480; 20140378999; 20150010437; 20150016777; 20150019013; 20150019124; 20150025393; 20150025394; 20150032164; 20150032252; 20150037437; 20150051519; 20150073646; 20150081156; 20150081444; 20150094096; 20150106427; 20150118756; 20150122018; 20150127141; 20150158182; 20150168939; 20150173631; 20150192682; 20150196256; 20150201853; 20150201854; 20150202770; 20150223708; 20150230735; 20150234477; 20150235088; 20150235370; 20150235441; 20150235447; 20150241458; 20150241705; 20150241959; 20150242575; 20150242943; 20150243100; 20150243105; 20150243106; 20150247723; 20150247975; 20150247976; 20150248169; 20150248170; 20150248787; 20150248788; 20150248789; 20150248791; 20150248792; 20150248793; 20150256401; 20150268355; 20150273242; 20150273691; 20150276775; 20150286221; 20150290453; 20150290454; 20150301072; 20150308839; 20150309263; 20150309264; 20150314166; 20150332213; 20150343238; 20150353206; 20150355207; 20150355211; 20150360057; 20160018816; 20160023762; 20160025500; 20160025502; 20160026253; 20160036118; 20160039540; 20160039553; 20160039621; 20160042151; 20160045841; 20160051169; 20160054135; 20160066844; 20160070436; 20160081575; 20160082597; 20160084869; 20160086108; 20160109954; 20160129592; 20160132059; 20160136284; 20160143500; 20160148433; 20160167582; 20160170414; 20160171884; 20160171893; 20160183818; 20160187166; 20160201933; 20160201934; 20160212402; 20160216117; 20160236582; 20160253844; 20160260322; 20160282126; 20160283774; 20160288905; 20160291593; 20160292403; 20160292696; 20160292869; 20160292872; 20160297429; 20160299506; 20160302706; 20160313739; 20160325143; 20160332748; 20160375592; 20160375779; 20160377381; 20160377508; 20160378111; 20160378117; 20160378861; 20160379074; 20170011210; 20170021497; 20170021502; 20170027523; 20170031369; 20170038579; 20170039764; 20170039859; 20170045893; 20170045894; 20170069214; 20170084983; 20170086698; 20170087301; 20170087381; 20170090478; 20170097506; 20170097507; 20170100837; 20170102711; 20170105592; 20170108871; 20170111223; 20170112392; 20170112407; 20170113352; 20170116477; 20170127652; 20170129603; 20170132931; 20170148213; 20170160398; 20170169713; 20170182657; 20170182664; 20170188864; 20170188893; 20170203446; 20170212511; 20170215086; 20170215381; 20170223034; 20170223037; 20170223046; 20170225321; 20170225332; 20170225334; 20170225336; 20170227965; 20170235316; 20170239719; 20170239720; 20170239721; 20170239752; 20170239891; 20170239892; 20170248966; 20170257162; 20170257778; 20170270361; 20170277193; 20170277194; 20170277195; 20170287337; 20170291093; 20170300540; 20170305015; 20170309069; 20170318360; 20170318477; 20170323563; 20170329347; 20170337826; 20170341231; 20170343695; 20170352192; 20170357270; 20170366751; 20170368684; 20170372618; 20180001476; 20180004213; 20180015347; 20180021097; 20180032949; 20180039287; 20180041907; 20180042526; 20180043547; 20180052276; 20180052277; 20180052320; 20180052451; 20180052501; 20180059297; 20180059304; 20180059672; 20180060764; 20180060765; 20180061243; 20180068255; 20180068358; 20180068567; 20180071949; 20180075649; 20180077902; 20180081439; 20180082308; 20180084242; 20180104829; 20180109223; 20180113468; 20180116898; 20180119534; 20180120856; 20180123291; 20180126460; 20180126461; 20180126462; 20180126649; 20180126650; 20180133801; 20180139364; 20180144558; 20180153084; 20180157336; 20180158236; 20180165974; 20180170392; 20180172450; 20180173242; 20180174357; 20180178376; 20180178382; 20180178663; 20180180421; 20180186067; 20180186080; 20180186081; 20180186082; 20180204111; 20180207791; 20180211263; 20180211441; 20180215039; 20180218619; 20180225597; 20180231972; 20180232668; 20180233054; 20180233856; 20180238164; 20180249343; 20180251135; 20180251234; 20180252535; 20180255465; 20180259976; 20180261023; 20180263170; 20180273296; 20180273297; 20180273298; 20180276891; 20180278920; 20180281191; 20180282065; 20180282066; 20180284735; 20180284736; 20180284737; 20180284741; 20180284742; 20180284743; 20180284744; 20180284745; 20180284746; 20180284747; 20180284749; 20180284752; 20180284753; 20180284754; 20180284755; 20180284756; 20180284757; 20180284758; 20180284786; 20180288303; 20180288586; 20180293536; 20180299878; 20180299882; 20180300835; 20180304468; 20180306587; 20180306589; 20180306591; 20180307241; 20180307941; 20180311822; 20180312824; 20180321666; 20180321667; 20180321672; 20180322197; 20180322779; 20180329425; 20180330293; 20180348761; 20180348764; 20180361586; 20180362158; 20180362190; 20180364724; 20180365603; 20180370046; 20180373320; 20180374266; 20190001492; 20190011921; 20190011932; 20190015167; 20190020530; 20190023438; 20190025805; 20190025806; 20190025812; 20190025813; 20190033845; 20190033846; 20190033847; 20190033848; 20190033849; 20190033888; 20190034728; 20190034729; 20190034730; 20190034839; 20190041223; 20190041835; 20190041836; 20190041840; 20190041841; 20190041842; 20190041843; 20190041844; 20190041845; 20190041846; 20190041852; 20190049968; 20190051178; 20190051198; 20190056693; 20190064791; 20190064792; 20190073760; 20190077510; 20190079509; 20190079523; 20190079524; 20190079526; 20190079528; 20190080266; 20190080515; 20190080516; 20190082985; 20190086919; 20190086925; 20190086930; 20190086932; 20190086934; 20190086938; 20190088133; 20190092179; 20190092183; 20190092184; 20190094870; 20190094981; 20190097443; 20190100196; 20190104919; 20190105200; 20190107845; 20190113351; 20190113918; 20190113919; 20190113920; 20190113927; 20190116758; 20190120639; 20190120967; 20190121333; 20190121338; 20190121339; 20190121340; 20190121341; 20190121342; 20190121343; 20190121344; 20190121345; 20190121346; 20190121347; 20190121348; 20190121349; 20190121350; 20190121365; 20190125126; 20190125361; 20190125454; 20190125455; 20190125456; 20190125457; 20190125458; 20190125459; 20190128390; 20190129404; 20190129405; 20190129406; 20190129407; 20190129408; 20190129409; 20190129410; 20190130182; 20190135296; 20190137985; 20190137986; 20190137987; 20190137988; 20190137989; 20190143412; 20190145239; 20190145765; 20190145784; 20190146451; 20190146472; 20190146473; 20190146474; 20190146475; 20190146476; 20190146477; 20190146478; 20190146479; 20190146480; 20190146481; 20190146482; 20190146515; 20190147253; 20190147254; 20190147255; 20190147260; 20190149725; 20190155263; 20190155272; 20190155295; 20190155296; 20190156128; 20190159848; 20190160675; 20190161274; 20190163206; 20190170521; 20190171187; 20190171912; 20190176328; 20190176329; 20190178638; 20190179290; 20190179300; 20190179301; 20190179329; 20190179976; 20190180499; 20190187680; 20190187681; 20190187682; 20190187683; 20190187684; 20190187685; 20190187686; 20190187687; 20190187688; 20190187689; 20190187690; 20190187703; 20190187715; 20190188632; 20190188895; 20190193276; 20190193629; 20190196472; 20190196480; 20190196485; 20190200844; 20190200977; 20190201037; 20190201038; 20190201040; 20190201042; 20190201046; 20190201127; 20190201136; 20190204201; 20190206045; 20190206562; 20190206565; 20190212106; 20190213212; 20190213390; 20190213418; 20190213421; 20190213441; 20190213523; 20190213534; 20190213535; 20190213545; 20190213546; 20190213752; 20190215424; 20190219409; 20190219995; 20190219996; 20190220012; 20190220863; 20190222986; 20190227536; 20190227537; 20190228266; 20190228495; 20190228573; 20190229802; 20190231162; 20190232498; 20190232992; 20190235486; 20190235498; 20190235505; 20190235512; 20190235516; 20190235531; 20190236531; 20190238638; 20190240840; 20190243370; 20190247050; 20190247122; 20190248002; 20190248007; 20190248013; 20190248014; 20190248016; 20190250000; 20190254753; 20190254754; 20190258251; 20190258878; 20190261565; 20190261566; 20190265366; 20190265705; 20190266418; 20190269353; 20190270197; 20190274716; 20190277869; 20190277962; 20190278276; 20190278284; and 20190278290.


All patents and other publications; including literature references, issued patents, and published patent applications; cited throughout this application are expressly incorporated herein by reference for all purposes, including, but not limited to, describing and disclosing, for example, the methodologies described in such publications that might be used in connection with the technology described herein. Citation or identification of any reference herein, in any section of this application, shall not be construed as an admission that such reference is available as prior art to the present application, and shall be treated as if the entirety thereof forms a part of the disclosure hereof. Such references are provided for their disclosure of technologies to enable practice of the present invention, provide a written description of the invention, to provide basis for claim language, to make clear applicant's possession of the invention with respect to the various aggregates, combinations, and subcombinations of the respective disclosures or portions thereof (within a particular reference or across multiple references). The citation of references is intended to be part of the disclosure of the invention, and not merely supplementary background information. The incorporation by reference does not extend to teachings which are inconsistent with the invention as expressly described herein, and is evidence of a proper interpretation by persons of ordinary skill in the art of the terms, phrase and concepts discussed herein, without being limiting as the sole interpretation available.


The technology described herein is further illustrated by the following examples which in no way should be construed as being further limiting. However, the scope of the invention is to be interpreted in any case as including technological implementation beyond mere mental or manual steps, abstract ideas, and algorithms.


SUMMARY OF THE INVENTION

An augmented reality-driven, negotiation-based (ARN) framework is provided for human-robot collaboration (HRC) problems, where ARN for the first time enables spatially-distant, human-robot teammates to iteratively communicate preferences and constraints toward effective collaborations. An AR-driven interface is provided for HRC, where the human can directly visualize and interact with the robots' planned actions.


As encompassed herein, a robot is an automated system, either physically or virtually implemented according to physical constraints, that is controlled to perform a useful task.


The AR-based framework that inherits the benefits of spatial information from the projection-based systems while alleviating the proximity requirement and enabling bi-directional communication. The ARN framework supports bi-directional communication between man and machine toward effective collaborations.


The system typically supports a human user to visualize the robot's sensory information, and planned trajectories, while allowing the robot to prompt information as well as asking questions through an AR interface (Muhammad et al. 2019; Cheli et al. 2018). In comparison to features available from the work of Muhammad and Cheli, the ARN framework supports human multi-robot collaboration, where the robots collaborate with both robot and human teammates or coworkers. Note that the robots may also be competitive with other robots or humans on other teams or performing other tasks, and therefore all robots and humans need not be seeking to achieve the same goals.


The present system preferably provides robots equipped with the task (re)planning capability, which enables the robots to respond to human feedback by adjusting their task completion strategy. The robots' task planning capability enables negotiation and collaboration behaviors within human-robot teams.


The AR interface of ARN enables the human teammate to visualize the robots' current status (e.g., their current locations) as well as the planned motion trajectories. For instance, a human user might “see through” a heavy door (via AR) and find a robot waiting for him/her to open the door. Moreover, ARN also supports people giving feedback to the robots' current plans. For instance, if the user is too busy to help on the door, he/she can indicate “I cannot open the door for you in three minutes” using ARN. Accordingly, the robots may incorporate such human feedback for re-planning, and see if it makes sense to work on something else and come back after three minutes. The AR interface is particularly useful in environments with challenging visibility, such as the indoor environments of offices, warehouses, and homes, because the human might frequently find it impossible to directly observe the robots' status due to occlusions.


Thus, a feature of a preferred embodiment is an automated processing associated with the robot of future planning and scheduling, to coordinate with other resources, constraints, or requirements. Further, where multiple robots are available, the planning may be collaborative to achieve group or task efficiency rather than seeking to maximize individual efficiency. Indeed, in HRC, often it is the human efficiency that is a paramount concern, and therefore one or more robots will sacrifice its own efficiency in favor of the human.


On the other hand, in a competitive environment, there is typically a goal, and planning is centered around competing for the highest reward in view of the goal. In this case, human efficiency may be less important than achieving the goal. In a competition, there are often rules or constraints to render the competition “fair” or legal, and the rules or constraints will set bounds on the plan as well. Depending on the nature of penalties or sanctions for violation of rules or constraints, the robot may consider plans that violate rules or constraints, even while accepting plans that consistently fall within the rules or constraints. For example, the planning may involve a statistical process for precision, accuracy, or reliability, with an accepted risk that the precision, accuracy or reliability constraint will be accepted. Further, with acceptance of statistical or other risks of non-compliance, fuzzy logic or other imperfect heuristics may be employed. For example, the HRC may face a combinatorial optimization problem (np complete or np hard), which cannot feasibly be computed in real time. Therefore, the automated control may implement a simplified process to estimate a reasonable plan within the time and resources available, accepting the risk that the estimate and reasonableness may fail.


ARN has been implemented on and evaluated with a human-robot collaborative delivery task in an indoor office environment. Both human participants and robots are assigned non-transferable tasks. Experimental results suggest that ARN significantly reduced people's task completion time and cognitive load, while increasing the efficiency of human-robot collaboration in overall task completion time.


In the context of the delivery task, competition might take the form of multiple HRC teams competing for the same doorway. In such a scenario, while the HRC teams are competing at a low level, in the real-world cooperation (“coopetition”) may be the best strategy, so that the competing HRC teams may collaborate to coordinate tasks, even if one team does not directly benefit from the interaction.


Multi-agent systems require the agents, including humans and robots, to extensively communicate with each other, because of the inter-dependency among the agents' actions. The inter-dependency can be in the form of state constraints, action synergies, or both. The augmented reality-driven, negotiation-based (ARN) framework is introduced to enable bidirectional communication between human and robot teammates, and iterative “negotiation” toward the most effective joint plans.


There are a variety of robot platforms that have been developed for very different purposes. A sample of examples are, among others, warehouse robots, indoor service robots, military robots, and rescue robots. Despite their capabilities of accomplishing complex tasks in the real world, these robots rarely collaborate with people. One of the main challenges in human-robot collaboration is the significant communication cost. More precisely, effective human-robot collaboration requires extensive communications on intentions in the form of planned future actions. Natural language is the most common communication channel in human community, but is less effective for human-robot interaction due to the limited bandwidth and ubiquitous ambiguity. Therefore, there is the need of creating novel technologies for efficient, accurate human-robot communication toward effective collaborations within human-robot teams. In repetitive tasks or stable HRC, a task or user profile may be developed, as a rule base, statistical compilation, or artificial intelligence/machine learning paradigm (e.g., neural networks or deep neural networks), so that explicit communication of plans may be minimized and rather expectations of the plan utilized, with feedback employed to correct the expectation. Note that the feedback may be from human to robot, or robot to human, or both.


This technology may significantly reduce the communication cost within human-robot teams, and is generally applicable to the above-mentioned robot application domains.


Effective human-robot collaboration requires communication within human-robot teams of at least the following properties: high bandwidth, low ambiguity, low latency, and minimum training. Current technologies, such as vision-based, language-based, and motion-based, do not meet such communication requirements.


The present technology may be used in such markets as industrial robots (e.g., those on assembly lines), military robots (e.g., drones), and warehouse robots. For instance, Amazon warehouses have separate areas for robots and people, and robot zones are completely forbidden to people, except for those who wear special vests. The “special vests” force robots to stop in close proximity. With the present technology, people can easily perceive robot intention and potentially achieve effective human-robot collaborations.


An aspect of various embodiments of this technology enables people to use an Augmented Reality (AR) interface to visualize robots' planned actions. To achieve this functionality, the framework includes the following components:

    • 1) an augmented reality (AR) interface,
    • 2) a task planner, and
    • 3) an action restrictor.


1) The AR interface is for visualizing the planned actions of a robot team. Trajectories and movements of robot avatars are employed to show the robots' planned actions. Further, the AR interface supports the visualization of action effects, such as a tool being taken by a robot in an augmented layer. The other functionality of this AR interface is to allow people giving feedback to robots' planned actions. For instance, a person seeing a robot going to take a tool (say screwdriver) can “tell” the robot not to take this tool by “selecting” this tool through the AR interface.


2) The team of robots need the capability of multi-robot task planning for computing sequences of actions; one for each robot. An approach based on Answer Set Programming (ASP) may be employed.


3) The action restrictor is for converting human feedback into a form that can be directly processed by the multi-robot task planner. Given that ASP is used for multi-robot task planning, the human feedback (provided through the AR interface) may be converted into a set of ASP constraints, and the constraints are then directly incorporated into the ASP program to account for human feedback in re-planning for the robot team.


The technology may be used to support human-robot teams, and may be implemented as a system, including multiple autonomous robots, AR devices (glasses, tablets, etc.), and the software that enables people to communicate with their robot teammates. The technology also provides methods of controlling the robots, providing an AR interface for the users, and various architectures for integrating the system.


In comparison to other channels for human-robot communication, the AR-based technology can provide high bandwidth, low latency, and low ambiguity. Moreover, this technology may be intuitive, and the training effort (to new human users) is minimum in comparison to traditional Human-Compute Interaction (HCl) technologies.


There is the cost needed to set up the whole system for a new domain, although this is a one-time process, and may in some cases be automated. The implementation may employ existing AR devices, such as AR glasses and tablets, as well as AR software libraries, such as ARCore (Google) and ARKit (Apple).


See U.S. Pat. Nos. 10,417,829; 10,391,408; 10,339,721; 10,332,245; 10,325,411; 10,304,254; 10,296,080; 10,290,049; 10,262,437; 10,217,488; 10,078,921; 10,062,177; 10,026,209; 10,008,045; 10,002,442; 9,985,786; 20190272348; 20190255434; 20190253835; 20190253580; 20190236844; 20190228854; 20190223958; 20190213755; 20190212901; 20190208979; 20190197788; 20190191125; 20190189160; 20190188917; 20190188913; 20190188788; 20190188766; 20190164346; 20190162575; 20190130637; 20190118104; 20190101394; 20190089760; 20190060741; 20190051054; 20180335502; 20180296916; 20180096495; 20180091869; 20180091791; and 20180075302.


The AR interface allows communication through radio frequency communications, and local and wide area communication networks, and the Internet. Therefore, there is no effective spatial limitation on the human and robot. The AR interface may be used, for example, to supports viewing status of robots that are a few minutes of traveling away from people. Potentially, it even can be applied to warehouse domains, and human-multi-UAV collaboration. While the preferred user interface comprises augmented reality (AR), the interface is not so limited, and may be modified reality (MR), or artificial (e.g., virtual reality, VR). The user interface need not be visual, and may be presented through other senses, such as auditory or tactile, for example for use by blind persons.


The present system preferably comprises a task planning system that supports computing a sequence of actions for each robot within a robot team, e.g., using declarative knowledge.


It is therefore an object to provide a method for human-automaton collaboration, comprising: generating a symbolic plan for the automaton, comprising a sequence of actions to be performed; presenting the symbolic plan as a visualizable trajectory through a visual user interface for a human user; receiving feedback from the human user to the automaton; and altering the symbolic plan in dependence on the received feedback.


The trajectory may be a static representation, or include both position and time, for example.


It is also an object to provide a method for human-automaton collaboration, comprising: generating a plan for the automaton, comprising a sequence of actions to be performed; presenting the plan as through a user interface for a human user; receiving feedback from the human user to the automaton; and altering the plan in dependence on the received feedback.


It is also an object to provide an automaton control system, comprising: a planner configured to generate a symbolic plan for the automaton, comprising a sequence of actions to be performed; a visualizer, configured to present the symbolic plan as a visualizable trajectory through a visual user interface; an input, configured to receive feedback from the human user; and a restrictor configured to process the received feedback and present it to the planner as a constraint to alter the symbolic plan.


A further object provides an automaton control system, comprising: a planner configured to generate a plan for the automaton, comprising a sequence of actions to be performed; a user interface, configured to present the plan; an input, configured to receive feedback from the human user; and a restrictor configured to process the received feedback and present it to the planner as a constraint to alter the plan.


Another object provides a method for human-automaton collaboration, comprising: automatically generating a proposed plan for the automaton, comprising a proposed sequence of actions to be performed by the automaton; presenting the proposed plan through a human computer interface for a human user; receiving feedback from the human user through the human computer interface relating to the proposed plan; and automatically altering the proposed plan to produce a plan comprising a sequence of actions to be performed by the automaton, in dependence on the received feedback.


The automaton may be a robot. The method may further comprise receiving the feedback from the human user while the robot is performing actions as a predicate to the proposed sequence of actions. The sequence of actions may be performed by the automaton in the real world; and the proposed plan may be overlayed as a visualizable time-space trajectory on a representation of the real world in an augmented reality interface for the human user. The human computer user interface may be a 3D visual user interface. The plan may comprise a time sequence of physical movement. The plan may comprise a series of tasks, wherein alteration of the proposed plan may comprise rescheduling the series of tasks to synchronize the automaton with a planned human activity. The proposed plan may comprise a coordinated set of physical activities and interactions of a plurality of automatons.


The method may further comprise automatically coordinating tasks involving a contested resource between the plurality of automatons and presenting the automatically coordinated tasks involving the contested resource to the human user through the human computer interface comprising an augmented reality interface.


The method may further comprise automatically coordinating tasks involving the plurality of automatons interacting with at least one human, and presenting the automatically coordinated tasks to the human user through the human computer interface comprising an augmented reality interface.


The method may further comprise automatically coordinating tasks involving the plurality of automatons with a distributed automated control system.


The automaton may comprise a plurality of automatons configured to operate as independent agents, further comprising automatically negotiating between the plurality of automatons configured to operate as independent agents to optimize efficiency of a human-involved task, and including a result of the automatic negotiation in the proposed plan.


It is another object to provide a control planning system for an automaton, comprising: an automated planner configured to automatically generate a plan for the automaton, comprising a sequence of actions to be performed by the automaton to perform a task limited by a set of constraints; a user interface, configured to present the plan to a human user; an input, configured to receive feedback from the human user relating to the presented plan; and a restrictor configured to automatically process the received feedback and present it to the automated planner as a constraint to update the set of constraints, wherein the automated planner is further configured to alter the plan for the automaton selectively dependent on the updated set of constraints.


The automaton may be a robot and the feedback may be received from the human while the robot is performing actions as a predicate to the proposed sequence of actions and as part of the task.


The sequence of actions may be performed in the real world; and user interface may be configured to overlay the plan as a visualizable time-space trajectory on a representation of the real world in an augmented reality interface for the human user. The user interface may be a 2D or 3D visual user interface.


As used herein, an augmented reality interface is one in which existing physical objects or environment is presented to the user with projections of computer-generated images or objects that are linked to the existing physical objects or environment.


See, en.wikipedia.org/wiki/Augmented_reality; en.wikipedia.org/wiki/Artificial_Reality


Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. A heads-up display is often employed. See en.wikipedia.org/wiki/Head-mounted_display. AR can be defined as a system that fulfills three basic features: a combination of real and virtual worlds with 3D registration of virtual and real objects. In this application, a requirement for real-time interaction is not required, since the interface provides a plan, i.e., future action, and not necessarily real time control. However, where the human in involved in a task, and uses the AR interface, integration of both real-time information and future planning is preferred. In virtual reality (VR), the users' perception of reality is completely based on virtual information. In augmented reality (AR) the user is provided with additional computer-generated information that enhances their perception of reality.


Note that the plan may be simple, such as a time for a robot to traverse a linear unobstructed and available trajectory, or a complex optimization with multiple interactions. Such simple and complex planning are well known in the art. See, en.wikipedia.org/wiki/Markov_decision_process; en.wikipedia.org/wiki/Motion_planning; en.wikipedia.org/wiki/Robot_navigation; en.wikipedia.org/wiki/Optimal_control; Kehoe, Ben, Sachin Patil, Pieter Abbeel, and Ken Goldberg. “A survey of research on cloud robotics and automation.” IEEE Transactions on automation science and engineering 12, no. 2 (2015): 398-409; Patil, Sachin, Gregory Kahn, Michael Laskey, John Schulman, Ken Goldberg, and Pieter Abbeel. “Scaling up gaussian belief space planning through covariance-free trajectory optimization and automatic differentiation.” In Algorithmic foundations of robotics XI, pp. 515-533. Springer, Cham, 2015; Bien, Zeungnam, and Jihong Lee. “A minimum-time trajectory planning method for two robots.” IEEE Transactions on Robotics and Automation 8, no. 3 (1992): 414-418; Jouaneh, Musa K., Zhixiao Wang, and David A. Dornfeld. “Trajectory planning for coordinated motion of a robot and a positioning table. I. Path specification.” IEEE Transactions on Robotics and Automation 6, no. 6 (1990): 735-745; Hwangbo, Jemin, Joonho Lee, Alexey Dosovitskiy, Dario Bellicoso, Vassilios Tsounis, Vladlen Koltun, and Marco Hutter. “Learning agile and dynamic motor skills for legged robots.” Science Robotics 4, no. 26 (2019); Hirose, Noriaki, Fei Xia, Roberto Martín-Martín, Amir Sadeghian, and Silvio Savarese. “Deep visual MPC-policy learning for navigation.” IEEE Robotics and Automation Letters 4, no. 4 (2019): 3184-3191; Moreno, Francisco-Angel, Javier Monroy, Jose-Raul Ruiz-Sarmiento, Cipriano Galindo, and Javier Gonzalez-Jimenez. “Automatic Waypoint Generation to Improve Robot Navigation Through Narrow Spaces.” Sensors 20, no. 1 (2020): 240; and Kamali, Kaveh, Ilian A. Bonev, and Christian Desrosiers. “Real-time Motion Planning for Robotic Teleoperation Using Dynamic-goal Deep Reinforcement Learning.” In 2020 17th Conference on Computer and Robot Vision (CRV), pp. 182-189. IEEE, 2020.


The plan may comprise a series of physical tasks, and the plan may be altered to reschedule the series of physical tasks to synchronize the automaton with a planned human activity based on the received feedback.


The automaton may comprise a plurality of collaborative automatons, each collaborative automaton being configured to automatically negotiate with another collaborative automaton to coordinate aspects of the task within the set of constraints.


The automaton may comprise a plurality of automatons, and at least one of the planner and the restrictor may be configured to automatically coordinate tasks involving a contested resource between the plurality of automatons within the plan before receipt of the human input.


The automaton may comprise a plurality of automatons, each representing independent agents, and at least one of the planner and the restrictor may be further configured to employ a result of an automatic negotiation between the plurality of automatons representing independent agents to optimize efficiency of a human-involved task.


Another object provides a non-transitory computer readable medium, storing instructions for controlling an automaton, comprising: instructions for generating a plan for the automaton, comprising a sequence of actions to be performed by the automaton; instructions for presenting the plan through a human computer user interface for a human user; instructions for receiving feedback from the human user relating to the plan comprising sequence of actions to be performed by the automaton; and instructions for altering the plan in dependence on the received feedback from the human user.


A still further object provides a non-transitory computer readable medium, storing instructions for controlling an automaton, comprising: instructions for generating a plan for the automaton, comprising a sequence of actions to be performed; instructions for presenting the plan through a user interface for a human user; instructions for receiving feedback from the human user to the automaton; and instructions for altering the plan in dependence on the received feedback.


The user interface is a visual user interface, an augmented reality user interface, or an augmented reality visual user interface, for example.


The plan may be presented symbolically in an augmented reality user interface, as a visualizable trajectory in a visual user interface, as a visualizable time-space trajectory in a visual user interface, as a visualizable time-space trajectory in an augmented reality user interface, for example.


The plan may be presented within a virtual reality user interface. The visual user interface may be a 2D or 3D interface.


The automaton may be a robot. Alternately, the automaton may lack physical movement actuators, and serve an automated control or layout function, for example.


The sequence of actions may be performed in the real world; and the symbolic plan overlayed as a visualizable trajectory on a representation of the real world in an augmented reality interface for the human user.


The visualizable trajectory may comprise a time sequence of movement.


The symbolic plan may comprise a series of tasks, wherein said altering the symbolic plan comprises rescheduling the series of tasks to synchronize the automaton with a planned human activity.


The automaton comprises a plurality of automatons or robots. Tasks may be automatically coordinated involving a contested resource between the plurality of automatons. Tasks involving the human may be automatically coordinated between the plurality of automatons to resolve competition for limited resources before the human involvement. Tasks involving the plurality of automatons may be automatically coordinated according to a distributed control system.


The automaton may comprise a plurality of automatons acting as independent agents, further comprising automatically negotiating between the plurality of automatons to optimize efficiency of a human-involved task.


The symbolic plan may comprise a series of tasks, which is altered to reschedule the series of tasks to synchronize the automaton with a planned human activity based on the received feedback.


The automaton may comprise a plurality of collaborative automatons or collaborative robots. The planner or restrictor may be configured to automatically coordinate tasks involving a contested resource between the plurality of automatons. The planner or restrictor may be configured to automatically coordinate tasks involving the human between the plurality of automatons to resolve competition for limited resources before the human involvement. The planner or restrictor may be further configured to automatically coordinate tasks involving the plurality of automatons according to a distributed control algorithm.


The automaton may comprise a plurality of automatons acting as independent agents, and the planner or restrictor may be further configured to automatically negotiate between the plurality of automatons to optimize efficiency of a human-involved task.


It is a further object to provide a non-transitory computer readable medium, storing instructions for controlling an automaton, comprising: instructions for generating a symbolic plan for the automaton, comprising a sequence of actions to be performed; instructions for presenting the symbolic plan as a visualizable trajectory through a visual user interface for a human user; instructions for receiving feedback from the human user to the automaton; and instructions for altering the symbolic plan in dependence on the received feedback.


It is also an object to provide a method for human-automaton collaboration, comprising: automatically generating a symbolic plan, comprising a sequence of actions to be performed by the automaton; presenting the symbolic plan as a visualizable time-space trajectory through a visual user interface for a human user; receiving feedback from the human user which limits the symbolic plan; and altering the symbolic plan in dependence on the received feedback.


It is a further object to provide a method for human-automaton collaboration, comprising: automatically generating a plan, comprising a sequence of actions to be performed by the automaton; presenting the plan through a user interface for a human user; receiving feedback from the human user which limits the plan; and altering the plan in dependence on the received feedback.


The visual user interface may provide bidirectional communication between the user and the automaton. The visual user interface may also provide real-time bidirectional communication between the user and the automaton. The visual user interface may further provide communication of sensory information from the automaton to the user. The visual user interface may present at least one query from the automaton to the user, or from the user to the automaton. The visual user interface may present a concurrent real-world image not otherwise visible to the user.


The user interface may be a visual user interface, an augmented reality visual user interface, or a virtual reality interface.


The plan may be presented symbolically in an augmented reality user interface. The plan may be presented as a visualizable time-space trajectory in a visual user interface or in an augmented reality user interface. The plan may be presented within a virtual reality user interface.


The planner may support swarm planning. The symbolic plan alteration may comprise negotiating with the user and/or task rescheduling.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows components of the ARN framework



FIGS. 2A, 2B and 2C show (2A) a human participant working on the task of Jigsaw puzzle on the left, while keeping an eye on the status of the robots through a mobile AR interface on the right; (2B) the status of three mobile robots, where two are waiting outside for the human to open the door, and one is arriving; and (2C) a screenshot of AR interface showing the three robots waiting to come inside after human opened the door with their planned trajectories in different colors, a marker showing the access point of the robots to the door, and two buttons for human feedback



FIG. 3 shows an AR interface on a mobile device.



FIG. 4 shows the turtlebot-2 platforms used for the experiments.



FIG. 5 shows map showing locations L1, L2 and L3 and B as base station.



FIG. 6 shows a top view of human looking at AR device while solving the Jigsaw.



FIGS. 7A, 7B and 7C show (7A) one robot waiting with red robot arriving at the door first, (7B) two robots waiting blue robot arriving second and waiting near red robot, and (7C) three robots waiting green robot arrives, and joins the queue.



FIG. 8 shows a plot for human vs robot completion time.



FIG. 9 shows results of survey from participants in a survey that consisted of five statements with Likert-scale responses.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS


FIG. 1 shows an overview of ARN for multi-turn bidirectional communication toward effective HRC. The core system consists of the following components:


Planner generates a symbolic plan for each robot. This symbolic plan includes a sequence of actions to be performed by the robots. The symbolic plan, in the form of an action sequence, is used by the robots to generate a sequence of motion trajectories. The set of all these generated motion plans for N robots is passed on to a Visualizer component.


Visualizer converts these robot plans into visualizable trajectories that are overlaid in the real world. The human can visualize these trajectories with the help of AR interface. The AR interface also allows the human to share its feedback. This feedback shared by the human counterpart is initially passed on to a Restrictor component.


Restrictor processes human feedback and passes it as constraints to a Planner component. The constraints (the symbolic form of human feedback) are then used for computing plans for the robots. For example, if the robots require human help, the constraint here is the availability of human teammate for door-opening actions. The changes in human or robot's intention are constantly communicated between the human and robot through the AR interface.


Negotiation: The term of “negotiation” is used to refer to the process of the agents iteratively considering other agents' plans, and (accordingly) adjusting their own plans. For example, consider the robots have a computed plan (P) at time step t1 which involves the set of actions that the robot needs to take to accomplish the goals. P is communicated with the human and human visualizes the robot plans with the help of the AR interface. The human shares his/her feedback (H) with the robot teammates at time step t2. The feedback from human may lead to re-computation of plans for some of the robot teammates. In the case of re-computation, the human is updated with the newly recomputed plans. This re-computation takes place based on the state of all the robots and human feedback. All the communication that takes place between the human and robots is a part of the entire negotiation which results in a recomputed plan of better quality.


Planner


An Answer Set Programming (ASP) based planner is used to generate plans for the team of robots (Gelfond and Kahl 2014; Lifschitz 2008). ASP is a popular general knowledge representation and reasoning (KRR) language that has gained recent attention for a variety of planning problems (Lifschitz 2002; Yang et al. 2014; Erdem, Gelfond, and Leone 2016), including robotics (Erdem and Patoglu 2018).


There are the five actions in the domain: approach, opendoor, gothrough, load, and unload. The following shows some of their definitions.

    • opendoor
    • open(D,I+1):- opendoor(D,I).


The above rule means that executing the action opendoor(D) at time step I causes the door to be open at time step I+1.

    • load
    • loaded(O,I+1):- load(O,I).


The above rule implicates that executing the action load(O) at time step I causes the object O to be loaded at time step I+1.

    • unload
    • -loaded(O,I+1):- unload(O,I).


The above rule implicates that executing the action unload(O) at time step I causes the object O to be unloaded or in ASP terms as not loaded at time step I+1.


The ASP planner also takes in a set of constraints together with the set of human-defined tasks as input, and produces a plan (P), in the form of a sequence of actions (A) to be performed to complete those tasks. For ASP-based planners, a constraint is a logical rule with the empty head. For instance:

    • :- opendoor(D,I), not facing(D,I).


The above constraint means that it is impossible to execute the opendoor(D) action if there is no evidence that the robot is facing door D at step I.


Planner generates an action sequence for the team of robots. These actions are passed on at the semantic level and hence these actions are converted into immediate goals in the 2D space. The robot continues to follow the sequence of actions unless it gets a modified action sequence from the Planner. Consider an example scenario, where robot is at room R1 and wants to go to room R2, given that the rooms R1 and R2 are connected with a door D1. Here Planner generates the following symbolic plan for the robot:

    • approach(D1,0).
    • opendoor(D1,1).
    • gothrough(D1,2).


The above generated symbolic plan indicates that the robot should approach the door D1 at time step 1=0. At time step I=1, the robot should open the door D1 and as mentioned above, the opendoor action opens the door at time step I+1, the robot can go through the door D1 at time step 1=2.


Visualizer


The human can use a phone, or a tablet to use the AR capabilities of the system. The AR interface constantly casts the information being passed from the robot to the human and also the other way around. Casting here refers to the conversion of information into a readable form for the counterpart (human or robot).


Visualizer receives a set of motion plans along with the live locations of the individual robots as input. These plans contain a list of 2D coordinates that are a result of robots motion planner. The live locations of robots are the robots' x and y coordinates specifying their current location in the environment. These motion plans and the live locations are not useful to the human and would make no sense unless they are converted to something that the human counterpart can perceive. Visualizer converts these plans and live locations to spatially visualizable objects in the augmented environment.



FIG. 2 shows the live locations of the three robots from a third person view. Visualizer augments these live locations as robot avatars. There are three robot avatars with the colors blue, green and red, which show the live locations of the three robots. FIG. 3 shows the trajectory markers that represent the motion plans of the three different robots. The AR interface distinguishes the trajectories of different robots using different colors to help people keep track of different robots.


Restrictor


Restrictor allows humans to share their feedback using the AR interface. FIG. 3 shows the buttons (2-min and 4-min) which can be used by humans to communicate their feedback to the robots. Consider an example where robots and humans share a common resource, e.g., a screwdriver, which is required by both the human and robot counterparts for accomplishing their tasks. If the human wants to halt a common resource for a specific time, the human can convey this information to the robots, then the Planner can utilize this information to re-plan based on this human feedback. The Restrictor converts the human feedback, which in its original form cannot be interpreted by the Planner, into a format that can be processed by the Planner.


Two categories of tasks are considered, long-duration tasks (ΔL) and short duration tasks (ΔS) and each of the tasks falls into one of the categories. This duration is assigned to each task beforehand and hence whenever a new task is assigned, they are categorized as either ΔL or ΔS. It should be noted that robot tasks in ΔL and ΔS require help from human counterpart and hence depend largely on human availability. If the human clicks the 2-min button to specify that he expects to be available in two minutes, then the tasks from ΔL are eliminated from the goal specification. Similarly, if the 4-min button is clicked, the tasks from ΔS AS are eliminated from the goal specification.


Consider there are two tasks of picking object O1 and O2 where picking object O1 is a long duration task and picking O2 is a short duration task. So, if the goal of the robot is to pick up both the objects and store them into base_station, then the goal specification for the Planner can be given as follows:

    • :- not located(O1,base station,n−1).
    • :- not located(O2,base station,n−1).


The Planner will generate a symbolic plan based on the goal specification. If the human clicks the 4-min, the Restrictor passes on a new goal specification with just the tasks in ΔL as follows:

    • :- not located(O1,base station,n−1).


Since human expects to be available in four minutes, the tasks from ΔS will require the robot to wait for human counterpart. Such waiting reduces the efficiency of overall human-robot teams. Hence the elimination of tasks ensures that the robot plans do not involve tasks that contradict human feedback. The tasks in ΔS are only added to the goal specification once the time specified by the human feedback has elapsed, which in this case is four minutes.


Algorithms of ARN


ARN framework generates plans for all the robots based on human feedback and ensures that all the robots have optimal plans at any given instant of time. Multithreading is used to ensure that the robots execute their plans in parallel. Table 1 lists the global variables that are used in the algorithms.












Algorithm 1 ARN Framework

















Input: S, a set of N states, and, G, a set of N goals (N > 1)



Output: P : [p1, p2,... , pN]










1:
Initialize human feedback H as Ø



2:
F = M(H), where F is global array that stores the constrained







resources interpreted from H.










3:
for each i ϵ {0, 1, ... , N−1} do










4:
P[i]= p, where si p→ gi and P is a global array, and









P[i] stores the plan for (i+1)th robots










5:
end for



6:
Thread ConstraintChecker = checkResourceConstraints( )



7:
for each i ϵ {0, 1, ... , N−1} do










8:
Thread Ti = executePlans(pi)










9:
end for










Algorithm 1 considers the current states (C) and goal states (G) of all the robots as input. The output of Algorithm 1 is a list of symbolic plans stored in P, where pi corresponds to the plan of ith robot and i∈{1, 2, · , N}. Firstly, the human feedback (H) is initialized as Ø. His then used to populate F, which is a global array that stores the constrained resources interpreted from H. It should be noted that the operation of M at Line 2 considers Has input, and outputs a set of constrained resources that are stored in F. Then a for-loop is entered that runs from 0 to N−1, where N corresponds to the number of robots. This for-loop is responsible for generating initial plans for N robots. The initial plans are generated by considering C and G of all the robots. In Line 6 a thread is started which is termed as ConstraintChecker which executes the procedure of Algorithm 2. A for-loop is then started that runs from 0 to N−1. This for-loop is responsible to start N threads for running the plan execution of N robots in parallel. Every thread runs an instance of executePlans procedure (Algorithm 3).












Algorithm 2 Procedure checkResourceConstraintst( )
















1:
while Robots still have pending tasks do








2:
Check if new human feedback (H) is obtained.


3:
if H is not NULL then








4:
F = M(H), where F stores the constrained resources



interpreted from H.








5:
end if








6:
end while
















TABLE 1







Global variables used in ARN for multi-threading










Variable name
Description







P
An array that stores the plans of N robots.



F
An array to store the constrained resources.










Algorithm 2 runs in the background until all the robots have completed their tasks to process human feedback as received. When the human provides the feedback through the AR interface, ConstraintChecker uses the operation of M and input of H to update F.


Algorithm 3 is executed on separate threads, one for each robot. This algorithm runs a for-loop that iterates over the set of constrained resources F. For each constrained resource Fj∈F, a plan {circumflex over ( )}pi is generated which is a result of the C operation. The C operation takes the current plan of the robot pi, the plan of other robot teammates (P), and Fj as input and generates optimal plan {circumflex over ( )}pi. It should be noted that the operation of argmin requires a symbolic task planner for computing a sequence of actions while minimizing the overall plan cost. This algorithm constantly checks if the plan executed by the robots is optimal. To understand how the plans are optimal, a closer look at the Lines 3-7 of Algorithm 3 are in order. In Line 3, the algorithm checks if is {circumflex over ( )}pi equal to pi where {circumflex over ( )}pi and pi are the newly generated plan and the original plan respectively for the ith robot. If the {circumflex over ( )}pi is not equal to pi, then pi is replaced with {circumflex over ( )}pi in Line 4. Otherwise, if {circumflex over ( )}pi is equal to pi, then it signifies that the constrained resources have no effect on the current plan and hence the robots carry out the actions from the original plan pi as per Line 6 of Algorithm 3.


This shows how that the system employs re-planning to keep the robot plans updated based on human feedback. Such kind of re-planning capability allows human-robot teams to adjust their plans in a much better way to achieve higher levels of efficiency. All the above algorithms are indeed a part of the entire negotiation strategy which is put forward for quick convergence to efficient plans for a team of robots in human-robot shared environments.












Algorithm 3 Procedure execute Plans(pi)


















1:
for each constrained resource FjϵF, where j ϵ {0, 1 ,... , j} and F







is global array that stores constrained resources, do










2:
{circumflex over ( )}pi = argminpi(C(pi, P, Fj)), where si −pi→ gi and P is







the global array that stores plans of all robot teammates.










3:
if {circumflex over ( )}pi != {circumflex over ( )}pi then










4:
Replace pi with pi










5:
else



6:
Robot carries out actions from plan pi



7:
end if










8:
end for










Experimental Setup


Experiments have been conducted with ten human participants using a team of three Turtlebot-2 platforms (FIG. 4) in an office environment. The Turtlebots have Kinect RGB-D camera for navigation and sensing and a laptop that runs Robot Operating System (ROS). FIG. 5 shows the pre-generated map of the floor where the experiments were conducted. L1, L2, and L3 are the three different locations that the robots have to visit as part of their task. B is the base station.


The objective of the experiments was to evaluate two hypotheses:

    • 1. Using the AR-driven system can significantly improve the collaboration efficiency in human-robot systems compared to the baseline of audio signals from robot; and
    • 2. Users prefer an AR-driven system compared to the baseline.


Ten participants of ages 20-30 volunteered to participate in an experiment where they worked in collaboration with robots to complete their task in the least possible time. The participants used both the baseline system and the AR-based system for two different trials. The allocation of the system (AR or baseline) for each trial was random. In each trial, the goal of the human participant was to complete a Jigsaw puzzle (FIG. 6) in the least possible time while helping the robot. The robots worked on delivery tasks, where each robot had to move three objects from three different locations (L1, L2, and L3) to a specified room. After picking every object, the robot had to visit the room where human was located (base station) and then again move on to pick up the next object, until the robot picked up all the three objects. Here the robots required the human counterpart to help them by opening the door to the base station.


Human Subject Study: At the end of the experiment, participants were required to fill out a survey form indicating their qualitative opinion including the following items. The response choices were 1 (Strongly disagree), 2 (Somewhat disagree), 3 (Neutral), 4 (Somewhat agree), and 5 (Strongly agree). The questions include: 1, The tasks were easy to understand; 2, It is easy to keep track of robot status; 3, I can focus on my task with minimal distraction from robot; 4, The task was not mentally demanding (e.g., remembering, deciding, thinking, etc.); 5, I enjoyed working with the robot and would like to use such a system in the future.


To evaluate Hypothesis-I the evaluation metrics used consist of human task completion time (TH), individual robots' task completion timings (TR1, TR2, TR3). These completion times are compared with the baseline and back the hypothesis with the observations from the experimental results. For evaluation of Hypothesis-II, the answers from the survey form are used for both the systems (AR and baseline).


In ARN experimental trials, the participants were given a ten-inch tablet as an AR device to observe the current status of all the robots. The AR interface empowered the participant to observe the live location of all the three robots in a spatial representation with the visualization of real-time trajectories of the robot. Through the device, the participant could track if the robots have arrived at the door. After visiting every location, the robot(s) wait outside the door for the human to open the door (FIG. 5). Multiple robots could wait for the participant to open the door. Such kind of action synergy allowed the robots to execute a coordinated plan, in turn reducing the number of required door opening actions. Also, since a mobile device was used for the AR interface, the participants had the flexibility of viewing the current status by moving the device around. A simple device holder is used to keep the mobile device on the desk where the human participant solved the Jigsaw puzzle (FIG. 6).


Baseline: The goal of ARN is to enable efficient collaboration between humans and robots by facilitating seamless communication based on visual cues (non-verbal). Therefore, ARN is evaluated for the baseline of verbal communication using audio notifications. The audio notifications were sent to the participant when any of the robots arrive at the door. For instance, when Robot 2 arrived at the door, the participant would get the audio notification, “Robot 2 arrived at the door”. The notifications helped the participant to get the current status of the robots waiting for the participant to open the door. This baseline is used since it allows the human to know the status of the robots waiting for the door, without having to know the status of other robots that are moving outside the door. No other notifications were sent to the human, to avoid cognitive overhead by pushing unnecessary notifications.


Illustrative Trial


An ARN is shown using an example of delivery tasks for three robots. Robots were randomly assigned to deliver objects from three different locations L1, L2, and L3 (FIG. 5). The participant was made to sit on a desk at the base station with a laptop to solve Jigsaw which was chosen to mimic the task of assembly. The tablet is also placed on the desk for the participant to check the status of the robot using AR (FIG. 6).


The robots start to navigate to their designated object locations to pick the objects. The human starts solving the Jigsaw at the same moment as the robots start to navigate to the object locations. At this instant, the timer is started for the robots as well as the human. The robots pick up the objects and return to the base station to drop the objects.


The robots arrive one after the other at the base station. The robot that arrives first takes the lead and waits at the door, in this case, the red robot, which can be seen in FIG. 7. After some time, the blue robot arrives and waits near the red robot in the queue for going inside the base station). Finally, the green robot arrives at the door and joins the queue after the blue robot). At this moment the participant was still solving the Jigsaw. As soon as the participant noticed that the AR device shows three robots waiting outside the door, the participant got up to open the door for the robots.


The red robot that is leading constantly monitors if the door is open or not. Once the door was opened, the red robot entered the base station. Once the red robot entered the base station it signaled the blue robot which followed the red robot to the base station. Similarly, the green robot entered the base station. The robots waited for some designated time at the base station and started navigating to their next object locations.


The above illustration depicts how the trial looked for one delivery task of the robots and how the human and robots collaborated using the ARN framework. (See, bit.ly/2yK8YsSARRobotics, a video of this trial).


Experimental Results


A first hypothesis is that ARN improves the collaboration efficiency of human-robot teams. Accordingly, the experiments were conducted as mentioned above which focuses on evaluating the efficiency of ARN compared to the baseline. The metric used here is the overall time needed for the task completion of robots and the participant.



FIG. 8 shows the overall performance of ARN compared to the baseline. The points that correspond to the human task completion time are plotted along the x-axis and the three robots task completion time which is along y-axis TR1+TR2+TR3. The lowest completion time was observed in the trials that used ARN, while the baseline has some of the highest completion times. Most of the plots of ARN are near to the origin, with two points being away (trials which took more time to complete). The plots for baseline are scattered far from the axes manifesting the high completion times of the trials.


The total time of each trial was calculated as Tall=TH+TR1+TR2+TR3. The overall time required for the completion of tasks using ARN was less than the baseline. The average Trialt was 60 minutes for ARN and it turned out to be 65 minutes and 6 seconds for the baseline. All of the above support Hypothesis-I.


A p-value of 0.02 was observed for all the human task completion times (TH). This shows that ARN performs significantly better than the baseline in human task completion time. Also for the three robots, the average completion time was less in ARN than the baseline while the improvement was not statistically significant.









TABLE 2







Results of human participant experiment













Q1
Q2
Q3
Q4
Q5


















Baseline
4.4
2.0
2.6
2.0
2.7



ARN
4.8
4.6
4.3
4.2
4.6











FIG. 9 shows the average scores from the human participant survey. The scores given out by participants were between 1-5. The scores are on the higher side for all the questions for the ARN framework compared to the baseline (Table 2). Q1 is a general question aiming at confirming if the participants understood the tasks, the difference is not large between ARN and the baseline. Apart from the average points of the individual questions, the p-values are also calculated. Apart from Q1, the p-values of all the other questions are discussed below.


In Q2, a significant difference is seen compared to the baseline. The interpretation from Q2 was that it was significantly easy to keep track of robot status using ARN. This was one of the objectives to enable effective bi-directional communication between the human and the robot. Such statistical significance with a value of 1.5474e-07 portrays that the ARN proved very effective in helping the participant in keeping track of the robots' status. Similar to Question 2, significant improvements are observed in Questions 3-5.


CONCLUSION

An augmented reality-driven, negotiation-based framework, called ARN, for efficient human-robot collaboration is provided. The human and robot teammates work on non-transferable tasks, while the robots have limited capabilities and need human help at certain phases for task completion. ARN enables human-robot negotiations through visualizing robots' current and planned actions while incorporating human feedback into robot replanning. Experiments with human participants show that ARN increases the overall efficiency of human-robot teams in task completion, while significantly reducing the cognitive load of human participants, in comparison to a baseline that supports only verbal cues. AR is applied to negotiation-based human-robot collaboration, where the negotiation is realized through the human visualizing robots' (current and planned) actions, human providing feedback to the robot teammates, and robots replanning accordingly.


The description of embodiments of the disclosure is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. While specific embodiments of, and examples for, the disclosure are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize. For example, while method steps or functions are presented in a given order, alternative embodiments may perform functions in a different order, or functions may be performed substantially concurrently. The teachings of the disclosure provided herein can be applied to other procedures or methods as appropriate. The various embodiments described herein can be combined to provide further embodiments. Aspects of the disclosure can be modified, if necessary, to employ the compositions, functions and concepts of the above references and application to provide yet further embodiments of the disclosure. These and other changes can be made to the disclosure in light of the detailed description. All such modifications are intended to be included within the scope of the appended claims.


Specific elements of any of the foregoing embodiments can be combined or substituted for elements in other embodiments. Furthermore, while advantages associated with certain embodiments of the disclosure have been described in the context of these embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the disclosure. Each reference cited herein is expressly incorporated herein in its entirety. Such references provide examples representing aspects of the invention, uses of the invention, disclosure of the context of the invention and its use and application. The various aspects disclosed herein, including subject matter incorporated herein by reference, may be employed, in combination or subcombination and in various permutations, consistent with the claims.


REFERENCES (EACH OF WHICH IS EXPRESSLY INCORPORATED HEREIN BY REFERENCE)





    • Amor, H. B.; Ganesan, R. K.; Rathore, Y.; and Ross, H. 2018. Intention projection for human-robot collaboration with mixed reality cues. In Proceedings of the 1st International Workshop on Virtual, Augmented, and Mixed Reality for HRI (VAM-HRI).

    • Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; and Maclntyre, B. 2001. Recent advances in augmented reality. Technical report, NAVAL RESEARCH LAB WASHINGTON DC.

    • Azuma, R. T. 1997. A survey of augmented reality. Presence: Teleoperators & Virtual Environments 6(4):355-385.

    • Chadalavada, R. T.; Andreasson, H.; Krug, R.; and Lilienthal, A. J. 2015. That's on my mind! robot to human intention communication through on-board projection on shared floor space. In Mobile Robots (ECMR), 2015 European Conference on, 1-6. IEEE.

    • Chai, J. Y.; She, L.; Fang, R.; Ottarson, S.; Littley, C.; Liu, C.; and Hanson, K. 2014. Collaborative effort towards common ground in situated human-robot dialogue. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, 33-40. ACM.

    • Cheli, M.; Sinapov, J.; Danahy, E.; and Rogers, C. 2018. Towards an augmented reality framework for k-12 robotics education. In 1st International Workshop on Virtual, Augmented and Mixed Reality for Human-Robot Interaction (VAMHRI).

    • Erdem, E., and Patoglu, V. 2018. Applications of asp in robotics. KI-Künstliche Intelligenz 32(2-3):143-149.

    • Erdem, E.; Gelfond, M.; and Leone, N. 2016. Applications of answer set programming. AI Magazine 37(3):53-68.

    • Gelfond, M., and Kahl, Y. 2014. Knowledge representation, reasoning, and the design of intelligent agents: The answer-set programming approach. Cambridge University Press.

    • Green, S. A.; Billinghurst, M.; Chen, X.; and Chase, J. G. 2007. Augmented reality for human-robot collaboration. In Human Robot Interaction.

    • Hedayati, H.; Walker, M.; and Szafir, D. 2018. Improving collocated robot teleoperation with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, 78-86.

    • Ivanov, S. H.; Webster, C.; and Berezina, K. 2017. Adoption of robots and service automation by tourism and hospitality companies.

    • Lifschitz, V. 2002. Answer set programming and plan generation. Artificial Intelligence 138(1-2):39-54.

    • Lifschitz, V. 2008. What is answer set programming?. In AAAI, volume 8, 1594-1597.

    • Matuszek, C.; Herbst, E.; Zettlemoyer, L.; and Fox, D. 2013. Learning to parse natural language commands to a robot control system. In Experimental Robotics, 403-415. Springer.

    • Milgram, P.; Zhai, S.; Drascic, D.; and Grodski, J. 1993. Applications of augmented reality for human-robot communication. In Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93), volume 3, 1467-1472 vol. 3.

    • Muhammad, F.; Hassan, A.; Cleaver, A.; and Sinapov, J. 2019. Creating a shared reality with robots. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction.

    • Nickel, K., and Stiefelhagen, R. 2007. Visual recognition of pointing gestures for human-robot interaction. Image and vision computing 25(12):1875-1884.

    • Park, J., and Kim, G. J. 2009. Robots with projectors: An alternative to anthropomorphic hri. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction.

    • Reinhart, G.; Vogl, W.; and Kresse, I. 2007. A projection-based user interface for industrial robots. In 2007 IEEE Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, 67-71.

    • Stone, P., and Veloso, M. 2000. Multiagent systems: A survey from a machine learning perspective. Autonomous Robots 8(3):345-383.

    • Tellex, S.; Kollar, T.; Dickerson, S.; Walter, M. R.; Banerjee, A. G.; Teller, S.; and Roy, N. 2011. Understanding natural language commands for robotic navigation and mobile manipulation. In Twenty-Fifth AAAI Conference on Artificial Intelligence.

    • Thomason, J.; Zhang, S.; Mooney, R. J.; and Stone, P. 2015. Learning to interpret natural language commands through human-robot dialog. In Twenty-Fourth International Joint Conference on Artificial Intelligence.

    • Waldherr, S.; Romero, R.; and Thrun, S. 2000. A gesture based interface for human-robot interaction. Autonomous Robots 9(2):151-173.

    • Walker, M.; Hedayati, H.; Lee, J.; and Szafir, D. 2018. Communicating robot motion intent with augmented reality. In Proceedings of the International Conference on Human-Robot Interaction.

    • Watanabe, A.; Ikeda, T.; Morales, Y.; Shinozawa, K.; Miyashita, T.; and Hagita, N. 2015. Communicating robotic navigational intentions. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

    • Wooldridge, M. 2009. An introduction to multiagent systems. John Wiley & Sons.

    • Wurman, P. R.; D'Andrea, R.; and Mountz, M. 2008. Coordinating hundreds of cooperative, autonomous vehicles in warehouses. AI magazine 29(1):9.

    • Yang, F.; Khandelwal, P.; Leonetti, M.; and Stone, P. H. 2014. Planning in answer set programming while learning action costs for mobile robots. In 2014 AAAI Spring Symposium Series.

    • Yang, H.-D.; Park, A.-Y.; and Lee, S.-W. 2007. Gesture spotting and recognition for human-robot interaction. IEEE Transactions on robotics 23(2):256-270.




Claims
  • 1. A method for human-automaton collaboration, comprising: automatically generating a proposed motion plan for a respective automaton, comprising a proposed sequence of actions and motion trajectory to be performed by the respective automaton;presenting the proposed plan comprising the proposed sequence of actions and motion trajectory through a human computer interface for a human user;receiving feedback from the human user through the human computer interface relating to the proposed plan; andautomatically altering the proposed plan to produce a modified plan comprising a rescheduled sequence of actions and motion trajectory to be performed by the respective automaton, in dependence on the received feedback and at least one predicted waiting period.
  • 2. The method according to claim 1, wherein the respective automaton is a robot, further comprising performing at least one alternate action while waiting for at least one action by the human user while the robot is performing the modified sequence of actions.
  • 3. The method according to claim 1, wherein the sequence of actions to be performed by the respective automaton is to be performed in the real world; and the proposed plan is overlayed as a visualizable time-space trajectory on a representation of the real world in an augmented reality interface for the human user.
  • 4. The method according to claim 1, wherein the human computer user interface is a 3D visual user interface.
  • 5. The method according to claim 1, wherein the plan comprises a time sequence of physical movement of the respective automaton, subject to waiting for a predicate action to occur.
  • 6. The method according to claim 1, wherein the plan comprises a series of tasks, wherein said automatically altering the proposed plan comprises rescheduling the series of tasks to synchronize the respective automaton with a planned human activity while optimizing a predicted efficiency of the automaton.
  • 7. The method according to claim 1, wherein the proposed plan comprises a coordinated set of physical activities and interactions of a plurality of automatons comprising the respective automaton, and said presenting comprises distinctly presenting the proposed motion plan for each of the plurality of automatons.
  • 8. The method according to claim 7, further comprising automatically coordinating tasks involving a contested resource between the plurality of automatons and presenting the automatically coordinated tasks involving the contested resource to the human user through the human computer interface comprising an augmented reality interface.
  • 9. The method according to claim 7, further comprising automatically coordinating tasks involving the plurality of automatons interacting with at least one human, and presenting the automatically coordinated tasks to the human user through the human computer interface comprising an augmented reality interface.
  • 10. The method according to claim 7, further comprising automatically coordinating tasks involving the plurality of automatons with a distributed automated control system.
  • 11. The method according to claim 1, wherein the plurality of automatons are configured to operate as independent agents, further comprising: automatically negotiating between the plurality of automatons comprising the respective automaton to optimize efficiency of a human-involved task; andincluding a result of the automatic negotiation in the proposed plan.
  • 12. A control planning system for an automaton, comprising: an automated planner configured to: automatically generate a plan for the automaton, comprising a sequence of actions and motion trajectory to be performed by the automaton to perform a task limited by a set of constraints; andautomatically reschedule the sequence of actions and motion trajectory in dependence on a predicted waiting period to increase efficiency;a user interface, configured to present the plan to a human user;an input, configured to receive feedback from the human user relating to the presented plan; anda restrictor configured to automatically process the received feedback and present it to the automated planner as a constraint to update the set of constraints,wherein the automated planner is further configured to alter the plan for the automaton selectively dependent on the updated set of constraints.
  • 13. The control planning system for an automaton according to claim 12, wherein the automaton is a robot and the automated planner is further configured to perform an alternate action while waiting for at least one action by the human while the robot is performing the prior sequence of actions and motion trajectory as part of the task.
  • 14. The control planning system for an automaton according to claim 12, wherein the sequence of actions and motion trajectory are to be performed in the real world; and the user interface is configured to overlay the plan as a visualizable time-space trajectory on a representation of the real world in an augmented reality interface for the human user.
  • 15. The control planning system for an automaton according to claim 12, wherein the user interface is a 3D visual user interface.
  • 16. The control planning system for an automaton according to claim 12, wherein the plan comprises a series of physical tasks, and the automated planner is further configured to alter the plan to reschedule the series of physical tasks to synchronize the automaton with a planned human activity and increase efficiency, based on the received feedback.
  • 17. The control planning system for an automaton according to claim 12, wherein the automaton is configured to interact with a plurality of collaborative automatons, each collaborative automaton being configured to automatically negotiate with another collaborative automaton and the automaton to coordinate aspects of the task within the set of constraints.
  • 18. The control planning system for an automaton according to claim 12, wherein the automaton is configured to interact with a plurality of automatons, and at least one of the automated planner and the restrictor is configured to automatically coordinate tasks involving a contested resource between the plurality of automatons and the automaton within the plan before receipt of the human input.
  • 19. The control planning system for an automaton according to claim 12, wherein the automaton is an independent agent configured to interact with a plurality of automatons each representing independent agents, and at least one of the automated planner and the restrictor is further configured to employ a result of an automatic negotiation between the plurality of automatons and the automaton to optimize efficiency of a human-involved task.
  • 20. A non-transitory computer readable medium, storing instructions for controlling an automaton, comprising: instructions for generating an efficient time-based plan for the automaton dependent on a set of constraints, comprising a sequence of actions and motion trajectory to be performed by the automaton;instructions for presenting the plan comprising the sequence of actions and motion trajectory through a human computer user interface for a human user;instructions for receiving feedback from the human user relating to the plan comprising sequence of actions and motion trajectory to be performed by the automaton; andinstructions for altering the set of constraints on the plan in dependence on the received feedback from the human user and at least one predicted waiting period for the automaton to perform a respective action or motion trajectory.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a non-provisional of, and claims benefit of priority under 35 U.S.C. § 119(e) from, U.S. Provisional Patent Application No. 62/902,830, filed Sep. 19, 2019, the entirety of which is expressly incorporated herein by reference.

US Referenced Citations (1353)
Number Name Date Kind
4163183 Engelberger et al. Jul 1979 A
4260940 Engelberger et al. Apr 1981 A
4260941 Engelberger et al. Apr 1981 A
4754402 Wand Jun 1988 A
4789940 Christian Dec 1988 A
4940925 Wand et al. Jul 1990 A
4982329 Tabata et al. Jan 1991 A
5046022 Conway et al. Sep 1991 A
5280431 Summerville et al. Jan 1994 A
5342283 Good Aug 1994 A
5375059 Kyrtsos et al. Dec 1994 A
5390125 Sennott et al. Feb 1995 A
5438517 Sennott et al. Aug 1995 A
5548516 Gudat et al. Aug 1996 A
5555503 Kyrtsos et al. Sep 1996 A
5579444 Dalziel et al. Nov 1996 A
5610815 Gudat et al. Mar 1997 A
5612883 Shaffer et al. Mar 1997 A
5615116 Gudat et al. Mar 1997 A
5629855 Kyrtsos et al. May 1997 A
5640323 Kleimenhagen et al. Jun 1997 A
5646843 Gudat et al. Jul 1997 A
5646845 Gudat et al. Jul 1997 A
5648901 Gudat et al. Jul 1997 A
5657226 Shin et al. Aug 1997 A
5680306 Shin et al. Oct 1997 A
5680313 Whittaker et al. Oct 1997 A
5684696 Rao et al. Nov 1997 A
5838562 Gudat et al. Nov 1998 A
5956250 Gudat et al. Sep 1999 A
5969973 Bourne et al. Oct 1999 A
5983161 Lemelson et al. Nov 1999 A
5985214 Stylli et al. Nov 1999 A
6042555 Kramer et al. Mar 2000 A
6099457 Good Aug 2000 A
6122572 Yavnai Sep 2000 A
6169981 Werbos Jan 2001 B1
6233545 Datig May 2001 B1
6252544 Hoffberg Jun 2001 B1
6275773 Lemelson et al. Aug 2001 B1
6292830 Taylor et al. Sep 2001 B1
6341243 Bourne et al. Jan 2002 B1
6341372 Datig Jan 2002 B1
6360193 Stoyen Mar 2002 B1
6413229 Kramer et al. Jul 2002 B1
6429812 Hoffberg Aug 2002 B1
6468800 Stylli et al. Oct 2002 B1
6472218 Stylli et al. Oct 2002 B1
6487500 Lemelson et al. Nov 2002 B2
6507767 Bourne et al. Jan 2003 B2
6574628 Kahn et al. Jun 2003 B1
6580967 Jevtic et al. Jun 2003 B2
6581048 Werbos Jun 2003 B1
6666811 Good Dec 2003 B1
6678577 Stylli et al. Jan 2004 B1
6685884 Stylli et al. Feb 2004 B2
6748325 Fujisaki Jun 2004 B1
6791472 Hoffberg Sep 2004 B1
6842674 Solomon Jan 2005 B2
6842692 Fehr et al. Jan 2005 B2
6845294 Jevtic et al. Jan 2005 B2
6890485 Stylli et al. May 2005 B1
6898484 Lemelson et al. May 2005 B2
6904335 Solomon Jun 2005 B2
6965816 Walker Nov 2005 B2
6988026 Breed et al. Jan 2006 B2
7033781 Short Apr 2006 B1
7047861 Solomon May 2006 B2
7054718 Miyamoto et al. May 2006 B2
7072764 Donath et al. Jul 2006 B2
7099747 Mikami et al. Aug 2006 B2
7103460 Breed Sep 2006 B1
7168748 Townsend et al. Jan 2007 B2
7236861 Paradis et al. Jun 2007 B2
7268700 Hoffberg Sep 2007 B1
7269479 Okamoto et al. Sep 2007 B2
7271737 Hoffberg Sep 2007 B1
7298289 Hoffberg Nov 2007 B1
7330844 Stoyen Feb 2008 B2
7343222 Solomon Mar 2008 B2
7383107 Fehr et al. Jun 2008 B2
7386163 Sabe et al. Jun 2008 B2
7415321 Okazaki et al. Aug 2008 B2
7421321 Breed et al. Sep 2008 B2
7528835 Templeman May 2009 B2
7558156 Vook et al. Jul 2009 B2
7590589 Hoffberg Sep 2009 B2
7649331 Hosoda et al. Jan 2010 B2
7662128 Salcudean et al. Feb 2010 B2
7706918 Sato et al. Apr 2010 B2
7714895 Pretlove May 2010 B2
7720777 Ducheneaut et al. May 2010 B2
7742845 Fink et al. Jun 2010 B2
7751928 Antony et al. Jul 2010 B1
7756615 Barfoot et al. Jul 2010 B2
7765029 Fleischer et al. Jul 2010 B2
7765038 Appleby et al. Jul 2010 B2
7774243 Antony et al. Aug 2010 B1
7835778 Foley et al. Nov 2010 B2
7865267 Sabe et al. Jan 2011 B2
7873471 Gieseke Jan 2011 B2
7881824 Nagasaka et al. Feb 2011 B2
7904182 Hosek et al. Mar 2011 B2
7949428 Endo et al. May 2011 B2
7970476 Chapin et al. Jun 2011 B2
7991576 Roumeliotis Aug 2011 B2
8010180 Quaid et al. Aug 2011 B2
8052857 Townsend Nov 2011 B2
8112176 Solomon Feb 2012 B2
8121618 Rhoads et al. Feb 2012 B2
8126642 Trepagnier et al. Feb 2012 B2
8139109 Schmiedel et al. Mar 2012 B2
8145295 Boyden et al. Mar 2012 B2
8157205 McWhirk Apr 2012 B2
8160680 Boyden et al. Apr 2012 B2
8180436 Boyden et al. May 2012 B2
8195343 Lin Jun 2012 B2
8195358 Anderson Jun 2012 B2
8195599 Boddy et al. Jun 2012 B2
8200428 Anderson Jun 2012 B2
8213261 Imhof et al. Jul 2012 B2
8221322 Wang et al. Jul 2012 B2
8229163 Coleman et al. Jul 2012 B2
8229618 Tolstedt et al. Jul 2012 B2
8237775 Givon Aug 2012 B2
8244327 Fichtinger et al. Aug 2012 B2
8244469 Cheung et al. Aug 2012 B2
8255092 Phillips et al. Aug 2012 B2
8280623 Trepagnier et al. Oct 2012 B2
8306650 Antony et al. Nov 2012 B1
8340823 Ohno et al. Dec 2012 B2
8373582 Hoffberg Feb 2013 B2
8374721 Halloran et al. Feb 2013 B2
8392065 Tolstedt et al. Mar 2013 B2
8412449 Trepagnier et al. Apr 2013 B2
8414356 Boyden et al. Apr 2013 B2
8419804 Herr et al. Apr 2013 B2
8422994 Rhoads et al. Apr 2013 B2
8447440 Phillips et al. May 2013 B2
8447524 Chen et al. May 2013 B2
8457830 Goulding Jun 2013 B2
8467779 Helfrich Jun 2013 B2
8467928 Anderson Jun 2013 B2
8478493 Anderson Jul 2013 B2
8483876 Ohno Jul 2013 B2
8485861 Boyden et al. Jul 2013 B2
8489115 Rodriguez et al. Jul 2013 B2
8512219 Ferren et al. Aug 2013 B2
8512415 Herr et al. Aug 2013 B2
8518031 Boyden et al. Aug 2013 B2
8521257 Whitcomb et al. Aug 2013 B2
8538673 Sislak et al. Sep 2013 B2
8568363 Boyden et al. Oct 2013 B2
8577126 Jones et al. Nov 2013 B2
8577538 Lenser et al. Nov 2013 B2
8583286 Fleischer et al. Nov 2013 B2
8583313 Mian Nov 2013 B2
8600830 Hoffberg Dec 2013 B2
8612052 Nagasaka et al. Dec 2013 B2
8629789 Hoffberg Jan 2014 B2
8630763 Goulding Jan 2014 B2
8660642 Ferren et al. Feb 2014 B2
8666587 Anderson Mar 2014 B2
8682309 Helfrich Mar 2014 B2
8682726 Hoffberg Mar 2014 B2
8694092 Ferren et al. Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706186 Fichtinger et al. Apr 2014 B2
8706298 Goulding Apr 2014 B2
8706394 Trepagnier et al. Apr 2014 B2
8711206 Newcombe et al. Apr 2014 B2
8712686 Bandyopadhyay et al. Apr 2014 B2
8725292 Perlin et al. May 2014 B2
8727987 Chauhan May 2014 B2
8737986 Rhoads et al. May 2014 B2
8761931 Halloran et al. Jun 2014 B2
8781629 Ota Jul 2014 B2
8784384 Boyden et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8798932 Boyden et al. Aug 2014 B2
8798933 Boyden et al. Aug 2014 B2
8803951 Gay et al. Aug 2014 B2
8817078 Gay et al. Aug 2014 B2
8818567 Anderson Aug 2014 B2
8822924 Valentino et al. Sep 2014 B2
8831863 Soulie et al. Sep 2014 B2
8834488 Farritor et al. Sep 2014 B2
8842176 Schofield et al. Sep 2014 B2
8849441 Boyden et al. Sep 2014 B2
8858912 Boyden et al. Oct 2014 B2
8864846 Herr et al. Oct 2014 B2
8874162 Schrader et al. Oct 2014 B2
8874477 Hoffberg Oct 2014 B2
8885022 Gay et al. Nov 2014 B2
8900325 Herr et al. Dec 2014 B2
8911499 Quaid et al. Dec 2014 B2
8914182 Casado et al. Dec 2014 B2
8918209 Rosenstein et al. Dec 2014 B2
8920332 Hong et al. Dec 2014 B2
8930019 Allen et al. Jan 2015 B2
8935119 Yuen Jan 2015 B2
8936629 Boyden et al. Jan 2015 B2
8939056 Neal, III et al. Jan 2015 B1
8945017 Venkatraman et al. Feb 2015 B2
8947531 Fischer et al. Feb 2015 B2
8948832 Hong et al. Feb 2015 B2
8956303 Hong et al. Feb 2015 B2
8965677 Breed et al. Feb 2015 B2
8965688 Bandyopadhyay et al. Feb 2015 B2
8965730 Yuen Feb 2015 B2
8968332 Farritor et al. Mar 2015 B2
8972177 Zheng et al. Mar 2015 B2
8989972 Anderson Mar 2015 B2
8998815 Venkatraman et al. Apr 2015 B2
9002426 Quaid et al. Apr 2015 B2
9005129 Venkatraman et al. Apr 2015 B2
9008962 Bandyopadhyay et al. Apr 2015 B2
9014790 Richards et al. Apr 2015 B2
9014848 Farlow et al. Apr 2015 B2
9020617 Hosek et al. Apr 2015 B2
9021024 Tang et al. Apr 2015 B1
9040087 Boyden et al. May 2015 B2
9044149 Richards et al. Jun 2015 B2
9044171 Venkatraman et al. Jun 2015 B2
9046373 Bandyopadhyay et al. Jun 2015 B2
9046892 Jang et al. Jun 2015 B2
9056676 Wang Jun 2015 B1
9060678 Larkin et al. Jun 2015 B2
9066211 Helfrich Jun 2015 B2
9075146 Valentino et al. Jul 2015 B1
9079060 Hong et al. Jul 2015 B2
9089968 Goulding Jul 2015 B2
9092698 Buehler et al. Jul 2015 B2
9098079 Masoud Aug 2015 B2
9113794 Hong et al. Aug 2015 B2
9113795 Hong et al. Aug 2015 B2
9129532 Rubin et al. Sep 2015 B2
9131529 Ayyagari et al. Sep 2015 B1
9139310 Wang Sep 2015 B1
9151633 Hoffberg Oct 2015 B2
9168392 Balakin Oct 2015 B1
9168419 Hong et al. Oct 2015 B2
9177476 Breed Nov 2015 B2
9183560 Abelow Nov 2015 B2
9188980 Anderson Nov 2015 B2
9198563 Ferren et al. Dec 2015 B2
9198604 Venkatraman et al. Dec 2015 B2
9199725 Yelland et al. Dec 2015 B2
9211201 Herr et al. Dec 2015 B2
9220086 Wang et al. Dec 2015 B2
9220917 Boyden et al. Dec 2015 B2
9221177 Herr et al. Dec 2015 B2
9228859 Ranky et al. Jan 2016 B2
9234744 Rhoads et al. Jan 2016 B2
9237855 Hong et al. Jan 2016 B2
9238304 Bradski Jan 2016 B1
9248982 Eberhardt et al. Feb 2016 B2
9251393 Pollack Feb 2016 B2
9253753 Rubin et al. Feb 2016 B2
9261376 Zheng et al. Feb 2016 B2
9282902 Richards et al. Mar 2016 B2
9292936 Bronshtein Mar 2016 B2
9300423 Rubin et al. Mar 2016 B2
9302122 Balakin Apr 2016 B2
9302783 Wang Apr 2016 B2
9307917 Hong et al. Apr 2016 B2
9311670 Hoffberg Apr 2016 B2
9333042 Diolaiti et al. May 2016 B2
9345387 Larkin May 2016 B2
9345592 Herr et al. May 2016 B2
9346560 Wang May 2016 B2
9351106 Markham et al. May 2016 B2
9351856 Herr et al. May 2016 B2
9358975 Watts Jun 2016 B1
9361797 Chen et al. Jun 2016 B1
9383752 Mian Jul 2016 B2
9389085 Khorashadi et al. Jul 2016 B2
9392920 Halloran et al. Jul 2016 B2
9400503 Kearns et al. Jul 2016 B2
9402552 Richards et al. Aug 2016 B2
9408530 Ferren et al. Aug 2016 B2
9410979 Yuen et al. Aug 2016 B2
9412278 Gong et al. Aug 2016 B1
9413852 Lawson et al. Aug 2016 B2
9420203 Broggi et al. Aug 2016 B2
9420432 Matthews, III et al. Aug 2016 B2
9429657 Sidhu et al. Aug 2016 B2
9429661 Valentino et al. Aug 2016 B2
9431006 Bellegarda Aug 2016 B2
9434072 Buehler et al. Sep 2016 B2
9440545 Wang Sep 2016 B2
9443192 Cosic Sep 2016 B1
9445711 Sitti et al. Sep 2016 B2
9448072 Bandyopadhyay et al. Sep 2016 B2
9456787 Venkatraman et al. Oct 2016 B2
9457915 Wang Oct 2016 B2
9459273 Moix et al. Oct 2016 B2
9459626 Chen et al. Oct 2016 B2
9465390 Mason et al. Oct 2016 B2
9467834 Guday et al. Oct 2016 B2
9468349 Fong et al. Oct 2016 B2
9470529 Sidhu et al. Oct 2016 B2
9470702 Pollack Oct 2016 B2
9477230 Sastre I Sastre Oct 2016 B2
9480534 Bowling et al. Nov 2016 B2
9486921 Straszheim et al. Nov 2016 B1
9488987 Goulding Nov 2016 B2
9489655 Lecky Nov 2016 B1
9491589 Schrader et al. Nov 2016 B2
9494432 Pakzad et al. Nov 2016 B2
9497380 Jannard et al. Nov 2016 B1
9498649 Balakin Nov 2016 B2
9504408 Hong et al. Nov 2016 B2
9507346 Levinson et al. Nov 2016 B1
9511799 Lavoie Dec 2016 B2
9513132 Fowe Dec 2016 B2
9517668 Lavoie Dec 2016 B2
9517767 Kentley et al. Dec 2016 B1
9519882 Galluzzo et al. Dec 2016 B2
9520638 Baringer et al. Dec 2016 B2
9527586 Levien et al. Dec 2016 B2
9527587 Levien et al. Dec 2016 B2
9538892 Fong et al. Jan 2017 B2
9539117 Herr et al. Jan 2017 B2
9540043 Lavoie Jan 2017 B2
9540102 Levien et al. Jan 2017 B2
9541383 Abovitz et al. Jan 2017 B2
9542600 Buchanan et al. Jan 2017 B2
9543636 Baringer et al. Jan 2017 B2
9551582 Hoffberg Jan 2017 B2
9554922 Casler et al. Jan 2017 B2
9557742 Paduano et al. Jan 2017 B2
9561794 Watts Feb 2017 B2
9561941 Watts Feb 2017 B1
9567074 Levien et al. Feb 2017 B2
9568492 Yuen Feb 2017 B2
9572533 Venkatraman et al. Feb 2017 B2
9574883 Watts et al. Feb 2017 B2
9582720 Gupta et al. Feb 2017 B2
9586316 Swilling Mar 2017 B1
9588195 Fichtinger et al. Mar 2017 B2
9592851 Lavoie et al. Mar 2017 B2
9593957 Zheng et al. Mar 2017 B2
9597014 Venkatraman et al. Mar 2017 B2
9599632 Yuen Mar 2017 B2
9599990 Halloran et al. Mar 2017 B2
9600138 Thomas et al. Mar 2017 B2
9605952 Rose et al. Mar 2017 B2
9606539 Kentley et al. Mar 2017 B1
9609107 Rodriguez et al. Mar 2017 B2
9612123 Levinson et al. Apr 2017 B1
9612403 Abovitz et al. Apr 2017 B2
9616252 Balakin Apr 2017 B2
9623562 Watts Apr 2017 B1
9630619 Kentley et al. Apr 2017 B1
9632502 Levinson et al. Apr 2017 B1
9638829 Davoodi et al. May 2017 B2
9641239 Panther et al. May 2017 B2
9643316 Krasny et al. May 2017 B2
9645159 Pollack et al. May 2017 B2
9645577 Frazzoli et al. May 2017 B1
9646614 Bellegarda et al. May 2017 B2
9649767 Nusser et al. May 2017 B2
9651368 Abovitz et al. May 2017 B2
9655548 Hong et al. May 2017 B2
9658239 Eberhardt et al. May 2017 B2
9661827 Shen et al. May 2017 B1
9662053 Richards et al. May 2017 B2
9669544 Buehler et al. Jun 2017 B2
9671418 Mellars et al. Jun 2017 B2
9671566 Abovitz et al. Jun 2017 B2
9682481 Lutz et al. Jun 2017 B2
9683858 Zheng et al. Jun 2017 B2
9687301 Lee Jun 2017 B2
9687377 Han et al. Jun 2017 B2
9701015 Buehler et al. Jul 2017 B2
9701239 Kentley et al. Jul 2017 B2
9703295 Neal, III et al. Jul 2017 B1
9714954 Gelbman Jul 2017 B2
9720415 Levinson et al. Aug 2017 B2
9721471 Chen et al. Aug 2017 B2
9726686 Mellars et al. Aug 2017 B2
9731853 Akdogan et al. Aug 2017 B2
9734220 Karpi{hacek over (s)}t{hacek over (s)}enko et al. Aug 2017 B2
9734367 Lecky et al. Aug 2017 B1
9734455 Levinson et al. Aug 2017 B2
9734632 Thomas et al. Aug 2017 B2
9736655 Schrader et al. Aug 2017 B2
9744672 Sun et al. Aug 2017 B2
9746330 Lacaze et al. Aug 2017 B2
9747809 Levien et al. Aug 2017 B2
9750977 Yuen et al. Sep 2017 B2
9751015 Gay et al. Sep 2017 B2
9754226 Zheng et al. Sep 2017 B2
9754419 Petrovskaya et al. Sep 2017 B2
9754490 Kentley et al. Sep 2017 B2
9760093 Perlin et al. Sep 2017 B2
9767608 Lee et al. Sep 2017 B2
9769365 Jannard Sep 2017 B1
9775681 Quaid et al. Oct 2017 B2
9776326 Zevenbergen et al. Oct 2017 B2
9776716 Levien et al. Oct 2017 B2
9785911 Galluzzo et al. Oct 2017 B2
9786202 Huang et al. Oct 2017 B2
9791866 Paduano et al. Oct 2017 B2
9792613 Gong et al. Oct 2017 B2
9794541 Gay et al. Oct 2017 B2
9795445 Bowling Oct 2017 B2
9798329 Shattil Oct 2017 B2
9801527 Ferren et al. Oct 2017 B2
9802661 Kentley-Klay Oct 2017 B1
9804599 Kentley-Klay et al. Oct 2017 B2
9805372 Gong et al. Oct 2017 B2
9805607 Gong et al. Oct 2017 B2
9818136 Hoffberg Nov 2017 B1
9827683 Hance et al. Nov 2017 B1
9830485 Lecky Nov 2017 B1
9832749 Schlesinger et al. Nov 2017 B2
9835637 Pollack et al. Dec 2017 B2
9839552 Han et al. Dec 2017 B2
9840003 Szatmary et al. Dec 2017 B2
9857170 Abovitz et al. Jan 2018 B2
9860129 Zinger et al. Jan 2018 B2
9861075 Shen et al. Jan 2018 B2
9869484 Hester et al. Jan 2018 B2
9870566 Gong et al. Jan 2018 B2
9873196 Szatmary et al. Jan 2018 B2
9878664 Kentley-Klay et al. Jan 2018 B2
9880561 Russell Jan 2018 B2
9881497 Chen et al. Jan 2018 B2
9888105 Rhoads Feb 2018 B2
9901408 Larkin Feb 2018 B2
9902069 Farlow et al. Feb 2018 B2
9907721 Morbi et al. Mar 2018 B2
9910441 Levinson et al. Mar 2018 B2
9911020 Liu et al. Mar 2018 B1
9916002 Petrovskaya et al. Mar 2018 B2
9916006 Maltz Mar 2018 B2
9916010 Harris et al. Mar 2018 B2
9916703 Levinson et al. Mar 2018 B2
9919360 Buller et al. Mar 2018 B2
9927807 Ganjoo Mar 2018 B1
9928649 Hu et al. Mar 2018 B2
9931573 Fang et al. Apr 2018 B2
9931697 Levin et al. Apr 2018 B2
9937621 Zevenbergen et al. Apr 2018 B2
9939817 Kentley-Klay et al. Apr 2018 B1
9940553 Shotton et al. Apr 2018 B2
9940604 Galluzzo et al. Apr 2018 B2
9945677 Watts Apr 2018 B1
9947230 Hu et al. Apr 2018 B2
9952042 Abovitz et al. Apr 2018 B2
9952591 Parekh et al. Apr 2018 B2
9958864 Kentley-Klay et al. May 2018 B2
9958875 Paduano et al. May 2018 B2
9964765 Gardiner May 2018 B2
9965730 Hance et al. May 2018 B2
9968280 Whitcomb et al. May 2018 B2
9972137 Petrovskaya et al. May 2018 B2
9975249 Herr et al. May 2018 B2
9977496 Maltz May 2018 B2
9978013 Kaufhold May 2018 B2
9980630 Larkin et al. May 2018 B2
9983584 Bruggemann et al. May 2018 B2
9984339 Hance et al. May 2018 B2
9985786 Bhabbur et al. May 2018 B1
10449673 Hill Oct 2019 B2
10884430 Kumar Jan 2021 B2
20020005614 Krull et al. Jan 2002 A1
20020012611 Stylli et al. Jan 2002 A1
20020016647 Bourne et al. Feb 2002 A1
20020022927 Lemelson Feb 2002 A1
20020073101 Stoyen Jun 2002 A1
20020184236 Donath et al. Dec 2002 A1
20020198623 Jevtic et al. Dec 2002 A1
20020198697 Datig Dec 2002 A1
20030093187 Walker May 2003 A1
20030179308 Zamorano et al. Sep 2003 A1
20030199944 Chapin et al. Oct 2003 A1
20030208302 Lemelson et al. Nov 2003 A1
20040006422 Fehr et al. Jan 2004 A1
20040006566 Taylor et al. Jan 2004 A1
20040013295 Sabe et al. Jan 2004 A1
20040019402 Bourne et al. Jan 2004 A1
20040030448 Solomon Feb 2004 A1
20040030449 Solomon Feb 2004 A1
20040030450 Solomon Feb 2004 A1
20040030451 Solomon Feb 2004 A1
20040030570 Solomon Feb 2004 A1
20040030571 Solomon Feb 2004 A1
20040030741 Wolton et al. Feb 2004 A1
20040068351 Solomon Apr 2004 A1
20040068415 Solomon Apr 2004 A1
20040068416 Solomon Apr 2004 A1
20040077090 Short Apr 2004 A1
20040103740 Townsend et al. Jun 2004 A1
20040107021 Jevtic et al. Jun 2004 A1
20040128028 Miyamoto et al. Jul 2004 A1
20040130442 Breed et al. Jul 2004 A1
20040133168 Salcudean et al. Jul 2004 A1
20040134336 Solomon Jul 2004 A1
20040134337 Solomon Jul 2004 A1
20040162638 Solomon Aug 2004 A1
20040242953 Good Dec 2004 A1
20040267442 Fehr et al. Dec 2004 A1
20050005266 Datig Jan 2005 A1
20050065649 Rosenfeld et al. Mar 2005 A1
20050071043 Jevtic et al. Mar 2005 A1
20050125099 Mikami et al. Jun 2005 A1
20050131581 Sabe et al. Jun 2005 A1
20050149251 Donath et al. Jul 2005 A1
20050183569 Solomon Aug 2005 A1
20050187677 Walker Aug 2005 A1
20050191670 Stylli et al. Sep 2005 A1
20050192727 Shostak et al. Sep 2005 A1
20050215764 Tuszynski et al. Sep 2005 A1
20050237188 Tani Oct 2005 A1
20050240253 Tyler et al. Oct 2005 A1
20050240307 Kuroki et al. Oct 2005 A1
20050249667 Tuszynski et al. Nov 2005 A1
20050251291 Solomon Nov 2005 A1
20050273218 Breed et al. Dec 2005 A1
20060064202 Gutmann et al. Mar 2006 A1
20060095171 Whittaker et al. May 2006 A1
20060097683 Hosoda et al. May 2006 A1
20060142657 Quaid et al. Jun 2006 A1
20060161218 Danilov Jul 2006 A1
20060161405 Munirajan Jul 2006 A1
20060167784 Hoffberg Jul 2006 A1
20060184272 Okazaki et al. Aug 2006 A1
20060229801 Fink et al. Oct 2006 A1
20060241368 Fichtinger et al. Oct 2006 A1
20060241718 Tyler et al. Oct 2006 A1
20070007384 Sliwa Jan 2007 A1
20070010898 Hosek et al. Jan 2007 A1
20070018890 Kulyukin Jan 2007 A1
20070027612 Barfoot et al. Feb 2007 A1
20070039831 Townsend Feb 2007 A1
20070055662 Edelman et al. Mar 2007 A1
20070063875 Hoffberg Mar 2007 A1
20070070072 Templeman Mar 2007 A1
20070087756 Hoffberg Apr 2007 A1
20070100780 Fleischer et al. May 2007 A1
20070124024 Okamoto et al. May 2007 A1
20070150565 Ayyagari et al. Jun 2007 A1
20070159924 Vook et al. Jul 2007 A1
20070217586 Marti Sep 2007 A1
20070219933 Datig Sep 2007 A1
20070220637 Endo et al. Sep 2007 A1
20070239315 Sato et al. Oct 2007 A1
20070250119 Tyler et al. Oct 2007 A1
20070262860 Salinas et al. Nov 2007 A1
20080009772 Tyler et al. Jan 2008 A1
20080027591 Lenser et al. Jan 2008 A1
20080059015 Whittaker et al. Mar 2008 A1
20080072139 Salinas et al. Mar 2008 A1
20080167771 Whittaker et al. Jul 2008 A1
20080226498 Stylli et al. Sep 2008 A1
20080228239 Tyler et al. Sep 2008 A1
20080270097 Solomon Oct 2008 A1
20080300777 Fehr et al. Dec 2008 A1
20080312561 Chauhan Dec 2008 A1
20090000626 Quaid et al. Jan 2009 A1
20090000627 Quaid et al. Jan 2009 A1
20090012531 Quaid et al. Jan 2009 A1
20090012532 Quaid et al. Jan 2009 A1
20090037033 Phillips et al. Feb 2009 A1
20090043504 Bandyopadhyay et al. Feb 2009 A1
20090073034 Lin Mar 2009 A1
20090087029 Coleman et al. Apr 2009 A1
20090148035 Ohno et al. Jun 2009 A1
20090152391 McWhirk Jun 2009 A1
20090178597 Sliwa, Jr. Jul 2009 A1
20090306741 Hogle et al. Dec 2009 A1
20090312808 Tyler et al. Dec 2009 A1
20090312817 Hogle et al. Dec 2009 A1
20090326604 Tyler et al. Dec 2009 A1
20100017046 Cheung et al. Jan 2010 A1
20100042258 Perlin et al. Feb 2010 A1
20100056900 Whitcomb et al. Mar 2010 A1
20100076631 Mian Mar 2010 A1
20100076737 Boddy et al. Mar 2010 A1
20100106356 Trepagnier et al. Apr 2010 A1
20100114633 Sislak et al. May 2010 A1
20100149917 Imhof et al. Jun 2010 A1
20100161232 Chen et al. Jun 2010 A1
20100235285 Hoffberg Sep 2010 A1
20100268382 Ohno Oct 2010 A1
20100286791 Goldsmith Nov 2010 A1
20100286824 Solomon Nov 2010 A1
20100312387 Jang et al. Dec 2010 A1
20100312388 Jang et al. Dec 2010 A1
20100317420 Hoffberg Dec 2010 A1
20110002194 Imhof et al. Jan 2011 A1
20110004513 Hoffberg Jan 2011 A1
20110022230 Fleischer et al. Jan 2011 A1
20110077775 Nagasaka et al. Mar 2011 A1
20110082717 Saad et al. Apr 2011 A1
20110118855 Hosek et al. May 2011 A1
20110128300 Gay et al. Jun 2011 A1
20110130114 Boudville Jun 2011 A1
20110164030 Gay et al. Jul 2011 A1
20110164116 Gay et al. Jul 2011 A1
20110185110 Smigelski Jul 2011 A1
20110196564 Coulmeau Aug 2011 A1
20110231016 Goulding Sep 2011 A1
20110231050 Goulding Sep 2011 A1
20110288684 Farlow et al. Nov 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120069131 Abelow Mar 2012 A1
20120072023 Ota Mar 2012 A1
20120075072 Pappu Mar 2012 A1
20120084839 Ayyagari et al. Apr 2012 A1
20120101680 Trepagnier et al. Apr 2012 A1
20120109150 Quaid et al. May 2012 A1
20120130632 Bandyopadhyay et al. May 2012 A1
20120149353 Helfrich Jun 2012 A1
20120166024 Phillips et al. Jun 2012 A1
20120173018 Allen et al. Jul 2012 A1
20120182392 Kearns et al. Jul 2012 A1
20120185094 Rosenstein et al. Jul 2012 A1
20120194644 Newcombe et al. Aug 2012 A1
20120209432 Fleischer et al. Aug 2012 A1
20120215354 Krasny et al. Aug 2012 A1
20120274775 Reiffel Nov 2012 A1
20120290152 Cheung et al. Nov 2012 A1
20120310112 Fichtinger et al. Dec 2012 A1
20120316725 Trepagnier et al. Dec 2012 A1
20130079693 Ranky et al. Mar 2013 A1
20130131985 Weiland et al. May 2013 A1
20130165070 Hoffberg Jun 2013 A1
20130166195 Bandyopadhyay et al. Jun 2013 A1
20130166202 Bandyopadhyay et al. Jun 2013 A1
20130166387 Hoffberg Jun 2013 A1
20130196300 Huang et al. Aug 2013 A1
20130212420 Lawson et al. Aug 2013 A1
20130238183 Goulding Sep 2013 A1
20130252586 Helfrich Sep 2013 A1
20130274986 Trepagnier et al. Oct 2013 A1
20130279392 Rubin et al. Oct 2013 A1
20130279393 Rubin et al. Oct 2013 A1
20130279491 Rubin et al. Oct 2013 A1
20130303847 Sitti et al. Nov 2013 A1
20130320212 Valentino et al. Dec 2013 A1
20130335273 Pakzad et al. Dec 2013 A1
20130343640 Buehler et al. Dec 2013 A1
20130345870 Buehler Dec 2013 A1
20130346348 Buehler Dec 2013 A1
20140039298 Whitcomb et al. Feb 2014 A1
20140067188 Mian Mar 2014 A1
20140081459 Dubois et al. Mar 2014 A1
20140100693 Fong et al. Apr 2014 A1
20140100697 Goulding Apr 2014 A1
20140155087 Helfrich Jun 2014 A1
20140155098 Markham et al. Jun 2014 A1
20140156806 Karpistsenko et al. Jun 2014 A1
20140163664 Goldsmith Jun 2014 A1
20140187913 Fichtinger et al. Jul 2014 A1
20140193040 Bronshtein Jul 2014 A1
20140214259 Trepagnier et al. Jul 2014 A1
20140241612 Rhemann et al. Aug 2014 A1
20140241617 Shotton et al. Aug 2014 A1
20140244035 Perlin et al. Aug 2014 A1
20140263989 Valentino et al. Sep 2014 A1
20140264047 Valentino et al. Sep 2014 A1
20140266939 Baringer et al. Sep 2014 A1
20140268601 Valentino et al. Sep 2014 A1
20140273858 Panther et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275850 Venkatraman et al. Sep 2014 A1
20140275852 Hong et al. Sep 2014 A1
20140275854 Venkatraman et al. Sep 2014 A1
20140276119 Venkatraman et al. Sep 2014 A1
20140277737 Sekiyama et al. Sep 2014 A1
20140278220 Yuen Sep 2014 A1
20140278229 Hong et al. Sep 2014 A1
20140278634 Horvitz et al. Sep 2014 A1
20140288390 Hong et al. Sep 2014 A1
20140288391 Hong et al. Sep 2014 A1
20140288392 Hong et al. Sep 2014 A1
20140288435 Richards et al. Sep 2014 A1
20140288436 Venkatraman et al. Sep 2014 A1
20140288438 Venkatraman et al. Sep 2014 A1
20140293014 Gay et al. Oct 2014 A1
20140297217 Yuen Oct 2014 A1
20140297218 Yuen Oct 2014 A1
20140303486 Baumgartner et al. Oct 2014 A1
20140305204 Hong et al. Oct 2014 A1
20140316305 Venkatraman et al. Oct 2014 A1
20140316570 Sun et al. Oct 2014 A1
20140333668 Gay et al. Nov 2014 A1
20140347265 Aimone et al. Nov 2014 A1
20140356817 Brooks et al. Dec 2014 A1
20140358012 Richards et al. Dec 2014 A1
20140374480 Pollack Dec 2014 A1
20140378999 Crawford et al. Dec 2014 A1
20150010437 Mellars et al. Jan 2015 A1
20150016777 Abovitz et al. Jan 2015 A1
20150019013 Rose et al. Jan 2015 A1
20150019124 Bandyopadhyay et al. Jan 2015 A1
20150025393 Hong et al. Jan 2015 A1
20150025394 Hong et al. Jan 2015 A1
20150032164 Crawford et al. Jan 2015 A1
20150032252 Galluzzo et al. Jan 2015 A1
20150037437 Gill et al. Feb 2015 A1
20150051519 Morbi et al. Feb 2015 A1
20150073646 Rosenstein et al. Mar 2015 A1
20150081156 Trepagnier et al. Mar 2015 A1
20150081444 Hoffberg Mar 2015 A1
20150094096 Tang et al. Apr 2015 A1
20150106427 Tang et al. Apr 2015 A1
20150118756 Pollack et al. Apr 2015 A1
20150122018 Yuen May 2015 A1
20150127141 Kawada et al. May 2015 A1
20150158182 Farlow et al. Jun 2015 A1
20150168939 Masoud Jun 2015 A1
20150173631 Richards et al. Jun 2015 A1
20150192682 Valentino et al. Jul 2015 A1
20150196256 Venkatraman et al. Jul 2015 A1
20150201853 Hong et al. Jul 2015 A1
20150201854 Hong et al. Jul 2015 A1
20150202770 Patron et al. Jul 2015 A1
20150223708 Richards et al. Aug 2015 A1
20150230735 Venkatraman et al. Aug 2015 A1
20150234477 Abovitz et al. Aug 2015 A1
20150235088 Abovitz et al. Aug 2015 A1
20150235370 Abovitz et al. Aug 2015 A1
20150235441 Abovitz et al. Aug 2015 A1
20150235447 Abovitz et al. Aug 2015 A1
20150241458 Pollack Aug 2015 A1
20150241705 Abovitz et al. Aug 2015 A1
20150241959 Abovitz et al. Aug 2015 A1
20150242575 Abovitz et al. Aug 2015 A1
20150242943 Abovitz et al. Aug 2015 A1
20150243100 Abovitz et al. Aug 2015 A1
20150243105 Abovitz et al. Aug 2015 A1
20150243106 Abovitz et al. Aug 2015 A1
20150247723 Abovitz et al. Sep 2015 A1
20150247975 Abovitz et al. Sep 2015 A1
20150247976 Abovitz et al. Sep 2015 A1
20150248169 Abovitz et al. Sep 2015 A1
20150248170 Abovitz et al. Sep 2015 A1
20150248787 Abovitz et al. Sep 2015 A1
20150248788 Abovitz et al. Sep 2015 A1
20150248789 Abovitz et al. Sep 2015 A1
20150248791 Abovitz et al. Sep 2015 A1
20150248792 Abovitz et al. Sep 2015 A1
20150248793 Abovitz et al. Sep 2015 A1
20150256401 Zinger et al. Sep 2015 A1
20150268355 Valentino et al. Sep 2015 A1
20150273242 Balakin Oct 2015 A1
20150273691 Pollack Oct 2015 A1
20150276775 Mellars et al. Oct 2015 A1
20150286221 Goulding Oct 2015 A1
20150290453 Tyler et al. Oct 2015 A1
20150290454 Tyler et al. Oct 2015 A1
20150290802 Buehler et al. Oct 2015 A1
20150290803 Buehler et al. Oct 2015 A1
20150301072 Gelbman Oct 2015 A1
20150308839 Jiang et al. Oct 2015 A1
20150309263 Abovitz et al. Oct 2015 A2
20150309264 Abovitz et al. Oct 2015 A1
20150314166 Hong et al. Nov 2015 A1
20150323990 Maltz Nov 2015 A1
20150332213 Galluzzo et al. Nov 2015 A1
20150343238 Balakin Dec 2015 A1
20150353206 Wang Dec 2015 A1
20150355207 Pollack et al. Dec 2015 A1
20150355211 Mellars et al. Dec 2015 A1
20150360057 Balakin Dec 2015 A1
20150369864 Marlow et al. Dec 2015 A1
20160004306 Maltz Jan 2016 A1
20160018816 Hosek et al. Jan 2016 A1
20160023762 Wang Jan 2016 A1
20160025500 Hoffberg Jan 2016 A1
20160025502 Lacaze et al. Jan 2016 A1
20160026253 Bradski et al. Jan 2016 A1
20160036118 Baringer et al. Feb 2016 A1
20160039540 Wang Feb 2016 A1
20160039553 Akdogan et al. Feb 2016 A1
20160039621 Akdogan et al. Feb 2016 A1
20160042151 Akdogan et al. Feb 2016 A1
20160045841 Kaplan et al. Feb 2016 A1
20160051169 Hong et al. Feb 2016 A1
20160054135 Fowe Feb 2016 A1
20160055677 Kuffner Feb 2016 A1
20160066844 Venkatraman et al. Mar 2016 A1
20160070436 Thomas et al. Mar 2016 A1
20160081575 Wu Mar 2016 A1
20160082597 Gorshechnikov et al. Mar 2016 A1
20160084869 Yuen et al. Mar 2016 A1
20160086108 Abelow Mar 2016 A1
20160109954 Harris et al. Apr 2016 A1
20160129592 Saboo et al. May 2016 A1
20160132059 Mason et al. May 2016 A1
20160136284 Gill et al. May 2016 A1
20160140729 Soatto et al. May 2016 A1
20160143500 Fong et al. May 2016 A1
20160148433 Petrovskaya et al. May 2016 A1
20160158937 Kamoi et al. Jun 2016 A1
20160167582 Chen et al. Jun 2016 A1
20160170414 Chen et al. Jun 2016 A1
20160171884 Chen et al. Jun 2016 A1
20160171893 Chen et al. Jun 2016 A1
20160183818 Richards et al. Jun 2016 A1
20160187166 Ranky et al. Jun 2016 A1
20160201933 Hester et al. Jul 2016 A1
20160201934 Hester et al. Jul 2016 A1
20160212402 Maruyama et al. Jul 2016 A1
20160216117 Bandyopadhyay et al. Jul 2016 A9
20160236582 Wang Aug 2016 A1
20160253844 Petrovskaya et al. Sep 2016 A1
20160260322 Chen et al. Sep 2016 A1
20160282126 Watts et al. Sep 2016 A1
20160283774 Buchanan et al. Sep 2016 A1
20160288905 Gong et al. Oct 2016 A1
20160291593 Hammond et al. Oct 2016 A1
20160292403 Gong et al. Oct 2016 A1
20160292696 Gong et al. Oct 2016 A1
20160292869 Hammond et al. Oct 2016 A1
20160292872 Hammond et al. Oct 2016 A1
20160297429 Watts Oct 2016 A1
20160299506 Bruggeman et al. Oct 2016 A1
20160302706 Richards et al. Oct 2016 A1
20160311116 Hill Oct 2016 A1
20160313739 Mian Oct 2016 A1
20160325143 Yuen et al. Nov 2016 A1
20160332748 Wang Nov 2016 A1
20160375592 Szatmary et al. Dec 2016 A1
20160375779 Wang Dec 2016 A1
20160377381 Lyren Dec 2016 A1
20160377508 Perrone et al. Dec 2016 A1
20160378111 Lenser et al. Dec 2016 A1
20160378117 Szatmary et al. Dec 2016 A1
20160378861 Eledath et al. Dec 2016 A1
20160379074 Nielsen et al. Dec 2016 A1
20170011210 Cheong et al. Jan 2017 A1
20170021497 Tseng et al. Jan 2017 A1
20170021502 Nusser et al. Jan 2017 A1
20170027523 Venkatraman et al. Feb 2017 A1
20170031369 Liu et al. Feb 2017 A1
20170038579 Yeoh et al. Feb 2017 A1
20170039764 Hu et al. Feb 2017 A1
20170039859 Hu et al. Feb 2017 A1
20170045893 Goulding Feb 2017 A1
20170045894 Canoy et al. Feb 2017 A1
20170069214 Dupray et al. Mar 2017 A1
20170075116 Gardiner Mar 2017 A1
20170084983 Baringer et al. Mar 2017 A1
20170086698 Wu Mar 2017 A1
20170087301 Wu Mar 2017 A1
20170087381 Balakin Mar 2017 A1
20170090478 Blayvas et al. Mar 2017 A1
20170097506 Schowengerdt et al. Apr 2017 A1
20170097507 Yeoh et al. Apr 2017 A1
20170100837 Zevenbergen et al. Apr 2017 A1
20170102711 Watts Apr 2017 A1
20170105592 Fong et al. Apr 2017 A1
20170108871 Watts Apr 2017 A1
20170111223 Matni et al. Apr 2017 A1
20170112392 Wu Apr 2017 A1
20170112407 Wu Apr 2017 A1
20170113352 Lutz et al. Apr 2017 A1
20170116477 Chen et al. Apr 2017 A1
20170127652 Shen et al. May 2017 A1
20170129603 Raptopoulos et al. May 2017 A1
20170132931 Hoffberg May 2017 A1
20170148213 Thomas et al. May 2017 A1
20170160398 Venkatraman et al. Jun 2017 A1
20170165841 Kamoi Jun 2017 A1
20170169713 Gong et al. Jun 2017 A1
20170182657 Rose et al. Jun 2017 A1
20170182664 Watts Jun 2017 A1
20170188864 Drury Jul 2017 A1
20170188893 Venkatraman et al. Jul 2017 A1
20170203446 Dooley et al. Jul 2017 A1
20170212511 Paiva Ferreira et al. Jul 2017 A1
20170215086 Priest Jul 2017 A1
20170215381 Shen et al. Aug 2017 A1
20170223034 Singh et al. Aug 2017 A1
20170223037 Singh et al. Aug 2017 A1
20170223046 Singh Aug 2017 A1
20170225321 Deyle et al. Aug 2017 A1
20170225332 Deyle et al. Aug 2017 A1
20170225334 Deyle et al. Aug 2017 A1
20170225336 Deyle et al. Aug 2017 A1
20170227965 Decenzo et al. Aug 2017 A1
20170232615 Hammock Aug 2017 A1
20170235316 Shattil Aug 2017 A1
20170239719 Buller et al. Aug 2017 A1
20170239720 Levin et al. Aug 2017 A1
20170239721 Buller et al. Aug 2017 A1
20170239752 Buller et al. Aug 2017 A1
20170239891 Buller et al. Aug 2017 A1
20170239892 Buller et al. Aug 2017 A1
20170248966 Lutz et al. Aug 2017 A1
20170257162 Panther et al. Sep 2017 A1
20170257778 Priest Sep 2017 A1
20170270361 Puttagunta et al. Sep 2017 A1
20170277193 Frazzoli et al. Sep 2017 A1
20170277194 Frazzoli et al. Sep 2017 A1
20170277195 Frazzoli et al. Sep 2017 A1
20170287337 Chen et al. Oct 2017 A1
20170291093 Janssen Oct 2017 A1
20170300540 Karpistsenko et al. Oct 2017 A1
20170305015 Krasny et al. Oct 2017 A1
20170309069 Thomas et al. Oct 2017 A1
20170318360 Tran et al. Nov 2017 A1
20170318477 Priest Nov 2017 A1
20170323563 Pundurs Nov 2017 A1
20170329347 Passot et al. Nov 2017 A1
20170337826 Moran et al. Nov 2017 A1
20170341231 Tan et al. Nov 2017 A1
20170343695 Stetson et al. Nov 2017 A1
20170352192 Petrovskaya et al. Dec 2017 A1
20170357270 Russell Dec 2017 A1
20170366751 Terry et al. Dec 2017 A1
20170368684 Zevenbergen et al. Dec 2017 A1
20170372618 Xu et al. Dec 2017 A1
20180001476 Tan et al. Jan 2018 A1
20180004213 Absmeier et al. Jan 2018 A1
20180009059 Aoki Jan 2018 A1
20180015347 Janssen Jan 2018 A1
20180021097 Quaid et al. Jan 2018 A1
20180032949 Galluzzo et al. Feb 2018 A1
20180039287 Shattil Feb 2018 A1
20180041907 Terry et al. Feb 2018 A1
20180042526 Hong et al. Feb 2018 A1
20180043547 Hance et al. Feb 2018 A1
20180048876 Gay et al. Feb 2018 A1
20180052276 Klienman et al. Feb 2018 A1
20180052277 Schowengerdt et al. Feb 2018 A1
20180052320 Curtis et al. Feb 2018 A1
20180052451 Billi-Duran et al. Feb 2018 A1
20180052501 Jones, Jr. et al. Feb 2018 A1
20180055312 Jung Mar 2018 A1
20180055326 Jung Mar 2018 A1
20180059297 Peroz et al. Mar 2018 A1
20180059304 Bhargava et al. Mar 2018 A1
20180059672 Li et al. Mar 2018 A1
20180060764 Hance et al. Mar 2018 A1
20180060765 Hance et al. Mar 2018 A1
20180061137 Jung Mar 2018 A1
20180061243 Shloosh Mar 2018 A1
20180068255 Hance et al. Mar 2018 A1
20180068358 Hoffberg Mar 2018 A1
20180068567 Gong et al. Mar 2018 A1
20180071949 Giles Mar 2018 A1
20180075302 Udell et al. Mar 2018 A1
20180075649 Godwin et al. Mar 2018 A1
20180077902 Shen et al. Mar 2018 A1
20180081439 Daniels Mar 2018 A1
20180082308 Gong et al. Mar 2018 A1
20180084242 Rublee et al. Mar 2018 A1
20180085616 Makiyama et al. Mar 2018 A1
20180085914 Kuffner, Jr. Mar 2018 A1
20180091791 Jiang et al. Mar 2018 A1
20180091869 Krishna et al. Mar 2018 A1
20180096495 Jiang et al. Apr 2018 A1
20180104829 Altman et al. Apr 2018 A1
20180109223 Panas et al. Apr 2018 A1
20180113468 Russell Apr 2018 A1
20180116898 Morbi et al. May 2018 A1
20180119534 Jamison et al. May 2018 A1
20180120856 Gabardos et al. May 2018 A1
20180123291 Brandwijk May 2018 A1
20180126460 Murphree et al. May 2018 A1
20180126461 Buller et al. May 2018 A1
20180126462 Murphree et al. May 2018 A1
20180126649 Romano et al. May 2018 A1
20180126650 Murphree et al. May 2018 A1
20180133801 Buller et al. May 2018 A1
20180139364 Jannard May 2018 A1
20180144558 Priest May 2018 A1
20180153084 Calleija et al. Jun 2018 A1
20180157336 Harris et al. Jun 2018 A1
20180158236 Priest Jun 2018 A1
20180165974 Bonkoski et al. Jun 2018 A1
20180170392 Yang et al. Jun 2018 A1
20180172450 Lalonde et al. Jun 2018 A1
20180173242 Lalonde et al. Jun 2018 A1
20180174357 Priest et al. Jun 2018 A1
20180178376 Lalonde et al. Jun 2018 A1
20180178382 Lalonde et al. Jun 2018 A1
20180178663 Wang Jun 2018 A9
20180180421 Holz Jun 2018 A1
20180186067 Buller et al. Jul 2018 A1
20180186080 Milshtein et al. Jul 2018 A1
20180186081 Milshtein et al. Jul 2018 A1
20180186082 Randhawa Jul 2018 A1
20180196404 Stilwell Jul 2018 A1
20180204111 Zadeh et al. Jul 2018 A1
20180207791 Szatmary et al. Jul 2018 A1
20180211263 Gong et al. Jul 2018 A1
20180211441 Priest et al. Jul 2018 A1
20180215039 Sinyavskiy et al. Aug 2018 A1
20180218619 Brown et al. Aug 2018 A1
20180225597 Hance et al. Aug 2018 A1
20180231972 Woon et al. Aug 2018 A1
20180232668 Hance et al. Aug 2018 A1
20180233054 Woon et al. Aug 2018 A1
20180233856 Brandwijk Aug 2018 A1
20180238164 Jamison et al. Aug 2018 A1
20180249343 Priest et al. Aug 2018 A1
20180251135 Luo et al. Sep 2018 A1
20180251234 Wang Sep 2018 A1
20180252535 Bhimavarapu et al. Sep 2018 A1
20180255465 Priest et al. Sep 2018 A1
20180259976 Williams et al. Sep 2018 A1
20180261023 Blayvas et al. Sep 2018 A1
20180263170 Aghai et al. Sep 2018 A1
20180273296 Wagner et al. Sep 2018 A1
20180273297 Wagner et al. Sep 2018 A1
20180273298 Wagner et al. Sep 2018 A1
20180276891 Craner Sep 2018 A1
20180278920 Stefan Sep 2018 A1
20180281191 Sinyavskiy et al. Oct 2018 A1
20180282065 Wagner et al. Oct 2018 A1
20180282066 Wagner et al. Oct 2018 A1
20180284735 Cella et al. Oct 2018 A1
20180284736 Cella et al. Oct 2018 A1
20180284737 Cella et al. Oct 2018 A1
20180284741 Cella et al. Oct 2018 A1
20180284742 Cella et al. Oct 2018 A1
20180284743 Cella et al. Oct 2018 A1
20180284744 Cella et al. Oct 2018 A1
20180284745 Cella et al. Oct 2018 A1
20180284746 Cella et al. Oct 2018 A1
20180284747 Cella et al. Oct 2018 A1
20180284749 Cella et al. Oct 2018 A1
20180284752 Cella et al. Oct 2018 A1
20180284753 Cella et al. Oct 2018 A1
20180284754 Cella et al. Oct 2018 A1
20180284755 Cella et al. Oct 2018 A1
20180284756 Cella et al. Oct 2018 A1
20180284757 Cella et al. Oct 2018 A1
20180284758 Cella et al. Oct 2018 A1
20180284786 Moshkina-Martinson et al. Oct 2018 A1
20180285697 Shotton et al. Oct 2018 A1
20180288303 Wang et al. Oct 2018 A1
20180288586 Tran et al. Oct 2018 A1
20180293536 Galluzzo et al. Oct 2018 A1
20180296916 Chung et al. Oct 2018 A1
20180299878 Cella et al. Oct 2018 A1
20180299882 Kichkaylo Oct 2018 A1
20180300835 Saboo et al. Oct 2018 A1
20180304461 Shaw Oct 2018 A1
20180304468 Holz Oct 2018 A1
20180306587 Holz Oct 2018 A1
20180306589 Holz Oct 2018 A1
20180306591 Jose et al. Oct 2018 A1
20180307241 Holz Oct 2018 A1
20180307941 Holz et al. Oct 2018 A1
20180311822 Kaminka et al. Nov 2018 A1
20180312824 Zhang et al. Nov 2018 A1
20180321666 Cella et al. Nov 2018 A1
20180321667 Cella et al. Nov 2018 A1
20180321672 Cella et al. Nov 2018 A1
20180322197 Hesterman Nov 2018 A1
20180322779 Pundurs Nov 2018 A1
20180329425 Watts Nov 2018 A1
20180330293 Kulkarni et al. Nov 2018 A1
20180335502 Lowe et al. Nov 2018 A1
20180348761 Zhu et al. Dec 2018 A1
20180348764 Zhang et al. Dec 2018 A1
20180361586 Tan et al. Dec 2018 A1
20180362158 Zhang et al. Dec 2018 A1
20180362190 Chambers et al. Dec 2018 A1
20180364724 Ibarz Gabardos et al. Dec 2018 A1
20180365603 Hance et al. Dec 2018 A1
20180370046 Hance et al. Dec 2018 A1
20180373320 Petrovskaya et al. Dec 2018 A1
20180374266 Schowengerdt et al. Dec 2018 A1
20190001492 Rose et al. Jan 2019 A1
20190011921 Wang et al. Jan 2019 A1
20190011932 McGrath Jan 2019 A1
20190015167 Draelos et al. Jan 2019 A1
20190020530 Au et al. Jan 2019 A1
20190023438 Akdogan et al. Jan 2019 A1
20190025805 Cella et al. Jan 2019 A1
20190025806 Cella et al. Jan 2019 A1
20190025812 Cella et al. Jan 2019 A1
20190025813 Cella et al. Jan 2019 A1
20190033845 Cella et al. Jan 2019 A1
20190033846 Cella et al. Jan 2019 A1
20190033847 Cella et al. Jan 2019 A1
20190033848 Cella et al. Jan 2019 A1
20190033849 Cella et al. Jan 2019 A1
20190033888 Bosworth et al. Jan 2019 A1
20190034728 Puttagunta et al. Jan 2019 A1
20190034729 Puttagunta et al. Jan 2019 A1
20190034730 Puttagunta et al. Jan 2019 A1
20190034839 Hance et al. Jan 2019 A1
20190041223 Yang et al. Feb 2019 A1
20190041835 Cella et al. Feb 2019 A1
20190041836 Cella et al. Feb 2019 A1
20190041840 Cella et al. Feb 2019 A1
20190041841 Cella et al. Feb 2019 A1
20190041842 Cella et al. Feb 2019 A1
20190041843 Cella et al. Feb 2019 A1
20190041844 Cella et al. Feb 2019 A1
20190041845 Cella et al. Feb 2019 A1
20190041846 Cella et al. Feb 2019 A1
20190041852 Schubert et al. Feb 2019 A1
20190049968 Dean et al. Feb 2019 A1
20190051054 Jovanovic et al. Feb 2019 A1
20190051178 Priev Feb 2019 A1
20190051198 Nimmagadda et al. Feb 2019 A1
20190056693 Gelman et al. Feb 2019 A1
20190060741 Contreras Feb 2019 A1
20190064791 Cella et al. Feb 2019 A1
20190064792 Cella et al. Feb 2019 A1
20190073760 Wang et al. Mar 2019 A1
20190077510 Panas et al. Mar 2019 A1
20190079509 Bosworth Mar 2019 A1
20190079523 Zhu et al. Mar 2019 A1
20190079524 Zhu et al. Mar 2019 A1
20190079526 Vallespi-Gonzalez et al. Mar 2019 A1
20190079528 Zhu et al. Mar 2019 A1
20190080266 Zhu et al. Mar 2019 A1
20190080515 Geri et al. Mar 2019 A1
20190080516 Petrovskaya et al. Mar 2019 A1
20190082985 Hong et al. Mar 2019 A1
20190086919 Zhang et al. Mar 2019 A1
20190086925 Fan et al. Mar 2019 A1
20190086930 Fan et al. Mar 2019 A1
20190086932 Fan et al. Mar 2019 A1
20190086934 Canoso et al. Mar 2019 A1
20190086938 Shattil Mar 2019 A1
20190088133 Alieiev et al. Mar 2019 A1
20190088162 Meglan Mar 2019 A1
20190089760 Zhang et al. Mar 2019 A1
20190092179 Kwa et al. Mar 2019 A1
20190092183 Sussman et al. Mar 2019 A1
20190092184 Sussman et al. Mar 2019 A1
20190094870 Afrouzi et al. Mar 2019 A1
20190094981 Bradski et al. Mar 2019 A1
20190097443 Kwa et al. Mar 2019 A1
20190100196 Ueda et al. Apr 2019 A1
20190101394 van der Meijden et al. Apr 2019 A1
20190104919 Shelton et al. Apr 2019 A1
20190105200 Hipsley Apr 2019 A1
20190107845 Kaine Apr 2019 A1
20190113351 Antony Apr 2019 A1
20190113918 Englard et al. Apr 2019 A1
20190113919 Englard et al. Apr 2019 A1
20190113920 Englard et al. Apr 2019 A1
20190113927 Englard et al. Apr 2019 A1
20190116758 Shen et al. Apr 2019 A1
20190118104 Su Apr 2019 A1
20190120639 Song et al. Apr 2019 A1
20190120967 Smits Apr 2019 A1
20190121333 Cella et al. Apr 2019 A1
20190121338 Cella et al. Apr 2019 A1
20190121339 Cella et al. Apr 2019 A1
20190121340 Cella et al. Apr 2019 A1
20190121341 Cella et al. Apr 2019 A1
20190121342 Cella et al. Apr 2019 A1
20190121343 Cella et al. Apr 2019 A1
20190121344 Cella et al. Apr 2019 A1
20190121345 Cella et al. Apr 2019 A1
20190121346 Cella et al. Apr 2019 A1
20190121347 Cella et al. Apr 2019 A1
20190121348 Cella et al. Apr 2019 A1
20190121349 Cella et al. Apr 2019 A1
20190121350 Cella et al. Apr 2019 A1
20190121365 Passot et al. Apr 2019 A1
20190125126 Cohen May 2019 A1
20190125361 Shelton et al. May 2019 A1
20190125454 Stokes et al. May 2019 A1
20190125455 Shelton et al. May 2019 A1
20190125456 Shelton et al. May 2019 A1
20190125457 Parihar et al. May 2019 A1
20190125458 Shelton et al. May 2019 A1
20190125459 Shelton et al. May 2019 A1
20190128390 Williams May 2019 A1
20190129404 Cella et al. May 2019 A1
20190129405 Cella et al. May 2019 A1
20190129406 Cella et al. May 2019 A1
20190129407 Cella et al. May 2019 A1
20190129408 Cella et al. May 2019 A1
20190129409 Cella et al. May 2019 A1
20190129410 Cella et al. May 2019 A1
20190130182 Zang et al. May 2019 A1
20190130637 Parmar et al. May 2019 A1
20190135296 Hummelshoj May 2019 A1
20190137985 Cella et al. May 2019 A1
20190137986 Cella et al. May 2019 A1
20190137987 Cella et al. May 2019 A1
20190137988 Cella et al. May 2019 A1
20190137989 Cella et al. May 2019 A1
20190143412 Buller et al. May 2019 A1
20190145239 Yu et al. May 2019 A1
20190145765 Luo et al. May 2019 A1
20190145784 Ma et al. May 2019 A1
20190146451 Bhatt et al. May 2019 A1
20190146472 Cella et al. May 2019 A1
20190146473 Cella et al. May 2019 A1
20190146474 Cella et al. May 2019 A1
20190146475 Cella et al. May 2019 A1
20190146476 Cella et al. May 2019 A1
20190146477 Cella et al. May 2019 A1
20190146478 Cella et al. May 2019 A1
20190146479 Cella et al. May 2019 A1
20190146480 Cella et al. May 2019 A1
20190146481 Cella et al. May 2019 A1
20190146482 Cella et al. May 2019 A1
20190146515 De Salvo et al. May 2019 A1
20190147253 Bai et al. May 2019 A1
20190147254 Bai et al. May 2019 A1
20190147255 Homayounfar et al. May 2019 A1
20190147260 May May 2019 A1
20190149725 Adato May 2019 A1
20190155263 Cella et al. May 2019 A1
20190155272 Cella et al. May 2019 A1
20190155295 Moore et al. May 2019 A1
20190155296 Moore et al. May 2019 A1
20190156128 Zhang et al. May 2019 A1
20190159848 Quaid et al. May 2019 A1
20190160675 Paschall, II et al. May 2019 A1
20190161274 Paschall et al. May 2019 A1
20190162575 Cozzens et al. May 2019 A1
20190163206 Zhu et al. May 2019 A1
20190164346 Kim et al. May 2019 A1
20190170521 Elhoushi et al. Jun 2019 A1
20190171187 Cella et al. Jun 2019 A1
20190171912 Vallespi-Gonzalez et al. Jun 2019 A1
20190176328 Kichkaylo et al. Jun 2019 A1
20190176329 Swilling Jun 2019 A1
20190178638 Abovitz et al. Jun 2019 A1
20190179290 Yoshida et al. Jun 2019 A1
20190179300 Cella et al. Jun 2019 A1
20190179301 Cella et al. Jun 2019 A1
20190179329 Keivan et al. Jun 2019 A1
20190179976 Stroila et al. Jun 2019 A1
20190180499 Caulfield et al. Jun 2019 A1
20190187680 Cella et al. Jun 2019 A1
20190187681 Cella et al. Jun 2019 A1
20190187682 Cella et al. Jun 2019 A1
20190187683 Cella et al. Jun 2019 A1
20190187684 Cella et al. Jun 2019 A1
20190187685 Cella et al. Jun 2019 A1
20190187686 Cella et al. Jun 2019 A1
20190187687 Cella et al. Jun 2019 A1
20190187688 Cella et al. Jun 2019 A1
20190187689 Cella et al. Jun 2019 A1
20190187690 Cella et al. Jun 2019 A1
20190187703 Millard et al. Jun 2019 A1
20190187715 Zhang et al. Jun 2019 A1
20190188632 Galluzzo et al. Jun 2019 A1
20190188766 Cho et al. Jun 2019 A1
20190188788 Baker et al. Jun 2019 A1
20190188895 Marshall et al. Jun 2019 A1
20190188913 Cho et al. Jun 2019 A1
20190188917 Cho et al. Jun 2019 A1
20190189160 Huang Jun 2019 A1
20190191125 Fink et al. Jun 2019 A1
20190193276 Deyle et al. Jun 2019 A1
20190193629 Zevenbergen et al. Jun 2019 A1
20190196472 Korner et al. Jun 2019 A1
20190196480 Taylor Jun 2019 A1
20190196485 Li et al. Jun 2019 A1
20190197788 Forbes et al. Jun 2019 A1
20190200844 Shelton et al. Jul 2019 A1
20190200977 Shelton et al. Jul 2019 A1
20190201037 Houser et al. Jul 2019 A1
20190201038 Yates et al. Jul 2019 A1
20190201040 Messerly et al. Jul 2019 A1
20190201042 Nott et al. Jul 2019 A1
20190201046 Shelton et al. Jul 2019 A1
20190201127 Shelton et al. Jul 2019 A1
20190201136 Shelton et al. Jul 2019 A1
20190204201 Shelton et al. Jul 2019 A1
20190206045 Wang et al. Jul 2019 A1
20190206562 Shelton et al. Jul 2019 A1
20190206565 Shelton Jul 2019 A1
20190208979 Bassa Jul 2019 A1
20190212106 Bortz et al. Jul 2019 A1
20190212901 Garrison et al. Jul 2019 A1
20190213212 Adato et al. Jul 2019 A1
20190213390 Adato et al. Jul 2019 A1
20190213418 Adato et al. Jul 2019 A1
20190213421 Adato et al. Jul 2019 A1
20190213441 Adato et al. Jul 2019 A1
20190213523 Adato et al. Jul 2019 A1
20190213534 Adato et al. Jul 2019 A1
20190213535 Adato et al. Jul 2019 A1
20190213545 Adato et al. Jul 2019 A1
20190213546 Adato et al. Jul 2019 A1
20190213752 Adato et al. Jul 2019 A1
20190213755 Bassa et al. Jul 2019 A1
20190215424 Adato et al. Jul 2019 A1
20190219409 Tan et al. Jul 2019 A1
20190219995 Cella et al. Jul 2019 A1
20190219996 Cella et al. Jul 2019 A1
20190220012 Zhang et al. Jul 2019 A1
20190220863 Novick et al. Jul 2019 A1
20190222986 Aitken et al. Jul 2019 A1
20190223958 Kohli et al. Jul 2019 A1
20190227536 Cella et al. Jul 2019 A1
20190227537 Cella et al. Jul 2019 A1
20190228266 Habibian et al. Jul 2019 A1
20190228495 Tremblay et al. Jul 2019 A1
20190228573 Sen et al. Jul 2019 A1
20190228854 Jain et al. Jul 2019 A1
20190229802 Panther et al. Jul 2019 A1
20190231162 Lu et al. Aug 2019 A1
20190231436 Panse et al. Aug 2019 A1
20190232498 Tan et al. Aug 2019 A1
20190232992 Bondaryk et al. Aug 2019 A1
20190235486 Way et al. Aug 2019 A1
20190235492 Kueny et al. Aug 2019 A1
20190235498 Li et al. Aug 2019 A1
20190235505 Li et al. Aug 2019 A1
20190235512 Sinyavskiy et al. Aug 2019 A1
20190235516 Zhang et al. Aug 2019 A1
20190235531 Liu et al. Aug 2019 A1
20190236399 Soatto et al. Aug 2019 A1
20190236531 Adato et al. Aug 2019 A1
20190236844 Balasian et al. Aug 2019 A1
20190238638 Way et al. Aug 2019 A1
20190240840 Gorshechnikov et al. Aug 2019 A1
20190243370 Li et al. Aug 2019 A1
20190247050 Goldsmith Aug 2019 A1
20190247122 D'Amelio et al. Aug 2019 A1
20190248002 Deyle et al. Aug 2019 A1
20190248007 Duffy et al. Aug 2019 A1
20190248013 Deyle et al. Aug 2019 A1
20190248014 Deyle et al. Aug 2019 A1
20190248016 Deyle et al. Aug 2019 A1
20190250000 Zhang et al. Aug 2019 A1
20190253580 Kodimer Aug 2019 A1
20190253835 Jones Aug 2019 A1
20190254753 Johnson et al. Aug 2019 A1
20190254754 Johnson et al. Aug 2019 A1
20190255434 Wilson Aug 2019 A1
20190258251 Ditty et al. Aug 2019 A1
20190258878 Koivisto et al. Aug 2019 A1
20190261565 Robertson et al. Aug 2019 A1
20190261566 Robertson et al. Aug 2019 A1
20190265366 Venkatraman et al. Aug 2019 A1
20190265705 Zhang et al. Aug 2019 A1
20190266418 Xu et al. Aug 2019 A1
20190269353 Venkatraman et al. Sep 2019 A1
20190270197 Wagner et al. Sep 2019 A1
20190272348 Booij et al. Sep 2019 A1
20190274716 Nott et al. Sep 2019 A1
20190277869 Stein et al. Sep 2019 A1
20190277962 Ingram et al. Sep 2019 A1
20190278276 Zhang et al. Sep 2019 A1
20190278284 Zhang et al. Sep 2019 A1
20190278290 Zhang et al. Sep 2019 A1
20190369641 Gillett Dec 2019 A1
Foreign Referenced Citations (3)
Number Date Country
2982575 Jun 2018 CA
3071332 Jan 2019 CA
WO-2019173396 Sep 2019 WO
Non-Patent Literature Citations (183)
Entry
U.S. Appl. No. 10/001,499, filed Jun. 19, 2018, Mellars et al.
U.S. Appl. No. 10/001,780, filed Jun. 19, 2018, Gabardos et al.
U.S. Appl. No. 10/002,442, filed Jun. 19, 2018, Dagley et al.
U.S. Appl. No. 10/002,471, filed Jun. 19, 2018, Blayvas et al.
U.S. Appl. No. 10/002,537, filed Jun. 19, 2018, Chen et al.
U.S. Appl. No. 10/008,045, filed Jun. 26, 2018, Dagley et al.
U.S. Appl. No. 10/011,012, filed Jul. 3, 2018, Krasny et al.
U.S. Appl. No. 10/012,996, filed Jul. 3, 2018, Canoso et al.
U.S. Appl. No. 10/013,610, filed Jul. 3, 2018, Chen et al.
U.S. Appl. No. 10/015,466, filed Jul. 3, 2018, Maruyama et al.
U.S. Appl. No. 10/022,867, filed Jul. 17, 2018, Saboo et al.
U.S. Appl. No. 10/023,393, filed Jul. 17, 2018, Brazeau et al.
U.S. Appl. No. 10/025,886, filed Jul. 17, 2018, Rublee et al.
U.S. Appl. No. 10/026,209, filed Jul. 17, 2018, Dagley et al.
U.S. Appl. No. 10/030,988, filed Jul. 24, 2018, Brush et al.
U.S. Appl. No. 10/034,066, filed Jul. 24, 2018, Tran et al.
U.S. Appl. No. 10/048,683, filed Aug. 14, 2018, Levinson et al.
U.S. Appl. No. 10/051,411, filed Aug. 14, 2018, Breed.
U.S. Appl. No. 10/052,765, filed Aug. 21, 2018, Kamoi et al.
U.S. Appl. No. 10/059,467, filed Aug. 28, 2018, Wang.
U.S. Appl. No. 10/061,325, filed Aug. 28, 2018, Watts.
U.S. Appl. No. 10/061,328, filed Aug. 28, 2018, Canoy et al.
U.S. Appl. No. 10/062,177, filed Aug. 28, 2018, Dagley et al.
U.S. Appl. No. 10/065,309, filed Sep. 4, 2018, Rose et al.
U.S. Appl. No. 10/065,317, filed Sep. 4, 2018, Tan et al.
U.S. Appl. No. 10/068,470, filed Sep. 4, 2018, Pundurs.
U.S. Appl. No. 10/070,974, filed Sep. 11, 2018, Herr et al.
U.S. Appl. No. 10/078,136, filed Sep. 18, 2018, Kimchi et al.
U.S. Appl. No. 10/078,921, filed Sep. 18, 2018, Dagley et al.
U.S. Appl. No. 10/080,672, filed Sep. 25, 2018, Casler et al.
U.S. Appl. No. 10/081,104, filed Sep. 25, 2018, Swilling.
U.S. Appl. No. 10/082,397, filed Sep. 25, 2018, Sidhu et al.
U.S. Appl. No. 10/083,406, filed Sep. 25, 2018, Hance et al.
U.S. Appl. No. 10/099,391, filed Oct. 16, 2018, Hance et al.
U.S. Appl. No. 10/105,244, filed Oct. 23, 2018, Herr et al.
U.S. Appl. No. 10/106,283, filed Oct. 23, 2018, Akdogan et al.
U.S. Appl. No. 10/108,194, filed Oct. 23, 2018, Russell.
U.S. Appl. No. 10/122,995, filed Nov. 6, 2018, Rublee et al.
U.S. Appl. No. 10/123,181, filed Nov. 6, 2018, Huang et al.
U.S. Appl. No. 10/126,136, filed Nov. 13, 2018, Iagnemma.
U.S. Appl. No. 10/126,757, filed Nov. 13, 2018, Goulding.
U.S. Appl. No. 10/127,816, filed Nov. 13, 2018, Hoffberg.
U.S. Appl. No. 10/133,990, filed Nov. 20, 2018, Hance et al.
U.S. Appl. No. 10/137,011, filed Nov. 27, 2018, Herr et al.
U.S. Appl. No. 10/144,591, filed Dec. 4, 2018, Brazeau et al.
U.S. Appl. No. 10/147,069, filed Dec. 4, 2018, Galluzzo et al.
U.S. Appl. No. 10/153,537, filed Dec. 11, 2018, Baringer et al.
U.S. Appl. No. 10/159,218, filed Dec. 25, 2018, Shen et al.
U.S. Appl. No. 10/162,353, filed Dec. 25, 2018, Hammond et al.
U.S. Appl. No. 10/162,355, filed Dec. 25, 2018, Hayon et al.
U.S. Appl. No. 10/168,704, filed Jan. 1, 2019, Zhang et al.
U.S. Appl. No. 10/172,409, filed Jan. 8, 2019, Andon.
U.S. Appl. No. 10/178,445, filed Jan. 8, 2019, Lubranski et al.
U.S. Appl. No. 10/178,973, filed Jan. 15, 2019, Venkatraman et al.
U.S. Appl. No. 10/188,472, filed Jan. 29, 2019, Diolaiti et al.
U.S. Appl. No. 10/191,495, filed Jan. 29, 2019, Bobda.
U.S. Appl. No. 10/192,113, filed Jan. 29, 2019, Liu et al.
U.S. Appl. No. 10/194,836, filed Feb. 5, 2019, Venkatraman et al.
U.S. Appl. No. 10/194,858, filed Feb. 5, 2019, Marquez Chin et al.
U.S. Appl. No. 10/203,762, filed Feb. 12, 2019, Bradski et al.
U.S. Appl. No. 10/207,404, filed Feb. 19, 2019, Khansari Zadeh.
U.S. Appl. No. 10/209,365, filed Feb. 19, 2019, Venkatraman et al.
U.S. Appl. No. 10/212,876, filed Feb. 26, 2019, Aghai et al.
U.S. Appl. No. 10/216,195, filed Feb. 26, 2019, Switkes et al.
U.S. Appl. No. 10/217,488, filed Feb. 26, 2019, Huang.
U.S. Appl. No. 10/218,433, filed Feb. 26, 2019, Panther et al.
U.S. Appl. No. 10/222,215, filed Mar. 5, 2019, Holz.
U.S. Appl. No. 10/225,025, filed Mar. 5, 2019, Henry et al.
U.S. Appl. No. 10/228,242, filed Mar. 12, 2019, Abovitz et al.
U.S. Appl. No. 10/230,745, filed Mar. 12, 2019, Singh et al.
U.S. Appl. No. 10/231,790, filed Mar. 19, 2019, Quaid et al.
U.S. Appl. No. 10/239,208, filed Mar. 26, 2019, Swilling.
U.S. Appl. No. 10/239,740, filed Mar. 26, 2019, McHale et al.
U.S. Appl. No. 10/243,379, filed Mar. 26, 2019, Kwa et al.
U.S. Appl. No. 10/248,119, filed Apr. 2, 2019, Kentley-Klay et al.
U.S. Appl. No. 10/251,805, filed Apr. 9, 2019, Morbi et al.
U.S. Appl. No. 10/252,335, filed Apr. 9, 2019, Buller et al.
U.S. Appl. No. 10/254,499, filed Apr. 9, 2019, Cohen et al.
U.S. Appl. No. 10/254,536, filed Apr. 9, 2019, Yeoh et al.
U.S. Appl. No. 10/255,719, filed Apr. 9, 2019, Priest.
U.S. Appl. No. 10/255,723, filed Apr. 9, 2019, Thomas et al.
U.S. Appl. No. 10/259,514, filed Apr. 16, 2019, Kentley-Klay.
U.S. Appl. No. 10/260,890, filed Apr. 16, 2019, Jose et al.
U.S. Appl. No. 10/262,213, filed Apr. 16, 2019, Chen et al.
U.S. Appl. No. 10/262,437, filed Apr. 16, 2019, Ter Beest III.
U.S. Appl. No. 10/264,586, filed Apr. 16, 2019, Beattie Jr. et al.
U.S. Appl. No. 10/265,859, filed Apr. 23, 2019, Deyle et al.
U.S. Appl. No. 10/265,871, filed Apr. 23, 2019, Hance et al.
U.S. Appl. No. 10/267,970, filed Apr. 23, 2019, Jones Jr. et al.
U.S. Appl. No. 10/269,108, filed Apr. 23, 2019, Wang et al.
U.S. Appl. No. 10/270,789, filed Apr. 23, 2019, Singh.
U.S. Appl. No. 10/277,885, filed Apr. 30, 2019, Jannard et al.
U.S. Appl. No. 10/279,906, filed May 7, 2019, Levien et al.
U.S. Appl. No. 10/283,110, filed May 7, 2019, Bellegarda.
U.S. Appl. No. 10/285,828, filed May 14, 2019, Herr et al.
U.S. Appl. No. 10/288,419, filed May 14, 2019, Abovitz et al.
U.S. Appl. No. 10/290,049, filed May 14, 2019, Xu et al.
U.S. Appl. No. 10/291,334, filed May 14, 2019, Henry et al.
U.S. Appl. No. 10/293,485, filed May 21, 2019, Sinyavskiy et al.
U.S. Appl. No. 10/295,338, filed May 21, 2019, Abovitz et al.
U.S. Appl. No. 10/296,012, filed May 21, 2019, Lalonde et al.
U.S. Appl. No. 10/296,080, filed May 21, 2019, Ord et al.
U.S. Appl. No. 10/296,995, filed May 21, 2019, Saboo et al.
U.S. Appl. No. 10/300,601, filed May 28, 2019, Tan et al.
U.S. Appl. No. 10/300,603, filed May 28, 2019, Gorshechnikov et al.
U.S. Appl. No. 10/303,166, filed May 28, 2019, Iagnemma.
U.S. Appl. No. 10/303,174, filed May 28, 2019, Kentley-Klay et al.
U.S. Appl. No. 10/304,254, filed May 28, 2019, Jovanovic et al.
U.S. Appl. No. 10/307,199, filed Jun. 4, 2019, Farritor et al.
U.S. Appl. No. 10/307,272, filed Jun. 4, 2019, Herr et al.
U.S. Appl. No. 10/309,792, filed Jun. 4, 2019, Iagnemma.
U.S. Appl. No. 10/310,517, filed Jun. 4, 2019, Paduano et al.
U.S. Appl. No. 10/311,731, filed Jun. 4, 2019, Li et al.
U.S. Appl. No. 10/320,610, filed Jun. 11, 2019, Matni et al.
U.S. Appl. No. 10/325,411, filed Jun. 18, 2019, Laney et al.
U.S. Appl. No. 10/326,689, filed Jun. 18, 2019, Liu et al.
U.S. Appl. No. 10/327,674, filed Jun. 25, 2019, Hong et al.
U.S. Appl. No. 10/328,575, filed Jun. 25, 2019, Garcia et al.
U.S. Appl. No. 10/328,578, filed Jun. 25, 2019, Holz.
U.S. Appl. No. 10/330,440, filed Jun. 25, 2019, Lyren.
U.S. Appl. No. 10/331,129, filed Jun. 25, 2019, Iagnemma et al.
U.S. Appl. No. 10/332,245, filed Jun. 25, 2019, Price et al.
U.S. Appl. No. 10/334,050, filed Jun. 25, 2019, Kentley-Klay et al.
U.S. Appl. No. 10/334,164, filed Jun. 25, 2019, Terry et al.
U.S. Appl. No. 10/334,906, filed Jul. 2, 2019, Andon et al.
U.S. Appl. No. 10/335,004, filed Jul. 2, 2019, Fong et al.
U.S. Appl. No. 10/336,543, filed Jul. 2, 2019, Sills et al.
U.S. Appl. No. 10/338,391, filed Jul. 2, 2019, Yeoh et al.
U.S. Appl. No. 10/338,594, filed Jul. 2, 2019, Long.
U.S. Appl. No. 10/339,721, filed Jul. 2, 2019, Dascola et al.
U.S. Appl. No. 10/352,693, filed Jul. 16, 2019, Abovitz et al.
U.S. Appl. No. 10/353,388, filed Jul. 16, 2019, Schubert et al.
U.S. Appl. No. 10/353,532, filed Jul. 16, 2019, Holz et al.
U.S. Appl. No. 10/354,441, filed Jul. 16, 2019, Godwin et al.
U.S. Appl. No. 10/358,057, filed Jul. 23, 2019, Breed.
U.S. Appl. No. 10/359,783, filed Jul. 23, 2019, Williams et al.
U.S. Appl. No. 10/360,799, filed Jul. 23, 2019, Priev.
U.S. Appl. No. 10/362,057, filed Jul. 23, 2019, Wu.
U.S. Appl. No. 10/363,657, filed Jul. 30, 2019, Lalonde et al.
U.S. Appl. No. 10/363,826, filed Jul. 30, 2019, Wang.
U.S. Appl. No. 10/365,656, filed Jul. 30, 2019, Moore et al.
U.S. Appl. No. 10/365,716, filed Jul. 30, 2019, Aimone et al.
U.S. Appl. No. 10/366,289, filed Jul. 30, 2019, Puttagunta et al.
U.S. Appl. No. 10/366,508, filed Jul. 30, 2019, Liu et al.
U.S. Appl. No. 10/366,510, filed Jul. 30, 2019, Malti et al.
U.S. Appl. No. 10/368,249, filed Jul. 30, 2019, Priest.
U.S. Appl. No. 10/368,784, filed Aug. 6, 2019, Marlow et al.
U.S. Appl. No. 10/369,974, filed Aug. 6, 2019, Carlson et al.
U.S. Appl. No. 10/372,132, filed Aug. 6, 2019, Herz et al.
U.S. Appl. No. 10/372,721, filed Aug. 6, 2019, Karpistsenko et al.
U.S. Appl. No. 10/373,389, filed Aug. 6, 2019, Jung.
U.S. Appl. No. 10/375,289, filed Aug. 6, 2019, Wang et al.
U.S. Appl. No. 10/375,517, filed Aug. 6, 2019, Shen et al.
U.S. Appl. No. 10/377,040, filed Aug. 13, 2019, Sinyavskiy et al.
U.S. Appl. No. 10/377,375, filed Aug. 13, 2019, Jones et al.
U.S. Appl. No. 10/379,007, filed Aug. 13, 2019, Perrone et al.
U.S. Appl. No. 10/379,539, filed Aug. 13, 2019, Ibarz Gabardos et al.
U.S. Appl. No. 10/382,889, filed Aug. 13, 2019, Ajmeri et al.
U.S. Appl. No. 10/382,975, filed Aug. 13, 2019, Priest.
U.S. Appl. No. 10/384,351, filed Aug. 20, 2019, Deyle et al.
U.S. Appl. No. 10/386,857, filed Aug. 20, 2019, McGrath.
U.S. Appl. No. 10/389,037, filed Aug. 20, 2019, Johnson et al.
U.S. Appl. No. 10/391,408, filed Aug. 27, 2019, Ord et al.
U.S. Appl. No. 10/391,632, filed Aug. 27, 2019, Khansari Zadeh.
U.S. Appl. No. 10/394,246, filed Aug. 27, 2019, Moshkina-Martinson et al.
U.S. Appl. No. 10/395,117, filed Aug. 27, 2019, Zhang et al.
U.S. Appl. No. 10/395,434, filed Aug. 27, 2019, Priest.
U.S. Appl. No. 10/397,802, filed Aug. 27, 2019, Priest.
U.S. Appl. No. 10/398,520, filed Sep. 3, 2019, Larkin et al.
U.S. Appl. No. 10/399,443, filed Sep. 3, 2019, Kwa et al.
U.S. Appl. No. 10/401,852, filed Sep. 3, 2019, Levinson et al.
U.S. Appl. No. 10/401,864, filed Sep. 3, 2019, Sussman et al.
U.S. Appl. No. 10/402,731, filed Sep. 3, 2019, Cosic.
U.S. Appl. No. 10/406,687, filed Sep. 10, 2019, Lalonde et al.
U.S. Appl. No. 10/408,613, filed Sep. 10, 2019, Abovitz et al.
U.S. Appl. No. 10/409,284, filed Sep. 10, 2019, Kentley-Klay et al.
U.S. Appl. No. 10/410,328, filed Sep. 10, 2019, Liu et al.
U.S. Appl. No. 10/411,356, filed Sep. 10, 2019, Johnson et al.
U.S. Appl. No. 10/413,994, filed Sep. 17, 2019, Aoki.
U.S. Appl. No. 10/414,052, filed Sep. 17, 2019, Deyle et al.
U.S. Appl. No. 10/414,395, filed Sep. 17, 2019, Sapp et al.
U.S. Appl. No. 10/416,668, filed Sep. 17, 2019, Hammond et al.
U.S. Appl. No. 10/417,829, filed Sep. 17, 2019, Kim et al.
Related Publications (1)
Number Date Country
20210086370 A1 Mar 2021 US
Provisional Applications (1)
Number Date Country
62902830 Sep 2019 US