Lighting apparatus for remote controlled device

Information

  • Patent Grant
  • 11987383
  • Patent Number
    11,987,383
  • Date Filed
    Tuesday, August 23, 2022
    2 years ago
  • Date Issued
    Tuesday, May 21, 2024
    7 months ago
Abstract
There is a remote control device or drone, which has software and a combination of lights or LED on an lighting ring or apparatus that can move independently of the drone; the drone can be programmed or be reactive to sound or other stimulus to create the effect of writing shapes or words in the air and typically at nighttime against a dark sky.
Description
BACKGROUND OF THE INVENTION
1. Field of the Invention

This invention relates to using Light Emitting Diodes (LED) lights with remote controlled devices.


2. Description of Related Art

Prior to the present invention, remote controlled devices or drones used and employed lights and cameras; however, none used a combination of lights (typically LED-type lights) in a cycling motion and software that is reactive to sound to create the effect of writing words in the air and typically at nighttime against a dark sky. Prior devices simply turned lights on or off to mimic independent movement.


From the preceding descriptions, it is apparent that the devices currently being used have significant disadvantages. Thus, important aspects of the technology used in the field of invention remain amenable to useful refinement.


SUMMARY OF THE INVENTION

There is a remote control flying device or drone, which has software algorithms and a combination of lights or LED on an lighting ring or apparatus that can move independently of the drone; the drone can be programmed or be reactive to sound or other stimulus (light, motion, temperature) to create the effect of writing shapes or words in the air and typically at nighttime against a dark sky.


An apparatus for presenting LED lighting on a drone, said drone having a drone frame, at least one drone motor and at least one rotating blade; a battery; a LED microcontroller; a wireless receiver; an electronic speed controller; a first flight controller; said apparatus for presenting LED lighting comprising:

    • the drone frame having a first drone frame arm and a second drone frame arm;
    • the first drone frame arm having an LED ring motor;
    • the second drone frame arm having a bearing;
    • a LED housing has at least one LED light and engages the drone frame arms via the bearing and the LED ring motor, whereby the bearing allows the LED Ring Motor to move the LED housing around an axis of the drone frame and independently of movement of the drone and wherein in low light conditions and when the LED housing is moving independently around the drone, the at least one LED light creates a persistence of vision such that the drone is invisible in relation to the at least one LED light.


The drone frame can have multiple arms; each arm can have at least one drone motor, the at least one rotating blade, servo or bearing. The LED housing can be translucent and circular, square rectangular or triangular in shape. The drone is wirelessly connected to a second flight controller or a ground control computing device, which is a computing device with wireless communication and audio and visual inputs and can direct the LED lights to activate, for the LED Ring Motor to active to move the LED housing and to operate the first flight controller of the drone. The additional flight controller can be a smartphone, tablet or laptop computer; the audio input is a microphone; the visual input can be a light or thermal heat sensor.


A method of creating a persistence of vision display using a drone, an apparatus for presenting LED lighting and a ground station computing device with a wireless communication system; said drone having a drone frame, at least one drone motor and at least one rotating blade; a battery; a LED microcontroller; a wireless receiver; an electronic speed controller; a first flight controller; said apparatus for presenting LED lighting comprising:

    • the drone frame having a first drone frame arm and a second drone frame arm;
    • the first drone frame arm having an LED ring motor;
    • the second drone frame arm having a bearing;
    • a LED housing has at least one LED light and engages the drone frame arms via the bearing and the LED ring motor, whereby the bearing allows the LED Ring Motor to move the LED housing around an axis of the drone frame and independently of movement of the drone, comprising the following steps:
      • a. Activating the drone;
      • b. Signaling for the ground station computing device;
      • c. If the ground station computing device communicates with the drone and the LED microcontroller and transmits wireless instructions to the drone and the LED microcontroller to activate and adjust the at least one LED light and the LED housing and position of the drone.
      • d. If the ground station computing device does not communicate with the drone and the LED microcontroller, the drone and the LED microcontroller commences a pre-programmed light pattern and drone movement, whereby the at least one LED light and the LED housing create the persistence of vision display to a viewer.


The present invention introduces such refinements. In its preferred embodiments, the present invention has several aspects or facets that can be used independently, although they are preferably employed together to optimize their benefits. All of the foregoing operational principles and advantages of the present invention will be more fully appreciated upon consideration of the following detailed description, with reference to the appended drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1-4 show various views of one embodiment of the invention, which shows an overview of the rotating light apparatus on the drone.



FIGS. 5a and 5b show flowcharts of one embodiment of the algorithm for the “Hoop Drone” or “Persistence of Vision Communication” system.



FIGS. 6a and 6b show flowcharts of one embodiment of the algorithm for the “Hoop Drone” or “Persistence of Vision Communication” system, namely the Ground Station algorithm or software.



FIG. 7 shows a diagram of a system suitable for computing programs and the Internet or other wireless communication systems.





PARTS LIST






    • 10 Apparatus


    • 15 Frame for Drone


    • 20 Outer Ring for LED; LED housing


    • 25 Flight Controller


    • 30 ESC


    • 35 Drone Motor


    • 40 Propeller


    • 45 LED Ring Motor


    • 50 Bearing or Servo


    • 55 Battery


    • 60 Microcontroller; LED microcontroller computing device


    • 65 Wireless Receiver


    • 70 LED light or light





DESCRIPTION OF THE PREFERRED EMBODIMENTS

The Basic Drone Apparatus includes without limitation: a drone frame or housing; at least one drone motor and at least one rotating blade or fan; a battery; a computing device or computing control device or flight controller; antenna; electronic speed controllers or sensors; stabilizers; gyroscope; altimeter; accelerometer and magnetometer; and a wireless receiver. The drone frame can have at least one or multiple arms, which each can have a drone motor and propeller; the drone frame and/or arms can also have lights or LED lights. Some of the electronic components or sensors can be combined into a computing device on the drone itself or be placed on different parts of the apparatus (LED housing, drone arms or drone frame). All of the electronics, LED lights, batteries on the drone or LED housing can be connected with wiring.


Lighting Apparatus Embodiment

One preferred embodiment of the invention presents a circular or ring shaped light mounting structure on the drone apparatus, which moves independently and separately from the drone itself. There is a moving frame or ring of LED lights; LED lights can be programmed to react to an external stimulus (sound, light, etc.) or a programmed stimulus (music, light pattern). Separate motors, servos and bearings allow the light mounting apparatus on the drone apparatus to spin or to move independently from the drone itself. This invention employs software programs and algorithms to activate said lights and the drone apparatus. RF is radio frequency or any other wireless communication format; laptop refers to any computing device, including Smart Phone, laptop, tablet, notebook or any other mobile or desktop computing device.


Lighting or LED Housing


The drone frame can have multiple drone arms, including a first drone frame arm and a second drone frame arm. The first and the second drone arms can be connected to the lighting or LED housing. The first or second drone arms can have at least one or more LED frame motors, servos or bearings.


The LED housing has at least one LED light and engages the drone frame arms via the bearing and the LED ring motor, whereby the bearing allows the LED Ring Motor to move the LED housing around an axis of the drone frame and independently of movement of the drone. LED lights can be various colors: white, red, blue, etc.


The housing can be made of any lightweight plastic material, including clear, transparent or opaque colors; the housing can be hollow, a rail or track (upon which the LED lights are disposed. The LED housing can be circular, square rectangular or triangular or any variable shape.


Wireless Connection and Control:


In one preferred embodiment, the applicants employ a wireless control of not only the drone's flying motors and flight system, but also the LED lights, including without limitation use of a wireless Arduino LED Control system.


In low light conditions and when the LED housing is moving independently around the drone, the at least one LED light or lights on the rotating LED housing create a persistence of vision such that the drone is generally or basically shadowed or invisible in relation to the at least one LED light or lights. The display of a rapidly rotating LED light housing around a stationary or moving drone creates a visually stimulating and pleasing sight.


The drone can be wirelessly connected to a second flight controller, which is a computing device with audio and visual inputs and can direct the LED lights to activate, for the LED Ring Motor to active to move the LED housing and to operate the first flight controller of the drone. Typical wireless communication is RF but can also include without limitation other wireless protocols: Bluetooth, WIFI, Satellite and cellular network communications.


The second flight controller can also be a Smartphone, tablet or laptop computer; the audio input can be a microphone; the visual input can be a light sensor or another type of electronic eye; other potential sensors could include heat and thermal sensors and GPS or position sensors.


a. Programmed to React to Stimuli (Music or Light, Etc.)


In FIG. 5-7, when the drone is wirelessly connected to a controlling computing device, a wirelessly connected ground station (with a computing device, memory and input (microphone or other input sensors)) can direct the lights to turn/on off, for the light apparatus to move and to rotate and finally for the drone to react to this stimulus or input. The movement of the drone and the lights can be independent of one another.


When not connected to a controlling computing device; hardware on the drone can have pre-built ore pre-programmed patterns for movement of the drone and activation/deactivation of the LED lights and rotation or spinning of the LED housing around the drone frame.


b. Non-Programmed Stimulus:


In FIG. 5-7, a controlling computing device (Smart Phone, tablet or laptop computer) or ground station can use the input sensor (including without limitation: a microphone or light sensor) to react to the external or non-programmed stimulus; this wirelessly connected ground station (with a computing device, memory and input (microphone or other input sensors)) can direct the lights to turn/on off, for the light apparatus to move and to rotate and/or for the drone to react to this stimulus or input.


In one preferred embodiment, the controlling computing device uses its microphone to listen to external music or other stimulus—this will turn the LED “on” or “off” or make the LED light apparatus to move or to make the drone change its position.


The apparatus also allows the human operator to change the LED light activation, sequence of activation, LED light apparatus movement and the drone movement.


c. Persistence of Vision:


In FIG. 5-7, the movement of the drone and the lights can be independent of one another; the drone can remain stationary, but the lights and light structures can be moving and actuated. Since the light mounting ring or LED housing is at the furthest part of the periphery of the drone, the light mounting structure can block or cloak the drone apparatus from view.


This invention relates to remote control device or drone, which has a combination of lights (typically LED-type lights) in a cycling motion and software that is reactive to sound to create the effect of writing words in the air and typically at nighttime against a dark sky.


Persistence of Vision can be defined as the retention of a visual image for a short period of time after the removal of the stimulus that produced it; the phenomenon that produces the illusion of movement when viewing motion pictures.


This invention refers to computing programs, applications or software, which are all synonymous and are used interchangeably. This invention can be applied to any computing device that can be connected to a communication network or the Internet via wire or wireless connection. The embodiments of the invention may be implemented by a processor-based computer system. The system includes a database for receiving and storing information from users and application software for users, among other things, determining or updating usage, lifestyle characteristics, values and a user's profile, and displaying feedback information. In accordance with the present invention, computer system operates to execute the functionality for server component. A computer system includes: a processor, a memory and disk storage. Memory stores computer program instructions and data. Processor executes the program instructions or software, and processes the data stored in memory. Disk storage stores data to be transferred to and from memory; disk storage can be used to store data that is typically stored in the database.


All these elements are interconnected by one or more buses, which allow data to be intercommunicated between the elements. Memory can be accessible by processor over a bus and includes an operating system, a program partition and a data partition. The program partition stores and allows execution by processor of program instructions that implement the functions of each respective system described herein. The data partition is accessible by processor and stores data used during the execution of program instructions.


For purposes of this application, memory and disk are machine readable mediums and could include any medium capable of storing instructions adapted to be executed by a processor. Some examples of such media include, but are not limited to, read-only memory (ROM), random-access memory (RAM), programmable ROM, erasable programmable ROM, electronically erasable programmable ROM, dynamic RAM, magnetic disk (e.g., floppy disk and hard drive), optical disk (e.g., CD-ROM), optical fiber, electrical signals, light wave signals, radio-frequency (RF) signals and any other device or signal that can store digital information. In one embodiment, the instructions are stored on the medium in a compressed and/or encrypted format.


As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which can be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting; but rather, to provide an understandable description of the invention. The terms “a” or “an”, as used herein, are defined as: “one” or “more than one.” The term plurality, as used herein, is defined as: “two” or “more than two.” The term another, as used herein, is defined as at least a second or more. The terms including and/or having, as used herein, are defined as comprising (i.e., open language). The term coupled, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically.


Any section or paragraph headings are for the organization of the application and are not intended to be limiting. Any element in a claim that does not explicitly state “means for” performing a specific function, or “step for” performing a specific function, is not be interpreted as a “means” or “step” clause as specified in 35 U.S.C. Sec. 112, Paragraph 6. In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. Sec. 112, Paragraph 6.


Incorporation by Reference: All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference: U.S. Pat. No. 8,667,533; US 2005/0005025; U.S. Pat. Nos. 9,162,753; 8,903,568; 7,302,316; 7,195,200; 9,134,534; 9,129,295; 9,128,281; 9,097,891; 9,097,890; 9,061,102; 9,055,226; 9,017,123; 9,014,661; 9,010,261; 8,989,053; 8,964,298; 8,857,191; 8,854,594; 8,814,691; 8,596,036; 8,549,833; 8,488,246; 8,482,589; 8,477,425; 8,472,120; 8,468,244; 8,467,133; 8,291,716; 8,109,073; 8,099,944; 7,973,658; 7,773,204; 7,750,802; 6,949,003; 6,333,726.

Claims
  • 1. A method comprising: launching an aerial drone;via a display mechanism mounted on a frame of the drone-, providing an airborne display, the display mechanism comprising: a dynamic display member which is, during provision of the airborne display, drivingly rotated relative to a frame of the drone such that the frame is enveloped by the dynamic display member; anda light emitting arrangement carried by the dynamic display member; andusing an electronic controller configured to enable control of drone movement and of the display mechanism, performing operations comprising: receiving a pre-defined trigger stimulus; andresponsive to the trigger stimulus, autonomously causing the drone to produce a visual effect of writing semantic content against a visual backdrop.
  • 2. The method of claim 1, wherein provision of the airborne display comprises providing a persistence of vision display.
  • 3. The method of claim 2, wherein the light emitting arrangement of the display mechanism comprises an array of light emitters mounted on the dynamic display member for movement therewith.
  • 4. The method of claim 1, wherein the visual backdrop is optically da relative to the semantic content produced via the display mechanism.
  • 5. The method of claim 4, wherein the visual backdrop is the night sky.
  • 6. The method of claim 1, wherein the semantic content comprises one or more words.
  • 7. The method of claim 6, wherein the trigger stimulus for producing the visual effect comprises an ambient environmental stimulus.
  • 8. The method of claim 7, wherein the ambient environmental stimulus comprises ambient light.
  • 9. The method of claim 7, wherein the ambient environmental stimulus comprises ambient sound.
  • 10. The method of claim 7, wherein the ambient environmental stimulus comprises ambient motion.
  • 11. The method of claim 7, wherein the ambient environmental stimulus comprises ambient temperature.
  • 12. The method of claim 6, wherein the trigger stimulus for producing the visual effect comprises a programmed light pattern.
  • 13. The method of claim 6, wherein the trigger stimulus for producing the visual effect comprises reproduction of predefined music content.
  • 14. The method of claim 1, wherein the trigger stimulus is received at a remote controlling device in wireless communication with the drone, instructions for producing the visual effect responsive to the trigger stimulus being communicated to the drone from the remote controlling device.
  • 15. The method of claim 1, wherein production of the visual effect is caused at least in part by controlling drone movement.
  • 16. The method of claim 1, wherein production of the visual effect is caused by controlled operation of the display mechanism, being independent of drone movement.
  • 17. A drone display system comprising: an aerial drone comprising; a frame;flight systems mounted on the frame to enable controlled flight of the drone; anda display mechanism carried by the frame and configured to produce a persistence of vision display the display mechanism comprising: a rotatable display member mounted on the frame for rotation relative to the frame, the display member being shaped and positioned such that, during rotation thereof, the frame and the flight systems are located within a volume bounded by the rotating display member;a drive mechanism configured to drivingly rotate the display member during provision of the persistence of vision display; anda light emitting arrangement mounted on the display member; andan electronic controller configured to control the persistence of vision display, wherein the electronic controller is configured to perform operations comprising: identify reception of a pre-defined trigger stimulus; andresponsive to the trigger stimulus, autonomously causing the drone to produce a visual effect of writing semantic content against a visual backdrop.
  • 18. The drone display system of claim 17, wherein the light emitting arrangement of the display mechanism comprises an array of light emitters mounted on the rotatable display member for movement therewith.
  • 19. The drone display system of claim 18, wherein the electronic controller comprises a mobile electronic device physically separate from the drone and configured for wireless communication with the drone to transmit instruction for producing the visual effect.
PRIORITY CLAIM

This application is a continuation of and claims the benefit of U.S. patent application Ser. No. 16/452,026, filed Jun. 25, 2019, which is a continuation of and claims the benefit of U.S. patent application Ser. No. 15/339,810, filed Oct. 31, 2016, which claims the benefit of U.S. Provisional Patent Appl. No. 62/249,252, filed on Oct. 31, 2015, which are incorporated by reference in their entirety.

US Referenced Citations (89)
Number Name Date Kind
6038295 Mattes Mar 2000 A
6819982 Doane Nov 2004 B2
6980909 Root et al. Dec 2005 B2
7173651 Knowles Feb 2007 B1
7411493 Smith Aug 2008 B2
7535890 Rojas May 2009 B2
7542073 Li et al. Jun 2009 B2
8131597 Hudetz Mar 2012 B2
8174562 Hartman May 2012 B2
8199747 Rojas et al. Jun 2012 B2
8274550 Steuart, III Sep 2012 B2
8332475 Rosen et al. Dec 2012 B2
8646720 Shaw Feb 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8874677 Rosen et al. Oct 2014 B2
8909679 Root et al. Dec 2014 B2
8995433 Rojas Mar 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9344642 Niemi et al. May 2016 B2
9345711 Friedhoff May 2016 B2
9443227 Evans et al. Sep 2016 B2
9471059 Wilkins Oct 2016 B1
9489661 Evans et al. Nov 2016 B2
9489937 Beard et al. Nov 2016 B1
9491134 Rosen et al. Nov 2016 B2
9576369 Venkataraman et al. Feb 2017 B2
9589448 Schneider et al. Mar 2017 B1
9681046 Adsumilli et al. Jun 2017 B2
9723272 Lu et al. Aug 2017 B2
9747901 Gentry Aug 2017 B1
9922659 Bradlow et al. Mar 2018 B2
9989965 Cuban et al. Jun 2018 B2
10061328 Canoy et al. Aug 2018 B2
10109224 Ratti Oct 2018 B1
10140987 Erickson et al. Nov 2018 B2
10168700 Gordon et al. Jan 2019 B2
10370118 Nielsen et al. Aug 2019 B1
10501180 Yu Dec 2019 B2
10768639 Meisenholder et al. Sep 2020 B1
11126206 Meisenholder et al. Sep 2021 B2
11427349 Nielsen et al. Aug 2022 B1
20070250526 Hanna Oct 2007 A1
20080255842 Simhi Oct 2008 A1
20090122133 Hartman May 2009 A1
20110202598 Evans et al. Aug 2011 A1
20120194420 Osterhout et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120281885 Syrdal et al. Nov 2012 A1
20120287274 Bevirt Nov 2012 A1
20130056581 Sparks Mar 2013 A1
20130238168 Reyes Sep 2013 A1
20140254896 Zhou et al. Sep 2014 A1
20150022432 Stewart et al. Jan 2015 A1
20150070272 Kim et al. Mar 2015 A1
20150175263 Reyes Jun 2015 A1
20150199022 Gottesman et al. Jul 2015 A1
20150287246 Huston et al. Oct 2015 A1
20150331490 Yamada Nov 2015 A1
20150362917 Wang et al. Dec 2015 A1
20160063987 Xu et al. Mar 2016 A1
20160161946 Wuth Sepulveda et al. Jun 2016 A1
20160179096 Bradlow et al. Jun 2016 A1
20160292886 Erad et al. Oct 2016 A1
20160307573 Wobrock Oct 2016 A1
20160336020 Bradlow et al. Nov 2016 A1
20170031369 Liu et al. Feb 2017 A1
20170094259 Kouperman et al. Mar 2017 A1
20170099424 Jones Apr 2017 A1
20170102699 Anderson Apr 2017 A1
20170137125 Kales May 2017 A1
20170177925 Volkart Jun 2017 A1
20170225796 Sun et al. Aug 2017 A1
20170228690 Kohli Aug 2017 A1
20170244937 Meier et al. Aug 2017 A1
20170320564 Kuzikov Nov 2017 A1
20170337791 Gordon-carroll Nov 2017 A1
20170371353 Millinger, III Dec 2017 A1
20180082682 Erickson et al. Mar 2018 A1
20180246529 Hu et al. Aug 2018 A1
20190011921 Wang et al. Jan 2019 A1
20200241575 Meisenholder et al. Jul 2020 A1
20210362848 Spencer Nov 2021 A1
20210382503 Meisenholder et al. Dec 2021 A1
Foreign Referenced Citations (1)
Number Date Country
2887596 Jul 2015 CA
Non-Patent Literature Citations (21)
Entry
US 10,656,660 B1, 05/2020, Meisenholder et al. (withdrawn)
“U.S. Appl. No. 15/339,810, Non Final Office Action dated Sep. 7, 2018”, 5 pgs.
“U.S. Appl. No. 15/339,810, Notice of Allowance dated Mar. 21, 2019”, 7 pgs.
“U.S. Appl. No. 15/339,810, Response filed Feb. 7, 2019 to Non Final Office Action dated Sep. 7, 2018”, 7 pgs.
“U.S. Appl. No. 15/339,810, Response filed Jul. 17, 2018 to Restriction Requirement dated May 16, 2018”, 7 pgs.
“U.S. Appl. No. 15/339,810, Restriction Requirement dated May 16, 2018”, 5 pgs.
“U.S. Appl. No. 16/452,026, 312 Amendment filed Jul. 15, 2022”, 6 pgs.
“U.S. Appl. No. 16/452,026, Final Office Action dated Apr. 21, 2021”, 14 pgs.
“U.S. Appl. No. 16/452,026, Non Final Office Action dated Sep. 13, 2021”, 12 pgs.
“U.S. Appl. No. 16/452,026, Non Final Office Action dated Nov. 13, 2020”, 16 pgs.
“U.S. Appl. No. 16/452,026, Notice of Allowance dated Apr. 15, 2022”, 8 pgs.
“U.S. Appl. No. 16/452,026, Prelimary Amendment filed Jan. 2, 2020”, 7 pgs.
“U.S. Appl. No. 16/452,026, PTO Response to Rule 312 Communication dated Jul. 27, 2022”, 2 pgs.
“U.S. Appl. No. 16/452,026, Response filed Feb. 28, 2022 to Non Final Office Action dated Sep. 13, 2021”, 10 pgs.
“U.S. Appl. No. 16/452,026, Response filed Apr. 13, 2021 to Non Final Office Action dated Nov. 13, 2020”, 8 pgs.
“U.S. Appl. No. 16/452,026, Response filed Aug. 23, 2021 to Final Office Action dated Apr. 21, 2021”, 11 pages.
Laput, Gierad, et al., “PixelTone: A Multimodal Interface for Image Editing”, ACM, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, FR, (2013), 10 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online] Retrieved from the Internet: <URL: http://www.theregister.co.uk/2005/12/12/stealthtext/>, (Dec. 12, 2005), 1 pg.
Meisenholder, David, et al., “Remoteless Control of Drone Behavior”, U.S. Appl. No. 15/640,143, filed Jun. 30, 2017, 108 pgs.
Pourmehr, Shokoofeh, et al., “You two! Take off!: Creating, Modifying, and Commanding Groups of Robots Using Face Engagement and Indirect Speech in Voice Commands”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, JP, (2013), 137-142.
Yamada, Wataru, et al., “iSphere: Self-Luminous Spherical Drone Display”, Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST), Quebec City, CA, (Oct. 22-25, 2017), 635-343.
Related Publications (1)
Number Date Country
20230059272 A1 Feb 2023 US
Provisional Applications (1)
Number Date Country
62249252 Oct 2015 US
Continuations (2)
Number Date Country
Parent 16452026 Jun 2019 US
Child 17821776 US
Parent 15339810 Oct 2016 US
Child 16452026 US