Systems and methods for using augmenting reality to control a connected home system

Information

  • Patent Grant
  • 11163434
  • Patent Number
    11,163,434
  • Date Filed
    Thursday, January 24, 2019
    5 years ago
  • Date Issued
    Tuesday, November 2, 2021
    2 years ago
Abstract
Systems and methods for using augmented reality to control a connected home system are provided. Some methods can include receiving a video data stream from an IoT video device monitoring a region in which an IoT automation device is located within a field of view of the IoT video device, displaying the video data stream on a user interface device, overlaying a controlling graphic on top of a depiction of the IoT automation device in the video data stream displayed on the user interface device, receiving first user input identifying the controlling graphic via the video data stream displayed on the user interface device, and responsive to the first user input, initiating a change of state of the IoT automation device in the region.
Description
FIELD

The present invention relates generally to connected home systems. More particularly, the present invention relates to systems and methods for using augmented reality to control connected home systems.


BACKGROUND

Systems and methods to control Internet-of-Things (IoT) automation devices in a connected home system, such as lights, switches, locks, and thermostats, are known. For example, such systems and methods can include a device control page in a mobile or web application displaying identifications of the IoT automation devices to a user in a list consecutively or in groups based on types or locations of the IoT automation devices, and the device control page receiving user input to control one of the IoT automation devices.


Systems and methods to view a video data stream captured by an IoT video device in the connected home system are also known. For example, such systems and methods can include a video page in the mobile or web application displaying the video data stream to the user.


However, if the user wishes to confirm that the one of the IoT automation devices changed state pursuant to the user input entered into the device control page outside of feedback provided by the device control page, then the user must navigate to the video page to view the video data stream and the one of the IoT automation devices captured therein. That is, the user must switch from the device control page to the video page, thereby creating a less than desirable user experience.


In view of the above, there is a continuing, ongoing need for improved systems and methods.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a connected system in accordance with disclosed embodiments.





DETAILED DESCRIPTION

While this invention is susceptible of an embodiment in many different forms, there are shown in the drawings and will be described herein in detail specific embodiments thereof with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention. It is not intended to limit the invention to the specific illustrated embodiments.


Embodiments disclosed herein can include systems and methods for using augmented reality to control a connected home system, thereby enhancing a user experience when interacting with the connected home system. For example, the connected home system can include an IoT video device, such as a camera, and an IoT automation device (or a plurality of automation devices), such as a light, a switch, a lock, or a thermostat. The IoT video device can monitor a region in which the connected home system is installed, and the IoT automation device can be located within a field of view of the IoT video device so that a depiction of the IoT automation device can be displayed in a video data stream captured by the IoT video device.


In accordance with disclosed embodiments, systems and methods disclosed herein can overlay controlling graphics on top of the video data stream when displaying the video data stream on a user interface device, such as in a mobile or web application. For example, in some embodiments, systems and method disclosed herein can display the controlling graphics on top of the video data stream responsive to first user input, such as a user touching the user interface device displaying the video data stream for a predetermined period of time.


Then, systems and methods disclosed herein can receive second user input via a portion of the controlling graphics overlaid on the depiction of the IoT automation device displayed in the video data stream, initiate a change of state of the IoT automation device in the region responsive to the second user input, and display the depiction of the IoT automation device with the state changed in the video data stream. Accordingly, systems and methods disclosed herein can both receive the second user input to change the state of the IoT automation device and provide visual confirmation displaying the IoT automation device with the state changed via a single page or screen of the user interface device and without navigating to multiple pages or screens of the user interface device. In some embodiments, the second user input can include the user touching the portion of the user interface device displaying the controlling graphics over the depiction of the IoT automation device for the predetermined period of time.


In some embodiments, responsive to the second user input and prior to initiating the change of state of the IoT device, systems and methods disclosed herein can display details for the portion of the controlling graphics receiving the second user input. For example, the portion of the controlling graphics receiving the second user input can include an identifier of a thermostat in the region. In these embodiments, responsive to the second user input, systems and methods disclosed herein can display a temperature of the region and up and down arrows to control the thermostat overlaid on a depiction of the thermostat displayed in the video data stream, receive third user input via the up and down arrows to adjust the thermostat up or down, and initiate adjusting the thermostat responsive to the third user input. Alternatively, the portion of the controlling graphics receiving the second user input can include an identifier of a light with a dimmer in the region. In these embodiments, responsive to the second user input, systems and methods disclosed herein can display a slide bar to control a brightness of the dimmer overlaid on a depiction of the light displayed in the video data stream, receive third user input via the slide bar to adjust the brightness of the dimmer, and initiate adjusting the dimmer responsive to the third user input. Accordingly, systems and methods disclosed herein can both receive the third user input to change the state of the IoT automation device and provide visual confirmation displaying the IoT automation device with the state changed via the single page or screen of the user interface device and without navigating to multiple pages or screens of the user interface device.


In some embodiments, the IoT automation device can include an emitter, such as an infrared LED. To enroll the IoT automation device with the IoT video device for control via the video data stream, systems and methods disclosed herein can instruct the IoT automation device to transmit a visual or non-visual signal to the IoT video device. In some embodiments, the visual or non-visual signal can include a unique signature therein that can identify the IoT automation device and capabilities of the IoT automation device.


The IoT video device can capture the visual or non-visual signal, and responsive thereto, systems and methods disclosed herein can identify and save a location of the IoT automation device within the field of view of the IoT video device. Then, systems and methods disclosed herein can match the location of the IoT automation device within the field of view of the IoT video device with a location for the portion of the controlling graphics to be overlaid on the depiction of the IoT automation device displayed in the video data stream. When the user interface device receives the second user input, systems and methods disclosed herein can correlate a touch point of the user interface device receiving the second user input with the location of the portion of the controlling graphics overlaying the depiction of the IoT automation device displayed in the video data stream, the location of the IoT automation device within the field of view of the IoT video device, and/or with the IoT automation device itself. Responsive thereto, systems and methods herein can initiate the change the state of the IoT automation device in the region.



FIG. 1 is a block diagram of a connected home system 100 in accordance with disclosed embodiments. As seen in FIG. 1, the connected home system 100 can include an IoT video device 102 monitoring a region R in which an IoT automation device 106 is installed, and the IoT automation device 106 can be located within a field of view 104 of the IoT video device 102. The IoT video device 102 can capture a video data stream 112 of the region R within the field of view 104, and the video data stream can include a depiction 114 of the IoT automation device 106.


The IoT video device 102 or another device of the connected home system 100, such as a control panel, a gateway device, or the like, can wirelessly transmit the video data stream 112 to a remote server or device 120 that is in wireless communication with the connected home system 100, and the remove server or device 102 can receive the video data stream 112 via a transceiver device 122. As seen in FIG. 1, the remote server or device 120 can also include a memory device 126, and each of the transceiver device 122 and the memory device 126 can be in communication with control circuitry 124, a programmable processor 124a, and executable control software 124b as would be understood by one of ordinary skill in the art. The executable control software 124b can be stored on a transitory or non-transitory computer readable medium, including, but not limited to local computer memory, RAM, optical storage media, magnetic storage media, flash memory, and the like. In some embodiments, some or all of the control circuitry 124, the programmable processor 124a, and the executable control software 124b can execute and control at least some of the methods disclosed and described above and herein.


As seen in FIG. 1, the IoT automation device 106 can include an emitter 108. To enroll the IoT automation device 106 with the IoT video device 102 for control via the video data stream 112, the IoT video device 102 and/or the remote server or device 120 can instruct the IoT automation device 106 to transmit via the emitter 108 a visual or non-visual signal to the IoT video device 102 that identifies the IoT automation device 106. The IoT video device 102 can capture the visual or non-visual signal and transmit the visual or non-visual signal to the remote server or device 120, and the remote server or device 120 can receive the visual or non-visual signal or a representation thereof via the transceiver device 122. Responsive thereto, the control circuitry 124, the programmable processor 124a, and the executable control software 124b can identify and save in the memory device 126 a location of the IoT automation device 106 within the field of view 104 of the IoT video device 102 and match the location of the IoT automation device 106 within the field of view 104 of the IoT video device 102 with a location for a controlling graphic 116 to be overlaid on a depiction 114 of the IoT automation device 106 displayed in the video data stream 112.


The remote server or device 122 can also be in wireless communication with a user interface device 110. Accordingly, the transceiver device 122 can transmit the video data stream 112 and the controlling graphic 116 to the user interface device 110 with a video instruction signal for the user interface device 110 to overlay the controlling graphic 116 on top of the depiction 114 of the IoT automation device 106 in the video data stream 112 when displaying the video data stream 112 thereon. Responsive thereto, and as seen in FIG. 1, the user interface device 110 can display the video data stream 112 with the depiction 114 of the IoT automation device 114 therein and the controlling graphic 116 overlaid thereon.


In some embodiments, the control circuitry 124, the programmable processor 124a, and the executable control software 124b can transmit the video data stream 112 to the user interface device 110 separately from the controlling graphic 116 and the video instruction signal. For example, in these embodiments, the control circuitry 124, the programmable processor 124a, and the executable control software 124b can transmit the controlling graphic 116 and the video instruction signal to the user interface device 110 responsive to the user interface device 110 displaying the video data stream 112, receiving first user input, and transmitting a first request signal to the remote server device 120 requesting the controlling graphic 116.


In any embodiment, the control circuitry 124, the programmable processor 124a, and the executable control software 124b can retrieve the controlling graphic 116 associated with the IoT automation device 116 and the location for the controlling graphic 116 to be overlaid on the depiction 114 of the IoT automation device 106 displayed in the video data stream 112 from the memory device 126.


When the user interface device 110 is displaying the video data stream 112 with the controlling graphic 116 overlaid on the depiction 114 of the IoT automation device 106, the user interface device 110 can receive second user input identifying the controlling graphic 116 and, responsive thereto, transmit a second request signal to the remote server or device 120. The transceiver device 122 can receive the second request signal, and responsive thereto, the control circuitry 122, the programmable processor 124a, and the executable control software 124b can correlate a touch point of the user interface device 110 receiving the second user input with the location of the controlling graphic 116 overlaying the depiction 114 of the IoT automation device 106 displayed in the video data stream 112, with the location of the IoT automation device 106 within the field of view 104 of the IoT video device 102, and/or with the IoT automation device 106 itself. Then, the control circuitry 122, the programmable processor 124a, and the executable control software 124b can initiate a change of state of the IoT automation device 106, for example, by transmitting an automation instruction signal to the IoT automation device 106 or the another device of the connected home system 100 for the IoT automation device 106 to change its state pursuant to the second user input. Because the IoT video device 102 can capture the video data stream 112 of the region R within the field of view 104, including the IoT automation device 106, the video data stream 112 can capture the IoT automation device 106 with the state changed for display on the user interface device 112, thereby providing visual confirmation for a user regarding the state of the IoT automation device 106 being changed.


Although a few embodiments have been described in detail above, other modifications are possible. For example, the logic flows described above do not require the particular order described or sequential order to achieve desirable results. Other steps may be provided, steps may be eliminated from the described flows, and other components may be added to or removed from the described systems. Other embodiments may be within the scope of the invention.


From the foregoing, it will be observed that numerous variations and modifications may be effected without departing from the spirit and scope of the invention. It is to be understood that no limitation with respect to the specific system or method described herein is intended or should be inferred. It is, of course, intended to cover all such modifications as fall within the spirit and scope of the invention.

Claims
  • 1. A method comprising: instructing an IoT automation device to transmit a visual signal from an emitter of the IoT automation device;receiving, at an IoT video device, the visual signal the from an emitter of the IoT automation device, the visual signal identifying the IoT automation device;responsive to receiving the visual signal, identifying and saving an actual first location of the IoT automation device within the field of view of the IoT video device based on a location at which the visual signal is emitted from the emitter of the IoT automation device;receiving a video data stream from an the IoT video device monitoring a region in which the IoT automation device is located within a field of view of the IoT video device;displaying the video data stream on a display of a user interface device;matching the actual location of the IoT automation device within the field of view of the IoT video device with a first location on the display of the user interface device;overlaying, for a predetermined period of time, a controlling graphic at the first location on the display of the user interface device that is on top of a depiction of the IoT automation device in the video data stream displayed on the display of the user interface device;receiving first user input at a touch point on the video data stream displayed on the display of the user interface device;determining whether the touchpoint on the video data stream is correlated with the first location of the controlling graphic overlaying the video data stream on the display of the user interface device;responsive to the touchpoint being correlated with the first location on the display of the user interface device, initiating a change of state of the IoT automation device in the region; andresponsive to initiating the change of state of the IoT automat ion device, removing the controlling graphic from the display of the user interface device and displaying the video data stream on the display of the user interface device with the depiction of the IoT automation device showing the state changed in the video data stream.
  • 2. The method of claim 1 further comprising: receiving second user input requesting the controlling graphic via the video data stream displayed on the display of the user interface device; andresponsive to the second user input, overlaying the controlling graphic at the first location on the display of the user interface device.
  • 3. The method of claim 1 further comprising: responsive to receiving second user input at the first location on the display of the user interface device, displaying details for a controlling graphic at the first location on the display of the user interface device;receiving the first user input at the touchpoint on the display of the user interface device while the details for the controlling graphic are displayed, the first user input adjusting one of the details for the controlling graphic; andresponsive to the first user input, initiating the change of the state of the IoT automation device in the region by initiating an adjustment of the IoT automation device in the region that corresponds to the one of the details as adjusted.
  • 4. A system comprising: an IoT video device monitoring a region;an IoT automation device located in the region within a field of view of the IoT video device, the IoT automation device including an emitter;a remote server or device in wireless communication with the IoT video device and the IoT automation device; anda user interface device in communication with the remote server or device, wherein the remote server or user interface device instructs the IoT automation device to transmit a visual signal from the emitter of the IoT automation device,wherein the visual signal identifies the IoT automation device wherein the IoT video device receives the visual signal and transmits the visual signal to the remote server or user interface device,wherein, in response to receiving the visual signal, the remote server or user interface device identifies and saves an actual first location of the IoT automation device within the field of view of the IoT video device based on a location at which the visual signal is emitted from the emitter of the IoT automation device,wherein the IoT video device captures a video data stream of the field of view and transmits the video data stream to the remote server or device,wherein the remote server or device transmits the video data stream to the user interface device,wherein the user interface device displays the video data stream on a display of the user interface device,wherein the remote server or device matches the actual location of the IoT automation device within the field of view of the IoT video device with a first location on the display of the user interface device,wherein the remote server or device overlays, for a predetermined period of time, a controlling graphic at the first location on the display of the user interface device that is on top of a depiction of the IoT automation device in the video data stream displayed on the display of the user interface device,wherein the user interface device receives first user input at a touch point on the video data stream displayed on f the display of the user interface device,wherein the user interface device determines whether the touchpoint on the video data stream is correlated with the first location of the controlling graphic overlaying the video data stream on the display of the user interface device,wherein, responsive to the touchpoint being correlated with the first location on the display of the user interface device, the remote server or device initiates a change of state of the IoT automation device in the region, andwherein, responsive to initiating the change of state of the IoT automation device in the region, the remote server or device removes the controlling graphic from the display of the device and displays the video data stream on the display of the device with the depiction of the IoT automation device showing the state changed in the video data stream.
  • 5. The system of claim 4 wherein the user interface device receives second user input requesting the controlling graphic via the video data stream displayed on the display of the user interface device, and wherein, responsive to the second user input, the remote server or device overlays the controlling graphic on top of the depiction of the automation device in the video data stream displayed on the display of the user interface device.
  • 6. The system of claim 4 wherein, responsive to receiving first user input at the first location on the display of the user interface device, the remote server or device displays details for a controlling graphic at the first location on the display of the user interface device, wherein the user interface device receives the first user input at the touchpoint on the display of the user interface device and second user input adjusts one of the details for the controlling graphic, and wherein, responsive to the second user input, the remote server or device initiates the change of the state of the IoT device in the region by initiating an adjustment of the IoT automation device in the region that corresponds to the one of the details as adjusted.
US Referenced Citations (164)
Number Name Date Kind
4821309 Namekawa Apr 1989 A
5331549 Crawford, Jr. Jul 1994 A
5960337 Brewster et al. Sep 1999 A
6028915 McNevin Feb 2000 A
6031836 Haserodt Feb 2000 A
6037936 Ellenby Mar 2000 A
6292542 Bilder Sep 2001 B1
6466258 Mogenis et al. Oct 2002 B1
6529137 Roe Mar 2003 B1
7026926 Walker, III Apr 2006 B1
7119675 Khandelwal et al. Oct 2006 B2
7145462 Dewing et al. Dec 2006 B2
7177623 Baldwin Feb 2007 B2
7542428 Johnson et al. Jun 2009 B1
7734906 Orlando et al. Jun 2010 B2
7884734 Izadi Feb 2011 B2
8225226 Skourup Jul 2012 B2
8314683 Pfeffer Nov 2012 B2
8345665 Vieri et al. Jan 2013 B2
8350694 Trundle et al. Jan 2013 B1
8400548 Bilbrey Mar 2013 B2
8433344 Virga Apr 2013 B1
8473619 Baum et al. Jun 2013 B2
8478844 Baum et al. Jul 2013 B2
8489063 Petite Jul 2013 B2
8494481 Bacco et al. Jul 2013 B1
8538374 Haimo et al. Sep 2013 B1
8554250 Linaker Oct 2013 B2
8576066 Bivens et al. Nov 2013 B2
8600338 Perrott et al. Dec 2013 B2
8625751 Bruce et al. Jan 2014 B2
8630820 Amis Jan 2014 B2
8830267 Brackney Sep 2014 B2
8896436 Morehead Nov 2014 B1
8970725 Mekenkamp Mar 2015 B2
8990887 Kocsis et al. Mar 2015 B2
9013294 Trundle Apr 2015 B1
9414212 Nokhoudian et al. Aug 2016 B2
9426638 Johnson Aug 2016 B1
9438440 Burns Sep 2016 B2
9571625 Kim Feb 2017 B2
9640005 Geerlings et al. May 2017 B2
9727132 Liu Aug 2017 B2
9875643 Sarna, II Jan 2018 B1
10142421 Mighdoll et al. Nov 2018 B2
10212000 Irving, Jr. Feb 2019 B1
10559194 Jiang Feb 2020 B2
10602046 Pan Mar 2020 B2
10613729 Cohrt Apr 2020 B2
20010016806 Ronen Aug 2001 A1
20020053978 Peterson et al. May 2002 A1
20030012344 Agarwal et al. Jan 2003 A1
20030151507 Andre et al. Aug 2003 A1
20040103431 Davenport et al. May 2004 A1
20040145465 Stults et al. Jul 2004 A1
20040192250 Hargett Sep 2004 A1
20040239498 Miller Dec 2004 A1
20050222820 Chung Oct 2005 A1
20060015254 Smith Jan 2006 A1
20060125621 Babich Jun 2006 A1
20070008099 Kimmel et al. Jan 2007 A1
20070103294 Bonecutter et al. May 2007 A1
20070115108 Martin et al. May 2007 A1
20070210910 Norstrom et al. Sep 2007 A1
20070236381 Ouchi Oct 2007 A1
20070262857 Jackson Nov 2007 A1
20080048861 Naidoo et al. Feb 2008 A1
20080098068 Ebata Apr 2008 A1
20080151795 Shorty et al. Jun 2008 A1
20080191857 Mojaver Aug 2008 A1
20080278311 Grange et al. Nov 2008 A1
20090005068 Forstall et al. Jan 2009 A1
20090265576 Blum Oct 2009 A1
20090294666 Hargel Dec 2009 A1
20090322511 McKenna et al. Dec 2009 A1
20090322523 McKenna et al. Dec 2009 A1
20100002845 Zerillo et al. Jan 2010 A1
20100030399 Zellner et al. Feb 2010 A1
20100045460 Caler et al. Feb 2010 A1
20100094636 Becker et al. Apr 2010 A1
20100325047 Carlson et al. Dec 2010 A1
20110046920 Amis Feb 2011 A1
20110071880 Spector Mar 2011 A1
20110105041 Maruyama May 2011 A1
20110111728 Ferguson et al. May 2011 A1
20110157357 Weisensale Jun 2011 A1
20110181443 Gutierrez et al. Jul 2011 A1
20120188072 Dawes et al. Jul 2012 A1
20120203379 Sloo et al. Aug 2012 A1
20120218102 Bivens et al. Aug 2012 A1
20130053063 McSheffrey Feb 2013 A1
20130141460 Kane-Esrig et al. Jun 2013 A1
20130173064 Fadell et al. Jul 2013 A1
20130204440 Fadell et al. Aug 2013 A1
20130257858 Na et al. Oct 2013 A1
20130264383 Ko Oct 2013 A1
20130295872 Guday et al. Nov 2013 A1
20130338839 Rogers et al. Dec 2013 A1
20140096084 Kwon Apr 2014 A1
20140098247 Rao Apr 2014 A1
20140168262 Forutanpour Jun 2014 A1
20140244001 Glickfield Aug 2014 A1
20140253321 Srinivasan et al. Sep 2014 A1
20140266669 Fadell et al. Sep 2014 A1
20140292807 Raffa Oct 2014 A1
20140316581 Fadell et al. Oct 2014 A1
20140337921 Hanna, Jr. et al. Nov 2014 A1
20140368601 deCharms Dec 2014 A1
20150028746 Temple Jan 2015 A1
20150077282 Mohamadi Mar 2015 A1
20150111525 Crockett et al. Apr 2015 A1
20150130957 Berelejis May 2015 A1
20150228139 Geerlings et al. Aug 2015 A1
20150279187 Kranz Oct 2015 A1
20150281656 Chien et al. Oct 2015 A1
20150288819 Brown et al. Oct 2015 A1
20150302674 Kuruba et al. Oct 2015 A1
20150317809 Chellappan et al. Nov 2015 A1
20150324107 Van Dijkman et al. Nov 2015 A1
20150339031 Zeinstra et al. Nov 2015 A1
20150347850 Berelejis Dec 2015 A1
20150370615 Pi-Sunyer Dec 2015 A1
20160019763 Raji et al. Jan 2016 A1
20160029190 Rattner Jan 2016 A1
20160037319 Hafeman Feb 2016 A1
20160098305 Bucsa et al. Apr 2016 A1
20160117913 Sharma et al. Apr 2016 A1
20160179087 Lee Jun 2016 A1
20160180699 Cote Jun 2016 A1
20160203648 Bilbrey Jul 2016 A1
20160224123 Antoniac Aug 2016 A1
20160274762 Lopez Sep 2016 A1
20160275022 Piel et al. Sep 2016 A1
20160286033 Frenz et al. Sep 2016 A1
20160313750 Frenz et al. Oct 2016 A1
20160323548 Khot et al. Nov 2016 A1
20160335423 Beals Nov 2016 A1
20160335981 Koo Nov 2016 A1
20170010783 Beattie Jan 2017 A1
20170034295 Verna et al. Feb 2017 A1
20170108838 Todeschini Apr 2017 A1
20170169688 Britt et al. Jun 2017 A1
20170191695 Bruhn et al. Jul 2017 A1
20170222884 Denneler Aug 2017 A1
20170270715 Lindsay Sep 2017 A1
20170364747 Ekambaram Dec 2017 A1
20180102045 Simon Apr 2018 A1
20180137725 Acera et al. May 2018 A1
20180160260 Meganathan Jun 2018 A1
20180177031 Yoo Jun 2018 A1
20180199179 Rauner Jul 2018 A1
20180204385 Sarangdhar Jul 2018 A1
20180239425 Jang Aug 2018 A1
20180365495 Laycock et al. Dec 2018 A1
20180365898 Costa Dec 2018 A1
20190068393 Lee Feb 2019 A1
20190114061 Daniels Apr 2019 A1
20190171170 Becea Jun 2019 A1
20190208024 Jablonski Jul 2019 A1
20190212901 Garrison Jul 2019 A1
20190340819 Chandrashekarappa Nov 2019 A1
20190347916 Wild et al. Nov 2019 A1
20190392604 Keen Dec 2019 A1
20200005542 Kocharlakota Jan 2020 A1
Foreign Referenced Citations (8)
Number Date Country
1970871 Sep 2008 EP
2219163 Aug 2010 EP
2987269 Jul 2018 EP
3429133 Jan 2019 EP
2007100553 Sep 2007 WO
2013175076 Nov 2013 WO
2014124497 Aug 2014 WO
2014169232 Oct 2014 WO
Non-Patent Literature Citations (1)
Entry
Extended European Search Report from corresponding EP patent application 20152568.0, dated Jun. 9, 2020.
Related Publications (1)
Number Date Country
20200241736 A1 Jul 2020 US