Security protocol for motion tracking systems

Information

  • Patent Grant
  • 10989803
  • Patent Number
    10,989,803
  • Date Filed
    Tuesday, August 21, 2018
    5 years ago
  • Date Issued
    Tuesday, April 27, 2021
    3 years ago
Abstract
An approach to authentication, provisioning, and/or access control makes use of a user performing prescribed motion of their body and/or particular limbs (e.g., a prescribed number of steps in one or more directions), for example, as part of a challenge-response protocol. The motion may be detected using the radio frequency reflection techniques. Applications of this approach may include provisioning of equipment that monitors motion within an environment. In some examples, the environment is defined as part of the user performing the prescribed motion (e.g., the user walks around a boundary of their house).
Description
BACKGROUND

This invention relates to security protocols for motion tracking systems.


Recent years have seen new technologies that can track human motion without requiring the monitored person to carry or wear a sensor on his or her body. Hence these devices do not require the consent of the monitored user or even awareness of the monitoring. For example, U.S. Pat. No. 9,753,131 describes a system that permits monitoring of a person's 3D location using RF reflections off a subject's body. It does this through walls and obstacles. U.S. Pat. Pub. 2017/0042432 describes a system that can track a person' breathing and heartbeats without body contact and through walls and obstacles.


Because such devices can operate even when the monitored person is oblivious to their presence, there is a need to ensure privacy and/or authority to perform such monitoring.


SUMMARY

In one aspect, in general, an approach to authentication, provisioning, and/or access control makes use of a user performing prescribed motion of their body and/or particular limbs (e.g., a prescribed number of steps in one or more directions), for example, as part of a challenge-response protocol. The motion may be detected using the radio frequency reflection techniques. Applications of this approach may include provisioning of equipment that monitors motion within an environment. In some examples, the environment is defined as part of the user performing the prescribed motion (e.g., the user walks around a boundary of their house).


Other features and advantages of the invention are apparent from the following description, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1 is a system diagram.





DESCRIPTION

Referring to FIG. 1, a motion-based system 100 includes a tracking system 110, which is able to track motion of a subject 101 in an environment (e.g., in a room in which the system is located, or in which the system can track motion through one or more walls, etc.). The tracking system provides motion-based data to an application system 190, which processes the motion-based data and performs various actions or analyses based on that data. In some examples, the application system is tightly coupled to the tracking system 110, for example, being integrated in a single device, or in communication over a local area network. In some examples, the application system may be remote from the tracking system, for example, executing in a dedicated or shared server remote from the tracking system 110 and in communication over the public Internet.


Continuing to refer to FIG. 1, the tracking system 110 is able to sense motion of the subject 101 using a non-contact technology. In the system illustrated in FIG. 1, a radio-frequency system is illustrated in which antennas 122 of an antenna subsystem 120 are used to emit radio frequency signals and receive corresponding reflections of the emitted signals from the subject. Very generally, the tracker 130 has the capability to process the relationship between the emitted signals and the received reflections to infer motion of the subject. In various examples of such a system, the motion may include one or more of (a) gross motion, such as walking in a particular pattern in a room, (b) limb or gesture motion that relates to absolute or relative motion of particular limbs or body parts, and (c) fine motion, such a motion that encodes heart rate or breathing rate through, for example, period motion of the subject's chest.


Notwithstanding the ability of the tracker 130 to process signals (e.g., RF transmission and reflection signals) to infer the motion of the subject, for one or more reasons the tracking system 110 as a whole inhibits such information to be provided, for example, to one or more application systems 190 based on enrollment/authentication data 145. For example, the data may represent one or more of (a) a spatial region in which the tracking system 110 is permitted to provide motion data, (b) a time period during which such motion may be provided, and (c) a type of motion data that may be provided.


The enrollment/authentication data 145 may be configured in a variety of ways. One way this data can be configured relates soliciting explicit authorization from the subject, for example, to maintain the privacy of the subject for example required by agreed upon privacy policies related to operation of the system. One example of soliciting authorization is by emitting a prompt or other signal that would be perceived (e.g., heard or seen) by the subject as well as other individual in the vicinity. For example, the prompt may generated by a challenge generator 150 (e.g., based on recorded or computer synthesized speech) and be emitted via an audio speaker 155. For example, the prompt might be “To authorize motion tracking in this room, please <specification of authentication motion>”, where the specification of authentication motion may be a reference to a pre-defined (possibly secret) motion pattern, such as “take two steps to the left and then two steps back” or “wave with your right hand” or “hold your breath for 5 seconds, breathe normally, and then hold your breath for 5 seconds again”. Optionally, the system may announce the initiation of motion tracking “Motion tracking is being initiated. If you do not wish your motion to be tracked, please leave this vicinity.” As indicated above, prompts are not necessary audio, in some instances, in other versions, the prompt may instead or in addition be via a visual display, such as a computer or television screen.


Continuing to refer to FIG. 1, the enrollment/authentication process is controlled by an enrollment processor 140 of the tracking system. For example, then the tracking system is turned on, or tracking is requested by the application system 190, the enrollment processor may cause the challenge generator 150 to emit the challenge of a type described above. The enrollment processor 140 then receives tracking data from the tacker 130, and determines whether this tracking data forms a suitable response to the challenge. If it does, the enrollment processor save corresponding enrollment/authentication data 145, which is used by the tracker 130 to permit the passing of motion data to the application system 190. For example, the enrollment processor may have one of a variety of pattern matchers to determine whether the response was suitable, for example, a motion template matcher or a neural network pattern matcher.


In some examples, the motion-based system 100 or the tracking system 110 are portable, and therefore it may be possible for authentication or enrollment to be performed in one location or in a particular orientation, and then transfer the system to another location or orientation in which the system should not be authorized to monitor motion. In some examples of the system, an optional motion sensor 160 (e.g., using inertially-based sensing; radio positioning, etc.) provides signals to the enrollment processor that indicate the change. The enrollment processor may then cancel the authorization by writing appropriate information in the enrollment/authentication data 145, of may initiate a new challenge requiring response from the subject.


There are other situations in which the enrollment processor may terminate the authorization to monitor motion in the environment, including one or more of (a) an elapsed time, for example, a predetermined duration such as 30 minutes; (c) an elapsed time after a motion-based event, such as 10 minutes after the subject leaves the sensing vicinity of the system, or immediately is a second subject enters the vicinity; or (c) an explicit gesture or other motion input from the subject that indicates that the system is to stop sensing.


As introduced above, the tracker 130 may use radio frequency reflections as the basis of the motion tracking. In some embodiments, the approach used makes use of techniques described in one or more of the following applications, each of which is incorporated herein by reference:

    • U.S. Pat. No. 9,753,131 titled “Motion Tracking Via Body Radio Reflections” issued on Sep. 5, 2017 (also corresponding to International Application No. PCT/US2014/059856, titled “Motion Tracking Via Body Radio Reflections” and filed Oct. 9, 2014, and U.S. Pat. Pub. 2017/0082741 published Mar. 23, 2017);
    • U.S. Pat. Pub. 2017/0074980, titled “Object Tracking Via Radio Reflections”, published Mar. 16, 2017 (also corresponding to International Application No. PCT/US2015/017239, filed Feb. 24, 2015); and
    • U.S. Pat. Pub. 2017/0042432, titled “Vital Signs Monitoring Via Radio Reflections,” published Feb. 16, 2017 (corresponding to International Application No. PCT/US2015/027945, and filed Apr. 28, 2015)


      and the following publications, each of which is also incorporated herein by reference:
    • 3D Tracking via Body Radio Reflections Fadel Adib, Zach Kabelac, Dina Katabi, Robert C. Miller Usenix NSDI'14, Seattle, Wash., April 2014
    • Multi-Person Localization via RF Body Reflections Fadel Adib, Zach Kabelac, Dina Katabi Usenix NSDI'15, Oakland, Calif., May 2015
    • Demo: Real-time Breath Monitoring Using Wireless Signals Fadel Adib, Zach Kabelac, Hongzi Mao, Dina Katabi, Robert C. Miller ACM MobiCom'14, Maui, Hi., September 2014
    • Smart Homes That Monitor Breathing and Heart Rate Fadel Adib, Hongzi Mao, Zach Kabelac, Dina Katabi, Robert C. Miller CHI'15, Seoul, South Korea, April 2015
    • Capturing the Human Figure Through a Wall Fadel Adib, Chen-Yu Hsu, Hongzi Mao, Dina Katabi, Fredo Durand ACM SIGGRAPH Asia'15, Kobe, Japan, November 2015


Although the radio-frequency approaches described above provide advantages, for example, being able to sense through some physical barriers, relatively low cost etc., other sensing modes may be used. For example, rather than using reflected radio frequency signals, acoustics (e.g., ultrasound) signals may be used to track motion. Also, optical techniques may be used with a camera or camera array (e.g., for depth sensing) may be used to sense natural or synthetic illumination of a subject from with the subject's motion is determined by an alternative version of the tracker 130 configured for such signals. Of course, yet other embodiments may use two or more sensing modes together.


Without limiting the approach, the following are examples of the approaches described above:


Example 1: the user installs the device in the desired location. When the user turns the device on, she is asked to stand at a specific distance (or location) in front of the device. Say she is asked to stand at 2 meters in front of the device. The user is then asked to perform a sequence of movements. For example, she may be asked to first take two steps to the right. Next, take two steps to the left. Next take two steps back, etc. The device monitors the user as she performs the sequence and checks that it matches the desired movements. If not, the user is asked to repeat and eventually denied access if she cannot pass the response phase. Since the device can track people, the main point of this exercise is to ascertain that the person the device is monitoring is actually aware of the monitoring process. Once this check is performed the device can monitor this blob/person until it loses it.


Example 2: In our home security example, the user wants the device to track people in her home, not just track her. So the next step in the protocol is to prove to the device that the user has access to the monitored space. Once the user performs the above step, the user walks through the area she wants to monitor. The device will follow the user motion and mark those locations and store the marking for future use. The device may fill in some small areas in the middle of where the user walked. For example, the user may walk around a table. The table will still be considered as part of the home. These marked locations are considered accessible to this user and the device will allow her to use her credential (e.g., password) for future access to the data from that space.


Example 3: What if the user performs the above protocol, showing the device her ability to access a particular area, then changes the device's position and points it toward a space that she has no access to—e.g., the neighbor's home. Since the device localizes with respect to itself, in the absence of additional information, the device may confuse the marked space with some other space that the user should not have access to. To prevent this, the device should include a sensor (or mechanism) to detect its own motion (e.g., an accelerometer or infrared sensor). If the device detects that it was moved since the last time the privacy protocol was performed, it will require the user to repeat the protocol (both steps 1 and 2) before giving her access again.


Example 4: A similar approach can be done for monitoring breathing and heart signals (other type of device-free monitoring). The challenge-response in this case could be a body motion like above or a breathing motion, e.g., the device asks the user to take a breath then release her breath in a particular timed manner.


As discussed above, the motion-based system 100, or at least the tracking system 110, may be incorporated into a device that is installed or placed in the room or other region in which motion is to be sensed. For example, the tracking system 110 may be integrated into a personal assistant device, which may also be configured to respond to voice commands. The motion-based authentication may be used to ensure privacy of motion in the sensing range of the assistant device, and may also be used to ensure privacy of other modes of sensing, such as audio and/or video sensing. As another example, the tracker may be integrated in a hospital room monitoring system (e.g., with a central application system being used to monitor multiple beds in the hospital).


Implementations of the system may be computer controller or computer implemented, with one or more processors of the tracking system 100 being configured with software instructions stored on a machine-readable medium of the system. These instructions then cause one or more physical or logical processors of the system to perform the steps of the approaches described above. For example, the instructions may be machine level instructions, programming language statements (whether interpreted or compiled), or instructions at an intermediate level (e.g., object code, code libraries, etc.). Alternative implementations may have hardware implementations of one or more of the components, for example, using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) and the like, and a combination of software- and hardware-based implementations may be used.


It is to be understood that the foregoing description is intended to illustrate and not to limit the scope of the invention, which is defined by the scope of the appended claims. Other embodiments are within the scope of the following claims.

Claims
  • 1. A method of authentication and/or access control comprising: prompting a user to perform a commanded motion-based action;determining an observer motion-based action of the user using radio-frequency based tracking;determining if the observed motion-based action matches the commanded motion-based action;if the observer motion-based action matches the commanded motion-based action, performing an authentication, and/or access control operation including authorizing a user to use a motion tracking system or providing a user access to the motion tracking system for observation of subsequent motion-based action of the user; anddeauthenticating the user or revoking the user's access to the motion tracking system upon detecting that the motion tracking system has been moved from one location to another location.
  • 2. The method of claim 1 wherein: the commanded motion-based action comprises a limb-based motion.
  • 3. The method of claim 2 wherein: the commanded motion-based action comprises whole-body motion.
  • 4. The method of claim 3 wherein: the whole-body motion comprises a movement to a prescribed location.
  • 5. The method of claim 3 wherein: the whole-body motion comprises a sequence of prescribed relative movements.
  • 6. The method of claim 1 further comprising provisioning the motion tracking system, including: prompting a user to perform a commanded motion-based action;determining an observed motion-based action of the user using radio-frequency based tracking; andprovisioning the motion tracking system according to the observed motion-based action.
  • 7. The method of claim 6 wherein the observed motion-based action comprises movement to a plurality of locations in an environment, and provisioning the system is performed according to said plurality of locations.
  • 8. The method of claim 1 wherein determining the observed motion-based action of the user using radio-frequency based tracking comprises: emitting first radio signals from one or more transmitting antennas;acquiring second radio signals at one or more receiving antennas, the second radio signals comprising reflections of the first radio signals from the user; andprocessing the second radio signals to determine the motion-based action of the user.
  • 9. The method of claim 1 further comprising deauthenticating the user or revoking the user's access to the motion tracking system upon determining that a predetermined amount of time has elapsed.
  • 10. The method of claim 1 further comprising deauthenticating the user or revoking the user's access to the motion tracking system upon determining that a predetermined amount of time has elapsed since the motion tracking system last detected movement of the user.
  • 11. A system comprising: a motion-tracking system comprising a radio transmitter;a plurality of antennas, each antenna being configured for transmission and/or receiving of radio signals; anda processor configured cause emission of first radio signals via one or more of the antennas, to process signal received at one or more of the antennas to track motion; anda controller configured to prompt a user to perform a commanded motion-based action;determine an observer motion-based action of the user using radio-frequency based tracking;determine if the observed motion-based action matches the commanded motion-based action;if the observer motion-based action matches the commanded motion-based action, perform an authentication, and/or access control operation including authorizing a user to use a motion tracking system or providing a user access to the motion tracking system for observation of subsequent motion-based action of the user; anddeauthenticate the user or revoke the user's access to the motion tracking system upon detecting that the motion tracking system has been moved from one location to another location.
  • 12. A method of authentication and/or access control comprising: prompting a user to perform a commanded motion-based action;determining an observer motion-based action of the user using radio-frequency based tracking;determining if the observed motion-based action matches the commanded motion-based action;if the observer motion-based action matches the commanded motion-based action, performing an authentication, and/or access control operation including authorizing a user to use a motion tracking system or providing a user access to the motion tracking system for observation of subsequent motion-based action of the user; anddeauthenticating the user or revoking the user's access to the motion tracking system upon detecting a gesture indicating that the motion tracking system should enter a privacy mode of operation.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 62/548,058, filed on Aug. 21, 2017, which is incorporated herein by reference.

US Referenced Citations (89)
Number Name Date Kind
5331953 Andersson et al. Jul 1994 A
7010445 Battenberg et al. Mar 2006 B2
7215698 Darby et al. May 2007 B2
7636855 Applebaum Dec 2009 B2
7698002 Music Apr 2010 B2
7753849 Morgan et al. Jul 2010 B2
7811234 McGrath Oct 2010 B2
7916066 Osterweil Mar 2011 B1
7973702 Rofougaran et al. Jul 2011 B2
8059700 Lopez-Risueno et al. Nov 2011 B2
8174931 Vartanian et al. May 2012 B2
8179816 Vaidyanathan et al. May 2012 B1
8232866 McGrath et al. Jul 2012 B2
8516561 White Aug 2013 B2
8539550 Terres Sep 2013 B1
8721554 Lin et al. May 2014 B2
8884809 Hyde et al. Nov 2014 B2
8905928 Hayes et al. Dec 2014 B2
8925070 Sriraghavan Dec 2014 B2
8954758 Leoutsarakos Feb 2015 B2
8988792 Fujimoto Mar 2015 B2
9186079 Mase et al. Nov 2015 B2
9202027 Odessky Dec 2015 B2
9229102 Wright et al. Jan 2016 B1
9239917 Kapinos Jan 2016 B2
9253592 Moscovich Feb 2016 B1
9274210 Enge et al. Mar 2016 B2
9443419 Wilson et al. Sep 2016 B2
9477812 Lin et al. Oct 2016 B2
9492099 Gamble et al. Nov 2016 B2
9584503 Zhang Feb 2017 B2
9626498 Gay Apr 2017 B2
9811164 Poupyrev Nov 2017 B2
10135667 Greer Nov 2018 B1
10139916 Poupyrev Nov 2018 B2
10218407 Trotta Feb 2019 B2
10310620 Lien Jun 2019 B2
20030209077 Battenberg et al. Nov 2003 A1
20040130442 Breed Jul 2004 A1
20070139216 Breed Jun 2007 A1
20080074307 Boric-Lubecke et al. Mar 2008 A1
20080077015 Boric-Lubecke et al. Mar 2008 A1
20080119716 Boric-Lubecke et al. May 2008 A1
20080234568 Ouchi Sep 2008 A1
20080275328 Jones Nov 2008 A1
20080275337 Fossan et al. Nov 2008 A1
20080316085 Rofougaran et al. Dec 2008 A1
20090209850 Tao et al. Aug 2009 A1
20090271004 Zecchin Oct 2009 A1
20090278728 Morgan et al. Nov 2009 A1
20090303108 Hilsebecher et al. Dec 2009 A1
20100241009 Petkie Sep 2010 A1
20110181509 Rautiainen Jul 2011 A1
20110260856 Rossmann et al. Oct 2011 A1
20110279366 Lohbihler Nov 2011 A1
20120087212 Vartanian Apr 2012 A1
20120116186 Shrivastav et al. May 2012 A1
20120116187 Hayes et al. May 2012 A1
20120182617 Fujimoto Jul 2012 A1
20120185096 Rosenstein Jul 2012 A1
20120192617 Walton et al. Aug 2012 A1
20120242501 Tran Sep 2012 A1
20120250546 Hamida Oct 2012 A1
20120274498 Hyde et al. Nov 2012 A1
20120304284 Johnson Nov 2012 A1
20130069818 Shirakawa et al. Mar 2013 A1
20130113647 Sentelle et al. May 2013 A1
20130182539 Barkin Jul 2013 A1
20130300573 Brown et al. Nov 2013 A1
20140058256 De Jong Feb 2014 A1
20140247148 Proud Sep 2014 A1
20140266914 Schwartz et al. Sep 2014 A1
20140310764 Tippett Oct 2014 A1
20150058973 Haberman Feb 2015 A1
20150164379 Lin et al. Jun 2015 A1
20150193938 Freedman Jul 2015 A1
20150269825 Tran Sep 2015 A1
20150301167 Sentelle et al. Oct 2015 A1
20160022204 Mostov Jan 2016 A1
20160030006 Okuya et al. Feb 2016 A1
20160054804 Gollakata Feb 2016 A1
20160200276 Diewald Jul 2016 A1
20160311388 Diewald Oct 2016 A1
20170082741 Adib Mar 2017 A1
20170150891 Tsuchimoto et al. Jun 2017 A1
20170262996 Jain et al. Sep 2017 A1
20170300650 Margon Oct 2017 A1
20180054262 Noethlings et al. Feb 2018 A1
20180348343 Achour et al. Dec 2018 A1
Foreign Referenced Citations (21)
Number Date Country
1342968 Apr 2002 CN
101313231 Nov 2008 CN
2492709 Aug 2012 EP
2349759 Nov 2000 GB
H11271430 Oct 1999 JP
2000338231 Dec 2000 JP
2008-282092 Nov 2008 JP
2009541725 Nov 2009 JP
2010263953 Nov 2010 JP
2011033498 Feb 2011 JP
2013068433 Apr 2013 JP
2014039666 Mar 2014 JP
2014517253 Jul 2014 JP
2009047546 Apr 2009 WO
2012116187 Aug 2012 WO
2013034716 Mar 2013 WO
2013177431 Nov 2013 WO
2014172668 Oct 2014 WO
2015006701 Jan 2015 WO
2015017239 Feb 2015 WO
2016176600 Nov 2016 WO
Non-Patent Literature Citations (7)
Entry
M. Agnes et al., “Webster's New World College Dictionary,” fourth edition; Wiley Publishing, Inc.; Cleveland, Ohio, USA; copyright in the year 2007; ISBN 0-02-863119-6; entry for the verb, “provision.” (Year: 2007).
Sungchang Choi, Yonggyu Kim and Joongsoo Ma, “Adaptive localization algorithm for movement of mobile node,” 2012 14th International Conference on AdvancedCommunication Technology (ICACT), PyeongChang, 2012, pp. 184-188.
Anitori, Laura et al.; “FMCW radar for life-sign detection,” In 2009 IEEE Radar Conference, pp. 1-6. IEEE (2009).
Nguyen, Van et al.; “Harmonic Path (HAPA) algorithm for non-contact vital signs monitoring with IR-UWB radar.” In 2013 IEEE Biomedical Circuits and Systems Conference (BioCAS), pp. 146-149 IEEE (2013).
Wang, Yazhou et al.; “Simultaneous Localization and Respiration Detection of Multiple People Using Low Cost UWB Biometric Pulse Doppler Radar Sensor,” IEEE, 978-1-4673-1088 (2012).
Zhang, Dan et al.; “FMCW Radar for Small Displacement Detection of Vital Signal Using Projection Matrix Method,” International Journal of Antennas and Propagation, vol. 2013, Article ID 571986, 5 pages ( 2013).
Mingmin Zhao et al. “Through-Wall Human Pose Estimation Using Radio Signals,” 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, IEEE, Jun. 18, 2018 (Jun. 18, 2018), pp. 7356-7365, XP033473655, DOI: 10.1109/CVPR.2018.00768.
Provisional Applications (1)
Number Date Country
62548058 Aug 2017 US