The present disclosure is generally related to targeted advertising, smart televisions, and passive motion detection systems.
Targeted advertising is a form of advertising where online advertisers can use sophisticated methods to target the most receptive portions of an audience based on certain specified traits of interest, which may be identified as indicative of being receptive to the advertised product or service. Through the emergence of new online channels, the need for targeted advertising is increasing because companies aim to minimize wasted advertising by means of information technology. Most targeted new media advertising currently uses second-order proxies for targeting, such as tracking online or mobile web activities of consumers, associating historical web page consumer demographics with new consumer web page access, using a search word as the basis for implied interest, or contextual advertising.
It is desirable to have an integrated means of knowing how many individuals are watching media and advertising content, while also determining if, and to what level, each of those individuals is engaged with the content, without connecting to any devices associated with the viewers.
Systems, and methods for selecting advertisements based on party size and/or engagement are provided. An Internet-connected media display device may be detected as displaying media. An up-to-date party size in a monitored space may be determined based on frequency response data by a wireless access point located close to the space. An up-to-date engagement level may also be determined with respect to each individual in the space based on frequency response data by the wireless access point. An advertisement database may be filtered for a qualifying advertisement based on the identified party size and engagement levels.
Disclosed herein are systems, methods and computer-readable storage media for selecting advertisements on a media display based on party size and/or party engagement levels. In some aspects, an exemplary method can include: detecting that an Internet-connected media display device is displaying media; determining, from a party size calculator module, an up-to-date party size of individuals in a monitored space based on an examination of frequency response data, associated with the monitored space, by a wireless access point located in proximity to the monitored space; determining, from an engagement calculator module, an up-to-date engagement level with respect to each of the individuals in the monitored space based on the examination of the frequency response data by the wireless access point; filtering an advertisement database for a qualifying advertisement, for a suitable advertisement slot, that matches the up-to-date party size, an average of the up-to-date engagement levels between the individuals in the monitored space. In some examples, if there are two or more advertisements that qualify then the filtering takes into account a flag count of each of the two or more advertisements. The qualifying advertisement may have the highest flag count as determined by whether a first engagement level changes with respect to a second engagement level of a previous timestamp during playtime of an associated advertisement.
Methods may further include: updating the advertisement database with respect to flag counts, determining that a first party size is different than a second party size of a previous timestamp; decreasing an associated flag count for an associated advertisement when the first party size is less than the second party size; and increasing the associated flag count for the associated advertisement when the first party size is greater than the second party size. Further embodiments may further include: updating the advertisement database with respect to flag counts, as well as include determining a first engaged party size is different than a second engaged party size of the previous timestamp; decreasing an associated flag count for an associated advertisement when the first engaged party size is less than the second engaged party size; and increasing the associated flag count for the associated advertisement when the first engaged party size is greater than or equal to the second engaged party size.
Embodiments may further include: updating the advertisement database with respect to flag counts, determining a first average engagement level of the first party is different than a second average engagement level of the second party of the previous timestamp; decreasing an associated flag count for an associated advertisement when the first average engagement level is less than the second average engagement level; and increasing the associated flag count for the associated advertisement when the first average engagement level is greater than the second average engagement level.
In some examples, the first and second average engagement levels may be retrieved from an engagement database including engagement levels of each party member that was monitored at interval timestamps in the monitored space. The frequency response data may be associated with a radio map using Wi-Fi localization to translate Wi-Fi signal strengths into locations. The radio map may further include metadata including frequency data of a channel, phase response data of the channel, and impulse response data of the channel that describe a wireless communication link between paired devices used to compare with a signal scan. In some examples, one or more cloud databases may include the engagement calculator module, party size calculator module, and advertisement database.
In some aspects, an exemplary system can include one or more processors and a non-transitory computer-readable storage medium having stored therein instructions which, when executed by the one or more processors, cause the one or more processors to perform the steps described herein.
The technologies herein can provide advertisement engagement measurements based upon a device-free indoor positioning technology that can locate and measure engagement of individuals in a monitored space based on passively observing changes in the environment. Such changes would be determined based on comparisons of locations of interests that have built an ensemble of fingerprints during a training phase. During a testing phase, fingerprints generated from new data would be compared with those from the training phase to determine a location of a individual. The differences between the fingerprints generated from the new data and those from the training phase would result in data indicating position and engagement of movement that would reflect where and how much an individual is engaged with respect to watching a media device, such as a smart TV, in the monitored space. The number of individuals and the level of engagement of the individuals in the monitored space are used to determine what advertisements should be played based on what certain requirements associated with the advertisements, such as a minimum average engagement score or party size.
The approaches herein can provide systems, methods, and computer-readable storage media for determining party size with respect to the number of individuals in the monitored space, the respective level of engagement of those individuals, and which advertisement to play given the party size and level of engagement, where the party size and level of engagement is determined by passive indoor positioning technology using channel state information (CSI). The disclosure begins with an initial discussion of systems and technologies for determining party size and level of engagement through the passive indoor positioning technology using CSI, as generally exemplified in
Wi-Fi AP 102 may be configured to record channel state information (CSI). In Wi-Fi communications, CSI refers to known channel properties of a radio frequency (RF) communication link that describes how a signal propagates from a transmitter to a receiver and represents a combined effect of various properties such as channel frequency response, channel phase response, and/or channel impulse response. Wi-Fi AP 102 may include central processing unit (CPU), graphics processing unit (GPU) 106, digital signal processor (DSP) 108, application program interface (API) 110, and radio 112.
CPU 104 may carry out instructions for the Wi-Fi AP 102 to perform. GPU 106 may be a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. Digital signal processor (DSP) 108 may be a specialized microprocessor, or a system-in-a-package (SiP) component, with its architecture optimized for the operational needs of digital signal processing. The DSP 108 may be configured to measure, filter, or compress continuous real-world analog signals. API 110 may be a set of routines, protocols, and tools for building software applications, programming any graphical user interface (GUI) components, and specifying how software components interact. The API 110 may provide metadata related to the CSI to an agent 114.
The Wi-Fi AP 102 may further include a radio 112 that may be compliant with IEEE 802.11 standards such as 802.11b or 802.11g, using omnidirectional antenna, and may have a range of 100 m (0.062 mi). The radio 112 may have an external semi-parabolic antenna (15 dB gain) with a similarly equipped receiver at the far end might have a range over 20 miles. The Wi-Fi AP 102 may be equipped with a network interface card (NIC) that connects the Wi-Fi AP 102 to a computer network. The radio 112 may be a transceiver, a transmitter, or a receiver.
The agent 114 may collect CSI-related metadata from the Wi-Fi AP 102, filter the CSI-related metadata, and send the filtered CSI-related metadata to one or more cloud databases 120 for activity identification. The activity identification can be accomplished on an edge, at an agent level, or in the cloud, or some combination of the three. The agent 114 may include a local profile database 116 that is utilized when at least a portion of the activity identification is done on the edge. This could be a simple motion versus no-motion determination profile database or a more extensive profile database used for identifying activities, objects, individuals, biometrics, etc.
The agent 114 may also include an activity identification module 118 that distinguishes between activities, such as between walking and in-place activities. In general, a walking activity may cause significant pattern changes to amplitude over time of the channel impulse response, since it involves significant body movements and location changes. In contrast, an in-place activity (such as watching TV on a sofa) may only involve relatively smaller body movements and may not cause significant pattern changes to amplitude over time of the channel impulse response. Advertisement in-place activity is reflected by certain repetitive patterns within the channel impulse response.
Cloud server 120 may include or otherwise have access to a profile database 122, device database 124, profile module 126, advertisement selection module 128, engagement calculator module 130, party size calculator module 132, engagement database 134, and advertisement database 135. Such databases and module 122-136 associated with cloud server 120 may be used to create, modify, analyze, and store profiles describing various activities by users in a monitored space.
Profile module 126 may monitor the received CSI-related metadata from a continuous monitoring of a monitored space, identify multiple similar patterns of a set of CSI-related metadata that do not have a matching profile in a profile database 122, combine the set of CSI-related metadata with user feedback to label the resulting clusters, and define a new profile that may then be added to the profile database 122. The profiles in the profile database 122 may be profiles characterizing simple motion versus no-motion determinations, as well as more extensive descriptive profiles that may be sued to detect and characterize different types of activities, objects, individuals, biometrics, etc. Device database 124 may store device ID of all connected wireless APs.
The cloud server 120 may further include an advertisement selection module 128, which utilizes engagement scores provided by an engagement calculator module 130 to determine if one or more individuals in the monitored space are engaged with a smart TV 138 displaying media content or advertising content. Advertisement selection module 128 may determine which advertisement content to display in an advertisement slot. When there is a marked drop in engagement related to specific advertising content, similar advertising content may be marked with a negative flag count as an indication of lesser interest in the advertising content. Therefore, the advertisement selection module 128 can learn a pattern of interest to maximize engagement with advertising content and ensure advertisers receive maximum return on investment. An exemplary process of the advertisement selection module 128 is further described in
The engagement calculator module 130 measures an engagement level of the individuals in the monitored space as a function of how much motion is detected. Each individual's movement level may be quantified in various ways, for example, from 0 to 1, with 0 being asleep and 1 being an athletic activity. A viewer sitting awake and watching the TV may have a movement score of 0.1, which may translate to an engagement score of 10. The localization information related to the motion data can, in some examples, be used to help quantify the level of engagement motion.
For example, motion far from the smart TV 138 may indicate that an individual is not as engaged in comparison to if the individual were closer to the smart TV 138. The individual abandoning the monitored space could be an indication of the individual is no longer engaged. Motion within the monitored space may have a lower movement score than movement that is towards the exit of the room. Additionally, motion could be viewed cumulatively, so that if the individual is moving from one seat in the living room to another while watching the smart TV 138, that may be viewed as more engagement as opposed to the user moving back and forth more consistently, indicating they are engaged in another activity.
For example, when an individual's motion level is greater than 0.5, they are considered too active to be engaged in the advertising content or media content displayed on the smart TV 138. The engagement calculator module 130 monitors for changes in engagement level, and when those changes coincide with advertising content, positive or negative flags are applied to that advertisement content or advertisement type in an advertising database 136 of the one or more cloud databases 120, for the purposes of targeting viewers with advertisements that have the highest positive flag count. An exemplary process of the engagement calculator module 130 is illustrated described in
For example, a room with two individuals but only one engaged individual, would be shown advertisements for a single audience and not those for a small collection of people, or groups of four or more. A second filter may be for a minimum average engagement level of viewers the advertiser paid for. A third filer, may be applied when there are still multiple advertisements that meet the first two filters, is the positive flag count. Each time engagement or party size decreases, the engagement calculator module 130 and party size calculator module 132 may apply a negative flag to the advertisement content or advertisement type.
The one or more cloud databases 120 may further include a party size calculator module 132 that quantifies the number of individuals in the room, applies positive and negative flags to an advertisement content or advertisement type in the advertising database 136 based on increases or decreases in the number of individuals in the room when the advertisement content is being displayed. The one or more cloud databases 120 may further include an engagement database 134 that stores, with every interval timestamp, an associated party size and engagement level in the room being monitored, as calculated by the party size calculator module 132 and the engagement calculator module 130. The engagement database 134 is further described below along with an exemplary engagement database table shown in
With respect to the advertisement database 136, it contains video files, links to video files, or file names associates with the video files of each advertisement to be delivered to the smart TV 138, the relevant locations, such as kitchen, the rating, corresponding to the amount paid, and the user engagement level, also based on the amount paid. The smart TV 138 is delivered the chosen advertising content to be displayed based on instructions from the one or more cloud databases 120. The advertisement database 136 is further described below along with an exemplary advertisement database table shown in
If the engagement level remains the same (indicating that the advertising content didn't cause an engaged viewer to disengage) or increases, or the part size increases, the engagement calculator module 130 and party size calculator module 132 each apply a positive flag, respectively, with respect to a respective advertising content. The total of the flags represents how positively or negatively the viewers have responded to the advertisement content or advertisement types. Once all filters have been made to the engagement database 134, the highest rated remaining advertisement content is played at step 208. The advertisement content may be displayed at step 209. It may be determined if the media consumption on the smart TV 138 is concluded in step 210, indicating the user is no longer watching. If the media consumption has concluded the program ends in step 212, if not, the method may loop back to calculating the party size at step 222.
The engagement score for each individual in the room may be calculated in step 306. The engagement score may be 1 over the movement level, which may result in a scale for the engagement level of each individual to range from 0 to 10, for example. With 0 representing a viewer that is asleep, 1 representing a user involved in an athletic activity and 10 representing a user who is seated and watching the smart TV 138. It may be determining if any of the individuals observed are above a movement level threshold used to determine that a viewer is not engaged with the media or advertisement content in step 308.
In an example, the party size calculator module 132 determined there were two individuals in the room with the smart TV 138. Individual one may have an engagement score of 10 and individual two may have an engagement score of 4, and an engagement score threshold of 5 may be chosen as the line below which a viewer is considered too active to be engaged in the media or advertisement content. The total number on individuals who are below the movement level threshold may be written to the engaged party size field of the engagement database 134 at step 310. It may be determining if the content being displayed on the smart TV 138 is advertisement content or media content at step 312. If it is media content the module jumps to step 326 as steps 314-324 are for the rating of advertisement content based on the change.
It may be determine if the engaged party size is greater than, less than, or equal to the engaged party size at the previous timestamp at step 314. If the engaged party size is equal to the previous timestamp, no action is taken, and the advertisement selection module 128 moves to step 320. If the engaged party size is greater than the previous timestamp, the flag column corresponding to the advertising content being played is increased by one at step 316. This provides feedback to advertisers about party sizes viewing their advertisements increasing during the viewing period. If the engaged party size in less than the previous timestamp, the flag column corresponding to the advertisement being played in decreased by 1 at step 318. A viewer leaving the room may be interpreted as negative feedback on the advertisement being displayed.
The engagement level for each individual may be compared to the engagement level for the same individual at the previous timestamp at step 320. For each individual whose engagement level is greater than or equal to their engagement level at the previous timestamp, increase the flag, count for the advertisement being displayed by 1. An increase in engagement is clearly positive feedback about the advertisement or advertisement type and would be used to inform targeted advertising choices. In some examples, we interpret an engaged viewer having no decrease in their engagement level when shown advertisement content, as opposed to media content they chose to watch, it is interpreted as positive feedback on the advertisement content or advertisement type at step 322. For each individual whose engagement level is less than their engagement level at the previous timestamp, the flag count may be decreased for the advertisement content being displayed by 1 at step 324. This balances positive and negative feedback on advertisement types and the combination of all of these flags gives an overall positive or negative ranking for an advertisement content or advertisement type. The engagement party size may then be sent to the advertisement selection module 128 at step 326.
For clarity of explanation, in some instances the present technology may be presented as including individual functional blocks including functional blocks including devices, device components, steps or routines in a method embodied in software, or combinations of hardware and software. In some embodiments the computer-readable storage devices, mediums, and memories can include a cable or wireless signal containing a bit stream and the like. However, when mentioned, non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
Methods according to the above-described examples can be implemented using computer-executable instructions that are stored or otherwise available from computer readable media. Such instructions can include, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, or source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memory, networked storage devices, and so on.
Devices implementing methods according to these disclosures can include hardware, firmware and/or software, and can take any of a variety of form factors. Typical examples of such form factors include laptops, smart phones, small form factor personal computers, personal digital assistants, rackmount devices, standalone devices, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example. The instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are means for providing the functions described in these disclosures.
Although a variety of examples and other information was used to explain aspects within the scope of the appended claims, no limitation of the claims should be implied based on particular features or arrangements in such examples, as one of ordinary skill would be able to use these examples to derive a wide variety of implementations. Further and although some subject matter may have been described in language specific to examples of structural features and/or method steps, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to these described features or acts. For example, such functionality can be distributed differently or performed in components other than those identified herein. Rather, the described features and steps are disclosed as examples of components of systems and methods within the scope of the appended claims.
The present application claims the priority benefit of U.S. Provisional Patent Application No. 62/809,011, filed on Feb. 22, 2019 and entitled “Knowledge-Based Feature Fusion for Robust Device-Free Localization and Activity Monitoring” and U.S. Provisional Patent Application No. 62/809,336, filed on Feb. 22, 2019 and entitled “Advertisement Engagement Measurement,” the contents of which are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
8073441 | Unger et al. | Dec 2011 | B1 |
8461988 | Tran | Jun 2013 | B2 |
9219790 | Filev et al. | Dec 2015 | B1 |
9414115 | Mao et al. | Aug 2016 | B1 |
9703892 | Ramer et al. | Jul 2017 | B2 |
9854292 | Matthews et al. | Dec 2017 | B1 |
9867548 | Le et al. | Jan 2018 | B2 |
9985846 | Roman et al. | May 2018 | B1 |
10045191 | Nguyen et al. | Aug 2018 | B2 |
10374646 | Fletcher | Aug 2019 | B1 |
10419880 | Long | Sep 2019 | B1 |
10818384 | Peterson et al. | Oct 2020 | B1 |
10999705 | Martinez | May 2021 | B2 |
11017688 | Arazi | May 2021 | B1 |
11039278 | Carreiro et al. | Jun 2021 | B1 |
11082109 | Martinez | Aug 2021 | B2 |
11218769 | Martinez | Jan 2022 | B2 |
11448726 | Martinez | Sep 2022 | B2 |
11523253 | Martinez | Dec 2022 | B2 |
20020188668 | Jeffery et al. | Dec 2002 | A1 |
20060224938 | Fikes et al. | Oct 2006 | A1 |
20070024580 | Sands et al. | Feb 2007 | A1 |
20070266395 | Lee et al. | Nov 2007 | A1 |
20080262909 | Li | Oct 2008 | A1 |
20100242063 | Slaney et al. | Sep 2010 | A1 |
20110029277 | Chowdhary et al. | Feb 2011 | A1 |
20110117924 | Brunner et al. | May 2011 | A1 |
20110129047 | Mashino et al. | Jun 2011 | A1 |
20110258039 | Patwa et al. | Oct 2011 | A1 |
20120053472 | Tran | Mar 2012 | A1 |
20120135733 | Cormier et al. | May 2012 | A1 |
20120289147 | Raleigh et al. | Nov 2012 | A1 |
20120324494 | Burger et al. | Dec 2012 | A1 |
20130014136 | Bhatia et al. | Jan 2013 | A1 |
20130028443 | Pance et al. | Jan 2013 | A1 |
20130053990 | Ackland | Feb 2013 | A1 |
20130102256 | Cendrillon et al. | Apr 2013 | A1 |
20130115974 | Lee et al. | May 2013 | A1 |
20130326554 | Shkedi | Dec 2013 | A1 |
20140033240 | Card et al. | Jan 2014 | A1 |
20140181100 | Ramer et al. | Jun 2014 | A1 |
20140223467 | Hayton et al. | Aug 2014 | A1 |
20140278389 | Zurek et al. | Sep 2014 | A1 |
20140358012 | Richards et al. | Dec 2014 | A1 |
20150026708 | Ahmed et al. | Jan 2015 | A1 |
20150050923 | Tu et al. | Feb 2015 | A1 |
20150092747 | Ganesan | Apr 2015 | A1 |
20150110471 | Zheng | Apr 2015 | A1 |
20150113556 | Weast et al. | Apr 2015 | A1 |
20150121428 | Nguyen et al. | Apr 2015 | A1 |
20150365787 | Farrell | Dec 2015 | A1 |
20160057472 | Gupta et al. | Feb 2016 | A1 |
20160105700 | Collins et al. | Apr 2016 | A1 |
20160127766 | Luk et al. | May 2016 | A1 |
20160174185 | Ramakrishnan et al. | Jun 2016 | A1 |
20160253710 | Publicover et al. | Sep 2016 | A1 |
20160277529 | Chen et al. | Sep 2016 | A1 |
20160315682 | Liu | Oct 2016 | A1 |
20160337701 | Khare | Nov 2016 | A1 |
20160344779 | Jain | Nov 2016 | A1 |
20170032191 | Ackland | Feb 2017 | A1 |
20170068790 | Fuerst | Mar 2017 | A1 |
20170135635 | Bostick et al. | May 2017 | A1 |
20170160089 | Jang et al. | Jun 2017 | A1 |
20170315711 | Adams | Nov 2017 | A1 |
20170332192 | Edge | Nov 2017 | A1 |
20170354349 | Mohapatra et al. | Dec 2017 | A1 |
20170366955 | Edge | Dec 2017 | A1 |
20180008207 | Sarkela et al. | Jan 2018 | A1 |
20180035072 | Asarikuniyil et al. | Feb 2018 | A1 |
20180091952 | Sant | Mar 2018 | A1 |
20180181094 | Funk et al. | Jun 2018 | A1 |
20180184165 | Maughan et al. | Jun 2018 | A1 |
20180330406 | Deluca et al. | Nov 2018 | A1 |
20190051342 | Wootton et al. | Feb 2019 | A1 |
20190174170 | Chen | Jun 2019 | A1 |
20190178980 | Zhang et al. | Jun 2019 | A1 |
20190188756 | Bradley et al. | Jun 2019 | A1 |
20190246371 | Hwang et al. | Aug 2019 | A1 |
20190306023 | Vasseur et al. | Oct 2019 | A1 |
20200036592 | Kholaif | Jan 2020 | A1 |
20200090022 | Ma et al. | Mar 2020 | A1 |
20200112939 | Scharf et al. | Apr 2020 | A1 |
20200120384 | Armaly | Apr 2020 | A1 |
20200133383 | Ahlstrom et al. | Apr 2020 | A1 |
20200186321 | Hwang et al. | Jun 2020 | A1 |
20200226388 | Ghessassi | Jul 2020 | A1 |
20200265700 | Bergman et al. | Aug 2020 | A1 |
20200292572 | Bateni | Sep 2020 | A1 |
20200296463 | Martinez | Sep 2020 | A1 |
20200303046 | Martinez | Sep 2020 | A1 |
20200327430 | Martinez | Oct 2020 | A1 |
20200328793 | Martinez | Oct 2020 | A1 |
20200329343 | Martinez | Oct 2020 | A1 |
20200383119 | Sun et al. | Dec 2020 | A1 |
20200397365 | Zhang et al. | Dec 2020 | A1 |
20210063537 | Martinez | Mar 2021 | A1 |
20210120370 | Martinez | Apr 2021 | A1 |
20210352441 | Liu | Nov 2021 | A1 |
20220060234 | Martinez | Feb 2022 | A1 |
20220070633 | Ghoshal | Mar 2022 | A1 |
20220167050 | Martinez | May 2022 | A1 |
20220256429 | Martinez | Aug 2022 | A1 |
20230003836 | Martinez | Jan 2023 | A1 |
Number | Date | Country |
---|---|---|
3044480 | May 2018 | CA |
105828289 | Aug 2016 | CN |
WO 2016187458 | Nov 2016 | WO |
WO 2018094502 | May 2018 | WO |
WO 2020170221 | Aug 2020 | WO |
WO 2020240526 | Dec 2020 | WO |
WO 2021084519 | May 2021 | WO |
Entry |
---|
Electronic Frog Eye: Counting Crowd Using WiFi; Jul. 8, 2014; Xi et al. (Year: 2014). |
SCPL: Indoor Device-Free Multi-Subject Counting and Localization Using Radio Signal Strength; 2013; Xu et al. (Year: 2013). |
U.S. Appl. No. 17/014,720, Michel A. Martinez, Monitoring Activity Using Wi-Fi Motion Detection, Sep. 8, 2020. |
U.S. Appl. No. 17/006,579, Michel A. Martinez, System and Method for Presence and Pulse Detection From Wireless Systems, Nov. 2, 2020. |
PCT/IB2020/060271, System for Multi-Path 5G and Wi-Fi Motion Detection, Nov. 2, 2020. |
PCT Application No. PCT/IB2020/060271 International Search Report and Written Opinion dated Feb. 15, 2021. |
U.S. Appl. No. 16/796,662 Office Action dated Feb. 12, 2021. |
Deng et al., “CC-DTW: An Accurate Indoor Fingerprinting Localization Using Calibrated Channel State Information and Modified Dynamic Time Warping”, Sensors 19, No. 9:1984, Apr. 28, 2019 (Apr. 28, 2019), [online] [retrieved on Aug. 20, 2020 (Aug. 20, 2020)], Retrieved from the internet: https://www.mdpif.com/1424-8220/19/9/1984. |
Ghourchian et al., “Real-Time Indoor Localization in Smart Homes Using Semi-Supervised Learning”, Association for the Advancement of Artificial Intelligence, Twenty-Ninth AAAI Conference on Innovative Applications, pp. 4670-4677, Feb. 8, 2017 (Feb. 8, 2017), [online] [retrieved on Aug. 20, 2020 (Aug. 20, 2020)], Retrieved from the internet: https://aaai.org/ocs/index.php/IAAI/IAAI17/paer/view/15010. |
Rui Zhou et al., “Device-free Localization Based on CSI Fingerprints and Deep Neural Networks”, 15 Annual IEEE International Conference on Sensing, Communication, and Networking (SECON), Jun. 11, 2018 (Jun. 11, 2018), [online] [retrieved on Aug. 20, 2020 (Aug. 20, 2020] Retrieved from the internet: https://dl.acm.org/doi/10.1145/2639108.2639143. |
Xuyu Wang et al., “CSI-Based Fingerprinting for Indoor Localization: A Deep Learning Approach”, IEEE Transactions on Vehicular Technology, vol. 66, No. 1, pp. 763-776, Mar. 22, 2016 (Mar. 22, 2016), [online] [retrieved on Aug. 20, 2020 (Aug. 20, 2020), Retrieved from the internet: https://ieeexplore://ieeexplore.ieee.org/documents/7438932. |
Yang Wang et al., “E-eyes: Device-free Location-oriented Activity Identification Using Fine-grained Wifi Signatures”, MobiCom'14, pp. 617-628 Sep. 7, 2014 (Sep. 7, 2014), [retrieved on Aug. 20, 2020 (Aug. 20, 2020)], Retrieved from the internet: https://dl.acm.org/doi/10.1145/2639108.2639143. |
PCT Application No. PCT/IB2020/055186 International Search Report and Written Opinion dated Oct. 15, 2020. |
PCT/IB2020/051503, Smart Media Display, Feb. 22, 2020. |
U.S. Appl. No. 16/796,662, Michel A. Martinez, Smart Media Display, filed Feb. 20, 2020. |
U.S. Appl. No. 16/798,138, Small Motion Vector Identification in a Wi-Fi Motion Detection System, filed Feb. 21, 2020. |
U.S. Appl. No. 16/794,668, Michel A. Martinez, Robotic H Matrix Creation, filed Feb. 19, 2020. |
U.S. Appl. No. 16/798,319, Michel A. Martinez, Wi-Fi Based Condition Monitoring, filed Feb. 22, 2020. |
U.S. Appl. No. 16/795,219, Michel A. Martinez, Self-Learning Based on Wi-Fi-Based Monitoring and Augmentation, filed Feb. 19, 2020. |
U.S. Appl. No. 16/798,148, Seyedehsan Bateni, Wireless Motion Detection Using Multiband Filters, filed Feb. 21, 2020. |
U.S. Appl. No. 17/006,579, Michel A. Martinez, System and Method for Presence and Pulse Detection From Wireless Systems, filed Aug. 28, 2020. |
PCT Application No. PCT/IB2020/051503 International Preliminary Report on Patentability dated Aug. 10, 2021. |
U.S. Appl. No. 16/798,148 Office Action dated Oct. 22, 2021. |
PCT Application No. PCT/IB2020/051503 International Search Report and Written Opinion dated Jul. 30, 2020. |
U.S. Appl. No. 16/798,138 Office Action dated Sep. 8, 2020. |
PCT Application No. PCT/IB2020/055186 International Preliminary Report on Patentability dated Nov. 16, 2021. |
U.S. Appl. No. 16/798,319 Office Action dated Dec. 29, 2021. |
U.S. Appl. No. 17/006,579 Office Action dated Jan. 6, 2022. |
U.S. Appl. No. 16/798,148 Final Office Action dated Apr. 8, 2022. |
U.S. Appl. No. 17/131,078 Non-Final Office Action dated Mar. 2, 2022. |
PCT Application No. PCT/IB2020/060271 International Preliminary Report on Patentability dated May 3, 2022. |
U.S. Appl. No. 16/794,668 Office Action dated May 24, 2022. |
U.S. Appl. No. 16/798,319 Final Office Action dated Jun. 13, 2022. |
U.S. Appl. No. 16/798,148 Office Action dated Jul. 26, 2022. |
U.S. Appl. No. 16/798,148, Final Office Action, dated Jan. 3, 2023. |
Number | Date | Country | |
---|---|---|---|
20200302478 A1 | Sep 2020 | US |
Number | Date | Country | |
---|---|---|---|
62809011 | Feb 2019 | US | |
62809336 | Feb 2019 | US |