Interactive systems and methods

Information

  • Patent Grant
  • 10614271
  • Patent Number
    10,614,271
  • Date Filed
    Monday, January 29, 2018
    8 years ago
  • Date Issued
    Tuesday, April 7, 2020
    5 years ago
Abstract
In one embodiment, a system includes a radio-frequency identification (RFID) reader configured to read data stored on an RFID tag associated with a user and to generate a first signal indicative of the data and indicative of a location of the RFID tag, a sensor system configured to detect a user interaction with an interactive element and to generate a second signal indicative of the user interaction, and a processor configured to match the user to the user interaction based on the first signal and the second signal and to update a user database to reflect that the user is matched to the user interaction.
Description
FIELD OF DISCLOSURE

The present disclosure relates generally to amusement parks. More specifically, embodiments of the present disclosure relate to interactive systems and methods for use in amusement parks.


BACKGROUND

Amusement parks and/or theme parks may include various entertainment attractions. Some existing attractions may provide guests with an immersive or interactive experience. For example, guests may visit areas having various features, such as audio, video, and special effects features. With the increasing sophistication and complexity of modern attractions, and the corresponding increase in expectations among amusement park and/or theme park guests, improved and more creative attractions are needed, including attractions that provide a more interactive and personalized experience.


SUMMARY

In one embodiment, a system includes a radio-frequency identification (RFID) reader configured to read data stored on an RFID tag associated with a user and to generate a first signal indicative of the data and indicative of a location of the RFID tag, a sensor system configured to detect a user interaction with an interactive element and to generate a second signal indicative of the user interaction, and a processor configured to match the user to the user interaction based on the first signal and the second signal and to update a user database to reflect that the user is matched to the user interaction.


In one embodiment, a system includes a radio-frequency identification device (RFID) reader configured to read data stored on RFID tags associated with respective users, a sensor system configured to detect a first user interaction at a first portion of a display and a second user interaction at a second portion of the display, and a processor configured to receive RFID signals indicative of the data from the RFID reader and to receive sensor signals indicative of the first user interaction and the second user interaction from the sensor system. The processor is configured to render a first image for display at the first portion of the display and to render a second image for display at the second portion of the display based on the RFID signals and the sensor signals.


In one embodiment, a method includes receiving, at a processor, a first signal from an RFID reader, then receiving, at the processor, a second signal from a sensor system. The first signal is indicative of data stored on an RFID tag and indicative of a location of the RFID tag associated with a user and read by the RFID reader and the second signal is indicative of a user interaction with an interactive element. The method also includes matching, using the processor, the user to the user interaction based on the first signal and the second signal, and finally updating, using the processor, a user database to reflect that the user is matched to the user interaction.





BRIEF DESCRIPTION OF DRAWINGS

These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:



FIG. 1 is a schematic diagram of an interactive system, in accordance with an aspect of the present disclosure;



FIG. 2 is a schematic diagram of a radio-frequency identification (RFID) system that may be used in the interactive system of FIG. 1, in accordance with an aspect of the present disclosure;



FIG. 3 is a schematic diagram of multiple users interacting with the interactive system of FIG. 1, in accordance with an aspect of the present disclosure;



FIG. 4 is a schematic diagram of a sensor system that may be used in the interactive system of FIG. 1, in accordance with an aspect of the present disclosure;



FIG. 5 is a schematic diagram of a sensor system that may be used with the interactive system of FIG. 1 to detect a depth of a user's interaction, in accordance with an aspect of the present disclosure; and



FIG. 6 is a flow diagram of a method of operating the interactive system of FIG. 1, in accordance with an aspect of the present disclosure.





DETAILED DESCRIPTION

One or more specific embodiments of the present disclosure will be described below. In an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.


Amusement parks feature a wide variety of entertainment, such as amusement park rides, performance shows, and games. The different types of entertainment may include features that enhance a guest's experience at the amusement park. However, many of the forms of entertainment do not vary based upon a guest's previous experiences or actions. For example, a game may include the same rules, elements, and gameplay for each guest. Some guests may prefer a more interactive form of entertainment that is different for each guest and/or different during each interaction. As such, there may be a need to create an interactive system that detects a guest's interaction with an interactive element and/or updates the interactive element based at least in part on the guest's interaction to provide a unique experience.


The present disclosure relates to an interactive system that uses radio-frequency identification (RFID). Users of the interactive system may be guests of an amusement park. A user may wear or carry a device that supports an RFID tag. The interactive system may read the RFID tag and detect the user's interaction with an interactive element. In one embodiment, the interactive system may display different elements depending on the user's interactions with the system. Each user may have had different, respective interactions with the system, and thus, the system may display different interactive elements from user to user. Further, the interactive system may activate a different feature based on an achievement or performance of the user. In this manner, each user's experience with the interactive system may be different and tailored to more suitably fit the corresponding user.


Turning to the drawings, FIG. 1 is a schematic view of an embodiment of an interactive system 8. In one embodiment, the interactive system 8 may be located in a room or area of an attraction. As illustrated in FIG. 1, the interactive system 8 includes RFID readers 10, a sensor system 12, and a computing system 14 (e.g., a cloud-based computing system). Each RFID reader 10 may read an RFID tag 50 supported by a wearable device 16 (e.g., wearable or portable device, such as a bracelet, necklace, charm, pin, or toy), which may be worn or carried by a user 18. In one embodiment, the RFID reader 10 may be a transceiver that may be capable of sending information to the RFID tag 50 and/or to other devices. For example, the RFID reader 10 may send a signal indicative of information received from the RFID tag 50 to the computing system 14, which determines an approximate location of the user 18 based on the signal. In operation, the user 18 may interact with an interactive element 20 (e.g., an object or image) and the interaction may be detected by the sensor system 12. In one embodiment, the interactive element 20 may be an image (e.g., visual or graphical element) displayed on a display 21 (e.g., screen or wall). When the user 18 interacts with the interactive element 20, the sensor system 12 may detect the action performed and send a signal indicative of the action to the computing system 14. In response to the received signals, the computing system 14 may update information regarding the user 18 (e.g., assign points, update a level, mark tasks as complete) and/or change the interactive element 20. The computing system 14 may include one or more databases 22, a memory 24 that contains instructions regarding updating the interactive elements 20 and the one or more databases 22, and one or more processors 26 that execute the instructions. The memory 24 includes non-transitory, computer-readable medium that may store the instructions.



FIG. 2 illustrates a schematic of an RFID system 28 that may be used in the interactive system 8. The RFID system 28 contains the RFID reader 10 and the RFID tag 50. The RFID reader 10 may obtain information by reading the RFID tag 50 of the wearable device 16 associated with the user 18. The RFID tag 50 may include a microchip 52, an integrated circuit 54 to power the microchip 52, a memory 56 storing information, and an antenna 58 that may transmit and receive signals. The RFID reader 10 continuously sends out signals in electromagnetic waves. In one embodiment, the RFID reader 10 and the RFID tag 50 use ultra-high frequency (UHF) waves, which may range from about 300 MHz to 3 GHz. It should be appreciated that the RFID reader 10 may emit waves having any suitable frequency. Furthermore, in some embodiments, the RFID tag 50 may additionally or alternatively use near-field communication (NFC) (e.g., the RFID tag 50 may be a dual-frequency RFID tag) and may be read by an NFC reader. When the wearable device 16 is within a certain distance of the RFID reader 10 (e.g., about 10 or 20 feet), the antenna 58 of the RFID tag 50 captures the emitted electromagnetic waves as energy. The integrated circuit 54 uses the energy to provide power to the microchip 52, which generates backscatter. The backscatter is a signal containing information stored in the memory 56 of the RFID tag 50. The RFID tag 50 transmits the backscatter to RFID reader 10 via the antenna 58, where the RFID reader 10 interprets the backscatter to obtain the information. The RFID reader 10 may send a signal to the computing system 14 indicative of the backscatter obtained from the RFID tag 50. The computing system 14 may use the information to change the interactive elements 20 of the interactive system 8. Furthermore, the RFID reader 10 may be able to detect the strength of the backscatter sent by the RFID tag 50, and as a result, the computing system 14 may determine the approximate location of the RFID tag 50 and thus, the user 18. In one embodiment, multiple RFID readers 10 may be used in the interactive system 8 to determine (e.g., via triangulation) the location of the user 18 more efficiently and/or accurately based on the respective strength of the received backscatters.


The wearable device 16 may be compatible with multiple RFID readers 10 disposed in multiple interactive systems 8 and may be reused for several visits to the park. In operation, when the RFID reader 10 receives the signal from the RFID tag 50, the RFID reader 10 may identify the wearable device 16 and the computing system 14 may access corresponding information from the one or more databases 22. The corresponding information in the one or more databases 22 may include the user's identity, preferences, level in the game, completed tasks or actions, team affiliation, or any of a variety of other information related to the user 18, the wearable device 16, and/or past interactions with one or more interactive systems 8. In this manner, the user 18 may build upon prior performances and experiences. The one or more databases 22 may also include stored media (e.g., in a model or media database) that may be rendered, such as on the display 21. In one embodiment, data may be transferred between the memory 56 and a remote computing system (e.g., a gaming console). For example, data indicative of achievements on the remote computing system may be written to and stored in the memory 56. The RFID reader 10 may then read the data in the memory 56, and the computing system 14 may adjust the gameplay based on the data (e.g., generating new interactive elements 20 on the display 21).


Turning back to FIG. 1, the sensor system 12 and the RFID system 28 (FIG. 2) may work in conjunction to determine both an occurrence of an interaction and who caused the interaction. For example, the user 18 may interact with the interactive element 20 (e.g., touch the interactive element 20, which may be an image on the display 21). The sensor system 12 may detect that the interaction with interactive element 20 has occurred and send a signal to the computing system 14. The signal generated by the sensor system 12 may indicate or provide information related to the occurrence of the interaction and/or the location of the interaction relative to the sensor system 12 and relative to the interactive element 20. However, the sensor system 12 cannot detect the identity of the user 18 nor determine which user 18 interacted with the interactive element 20. As noted above, the RFID system 28 may determine the identification of the user 18 based on received backscatter from the RFID tag 50 of the user 18. The RFID system 28 may also determine an approximate location of the user 18 based on the strength of the received backscatter. The RFID system 28 may send information related to the identification and the approximate location to the computing system 14. The computing system 14 can then use information regarding the occurrence of the interaction, the location of interaction, the identification of the user 18, and the location of the user 18 to determine which user 18 interacted with the interactive element 20. As a result of determining that the user 18 interacted with the interactive element 20, the computing system 14 may update information regarding the user 18 (e.g., assign points) to reflect the interaction. Thus, together the sensor system 12 and the RFID system 28 provide a system that efficiently tracks the interactions of each user 18 within an attraction.


In one embodiment, the processor 26 of the computing system 14 may determine the identification of the RFID tag 50 supported in the wearable device 16 worn or carried by the user 18. In one embodiment, the databases 22 may contain corresponding information for the RFID tag 50, which may be associated with the user 18. For example, when the user 18 obtains the wearable device 16, the user 18 may register the wearable device 16 to associate the wearable device 16 with the user 18 and/or to input (e.g., via a user's computing device, such as a mobile phone, that is communicatively coupled to the computing system 14) any user information and/or preferences, such as the user's name, age, physical attributes and/or limits, preferred difficulty level, preferred characters, preferred types of games, preferred team affiliation, preferred theme, or the like, which may be stored as the corresponding information for the RFID tag 50 within the databases 22. As the user 18 interacts with the interactive system 8, the corresponding information for the RFID tag 50 stored within the databases 22 may be updated to include completed interactions, achievements, level in the game, or the like.


It should be appreciated that the databases 22 may contain corresponding information for multiple RFID tags 50, which may each be supported in a respective wearable device 16 that is worn or carried by a respective user 18. In one embodiment, upon determining the identification of the RFID tag 50, the processor 26 may access the corresponding information (e.g., corresponding to or associated with the RFID tag 50 of the wearable device 16) in the database 22. As noted above, the databases 22 may be in communication with multiple computing systems 14, which may be coupled to separate interactive systems 8.


As noted above, the signal received at the computing system 14 from the RFID reader 10 may provide information regarding the location of the user 18. For example, the signal from the RFID reader 10 may indicate detection of the backscatter from the RFID tag 50 and/or the strength of the backscatter received from the RFID tag 50, which in turn may indicate the approximate location of the wearable device 16, and thus, the corresponding user 18. For example, the RFID reader 10 may only read RFID tags 10 that are within a certain range or distance of the RFID reader 10. Accordingly, detection of the backscatter by the RFID reader 10 indicates that the RFID tag 50 of the wearable device 16 is within the certain range or distance of the RFID reader 10.


In operation, the computing system 14 may also receive a signal from the sensor system 12. The signal may indicate the occurrence of an interaction with the interactive element 20. In the illustrated embodiment, the interactive element 20 is an image presented on the display 21. In one embodiment, the processor 26 may process the signal from the sensor system 12 to detect the interaction with the interactive element 20. For example, in some embodiments, the processor 26 may process the signal from the sensor system 12 to determine a location of the interaction relative to the interactive element 20 to determine if an interaction with the interactive element 20 occurred. For example, the interactive element 20 may be an image located in a corner of the display 21. When the sensor system 12 senses an action (e.g., a touch, swipe, movement) performed in the corner of the display 21 at or proximate to the location of the interactive element 20, the processor 26 may determine that an interaction with the interactive element 20 has occurred. Therefore, the processor 26 may process the signal received from the sensor system 12 to determine the occurrence of an interaction and/or the location of the interaction.


In one embodiment, the processor 26 may also receive and process the signal from the RFID reader 10 to determine the identity of the RFID tag 50 associated with the user 18 who performed the interaction with the interactive element 20. It should be appreciated that the signal from the RFID reader 10 may indicate the strength of the backscatter and/or multiple RFID readers 10 may be utilized to facilitate determination of the location of the RFID tag 50 associated with the user 18. Determining the location of the RFID tag 50 may facilitate matching the user 18 to the interaction with the interactive element 20, particularly where multiple users 18, each having a respective wearable device 16 with a respective RFID tag 50, are proximate to one another or in the range of the RFID tag 50.


In response to the determination that the user 18 completed the interaction with the interactive element 20, the processor 26 may send a signal to update one or more interactive elements 20. For example, in response to the processor 26 determining that the user 18 touched the interactive element 20, the processor 26 may transmit a signal that causes the interactive element 20 to disappear and be replaced by a new interactive element 20 (e.g., the processor 26 may render an image based on relevant information from one or more databases 22 for display on the display 21). In one embodiment, the computing system 14 may render and/or instruct display of the new interactive element 20 based on the identity of the RFID tag 50 and/or the corresponding information accessed from the databases 22. For example, the computing system 14 may access the corresponding information in the databases 22 that indicates that the RFID tag 50 is affiliated with a particular team, and then the computing system 14 may render and/or instruct display of an animated character that is associated with the particular team. Additionally or alternatively, the computing system 14 may render and/or play audio. Thus, the computing system 14 may render different types of media (e.g., audio, images) in response to detection of an interaction with the interactive element 20. Moreover, in one embodiment, the processor 26 may update the databases 22 to reflect the interaction with the interactive element 20. For example, in response to the processor 26 determining that the user 18 interacted with the interactive element 20, the processor 26 may transmit a signal to the databases 22 to update the corresponding information (e.g., points earned by the user 18 from interacting with interactive element 20, level in the game, interactions completed) for the RFID tag 50 supported in the wearable device 16 that is worn or carried by the user 18. In this manner, the processor 26 may process the combination of the signals sent by the sensor system 12 and the RFID reader 10 to update the interactive elements 20 and/or the databases 22.


In one embodiment, the RFID system 28 may update the information of multiple users 18 simultaneously. For example, the RFID reader 10 may determine that multiple users 18 are located within a zone (e.g., a predetermined zone) proximate to the interactive element 20. Then, when one of the users 18 interacts with the interactive element 20, the computing system 14 may update information of all of the users 18 that are located within the zone proximate to the interactive element 20.


When the user 18 reaches a certain achievement (e.g., accumulating enough points from interactions or completing certain interactions), the computing system 14 may process the achievement to enable the user 18 to obtain special items or experiences. In one embodiment, the interactive system 8 may include an isolated room or area that is only accessible to the user 18 after the user 18 has reached a certain achievement. For example, a door of the isolated room or area may be communicatively coupled with the computing system 14, and the computing system 14 may provide a signal to open the door only after the user 18 has reached a certain achievement. In one embodiment, the computing system 14 may provide coupons (e.g., via a printing device or via a mobile device of the user 18) for exclusive merchandise as a result of reaching a certain achievement. It should be appreciated that the computing system 14 may receive and analyze data from the database 22 to determine that a certain achievement has been reached and to perform such actions.



FIG. 3 illustrates a schematic of an embodiment of the interactive system 8 shown in FIG. 1. FIG. 3 shows the RFID reader 10, the sensor system 12, and the computing system 14. Furthermore, FIG. 3 shows multiple users 18A, 18B, and 18C each with respective wearable devices 16A, 16B, and 16C that support respective RFID tags 50A, 50B, 50C. In FIG. 3, the users 18 are located within the interactive system 8 such that the wearable device 16A is the most proximate, the wearable device 16C is the most distant, and the wearable device 16B is intermediate to the RFID reader 10. As discussed above, the respective RFID tags 50 located on the respective wearable devices 16 send respective backscatter to the RFID reader 10, and the detection of and/or the strength of the backscatter is based on the location of the RFID tags 50 with respect to the RFID reader 10. For example, in FIG. 3, the RFID reader 10 will pick up stronger backscatter from the RFID tag 50A as compared to the RFID tag 50B. In turn, the RFID reader 10 will transmit a signal that contains information indicative of the strength of the respective backscatters to the computing system 14. As a result, the computing system 14 will determine the location of the user 18A is more proximate to the RFID reader 10 than the user 18B. Further, the RFID tag 50C may be outside of the receiving range of RFID reader 10, and thus, the RFID reader 10 may not detect the user 18C nor provide an indication of the presence of the user 18C to the computing system 14. Other RFID readers 10 may be used in conjunction to more specifically identity location information, such as via triangulation.


The sensor system 12 may detect and generate a signal indicative of each interaction with the interactive elements 20. For example, with reference to FIG. 3, when the user 18A interacts with a first interactive element 20A, the sensor system 12 detects and generates a signal indicative of the interaction with the first interactive element 20A. Likewise, when the user 18B interacts with a second interactive element 20B, the sensor system 12 detects and generates a signal indicative of the interaction with the second interactive element 20B. In one embodiment, the first and second interactive elements 20A, 20B may be located on a touchscreen communicatively coupled to the sensor system 12. The users 18A, 18B may touch the respective first and second interactive elements 20A, 20B, and the touchscreen may detect the interactions. As a result, the sensor system 12 sends signals to the computing system 14 indicative of the respective interactions. The computing system 14 receives and processes the signals indicative of the respective interactions with the first and second interactive elements 20A, 20B from the sensor system 12, as well as the signals indicative of the locations of the users 18A, 18B from the RFID reader 10 to match (e.g., assign) the interaction with the first interactive element 20A to the user 18A and the interaction with the second interactive element 20B to the user 18B. For example, the computing system 14 may determine that the user 18A was located most proximate to the first interactive element 20A at the time of the interaction with the first interactive element 20A based on the signals, and thus, the computing system 14 matches the user 18A to the interaction with the first interactive element 20A. In one embodiment, the computing system 14 may update the corresponding information for the users 18A, 18B in the database 22 based on the received signals and the matched interactions.


In one embodiment, the computing system 14 may transmit a signal to update the interactive elements 20 based on the received signals. For example, the computing system 14 may render one new image from a model database of the one or more databases 22 for display on the display 21 in the vicinity of the user 18A, and the computing system 14 may render a different new image from the model database for display on the display 21 in the vicinity of the user 18B. The new image and the different new image may be rendered based at least in part on the corresponding information in the databases 22 for the RFID tags 50A, 50B associated with the users 18A, 18C. In this manner, the interactive system 8 may track performance of each user 18 and/or provide a personalized experience for each user 18. Additionally or alternatively, the computing system 14 may render a different new image from the model database locally and independent of the information of the users 18. For example, the image may be a target and when interacted with, regardless of which user completes the interaction, the target may disappear and/or be replaced by another particular image (e.g., the same image). In this case, the interaction may or may not affect or change the information related to the users 18.



FIG. 4 is a front view of an embodiment of the sensor system 12. FIG. 4 illustrates multiple sensor elements that may detect an interaction with the interactive elements 20. In one embodiment, the sensor system 12 includes one or more optical emitters 100 (e.g., light emitting diodes [LEDs], lasers) that may emit light, and corresponding optical detectors 102 (e.g., photodetectors). As shown, in one embodiment, the optical emitters 100 and optical detectors 102 may be positioned around a perimeter of the display 21. In one embodiment, each optical emitter 100 is opposed to (e.g., aligned directly across from) a corresponding optical detector 102, such that the optical emitter 100 emits light towards the corresponding optical detector 102. In one embodiment, the optical detector 102 detects the light emitted by the optical emitter 100 as long as there is no object positioned between the optical detector 102 and the optical emitter 100. However, when an object interferes with or blocks the light emitted from the optical emitter 100 from reaching the optical detector 102 (e.g., the user 18 reaches a hand to interact with the interactive element 20 on the display 21), the optical detector 102 does not detect the emitted light and sends a signal indicating that the optical detector 102 did not detect the emitted light to the computing system 14. With the optical emitters 100 and the optical detectors 102 positioned about the perimeter of the display 21, as shown, the computing system 14 may process signals from the optical detectors 102 to determine the location of the interaction relative to the interactive elements 20 and/or the display 21 (e.g., along an X axis and a Y axis).


In one embodiment, the optical emitter 100 and the optical detector 102 may be located within a common housing or positioned to enable the optical detector 102 to detect reflected light. In some such cases, light emitted by the optical emitter 100 may be reflected by an object (e.g., a hand of the user 18) toward the optical detector 102, which may then detect the reflected light and provide a signal, indicating that the optical detector 102 detected the emitted light, to the computing system 14. In this manner, the sensor system 12 may provide the signals indicative of the occurrence of interactions with the interactive elements 20.


Additionally or alternatively, the sensor system 12 may include multiple RFID readers 10, such as NFC readers 104 or any of a variety of other readers (e.g., UHF readers), to detect interactions with the interactive elements 20. In one embodiment, the NFC readers 104 may be located on the rearward side of the display 21 (i.e., the opposite side of the forward side of the display 21 that is visible to the user 18). In one embodiment, the NFC readers 104 are arranged in a grid (e.g., a series of rows and columns). As such, when the user 18 reaches forward to interact with the interactive element 20, the NFC readers 104 may read an NFC RFID tag located on the wearable device 16. As discussed above with respect to FIGS. 1-3, the RFID tag 50 may include the NFC RFID tag. In one embodiment, the NFC readers 104 may receive identity information from the NFC RFID tag, which may be provided to the computing system 14 to facilitate matching the interaction to the user 18.


Additionally or alternatively, the sensor system 12 may include a light detection and ranging (LIDAR) system. The LIDAR system may contain at least one LIDAR sensor 106 disposed proximate to the display 21. In operation, the LIDAR sensor 106 continuously sends light (e.g., laser light), such that the LIDAR sensor 106 covers the display 21 in a layer of light. When the emitted light hits an object in its path, the light reflects back to the LIDAR sensor 106, and the LIDAR sensor 106 may determine how far away the object is. For example, when the user 18 reaches out to interact with the interactive element 20 on the display 21, the user's hand interferes with the light emitted by the LIDAR sensor 106, and the LIDAR sensor 106 will detect the interference and determine the location of interference. The LIDAR sensor 106 may then transmit a signal that indicates the location of the interference to the computing system 14 for further processing.


Additionally or alternatively, the sensor system 12 may include a touchscreen, such as resistive touchscreen panel, a capacitive touchscreen panel, a surface acoustic wave panel, or any combination thereof. The touchscreen may sense an interaction performed on the display 21 (e.g., with contact force or electrical charge), and the location of the interaction may be determined based on the position of the sensed interaction. The touchscreen may then send a signal that indicates the location of the sensed interaction to the computer system 14. It should be appreciated that the display 21 may be or include one or more touchscreen displays that may display the interactive elements 20 and detect the interactions with the interactive elements 20.


It should be appreciated that the sensor system 12 may additionally or alternatively include other elements that may detect an interaction, such as pressure pads, switches, or cameras. It should also be appreciated that the sensor system 12 may include any combination of the aforementioned elements, or any other suitable element not already mentioned, to use for sensing an interaction and its location.



FIG. 5 illustrates an embodiment of the sensor system 12 that may detect additional characteristics (e.g., depth along a Z axis) of an interaction. The illustrated sensor system 12 of FIG. 5 includes a layered LIDAR system. However, it should be appreciated that the sensor system 12 may include multiple layers (e.g., along the Z axis) of optical emitters 100 and optical detectors 102 arranged about the display 21 in a manner similar to that of FIG. 4. In FIG. 5, there are two LIDAR sensors 106A and 106B. As shown, the LIDAR sensor 106A is located more forward (i.e., toward the user 18) than the LIDAR sensor 106B, such that the LIDAR sensor 106A creates a light layer 150A closer to the user 18 than a light layer 150B created by the LIDAR sensor 106B in front of the display 21. In this manner, the LIDAR sensor 106A may detect an interfering object (e.g., the user's hand) without the LIDAR sensor 106B detecting the interfering object. The computing system 14 may process the signals sent by the LIDAR sensors 106A, 106B to determine the approximate depth of an interaction. For example, if both the LIDAR sensors 106A, 106B detect the object, the computing system 14 may determine that the user 18 fully or successfully interacted (e.g., contacted or touched) with the interactive element 20 on the display 21. However, if only one LIDAR sensor 106A detects the object, the computing system 14 may determine that the user 18 did not fully interact with the interactive element 20. Furthermore, while FIG. 5 depicts two LIDAR sensors 106A, 106B, there may be any suitable number of LIDAR systems and corresponding light layers to enable sensing a depth of the interaction.


In one embodiment, the sensor system 12 may be used to detect a position of the user 18. For example, additional LIDAR sensors 106 may be placed to detect location of the user 18 in the interactive system 8 (e.g., location within a room or area). That is, the presence of the user 18 may interfere with the emitted light of one or more of the LIDAR sensors 106. The one or more LIDAR sensors 106 may then transmit information regarding the interference to the computing system 14 to determine the position of the user 18. The signals received from the LIDAR sensors 106 may then be combined with signals received from the RFID reader 10 to determine the identity of the user 18.


Furthermore, in one embodiment, the sensor system 12 may be configured to detect skeletal movement and gestures of the user 18. For example, the sensor system 12 may include multiple imaging devices (e.g., cameras), and images obtained by the imaging devices may be processed (e.g., by the computing system 14 using image processing algorithms) to determine skeletal movements and gestures (e.g., hand waving, jumping, dancing, reaching toward the interactive element 20) of the user 18. In one embodiment, the imaging devices may be positioned relative to the display 21 to enable the cameras to detect the user 18 reaching toward the interactive element 20, thereby enabling the determination of the occurrence of the interaction and/or the depth of the interaction. If images obtained by the imaging devices indicate that the user 18 interacted with the interactive element 20 at a certain depth (e.g., extended their arm to a region substantially proximate to the display 21), then the computing system 14 may determine that an interaction occurred, and in response, update the profile of the user 18 and/or the interactive element 20. However, if images obtained by the imaging devices indicate that the user 18 did not adequately interact with the interactive element 20 (e.g., the interaction is not at the appropriate depth), the computing system 14 may not update the profile of the user 18 and/or the interactive element 20. Thus, the sensor system 12 may include any of a variety of elements (e.g., optical emitters 100, optical detectors 102, LIDAR sensors 106, cameras) configured to detect an interaction, a depth of the interaction, a position of a user, and/or gestures made by the user, for example.


It should be appreciated that the various features of the interactive system 8, such as the sensor system 12, may be wirelessly coupled to computing system 14. For example, the sensor system 12 may not be confined to or fixed within a room or area. In one embodiment, the sensor system 12 may be placed on a moving vehicle (e.g., an amusement park ride). The sensor system 12 may detect if user 18 is on the ride, detect interactions completed during the ride (e.g., as a ride vehicle travels along a path), and update the database 22 and/or the interactive elements 20 accordingly. In one embodiment, the sensor system 12 may be placed on a person or an animated character and may detect interactions with the person or the animated character. In this manner, the sensor system 12 may be placed anywhere in the park and is not fixed to a stationary object.



FIG. 6 illustrates a flow chart of a method 138 that may be carried out by the processor 26 to track performance of the users 18, update data in the databases 22, and/or update the interactive elements 20 to provide a unique personalized experience for the users 18. The method 138 disclosed herein includes various steps represented by blocks. It should be noted that at least some steps of the method 138 may be performed as an automated procedure by a system, such as the interactive system 8. Although the flow chart illustrates the steps in a certain sequence, it should be understood that the steps may be performed in any suitable order and certain steps may be carried out simultaneously, where appropriate. Additionally, steps may be added to or omitted from of the method 138. Further, certain steps or portions of the method 138 may be performed by separate devices. For example, a first portion of a method 138 may be performed by the processor 26 of the computing system 14, while a second portion of the method 138 may be performed by a separate processing device. In addition, insofar as steps of the method 138 disclosed herein are applied to received signals, it should be understood that the received signals may be raw signals or processed signals. That is, the method 138 may be applied to an output of the received signals.


In block 140, the processor 26 receives a signal from the RFID reader 10. The signal provides information read from the RFID tag 50 corresponding to the user 18, which the processor 26 uses to determine the identification of RFID tag 50 and/or to access the corresponding data from the databases 22. The processor 26 may also determine the location of RFID tag 50 relative to the RFID reader 10, the interactive element 20, and/or the display 21 using the signal received from the RFID reader 10.


In block 142, the processor 26 receives a signal from the sensor system 12 indicative of the occurrence of an interaction with the interactive element 20. The signal may indicate the location of the interaction relative to the interactive element 20. Based at least in part on the location of the interaction and the location of the RFID tag 50, the processor 26 may match the interaction to the corresponding RFID tag 50, and thus, to its corresponding user 18, in block 144.


In block 146, the processor 26 updates the databases 22 based on the interaction by the user 18. For example, the processor 26 may update the databases 22 to include additional points, record the interaction as complete, or the like. In block 148, the processor 26 may update the interactive element 20 as a result of matching the interaction to the corresponding RFID tag 50, and thus, to its associated user 18. The interactive element 20 may be updated in a user-specific manner, such that the next interactive element 20 is selected and presented to the user 18 based on the corresponding information that is specific to the RFID tag 50 and the user 18, for example. The method 138 may be performed in response to any number of interactions by any number of users 18. Further, if multiple users 18 interact with multiple interactive elements 20 simultaneously, the processor 26 may perform the method 138 multiple times simultaneously.


As set forth above, embodiments of the present disclosure describe an interactive system that uses an RFID reader, a sensor system, and a computing system to determine location and identification of an RFID tag. Further, the system detects an interaction performed by a user associated with the RFID tag. The interactive system stores information related to the RFID tag in one or more databases, which dynamically changes based at least in part on the user's interactions, preferences, performances, or any combination thereof. The computing system updates elements of the interactive system to reflect the information and enhance each user's experience.


While only certain features of the disclosure have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the disclosure.

Claims
  • 1. A system, comprising: a radio-frequency identification (RFID) reader configured to read data stored on an RFID tag associated with a user and to generate a first signal indicative of the data and indicative of a first location of the RFID tag;a sensor system configured to detect a user interaction with an interactive element and to generate a second signal indicative of the user interaction and a second location related to the interactive element; anda processor configured to: match the user to the user interaction based on a correspondence between the first location indicated by the first signal and the second location indicated by the second signal; andupdate a user database to reflect that the user is matched to the user interaction.
  • 2. The system of claim 1, wherein the interactive element comprises a display configured to display an image, and the processor is configured to render another image for presentation on the display in response to matching the user to the user interaction.
  • 3. The system of claim 1, wherein the RFID reader is an ultra-high frequency RFID reader.
  • 4. The system of claim 1, wherein the RFID reader is configured to generate a third signal indicative of a strength of an RFID signal transmitted from the RFID tag to the RFID reader, and the processor is configured to match the user to the user interaction based on the third signal.
  • 5. The system of claim 1, wherein the sensor system comprises one or more optical detectors.
  • 6. The system of claim 1, wherein the sensor system comprises multiple optical detectors positioned about a perimeter of a display that is configured to display the first interactive element.
  • 7. The system of claim 1, wherein the sensor system comprises multiple near-field communication readers.
  • 8. The system of claim 1, wherein the sensor system comprises a light detection and ranging (LIDAR) system.
  • 9. The system of claim 1, wherein the sensor system comprises a resistive touchscreen panel, a capacitive touchscreen panel, a surface acoustic wave panel, or any combination thereof.
  • 10. The system of claim 1, wherein the RFID reader is configured to read respective data stored on an additional RFID tag associated with an additional user and to generate a third signal indicative of the respective data and indicative of a third location of the additional RFID tag, the sensor system is configured to detect an additional user interaction with an additional interactive element and to generate a fourth signal indicative of the additional user interaction and a fourth location related to the additional interactive element, and the processor is configured to: match the additional user to the additional user interaction based on a correspondence between the third location indicated by the third signal and the fourth location indicated by the fourth signal; andupdate the user database to reflect that the additional user is matched to the additional user interaction.
  • 11. The system of claim 10, wherein the sensor system is configured to match the user to the user interaction and to match the additional user to the additional user interaction when the user interaction and the additional user interaction occur substantially simultaneously.
  • 12. The system of claim 1, comprising one or more additional RFID readers configured to read data stored on the RFID tag and to generate one or more additional signals indicative of a respective strength of respective RFID signals transmitted from the RFID tag to the one or more additional RFID readers, wherein the processor is configured to determine a location of the RFID tag relative to the interactive element based on the first signal and the one or more additional signals.
  • 13. A system, comprising: a radio-frequency identification device (RFID) reader configured to read data stored on one or more RFID tags associated with one or more respective users;a sensor system configured to detect a first user interaction at a first location on a display and a second user interaction at a second location on the display; anda processor configured to receive RFID signals indicative of the data from the RFID reader and to receive sensor signals indicative of the first user interaction and the second user interaction from the sensor system, wherein the processor is configured to render a first image for display at the first location on the display based on the sensor signals and the RFID reader reading data stored on a first RFID tag of the one or more RFID tags, and wherein the processor is configured to render a second image for display at the second location on the display based on the sensor signals and the RFID reader reading data stored on a second RFID tag of the one or more RFID tags.
  • 14. The system of claim 13, wherein the processor is configured to match a first user of the one or more respective users to the first user interaction based on the RFID signals and the sensor signals and to update a user database to reflect that the first user of the one or more respective users is matched to the first user interaction.
  • 15. The system of claim 13, wherein the RFID signals are indicative of a respective location of each of the one or more RFID tags.
  • 16. A method, comprising: receiving, at a processor, a first signal from a radio-frequency identification device (RFID) reader, wherein the first signal is indicative of data stored on an RFID tag and indicative of a first location of the RFID tag associated with a user and read by the RFID reader;receiving, at the processor, a second signal from a sensor system, wherein the second signal is indicative of a user interaction with an interactive element and a second location associated with the interactive element;matching, using the processor, the user to the user interaction based on the first location indicated by the first signal and the second location indicated by the second signal; andupdating, using the processor, a user database to reflect that the user is matched to the user interaction.
  • 17. The method of claim 16, comprising rendering, using the processor, an image for display on a display in response to the matching of the user to the user interaction.
  • 18. The method of claim 16, comprising: detecting, using the sensor system, an interruption in light emitted by a light emitting element; andgenerating, using the sensor system, the second signal in response to the interruption in light.
  • 19. The method of claim 16, comprising: receiving, at the processor, a third signal from the RFID reader, wherein the third signal is indicative of respective data stored on an additional RFID tag and indicative of a third location of the additional RFID tag associated with an additional user and read by the RFID reader;receiving, at the processor, a fourth signal from the sensor system, wherein the fourth signal is indicative of an additional user interaction with an additional interactive element and a fourth location associated with the additional interactive element;matching, using the processor, the additional user to the additional user interaction based on the third location indicated by the third signal and the fourth location indicated by the fourth signal; andupdating, using the processor, the user database to reflect that the additional user is matched to the additional user interaction.
  • 20. The method of claim 16, comprising: receiving, at the processor, one or more additional signals from one or more additional RFID readers configured to read data stored on the RFID tag; anddetermining, using the processor, a location of the RFID tag relative to the interactive element based on the first signal and the one or more additional signals to facilitate matching the user to the user interaction.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority from and the benefit of U.S. Provisional Application No. 62/617,531, entitled “INTERACTIVE SYSTEMS AND METHODS,” filed Jan. 15, 2018, which is hereby incorporated by reference in its entirety for all purposes.

US Referenced Citations (299)
Number Name Date Kind
5946444 Evans et al. Aug 1999 A
6142368 Mullins et al. Nov 2000 A
6307952 Dietz Oct 2001 B1
6346886 De La Huerga Feb 2002 B1
6352205 Mullins et al. Mar 2002 B1
6474557 Mullins et al. Nov 2002 B2
6526158 Goldberg Feb 2003 B1
6634949 Briggs et al. Oct 2003 B1
6680707 Allen et al. Jan 2004 B2
6761637 Weston et al. Jul 2004 B2
6822569 Bellum et al. Nov 2004 B1
6888502 Beigel et al. May 2005 B2
6908387 Hedrick et al. Jun 2005 B2
6967566 Weston et al. Nov 2005 B2
7029400 Briggs Apr 2006 B2
7047205 Hale et al. May 2006 B2
7066781 Weston Jun 2006 B2
7204425 Mosher, Jr. et al. Apr 2007 B2
7224967 Hale et al. May 2007 B2
7311605 Moser Dec 2007 B2
7327251 Corbett, Jr. Feb 2008 B2
7336178 Le Feb 2008 B2
7336185 Turner et al. Feb 2008 B2
7385498 Dobosz Jun 2008 B2
7396281 Mendelsohn et al. Jul 2008 B2
7400253 Cohen Jul 2008 B2
7445550 Barney et al. Nov 2008 B2
7479886 Burr Jan 2009 B2
7488231 Weston Feb 2009 B2
7492254 Bandy et al. Feb 2009 B2
7500917 Barney et al. Mar 2009 B2
7528729 Light et al. May 2009 B2
7541926 Dugan et al. Jun 2009 B2
7564360 Cote et al. Jul 2009 B2
7564426 Poor et al. Jul 2009 B2
7606540 Yoon Oct 2009 B2
7614958 Weston et al. Nov 2009 B2
7642921 Cutler et al. Jan 2010 B2
7674184 Briggs et al. Mar 2010 B2
7720718 Hale et al. May 2010 B2
7739925 Foster Jun 2010 B2
7749089 Briggs et al. Jul 2010 B1
7752794 Kerlin Jul 2010 B2
7775894 Henry et al. Aug 2010 B2
7786871 Schwarze et al. Aug 2010 B2
7791557 Mickle et al. Sep 2010 B2
7802724 Nohr Sep 2010 B1
7812779 Turner et al. Oct 2010 B2
7817044 Posamentier Oct 2010 B2
7837567 Holzberg et al. Nov 2010 B2
7850527 Barney et al. Dec 2010 B2
7855697 Chamarti et al. Dec 2010 B2
7878905 Weston et al. Feb 2011 B2
7881713 Hale et al. Feb 2011 B2
7885763 Havens Feb 2011 B2
7896742 Weston et al. Mar 2011 B2
7925308 Greene et al. Apr 2011 B2
7942320 Joe May 2011 B2
7956725 Smith Jun 2011 B2
7994910 Brooks et al. Aug 2011 B2
7997981 Rowe et al. Aug 2011 B2
8016667 Benbrahim Sep 2011 B2
8035335 Duron et al. Oct 2011 B2
8082165 Natsuyama et al. Dec 2011 B2
8085130 Liu et al. Dec 2011 B2
8089458 Barney et al. Jan 2012 B2
8123613 Dabrowski Feb 2012 B2
8164567 Barney et al. Apr 2012 B1
8169406 Barney et al. May 2012 B2
8184097 Barney et al. May 2012 B1
8200515 Natsuyama et al. Jun 2012 B2
8213862 Muth Jul 2012 B2
8222996 Smith et al. Jul 2012 B2
8226493 Briggs et al. Jul 2012 B2
8231047 Canora Jul 2012 B2
8237561 Beigel et al. Aug 2012 B2
8248208 Renfro, Jr. Aug 2012 B2
8248367 Barney et al. Aug 2012 B1
8253533 Jones Aug 2012 B2
8253542 Canora et al. Aug 2012 B2
8296983 Padgett et al. Oct 2012 B2
8313381 Ackley et al. Nov 2012 B2
8330284 Weston et al. Dec 2012 B2
8330587 Kupstas Dec 2012 B2
8342929 Briggs et al. Jan 2013 B2
8353705 Dobson et al. Jan 2013 B2
8368648 Barney et al. Feb 2013 B2
8373543 Brommer et al. Feb 2013 B2
8373659 Barney et al. Feb 2013 B2
8384668 Barney et al. Feb 2013 B2
8392506 Rowe et al. Mar 2013 B2
8416087 Canora et al. Apr 2013 B2
8425313 Nelson et al. Apr 2013 B2
8430749 Nelson et al. Apr 2013 B2
8463183 Muth Jun 2013 B2
8475275 Weston et al. Jul 2013 B2
8477046 Alonso Jul 2013 B2
8489657 Shepherd et al. Jul 2013 B2
8491389 Weston et al. Jul 2013 B2
8531050 Barney et al. Sep 2013 B2
8552597 Song et al. Oct 2013 B2
8564414 Bergevoet Oct 2013 B2
8571905 Risnoveanu et al. Oct 2013 B2
8581721 Asher et al. Nov 2013 B2
8593283 Smith Nov 2013 B2
8593291 Townsend et al. Nov 2013 B2
8597111 LeMay et al. Dec 2013 B2
8608535 Weston et al. Dec 2013 B2
8618928 Weed et al. Dec 2013 B2
8621245 Shearer et al. Dec 2013 B2
8635126 Risnoveanu et al. Jan 2014 B2
8681000 August et al. Mar 2014 B2
8682729 Werbitt Mar 2014 B2
8686579 Barney et al. Apr 2014 B2
8702515 Weston et al. Apr 2014 B2
8708821 Barney et al. Apr 2014 B2
8711094 Barney et al. Apr 2014 B2
8742623 Biederman et al. Jun 2014 B1
8753165 Weston Jun 2014 B2
8758136 Briggs et al. Jun 2014 B2
8773245 Canora et al. Jul 2014 B2
8790180 Barney et al. Jul 2014 B2
8797146 Cook et al. Aug 2014 B2
8810373 Kim et al. Aug 2014 B2
8810430 Proud Aug 2014 B2
8814688 Barney et al. Aug 2014 B2
8816873 Bisset et al. Aug 2014 B2
8821238 Ackley et al. Sep 2014 B2
8827810 Weston et al. Sep 2014 B2
8830030 Arthurs et al. Sep 2014 B2
8851372 Zhou et al. Oct 2014 B2
8866673 Mendelson Oct 2014 B2
8870641 Dabrowski Oct 2014 B2
8888576 Briggs et al. Nov 2014 B2
8913011 Barney et al. Dec 2014 B2
8915785 Barney et al. Dec 2014 B2
8917172 Charych Dec 2014 B2
8923994 Laikari et al. Dec 2014 B2
8924432 Richards et al. Dec 2014 B2
8937530 Smith et al. Jan 2015 B2
8961260 Weston Feb 2015 B2
8961312 Barney et al. Feb 2015 B2
8971804 Butler Mar 2015 B2
8972048 Canora et al. Mar 2015 B2
8981907 Lavache Mar 2015 B1
9002264 Zhang Apr 2015 B2
9021277 Shearer et al. Apr 2015 B2
9039533 Barney et al. May 2015 B2
9072965 Kessman et al. Jul 2015 B2
9087246 Chin et al. Jul 2015 B1
9109763 Wein Aug 2015 B1
9122964 Krawczewicz Sep 2015 B2
9130651 Tabe Sep 2015 B2
9138650 Barney et al. Sep 2015 B2
9148202 Ackley et al. Sep 2015 B2
9149717 Barney et al. Oct 2015 B2
9162148 Barney et al. Oct 2015 B2
9162149 Weston et al. Oct 2015 B2
9178569 Chakravarty et al. Nov 2015 B2
9186585 Briggs et al. Nov 2015 B2
9196964 Baringer Nov 2015 B2
9207650 Narendra et al. Dec 2015 B2
9215592 Narendra et al. Dec 2015 B2
9225372 Butler Dec 2015 B2
9232475 Heinzelman et al. Jan 2016 B2
9245158 Gudan et al. Jan 2016 B2
9272206 Weston et al. Mar 2016 B2
9318898 John Apr 2016 B2
9320976 Weston Apr 2016 B2
9367852 Canora et al. Jun 2016 B2
9383730 Prestenback Jul 2016 B2
9393491 Barney et al. Jul 2016 B2
9393500 Barney et al. Jul 2016 B2
9411992 Marek et al. Aug 2016 B1
9412231 Dabrowski Aug 2016 B2
9413229 Fleming Aug 2016 B2
9424451 Kalhous et al. Aug 2016 B2
9438044 Proud Sep 2016 B2
9443382 Lyons Sep 2016 B2
9446319 Barney et al. Sep 2016 B2
9463380 Weston et al. Oct 2016 B2
9468854 Briggs et al. Oct 2016 B2
9474962 Barney et al. Oct 2016 B2
9480929 Weston Nov 2016 B2
9483906 LeMay et al. Nov 2016 B2
9491584 Mendelson Nov 2016 B1
9523775 Chakraborty et al. Dec 2016 B2
9542579 Mangold et al. Jan 2017 B2
9563898 McMahan et al. Feb 2017 B2
9579568 Barney et al. Feb 2017 B2
9582981 Rokhsaz et al. Feb 2017 B2
9589224 Patterson et al. Mar 2017 B2
9613237 Nikunen et al. Apr 2017 B2
9616334 Weston et al. Apr 2017 B2
9626672 Fisher Apr 2017 B2
9642089 Sharma et al. May 2017 B2
9646312 Lyons et al. May 2017 B2
9651992 Stotler May 2017 B2
9661450 Agrawal et al. May 2017 B2
9675878 Barney et al. Jun 2017 B2
9680533 Gudan et al. Jun 2017 B2
9692230 Biederman et al. Jun 2017 B2
9696802 Priyantha et al. Jul 2017 B2
9706924 Greene Jul 2017 B2
9707478 Barney et al. Jul 2017 B2
9713766 Barney et al. Jul 2017 B2
9731194 Briggs et al. Aug 2017 B2
9737797 Barney et al. Aug 2017 B2
9741022 Ziskind et al. Aug 2017 B2
9743357 Tabe Aug 2017 B2
9747538 Gudan et al. Aug 2017 B2
9748632 Rokhsaz et al. Aug 2017 B2
9754139 Chemishkian et al. Sep 2017 B2
9754202 Gudan et al. Sep 2017 B2
9756579 Zhou et al. Sep 2017 B2
9762292 Manian et al. Sep 2017 B2
9767649 Dabrowski Sep 2017 B2
9770652 Barney et al. Sep 2017 B2
9813855 Sahadi et al. Nov 2017 B2
9814973 Barney et al. Nov 2017 B2
9831724 Copeland et al. Nov 2017 B2
9836103 Kramer et al. Dec 2017 B2
9837865 Mitcheson et al. Dec 2017 B2
9861887 Briggs et al. Jan 2018 B1
9864882 Geist et al. Jan 2018 B1
9867024 Larson Jan 2018 B1
9871298 Daniel et al. Jan 2018 B2
9909896 Bass et al. Mar 2018 B2
9928527 Woycik et al. Mar 2018 B2
9928681 LeMay, Jr. et al. Mar 2018 B2
9931578 Weston Apr 2018 B2
9936357 Mills et al. Apr 2018 B2
9949219 Belogolovy Apr 2018 B2
9972894 Dion et al. May 2018 B2
9993724 Barney et al. Jun 2018 B2
1001079 Weston et al. Jul 2018 A1
1002262 Barney et al. Jul 2018 A1
20020008622 Weston et al. Jan 2002 A1
20040008114 Sawyer Jan 2004 A1
20050143133 Bridgelall Jun 2005 A1
20060125638 Marino Jun 2006 A1
20100164723 Shiau et al. Jul 2010 A1
20100207738 Bloy Aug 2010 A1
20110193958 Martin et al. Aug 2011 A1
20120276921 Németh Nov 2012 A1
20120286938 Cote et al. Nov 2012 A1
20130147609 Griffin et al. Jun 2013 A1
20130324059 Lee et al. Dec 2013 A1
20140122170 Padgett et al. May 2014 A1
20140162693 Wachter et al. Jun 2014 A1
20150046202 Hunt Feb 2015 A1
20150078140 Riobo Aboy et al. Mar 2015 A1
20150138556 LeBoeuf et al. May 2015 A1
20150194817 Lee et al. Jul 2015 A1
20150236551 Shearer et al. Aug 2015 A1
20150255226 Rouvala et al. Sep 2015 A1
20150312517 Hoyt et al. Oct 2015 A1
20150336013 Stenzler et al. Nov 2015 A1
20150347792 Bucherer-Klingler et al. Dec 2015 A1
20150371194 Marshall et al. Dec 2015 A1
20160019423 Ortiz et al. Jan 2016 A1
20160020636 Khlat Jan 2016 A1
20160020637 Khlat Jan 2016 A1
20160067600 Barney et al. Mar 2016 A1
20160144280 Pawlowski et al. May 2016 A1
20160170998 Frank et al. Jun 2016 A1
20160182165 Margon et al. Jun 2016 A1
20160203663 Proctor Jul 2016 A1
20160217496 Tuchman et al. Jul 2016 A1
20160226610 Pinzon Gonzales, Jr. Aug 2016 A1
20160307398 Walker et al. Oct 2016 A1
20160321548 Ziskind et al. Nov 2016 A1
20160373522 Carlos et al. Dec 2016 A1
20170091850 Alvarez et al. Mar 2017 A1
20170093463 Wang et al. Mar 2017 A1
20170115018 Mintz Apr 2017 A1
20170132438 Cletheroe et al. May 2017 A1
20170162006 Sahadi et al. Jun 2017 A1
20170169449 Heaven Jun 2017 A1
20170186270 Acres Jun 2017 A1
20170201003 Ackley et al. Jul 2017 A1
20170228804 Soni et al. Aug 2017 A1
20170235369 Acer et al. Aug 2017 A1
20170237466 Carr Aug 2017 A1
20170270734 Geraghty et al. Sep 2017 A1
20170288735 Zhou et al. Oct 2017 A1
20170293985 Deria et al. Oct 2017 A1
20170331509 Gollakota et al. Nov 2017 A1
20170340961 Weston et al. Nov 2017 A1
20170348593 Barney et al. Dec 2017 A1
20170358957 Mitcheson et al. Dec 2017 A1
20170361236 Barney et al. Dec 2017 A1
20170373526 Huang et al. Dec 2017 A1
20180008897 Ackley et al. Jan 2018 A1
20180014385 Wein Jan 2018 A1
20180078853 Barney et al. Mar 2018 A1
20180214769 Briggs et al. Aug 2018 A1
20180318723 Weston Nov 2018 A1
20180339226 Barney et al. Nov 2018 A1
Foreign Referenced Citations (8)
Number Date Country
2003288472 Oct 2003 JP
2004126791 Apr 2004 JP
2005267179 Sep 2005 JP
2010000178 Jan 2010 JP
2012244846 Dec 2012 JP
2013188019 Sep 2013 JP
6152919 Jun 2017 JP
2014135213 Sep 2014 WO
Non-Patent Literature Citations (12)
Entry
U.S. Appl. No. 15/882,761, filed Jan. 29, 2018, Wei Cheng Yeh.
U.S. Appl. No. 15/882,721, filed Jan. 29, 2018, Wei Cheng Yeh.
U.S. Appl. No. 15/882,788, filed Jan. 29, 2018, Wei Cheng Yeh.
U.S. Appl. No. 15/972,940, filed May 7, 2018, Unavailable.
U.S. Appl. No. 15/995,633, filed Jun. 1, 2018, Unavailable.
U.S. Appl. No. 16/196,967, filed Nov. 20, 2018, Matthew Usi.
U.S. Appl. No. 15/826,357, filed Nov. 29, 2017, Wei Yeh.
U.S. Appl. No. 15/833,839, filed Dec. 6, 2017, Travis Jon Cossairt.
U.S. Appl. No. 15/861,502, filed Jan. 3, 2018, Wei Cheng Yeh.
U.S. Appl. No. 15/874,671, filed Jan. 18, 2018, Wei Cheng Yeh.
Yimin Zhang et al. ; “Localization and Tracking of Passive RFID Tags Based on Direction Estimation”, International Journal of Antennas and Propagation, 2007, pp. 1-9, vol. 2007, Hindawi Publishing Corporation.
PCT/US2018/061353 Invitation to Pay Additional Fees dated Feb. 14, 2019.
Related Publications (1)
Number Date Country
20190220634 A1 Jul 2019 US
Provisional Applications (1)
Number Date Country
62617531 Jan 2018 US