Information is distributed wirelessly in many forms, such as radio broadcasts, cellular signals, WiFi signals, and Bluetooth signals. However, information distributed using these wireless protocols are either broadcasted to all devices with a suitable receiver or is transmitted to a particular device. To transmit to a particular device, both the transmitter and the receiver must use an encoding or “pairing” scheme (e.g., cellular signals, password-protected WiFi signals, and Bluetooth signals). This encoding scheme must be known, by both the transmitter and the receiver, before the transmission to a particular device can be accomplished. Additionally, to transmit information that is dependent on the location of the receiving device, substantially complicated location mechanisms are employed, for example, global positioning system (GPS) and/or triangulation. Such locating methods may not be able to locate devices to the accuracy that may be desired, for example, within a few feet. In some situations, it may be useful to transmit information to a particular device based on the location of the particular location of the device that may be substantially proximal to another device without the use of an encoding or “pairing” scheme. This invention provides such new and useful system and method.
The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.
As shown in FIGS. 1 and 5-7, the system 100 for transmitting information within a space preferably includes a plurality of sensory modules 110 that each include a data signal receiver 112 and an output element 114 and that are arranged substantially within the space, a first signal emitter 120 that emits a data signal of a first broadcast breadth and a second signal emitter 130 that emits a data signal of a second broadcast breadth substantially larger than the first broadcast breadth. The first signal emitter 120 preferably includes a direction modifier 122 that directs (1) a data signal including a first information set to a first portion of the space to be received by a first sensory module substantially within the first portion (information set A1) and (2) a data signal including a second information set to a second portion of the space to be received by a second sensory module substantially within the second portion (information set A2). The second signal emitter 130 preferably directs a data signal including a third information set (information set B) to both the first portion and the second portion to be received by both the first and second sensory modules. The first and second information preferably include instructions on the operation of the first and second sensory modules, respectively, and the third information set preferably includes instructions on the implementation of the first and second information sets. The first and second signal emitters 120 and 130 preferably cooperate to transmit location specific information to the plurality of sensory modules 110, in particular, information regarding the operation of the sensory modules 110 based on the location of the sensory module 110 within the space. The space in which the system 100 of the preferred embodiments is arranged may be an event space as shown in
The system 100 of the preferred embodiments allows for selective information to be communicated to selective sensory modules 110 based on the location of the sensory modules 110 without the need for complex systems and/or methods for device location, device identification, and/or pairing with the device. The first and second signal emitters 120 and 130 may alternatively be thought of as providing a first level of information and a second level of information, respectively, to the sensory modules 110, where the combination of the first and second levels of information provides a substantially complete set of location based operation instructions to the sensory modules 110. The first level of information is preferably specific to the location of the sensory module 110 (for example, instructions on the operation of a particular sensory module 110 in a particular location in the space) and the second level of information is nonspecific to the location of the sensory module 110 (for example, the timing instructions to implement the location specific instructions provided in the first level of information). However, any other suitable type of information set may be provided.
As shown in
The sensory module 110 functions to receive information from the first and second emitters 120 and 130 and to provide sensory stimulation to a user, for example, vision, touch, hearing, smell, and/or taste. As shown in
The sensory module 110 may be any suitable type of form factor, as shown in
The sensory module 110 may also include a storage device 118 that functions to store information sets, as shown in
The sensory module 110 may also include a module processor that functions to interpret the data signal received from the first and second signal emitters 120 and 130 and to implement the interpreted data signal through the output element 114. Alternatively, the data emitted by the first and second signal emitters 120 and 130 may already be processed to be implementable by the output element 114 without further processing by the sensory module 110. However, any other suitable arrangement of the processing of the data signal into information implementable by the output element 114 may be used.
The first emitter 120, as described above, functions to provide a first information set to a first sensory module 110 located substantially within a first portion of the space and a second information set to a second sensory module 110 located substantially within a second portion of the space. The first and second information sets are preferably stored in the respective sensory modules 110 and are preferably later referenced by the third information set of the second signal emitter 130. As described above, the first and second information sets preferably include information on the operation of the output element 114 of the first and second sensory modules 110, respectively. The information set may include one of a variety of different types of instructions. In a first variation, the first and second information sets may include a timing sequence for the actuation of the output element 114, for example, the first information set may include a first timing sequence of turning on and off an output element 114 that includes a light emitter and the second information set may include a second timing sequence of turning on and off the light emitter. In this variation, each of the sensory modules 110 preferably include a timing module that is substantially synchronized with the timing modules of other sensory modules 110. In a second variation, the first and second information sets may include instructions on the operation mode to actuate, for example, the first information set may include instructions to actuate the green operation mode of a colored light emitter and second information set may include instructions to actuate the red operation mode of a colored light emitter. In a third variation, the first and second information sets may include instructions for a sequence of images to display on an output element 114 that is a display. The first and second information sets may alternatively be a combination of the variations described above, for example, the first and second information sets may include a time sequence for the actuation of a combination of operation modes of the output element 114, for example, a time sequence for the actuation of various color modes in a colored light emitter. However, the first and second information sets may include any other suitable type of instruction to operate the output element 114.
The first signal emitter 120 is preferably of an emitter type that emits detectable data signals that are preferably invisible to the human eye. The data signal emitted by the first emitter 120 is preferably directional, in particular, the data signal preferably maintains a particular direction when directed at that particular direction, such as those seen television remotes. The data signal is preferably of an electromagnetic wave such as an infrared signal, but may alternatively be an ultrasonic signal, laser, or any other suitable type of signal. Because certain directional signals (in particular, electromagnetic wave signals and ultrasonic signals) may bounce off of surfaces and reflect towards a direction that was not of the original desired direction, the receiver 112 of the sensory module 110 is preferably able to discern between a direct signal and an unintentional reflected signal. In a first example, because reflected signals are generally of lower power than direct signals, the receiver 112 may only acknowledge and/or detect signals of a particular strength threshold. In a second example, because reflected signals take a longer path to reach a particular receiver 112, the receiver 112 may be configured to acknowledge the earliest arriving signal and ignore a later, possibly reflected, signal. In a third example, the first signal emitter 120 (and second signal emitter 130) may emit a signal that is polarized in a particular direction. Because reflection generally changes the polarity of a signal, the receivers 112 may be configured to only receiver signals of a particular polarization. In a fourth example, the emitters may function to emit the signal towards the first and second portion of the space in varying patterns to change the possible reflection patterns to decrease the number unintended receivers 112 that receive the reflected signal. In this example, the emitters 120 and 130 may function to scan the space to detect receivers that unintentionally received the signal and may determine an alternative path to emit the next signal that would substantially prevent the detected unintended receivers. In the variation where the output element 114 is a light, the first and/or second emitters 120 and 130 may visually detect the light output of the sensory module 110 (for example, by taking an image of the light output of the sensory modules 110 within the space and analyzing the light output based on the location of each sensory module 110), or the sensory module 110 may include a signal reflector that reflects the received signal back at the emitters. The fourth example may also include a feedback method that uses the scanned information to change the manner in which the emitters emit a signal to the sensory modules 110 in the space. The scan may also be used to differentiate between signals from multiple emitters (for example, in an example where there is more than one first signal emitter 120). In a fifth example, the first and/or second signal emitters 120 and/or 130 may include signal receivers and each sensory module 110 may include a signal emitter that communicates with the first and/or second signal emitters 120 and/or 130. In this example, the sensory modules 110 may communicate back to the first and/or second signal emitters 120 and/or 130 to confirm that each sensory module 110 received the correct information sets. For example, when a sensory module 110 receives an information set, the same information set (or a truncated version) may be sent back to the first and/or second signal emitter 120 and/or 130 to be matched with what was originally sent to that particular sensory module 110. However, any other suitable method to discern between a direct signal and an unintentional reflected signal may be used.
The first emitter 120, as described above, preferably includes a direction modifier 122 that functions to direct the signal to a desired location within the space. The first emitter 120 preferably also includes a signal output head 121, for example, an infrared emitting LED bulb that provides the output of the first signal emitter 120. The signal output head 121 may also include lenses that focus the signal to a particular location and/or influences the magnitude of the signal breadth. In particular, the lenses may function to decrease the magnitude of the signal breath to decrease the number of sensory modules 110 that receive a particular data signal broadcasted to a particular portion of the space to substantially one. Alternatively, the lenses may function to vary the magnitude of the signal breadth to reach a desired number of sensory modules 110. The variation of the lenses may be dynamic such that, during use, the first signal emitter 120 may change the breadth of the signal depending on the desired breadth. However, any other suitable lens arrangement may be used. The direction modifier 122 is preferably an actuator that functions to move the signal output head 121 of the first signal emitter 120. The direction modifier 122 preferably allows for substantially high articulation of the signal output head 121 to increase the number of directions towards which to emit the data signal and to increase the number of locations within the space that may be reached by a direct signal. Alternatively, reflected signals may be used to reflect the signal to reach particular locations that are otherwise unreachable by direct signal. In this variation, the sensory module 110 preferably detects these desired reflected signals. However, any other suitable system and/or method to direct the signal emitted by the first signal emitter 120 may be used.
As shown in
The second signal emitter 130, as described above, is substantially similar to the first signal emitter 130. The second signal emitter 130 functions to provide a third information set to sensory modules 110 located substantially within the first and second portions of the space. The third information set preferably includes an implementation information set that instructs the sensory modules 110 that receive the signal on the implementation of the instructions received from the first signal emitter 120, regardless of whether the sensory module 110 received the first or the second information set. The implementation information set preferably commonly references the instructions that were received from the first signal emitter 120 across all of the sensory modules 110 that receive the third information set. In other words, a first sensory module 110 received a first information set from the first signal emitter 120 and a second sensory module 110 received a second information set from the first signal emitter 120 that both receive the third information set from the second emitter 130 will both recognize the third information set as implementation instructions for the information set received from the first signal emitter 130. In this way, the first and second sensory module 110 may operate in a synchronized manner, but each may execute a different set of instructions. The second signal emitter 130 may alternatively be thought of as a “synchronizer” to synchronize the implementation of instructions received by each of the sensory modules 110. As described above, common referencing may be achieved by storing the information received from the first signal emitter 120 in a “to be executed” memory that is then referenced by the third information set. Alternatively, the first information set may be stored in a first sensory module as “Sequence A” and the second information set, different from the first information set, may be stored in a second sensory module 110 also as “Sequence A.” The third information set may then instruct the implementation of “Sequence A.” However, any other suitable common referencing method may be used.
The implementation information set in the third information set from the second signal emitter 130 may be one of a variety of types of instructions. In a first variation, the implementation information set may include instructions to implement the information set received from the first signal emitter 120 substantially immediately, in other words, the implementation information set functions as a “trigger” to implement the information set received from the first signal emitter 120. In a second variation, the implementation information set may include instructions on when to implement received information set, the sequence in which to implement the received information set, and/or any other suitable time delayed instruction. In this second variation, the sensory module 110 preferably functions to store the time information received from the third information set and preferably operates the output element 114 based on the timing provided. As described above, the sensory modules 110 that receive a timing based instruction set preferably include a timing module that is substantially synchronized with the other sensory modules 110 to allow substantially synchronized implementation of the desired information sets. However, any other suitable type of information may be provided in the third information set from the second signal emitter 130.
The second signal emitter 130 is preferably of an emitter type that is substantially similar to the first emitter type 120, for example, the second signal emitter 130 preferably emits a signal of the type that is substantially similar to the first emitter type 120 to allow the receiver 112 of the sensory module 110 to interpret the signal in a substantially similar way. Because the signal from the second signal emitter 130 is preferably broadcasted to all of the sensory modules 110 that are to receive the signal at one time, as described above, the data signal emitted by the second signal emitter 130 preferably has a signal breadth that is substantially larger than the signal emitted by the first signal emitter 120 to allow the signal to be received by sensory modules 110 within both the first and second portions of the space. The signal breadth is preferably large enough to reach all of the sensory modules 110 substantially within the space, such as a floodlight type breadth, as shown in
The first and second signal emitters 120 and 130 may be separate units, as shown in
The system 100 of the preferred embodiments may also include a processor 128, as shown in
The space may be an event space and may include a control system that controls the user experience provided by the space. In this variation, the central control 200 of the system 100 is preferably integrated into the existing control system of the user experience. This may allow for a centralized control system for the user experience provided by both the space and the system 100. For example, the space may include a light control panel that controls the lighting within the space when providing a user experience, such as spot lights or wall lights in a theater. In this example, the central control 200 is preferably integrated into the existing light control panel. However, the central control 200 may be integrated into any other suitable type of control system. In the version of the central control 200 that includes a user interface 214, the user interface 214 is preferably integrated into the existing control system for the space. The user interface 214 is preferably similar to the existing user interface of the control system for the user experience, for example, if the existing control system for the user experience includes sliders for light intensity, the user interface 214 of the central control 200 preferably also includes sliders for intensity of the output of the output element 114. However, any other suitable arrangement between the control system of the space and the central control 200 may be used. Alternatively, the central control 200 may be a separate unit from the control system of the space. However, any other suitable arrangement between the central control 200 and the control system of the space may be used.
The control system of the user experience and the central control 200 preferably communicate with each other to determine the first, second, and/or third instruction sets to the sensory modules 110. In the above example where the control system of the space includes control for the lighting of the space, the control system may include programming to control the sensory modules 110 along with the lighting sources in the environment and the control system may communicate instructions to the central control 200. In a second example, the central control 200 may communicate the state of the sensory modules 110 to the control system. The control system may then take the information provided by the central control 200 to manipulate the other lighting sources in the environment. However, any other type of information and/or instructions may be communicated between the central control 200 and the control system.
In the variation where the central control 200 and the control system of the user experience are separate units, the central control 200 may include a plug that interfaces with a plug on the control system to allow communication between the central control 200 and the control system of the space. For example, the control system may be a computer that includes a Universal Serial Bus (USB) interface and the central control 200 may include a plug that interfaces with the USB and communicates directions from the central control 200 to the control system and vice versa. Alternatively, the central control 200 may communicate with the control system of the user experience through a wireless protocol, for example, WiFi or Bluetooth. However, any other suitable communication between the central control 200 and the control system of the space may be used.
Each of the sensory modules 110 may also include a sensor 117 that produces a signal based upon the environment of the sensory module 110 that may be used to adjust the output of the output element 114 of a particular sensory module 110. For example, the sensor 117 may be an accelerometer. When an increased acceleration of the sensory module 110 is detected, the processor 128 may instruct the output element 114 to emit a higher intensity of output (for example, light). This provides two layers of control of the sensory module 110: The first, second, and/or third instruction set may instruct a sensory module 110 to emit light and, while the sensory module 110 is emitting light, an increased acceleration detected by the sensor 117 may instruct the sensory module 110 to emit a stronger intensity light, thus providing to the user the feeling that their actions change their environment, which may enhance their experience. The sensor 117 may also function to detect the state of a neighboring sensory module 110. For example, the sensor 117 may detect the light intensity and/or pattern of the neighboring sensory module 110 and emulate the light intensity and/or pattern. However, any other suitable sensor or sensor response may be used.
As described above, each sensory module 110 may also include a signal emitter. In this variation, the signal emitter of the sensory module 110 may transmit the state of the sensory module 110 as detected by the sensor 117 to the first and/or second emitters 120 and 130, the processor 128, and/or the central control 200. In this variation, the detected state of the sensory module 110 may be used to change the first, second, and/or third instruction sets. For example, if the acceleration of a particular sensory module 110 is detected to be high, the first, second, and/or third instruction sets may be adjusted to instruct that particular sensory module 110 to emit a different type of light. However, any other suitable adjustment to the first, second, and/or third instruction sets may be used.
The system 100 and method S100 of the preferred embodiments may be used in any suitable usage scenario for the transmission of information within a space. In a first exemplary usage scenario, as shown in
Within an audience stadium or auditorium space, there may be a multitude of uses for the information transmitted by the system 100 and method S100. For example, an audience member may be selected to be a winner of a contest based on the location of the seat and the information set that is broadcasted to that particular seat location will contain a win-indication sequence while an audience member in the next seat over is broadcasted a no-win-indication sequence by the first signal emitter 120. Alternatively, in the variation where the sensory module 110 includes a sensor 117 to detect the state of the sensory module 110, the winner may be selected based on the state of the sensory module 110, for example, the winner may be selected if the acceleration of the sensory module 110 is substantially high, indicating that the user is shaking the sensory module 110 substantially vigorously. However, any other suitable selection for a winner may be used. The second signal emitter 130 may then broadcast a “reveal winner” implementation instruction to all of the sensory modules 110 that instruct each sensory module 110 to implement the indication sequence received from the first signal emitter 120. The audience member that received the win-indication sequence will be notified of his or her win. In another example, the sensory modules 110 may be colored light emitters. The first signal emitter 120 may broadcast a series of information sets that include instructions on the color and/or timing of different colors to emit from each sensory module 110. The second signal emitter 130 may then broadcast a “begin sequence” implementation, and the sensory modules 110 will begin to emit light based on the instructions received from the first signal emitter. In this example, the sensory modules 110 may function as pixels in a large screen that is formed by all of the audience members with the transmitted sensory modules 110. By broadcasting information based on the location of the device, such a large and synchronized screen may be formed regardless of the audience members switching seats or moving around the auditorium. When broadcasting information based on the identity of the device itself rather than the actual location of the device, an audience member carrying the device may move from one side of the auditorium to another, resulting in the “pixels” of the large screen to be scattered in an unknown arrangement. The system 100 of this first exemplary usage scenario may also cooperate with an augmented reality device to provide an additional experience to a user viewing the plurality of sensory modules 110. For example, the light output of a particular sensory module 110 may be used as a fiducial marker for an augmented reality device. While a user without an augmented reality device may see a composite image from the light output of a plurality of sensory modules 110, a user watching the plurality of sensory modules 110 through an augmented reality device may see an image that is overlayed over the plurality of sensory modules 110 and is positioned based on the location of the light output of the particular sensory module 110. In this example, a user may experience a more detailed and/or embellished image through an augmented reality device based on the location of the fiducial marker relative to the user and/or the other sensory modules 110. The augmented reality device may also detect the state of the particular sensory module and adjust the provided augmented reality experience based on the state of the particular sensory module 110. However, any other suitable use of the system 100 and method S100 of the preferred embodiments within an auditorium or stadium space may be used.
In a second exemplary usage scenario, as shown in
In a third exemplary usage scenario, as shown in
In a fourth exemplary usage scenario, as seen in
In a fifth exemplary usage scenario, the user may be distributed a sensory module 110 that has been loaded with an instruction set based on the destination of the user within the a medical facility upon check-in into the medical facility, for example, the radiology department, and the sensory module 110 exhibits a light attribute that indicates to medical facility workers the destination of the patient, thus indicating to the medical facility workers if the user is heading towards the correct location within the medical facility, for example, if the user misunderstood directions, instead of going to the radiology department, he or she may mistakenly enter the maternity ward, and a passing medical facility worker will quickly notice this mistake by looking at the sensory module 110 and can quickly direct the user to the correct location. Alternatively, the sensory module 110 may change light attributes based upon the path taken by the user. For example, second emitters 130 may be located throughout the medical facility communicates with the sensory module 110 to determine the destination of the user and that will instruct the sensory module 110 to emit a “wrong direction” indication if the user on the wrong path. For example, if the patient is to go to the radiology department and is taking the correct path towards the radiology department, the sensory module 110 may emit a green light. If the patient takes a wrong turn or goes to the wrong floor of the building, the sensory module 110 may emit a red light, indicating to the user that the wrong path was taken and that he or she may backtrack or ask for assistance.
In a sixth exemplary usage scenario, because of first and second emitters 120 and 130 preferably provide a direction based signal that is received by the sensory emitters 1110, a sensory module 110 may detect attributes of the received signals from the first and/or second emitters 120 and 130 to determine where the sensory emitter 110 is located within the space. For example, if the detected signal from the first and/or second emitter 120 and 130 is of a lower magnitude and/or wavelength, the sensory module 110 may determine that the first and/or second emitter 120 and 130 is substantially far away. Similarly, if the sensory module 110 may detect only a reflected signal with no direct signal and determine that the first and/or second emitter 120 and 130 is located where is not direct line of sight to the sensory module 110. The receiver 112 of the sensory module 110 may also detect the direction from which the data signal from the first and/or second emitter 120 and 130 is received to determine the location of the first and/or second emitter 120 and 130 relative to the sensory module 110. For example, the receiver 112 may include a plurality of receiver modules, a first receiver module that receives signals from 0 to 30 degrees relative to the sensory module 110, a second receiver module that receives signals from 30 to 60 degrees relative to the sensory module 110, and a total of twelve receiver modules to allow receipt of signals at all 360 degrees relative to the sensory module 110. However, any other suitable arrangement of the receiver 112 may be used to allow the sensory module 110 to determine where the sensory emitter 110 is located within the space may be used.
The system 100 and method S100 for transmitting information within a space is preferably one of the variations described above, but may alternatively be a combination of the variations described above or any other suitable variation. The system 100 and method S100 may also be used to transmit any other suitable types of information in any other suitable types of usage scenarios.
As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the preferred embodiments of the invention without departing from the scope of this invention defined in the following claims.
This application claims the benefit of U.S. Provisional Application No. 61/332,665, filed on 7 May 2010 and titled “System and Method for Enhancing User Experience” and U.S. Provisional Application No. 61/422,818, filed on 14 Dec. 2010 and titled “System and Method for Transmitting Information,” which are both incorporated in their entirety by this reference.
Number | Date | Country | |
---|---|---|---|
61332665 | May 2010 | US | |
61422818 | Dec 2010 | US |