This specification relates generally to controlling a device wirelessly.
Various systems are known for remotely controlling electronic devices. These include the transmission of infra-red or radio frequency signals, voice, or other audio, control and even motion detection.
In one embodiment, a method comprises: determining a direction of gaze of a user; determining an orientation of a first device with respect to a second device based on at least one radio frequency packet passed wirelessly between the first and second devices using an array of antennas forming part of one of the devices; and determining if the direction of gaze and the orientation of the first device with respect to the second device adopt a predetermined relationship, for controlling a given operation.
The given operation may comprise an operation of the first device, and control signals may be sent for controlling the first device for performance of the given operation upon determination that the direction of gaze and the orientation of the first device with respect to the second device have adopted the predetermined relationship.
The predetermined relationship between the direction of gaze and the orientation of the first device with respect to the second device may include when the direction of gaze and the orientation of the first device with respect to the second device are in alignment, although other relationships may be used.
The determining of whether the direction of gaze and the orientation of the first device with respect to the second device adopt a predetermined relationship may be performed by means of a processor that may be included in the second device.
A gaze direction detector such as a retina movement detector in eye tracking glasses may be used to determine the direction of gaze of a user, which may comprise the second device.
An orientation detector located in the second device may be used to determine the orientation of the first device with respect to the second device.
Control signals for controlling operation of the first device may be transmitted in response to determining that the direction of gaze detected by the gaze detection detector and the orientation of the first device with respect to the second device determined by the orientation detector, have adopted said predetermined relationship, for example are in alignment.
The method may include detecting a predetermined gesture made by a user, for causing control signals to be transmitted for the first device.
The second device may include said array of antennas to receive at least one radio frequency to packet passed wirelessly thereto from the first device, and the method may include comparing signals received by the antennas of the array in response to said at least one radio frequency packet to determine the orientation of the first device with respect to the second device.
An embodiment of apparatus described herein comprises: at least one processor to receive gaze direction signals corresponding to a direction of gaze of a user, from a gaze direction detector; and orientation signals from an orientation detector operable to determine, based on at least one radio frequency packet passed wirelessly between first and second devices using an array of antennas forming part of one of the devices, an orientation of the first device with respect to the second device; the processor being operable in response to the gaze direction signals and the orientation signals, to determine if the direction of gaze detected by the gaze detection detector and the orientation of the first device with respect to the second device determined by the orientation detector adopt a given relationship, for controlling operation of the first device.
The processor may be included in the second device, which may also include the gaze direction detector. The second device may comprise eye tracking glasses including a detector for detecting retina movement, which may also include the orientation detector.
A transmitter may be provided coupled to the processor to transmit control signals for use in controlling the first device in response to the processor determining that the direction of gaze detected by the gaze detection detector and the orientation of the first device with respect to the second device determined by the orientation detector have adopted said given relationship, such as alignment thereof.
Also, the processor is responsive to the gaze direction signals and/or the orientation signals to detect a predetermined gesture made by a user, for causing the transmitter to transmit control signals for the first device.
The second device may include the array of antennas to receive at least one radio frequency packet passed wirelessly thereto from the first device, and a comparator to compare signals received by the antennas of the array in response to said at least one radio frequency packet to determine the orientation of the first device with respect to the second device.
An embodiment may include least one non-transitory computer readable memory medium having computer readable code stored therein, the computer readable code being configured to cause a processor to: determine a direction of gaze of a user; determine, based on at least one radio frequency packet passed wirelessly between first and second devices using an array of antennas forming part of one of the devices, an orientation of the first device with respect to the second device; and determine if the direction of gaze and the orientation of the first device with respect to the second device adopt a predetermined relationship, for controlling operation of the first device.
Also, an embodiment may include apparatus, comprising: means for receiving receive gaze direction signals corresponding to a direction of gaze of a user, from a gaze direction detector; means for receiving orientation signals from an orientation detector operable to determine, based on at least one radio frequency packet passed wirelessly between first and second devices using an array of antennas forming part of one of the devices, an orientation of the first device with respect to the second device; and means responsive to the gaze direction signals and the orientation signals, for determining if the direction of gaze detected by the gaze detection detector and the orientation of the first device with respect to the second device determined by the orientation detector adopt a given relationship, for controlling operation of the first device.
For a more complete understanding of example embodiments of the present invention, reference is now made to the following description taken in connection with the accompanying drawings in which:
Referring to
Referring to
The eye tracking glasses 5 include retina detectors 17, 18 which detect the user's eye movement. Also, the frame 12 of the glasses includes an array of antennas 19-1, 19-2, 19-3, 19-4 that detect signals transmitted by the device tags 7, 8, 9. The tag 7 is illustrated schematically by way of example in
The tag 7 may be configured to operate using any suitable type of wireless transmission/reception technology. Suitable types of technology include, but are not limited to Bluetooth Basic Rate/Enhanced Data Rate (BR/EDR) and Bluetooth Low Energy (BTLE). Bluetooth Low Energy (BLE) is a new wireless communication technology published by the Bluetooth SIG as a component of Bluetooth Core Specification Version 4.0. BLE is a lower power, lower complexity, and lower cost wireless communication protocol, designed for applications requiring lower data rates and shorter duty cycles. Inheriting the protocol stack and star topology of classical Bluetooth, BLE redefines the physical layer specification, and involves many new features such as a very-low power idle mode, a simple device discovery, and short data packets. Other types of suitable technology include WLAN and ZigB. The use of BTLE may be particularly useful due to its relatively low energy consumption and because most mobile phones and other portable electronic devices will be capable of communicating using BTLE technology.
The signals transmitted by the device tag 7 may be according to the Nokia High Accuracy Indoor Positioning (HAIP) solution for example as described at http://www.in-location-alliance.com.
The positioning packet 22 may also include a reference binary bit pattern field 24 which indicates a repeating bit pattern which, in this example is “11110000” that is transmitted in a direction estimation data field 25. The positioning packet 22 may also include a data and length field 26 that includes data such as coding, length of the direction estimation field 25 together with other factors useful in enabling the controller 5 to determine the orientation of the tag 7. It will be understood that the pattern 24 of the signal can be used as an identity signal to individually identify each tag such as tag 7.
Referring again to
Also, referring to
Signals corresponding to the angle θ computed by the AoA estimator 28 together with gaze angle signals computed by the estimator 27 are fed to a processor 30 which has an associated memory 30a that stores computer program instructions for operating the device, including comparing the gaze angle α of the user with the angle of orientation θ for the device tag 7. The computer program instructions may provide the logic and routines that enable the device to perform the functionality described herein. The computer program instructions may be pre-programmed or they may arrive at the device via an electromagnetic carrier signal or be copied from a physical entity such as a computer program product, a non-volatile electronic memory device (e.g. flash memory) or a record medium such as a CD-ROM or DVD. They may for instance be downloaded to the device from a server.
The processor 30 may be configured to determine when the detected angle of orientation θ adopts a predetermined relationship with the gaze angle α, and in response provide control signal to allow one of the devices 2, 3, 4 to be controlled by the user.
In the example shown in
The wireless control can be carried out directly with individual devices as illustrated schematically in
Each of the devices 2, 3, 4 shown in
A schematic block diagram of major circuit components of mobile phone 32 is illustrated in
The controller 5 may be used to control the individual devices 2, 3, 4 directly over a Bluetooth link by transmitting command signals from Bluetooth transmitter 31 directly to the tags, or through the intermediary of the mobile phone 32. Various examples will now be described by way of illustration.
Considering the printer device 4 shown in
Bluetooth transceiver 31 of the controller 5. The process is illustrated schematically in
At step S7.1 the AoA signal from tag 9 is detected at the antenna array 19 of controller 5 and the angle θ of orientation is computed by the AoA estimator 28 as previously described.
Also, the retina detectors 17, 18 provide signals to gaze angle estimator 29, which computes the gaze angle α.
Processor 30 determines at step S7.2 whether the gaze angle α and orientation θ are in alignment i.e. whether the user 1 is both gazing at the printer and has his/her head pointing at the printer. The alignment of the gaze angle α and orientation θ is deemed to indicate that the printer 4 should be instructed to start printing and in response, the processor 30 sends a command signal to Bluetooth transmitter/receiver 31 which is communicated wirelessly over a Bluetooth link to the printer tag 9 to be received by the Bluetooth transmitter/receiver 33 and processor 35, which in turn commands the printer 4 to start printing, as shown at step S7.3.
Movement of the user's gaze away from the printer can be used as a command to stop the printer 4. As indicated at step S7.4, when the processor 30 detects that the gaze angle α and orientation θ move out of alignment, a stop print command is sent to Bluetooth transmitter 31, to be received by receiver 33, so that the processor 35 commands the printer to stop printing, as illustrated at step S7.5.
In another example, the TV 3 shown in
At step S8.2, processor 30 determines whether the detected orientation θ is aligned with the gaze angle α computed by the gaze angle estimator 29. If so, the processor sends a start TV is command to Bluetooth transmitter/receiver 31, which is wirelessly transmitted to tag 8 at step S8.3. This is received by the Bluetooth transmitter/receiver 33 of tag 8 and in response, the processor 35 commands the TV 3 to switch on.
Also, the user of controller 5 may use gestures such as head movement or gaze angle movement to perform additional commands for the TV 3 such as changing channel, increasing or decreasing volume and switching off. At step S8.4, the processor 30 detects a predetermined transitory change in relationship between the gaze angle α and orientation θ so as to detect the gesture. Additionally, the controller 5 may include a solid state gyro device 43 which may provide additional orientation signals to the processor 30 to assist in identifying the occurrence of a gesture.
When a gesture is detected at step S8.4, a further command is sent by processor 30 to the Bluetooth transmitter 31 to be received by receiver 33, so that the processor 35 can instruct the device 3 to carry out the additional command such as changing channel/volume/switching off, as illustrated at step S8.5.
In the foregoing examples, commands are wirelessly transmitted directly over a wireless link such as BTLE from the controller 5 to the controlled device. However, the commands may be transmitted through the intermediary of another device such as the mobile phone 32. For example, the controller 5 may cooperate with the mobile phone 32 to open and close a door lock 2 with a tag 7, such as a car or automobile door lock as illustrated in
The tag 7 may be positioned on the car so that the BTLE signals transmitted to and from the transmitter/receiver 33 are not screened significantly by the generally metallic body 43 of the car. For example, the tag 7 may be mounted in the side mirror 44 in or on the window frame 45 or in the door handle 46 of the car. Alternatively, the tag 7 may be situated inside the car further away from the lock 2, in which case the transmission power of the transmitter/receiver 33 is configured to be sufficiently high that the attenuation caused by the metal shield of the car does not degrade remote wireless operation of the lock. If the tag 7 is situated significantly away from the lock, the direction detection process performed by to processor 30 should take into account that the applicable angle towards the lock may be relatively wide when the user is close to the car than when the user is more distant from it.
At step S10.1, signal 22 from lock 2 is detected by the controller 5. When the user 1 wishes to open the car door lock 2, he/she gazes at the door lock so that at step S10.2, the processor 30 detects that the orientation angle θ computed from the AoA signal from device tag 7, is in alignment with the gaze angle α. In response, at step S10.3 the processor 30 sends a command signal to Bluetooth transmitter/receiver 31, addressed to the Bluetooth transceiver 37 of mobile phone 32. The processor 39 of the mobile phone then provides to the user interface 42 an indication for user 1 that the lock is in a condition to be opened, and provides the user an opportunity to command the lock to be opened.
As illustrated at step S10.4, the user operates the user interface of phone 32, which sends an instruction to processor 39 that, in turn transmits a Bluetooth signal from transmitter 37 to the tag 7, commanding the door lock to be opened.
In a preparation step, not shown in
At step S10.6, the processor 39 of the phone 32 determines whether the phone 32 has been authenticated to command operation of the lock 2, for example by the Bluetooth pairing as just described, or using additional authentication in an initial set up procedure requiring additional authentication and/or encryption initialisation. If it is determined that the phone 32 is authorised to command operation of the lock 2, a command is sent from the phone 32 over the Bluetooth link established with the car lock 2 to open the lock as shown at step S10.8. If however the the phone 32 is found at step S10.6 not to be authenticated to operate the lock 2, an error message is displayed on the phone's user interface 42 as shown at step S10.7.
It will be appreciated that a similar process can be used to lock the car door. The phone 32 may provide enhanced encryption and other security controls for the transmissions to the tag 7 to ensure that only authorised persons may operate the lock 2 via the intermediary of the phone 32.
Many modifications and variations of the described systems are possible. For example, the lenses 10, 11 of the glasses 5 may form part of augmented reality (AR) display and, referring to
Also, the detection of the AoA/AoD signals from respective device tags need not necessarily be performed at the glasses which comprise the controller 5 but could be carried out at different location, for example at the mobile phone 32. In some embodiments, the antenna array 19 may be provided at the mobile phone 32 along with the processing circuitry 26, 27, 28, although in one embodiment, the antenna array is provided on the glasses as shown in
In another embodiment, the remote device such as phone 32 provides command signals to the controller 5, for example to control the AR source and display 44. For example in the process shown in
Also, in the described examples, the detected predetermined relationship between the orientation angle θ and the gaze angle α occurs when they are in alignment. However, this need not mean exact alignment the predetermined relationship may include a range of angles around an exact alignment, suitable for indicating that the user is both oriented and gazing in generally the same direction. Also, the system may be configured to determine when a selected misalignment of the orientation angle θ and the gaze angle α occurs.
In the foregoing, it will be understood that the processors 30, 35, 39 may be any type of processing circuitry. For example, the processing circuitry may be a programmable processor that interprets computer program instructions and processes data. The processing circuitry may include plural programmable processors. Alternatively, the processing circuitry may be, for example, programmable hardware with embedded firmware. The or each processing circuitry or processor may be termed processing means.
The term ‘memory’ when used in this specification is intended to relate primarily to memory comprising both non-volatile memory and volatile memory unless the context implies otherwise, although the term may also cover one or more volatile memories only, one or more non-volatile memories only, or one or more volatile memories and one or more non-volatile memories. Examples of volatile memory include RAM, DRAM, SDRAM etc. Examples of non-volatile memory include ROM, PROM, EEPROM, flash memory, optical storage, magnetic storage, etc.
Reference to “computer-readable storage medium”, “computer program product”, “tangibly embodied computer program” etc, or a “processor” or “processing circuit” etc. should be understood to encompass not only computers having differing architectures such as single/multi processor architectures and sequencers/parallel architectures, but also specialised circuits such as field programmable gate arrays FPGA, application specify circuits ASIC, signal processing devices and other devices. References to computer program, instructions, code etc. should be understood to express software for a programmable processor firmware such as the programmable content of a hardware device as instructions for a processor or configured or configuration settings for a fixed function device, gate array, programmable logic device, etc.
It should be realised that the foregoing embodiments are not to be construed as limiting and that other variations and modifications will be evident to those skilled in the art. Moreover, the disclosure of the present application should be understood to include any novel features or any novel combination of features either explicitly or implicitly disclosed herein or in any generalisation thereof and during prosecution of the present application or of any application derived therefrom, new claims may be formulated to cover any such features and/or combination of such features.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/FI2014/050567 | 7/9/2014 | WO | 00 |