Mobile devices such as cell phones can provide a user with an augmented reality experience. A camera in the cell phone can provide a view of the physical environment on a display device, such as an LCD screen, and augmented reality supplements that view with information about the physical environment. For example, textual descriptions can be overlaid on the view of the physical environment in order to provide the user with more information about the physical environment such as retail establishments within the vicinity of the user. Augmented reality is helpful to provide users with more information via their mobile devices. However, augmented reality tends to only provide static information. Accordingly, a need exists for using augmented reality as a control device for a user to interact with the physical world.
A mobile device having a user interface for use in controlling lighting, consistent with the present invention, includes a camera, a display device, and a component for transmitting a control signal. A processor within the mobile device is configured to detect via the camera a light module and display an identification of the light module on the display device. The processor is further configured receive a command relating to control of the light module and transmit a signal to the light module via the component. The signal provides the command to the light module for controlling operation of the light module.
A method for controlling lighting via a user interface on a mobile device having a camera and a display device, consistent with the present invention, includes detecting via the camera a light module and displaying an identification of the light module on the display device. The method further includes receiving a command relating to control of the light module and transmitting a signal to the light module, where the signal provides the command to the light module for controlling operation of the light module.
The accompanying drawings are incorporated in and constitute a part of this specification and, together with the description, explain the advantages and principles of the invention. In the drawings,
Embodiments of the present invention include an augmented reality (AR) graphical user interface, designed for touch screens on mobile devices equipped with cameras, specific to the control of network accessible light modules. The user interface graphics can be static or animated to identify the light modules within frame. Each highlight is interactive and may be activated to display information or controls for the lighting module. A user can enter commands via the AR user interface to control the operation of the lighting modules. In response, the mobile device wirelessly transmits control signals for the commands to the lighting module.
The WiFi method includes mobile device 44 sending a message to all the light modules (e.g., modules 22, 26, 28) or a subset of them to blink (step 64). In this WiFi method, light modules can be discovered and registered with the network at installation. Light module addresses are dynamically generated within a range of addresses. Discovery of the light modules can be accomplished by a server sending multicast messages, such as user datagram protocol (UDP) messages, which should be received by all of the light modules in the network. The light modules send a response including the identification of them, typically a name, back to the server. Now the server has identified the names of all the modules on its network.
Mobile device 44 determines whether it can confirm a communications channel with the light modules (step 66). In particular, when the augmented reality application is launched on mobile device 44, a message is sent to the light modules in the network to start blinking The blink is represented as a binary sequence encoded to correspond to the name of the associated light module on the network. Due to the relatively slow frame rate of a mobile device camera, it is preferred to use a fourth LED channel to send non-visible data from the light modules to the camera via infrared (IR) light. This method will provide a visible light communication (VLC) channel to the camera but will be invisible to people. If a mobile device has an IR filter, blinking a single visible color is an alternative. Blinking with a color in the visible spectrum will allow light modules to be identified and data transfer using both color and binary encoding to increase data rates. At this step, either all light modules start blinking (simultaneously) or each module signals one at a time (sequentially) until the mobile device confirms the communication channel.
The light communications method includes using the camera view mode of mobile device 44 to find a light module to be controlled (step 68). A camera flash on mobile device 44 is used to send binary encoded signals to the light module (e.g., module 14) in view (step 70). The flash on the camera is enabled by the augmented reality application to send binary encoded signals to the light module, which are detectable by the light module ambient sensor implemented as an input device 38. The ambient sensor is used to determine the relative distance of each light module, since modules that are farther away will receive less light thus determining which light module should respond.
Mobile device 44 determines if the light module receives the information (step 72) by detecting an acknowledgement reply from the module (step 74). The light module responds by sending an acknowledgement via VLC or IR back to mobile device 44, followed by an identifier or data. This blinking can be accomplished as in the WiFi method using binary encoded signals.
Once a light module is identified, mobile device 44 performs image processing to decode the blink pattern from the module (step 76). Through the use of an image processing algorithm to decode the blink sequences, the mobile device can interpret which light modules are in its view. With multiple light modules in-frame, the image processing is capable of dividing the image on a frame by frame basis to track and maintain the data stream from each light module.
An exemplary image processing algorithm for step 76 involves the use of color thresholding in which the image captured by the camera in mobile device 44 is divided into smaller groups or clusters of pixels. These groups of pixels can correspond with the light sources the user is trying to control through the AR application. Each group of pixels in the image is assigned an overall color, for example an average or dominant color of the group. The assigned color for the group is then compared to a color map in order to correlate the assigned color with a known color. Segmenting the image according to color in this manner essentially enhances the contrast of the scene in order for the image processing algorithm to more easily differentiate features in the scene. Other image processing algorithms for implementing step 76 are possible.
On the user interface in display device 46 of mobile device 44 the scene displayed is augmented with touch enabled control for the light modules (step 78).
Augmentation of the scene can involve adding touch-enabled controls over each light module icon or other identifier on the user interface. The mobile device also commands the light module to stop blinking at this point, in the case where visible spectrum colors are using for the blinking If IR light is used the blinking patterns may continue to provide a continuous means of data and tracking in the augmented reality frame.
If mobile device 44 receives a command selected by a user (step 80), mobile device 44 sends the command to the selected light module for use in controlling operation of the light module (step 82). The command can be sent using a binary encoded data signal with visible spectrum colors or IR light from mobile device 44. The light module can optionally confirm receipt of the command. If no commands are selected (step 80), mobile device 44 exits the AR application (step 84).
The user interfaces shown in