This invention relates to visible light communication systems and, more particularly, to using remote control devices with cameras to communicate with controlled devices using visible light.
Remote controllers have become ubiquitous for controlling all sorts of electronic components, including TVs, sound systems, ceiling fans, projectors, computers, thermostats, lighting systems, etc. However, each remote controller is typically unique for the particular associated electronic device. Universal remote controllers have had some success combining functions in one handheld device; however, they are expensive, cumbersome, and subject to being lost.
Remote control can be useful for a number of different reasons. The most common advantage is being able to change channels on a TV set without having to move from the comfort of a chair or couch. Another advantage is being able to control a device, such as a light or ceiling fan that may be attached to a high ceiling and out of reach. Another advantage is that having a remote controller can reduce the size and complexity of the device being controlled. For instance, the remote controller can have multiple buttons and indicators or even a touch screen and computer for controlling the device. This can enable the controlled device to simply be configured to send and receive wireless data.
Remote controllers can communicate using a variety of wireless communication protocols, but typically use infrared light or radio frequency (RF) electromagnetic radiation based physical layers. For example, many commercial and residential lighting systems have remote controllers that enable lights to be controlled locally. However, these remote controllers typically use infrared physical layers, which require the light fixtures to have infrared transceivers. This inevitably increases the cost of the lighting control system.
Some lighting control systems have been introduced that use visible light communication (VLC) to communicate optical data to light fixtures or other devices. Lighting control systems that use VLC have many advantages over conventional lighting control systems that use infrared remote controllers and transceivers or employ other types of wired and wireless communication protocols. One big advantage is that the visible light spectrum is currently globally unregulated and does not suffer from the congestion and interference common in RF-based communication systems. Another advantage is cost savings.
For example, light fixtures controlled by visible light do not use infrared transceivers for communicating with infrared remote controllers, and thus, do not incur the costs associated with conventional lighting control systems. In addition, a lighting control system employing visible light communication may transmit optical data using the same light fixture used to provide illumination, e.g., in a room of a building. In the case that the light fixture comprises one or more LEDs, the LEDs can also be used as light detectors for receiving optically transmitted data. Finally, no dedicated wires are needed in a system that transmits optical data using visible light traveling through free space. This is especially important for installation of lighting control systems in existing buildings. These advantages, and possibly others, enable the visible light communication protocol to be implemented within a lighting control system for very little cost.
A limitation of the visible light communication protocol, and specifically, a protocol that transmits optical data using visible light traveling through free space, is that it is typically limited in communication range and generally restricted to line of sight. In other words, optical data transmitted through free space typically cannot be communicated around corners or through walls between various rooms in a building, or between light fixtures and other devices that are too far away and outside of the optical communication range of the remote control device. Therefore, a need exists for a system and method that can extend the communication range in a visible light communication system.
A system, remote control device, and method are provided herein for communicating with and controlling various devices using visible light communication (VLC). According to one embodiment, the system and method described herein may use a smart phone, or other device, as a remote control device to communicate with and control various other devices using optically-modulated data transmitted through free space using visible light.
As used herein, the term “visible light” refers to a portion of the electromagnetic spectrum that is visible to (can be detected by) the human eye, and typically includes wavelengths from about 390 nm to about 700 nm. Electromagnetic radiation in this range of wavelengths is called visible light or simply light. In the system described herein, only visible light sources are used to transmit optical data.
As used herein, the term “free space” refers to communication within space, but not confined to, for example, an optical fiber. Thus, transfer of data occurs optically, but not constrained within an optical fiber or any other type of waveguide device yet is free and able to travel optically in any non-obstructed direction.
As used herein, the term “smart phone” refers to any mobile phone that uses RF transceivers to communicate bi-directionally according to well-known cellular protocols. In addition to RF transceivers, a “smart phone” as used herein may generally include a memory for storing one or more software applications (typically referred to as smart phone applications or “apps”), a processor for executing the one or more software applications, and a display screen for displaying a graphical user interface associated with the software applications. The display screen may be a touch screen display comprising a backlight for illuminating the touch screen display and a touch sensitive surface for receiving user input. In some cases, the smart phone may also include a camera and a camera flash.
In some embodiments, a smart phone configured for visible light communication may use the camera flash as a light source and optical transmitter, and the camera image sensor as an optical receiver. However, the remote control device described herein is not limited to a smart phone, and the light source does not have to be a flash. In alternative embodiments, a laptop computer, a desktop computer, a hand-held device, or a wall-mounted unit could operate as a VLC remote control device, for example, by modulating the backlight of a display screen (or other light source) for the purpose of transmitting optical data to one or more controlled devices.
In some embodiments, the remote control device and the controlled devices may each include one or more light-emitting diodes (LEDs). LEDs are desirable as they can be alternately configured to receive light and to emit light. In addition, LEDs may be configured for transmitting optical data by modulating the drive current supplied to an LED to produce light modulated with data.
In one exemplary application, the remote control device described herein may be used to communicate with light fixtures and lighting systems for controlling such lights, and to enable the lighting system to provide a communication channel between the remote control device and various remotely controlled devices. Although described here in the context of a lighting control system, the system, remote control device, and method provided herein are not limited to such and may be configured for controlling a wide variety of controlled devices. For example, the devices being controlled may be a light fixture, a thermostat, a security system component, a TV, a ceiling fan, a home appliance, or other type of remotely controlled electronic device.
According to one embodiment, a remote control device (e.g., a smart phone) may transmit data to a controlled device by modulating the light produced by the LED flash of a camera included within the remote control device. The device being controlled (e.g., a light fixture, a thermostat, or other type of remotely controlled device) may configure its LED as a photo-detector to receive the data sent optically through free space by the LED flash on the remote control device. Depending on the message, the controlled device can respond by driving its LED with high current modulated with data to produce light modulated with data. The remote control device may then detect this light and the data being modulated using an image sensor in the camera, for example.
In order to receive optically-transmitted data from the remote control device, the LED(s) of the controlled device(s) may be periodically and momentarily turned off to measure incoming light from the remote control device. This may be done at a relatively high rate (e.g., 360 Hz) that can't be seen by the human eye and is preferably higher than the modulation rate of the light produced by the camera flash when transmitting data. In some embodiments, the LED lights of the controlled devices may be synchronized to a common source (e.g., an AC mains), and thus to each other, so that they all turn off at the same time, and consequently do not interfere with data communication between each other, the remote control device, or any other remotely controlled devices. The times when the LEDs are not producing light are referred to herein as communication gaps.
According to one embodiment, a controlled device can measure the incoming light level during each communication gap, preferably using the same LEDs used for emission, and produce a stream of data proportional to the incoming light level. From such data streams, the controlled device may identify light level changes, which correspond to modulated data transitions. Circuitry in the controlled device may decode the data to determine the message being sent by the remote control device. Depending on the message, each controlled device can respond to the remote control device by modulating the brightness and/or color of the light being produced by its LED. Image processing software stored within and executed by the remote control device may then be used to determine the location of each controlled device and the data being communicated.
After transmitting a message using the LED camera flash, the remote control device may turn on the camera's image sensor and record a video for an amount of time anticipated for a response from the device or devices being controlled. Software within the remote control device may then analyze the video to determine the location of the device or devices responding with light and the data being communicated. For example, the software may detect optically-transmitted data by scanning one or more frames of video data for the brightness and potentially color of the light emitted from the LED of a controlled device. By analyzing successive video frames, the software can determine when the brightness and/or color of the light produced by a particular controlled device change, which indicates transitions in the modulated data being communicated from the controlled device to the remote control device. To ensure accurate detection, the transitions in the modulated data should occur at a rate slower than the frame rate of the camera.
Although an LED or photodiode could be used for such purpose, the remote control device described herein preferably uses a camera to receive light emitted from the controlled devices. This is because cameras have two-dimensional image sensors, which typically have millions of pixels that can be used to detect light of different wavelengths that fall within the camera's field of view. By utilizing an image sensor, instead of a discrete LED or photodiode, the image processing software within the remote control device can identify the location of many light sources within the camera's field of view, and track the changes in brightness and/or color over time. This enables the remote control device to receive data or messages from many controlled devices simultaneously.
The method described so far enables a remote control device to broadcast messages to all controlled devices located within a given communication range, and for all such devices within the field of view of the camera to communicate back to the remote control device. As set forth in more detail below, the system and method described herein also enables a remote control device to communicate bi-directionally with an individually addressed device.
According to one embodiment, a system and method are provided herein for establishing a bi-directional communication link between a remote control device and a selected one of the controlled devices. In some cases, the remote control device may initiate communication by broadcasting a message to all controlled devices located within a communication range of the remote control device, wherein such broadcast message includes a request to respond with a random number or unique pre-programmed ID. The application software on the remote control device can then produce an image on the display screen showing all the devices detected within the camera's field of view, and identify each device that responded to the broadcast message. The user of the remote control device can then touch one of the devices displayed in the image to select that device for subsequent communication. A bi-directional communication link between the remote control device and the selected device is thereafter established by using the random number or unique ID of the selected device as an address in subsequent communication messages.
As an example, suppose there are 100 lights (controlled devices) in the high ceiling of an auditorium. A user may position the camera of the remote control device to take a picture of the 100 lights, or some portion thereof, and push a button provided by the graphical user interface (GUI) on the touch screen of the remote control device. The remote control device may then modulate light from the camera flash, for example, to send a broadcast message to all lights that can receive the message requesting those lights to respond with a random number or unique ID. The remote control device may then start recording a video. All lights that received the broadcast message may respond with the random number or unique ID by modulating each light's output brightness (or color) with data. The application software within the remote control device may then analyze the video, identify the location (set of pixels) of each light within each video frame, determine the brightness (and/or color) of each light as a function of time, and determine the data being sent from each light from changes in the brightness (and/or color) of each light over time.
From the data that the remote control device decodes from each light, the application software can make sure that all random numbers or IDs received are unique, and can display one frame of the video on the touch screen display designating each light with a unique number and a circle, for instance. In some cases, the graphical user interface may allow the user to zoom in on a region of the light to be adjusted, and select that light with a double touch, for instance. A menu could then pop up that provides buttons, sliders, etc., that enable the user to configure or monitor various parameters, functions, etc., of the selected light. Such parameters could be color, brightness, etc., or things like diagnostics, power consumption, or status could be read. For example, the camera on the remote control device could be used to measure the color point of the light, and to provide feedback to the light to adjust the output color to a particular color point or correlated color temperature (CCT).
Once the user selects something from the graphical user interface by selecting a touch screen button, for instance, a message may be sent to the selected light using the random number or unique ID in the address field of the message transmitted by the camera flash. All lights within range will receive the message, however, only the selected light with that random number or ID will respond. As such, bi-directional communication with individual devices can be achieved.
According to another embodiment, a system and method are provided herein for increasing the optical power and extending the communication range of optical data transmitted in a visible light communication system. This embodiment uses the controlled devices themselves (e.g., the multitude of lights in an auditorium) to amplify the optical power of the messages transmitted by the remote control device, or messages transmitted between controlled devices, to extend the communication range of the visible light communication system.
As used herein, a “communication range” refers to an optical communication range, and specifically, a range of distances within which a receiving device can receive optical data transmitted from a sending device through free space using visible light. The communication range extends from the sending device up to a maximum distance at which a receiving device can receive optical data from the sending device. A receiving device located beyond the maximum distance is said to be located outside of the communication range of the sending device.
In a visible light communication system, the maximum distance may be generally determined by the brightness and directionality of the visible light emitted by the sending device, as well as the light detection sensitivity of the receiving device. In some cases, however, the maximum distance may be affected by obstacles in the communication path (such as walls or other optically-dense structures in a room), or deviations from a straight-line communication path (such as when light is to be transmitted around corners). The embodiment described herein increases the communication range of a visible light communication system by using one or more of the controlled devices to retransmit the communication messages received by the controlled devices. Controlled devices may receive communication messages from the remote control device, and/or retransmitted communication messages from other controlled devices. In addition to amplifying the optical power of the communication messages transmitted by the remote control device, the retransmission of communication messages may enable controlled devices outside of the communication range of the remote control device to receive retransmitted messages from controlled devices located within range of the remote control device.
In some cases, the communication range of the visible light communication system may be extended by using one or more controlled devices to amplify a sequence of communication messages transmitted by a remote control device. In general, the remote control device may be configured to send two (or more) communication messages sequentially with a fixed timing between each message. In one example, a first communication message and a second communication may be sent by the remote control device, wherein the second communication message is substantially identical to the first communication message.
The first message could be broadcast to all controlled devices within range of the remote control device, groupcast to a set of controlled devices, or unicast to an individual controlled device. The first message may include a plurality of data fields, one of which contains information that the first message should be amplified or repeated by the controlled devices that receive the first message. The controlled devices that properly receive the first message may adjust their bit timing to that of the received first message and retransmit the first message in synchronization with a second message being sent from the remote control device. Since the second message is substantially identical to the first message, retransmitting the first message in synchronization with the second message effectively increases the optical power of the second message sent by the remote control device. By utilizing the optical power of the controlled devices to amplify the second message sent by the remote control device, the optical power and communication range of the second message can be orders of magnitude larger than simply relying on the optical power of the remote control device alone.
In other cases, the communication range may be extended by using one or more controlled devices to retransmit a communication message sent from the remote control device, wherein the communication message is retransmitted by the controlled devices a specified number (N) of times. As in the previous case, the communication message may include a plurality of data fields, one of which contains information that the communication message should be retransmitted by the controlled devices that receive the communication message. In this example, the communication message may contain a repeat field that specifies a number (N) of times the communication message should be retransmitted by the controlled devices. In general, the number “N” may be substantially any number greater than or equal to one. Controlled devices that receive the communication message from the remote control device decrement the number (e.g., N−1) in the repeat field of the received message, and retransmit the communication message with the decremented number in the repeat field of the retransmitted message.
In this case, the remote control device may be configured for sending only one communication message (i.e., a first communication message) to nearby controlled devices within range of the remote control device. After decrementing the number (N) in the repeat field of the received message, the nearby controlled devices may retransmit the communication message to controlled devices outside of the communication range of the remote control device. In this manner, the communication range may be extended by using the controlled devices to relay messages to other devices.
Unlike the previous case, the controlled devices do not synchronize retransmissions to the bit timing of the received first message. Instead, the controlled devices may be synchronized to a common timing reference, and thus, may communicate in synchronization with each other. By synchronizing the timing of the controlled devices to a common timing reference, the controlled devices can receive and retransmit communication messages to other controlled devices in synchronization with each other.
In one embodiment, the controlled devices may be coupled to a common power source, such as the AC mains of a building, and may be synchronized to a common timing reference generated from the AC power source. For example, the controlled devices may each comprise a phase-locked loop (PLL) configured to generate the common timing reference by locking onto an AC signal provided by the AC power source.
Regardless of how the retransmitted messages are synchronized, the retransmit command may be used to communicate with a device to be controlled when that device is very far away, but within line of sight of the remote control device. In this case, the light source (e.g., the camera flash) on the remote control device may not have sufficient optical power to send messages to a distant controlled device. In order to reach the distant controlled device, the remote control device may send one or more communication messages comprising retransmit commands to nearby controlled devices, possibly at the user's discretion. The nearby controlled devices receive the first communication message and retransmit the first communication message to other controlled devices. In some cases, the first communication message may be retransmitted in synchronization with a second message transmitted from the remote control device to amplify the total transmitted optical power of the second message. In other cases, the first communication message may be retransmitted in synchronization with a common timing reference generated within the controlled devices. By retransmitting or relaying communication messages in such a manner, the distant controlled device may be able to receive the communication message and respond accordingly.
In some cases, a retransmit command may only be necessary when transmitting messages from the remote control device to a controlled device, which is far away but located within the field of view of the camera. This is because a camera is much more sensitive to incoming light than a simple LED connected to an amplifier. For example, a camera has a lens that focuses light onto an image sensor having millions of detectors, while a simple LED measures the total light power coming in from all directions. Therefore, if the LED of the controlled device falls within the field of view of the camera, communication from the controlled device to the camera can occur over relatively long distances.
The retransmit commands described herein are not limited to messages sent from the remote control device to the controlled devices and can also be sent bi-directionally and between controlled devices, themselves. This further extends the control and communication range of the system by enabling the remote control device to communicate with controlled devices that are not within the field of view or line of sight of the remote control device. For instance, a remote control device (or even a device with just an LED for transmitting and receiving light instead of a camera) could send a communication message containing a retransmit command to nearby devices, which would receive and retransmit the communication message in different directions. Not only would this increase the transmit optical power of the message, but a controlled device could receive a message from a controlling device located around a corner, for instance.
A controlled device that is not in the line of sight of a remote control device, or any controlling device with or without a camera, can also communicate back to the controlling device using the retransmit commands described herein. Thus, bi-directional communication is made possible between the controlling and the controlled devices by using the retransmit commands to relay communication messages around corners.
In some cases, the retransmit commands can be used to retransmit messages many times. For instance, a sending device (e.g., either a remote control device, a controlled device, or any device that can communicate using visible light) could specify that a communication message be retransmitted any number of times by including the number in a repeat field of the communication message. Upon receiving the communication message, a controlled device may decrement the number specified in the repeat field of the received message and retransmit the communication message with the decremented number in the repeat field of the retransmitted message. Controlled devices that receive the retransmitted message will retransmit the received message in accordance with the decremented number in the repeat field.
As an example, a sending device could specify that the communication message be retransmitted twice by storing a corresponding bit value in the repeat field of a communication message. Nearby controlled devices within range of the sending device will receive the communication message and retransmit the communication message twice, each time with the bit value in the repeat field decremented accordingly. Controlled devices that are outside the range of the sending device, but within range of the nearby controlled devices, will receive the first retransmitted communication message and transmit the final communication message in synchronization with the controlled devices within range of the sending device. Any number of retransmissions is possible, which can provide virtually unlimited communication range.
According to another embodiment, a system and method is provided herein for using a preamble and a Cyclic Redundancy Check (CRC) checksum in messages sent by a remote control device or other device using light to minimize the probability of such devices responding to incorrect data. For example, the preamble may include a unique sequence that does not exist within the rest of the message. Only after the unique pattern is detected does a receiver begin decoding a message. A CRC checksum calculated at the receiving device is compared to the CRC checksum generated at the transmitting device and sent with the message. Only if both checksums match does a receiving device accept a message.
In addition to providing error protection, the CRC checksum enables a device manufacturer to provide remote control device applications that can communicate only with that manufacturer's devices. By programming a unique manufacturer ID into each controlled device and using such ID as a seed for the CRC checksum, only messages sent by applications that use the same ID as the CRC checksum seed will be accepted.
Of course, the remote control device described herein is not necessarily limited to communicating with devices provided by only one manufacturer. In some embodiments, the controlled devices can perform two CRC checks on all received messages using two different seed values. One seed value can be manufacturer specific, while the second seed can be common to all manufacturers. In this way, applications can be written to control only a specific manufacturer's devices, or all devices from all manufacturers.
According to one embodiment, a memory medium is provided herein containing program instructions, which are executable on a processor of a remote control device for communicating with and controlling one or more controlled devices. In some cases, the program instructions may comprise first program instructions, second program instructions, and third program instructions. The first program instructions may be executable for providing a user interface on a display screen of the remote control device. In general, the user interface may be configured to receive user input for controlling the one or more controlled devices, and in some cases, may be a graphical user interface (GUI). The second program instructions may be executable for generating a communication message based on the user input received by the user interface. As described herein, the generated communication message may include a plurality of data fields. In some cases, the plurality of data fields includes a repeat field that specifies the number of times the communication message should be retransmitted by controlled devices that receive the optically-modulated data. The third program instructions may be executable for modulating a visible light source of the remote control device with data contained within the plurality of data fields to produce optically-modulated data, which is transmitted from the remote control device through free space to the one or more controlled devices.
In some cases, the program instructions may further comprise fourth program instructions, which are executable for configuring a light detector of the remote control device to receive optically-modulated data transmitted from the one or more controlled devices. Like the optically-modulated data transmitted from the remote control device, the optically-modulated data transmitted from the one or more controlled devices may be transmitted through free space using visible light.
In some cases, the fourth program instructions may configure an image sensor of the remote control device to capture a sequence of images of the controlled devices, and the program instructions may further comprise fifth, sixth, seventh, and eighth program instructions. The fifth program instructions may be executable for analyzing the sequence of images to determine a location of the controlled devices, and to determine the optically-modulated data sent from the one or more controlled devices by detecting a change in light output from the controlled devices over time. The sixth program instructions may be executable for displaying one of the images of the controlled devices on the display screen of the remote control device. The seventh program instructions may be executable for enabling a user to select a particular controlled device by touching a portion of the display screen corresponding to the location of the particular controlled device in the displayed image. The eighth program instructions may be executable for generating and directing subsequent communication messages to only the particular controlled device. Additional and/or alternative program instructions may also be included.
The visible light communication systems, methods, and memory mediums described herein provide fundamental advantages that are practically impossible with RF wireless communication protocols. Since the wavelength of RF electromagnetic radiation is orders of magnitude longer than visible light, discrimination between multiple light sources transmitting simultaneously is not possible. With visible light communication, the remote control device, or any device with a camera, can receive light from thousands of sources simultaneously. Further, since RF communication (e.g., Bluetooth) involves frequency and phase modulation of transmitter carrier frequencies, which are generated locally and out of synchronization with other devices, the concept of message amplification is not possible since multiple devices transmitting simultaneously will interfere with each other, instead of amplifying each other. As such, the visible light communication systems and methods described herein have significant practical advantages over today's state-of-the-art communication and control systems.
The present invention may be better understood, and its numerous objects, features, and advantages made apparent to those skilled in the art by referencing the accompanying drawings.
The use of the same reference symbols in different drawings indicates similar or identical items. While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the drawings and detailed description thereto are not intended to limit the invention to the forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present invention as defined by the appended claims.
Smart phones with touch screen displays and cameras with LED flash are becoming commonplace, and typically include radio frequency (RF) transceivers for communicating bi-directionally, using the RF spectrum, according to cellular phone protocols for long-range communication and Bluetooth protocols for short-range communication. Smart phones are not used to control lighting fixtures in conventional infrared-based lighting control systems, because they do not contain the infrared transceivers necessary for interfacing with these systems.
Recently, smart phones have been used in lighting control systems to communicate with lighting fixtures, or with specialized appliances attached to the fixtures, using radio frequency signals transmitted, e.g., from a Bluetooth radio. However, these lighting control systems require the controlled devices (i.e., the lighting fixtures or the specialized appliances) to also include relatively expensive and limited-range Bluetooth radios. A need remains for a smart phone, or other similar device, that can be used as a remote control device for communicating with and controlling devices that do not include Bluetooth radios (or other RF receivers). In addition to lighting control systems, it is contemplated that such a remote control device may be used for controlling a wide variety of controlled devices using visible light.
Smart phone devices typically have substantial processing power and run software applications that can be downloaded off the Internet and executed on the smart phone for many different purposes. Some of these applications are used to control the smart phone camera flash for functions unrelated to the camera, such as providing strobe lighting or a flashlight feature. While useful for seeing in the dark, these applications cannot be used to transmit optical data from the smart phone.
The remote control device described herein utilizes various software applications that function to control a light source (e.g., a camera flash or display screen backlight) of a smart phone, or other device, for the purpose of sending and/or receiving optical data to/from a variety of controlled devices using visible light. Different software applications can be used to control different devices. The software applications described herein can be downloaded (e.g., from the Internet) or otherwise stored onto a memory medium of the remote control device, and generally consist of program instructions that may be executed by a processor of the remote control device to effectuate visible light communication between the remote control device and one or more controlled devices.
Turning now to the drawings,
In the exemplary embodiment of
The remote control system 10 is not limited to the embodiment specifically illustrated in
Although visible light communication is preferred, smart phone 11 could also communicate using Wi-Fi, Bluetooth, or any other communication protocol with a controlled device comprising such communication protocol interface and a light emitter, such as an LED. In this embodiment, a controlled device may convert Wi-Fi messages, for example, to optically-modulated signals, which are transmitted to and detected by other controlled devices in the system 10.
Color button 32 could be a set of buttons, a slider, or even a two-dimensional region to adjust the color of a lamp 12. A set of buttons could allow preset color points or Correlated Color Temperatures (CCTs) to be selected, such as 2700 K, 3000 K, 3500 K, and 5000 K. A slider could allow a user to adjust the CCT of lamp 12 anywhere within a certain range. A two-dimensional region could enable a user to select any color within the gamut provided by lamp 12.
Ambient button 33 could also be a set of buttons or a slider, for instance, which could be used to adjust the relationship between brightness and ambient light level. In one example, ambient button or slider 33 could be used to adjust the ambient light above which lamp 12 would dim or turn off. The ambient light level could be a limit or threshold above which lamp 12 is fully on, and below which lamp 12 is fully off. On the other hand, lamp 12 could gradually adjust brightness as the ambient light level changes. Lamp 12 may implement hysteresis in the case that lamp 12 turns fully on and off with ambient light level.
Timer 34 could be a set of buttons, a slider, or even a link to a more extensive sub-menu to control when lamp 12 turns on and off. For example, timer 34 could enable a user to configure lamp 12 to turn on at a certain time of day and turn off at another time. As another example, timer 34 could enable a user to adjust the amount of time that lamp 12 stays on after being turned on. As such, timer 34 represents a wide range of functionality associated with any timers in lamp 12.
Send button 35 can initiate the message transmission from smart phone 11 to adjust any of the properties or functions provided by buttons 31, 32, 33, and 34. For example, a user could first push a button within group 30 to select a particular group of lamps, adjust the position of brightness slider 31, and finally touch the send button 35 to adjust the brightness of the lamps within the selected group. However, send button 35 illustrates just one way to initiate the transmission of a message from smart phone 11 to lamp 12. As another example, any time a button or slider is adjusted, a message can be sent to a controlled device (such as lamp 12). As another example, a message can be sent to a controlled device in response to a voice communication or any other type of input to smart phone 11. In another example, a software program running on the smart phone, or on another device connected to the smart phone, may automatically send messages to one or more of the controlled devices based on sensor output (e.g., ambient light detection), scheduling, or some other factor. In general, any input can result in the smart phone 11, or other electronic devices, transmitting a message optically to lamp 12 or device 13, for instance.
The functionality illustrated in
In addition to running applications 21 and 22, processor 58 receives user input through the touch screen display 55 and passes data to be transmitted to the camera flash controller 59. In some cases, the data sent from processor 58 to camera flash controller 59 may include a communication message, which is to be transmitted optically via controller 59 and LED 60 to one or more controlled devices. In general, the communication message may include a plurality of data fields as shown, e.g., in
In the example shown in
Since smart phone 11 uses camera 50 to receive data, the smart phone 11 can receive data from many controlled devices at the same time. In some cases, smart phone 11 can display a still image from the video recording that identifies the location of the controlled devices (e.g., lamp 12 and thermostat 13) from which smart phone 11 received valid responses. A user can then touch an image 56 of lamp 12 or an image 57 of thermostat 57 to select that device for further communication.
Smart phone 11 could also communicate, for example, using Wi-Fi or another communication protocol to a controlled device having the same protocol interface and an optical emitter. The controlled device could convert received Wi-Fi messages to optical messages, and smart phone 11 could use camera 50 to receive responses optically from the controlled devices.
PLI 65 typically comprises an LED driver circuit that produces substantially DC current to produce illumination from LEDs 66 and modulated current to transmit optical data from LEDs 66. The AC and DC currents provided by the LED driver circuit can be combined in many ways to produce illumination and transmit data using the same light source. For example, periodic time slots can be produced in synchronization with the AC mains 61 during which the DC current is turned off and the AC current is turned on to transmit optical data within the periodic time slots or communication gaps.
PLI 65 also typically comprises a receiver circuit that detects current induced in LEDs 66 when the LEDs 66 receive optical data transmitted using visible light through free space. The receiver circuit included within PLI 65 converts the photo-current induced in LEDs 66 to voltage, which is then compared to a reference voltage to determine a sequence of ones and zeros sent by the transmitting device.
VLC network controller 64 interfaces with PLI 65 and memory 67 to receive optical data transmitted from a transmitting device using visible light through free space, to implement the functionality of lamp 12, and in some cases, to re-transmit the received optical data during communication gap times. The optical data received by LEDs 66 can be interpreted by VLC network controller 64, stored in memory 67, and/or further processed. For instance, the brightness or color of LEDs 66 can be adjusted by adjusting the substantially DC current applied to LEDs 66 using the LED driver circuit included within PLI 65. Optical data intended for other or additional electronic devices (such as thermostat 13) can be stored in memory 67 and re-transmitted by PLI 65 and LEDs 66 in various ways.
The block diagram illustrated in
As shown in
As described in more detail below, communication messages 16 and 17 may each comprise a repeat field (81,
In addition to a repeat field, communication messages 16 and 17 may further comprise an address field (82,
As set forth below, communication messages 16 and 17 may further comprise a command field (83,
Alternative remote control systems 10 having possibly many different types and/or numbers of controlled devices may operate in accordance with substantially different communication timing diagrams. For instance, communication messages may not be repeated at all, or may be repeated many times. In some cases, communication messages may target individual devices in which only one controlled device may produce a response. In some cases, no responses may be provided. In other cases, responses from controlled devices may be repeated by other controlled devices to extend the communication range of responses sent from the controlled devices back to the smart phone 11. Further, smart phone 11, or any other type of device that initiates optical communication, may not have, or make use of, a camera 50, and consequently, may not record video 70.
The preamble field 80 identifies the start of communication message 16, 17 or response 18, 19 and comprises a unique data sequence or pattern that does not exist within the rest of the message or response. For instance, preamble 80 may include a coding violation (e.g., as used in bi-phase coding) or a control symbol (as used, e.g., in 4b5b or 8b10b coding). Receiving devices, such as lamp 12 and thermostat 13, identify the start of a message 16, 17 when the unique pattern in preamble 80 is detected. In some embodiments, receiving devices may also synchronize their internal timing to the bit timing of preamble 80, which enables devices without accurate timing references (e.g., PLLs) to communicate effectively.
The repeat field 81 instructs a receiving device to retransmit a received communication message 16 by specifying the number of times the message 16 is to be repeated by the receiving devices. In the example of
The address field 82 specifies the controlled device or devices targeted by the communication message 16. In some cases, address field 82 may include one particular code, for instance 0xFF, to indicate a broadcast message to be transmitted to all devices. In other cases, address field 82 may include a range of codes that identify a group of devices to target. For instance, the four most significant bits (8 through 11) being high can indicate a group cast message with the four least significant bits (12 through 15) indicating one of sixteen different groups of devices. In this example, all remaining codes could identify unicast messages to individual devices. Further, a certain range of unicast codes could be allocated for random number addressing as described previously, with another range of unicast codes allocated for pre-programmed addresses.
The command field 83 specifies the action to be taken by the target device or devices. Such commands and associated actions can be different for different types of devices. For instance, lamp 12 may interpret the command field 83 to perform the functions shown in application 21, while thermostat 13 may interpret the command field 83 to perform the functions shown in application 22. Some codes within the command field 83 can be reserved for system 10 management and interpreted the same by all devices. For instance, the code to respond with a random number that can be used for subsequent addressing could be the same for all devices independent of function. Other system management codes may be included to support a variety of functions.
The data field 84 may contain information associated with each command field 83 code. For instance, the code to adjust the brightness of a lamp may include a data code value within data field 84 to indicate what the brightness should be. As another example, the code to adjust the color temperature of a lamp may include a data code value within data field 84 to indicate the desired CCT. Some command field 83 codes may have no data field 84 code associated therewith. For instance, the code to turn off a lamp may not need any data information, and consequently, the data field 84 may not be included with such command codes.
The CRC checksum field 85 contains a code that is determined by the sequence of data bits between the preamble 80 and the CRC checksum field 85 and is typically used by a receiving device for error checking. CRC codes are well known in the industry and can be implemented with a variety of polynomials. Both transmitting and receiving devices generate the CRC code using the same polynomial and the same seed value. A transmitting device sends the CRC code in the CRC checksum field 85 of a message. When the message is received, a receiving device generates its own CRC code and compares the result to the code in the CRC checksum field 85 of the received message. If the codes match, the message was received properly. If the codes do not match, an error occurred, and the message can be ignored.
In general, the seed value used to generate CRC codes can be substantially any value, provided that the same seed value is used in both the transmitting and receiving devices. To allow devices from different manufacturers to be independently addressed, different manufacturers can program unique seed values into their devices, which in turn can be stored within smart phone 11 for use in communications with those devices. The use of unique seed values enables the smart phone 11 applications to selectively communicate with only a specific manufacturer's devices and to ignore messages that may be transmitted from a different manufacturer's device. For example, since the seed values are different for different manufacturers, a CRC code generated in a transmitting device from one manufacturer will not match the CRC code generated in a receiving device from another manufacturer. Consequently, all messages received by the receiving device from the transmitting device will produce errors and be ignored by the receiving device.
In order to allow some applications 21, 22, and 23 to communicate with all manufacturer's devices, all devices used in the system 10, including those made by different manufacturers, can be configured to generate two different CRC codes using two different seed values when receiving messages. One seed value can be manufacturer specific, and the other seed value can be the same for all manufacturers. If either of the generated CRC codes matches the value in the CRC checksum field 85 in the received message, an error is not generated, and the message is properly processed by the receiving device.
In some cases, the message structure of responses 18, 19 may be like the message structure of communication messages 16, 17. As shown in
In the example of
Bi-phase coding is just one of many possible coding schemes that may be used for communicating communication messages 16, 17 and responses 18, 19. For example, a commonly known 4b5b and 8b10b coding scheme could be used, instead of bi-phase coding, or data 91 could be communicated without any encoding. Further, LED 60 could be modulated faster or slower than 60 Hz, which increases or decreases data 91 throughput. As such,
In receiving device 111, encoded data is recovered from the received light and applied to AND gates 117 and 118. Message 16 bits 4 through 31, which comprise the repeat 81, address 82, communication 83, and data 84 fields, pass through AND gate 117 to memory 121 and CRC generator 119. Message bits 32 through 39 pass through AND gate 118 to comparator 120. Seed value 115 is also applied to CRC generator 119, which produces a checksum that is compared by comparator 120 to the CRC code contained within the CRC field 85 of the received message. If the generated and received codes match, no error is detected, and the received message is accepted. If the codes do not match, an error is detected, and the message can be ignored.
To support both manufacturer-specific seed 115 values and general seed 115 values, CRC generator 119 can produce two different checksums using both seed 115 values. If either resulting checksum from CRC generator 119 matches the CRC code contained within the CRC field 85 of the received message, no error is detected, and the message is accepted.
During communication gaps 133, for example, the LIGHT IN 131 into the lamp 12 is measured and passed to PLI 65 (
The rate of DATA OUT 140 is illustrated in
During video frames 164 and 166, LIGHT IN 161 is high and low respectively for the entire frame. In this case, the pixel voltage integrates LIGHT IN 161 over the entire 60 Hz cycle and produces a maximum difference between light and dark. During frames 163 and 165, however, the LIGHT IN 161 is on part of the 60 Hz cycle and off part of the 60 Hz cycle, which results in the pixel voltage, or image brightness, to be at some intermediate level. With such timing, image processing software in image processor 53 or processor 58 of smart phone 11 can detect the modulation of light from a lamp 12 and properly receive a response 18, 19. Additional image processing may also be used to compensate for motion of the camera 50 or transmitting devices.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown and described by way of example. It should be understood, however, that the drawings and detailed description thereto are not intended to limit the invention to the forms disclosed.
This application is a continuation of application Ser. No. 17/102,369, filed Nov. 23, 2020; which is a continuation of application Ser. No. 15/953,202, filed Apr. 13, 2018, now U.S. Pat. No. 10,847,026, issued Nov. 24, 2020; which is a continuation of application Ser. No. 13/773,322, filed Feb. 21, 2013, now U.S. Pat. No. 10,210,750, issued Feb. 19, 2019; which claims priority to Provisional Application No. 61/601,153, filed Feb. 21, 2012. Application Ser. No. 13/773,322 is further a continuation-in-part of application Ser. No. 13/231,077, filed Sep. 13, 2011, now abandoned. All these applications are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4029976 | Fish et al. | Jun 1977 | A |
4206444 | Ferlan | Jun 1980 | A |
4402090 | Gfeller et al. | Aug 1983 | A |
4713841 | Porter et al. | Dec 1987 | A |
4745402 | Auerbach | May 1988 | A |
4809359 | Dockery | Feb 1989 | A |
5018057 | Biggs et al. | May 1991 | A |
5093744 | Sato et al. | Mar 1992 | A |
5103466 | Bazes | Apr 1992 | A |
5181015 | Marshall et al. | Jan 1993 | A |
5193201 | Tymes | Mar 1993 | A |
5218356 | Knapp | Jun 1993 | A |
5299046 | Spaeth et al. | Mar 1994 | A |
5317441 | Sidman | May 1994 | A |
5383044 | Borchardt et al. | Jan 1995 | A |
5541759 | Neff et al. | Jul 1996 | A |
5619262 | Uno | Apr 1997 | A |
5657145 | Smith | Aug 1997 | A |
5797085 | Beuk et al. | Aug 1998 | A |
5905445 | Gurney et al. | May 1999 | A |
5929845 | Wei et al. | Jul 1999 | A |
6016038 | Mueller et al. | Jan 2000 | A |
6067595 | Lindenstruth | May 2000 | A |
6069929 | Yabe et al. | May 2000 | A |
6084231 | Popat | Jul 2000 | A |
6094014 | Bucks et al. | Jul 2000 | A |
6094340 | Min | Jul 2000 | A |
6108114 | Gilliland et al. | Aug 2000 | A |
6127783 | Pashley et al. | Oct 2000 | A |
6147458 | Bucks et al. | Nov 2000 | A |
6150774 | Mueller et al. | Nov 2000 | A |
6234645 | Boerner et al. | May 2001 | B1 |
6234648 | Boerner et al. | May 2001 | B1 |
6250774 | Begemann et al. | Jun 2001 | B1 |
6333605 | Grouev et al. | Dec 2001 | B1 |
6344641 | Blalock et al. | Feb 2002 | B1 |
6356774 | Bernstein et al. | Mar 2002 | B1 |
6359712 | Kamitani | Mar 2002 | B1 |
6384545 | Lau | May 2002 | B1 |
6396815 | Greaves et al. | May 2002 | B1 |
6414661 | Shen et al. | Jul 2002 | B1 |
6441558 | Muthu et al. | Aug 2002 | B1 |
6448550 | Nishimura | Sep 2002 | B1 |
6495964 | Muthu et al. | Dec 2002 | B1 |
6498440 | Stam et al. | Dec 2002 | B2 |
6513949 | Marshall et al. | Feb 2003 | B1 |
6577512 | Tripathi et al. | Jun 2003 | B2 |
6617795 | Bruning | Sep 2003 | B2 |
6636003 | Rahm et al. | Oct 2003 | B2 |
6639574 | Scheibe | Oct 2003 | B2 |
6664744 | Dietz | Dec 2003 | B2 |
6692136 | Marshall et al. | Feb 2004 | B2 |
6741351 | Marshall et al. | May 2004 | B2 |
6753661 | Muthu et al. | Jun 2004 | B2 |
6788011 | Mueller et al. | Sep 2004 | B2 |
6806659 | Mueller et al. | Oct 2004 | B1 |
6831569 | Wang et al. | Dec 2004 | B2 |
6831626 | Nakamura et al. | Dec 2004 | B2 |
6853150 | Clauberg et al. | Feb 2005 | B2 |
6879263 | Pederson et al. | Apr 2005 | B2 |
6965205 | Piepgras et al. | Nov 2005 | B2 |
6969954 | Lys | Nov 2005 | B2 |
6975079 | Lys et al. | Dec 2005 | B2 |
7006768 | Franklin | Feb 2006 | B1 |
7014336 | Ducharme et al. | Mar 2006 | B1 |
7038399 | Lys et al. | May 2006 | B2 |
7046160 | Pederson et al. | May 2006 | B2 |
7072587 | Dietz et al. | Jul 2006 | B2 |
7088031 | Brantner et al. | Aug 2006 | B2 |
7119500 | Young | Oct 2006 | B2 |
7135824 | Lys et al. | Nov 2006 | B2 |
7161311 | Mueller et al. | Jan 2007 | B2 |
7166966 | Naugler, Jr. et al. | Jan 2007 | B2 |
7194209 | Robbins et al. | Mar 2007 | B1 |
7233115 | Lys | Jun 2007 | B2 |
7233831 | Blackwell | Jun 2007 | B2 |
7252408 | Mazzochette et al. | Aug 2007 | B2 |
7255458 | Ashdown | Aug 2007 | B2 |
7256554 | Lys | Aug 2007 | B2 |
7262559 | Tripathi et al. | Aug 2007 | B2 |
7294816 | Ng et al. | Nov 2007 | B2 |
7315139 | Selvan et al. | Jan 2008 | B1 |
7319298 | Jungwirth et al. | Jan 2008 | B2 |
7329998 | Jungwirth | Feb 2008 | B2 |
7330002 | Joung | Feb 2008 | B2 |
7330662 | Zimmerman | Feb 2008 | B2 |
7339332 | Cull et al. | Mar 2008 | B2 |
7352972 | Franklin | Apr 2008 | B2 |
7358706 | Lys | Apr 2008 | B2 |
7359640 | Onde et al. | Apr 2008 | B2 |
7362320 | Payne et al. | Apr 2008 | B2 |
7372859 | Hall et al. | May 2008 | B2 |
7400310 | Lemay | Jul 2008 | B2 |
7445340 | Conner et al. | Nov 2008 | B2 |
7510300 | Iwauchi et al. | Mar 2009 | B2 |
7511695 | Furukawa et al. | Mar 2009 | B2 |
7525611 | Zagar et al. | Apr 2009 | B2 |
7553033 | Seki | Jun 2009 | B2 |
7554514 | Nozawa | Jun 2009 | B2 |
7573210 | Ashdown et al. | Aug 2009 | B2 |
7583901 | Nakagawa et al. | Sep 2009 | B2 |
7606451 | Morita | Oct 2009 | B2 |
7607798 | Panotopoulos | Oct 2009 | B2 |
7619193 | Deurenberg | Nov 2009 | B2 |
7649527 | Cho et al. | Jan 2010 | B2 |
7659672 | Yang | Feb 2010 | B2 |
7683864 | Lee et al. | Mar 2010 | B2 |
7701151 | Petrucci et al. | Apr 2010 | B2 |
7705541 | Watanabe et al. | Apr 2010 | B2 |
7737936 | Daly | Jun 2010 | B2 |
7828479 | Aslan et al. | Nov 2010 | B1 |
7851737 | Crouse et al. | Dec 2010 | B2 |
7878488 | Leason | Feb 2011 | B1 |
7885548 | Nelson et al. | Feb 2011 | B1 |
7924243 | Kanai et al. | Apr 2011 | B2 |
8013538 | Zampini et al. | Sep 2011 | B2 |
8018135 | Van De Ven et al. | Sep 2011 | B2 |
8035603 | Furukawa et al. | Oct 2011 | B2 |
8040299 | Kretz et al. | Oct 2011 | B2 |
8044899 | Ng et al. | Oct 2011 | B2 |
8044918 | Choi | Oct 2011 | B2 |
8057072 | Takenaka et al. | Nov 2011 | B2 |
8075182 | Dai et al. | Dec 2011 | B2 |
8076869 | Shatford et al. | Dec 2011 | B2 |
8159150 | Ashdown et al. | Apr 2012 | B2 |
8174197 | Ghanem et al. | May 2012 | B2 |
8174205 | Myers et al. | May 2012 | B2 |
8283876 | Ji | Oct 2012 | B2 |
8299722 | Melanson | Oct 2012 | B2 |
8362707 | Draper et al. | Jan 2013 | B2 |
8471496 | Knapp | Jun 2013 | B2 |
8503886 | Gunasekara et al. | Aug 2013 | B1 |
8521035 | Knapp et al. | Aug 2013 | B2 |
8556438 | Lee et al. | Oct 2013 | B2 |
8569974 | Chobot | Oct 2013 | B2 |
8595748 | Haggerty et al. | Nov 2013 | B1 |
8633655 | Kao et al. | Jan 2014 | B2 |
8653758 | Radermacher et al. | Feb 2014 | B2 |
8680787 | Veskovic | Mar 2014 | B2 |
8704666 | Baker, Jr. | Apr 2014 | B2 |
8721115 | Ing et al. | May 2014 | B2 |
8749172 | Knapp | Jun 2014 | B2 |
8773032 | May et al. | Jul 2014 | B2 |
8791647 | Kesterson et al. | Jul 2014 | B2 |
8816600 | Elder | Aug 2014 | B2 |
8886045 | Pederson et al. | Nov 2014 | B2 |
8886047 | Knapp | Nov 2014 | B2 |
8911160 | Seo et al. | Dec 2014 | B2 |
9386668 | Knapp et al. | Jul 2016 | B2 |
9509525 | Knapp | Nov 2016 | B2 |
9848482 | Knapp | Dec 2017 | B2 |
10210750 | Knapp et al. | Feb 2019 | B2 |
10847026 | Knapp et al. | Nov 2020 | B2 |
20010020123 | Diab et al. | Sep 2001 | A1 |
20010030668 | Erten et al. | Oct 2001 | A1 |
20020014643 | Kubo et al. | Feb 2002 | A1 |
20020033981 | Keller et al. | Mar 2002 | A1 |
20020047624 | Stam et al. | Apr 2002 | A1 |
20020049933 | Nyu | Apr 2002 | A1 |
20020134908 | Johnson | Sep 2002 | A1 |
20020138850 | Basil et al. | Sep 2002 | A1 |
20020171608 | Kanai et al. | Nov 2002 | A1 |
20030103413 | Jacobi et al. | Jun 2003 | A1 |
20030122749 | Booth et al. | Jul 2003 | A1 |
20030133491 | Shih | Jul 2003 | A1 |
20030179721 | Shurmantine et al. | Sep 2003 | A1 |
20030234342 | Gaines et al. | Dec 2003 | A1 |
20040044709 | Cabrera et al. | Mar 2004 | A1 |
20040052076 | Mueller et al. | Mar 2004 | A1 |
20040052299 | Jay et al. | Mar 2004 | A1 |
20040101312 | Cabrera | May 2004 | A1 |
20040136682 | Watanabe | Jul 2004 | A1 |
20040201793 | Anandan et al. | Oct 2004 | A1 |
20040208632 | Dietz et al. | Oct 2004 | A1 |
20040220922 | Lovison et al. | Nov 2004 | A1 |
20040257311 | Kanai et al. | Dec 2004 | A1 |
20040263802 | Seki et al. | Dec 2004 | A1 |
20050004727 | Remboski et al. | Jan 2005 | A1 |
20050030203 | Sharp et al. | Feb 2005 | A1 |
20050030267 | Tanghe et al. | Feb 2005 | A1 |
20050053378 | Stanchfield et al. | Mar 2005 | A1 |
20050077838 | Blümel | Apr 2005 | A1 |
20050110777 | Geaghan et al. | May 2005 | A1 |
20050117190 | Iwauchi et al. | Jun 2005 | A1 |
20050169643 | Franklin | Aug 2005 | A1 |
20050200292 | Naugler et al. | Sep 2005 | A1 |
20050207157 | Tani | Sep 2005 | A1 |
20050242742 | Cheang et al. | Nov 2005 | A1 |
20050259439 | Cull et al. | Nov 2005 | A1 |
20050265731 | Keum et al. | Dec 2005 | A1 |
20060012986 | Mazzochette et al. | Jan 2006 | A1 |
20060056855 | Nakagawa | Mar 2006 | A1 |
20060115386 | Michaels et al. | Jun 2006 | A1 |
20060145887 | Mcmahon | Jul 2006 | A1 |
20060164291 | Gunnarsson | Jul 2006 | A1 |
20060198463 | Godin | Sep 2006 | A1 |
20060220990 | Coushaine et al. | Oct 2006 | A1 |
20060227085 | Boldt et al. | Oct 2006 | A1 |
20070007898 | Bruning | Jan 2007 | A1 |
20070040512 | Jungwirth et al. | Feb 2007 | A1 |
20070109239 | den Boer et al. | May 2007 | A1 |
20070132592 | Stewart et al. | Jun 2007 | A1 |
20070139957 | Haim et al. | Jun 2007 | A1 |
20070147843 | Fujiwara | Jun 2007 | A1 |
20070230322 | Morita | Oct 2007 | A1 |
20070248180 | Bowman et al. | Oct 2007 | A1 |
20070254694 | Nakagwa et al. | Nov 2007 | A1 |
20070279346 | den Boer et al. | Dec 2007 | A1 |
20070291197 | Furukawa et al. | Dec 2007 | A1 |
20080061717 | Bogner et al. | Mar 2008 | A1 |
20080067942 | Watanabe et al. | Mar 2008 | A1 |
20080107029 | Hall et al. | May 2008 | A1 |
20080120559 | Yee | May 2008 | A1 |
20080136334 | Robinson et al. | Jun 2008 | A1 |
20080136770 | Ferentz et al. | Jun 2008 | A1 |
20080136771 | Chen et al. | Jun 2008 | A1 |
20080150864 | Bergquist | Jun 2008 | A1 |
20080174530 | Booth et al. | Jul 2008 | A1 |
20080186898 | Petite | Aug 2008 | A1 |
20080222367 | Co | Sep 2008 | A1 |
20080235418 | Werthen et al. | Sep 2008 | A1 |
20080253766 | Yu et al. | Oct 2008 | A1 |
20080265799 | Sibert | Oct 2008 | A1 |
20080297066 | Meijer et al. | Dec 2008 | A1 |
20080297070 | Kuenzler et al. | Dec 2008 | A1 |
20080304833 | Zheng | Dec 2008 | A1 |
20080309255 | Myers et al. | Dec 2008 | A1 |
20080317475 | Pederson et al. | Dec 2008 | A1 |
20090026978 | Robinson | Jan 2009 | A1 |
20090040154 | Scheibe | Feb 2009 | A1 |
20090049295 | Erickson et al. | Feb 2009 | A1 |
20090051496 | Pahlavan et al. | Feb 2009 | A1 |
20090121238 | Peck | May 2009 | A1 |
20090171571 | Son et al. | Jul 2009 | A1 |
20090189776 | Cheron et al. | Jul 2009 | A1 |
20090196282 | Fellman et al. | Aug 2009 | A1 |
20090245101 | Kwon et al. | Oct 2009 | A1 |
20090278789 | Declercq et al. | Nov 2009 | A1 |
20090284511 | Takasugi et al. | Nov 2009 | A1 |
20090303972 | Flammer, III et al. | Dec 2009 | A1 |
20100005533 | Shamir | Jan 2010 | A1 |
20100052542 | Siemiet et al. | Mar 2010 | A1 |
20100054748 | Sato | Mar 2010 | A1 |
20100061734 | Knapp | Mar 2010 | A1 |
20100096447 | Kwon et al. | Apr 2010 | A1 |
20100117543 | Van Der Veen et al. | May 2010 | A1 |
20100134021 | Ayres | Jun 2010 | A1 |
20100134024 | Brandes | Jun 2010 | A1 |
20100141159 | Shiu et al. | Jun 2010 | A1 |
20100182294 | Roshan et al. | Jul 2010 | A1 |
20100188443 | Lewis et al. | Jul 2010 | A1 |
20100188972 | Knapp | Jul 2010 | A1 |
20100194299 | Ye et al. | Aug 2010 | A1 |
20100201274 | Deixler | Aug 2010 | A1 |
20100213856 | Mizusako | Aug 2010 | A1 |
20100272437 | Yoon et al. | Oct 2010 | A1 |
20100301777 | Kraemer | Dec 2010 | A1 |
20100327764 | Knapp | Dec 2010 | A1 |
20110031894 | Van De Ven | Feb 2011 | A1 |
20110044343 | Sethuram et al. | Feb 2011 | A1 |
20110052214 | Shimada et al. | Mar 2011 | A1 |
20110062874 | Knapp | Mar 2011 | A1 |
20110063214 | Knapp | Mar 2011 | A1 |
20110063268 | Knapp | Mar 2011 | A1 |
20110068699 | Knapp | Mar 2011 | A1 |
20110069094 | Knapp | Mar 2011 | A1 |
20110069960 | Knapp et al. | Mar 2011 | A1 |
20110076024 | Damink | Mar 2011 | A1 |
20110133654 | Mckenzie et al. | Jun 2011 | A1 |
20110148315 | Van Der Veen et al. | Jun 2011 | A1 |
20110150028 | Nguyen et al. | Jun 2011 | A1 |
20110248640 | Welten | Oct 2011 | A1 |
20110253915 | Knapp | Oct 2011 | A1 |
20110299854 | Jonsson et al. | Dec 2011 | A1 |
20110309754 | Ashdown et al. | Dec 2011 | A1 |
20120001567 | Knapp et al. | Jan 2012 | A1 |
20120056545 | Radermacher et al. | Mar 2012 | A1 |
20120153839 | Farley et al. | Jun 2012 | A1 |
20120229032 | Van De Ven et al. | Sep 2012 | A1 |
20120299481 | Stevens | Nov 2012 | A1 |
20120306370 | Van De Ven et al. | Dec 2012 | A1 |
20130016978 | Son et al. | Jan 2013 | A1 |
20130088522 | Gettemy et al. | Apr 2013 | A1 |
20130108275 | Knutson | May 2013 | A1 |
20130183042 | Knapp et al. | Jul 2013 | A1 |
20130201690 | Vissenberg et al. | Aug 2013 | A1 |
20130257314 | Alvord et al. | Oct 2013 | A1 |
20130293147 | Rogers et al. | Nov 2013 | A1 |
20140028377 | Rosik et al. | Jan 2014 | A1 |
20150022110 | Sisto | Jan 2015 | A1 |
20180233030 | Knapp et al. | Aug 2018 | A1 |
Number | Date | Country |
---|---|---|
1291282 | Apr 2001 | CN |
1396616 | Feb 2003 | CN |
1573881 | Feb 2005 | CN |
1650673 | Aug 2005 | CN |
1849707 | Oct 2006 | CN |
101083866 | Dec 2007 | CN |
101150904 | Mar 2008 | CN |
101322441 | Dec 2008 | CN |
101331798 | Dec 2008 | CN |
101458067 | Jun 2009 | CN |
0196347 | Oct 1986 | EP |
0456462 | Nov 1991 | EP |
2273851 | Jan 2011 | EP |
2307577 | May 1997 | GB |
H06302384 | Oct 1994 | JP |
H08201472 | Aug 1996 | JP |
H1125822 | Jan 1999 | JP |
2000286067 | Oct 2000 | JP |
2001514432 | Sep 2001 | JP |
2002353900 | Dec 2002 | JP |
2004325643 | Nov 2004 | JP |
2005539247 | Dec 2005 | JP |
2006093450 | Apr 2006 | JP |
2006260927 | Sep 2006 | JP |
2007047352 | Feb 2007 | JP |
2007266974 | Oct 2007 | JP |
2007267037 | Oct 2007 | JP |
2008500583 | Jan 2008 | JP |
2008507150 | Mar 2008 | JP |
2008300152 | Dec 2008 | JP |
2009134877 | Jun 2009 | JP |
2010525567 | Jul 2010 | JP |
20080073172 | Aug 2008 | KR |
20080077353 | Aug 2008 | KR |
9910867 | Mar 1999 | WO |
0037904 | Jun 2000 | WO |
03075617 | Sep 2003 | WO |
2005024898 | Mar 2005 | WO |
2007069149 | Jun 2007 | WO |
2008065607 | Jun 2008 | WO |
2008129453 | Oct 2008 | WO |
2008152922 | Dec 2008 | WO |
2010124315 | Nov 2010 | WO |
2012005771 | Jan 2012 | WO |
2012042429 | Apr 2012 | WO |
2013142437 | Sep 2013 | WO |
Entry |
---|
“Color Management of a Red, Green, and Blue LED Combinational Light Source”, Avago Technologies, Mar. 2010, 2 pages. |
“Decision to Grant a Patent, JP Application 2012-520587, dated Oct. 21, 2015”, 2 pages. |
“European Search Report and European Search Opinion, EPC Application 10806752.1, dated Jun. 9, 2017”, 12 pages. |
“European Search Report and Search Opinion, EPC Application 10800143.9, dated Feb. 24, 2017”, 10 pages. |
“Final Office Action for U.S. Appl. No. 12/803,805 dated Jun. 23, 2015”, 75 pages. |
“Final Office Action for U.S. Appl. No. 13/773,322, dated Sep. 2, 2015”. |
“Final Office Action dated Jan. 28, 2015, for U.S. Appl. No. 12/806,117”, 23 pages. |
“Final Office Action dated Jul. 9, 2013, for U.S. Appl. No. 12/806,118”, 30 pages. |
“Final Office Action dated Jun. 14, 2013, for U.S. Appl. No. 12/806,117”, 23 pages. |
“Final Office Action dated Jun. 18, 2014, for U.S. Appl. No. 13/231,077”, 47 pages. |
“Final Office Action dated Nov. 28, 2011, for U.S. Appl. No. 12/360,467”, 17 pages. |
“Final Office Action dated Oct. 11, 2012, for U.S. Appl. No. 12/806,121”, 24 pages. |
“Final Office Action dated Sep. 12, 2012, for U.S. Appl. No. 12/584,143”, 16 pages. |
“International Preliminary Report on Patentability and Written Opinion for PCT/US2009/004953 dated Mar. 8, 2011”. |
“International Preliminary Report on Patentability, Int'l Application PCT/US2010/001919, dated Jan. 26, 2012”, 7 pages. |
“International Preliminary Report on Patentability, Int'l Application PCT/US2010/002171, dated Feb. 16, 2012”, 6 pages. |
“International Search Report & Written Opinion for PCT/US2010/000219 dated Oct. 12, 2010”. |
“International Search Report & Written Opinion for PCT/US2012/052774 dated Feb. 4, 2013”. |
“International Search Report & Written Opinion dated Sep. 19, 2012, for PCT/US2012/045392”. |
“International Search Report & Written Opinion, PCT/US2010/001919, dated Feb. 24, 2011”. |
“International Search Report & Written Opinion, PCT/US2010/002171, dated Nov. 24, 2010”. |
“International Search Report & Written Opinion, PCT/US2010/004953, dated Mar. 22, 2010”. |
“International Search Report & Written Opinion, PCT/US2013/027157, dated May 16, 2013”. |
“International Search Report and Written Opinion for PCT/US2014/068556 dated Jun. 22, 2015”. |
“International Search Report and Written Opinion for PCT/US2015/037660 dated Oct. 28, 2015”. |
“LED Fundamentals, How to Read a Datasheet (Part 2 of 2) Characteristic Curves, Dimensions and Packaging”, OSRAM Opto Semiconductors, Aug. 19, 2011, 17 pages. |
“Notice of Allowance for U.S. Appl. No. 12/806,117 dated Nov. 18, 2015”, 18 pages. |
“Notice of Allowance for U.S. Appl. No. 13/970,944 dated Sep. 11, 2015”, 10 pages. |
“Notice of Allowance for U.S. Appl. No. 14/097,355 dated Mar. 30, 2015”, 9 pages. |
“Notice of Allowance for U.S. Appl. No. 14/510,212 dated May 22, 2015”, 12 pages. |
“Notice of Allowance for U.S. Appl. No. 14/510,243 dated Nov. 6, 2015”, 9 pages. |
“Notice of Allowance for U.S. Appl. No. 14/604,881 dated Oct. 9, 2015”, 8 pages. |
“Notice of Allowance for U.S. Appl. No. 14/604,886 dated Sep. 25, 2015”, 8 pages. |
“Notice of Allowance dated Aug. 21, 2014, for U.S. Appl. No. 12/584,143”, 5 pages. |
“Notice of Allowance dated Feb. 21, 2014, for U.S. Appl. No. 12/806,118”, 9 pages. |
“Notice of Allowance dated Feb. 25, 2013, for U.S. Appl. No. 12/806,121”, 11 pages. |
“Notice of Allowance dated Feb. 4, 2013, for U.S. Appl. No. 12/806,113”, 9 pages. |
“Notice of Allowance dated Jan. 20, 2012, for U.S. Appl. No. 12/360,467”, 5 pages. |
“Notice of Allowance dated Jan. 28, 2014, for U.S. Appl. No. 13/178,686”, 10 pages. |
“Notice of Allowance dated May 3, 2013, for U.S. Appl. No. 12/806,126”, 6 pages. |
“Notice of Allowance dated Oct. 15, 2012, for U.S. Appl. No. 12/806,113”, 8 pages. |
“Notice of Allowance dated Oct. 31, 2013, for U.S. Appl. No. 12/924,628”, 10 pages. |
“Notice of Final Rejection, KR Application 10-2012-7003792, dated Jun. 22, 2016”, 1 page. |
“Notice of Final Rejection, KR Application 10-2012-7005884, dated Jul. 20, 2016”, 1 page. |
“Notification of Reason for Refusal, KR Application 10-2012-7003792, dated Mar. 7, 2016”, 7 pages. |
“Notification of Reason for Refusal, KR Application 10-2012-7005884, dated Apr. 28, 2016”, 3 pages. |
“Notification to Grant Patent Right for Invention, CN Application 2010-80032373.7, dated Aug. 6, 2014”, 2 pages. |
“Notification to Grant Patent Right for Invention, CN Application 2010-80035731.X, dated Aug. 6, 2015”, 2 pages. |
“Office Action for JP Application 2012-523605, dated Mar. 11, 2014”. |
“Office Action for JP Application 2012-523605, dated Sep. 24, 2014”. |
“Office Action for U.S. Appl. No. 12/806,117 dated May 27, 2015”, 20 pages. |
“Office Action for U.S. Appl. No. 13/970,964 dated Jun. 29, 2015”, 17 pages. |
“Office Action for U.S. Appl. No. 13/970,990 dated Aug. 20, 2015”, 8 pages. |
“Office Action for U.S. Appl. No. 14/305,456 dated Apr. 8, 2015”, 9 pages. |
“Office Action for U.S. Appl. No. 14/305,472 dated Mar. 25, 2015”, 12 pages. |
“Office Action for U.S. Appl. No. 14/510,243 dated Jul. 28, 2015”, 8 pages. |
“Office Action for U.S. Appl. No. 14/510,266 dated Jul. 31, 2015”, 10 pages. |
“Office Action for U.S. Appl. No. 14/510,283 dated Jul. 29, 2015”, 9 pages. |
“Office Action for U.S. Appl. No. 14/573,207 dated Nov. 4, 2015”, 23 pages. |
“Office Action dated Apr. 22, 2014, for U.S. Appl. No. 12/806,114”, 16 pages. |
“Office Action dated Aug. 2, 2012, for U.S. Appl. No. 12/806,114”, 14 pages. |
“Office Action dated Dec. 17, 2012, for U.S. Appl. No. 12/806,118”, 29 pages. |
“Office Action dated Dec. 4, 2013, for U.S. Appl. No. 12/803,805”, 19 pages. |
“Office Action dated Feb. 1, 2012, for U.S. Appl. No. 12/584,143”, 12 pages. |
“Office Action dated Feb. 17, 2015, for JP Application 2012-520587”. |
“Office Action dated Feb. 2, 2015, for CN Application 201080035731.X”. |
“Office Action dated Jul. 1, 2014, for JP Application 2012-520587”. |
“Office Action dated Jul. 10, 2012, for U.S. Appl. No. 12/806,113”, 11 pages. |
“Office Action dated Jul. 11, 2012, for U.S. Appl. No. 12/806,121”, 23 pages. |
“Office Action dated Jun. 10, 2013, for U.S. Appl. No. 12/924,628”, 9 pages. |
“Office Action dated Jun. 23, 2014, for U.S. Appl. No. 12/806,117”. |
“Office Action dated Jun. 27, 2013, for U.S. Appl. No. 13/178,686”. |
“Office Action dated Mar. 6, 2015, for U.S. Appl. No. 13/773,322”, 30 pages. |
“Office Action dated May 12, 2011, for U.S. Appl. No. 12/360,467”, 19 pages. |
“Office Action dated Nov. 4, 2013, for CN Application No. 201080032373.7”. |
“Office Action dated Nov. 12, 2013, for U.S. Appl. No. 13/231,077”, 31 pages. |
“Office Action dated Oct. 2, 2012 for U.S. Appl. No. 12/806,117”, 22 pages. |
“Office Action dated Oct. 24, 2013, for U.S. Appl. No. 12/806,117”, 19 pages. |
“Office Action dated Oct. 9, 2012, for U.S. Appl. No. 12/806,126”, 6 pages. |
“Office Action dated Sep. 10, 2014, for U.S. Appl. No. 12/803,805”, 28 pages. |
“Partial International Search Report for PCT/US2014/068556 dated Mar. 27, 2015”. |
“Partial International Search Report for PCT/US2015/037660 dated Aug. 21, 2015”. |
“Partial International Search Report for PCT/US2015/045252 dated Nov. 18, 2015”. |
“Partial International Search Report dated Nov. 16, 2012, for PCT/US2012/052774”. |
“Search Report by Registered Search Organization, JP Application 2012-520587, dated May 30, 2014”, 10 pages. |
“Search Report by Registered Search Organization, JP Application 2012-523605, dated Feb. 25, 2014”, 15 pages. |
“Supplementary Partial European Search Report, EPC Application 10806752.1, dated Mar. 8, 2017”, 6 pages. |
“The First Office Action, CN Application 2010-80035731.X, dated Jun. 5, 2014”, 5 pages. |
“The Second Office Action, CN Application 2010-80032373.7, dated Apr. 15, 2014”, 5 pages. |
“U.S. Appl. No. 12/924,628, filed Sep. 30, 2010”, 34 pages. |
Bouchet, et al., “Visible-Light Communication System Enabling 73 Mb/s Data Streaming”, IEEE Globecom Workshop on Optical Wireless Communications, 2010, pp. 1042-1046. |
Chonko, “Use Forward Voltage Drop to Measure Junction Temperature”, 2013 Penton Media, Inc., 24 pages. |
Hall, et al., “Jet Engine Control using Ethernet with a BRAIN (Postprint)”, AIAA/ASME/SAE/ASEE, Joint Propulsion Conference and Exhibition, Jul. 2008, pp. 1-18. |
Johnson, “Visible Light Communication: Tutorial”, Project IEEE P802.15 Working Group for Wireless Personal Area Networks (WPANs), Mar. 2008, 78 pages. |
Johnson, “Visible Light Communications”, CTC Tech Brief, Nov. 2009, 2 pages. |
Kebemou, “A Partitioning-Centric Approach for the Modeling and the Methodical Design of Automotive Embedded System Architectures”, Dissertation of Technical University of Berlin, 2008, 180 pages. |
O'Brien, et al., “Visible Light Communications and Other Developments in Optical Wireless”, Wireless World Research Forum, 2006, 26 pages. |
Zalewski, et al., “Safety Issues in Avionics and Automotive Databuses”, IFAC World Congress, Jul. 2005, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20220114884 A1 | Apr 2022 | US |
Number | Date | Country | |
---|---|---|---|
61601153 | Feb 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17102369 | Nov 2020 | US |
Child | 17558678 | US | |
Parent | 15953202 | Apr 2018 | US |
Child | 17102369 | US | |
Parent | 13773322 | Feb 2013 | US |
Child | 15953202 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13231077 | Sep 2011 | US |
Child | 13773322 | US |