The present invention relates generally to the field of outdoor lighting fixtures.
Observation cameras (e.g., security cameras, traffic cameras, etc.) are conventionally mounted to a high pole or side of a building and are either wired or wirelessly connected to a base station dedicated to the observation camera. It has conventionally been challenging to provide proper light, power, and data communications facilities for observation cameras.
One embodiment of the invention relates to an outdoor lighting fixture that includes a ballast for controlling the amount of current provided to a lamp. The lighting fixture also includes a fixture housing at least partially surrounding the ballast and the lamp and a mounting system for holding the fixture housing to at least one of a wall and a pole. The lighting fixture yet further includes a camera coupled to the housing and a control circuit wired to the camera. The lighting fixture also includes a radio frequency transceiver wired to the control circuit. The control circuit is configured to cause information from the camera to be wirelessly transmitted by the radio frequency transceiver.
Another embodiment of the invention relates to a kit for installing on an outdoor lighting fixture pole. The kit includes an outdoor lighting fixture configured for mounting to the outdoor lighting fixture pole and having a ballast and at least one lamp. The kit further includes a radio frequency transceiver for wirelessly communicating lighting commands and lighting information to a remote source. The kit also includes a camera for mounting to at least one of the outdoor lighting fixture and the outdoor lighting fixture pole. The kit yet further includes a control circuit wired to the camera and the radio frequency transceiver and configured to cause video information from the camera to be transmitted by the radio frequency transceiver.
Another embodiment of the invention relates to a device for use with an outdoor lighting fixture having a radio frequency transceiver for communicating data information to a remote source. The device includes a camera and a mount for holding the camera to at least one of the outdoor lighting fixture or a pole for the outdoor lighting fixture. The device further includes a control circuit wired to the camera and including memory for storing video from the camera. The device also includes an interface for wiring the control circuit to the radio frequency transceiver of the outdoor lighting fixture. The control circuit is configured to receive video information from the camera and to provide the video information to the radio frequency transceiver via the interface and for communication to the remote source.
Another embodiment of the invention relates to a device for an outdoor lighting fixture. The lighting fixture has a radio frequency transceiver for wirelessly communicating information. The device includes a camera for capturing images, video, or images and video and a mount for holding the camera to at least on of the outdoor lighting fixture or a pole. The device further includes a control circuit having a wired interface to the camera and including memory for storing the captured images, video, or images and video received from the camera via the wired interface. The device also includes a radio frequency transceiver wired to the control circuit. The control circuit is configured to cause the stored images, video, or images and video to be wirelessly transmitted by the radio frequency transceiver.
Alternative exemplary embodiments relate to other features and combinations of features as may be generally recited in the claims.
The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements, in which:
Referring generally to the Figures, a camera is coupled to an outdoor lighting fixture configured for mounting to a building or high pole. The camera uses power from the power source for the outdoor lighting fixture and a communications interface associated with the outdoor lighting fixture to transmit video information back to a remote source for observation or analysis. The camera may be positioned to look down at an area illuminated by the outdoor lighting fixture.
Referring now to
In
Mounting system 32 is shown to include a mount 34 and a compression sleeve 36. Compression sleeve 36 is configured to receive the pole and to tighten around the pole (e.g., when a clamp is closed, when a bolt is tightened, etc.). Compression sleeve 36 may be sized and shaped for attachment to existing outdoor poles such as street light poles, sidewalk poles, parking lot poles, and the like. As is provided by mounting system 32, the coupling mechanism may be mechanically adaptable to different poles or masts. For example, compression sleeve 36 may include a taper or a tapered cut so that compression sleeve 36 need not match the exact diameter of the pole or mast to which it will be coupled. While lighting fixture 102 shown in
According to an exemplary embodiment, fixture 102 and housing 30 are elongated and mount 34 extends along the length of housing 30. Mount 34 is preferably secured to housing 30 in at least one location beyond a lengthwise center point and at least one location before the lengthwise center point. In other exemplary embodiments, the axis of compression sleeve 36 also extends along the length of housing 30. In the embodiment shown in
Housing 30 is shown to include a fixture pan 50 and a door frame 52 that mates with fixture pan 50. In the embodiments shown in the Figures, door frame 52 is mounted to fixture pan 50 via hinges 54 and latches 56. When latches 56 are released, door frame 52 swings away from fixture pan 50 to allow access to fluorescent lamps 12 within housing 30. Latches 56 are shown as compression-type latches, although many alternative locking or latching mechanisms may be alternatively or additionally provided to secure the different sections of the housing. In some embodiments the latches may be similar to those found on “NEMA 4” type junction boxes or other closures. Further, many different hinge mechanisms may be used. Yet further, in some embodiments door frame 52 and fixture pan 50 may not be joined by a hinge and may be secured together via latches 56 on all sides, any number of screws, bolts or other fasteners that do not allow hinging, or the like. In an exemplary embodiment, fixture pan 50 and door frame 52 are configured to sandwich a rubber gasket that provides some sealing of the interior of housing 30 from the outside environment. In some embodiments the entirety of the interior of the lighting fixture is sealed such that rain and other environmental moisture does not easily enter housing 30. Housing 30 and its component pieces may be galvanized steel but may be any other metal (e.g., aluminum), plastic, and/or composite material. Housing 30, mounting system 32 and/or the other metal structures of lighting fixture 102 may be powder coated or otherwise treated for durability of the metal. According to an exemplary embodiment housing 30 is powder coated on the interior and exterior surfaces to provide a hard, relatively abrasion resistant, and tough surface finish.
Housing 30, mounting system 32, compression sleeve 36, and the entirety of lighting fixture 102 are preferably extremely robust and able to withstand environmental abuses of outdoor lighting fixtures. The shape of housing 30 and mounting system 32 are preferably such that the effective projection area (EPA) relative to strong horizontal winds is minimized—which correspondingly provides for minimized wind loading parameters of the lighting fixture.
Ballasts, structures for holding lamps, and the lamps themselves may be installed to the interior of fixture pan 50. Further, a reflector may be installed between the lamp and the interior metal of fixture pan 50. The reflector may be of a defined geometry and coated with a white reflective thermosetting powder coating applied to the light reflecting side of the body (i.e., a side of the reflector body that faces toward a fluorescent light bulb). The white reflective coating may have reflective properties, which in combination with the defined geometry of the reflector, provides high reflectivity. The reflective coating may be as described in U.S. Prov. Pat. App. No. 61/165,397, filed Mar. 31, 2009. In other exemplary embodiments, different reflector geometries may be used and the reflector may be uncoated or coated with other coating materials. In yet other embodiments, the reflector may be a “MIRO 4” type reflector manufactured and sold by Alanod GmbH & Co KG.
The shape and orientation of housing 30 relative to the reflector and/or the lamps is configured to provide a near full cut off such that light does not project above the plane of fixture pan 50. The lighting fixtures described herein are preferably “dark-sky” compliant or friendly.
To provide further resistance to environmental variables such as moisture, housing 30 may include one or more vents configured to allow moisture and air to escape housing 30 while not allowing moisture to enter housing 30. Moisture may enter enclosed lighting fixtures due to vacuums that can form during hot/cold cycling of the lamps. According to an exemplary embodiment, the vents include, are covered by, or are in front of one or more pieces of material that provide oleophobic and hydrophobic protection from water, washing products, dirt, dust and other air contaminants. According to an exemplary embodiment the vents may include GORE membrane sold and manufactured by W.L. Gore & Associates, Inc. The vent may include a hole in the body of housing 30 that is plugged with a snap-fit (or otherwise fit) plug including an expanded polytetrafluoroethylene (ePTFE) membrane with a polyester non-woven backing material.
While various Figures of the present disclosure, including
The lighting fixture system includes controller 16. Controller 16 is connected to lighting fixture 102 via wire 14. Controller 16 is configured to control the switching between different states of lighting fixture 102 (e.g., all lamps on, all lamps off, some lamps on, etc.). While controller 16 is shown as having a housing that is exterior to housing 30 of lighting fixture 102, it should be appreciated that controller 16 may be physically integrated with housing 30. For example, one or more circuit boards or circuit elements of controller 16 may be housed within, on top of, or otherwise secured to housing 30. Further, in other exemplary embodiments, controller 16 (including its housing) may be coupled directly to housing 30. For example, controller 16′s housing may be latched, bolted, clipped, or otherwise coupled to the interior or exterior of housing 30. Controller 16's housing may generally be shaped as a rectangle (as shown), may include one or more non-right angles or curves, or otherwise configured. In an exemplary embodiment, controller 16's housing is made of plastic and housing 30 for the lighting fixture 102 is made from metal. In other embodiments, other suitable materials may be used.
According to various embodiments, controller 16 is further configured to log usage information for lighting fixture 102 in a memory device local to controller 16. Controller 16 may further be configured to use the logged usage information to affect control logic of controller 16. Controller 16 may also or alternatively be configured to provide the logged usage information to another device for processing, storage, or display. Controller 16 is shown to include a sensor 13 coupled to controller 16 (e.g., controller 16's exterior housing). Controller 16 may be configured to use signals received from sensor 13 to affect control logic of controller 16. Further, controller 16 may be configured to provide information relating to sensor 13 to another device.
Referring further to
In
In the illustration of
Outdoor lighting fixture 102 additionally includes a sensor 13 (shown in
Client device 112 may be used to view the camera data or to provide camera 40 or the control circuit of outdoor lighting fixture 102 with commands. For example, client device 112 may provide a display of camera data (e.g., a slideshow of pictures, a near real-time view of streaming video from camera 40, motion information relating to vehicle 100 as detected or calculated by motion sensor 13, camera 40 and the control circuit, etc.). Client device 112 may further provide a user interface for allowing a user to provide control instructions or commands to the control circuit associated with sensor 13 or camera 40. For example, client device 112, via server 110, data communications network 108, and outdoor lighting fixture 106 may be configured to control outdoor lighting fixture 102 including camera 40. A user may view the data for the camera on client device 112 and provide client device 112 with user input to create camera instructions (e.g., an instruction for the camera to take various photos of the area, an instruction to follow vehicle 101 for as long as possible, an instruction for the camera to stay focused on a specific area for a specific time period, etc.), lighting fixture instructions (e.g., an instruction for a lighting fixture to stay in an illuminated state for a fixed or variable length of time based on the presence of vehicle 101, an instruction for a lighting fixture to turn off, etc.), or other outdoor lighting fixture system 100 instructions. Camera instructions may further include changing the zoom of camera 40 (e.g., zooming in or out on vehicle 101), panning camera 40 across a specific area (e.g., the area surrounding vehicle 101), tilting camera 40 (e.g., such that camera 40 shows a different angle of vehicle 101), or otherwise changing the position or configuration of camera 40. Outdoor lighting fixture instructions may also include instructions to provide lighting (e.g., by a secondary ballast of outdoor lighting fixture 102, by outdoor lighting fixture 106, etc.) such that camera 40 may better record an event or object, instructions to change lighting fixture status between an on state, an off state, and a dimmed state, etc.
Referring further to
Referring still to
Control circuit 210 is coupled to ballasts 244, 246 and is configured to provide control signals to ballasts 244, 246. Control circuit 210 may operate by controllably switching the relay from providing power to ballasts 244, 246 to restricting power to ballasts 244, 246 and vice versa. Control circuit 210 is further shown to include radio frequency transceiver 206 communicably connected to control circuit 210. According to an exemplary embodiment, the system shown in
In an exemplary embodiment radio frequency transceiver 206 is a ZigBee transceiver configured for wireless meshed networking In other embodiments radio frequency transceiver 206 operates according to a WiFi protocol, a Bluetooth protocol, or any other suitable protocol for short or long range wireless data transmission. Outdoor lighting fixture 200 is further shown to include a wired uplink interface 211. Wired uplink interface 211 may be or include a wire terminal, hardware for interpreting analog or digital signals received at the wire terminal, or one or more jacks, connectors, plugs, filters, or other hardware (or software) for receiving and interpreting signals received via the wire 212 from a remote source. Radio frequency transceiver 206 may include an encoder, a modulator, an amplifier, a demodulator, a decoder, an antenna, one or more filters, one or more buffers, one or more logic modules for interpreting received transmissions, and/or one or more logic modules for appropriately formatting transmissions. Control circuit 210 shown in
Camera 270 is shown coupled to the bottom side of housing 260 and may be connected to control circuit 210 either via a wireless or wired connection. Camera 270 may alternatively be coupled to housing 260 or elsewhere on lighting fixture 200. Camera 270 may provide control circuit 210 with video and/or still photos for transmission to other lighting fixtures 230, to a master controller 202 via a master transceiver 204, a data communications network 250 via interface 211, or other devices 232 wirelessly connected to lighting fixture 200.
Referring now to
Controller 300 is shown to include a camera circuit 330 for receiving camera data and video information from camera 309 and processing the camera data and video information. The video information or camera data may then be provided to circuit 350 for transmission via RF transceiver 306 to a remote source or another lighting fixture. Circuit 350 may further receive the camera data and perform additional processing or analysis of the camera data. For example, circuit 350 may use the video information or camera data to determine whether to change a lighting fixture status (turning the lighting fixture on or off, activating an extra ballast or lamp, etc.), to determine whether to change a schedule of the lighting fixture, or to make other control determinations.
Camera circuit 330 includes a camera interface 338 for communicating with a camera 309 connected (either via a wired connection or wirelessly) to controller 300. Camera interface 338 receives video information or camera data such as camera settings data, the current tilt or zoom of the camera, or the like. Camera interface 338 may be a wired interface such as a Ethernet interface, a digital video jack, an optical video connection, a USB interface, or another suitable interface for receiving video information from camera 309. In alternative embodiments, camera interface 338 is a wireless interface for receiving data from the camera via a wireless connection. In yet other embodiments camera 309 is a part of camera circuit 330 (e.g., rigidly coupled to the circuit board of circuit 330).
Camera circuit 330 further includes modules (e.g., integrated circuits, computer code modules in a memory device and for execution by a processor, etc.) for processing the camera data received by camera interface 338. Camera circuit 330 includes processor 332 for executing computer codes of the various modules of camera circuit 330, processing video information received from camera 309, or to complete the execution of other activities described herein. For example, processor 332 may remove noise from the video signal (e.g., denoising), increase or decrease the brightness or contrast of the video signal or images (e.g., to improve the view provided by the video signal), resize or rescale the video signal or images (e.g., increasing the size such that a particular object in the video signal is more easily seen, interpolating the image, etc.), or perform other processing techniques on the video signal and images (e.g., deinterlacing, deflicking, deblocking, color grading, etc.). Processor 332 may then provide the processed video signal or images to circuit 350 for transmitting to a remote source via radio frequency transceiver 306, may provide the video information to video logic 336 for video analysis, may store the video in memory 334 for later use, or may conduct another activity described herein using the processed video information. Memory 334 may be configured to store all video information or camera data received by camera circuit 330, some of the video information or camera data received by camera circuit 330, relevant video information or camera data selected by video logic 336, all video information or camera data for a given time frame, all video information or camera data associated with a particular object within the video, or otherwise. For example, memory 334 may be configured to store all camera data that has a timestamp within the past hour, past 24 hours, past week, or within any other time frame. In another example, video logic 336 may retain all video information or camera data associated with a particular vehicle recorded by the camera, retain all camera data with a specific timestamp range (e.g., all data with a timestamp within a period of time in which sensor 318 detected motion), etc.
Video logic 336 receives the video information or camera data from camera interface 338 or from processor 332 and analyzes the data. The analysis of the video information may include the detection of an object within the video (either stationary or moving) or the detection of an event occurring in the area captured by the video. For example, video logic 336 may be used to identify a vehicle or license plate, and may provide circuit 350 with data regarding the vehicle (e.g., how fast the vehicle was appearing to move, the direction in which the vehicle was traveling, etc.) or the license plate. Video logic 336 may include logic for determining which portions of a video signal and/or which images best represent a tracked object.
Camera circuit 330 further includes remote control module 340. Remote control module 340 is configured to allow for remote control of camera 309. Remote control of camera 309 may include adjusting the positioning, tilt, or zoom of the camera, adjusting when a camera records video, adjusting a camera resolution, stopping recording, starting recording, or initiating or changing any other camera activity. Remote control module 340 may be configured to serve or otherwise provide user interface controls or user interface options to a remote source for adjusting the camera settings. Remote control module 340 may receive an input from the user at the user interface controls or options and interpret the input (e.g., determine an adjustment to be made to camera 309). Remote control module 340 may then cause camera circuit 330 and camera interface 338 to adjust camera 309 or remote control module 340 can cause changes to be made via other modules of camera circuit 330 such as camera settings module 346.
Camera circuit 330 further includes video streamer 342 configured to process the video information from camera 309 and to provide a stream of the video to a remote source communicating with controller 300 (e.g., communicating wirelessly). Video streamer 342 may process or otherwise prepare the stream of video information for streaming to the remote source. For example, video streamer 342 may compress the video for streaming, packetize the video for streaming, and wrap the packetized video according to a video streaming protocol compatible with the remote source. Video streamer 342 may further be configured to negotiate and maintain a data streaming connection with the remote source.
Camera circuit 330 further includes server module 344 for serving video information and/or related user interfaces to a remote source. Server module 344 may be, for example, a web server or web service configured to respond to requests for video information or user interfaces using one or more world wide web communications protocols. For example, server module 344 may respond to http requests by providing http formatted responses. Server module 344 may be used to establish the streaming connection or streaming service provided by video streamer 342.
Camera circuit 330 is further shown to include camera settings module 346. Camera settings module 346 is configured to receive commands provided to controller 300 by a remote source and relating to camera settings. Camera settings module 346 can update stored camera settings or change the “live” behavior of the camera in response to the received commands. For example, radio frequency transceiver 306 can receive a command for the camera to change the default pan, tilt, and zoom settings of the camera from a remote source. Radio frequency transceiver 306 and wireless controller 305 can provide the command to the control circuit 350 which may route the command to camera circuit 330 and more particularly camera settings module 346. Camera settings module 346 can parse the command and set the pan, tilt, and zoom parameters for the camera by updating variables stored in memory 334 and/or providing the new parameters to camera 309 via camera interface 338. Other adjustable camera settings may include a timeframe under which the camera should record video, video settings such as the resolution of the video, the desired frames per second (FPS) of the video, the brightness, contrast, or color setting of the video, and/or a default position, tilt, and zoom set for the camera. Camera settings module 346 can also automatically update settings for the camera in response to received user commands regarding other settings. For example, if the zoom of camera 309 is changed via user command, camera settings module 346 can include logic for determining that, for example, the brightness of the video at the new zoom setting should be adjusted. Camera settings module 346 may be further used to adjust photo settings for the camera. Photo settings may include a size or resolution of the photos, the brightness, contrast, or color settings of the photos, etc. Photo settings further includes rules or logic for when to take photos or “stills” of video information. For example, photos may be taken by the camera on a scheduled interval, at specific pre-determined times, or when an object is detected and is in the view of the camera. Such settings can be set, changed, and maintained by camera settings module 346.
Circuit 350 is further shown to include a command and control module 356, logging module 358, an end of life module 360, a scheduling module 362, a timer 364, an environment processing module 366, and fixture data 368. Using signals received from communications electronics of the lighting fixture and/or signals received from one or more sensors (e.g., photocells, occupancy sensors, etc.), command and control module 356 is configured to control the ballasts and lamps of the lighting fixture. Command and control module 356 may include the primary control algorithm / loop for operating the fixture and may call, initiate, pass values to, receive values from, or otherwise use the other modules of the circuit. For example, command and control module 356 may primarily operate the fixture using a schedule as described below with respect to scheduling module 362, but may allow upstream or peer control (e.g., “override control”) to allow a remote source to cause the ballast/lamps to turn on or off. Command and control module 356 may be used to control 2-way communication using communications electronics of the lighting fixture.
Command and control module 356 may further receive data from camera circuit 330 or from a user of a remote source connecting to controller 300 and may adjust the control of the ballasts and lamps (e.g., if camera data or a user command indicates a desire to turn on the lamps of the lighting fixture for the benefit of a camera recording video). For example, if camera data and/or sensor 318 indicate there is a vehicle approaching the lighting fixture, command and control module 356 may provide a command to change the lighting fixture state to a dimmed state or an “on” state. Command and control module 356 may further change the lighting fixture state based on other camera data and/or sensor 318 data (e.g., other detected motion, an ambient light level, etc.).
Logging module 358 is configured to identify and store fixture event information. For example, logging module 358 may be configured to identify (e.g., by receiving a signal from another component of the circuit) when the lamps of the fixture are being or have been turned off or turned on. These events may be recorded by logging module 358 with a date/time stamp and with any other data. For example, logging module 358 may record each event as a row in a two dimensional table (e.g., implemented as a part of a relational database, implemented as a flat file stored in memory, etc.) with the fields such as event name, event date/time, event cause, event source. One module that may utilize such information is end of life module 360. End of life module 360 may be configured to compile a time of use total by querying or otherwise aggregating the data stored by logging module 358. Events logged by the system may be transmitted using the communications interfaces or other electronics to a remote source via a wired or wireless connection. Messages transmitting logged events or data may include an identifier unique to the lighting fixture (e.g., lighting fixture's communication hardware) that identify the fixture specifically. In addition to the activities of end of life module 360, command and control module 356 may be configured to cause communications electronics of the fixture to transmit messages from the log or other messages upon identifying a failure (e.g., a power supply failure, a control system failure, a ballast failure, a lamp failure, etc.). While logging module 358 may be primarily used to log on/off events, logging module 358 (or another module of the control system) may log energy draw (or some value derived from energy draw such as a carbon equivalent amount) by the lighting fixture. In an exemplary embodiment, logging module 358 logs information relating to camera circuit 330. For example, logging module 358 can log times when video logic 336 determined that motion was present in a captured scene, log the times when camera 309 was caused to be active based on motion detected using sensor 318, or log other activities relating to camera circuit 330 or camera 309.
In an exemplary embodiment, controller 300 (e.g., via RF transceiver 306) is configured to transmit the logged usage information to remote devices such as master controller 202 of
Controller 300 is shown to include power relays 302 configured to controllably switch on or off high voltage power outputs that may be provided to first ballast 244 and second ballast 246 of
Referring still to
Referring still to
Sensor interface 312 may be configured to receive signals from environment sensor 318. Sensor interface 312 may include any number of jacks, terminals, solder points or other connectors for receiving a wire or lead from environment sensor 318. Sensor interface 312 may also or alternatively be a radio frequency transceiver or receiver for receiving signals from wireless sensors. For example, sensor interface 312 may be a Bluetooth protocol compatible transceiver, a ZigBee transceiver, or any other standard or proprietary transceiver. Regardless of the communication medium used, sensor interface 312 may include filters, analog to digital converters, buffers, or other components configured to handle signals received from environment sensor 312. Sensor interface 312 may be configured to provide the result of any signal transformation (or the raw signal) to circuit 350 for further processing.
Referring further to
Referring now to
Camera 372 is configured to capture images and video and provide the images and video to control circuit 374. Control circuit 374 stores the images and video in memory 376. Control circuit 374 further provides the images and video to RF transceiver 378. RF transceiver 378 is wired to control circuit 374 and wirelessly transmits the images and video to RF transceiver 396 of lighting fixture 390. Control circuit 392 of lighting fixture 390 may then receive and process the images and video or continue transmitting the video information to a remote source.
Referring now to
Referring now to
Referring now to
Process 420 further includes providing a user interface to the remote source (step 428). The user interface may be used to provide a display for a user of the remote source to view the video. Process 420 further includes receiving a selection of video information from the remote source (step 430). The selection of video information may include a request to view a specific video, specific portions of a video, meta information (e.g., a timestamp or timeframe) of the selected video, or other video-related requests. The selected video information is streamed to the remote source (step 432) in response to the selection. Step 432 may include various pre-processing tasks. For example, pre-processing tasks may include compressing the video for streaming, packetizing the video for streaming, and wrapping the packetized video according to a video streaming protocol compatible with the remote source.
Process 420 further includes receiving setting information from the remote source (step 434). Setting information may include various camera settings (e.g., video recording settings such as a resolution of the video, brightness or color settings, instructions for recording an object in the view of the camera, etc.). In response to the received setting information, settings in the camera are updated (step 436). Process 420 further includes receiving PTZ commands from the remote source (step 438) and adjusting PTZ parameters of the camera based on the received commands (step 440). PTZ commands may include an adjustment of the panning of the camera, the tilt of the camera, or the zoom level of the camera.
The user interface of process 420 may include various controls for a user for providing a selection. For example, buttons that a user may click to change the tilt or zoom of the camera may be provided on the user interface, the user interface may show multiple camera views such that a user can select a specific camera view, etc.
Referring now to
Referring further to
Touch screen display 530 and more particularly user interface module 508 are configured to allow and facilitate user interaction (e.g., input and output) with master controller 202. It should be appreciated that in alternative embodiments of master controller 202, the display associated with master controller 202 may not be a touch screen, may be separated from the casing housing the control computer, and/or may be distributed from the control computer and connected via a network connection (e.g., Internet connection, LAN connection, WAN connection, etc.). Further, it should be appreciated that master controller 202 may be connected to a mouse, keyboard, or any other input device or devices for providing user input to master controller 202. Control computer is shown to include a communications interface 532 configured to connect to a wire associated with master transceiver 204.
Communications interface 532 may be a proprietary circuit for communicating with master transceiver 204 via a proprietary communications protocol. In other embodiments, communications interface 532 may be configured to communicate with master transceiver 204 via a standard communications protocol. For example, communications interface 532 may include Ethernet communications electronics (e.g., an Ethernet card) and an appropriate port (e.g., an RJ45 port configured for CAT5 cabling) to which an Ethernet cable is run from master controller 202 to master transceiver 204. Master transceiver 204 may be as described in U.S. application Ser. Nos. 12/240,805, 12/057,217, or 11/771,317 which are each incorporated herein by reference. Communications interface 532 and more generally master transceiver 204 are controlled by logic of wireless interface module 512. Wireless interface module 512 may include drivers, control software, configuration software, or other logic configured to facilitate communications activities of master controller 202 with lighting fixture controllers. For example, wireless interface module 512 may package, address format, or otherwise prepare messages for transmission to and reception by particular controllers or zones. Wireless interface module 512 may also interpret, route, decode, or otherwise handle communications received at master transceiver 204 and communications interface 532.
Referring still to
Control logic module 514 may be the primary logic module for master controller 202 and may be the main routine that calls, for example, modules 508, 510, etc. Control logic module 514 may generally be configured to provide lighting control, energy savings calculations, demand/response-based control, load shedding, load submetering, HVAC control, building automation control, workstation control, advertisement control, power strip control, “sleep mode” control, or any other types of control. In an exemplary embodiment, control logic module 514 operates based off of information stored in one or more databases of master controller 202 and stored in memory 504 or another memory device in communication with master controller 202. The database may be populated with information based on user input received at graphical user interfaces and control logic module 514 may continuously draw on the database information to make control decisions. For example, a user may establish any number of zones, set schedules for each zone, create ambient lighting parameters for each zone or fixture, etc. This information is stored in the database, related (e.g., via a relational database scheme, XML sets for zones or fixtures, or otherwise) and recalled by control logic module 514 as control logic module 514 proceeds through its various control algorithms.
Control logic module 514 may include any number of functions or sub-processes. For example, a scheduling sub-process of control logic module 514 may check at regular intervals to determine if an event is scheduled to take place. When events are determined to take place, the scheduling sub-process or another routine of control logic module 514 may call or otherwise use another module or routine to initiate the event. For example, if the schedule indicates that a zone should be turned off at 5:00 pm, then when 5:00 pm arrives the scheduling sub-process may call a routine (e.g., of wireless interface module) that causes an “off” signal to be transmitted by master transceiver 204. Control logic module 514 may also be configured to conduct or facilitate the completion of any other process, sub-process, or process steps conducted by master controller 202 described herein.
Referring further to
Fieldbus interfaces 516, 520 and device interface module 510 may also be used in concert with user interface module 508 and control logic module 514 to provide control to the monitored devices 518, 522. For example, monitored devices 518, 522 may be mechanical devices configured to operate a motor, one or more electronic valves, one or more workstations, machinery stations, a solenoid or valve, or otherwise. Such devices may be assigned to zones similar to the lighting fixtures described above and below or controlled independently. User interface module 508 may allow schedules and conditions to be established for each of devices 518, 522 so that master controller 202 may be used as a comprehensive energy management system for a facility. For example, a motor that controls the movement of a spinning advertisement may be coupled to the power output or relays of a controller similar to controller 300 of
Referring further to
Referring further to
Master controller 202 further includes mass video processor 562. Mass video processor 562 processes video or video information provided by the cameras wirelessly communicating with master controller 202. Mass video processor 562 may include processing the video for playback on a user interface, for display as part of a display (e.g., a display provided by touch screen display 530), or other video processing for providing video or video information to a device or user wirelessly communicating with master controller 202.
Master controller 202 further includes video storage 564. Video storage 564 stores various camera data (e.g., video or photos) received by master controller 202 or camera data to be transmitted wirelessly to cameras communicating with master controller 202. Video storage 564 may include storage of videos, photos, camera configuration information, a history of usage of the cameras, etc.
Master controller 202 further includes camera system configuration information 566. Camera system configuration information 566 provides configurations for the various cameras that wirelessly communicate with master controller 202. Configuration information may include camera positioning (e.g., adjusting the tilt or zoom of a PTZ camera), resolution or other video quality properties, or other configuration information as described in the present disclosure.
Master controller 202 further includes camera system command module 568. Camera system command module 568 is configured to provide commands to various cameras that may wirelessly communicate with master controller 202. Commands provided to the cameras may include instructions for the camera to record an event, instructions relating to the time and duration of the recording, or other camera instructions as described in the present disclosure.
Referring now to
Master controller 202 is preferably configured to provide a graphical user interface to a local or remote electronic display screen for allowing a user to adjust control parameters, turn lighting fixtures on or off, or to otherwise affect the operation of lighting fixtures in a facility. For example, master controller 202 includes touch screen display 530 for displaying such a graphical user interface and for allowing user interaction (e.g., input and output) with master controller 202. Touch screen display 530 is configured to provide a user with a display for viewing and managing lighting fixture and camera settings. For example, referring also to
It should be noted that while master controller 202 is shown in
Referring further to
According to an exemplary embodiment, different camera and lighting fixture settings may be provided to zones 610, 612. For example, one set of camera and lighting fixture settings may be provided to zone 610 in response to a vehicle traveling through zone 610 (e.g., instructions for recording vehicle movement and providing light for the vehicle) while a second set of camera settings may be provided to zone 612 (e.g., instructions for turning lighting fixtures 606, 608 on to a dimmed state while positioning cameras to detect and pick up the vehicle if the vehicle enters zone 612). According to various exemplary embodiments, master controller 202 may provide the same camera and lighting fixture settings to each lighting fixture and camera in a zone, may provide different camera settings for different cameras and lighting fixtures of the zone, or otherwise.
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.). For example, the position of elements may be reversed or otherwise varied and the nature or number of discrete elements or positions may be altered or varied. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure. In alternative exemplary embodiments the lighting fixtures shown and described throughout this application may be configured or modified for indoor use. For example, rather than including a mounting system for coupling the lighting fixture to a street pole, the lighting fixtures in alternative embodiments may include a mounting system for coupling the lighting fixture to a an indoor ceiling mount or an indoor wall mount. Such camera-integrated indoor lighting fixtures may be used be used in warehouses, manufacturing facilities, sporting arenas, airports, or other environments.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures may show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
This application is a continuation of U.S. Nonprovisional application Ser. No. 13/223,135, filed on Aug. 31, 2011, which claims benefit of U.S. Provisional Application No. 61/380,128, filed on Sep. 3, 2010. The application Ser. No. 13/223,135 also claims the benefit of priority as a Continuation-In-Part of U.S. application Ser. No. 12/875,930, filed on Sep. 3, 2010, which claims the benefit of priority of U.S. Application No. 61/275,985, filed on Sep. 4, 2009. The application Ser. No. 13/223,135 also claims the benefit of priority as a Continuation-In-Part of U.S. application Ser. No. 12/550,270, filed on Aug. 28, 2009, which is a Continuation-In-Part of application Ser. No. 11/771,317, filed Jun. 29, 2007, and is also a Continuation-In-Part of U.S. Ser. No. 12/240,805, filed on Sep. 29, 2008, which is a Continuation-In-Part of U.S. application Ser. No. 12/057,217, filed Mar. 27, 2008. The subject matter of application Ser. Nos. 13/223,135, 61/380,128, 61/275,985, 12/875,930, 12/550,270, 12/240,805, 12/057,217, and 11/771,317 are hereby incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
1254520 | MacDuff | Jan 1918 | A |
2403240 | Sawin | Jul 1946 | A |
2485148 | Fralin | Oct 1949 | A |
2636977 | Foster | Apr 1953 | A |
3292319 | McCarthy | Dec 1966 | A |
3337035 | Pennybacker | Aug 1967 | A |
3416266 | Eron | Dec 1968 | A |
3511559 | Foster | May 1970 | A |
3757290 | Ross et al. | Sep 1973 | A |
4013922 | Van Der Meulen | Mar 1977 | A |
4023043 | Stevenson | May 1977 | A |
4114186 | Dominguez | Sep 1978 | A |
4135181 | Bogacki et al. | Jan 1979 | A |
4144462 | Sieron et al. | Mar 1979 | A |
4190800 | Kelly et al. | Feb 1980 | A |
4204194 | Bogacki | May 1980 | A |
4204195 | Bogacki | May 1980 | A |
4306769 | Martinet | Dec 1981 | A |
4360881 | Martinson | Nov 1982 | A |
4387417 | Plemmons et al. | Jun 1983 | A |
4489386 | Breddan | Dec 1984 | A |
4727593 | Goldstein | Feb 1988 | A |
4733505 | Van Dame | Mar 1988 | A |
4809468 | Bareiss | Mar 1989 | A |
4841914 | Chattan | Jun 1989 | A |
4860511 | Weisner et al. | Aug 1989 | A |
4883340 | Dominguez | Nov 1989 | A |
4998095 | Shields | Mar 1991 | A |
5099622 | Sutton | Mar 1992 | A |
5165465 | Kenet | Nov 1992 | A |
5253444 | Donoho et al. | Oct 1993 | A |
5261179 | Schwinler | Nov 1993 | A |
5353543 | Teraoka | Oct 1994 | A |
5371661 | Simpson | Dec 1994 | A |
5426620 | Budney | Jun 1995 | A |
5546712 | Bixby | Aug 1996 | A |
5572438 | Ehlers et al. | Nov 1996 | A |
5598042 | Mix et al. | Jan 1997 | A |
5644173 | Elliason et al. | Jul 1997 | A |
5649394 | Ohba | Jul 1997 | A |
5655339 | Deblock et al. | Aug 1997 | A |
5713160 | Heron | Feb 1998 | A |
5717609 | Packa et al. | Feb 1998 | A |
5729387 | Takahashi et al. | Mar 1998 | A |
5758331 | Johnson | May 1998 | A |
5836114 | Ohba | Nov 1998 | A |
5918404 | Ohba | Jul 1999 | A |
5956462 | Langford | Sep 1999 | A |
5962989 | Baker | Oct 1999 | A |
6003471 | Ohba | Dec 1999 | A |
6122603 | Budike, Jr. | Sep 2000 | A |
6169979 | Johnson | Jan 2001 | B1 |
6257735 | Baar | Jul 2001 | B1 |
D447266 | Verfuerth | Aug 2001 | S |
6363667 | O'Neill | Apr 2002 | B2 |
6367419 | Gosselin | Apr 2002 | B1 |
6418674 | Deraedt | Jul 2002 | B1 |
D463059 | Verfuerth | Sep 2002 | S |
6467933 | Baar | Oct 2002 | B2 |
6524175 | Beaudry et al. | Feb 2003 | B2 |
6528782 | Zhang et al. | Mar 2003 | B1 |
6528957 | Luchaco | Mar 2003 | B1 |
6535859 | Yablonowski et al. | Mar 2003 | B1 |
6585396 | Verfuerth | Jul 2003 | B1 |
D479826 | Verfuerth et al. | Sep 2003 | S |
6622097 | Hunter | Sep 2003 | B2 |
6633823 | Bartone et al. | Oct 2003 | B2 |
6644836 | Adams | Nov 2003 | B1 |
D483332 | Verfuerth | Dec 2003 | S |
6671586 | Davis et al. | Dec 2003 | B2 |
6717660 | Bernardo | Apr 2004 | B1 |
6731080 | Flory | May 2004 | B2 |
D494700 | Hartman et al. | Aug 2004 | S |
6774790 | Houston | Aug 2004 | B1 |
6785592 | Smith et al. | Aug 2004 | B1 |
6813864 | Landis | Nov 2004 | B2 |
6828695 | Hansen | Dec 2004 | B1 |
6832135 | Ying | Dec 2004 | B2 |
6894609 | Menard et al. | May 2005 | B2 |
6938210 | Huh | Aug 2005 | B1 |
6979097 | Elam et al. | Dec 2005 | B2 |
6983210 | Matsubayashi et al. | Jan 2006 | B2 |
6990394 | Pasternak | Jan 2006 | B2 |
7027736 | Mier-Langner et al. | Apr 2006 | B1 |
7130719 | Ehlers et al. | Oct 2006 | B2 |
7130832 | Bannai et al. | Oct 2006 | B2 |
7167777 | Budike, Jr. | Jan 2007 | B2 |
7259527 | Foo | Aug 2007 | B2 |
7264177 | Buck et al. | Sep 2007 | B2 |
D557817 | Verfuerth | Dec 2007 | S |
7307542 | Chandler et al. | Dec 2007 | B1 |
D560469 | Bartol et al. | Jan 2008 | S |
7369056 | McCollough, Jr. | May 2008 | B2 |
7401942 | Verfuerth et al. | Jul 2008 | B1 |
7446671 | Giannopoulos et al. | Nov 2008 | B2 |
7518531 | Butzer et al. | Apr 2009 | B2 |
D595894 | Verfuerth et al. | Jul 2009 | S |
7563006 | Verfuerth et al. | Jul 2009 | B1 |
7575338 | Verfuerth | Aug 2009 | B1 |
D606697 | Verfuerth et al. | Dec 2009 | S |
7628506 | Verfuerth et al. | Dec 2009 | B2 |
7638743 | Bartol et al. | Dec 2009 | B2 |
7660652 | Smith et al. | Feb 2010 | B2 |
D617028 | Verfuerth et al. | Jun 2010 | S |
D617029 | Verfuerth et al. | Jun 2010 | S |
7738999 | Petite | Jun 2010 | B2 |
7746003 | Verfuerth et al. | Jun 2010 | B2 |
7762861 | Verfuerth et al. | Jul 2010 | B2 |
D621410 | Verfuerth et al. | Aug 2010 | S |
D621411 | Verfuerth et al. | Aug 2010 | S |
7780310 | Verfuerth et al. | Aug 2010 | B2 |
7784966 | Verfuerth et al. | Aug 2010 | B2 |
D623340 | Verfuerth et al. | Sep 2010 | S |
7812543 | Budike, Jr. | Oct 2010 | B2 |
7847706 | Ross et al. | Dec 2010 | B1 |
7859398 | Davidson et al. | Dec 2010 | B2 |
D632006 | Verfuerth et al. | Feb 2011 | S |
8033686 | Recker et al. | Oct 2011 | B2 |
8035320 | Sibert | Oct 2011 | B2 |
D650225 | Bartol et al. | Dec 2011 | S |
8070312 | Verfuerth et al. | Dec 2011 | B2 |
8138690 | Chemel et al. | Mar 2012 | B2 |
8255090 | Frader-Thompson et al. | Aug 2012 | B2 |
8344665 | Verfuerth et al. | Jan 2013 | B2 |
8373362 | Chemel et al. | Feb 2013 | B2 |
8450670 | Verfuerth et al. | May 2013 | B2 |
8531134 | Chemel et al. | Sep 2013 | B2 |
8543249 | Chemel et al. | Sep 2013 | B2 |
8610377 | Chemel et al. | Dec 2013 | B2 |
8626643 | Verfuerth et al. | Jan 2014 | B2 |
8779340 | Verfuerth et al. | Jul 2014 | B2 |
8884203 | Verfuerth et al. | Nov 2014 | B2 |
20010055965 | Delp et al. | Dec 2001 | A1 |
20020060283 | Jordan et al. | May 2002 | A1 |
20020065583 | Okada et al. | May 2002 | A1 |
20020082748 | Enga et al. | Jun 2002 | A1 |
20020103655 | Boies et al. | Aug 2002 | A1 |
20020162032 | Gundersen et al. | Oct 2002 | A1 |
20020172049 | Yueh | Nov 2002 | A1 |
20020173321 | Marsden et al. | Nov 2002 | A1 |
20030011486 | Ying | Jan 2003 | A1 |
20030016143 | Ghazarian | Jan 2003 | A1 |
20030036820 | Yellepeddy et al. | Feb 2003 | A1 |
20030041017 | Spool et al. | Feb 2003 | A1 |
20030041038 | Spool et al. | Feb 2003 | A1 |
20030046252 | Spool et al. | Mar 2003 | A1 |
20030084358 | Bresniker et al. | May 2003 | A1 |
20030084359 | Bresniker et al. | May 2003 | A1 |
20030093332 | Spool et al. | May 2003 | A1 |
20030171851 | Brickfield et al. | Sep 2003 | A1 |
20030179577 | Marsh | Sep 2003 | A1 |
20030229572 | Raines et al. | Dec 2003 | A1 |
20040006439 | Hunter | Jan 2004 | A1 |
20040024483 | Holcombe | Feb 2004 | A1 |
20040076001 | Lutes | Apr 2004 | A1 |
20040078153 | Bartone et al. | Apr 2004 | A1 |
20040078154 | Hunter | Apr 2004 | A1 |
20040083163 | Cooper | Apr 2004 | A1 |
20040095237 | Chen et al. | May 2004 | A1 |
20040128266 | Yellepeddy et al. | Jul 2004 | A1 |
20040193329 | Ransom et al. | Sep 2004 | A1 |
20040201448 | Wang | Oct 2004 | A1 |
20040243377 | Roytelman | Dec 2004 | A1 |
20050027636 | Gilbert et al. | Feb 2005 | A1 |
20050034023 | Maturana et al. | Feb 2005 | A1 |
20050035717 | Adamson et al. | Feb 2005 | A1 |
20050038571 | Brickfield et al. | Feb 2005 | A1 |
20050043860 | Petite | Feb 2005 | A1 |
20050124346 | Corbett et al. | Jun 2005 | A1 |
20050232289 | Walko et al. | Oct 2005 | A1 |
20050265050 | Miller | Dec 2005 | A1 |
20060002110 | Dowling et al. | Jan 2006 | A1 |
20060044152 | Wang | Mar 2006 | A1 |
20060044789 | Curtis | Mar 2006 | A1 |
20060065750 | Fairless | Mar 2006 | A1 |
20060085301 | Leahy | Apr 2006 | A1 |
20060125426 | Veskovic et al. | Jun 2006 | A1 |
20060253885 | Murphy et al. | Nov 2006 | A1 |
20070027645 | Guenther et al. | Feb 2007 | A1 |
20070043478 | Ehlers et al. | Feb 2007 | A1 |
20070085701 | Walters et al. | Apr 2007 | A1 |
20070097993 | Bojahra et al. | May 2007 | A1 |
20070100571 | Miki | May 2007 | A1 |
20070145915 | Roberge et al. | Jun 2007 | A1 |
20070222581 | Hawkins et al. | Sep 2007 | A1 |
20070247859 | Haddad et al. | Oct 2007 | A1 |
20070252528 | Vermuelen et al. | Nov 2007 | A1 |
20080143273 | Davidson et al. | Jun 2008 | A1 |
20080147465 | Raines et al. | Jun 2008 | A1 |
20080183337 | Szabados | Jul 2008 | A1 |
20080218317 | Choi | Sep 2008 | A1 |
20080266664 | Winston et al. | Oct 2008 | A1 |
20080275802 | Verfuerth et al. | Nov 2008 | A1 |
20080291054 | Groft | Nov 2008 | A1 |
20080315772 | Knibbe | Dec 2008 | A1 |
20080316743 | Shaneour | Dec 2008 | A1 |
20090000217 | Verfuerth et al. | Jan 2009 | A1 |
20090059603 | Recker et al. | Mar 2009 | A1 |
20090090895 | Hogan, Jr. | Apr 2009 | A1 |
20090147507 | Verfuerth et al. | Jun 2009 | A1 |
20090150004 | Wang et al. | Jun 2009 | A1 |
20090222142 | Kao et al. | Sep 2009 | A1 |
20090243517 | Verfuerth et al. | Oct 2009 | A1 |
20090248217 | Verfuerth et al. | Oct 2009 | A1 |
20090251066 | Baaijens et al. | Oct 2009 | A1 |
20090299811 | Verfuerth et al. | Dec 2009 | A1 |
20090315485 | Verfuerth et al. | Dec 2009 | A1 |
20100061088 | Bartol et al. | Mar 2010 | A1 |
20100246168 | Verfuerth et al. | Sep 2010 | A1 |
20110060701 | Verfuerth et al. | Mar 2011 | A1 |
20110146669 | Bartol et al. | Jun 2011 | A1 |
20110235317 | Verfuerth et al. | Sep 2011 | A1 |
20110279063 | Wang et al. | Nov 2011 | A1 |
20120037725 | Verfuerth | Feb 2012 | A1 |
20120038281 | Verfuerth | Feb 2012 | A1 |
20120038490 | Verfuerth | Feb 2012 | A1 |
20120040606 | Verfuerth | Feb 2012 | A1 |
20120044350 | Verfuerth | Feb 2012 | A1 |
20120081906 | Verfuerth et al. | Apr 2012 | A1 |
20120167957 | Verfuerth et al. | Jul 2012 | A1 |
20120274222 | Verfuerth et al. | Nov 2012 | A1 |
20130006437 | Verfuerth et al. | Jan 2013 | A1 |
20130033183 | Verfuerth et al. | Feb 2013 | A1 |
Number | Date | Country |
---|---|---|
2 237 826 | May 1991 | GB |
2 250 172 | Jun 1992 | GB |
05-336868 | Dec 1993 | JP |
2010-046091 | Mar 2010 | JP |
WO-2004023849 | Mar 2004 | WO |
Entry |
---|
U.S. Appl. No. 13/249,001, filed Sep. 29, 2011, Verfuerth et al. |
U.S. Appl. No. 13/275,536, filed Oct. 18, 2011, Verfuerth et al. |
U.S. Appl. No. 13/296,058, filed Nov. 14, 2011, Verfuerth et al. |
U.S. Appl. No. 61/466,411, filed Mar. 22, 2011, Verfuerth et al. |
U.S. Appl. No. 13/333,293, filed Dec. 21, 2011, Verfuerth et al. |
“About Sun Dome Tubular Skylights,” having a date indication of © 2009, 8 pages. |
Deru et al.; BigHorn Home Improvement Center Energy Performance; ASHRAE Transactions, Atlanta: 2006 vol. 112, 26 pages. |
Galasiu et al. “Energy saving lighting control systems for open-plan offices: a filed study”; Jul. 2007, National Research Council Canada; vol. 4; No. 1, pp. 1-28, 56 pages. |
Halliday, D., et al., Physics Part I and II; John Wiley& Sons, Inc. 1967 (9 pgs.). |
Harris, L. R., et al., “Pacific Northwest Laboratory's Lighting Technology Screening Matrix,” PNL-SA-23871, Apr. 1994, U.S. Department of Energy, Pacific Northwest Laboratory, Richland, Washington 99352, pp. 1-14. |
Non-Final Office Action on U.S. Appl. No. 13/902,449, dated Aug. 28, 2013, 15 pgs. |
Non-Final Office Action on U.S. Appl. No. 13/932,962, dated Aug. 28, 2013, 9 pgs. |
Notice of Acceptance (NOA) from Miami-Dade County, Building Code Compliance Office, Product Control Division, Approval Date Dec. 13, 2007, 2 pages. |
Sun-Dome /Tubular Skylight, Daylighting Technologies, Riviera Beach, FL, revision Oct. 22, 2007, 1 page. |
Number | Date | Country | |
---|---|---|---|
20140078308 A1 | Mar 2014 | US |
Number | Date | Country | |
---|---|---|---|
61380128 | Sep 2010 | US | |
61275985 | Sep 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13223135 | Aug 2011 | US |
Child | 14083299 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12875930 | Sep 2010 | US |
Child | 13223135 | US | |
Parent | 12550270 | Aug 2009 | US |
Child | 12875930 | US | |
Parent | 11771317 | Jun 2007 | US |
Child | 12550270 | US | |
Parent | 12240805 | Sep 2008 | US |
Child | 11771317 | US | |
Parent | 12057217 | Mar 2008 | US |
Child | 12240805 | US |