Displays are an important component of most advertising portfolios. As with other real estate, location is key. Most displays are configured to catch the eyes of viewers, which can enhance the brands shown on the displays. In crowded display areas, it is important to make a display distinctive to optimize the impact of the display.
In accordance with certain aspects of the present disclosure, a display device includes a housing with a grid assembly received therein. A plurality of light assemblies is coupled to the grid assembly, for example in a grid pattern to create a dynamic display area. Each of light assemblies has an actuator assembly, with each one of the actuator assemblies being individually controllable to move the corresponding light assembly between a retracted state and a plurality of extended states. A controller is coupled to each of the plurality of light assemblies and programmed to control the actuator assemblies to move the light assemblies between the retracted state and the plurality of extended states. In some examples, the housing is situated on a wheeled cart, and includes a removable stationary display area.
The examples described herein are related to display devices used for advertising.
In some examples, the display devices incorporate lighting and movement. The lighting and movement are configured to catch a viewer's attention. This can enhance the impact of the brand shown on the display device.
The housing 110 includes side panels 116 with handles 118 attached thereto, and top and bottom panels extending between the side panels. As shown in the rear view of
In some embodiments, the stationary display area 104 is removable, comprising a lift-off header 108 that is situated on the top panel of the housing 110 that may be removed from the housing 110. The stationary display area 104 may be decorated with decals of graphics, have a back-lit display, and/or have a supplemental video display. Further, in some examples, the header 108 houses electronic components of the display device 100, such as an audio amplifier (not shown), audio speaker(s) 109, lighting (not shown) for the stationary display area 104, etc.
Each of the light assemblies 230 includes an actuator assembly 232 configured to move the light assemblies in and out of the dynamic area 106. More specifically, each of the actuator assemblies 232 includes a moving cube 314 movingly mounted to a stationary core 312 that is mounted to the grid assembly 140 such that the cube 314 is laterally movable relative to the grid assembly 140. The moving cube 314 includes two L-shaped aluminum panels 314a, 314b, attached to a bracket 266 and positioned about a cable tray 264 fastened on top of a rail 272. The rail 272 is attached to a back plate 250 at one end, with the opposite end connected to an attachment bracket 270, with an end plate 268 connecting an LED module 316 to the attachment bracket 270. The panels 314a, 314b may also attach to the attachment bracket 270.
The back plate 250 has a bearing block 252 attached thereto with a radial ball bearing 254 supporting a threaded rod 256. One end of the threaded rod 256 is connected to the panels 314a, 314b via a dampener bushing 276 and vibration isolator spacer 274, and the other end is actuated by an actuator 262 configured to extend and retract the threaded rod 256 to selectively position the LED module 316 laterally. The actuator 262 may be, for example, a stepper motor or servo motor. The actuator 262 is mounted to the back plate 250 via vibration isolator spacers 258. A circuit board assembly 260 is further mounted to the back plate 250 and provides electrical components and controller circuitry for operating the actuator 262. The actuator 262 is electrically controlled and moves the moving cube 314 via the rod 256 to any of a plurality of extended positions. The movement can be precisely controlled, so that the position of the moving cube 314 is known. For example, in one embodiment, control is as precise as 0.0079 inches, with a position range of 1 to 29,000.
In the illustrated example, when the moving cube 314 is fully extended, the LED module 316 is moved about 7 inches from the fully retracted position. However, other lengths could be used depending on the amount of movement required.
In some examples, the actuator 262 includes a model F12-BC linear actuator made by W-Robit of Taiwan, which can drive up to 44 pounds, with a maximum drive speed of 40 inches per second. In another example, a PAC-UGT040D actuator made by PBC Linear of Roscoe, Ill., is used. Some examples include a model BCH U04 manufactured by Schneider Electric of Palatine, Ill., with a LXM23A servo driver system and Modicon M258 logic controller, also manufactured by Schneider Electric. In still other examples, the actuator 262 includes a model SM23165DT motor made by Moog Animatics, of Santa Clara, Calif.
The LED modules 316 mounted to each of the moving cubes 314 each include a plurality of LEDs, such as an NSSM032T LED module made by Nichia Corporation of Japan. Such an LED module is a 3-in-1 SMD LED, although other types can be used. In this example, the LED module 316 is about five inches in height and width. The LED module 316 is configured to provide a plurality of colors, and each LED module 316 is individually controllable, as described below.
In example embodiments, the LED modules 316 can be configured to display one or a plurality of colors. For example, the LED modules 316 can be configured to display text, pictures, or other effects. In this example, light assemblies 230 have a 4 millimeter LED pitch size. By grouping the LED modules 316, a larger effect, such as a larger picture or text, can be created on the display area 102. In yet another alternative, the brightness of the lights in the LED modules 316 is configurable to create different appearances. For example, the lights can be dimmed or otherwise dulled to form depth and other visual effects, particularly around the edges of the display device 100.
Referring now to
In the illustrated example, control signals for controlling movement of the actuator assemblies 232 are output by the computing device 502 via a USB output, and converted to DMX signals by an appropriate converter 508 which, in turn, communicates with the display device 100 through a DMX splitter 510. Other configurations are possible.
In one example, the actuators 262 of the actuator assemblies 232 are controlled by the computing device 502 according to a percentage of extension for the moving cube 314. For example, the computing device 502 defines a percentage, such as 0 percent, 10 percent, 25 percent, 50 percent, 75 percent, and/or 100 percent for the moving cube 314 at a given point in time. The percentage is translated to instructions transmitted to the appropriate actuator 262 to extend or retract the moving cube 314 the desired amount. By defining a changing percentage over time, the movement of the moving cube 314 can be choreographed, as desired.
In addition, the computing device 502 can define colors to be displayed by the LED module carried by the moving cube 314. The colors of the LEDs on the LED modules 316 can be changed to create the desired effect. LED control signals are output by the computing device 502 to an LED display driver 512, which communicates with the LED modules 316 of the light assemblies 230.
Since each of the actuator assemblies 232 can be individually controlled separately, the movement and color of each of the actuator assemblies 232 can be controlled to create patterns or other visual effects for the display device 100.
For example, the actuator assemblies 232 in a certain area of the display can be extended and retracted in coordination to give the appearance of movement of the display device 100. In one such example, the actuator assemblies 232 are controlled to provide a wave-like effect across the display device 100. In another example, the control is randomized, so that the actuator assemblies 232 move in random patterns. Other configurations and patterns are possible.
By controlling the display device 100 in this manner, the overall visual impact of the display device 100 is increased. Specified patterns can be used to further enhance the visual effect of the display device 100, thereby catching the eye of a viewer.
In some examples, the patterns are configured to make certain shapes and depictions. For example, as shown in
For example, the computing device 502 can be programmed to create various shapes on the display device 100 depending on the time of day, as well as control the sequence of those shapes. The sequence can be choreographed or randomized, as desired. For example, in one embodiment, the computing device 502 can control the sign to depict fluid flowing out of a bottle. Many other examples are possible.
Referring now to
At operation 610, a video creation software application such as a three-dimensional visualization software is used to author content for the display device 100. The software, which is executed by the computing device 502 (or any other computing device, not necessarily connected to the display device 100), allows for the creation and/or manipulation of video content that will be used to control the display device 100. The software optionally includes an emulator that depicts the display device 100 to allow a user to author different content for the display device. One example of such content is an advertisement featuring a bottle. The advertisement can define the shape, motion, and color of the bottle to be depicted on the display device 100.
Next, at operation 620, the content is edited into video (i.e., color) and motion components. This is accomplished by extracting the video and motion components so that dual synchronized video files are formed. The first video file is for controlling the light display (the LED modules), and the second video file is for controlling the motion (i.e., the moving cubes).
The first video file is transferred to operations 630, 640, whereat the LED modules of the display device 100 are controlled. This includes controlling which of the LED modules are active and any content displayed on the LED modules. In this example, the LED modules are controlled using the GigE protocol.
The second video file is transferred to operation 650, whereat the motion file is interpreted and translated into the DMX protocol. This protocol is, in turn, used at operation 660 to control movement of the moving cubes of the actuator assemblies by the servo motor.
By synchronizing the first and second video files, the visual and motion components of the display device 100 are synchronized to create the desired effects as defined by the author.
In these examples, the computing device 502 includes one or more processing units and computer readable media. Computer readable media includes physical memory such as volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or some combination thereof. Additionally, the computing device can include mass storage (removable and/or non-removable) such as a magnetic or optical disks or tape. An operating system, such as Linux or Windows, and one or more application programs can be stored on the mass storage device. The computing device can include input devices (such as a keyboard and mouse) and output devices (such as a monitor and printer).
The computing device also includes network connections to other devices, computers, networks, servers, etc. In example embodiments, the computing device communicates with other components through one or more networks, such as a local area network (LAN), a wide area network (WAN), the Internet, or a combination thereof. Communications can be implemented using wired and/or wireless technologies.
“Agency Preview Tool” (APT), is provided in some disclosed implementations. The APT 700 allows agencies preparing content for the display device 100 to preview content as it will appear on the display device 100. Additionally, embodiments of the APT 700 correctly format video content that is to be exported for display on the device 100 to ensure its compatibility with the various components of the system 500. Among other things, this allows creative preview and experimentation while creating new content, visual verification of correct synchronization between the video content, and the movement content that drives the movement of the actuator assemblies 232. In some examples, the APT 700 further checks to insure technical compliance of the created content with the physical capabilities and limits of the display device 100. For instance, the APT 700 may verify that the content to be displayed does not require the actuator assemblies 232 to move faster than they are capable of moving. Content to be displayed is exported in a format ready for integration, including files with content suitable for display on the display device 100, encoded module movement content, a sign preview, and metadata containing information such as the estimated power consumption of the content, for example.
Embodiments of the APT 700 receive as inputs a display video intended to be shown on the LED modules 316, and a movement video which is an encoded representation of the LED module movement. As shown in
Examples of the APT 700 further provide the ability to export the content once the user has completed creating and previewing it. With the system illustrated in
Valid export data 724 include, for example, a file with video correctly formatted for display by the LED controllers 370, and a file with video correctly formatted for interpretation for movement by the actuator assemblies 232.
As noted above, some embodiments of the APT 700 provide the interface for including actuator assembly 232 movement along with the displayed video content. End users may either create movement to go along with their display videos using a video editing application 702 of choice, or they may select default movement files provided within the APT. For example, the APT 700 may include a library of pre-generated movement videos that define predetermined movement patterns available for users of the APT 700.
Embodiments of the APT 700 are configured to verify that the video and motion files are the same length. If the files are not the same length, various solutions may be employed. For example, if the content video is longer than motion video, an error message is presented to the user informing them if they continue the motion content will be looped. If the motion video is longer than the content video, an error message is presented to the user informing them if they continue the motion content will be truncated.
To combine the content and movement video files to simulate the video and motion together, both a content video file and corresponding movement file are loaded to the APT 700 from the editing application 702. For the content video file, the APT 700 checks for the appropriate file type, length, etc. in the validation process 722. Each video frame is read in sequence and converted to an image for manipulation by a three dimensional simulator. As noted above, the disclosed example display device 100 includes a grid having movable LED modules 316. The content video file is thus split into a corresponding grid for display on the individual LED modules 316 of each module 222. The movement file is the same size as the content video file, and is also split into a corresponding grid.
As noted above in conjunction with
In certain implementations, the MSC 800 is installed at the location of the display device 100 and provides operational functionality for the movement of the actuator assemblies 232. In some embodiments, the DMX protocol (DM512) is used for communicating to the actuator assemblies 232. The signals output by the MSC are thus converted to DMX instructions suitable for controlling the actuator assemblies 232. In some embodiments, the LightFactory control system from dreamsolutions of Auckland, New Zealand is used to convert the greyscale video signal data into DMX512 instructions.
A conversion process 808 converts the motion data to visual data, and the MSC 800 displays the motion data as a visual output (the greyscale data is displayed to the MSC monitor 810). Each frame of movement data is converted to a greyscale red, green and blue value. This greyscale value is drawn to the screen 810 as 28 pixel wide by 28 pixel high squares arranged in a grid (exactly like the movement video file exported from the APT 700). The visual motion data is converted to an internal representation of motion. The greyscale value for video for each module is converted into a numeric value between 0 and 255 (0 being completely black and 255 being fully white). The greyscale numeric value is then converted to DMX512 instructions such that the numeric values correspond to the extension of the actuator assemblies 232 as described above.
In some implementations, the MCS 800 further includes a power usage detection process that monitors power consumption of the display device 100. For example, a power consumption threshold parameter may be determined and used as an input to the MSC 800. Power usage is monitored for module movement, LEDs, and other ancillary components. If power usage exceeds the threshold parameter, a warning or message is sent to an event log.
The various embodiments described above are provided by way of illustration only and should not be construed to limiting. Those skilled in the art will readily recognize various modifications and changes that may be made to the embodiments described above without departing from the true spirit and scope of the disclosure or the following claims.
This application is a National Stage application of PCT International Patent Application No. PCT/US2017/053290, filed on Sep. 25, 2017, which claims benefit of priority to U.S. Provisional patent application Ser. No. 62/399,767, filed Sep. 26, 2016, which applications are incorporated herein by reference. To the extent appropriate, a claim of priority is made to each of the above disclosed applications.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/053290 | 9/25/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/058055 | 3/29/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4654989 | Fleming | Apr 1987 | A |
4757626 | Weinreich | Jul 1988 | A |
5077608 | Dubner | Dec 1991 | A |
5086287 | Nutzel | Feb 1992 | A |
5525000 | Belobraydich et al. | Jun 1996 | A |
5717423 | Parker | Feb 1998 | A |
5793918 | Hogan | Aug 1998 | A |
5907312 | Sato et al. | May 1999 | A |
6189246 | Gorthala | Feb 2001 | B1 |
6433761 | Remitz | Aug 2002 | B1 |
6462840 | Kravtsov | Oct 2002 | B1 |
6546655 | Hillstrom | Apr 2003 | B1 |
6606809 | Hillstrom et al. | Aug 2003 | B2 |
7055271 | Lutz et al. | Jun 2006 | B2 |
7277080 | Goulthorpe | Oct 2007 | B2 |
7352339 | Morgan et al. | Apr 2008 | B2 |
7356357 | DeCost et al. | Apr 2008 | B2 |
7436388 | Hillis et al. | Oct 2008 | B2 |
7439950 | Carlberg | Oct 2008 | B2 |
7525510 | Beland et al. | Apr 2009 | B2 |
7551771 | England, III | Jun 2009 | B2 |
7552553 | Kelly | Jun 2009 | B2 |
7605772 | Syrstad | Oct 2009 | B2 |
7620026 | Anschutz et al. | Nov 2009 | B2 |
7653569 | Zbib | Jan 2010 | B1 |
7852333 | Nishikawa et al. | Dec 2010 | B2 |
7866075 | Meeker et al. | Jan 2011 | B2 |
7905413 | Knowles et al. | Mar 2011 | B2 |
7928968 | Shon et al. | Apr 2011 | B2 |
7948450 | Kay et al. | May 2011 | B2 |
8040361 | Bachelder et al. | Oct 2011 | B2 |
8081158 | Harris | Dec 2011 | B2 |
8152062 | Perrier et al. | Apr 2012 | B2 |
8232976 | Yun et al. | Jul 2012 | B2 |
8254338 | Anschutz et al. | Aug 2012 | B2 |
8289274 | Sliwa et al. | Oct 2012 | B2 |
8413073 | Lee | Apr 2013 | B2 |
8552883 | Su | Oct 2013 | B1 |
8576212 | Lee et al. | Nov 2013 | B2 |
8588517 | Lee et al. | Nov 2013 | B2 |
8606043 | Kwon et al. | Dec 2013 | B2 |
9257061 | Jurewicz et al. | Feb 2016 | B2 |
9269283 | Jurewicz et al. | Feb 2016 | B2 |
9640118 | Jurewicz et al. | May 2017 | B2 |
9812033 | Chari | Nov 2017 | B2 |
9885466 | Jurewicz et al. | Feb 2018 | B2 |
10208934 | Jurewicz et al. | Feb 2019 | B2 |
20030080923 | Suyama et al. | May 2003 | A1 |
20040077285 | Bonilla et al. | Apr 2004 | A1 |
20040165006 | Kirby et al. | Aug 2004 | A1 |
20050017977 | Simpson et al. | Jan 2005 | A1 |
20050150147 | Berryman | Jul 2005 | A1 |
20050264430 | Zhang et al. | Dec 2005 | A1 |
20060055641 | Robertus et al. | Mar 2006 | A1 |
20060107567 | Liao | May 2006 | A1 |
20060285832 | Huang | Dec 2006 | A1 |
20070171674 | Deutsch | Jul 2007 | A1 |
20070244417 | Escriba Nogues | Oct 2007 | A1 |
20080010041 | McDaniel | Jan 2008 | A1 |
20080115187 | Decost et al. | May 2008 | A1 |
20080201208 | Tie et al. | Aug 2008 | A1 |
20080238889 | Thorne | Oct 2008 | A1 |
20080246757 | Ito | Oct 2008 | A1 |
20080266204 | Bartels et al. | Oct 2008 | A1 |
20090084010 | Dykstra | Apr 2009 | A1 |
20090177528 | Wu et al. | Jul 2009 | A1 |
20090184892 | Eberle et al. | Jul 2009 | A1 |
20100182340 | Bachelder et al. | Jul 2010 | A1 |
20100219973 | Griffin et al. | Sep 2010 | A1 |
20110128283 | Lee et al. | Jun 2011 | A1 |
20110175992 | Lee et al. | Jul 2011 | A1 |
20110193277 | Christenson | Aug 2011 | A1 |
20110225860 | Troiano et al. | Sep 2011 | A1 |
20110228058 | Hatasawa | Sep 2011 | A1 |
20110231231 | Cruz | Sep 2011 | A1 |
20110235332 | Cheung | Sep 2011 | A1 |
20120092337 | Tsao | Apr 2012 | A1 |
20120139919 | Shintani | Jun 2012 | A1 |
20120154438 | Cohen | Jun 2012 | A1 |
20120159820 | Van Saanen | Jun 2012 | A1 |
20120188235 | Wu et al. | Jul 2012 | A1 |
20120195463 | Shinkai | Aug 2012 | A1 |
20120224311 | Sutherland et al. | Sep 2012 | A1 |
20120242958 | Zuloff | Sep 2012 | A1 |
20130009951 | Kwon et al. | Jan 2013 | A1 |
20130041730 | LoBianco | Feb 2013 | A1 |
20130063561 | Stephan | Mar 2013 | A1 |
20130117121 | Raman et al. | May 2013 | A1 |
20130194059 | Parr | Aug 2013 | A1 |
20130265213 | Chen | Oct 2013 | A1 |
20130312300 | Lee | Nov 2013 | A1 |
20130321394 | Fisher et al. | Dec 2013 | A1 |
20140104389 | Dharmatilleke | Apr 2014 | A1 |
20140114708 | Campbell | Apr 2014 | A1 |
20140259824 | Jurewicz et al. | Sep 2014 | A1 |
20140267457 | Jurewicz et al. | Sep 2014 | A1 |
Number | Date | Country |
---|---|---|
101546503 | Sep 2009 | CN |
102009019400 | Nov 2010 | DE |
2363506 | Dec 2001 | GB |
2393560 | Mar 2004 | GB |
2428433 | Jan 2007 | GB |
2435540 | Aug 2007 | GB |
4-14086 | Jan 1992 | JP |
10-268796 | Oct 1998 | JP |
3127447 | Jan 2001 | JP |
2001-333438 | Nov 2001 | JP |
2007-225638 | Sep 2007 | JP |
2005025377 | Mar 2005 | WO |
Entry |
---|
International Search Report and Written Opinion for PCT/US2017/053290 dated Jan. 4, 2018. |
International Search Report and Written Opinion for PCT/US2014/029573 dated Sep. 18, 2014. |
Extended European Search Report for Application No. 17854084.5 dated Apr. 2, 2020. |
Number | Date | Country | |
---|---|---|---|
20200219424 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
62399767 | Sep 2016 | US |