1. Field
This disclosure generally relates to the field of data transference.
2. General Background
Media products are typically placed in packaging on store shelves that allow consumers to read information about the content of the media products. Examples of such media products are Blu-ray discs or DVDs with movies, television shows, video games, or the like. The packaging may include colorful artwork, sound effects, and/or the like that may grab the attention of potential consumers. The packaging itself may be of such interest to some consumers that they keep the packaging after purchase of the corresponding product.
In contrast, consumers may purchase many media products online without any packaging. For example, consumers may purchase a movie online and download the movie almost instantly. The consumers are able to obtain the content more quickly than having to travel to the store to buy a product within a package.
Although online consumers are able to purchase media products more quickly than in store customers, online consumers do not receive the corresponding packaging for the media content unless they wait for the package to be mailed to them. Many consumers are purchasing more of their content through online websites rather than through the in-store experience as current packaging configurations have mainly remained stagnant. As a result, stores selling media products have faced newer challenges in maintaining the same base of customers that previously purchased media products from those stores.
A process detects, at a first proximity-based device within a first object, presence of a second proximity-based device within a second object. The presence is within a proximity. Further, the process modifies a first display associated with the first object based upon a second display associated with the second object.
Further, an apparatus comprises a first proximity-based device corresponding to a first object. The first proximity-based device detects a presence of a second proximity-based device corresponding to a second object, receives a communication from the second proximity-based device that is based upon a proximity-based communication protocol, and modifies a first display associated with the first object based upon the communication, the presence being within a proximity.
Another process detects, at a first proximity-based device within a first product, presence of a second proximity-based device within a second product. The presence is within a proximity. Further, the process modifies a first display associated with a first object corresponding to the first product based upon a second display associated with a second object associated with a second object corresponding to the second product.
Yet another process detects, at a first proximity-based device corresponding to a first object, presence of a second proximity-based device corresponding to a second object. The presence is within a proximity. Further, the process receives, at the first proximity-based device, a communication from the second proximity-based device that is based upon a proximity-based communication protocol. In addition, the process modifies a first display associated with the first object based upon the communication.
The above-mentioned features of the present disclosure will become more apparent with reference to the following description and accompanying drawings, wherein like reference numerals denote like elements and in which:
A dynamic transformation configuration transforms display media based upon a proximity-based communication protocol. Display media may be product packages, product covers, product binders, or the like. When the display media are placed within proximity to each other, the proximity-based communication protocol provides a mode of communication for communicating a change in appearance, sound, smell, or the like. For example, two movie products for different movies in a movie series may be placed within corresponding product packages. Those product packages may then be placed within proximity to each other. As a result, the packaging features of either or both packages may be changed. Packaging features may include graphics, e-ink displays, color, smell, and/or the like. For instance, the color of both product packages may be modified as a result of both packages being placed within a predefined proximity of each other.
Accordingly, the display media, e.g., product packages, are more meaningful than previous display media that were static. Users may have more interest in the in-store purchase experience as the display media of the dynamic transformation configuration provides an entertainment experience that is based upon the product packaging. Further, the appearance of a display medium may be modified differently based upon being within proximity to different display media. For example, a product package may turn a particular color when in proximity to a product package in the same movie series and a different color when in proximity to a product package in a different movie series. Therefore, the dynamic transformation configuration provides users with the ability to adapt their entertainment experience with different media packaging.
In one configuration, the packages only change their appearance, sound, smell, or the like after the user purchases the corresponding products. The store checkout device provides a code to a user device, e.g., a mobile device, or the proximity-based devices within the product and/or product package such that the code is utilized to perform validation prior to the modification of the appearance, sound, smell, or the like of the packages. Accordingly, the user is able to modify features of product packages only after purchase in this configuration.
In an alternative configuration, the user is able to modify package features prior to purchase. For example, a user may want to preview the appearance of two particular product packages in the store prior to purchase. The user may then peruse various possible product package combinations in the store prior to purchase. As a result, the user has an incentive to travel to the store to peruse a variety of entertainment experiences provided by the transformative packaging.
The product corresponding to the product packaging may be a media product, e.g., a Blu-ray disc, DVD, video game, or the like. Alternatively, the product may be an entertainment product that is not a media product, e.g., an action figure, a doll, or the like.
In one configuration, the first display medium 102 detects the presence of the second display medium 104 within a proximity. In another configuration, the second display medium 104 detects the presence of the first display medium 102 within the proximity. In yet another configuration, both display media each detect the presence of each other within the proximity.
In one aspect, the proximity is predefined. For example, the predefined distance may be established such that a tap of the two display media has to be performed for presence detection. Alternatively, the predefined distance may be a distance that does not have a tap.
A variety of wireless technologies may be utilized for the first display medium 102 to detect the presence of the second display medium 104. Various transmitters, receivers, and/or transceivers may be utilized, e.g., RFID chip, RFID tag, Near Field Communication (“NFC”) chip, NFC tag, Bluetooth, or the like. Radio or other communication may be established between the first display medium 102 and the second display medium 104 based upon one or more set of standards, e.g., NFC standards. After communication is established, a dynamic transformation protocol may be utilized between the display media to trigger or communicate transformations to the displays, e.g., the first display 106 and the second display 108.
Although two display media are illustrated in
Further, the second display medium 104 has a second proximity-based sensor and/or transceiver 206 that is in operable communication with a second servo motor 208. The second servo motor 208 provides a transformation command to the rear of the second display 108 based upon data received from the second proximity-based sensor and/or transceiver 206. Alternatively, the servo motor 208 may be configured to provide the transformation command to the front or side of the second display 108.
In one configuration, the first display medium 102 detects the presence of the second display medium 104 within the proximity when the first proximity-based sensor and/or transceiver 202 detects the presence of the second proximity-based sensor and/or transceiver 206 within the proximity. In one aspect, a predetermined dynamic transformation protocol is established such that servo motors 204 and 208 are programmed to perform certain actions based upon the detection of particular display media within the proximity. For example, the dynamic transformation protocol may provide that media product package displays 106, 108 corresponding to different media products in the same movie series should change to the color blue when in the media product packages are within proximity to each other. As another example, the dynamic transformation protocol may provide that media product package displays 106, 108 corresponding to different media products in a different movie series should change to the color green when in the media product packages are within proximity to each other.
The servo motors 204 and 208 perform transformative actions based upon receiving data from the corresponding proximity-based sensors and/or transceivers that presence of other proximity-based sensors and/or transceivers has been detected. The preprogrammed servo motors then perform transformative actions accordingly.
In another configuration, the servo motors 204, 208 are not preprogrammed. The proximity-based sensors and/or transceivers 202, 206 may provide actions to be performed to the servo motors 204, 208.
In one aspect, the actions performed by the servo motors 204 and 208 are directly performed by the servo motors 204 and 208 rather than through indirect commands. For instance, the servo motors 204 and 208 may release chemicals onto the displays 106 and 108 to change the appearance, e.g., color, of the displays 106 and 108.
In another aspect, devices other than servo motors 204 and 208 are utilized. For example, processors in display media 102 and 104 may be utilized to perform actions based upon detecting or being notified of detection of another display medium in proximity.
As another example of the features A and B, the first feature A may include a graphic representation of a character corresponding to content associated with the first display 106, whereas the second feature B may include a graphic representation of a character corresponding to content associated with the second display 108. As yet another example, the first feature A may include text corresponding to content associated with the first display 106, whereas the second feature B may include text corresponding to content associated with the second display 108.
Utilizing the example where the first feature A illustrated in
As yet another example, the content associated with the display media 102 and 104 may be different movies for a particular movie series with the same characters. The first feature A and the second feature B may each have graphic artwork, but the feature C may provide common graphic artwork for the series that may be obtained when the display media 102 and 104 are in proximity to each other. For example, fans of a particular series may only be able to obtain a particular cover for a series by purchasing the entire series and placing each product package in proximity to each other. Therefore, fans have an incentive to purchase products with the product packages.
As another example, various audio experiences may be provided. For example, a series theme song may be utilized in place of individual movie theme songs when the display media 102 and 104 are within proximity to each other.
The processes described herein may be implemented in a general, multi-purpose or single purpose processor. Such a processor will execute instructions, either at the assembly, compiled or machine-level, to perform the processes. Those instructions can be written by one of ordinary skill in the art following the description of the figures corresponding to the processes and stored or transmitted on a computer readable medium. The instructions may also be created using source code or any other known computer-aided design tool. A computer readable medium may be any medium capable of carrying those instructions and include a CD-ROM, DVD, magnetic or other optical disc, tape, silicon memory (e.g., removable, non-removable, volatile or non-volatile), packetized or non-packetized data through wireline or wireless transmissions locally or remotely through a network.
It is understood that the apparatuses, systems, computer program products, and processes described herein may also be applied in other types of apparatuses, systems, computer program products, and processes. Those skilled in the art will appreciate that the various adaptations and modifications of the aspects of the apparatuses, systems, computer program products, and processes described herein may be configured without departing from the scope and spirit of the present apparatuses, systems, computer program products, and processes. Therefore, it is to be understood that, within the scope of the appended claims, the present apparatuses, systems, computer program products, and processes may be practiced other than as specifically described herein.
This application claims priority to U.S. Provisional Patent Application Ser. No. 61/766,065, filed on Feb. 18, 2013, entitled TRANSFERENCE OF DATA TO PROVIDE CONTENT, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5986675 | Anderson et al. | Nov 1999 | A |
6050695 | Fromm | Apr 2000 | A |
6222807 | Min-Jae | Apr 2001 | B1 |
7663488 | Kalama et al. | Feb 2010 | B2 |
8010621 | Zilliacus et al. | Aug 2011 | B2 |
20020109596 | Phillips | Aug 2002 | A1 |
20050111825 | Yun | May 2005 | A1 |
20070037614 | Rosenberg | Feb 2007 | A1 |
20070254674 | Pedigo | Nov 2007 | A1 |
20080109309 | Landau et al. | May 2008 | A1 |
20080134032 | Pirnack | Jun 2008 | A1 |
20090085724 | Naressi et al. | Apr 2009 | A1 |
20090157449 | Itani | Jun 2009 | A1 |
20100004988 | Matsuo | Jan 2010 | A1 |
20100010964 | Skowronek et al. | Jan 2010 | A1 |
20100052934 | Clegg | Mar 2010 | A1 |
20100114983 | Robert et al. | May 2010 | A1 |
20100136898 | Farrow | Jun 2010 | A1 |
20100161434 | Herwig | Jun 2010 | A1 |
20100174599 | Rosenblatt et al. | Jul 2010 | A1 |
20100300913 | Goldburt | Dec 2010 | A1 |
20110016023 | Zakas | Jan 2011 | A1 |
20110112917 | Driessen | May 2011 | A1 |
20110140993 | Bess | Jun 2011 | A1 |
20110186656 | Cho | Aug 2011 | A1 |
20110288938 | Cook | Nov 2011 | A1 |
20110295691 | Krieter | Dec 2011 | A1 |
20110299830 | Sasaki | Dec 2011 | A1 |
20110320278 | Littman | Dec 2011 | A1 |
20120062475 | Locker et al. | Mar 2012 | A1 |
20120077584 | Sarmenta | Mar 2012 | A1 |
20120101885 | Lee | Apr 2012 | A1 |
20120155380 | Hodges | Jun 2012 | A1 |
20120206319 | Lucero | Aug 2012 | A1 |
20120208592 | Davis | Aug 2012 | A1 |
20120218089 | Hill | Aug 2012 | A1 |
20120220220 | DeLuca et al. | Aug 2012 | A1 |
20120220221 | Moosavi et al. | Aug 2012 | A1 |
20120224743 | Rodriquez et al. | Sep 2012 | A1 |
20120226573 | Zakas | Sep 2012 | A1 |
20120245988 | Pace | Sep 2012 | A1 |
20120271712 | Katzin et al. | Oct 2012 | A1 |
20120271717 | Postrel | Oct 2012 | A1 |
20120290377 | Itani | Nov 2012 | A1 |
20130002405 | Pesonen et al. | Jan 2013 | A1 |
20130006869 | Grab et al. | Jan 2013 | A1 |
20130106684 | Weast et al. | May 2013 | A1 |
20130115851 | Setton | May 2013 | A1 |
20130181886 | Hill | Jul 2013 | A1 |
20130185137 | Shafi | Jul 2013 | A1 |
20130237147 | Dearman | Sep 2013 | A1 |
20130300637 | Smits et al. | Nov 2013 | A1 |
20140131452 | Testanero | May 2014 | A1 |
Number | Date | Country |
---|---|---|
101312514 | Nov 2008 | CN |
02071285 | Sep 2002 | WO |
Entry |
---|
Huang, English Abstract of CN 202617440U. |
http://www.youtube.com/watch?v=f43NGb8XRK4, Feb. 19, 2012. |
Number | Date | Country | |
---|---|---|---|
20140232615 A1 | Aug 2014 | US |
Number | Date | Country | |
---|---|---|---|
61766065 | Feb 2013 | US |