Traditionally, product placement is a form of advertising that is done in the creation of the original Multi-Media Program to present “advertising” to the recipient without interrupting the program for a formal, traditional commercial. The prominent placement of a product as part of the Multi-Media Program functions to generate brand recognition with the program recipients in a manner that is far more subtle and unobtrusive than traditional commercials.
The multi-media object management system controls the retrieval of Object data that comprises an Object Representation and Object Characteristics and the integration of this Object data into a corresponding selected one of the predetermined Multi-Media Object Locations which are components of the Multi-Media Program. This enables advertisers to precisely control product placement on a customized basis thereby to dynamically modify the content of the Multi-Media Program on a centralized basis, a regional basis, and/or as it is delivered to the individual recipient. The delivery can also be based on demographic, psychographic, or socio-graphic groupings, which mayor may not be geographically proximate.
In the multi-media object management system, the process of creating the Multi-Media Program takes “Master Program” content and typically defines a plurality of Multi-Media Object Locations (although at least one Multi-Media Object Location is considered to be the minimalist subset) together with Object Management Data, which is collectively termed herein as “Object Ready Content”. These Multi-Media Object Locations are sites in the Master Program that can receive animation, audio, moving Objects, stationary Objects, and any other dynamic data, whether uni-dimensional, two-dimensional, three-dimensional, or multi-dimensional. The Object Ready Content is now ready to receive selected Objects.
The Object selection process for a given Multi-Media Object Location having spatial and temporal attributes is finally processed by reconciling Object Characteristic data with Object Management Data together with Master Program Rule Set information and Recipient Data (not always necessary or available, in particular, if the Object insertion is done in the central architecture, there would not be any Recipient Data). In addition, the Object Location Brokerage can have bi-directional connections to the reconcile process, as needed. This reconcile process ensures that the purchase process has not resulted in the placement of inappropriate objects or the selection of an object that cannot be used to populate the selected Multi-Media Object Location.
In addition, the present multi-media object highlighting system produces a representation of the object that highlights the object in the scenes in which it appears. The highlighting can be any human sensible characteristic, such as, but not limited to: flashing, changes in brightness, movement, change in representation, and the like. The highlighting can also include the use of an anomaly, such as a color representation in a black and white multi-media program or a black and white representation in a color multi-media program, or out-of context-object, such as an object inappropriate for the time frame of the program content. Object highlighting could be multi-dimensional, wherein the object takes on the appearance of a three-dimensional shape in the context of a two-dimensional visual program (the converse could also be true; that is, the object could be two-dimensional and the program content is three-dimensional). This juxtaposition of dimensions would make an object “stand-out” with respect to the program content. In addition, the highlighting may occur in another sensory form other than visual. The output of this complex process is the Multi-Media Program.
In order to ensure a proper understanding of the present multi-media object highlighting system, the following definitions are provided to clarify the terminology used herein.
The Master Program 11 and its associated Master Program Rule Set 12 are received by the multi-media object management system 1 and then processed to identify Multi-Media Object Locations 21 contained in the Master Program 11 that are to be used for Object placement in conjunction with Object Management Data 22. The Objects 32 can be identified uniformly throughout the Master Program 11 (every instance of an Object 32) or can be selectively targeted. The multi-media object management system 1 creates Multi-Media Object Locations 21, which are sites in the Master Program 11 that can receive animation, audio, moving Objects, stationary Objects, and any other dynamic data, whether uni-dimensional, two-dimensional, three-dimensional, or multi-dimensional. Each of these Multi-Media Object Locations 21 have associated therewith Object Management Data 22 which are Object centric information that is associated with the Multi-Media Object Location 21, such as the Object type, the Object location, the time and place or extent in the Multi-Media Program 42 where an Object 32 occurs, the number of dimensions that a given Object 32 has (video and audio or just video, for example) and how long an Object 32 “lives”. Once the processing of the Master Program 11 is completed, the resultant product is termed Object Ready Content 23 and consists of a copy of the Master Program 11 once it is processed to contain the Multi-Media Object Locations 23 and the associated Object Management Data 22.
The Object Ready Content 23 comprises the processed Master Program 11 and Object Management Data 22 and is described below as being transported directly or via a distribution network 120 from the Content Source 101 to the Object Insertion Processor 110 in order to provide the content stream that can be populated with selected Objects 32. However, the Object Ready Content 23 that is stored in Content Source 101 can be written to removable media for physical distribution to locations where the Object Insertion Processor 101 resides. Thus, conceptually, the distribution network 120 can comprise a physical media delivery operation. The Object Ready Content 23 produced by the Content Source 101 itself becomes a product that can be sold to recipients for use in their personal media players (such as a DVD or High Definition DVD or some future technology such as a 3-D media disk and player). The personal media player, when connected to a communications network or using its own memory which is populated with Objects, can retrieve the Object Ready Content 23 from the removable media and access the Object Source 102 to retrieve the selected Objects 32 and populate the Multi-Media Object Locations 21 in the Object Ready Content 23 to produce the Multi-Media Program for display to the recipient on their personal media player. A further example of this capability is where the recipient purchases the Multi-Media Program at a retail outlet, but also presents a removable media that contains Objects written thereon for insertion into the Multi-Media Program to personalize the Multi-Media Program. As an example, the recipient's media can contain Objects that comprise likenesses of the recipient and/or various acquaintances, which likenesses are to be merged into the Multi-Media Program, appearing for example as extras or bit players in a movie, or providing the recipient's favorite products in the Multi-Media Program (or a video game, to include multi-player video games inter-connected via the Internet).
In addition, there is a processing operation that takes place to create Objects 32, which are product representations, each of which has associated therewith Object Characteristics 31 consisting of the set of data that defines the content of an Object 32, and associated data including the class of the Object, identification of the owner of the Object, and limitations (if any) on the use of the Object. Therefore, Objects 32 consist of the elements that are used to populate the Multi-Media Object Locations 21 that have been created within the Object Ready Content 23.
Once the Object Ready Content 23 stream is scheduled to be delivered to recipients, a Merged Program Stream 41 is created, which consists of a combination of the Object Ready Content 23 with a full set or a subset of the Multi-Media Object Locations 23 populated. The Multi-Media Object Locations 21 are populated on a centralized, regional, and/or localized basis (or a demographic, psychographic, or socio-graphic groups which may or may not be geographically proximate) by a Merge function 51, and the final product is the Multi-Media Program 42 which consists of the Object Ready Content 23 with all of the Multi-Media Object Locations 21 populated and ready for delivery to a recipient. Included in the Merge function 51, the multi-media object highlighting system produces a representation of the Object that highlights the Object in the scenes in which it appears. The highlighting can be any human sensible characteristic, such as, but not limited to: flashing, changes in brightness, movement, change in representation, and the like. The highlighting can also include the use of an anomaly, such as a color representation in a black and white multi-media program or a black and white representation in a color multi-media program, or out-of-context Object, such as an Object inappropriate for the time frame of the program content. Object highlighting could be multi-dimensional, wherein the Object takes on the appearance of a three-dimensional shape in the context of a two-dimensional visual program (the converse could also be true; that is, the Object could be two-dimensional and the program content is three-dimensional). This juxtaposition of dimensions would make an Object “stand-out” with respect to the program content. In addition, the highlighting may occur in another sensory form other than visual.
The population of the Multi-Media Object Locations 21 with Objects 32 is controlled not only by the appropriateness of the Object 32 in the Master Program 11 as identified by the Master Program Rules Set 12 and the Object Characteristic Data 31, but also by the purchasing of the Multi-Media Object Locations 21 by advertisers to have their products displayed in the Multi-Media Program 42 as identified in the Object Location Brokerage 1010 and the recipient-specific characteristics as identified in Recipient Database 33. There are numerous procedures that can be used to effect the purchase and management of the Multi-Media Object Locations 21, and these result in the creation of a set of attribution data that defines the particular Object 32 that is to be used to populate a selected Multi-Media Object Location 21, subject to the Master Program Rule Set 12, the Object Characteristic Data 31, and the Object Management Data 22 confirming the selection (and optionally the Recipient Data 33). The management of the Multi-Media Object Locations 21 is performed in the Reconcile Processor 52 to ensure that the proper Object 32 is populated into the proper Multi-Media Object Location 21.
The Object 32 is inserted into the Multi-Media Program 42 at the Centralized Object Insertion Site 100 before delivery of the Multi-Media Program 42 across a distribution network 120 where all recipients 130-1 to 130-N observe or experience the same inserted Object 32. With centralized insertion, the object management technology resides at a central location, Centralized Object Insertion Site 100, with Objects 32 stored in an Object Source 102 and Object Ready Content 23 stored as data files in a Content Source 101. The Object Ready Content 23 that is stored in Content Source 101 can be generated in its entirety at the Centralized Object Insertion Site 100, or produced by manipulating Master Program 11 that is received directly from Master Program Source 111-1 or received via distribution network 120 from Master Program Source 111-M.
The content stored in the Content Source 101 contains graphical, visual, and aural information plus Object centric information, such as the Object type, the Object location, the time and place or extent in the Multi-Media Program 42 where an Object 32 occurs, the number of dimensions that a given Object 32 has (video and audio or just video, for example) and how long an Object 32 “lives”. This is described below in more detail with respect to the Content Source description of
In the Regional architecture illustrated in
The Content Source algorithm contains a number of key building blocks which create Object Ready Content 23. Master Program 11 is content that is not Object ready. It becomes Object Ready Content 23 after the identification of all Multi-Media Object Locations 21, wherein a Multi-Media Object Location 21 is created in the Master Program 11 and corresponding Object Management Data 22 which comprises Object centric information, such as the Object type, the Object location, the time and place or extent in the Multi-Media Program where an Object occurs, the number of dimensions that a given Object has (video and audio or just video, for example), and how long an Object “lives”.
At step 400 (
Along a parallel algorithmic path, the Object Management Process 305 uses the retrieved Master Program 11 and identifies at step 405 the Object types, the Object location, the time and place or extent where an Object 32 occurs, the number of dimensions that a given Object 32 has (video and audio or just video, for example), and how long an Object 32 “lives”. For example, a movie that is broadcast in 2008 and then again in 2010 quite likely has different Objects 32 being used. The Object Management Process 305 at step 406 stores this Multi-Media Object Location-related information as Object Management Data 22 in memory 306. The Object Management Data 22 contains all of the aforementioned Object attributes and is used to convey this information downstream to the Object Insertion Processor 110.
The Data Combiner Process 307 combines the Processed Master Program 308 with the associated Object Management Data 22 at step 407 to create the Object Ready Content 23 which is stored in Object Ready Content Memory 309 at step 408.
The above-mentioned steps 404, 406 of storing file data may be unnecessary if the Data Combiner Process 307 processes the generated data in real time, and writes the resultant Object Ready Content 23 to the Object Ready Content Memory 309. Likewise, ultra-fast processing and delivery methods may not require Object Ready Content Memory—in this case, the Processed Master Program could be streamed “live” to the Object Insertion Processor, wherever it is located; this architecture modification is likely for a “live” content program such as a sporting event.
Each Object 32 has a plurality of characteristics that define the owner of the Object 32, the representation of the Object 32 in a program (static, adaptable, dynamic), the content of the Object 32 (product identification and limitations on its use), as well as other data that are appropriate for the management of the Object 32 in the Multi-Media Program 42 context. Object Characteristics Data 31 include the set of data that defines the content of an associated Object 32, including the class of Object, identification of the owner of the Object, and limitations (if any) on the use of the Object. The characteristics or attributes of an Object can be uni-dimensional or multi-dimensional and can include, but are not limited to: video (moving images), still images, audio, audio that is matched with a given Object, other senses such as feel-smell-taste, and the like. An Object such as a cup of coffee could have a brand logo, an image, and an aroma. A typical Object Characteristic would be two-dimensional having an image and an associated sound clip.
Like the Object 32 having ownership, Multi-Media Object Location 21 has an owner associated with it as well, albeit different than Object 32 ownership. However, when comparing the ownership of the Object 32 versus the Multi-Media Object Location 21, the Object 32 is often a branded or trademarked product or service owned by a given company where the company has absolute ownership of all rights associated with its Object 32, while the “ownership” of the Multi-Media Object Location 21 is most often retained by the owner of the Multi-Media Program 42. From the advertiser's perspective, the use of Multi-Media Object Location 21 is generally transient and takes the form of a lease (although it is possible for a company to purchase Multi-Media Object Location 21 rights in perpetuity albeit said lease rights being substantially more expensive than the transient right). The transient lease rights of a Multi-Media Object Location 21 can be one-time-only, multiple play, just-in-time (rights auction just before real time delivery to the Recipient) and so on.
In the case where a selected Object 32 is identical in its “footprint” with the Multi-Media Object Location 23 defined in the Multi-Media Program 42, the Object insertion process is a simple substitution. Thus, a standard size soda can is fungible and the only delimiting factor is the label applied to the standard size soda can to identify the contents and the company that has produced this product. The selected Object must also be reviewed to determine whether the content of the Object is appropriate for the selected placement in the program. Thus, a can of motor oil would be an inappropriate selection to be displayed on the kitchen counter of a cooking show in place of a can of tomatoes.
In the case where a selected Object 32 is not identical in its “footprint” with the Multi-Media Object Location 23 defined in the Multi-Media Program 42, the Object insertion process is more complex than a simple Object 32 substitution. In this case, the selected Object 32 together with the background layer of multimedia content juxtaposed to the Multi-Media Object Location 21 needs to optimally have the background multimedia layer morph (and foreground morph, if necessary), modify, or adjust its shape to match the new shape and size and motion of the Multi-Media Object Location 21 so that the new Object 32 is now contiguous in its placement into the Master Program 11. It is also possible to morph, modify, or adjust the shape and size of the Object 32 but this is disadvantageous since most Objects 32 have identifiable shapes, colors, sizes etc., that confer brand recognition; thus, morphing the Object 32 could impair the value of the dynamically placed in situ Object 32 (product placement). This is particularly true for an Object 32 in motion (likewise for a Multi-Media Object Location 21 that is in motion). The preferred embodiment is to morph, modify, or adjust the background (or foreground) in synchronization with the Multi-Media Object Location 21 versus doing a likewise process on the Object 32. It is most desirable to match the new Object 32 with a new Multi-Media Object Location 21 so that these two elements are identical in shape (if a visual representation) with only the background (foreground) changing. Finally, if an Object 32 has two dimensions, video and audio, the Object's audio would be mixed with the Master Program audio to create a seamless audio stream for the life of the Object 32.
In the case where the selected Object 32 is not identical in its “footprint” but also either interacts with surrounding visualizations, or must be interfaced with surrounding subjects in the program, the Object insertion process requires manipulation of the selected Multi-Media Object Location 21 and the Master Program 11 background juxtaposed to the Multi-Media Object Location 21, to ensure the nature of the selected Object 32 is not changed, and the juxtaposed surroundings are naturally morphed, modified, or adjusted to ensure the interface between the selected Object 32 and the juxtaposed multimedia background or interrelated visualizations are harmonious in a seamless fashion. Thus, where a hand is holding a beverage container and the selected Object 32 provides a representation of a beverage container of different shape, the hand must be modified so the hand with the beverage container of the selected Object 32 appears natural This can be done by electronically inserting a “new hand with the proper finger locations”, or it could be done by shooting a short clip new scene and then digitally inserting that new scene when the new Object 32 with a beverage container handle is used. Thus, the director and producer of the Master Program, including the writers or authors of the Master Program, could anticipate in advance the likely set of possible Object 32 shapes that would be used in the finished product Multi-Media Program 42, and where necessary, create additional movie segments (video and audio) that accommodate all the likely Object 32 shapes and motions.
Emerging video or television architectures that use IPTV (Internet Protocol Television) are also a form of local delivery and could be delivered to a personal computer or to an IPTV set-top box. One advantage that IPTV has is that the Recipient Database (shown in
If the device is a mobile one, such as a cell phone enabled for video reception in some manner, GPS location is known as well as the subscriber database is stored in database registers such as HLRs (Home Location Registers) and VLRs (Visitor Location Registers). Thus, in the mobile context, Recipient Database 33 information is inherently and automatically available enabling optimal Object selection and insertion. In this mobile example, the Recipient Database 160-1 in
The localized recipient object insertion architecture truly matches Objects 32 with Recipient's interests, needs, and desires contained in Recipient Database 33. In this context, the advertiser has made an optimal connection with the recipient for a given product or service which is imbedded into the content stream. Break and Make advertising is no longer required, and a 30-minute Multi-Media Program is truly 30 minutes of entertainment. In the era of e-books or e-readers, the Recipient downloads a magazine and has electronic advertising that is directly paired with that Recipient's interests. Object 32 definition could even include, for example, the favorite color of the Recipient (say for an advertised car the Recipient is interested in). For all of these architectures, but in particular for the Local Insertion which is highly customized, a third database, shown in
In
The Object Insertion Processor 1000 shown in
Object Insertion Processor 1000 performs additional tasks such as high reliability and high availability communications at devices 1004 and 1008, the input and output nodes, respectively, of Object Insertion Processor 1000. The Object Insertion Processor 1000 has Memory 1005 and Storage 1006 to manage data flow and processing capability in 1007. In addition, the Highlighting Programs, as described below, are stored in Memory 1005 for use in generating Highlighted Objects.
More complex, the Object Insertion Processor 1000 performs tasks at 1007 such as morphing a given video frame so that the inserted Object fits fully into the “content hole” (also termed Multi-Media Object Location 21)—this process is essential since an inserted Object 1 to inserted Object N in the matrix of possible Objects available to insert may not have the same exact shape (i.e., a Heineken® bottle has a different shape than a Coors® bottle). This morphing process continues for every frame until the Object insertion timeframe is completed; and a given frame could have 1 to Y Objects being inserted in a concurrent or simultaneous fashion, with any given frame having its own defined set of Objects being inserted.
For a video data file, the Objects contained therein are generally two-dimensional—an image and associated sound clip (to be merged into the composite audio stream). However, there is no limitation on Objects being in only two dimensions. Objects are multi-dimensional (to include visual effects to create a 3-D perspective from the Recipient's viewpoint) and necessarily have attributes associated with those dimensions. Attributes such as feel, smell, taste, and others are readily possible. The Highlighting of an Object can also take on multiple dimensions such as visual, aural, smell, and touch—in any and all combinations. Objects can also have Highlighted spatial attributes such as 2-D or 3-D. Highlighted Objects can be highlighted in a manner to draw attention to the Object through a variety of methods to include some form of juxtaposition with respect to the Master Program.
The Object Insertion Processor Algorithm starts at step 1100 with the receipt of the Objects 1111 and the Object Ready Content data 1101. The Object Ready Content data 1101 is further separated at step 1102 into the Object Management Data 1103 and the processed Master Program 1104. The Objects 1111 are multi-dimensional, and the Object Database of Objects 1111 can contain exactly the exact number of needed Objects, or it could contain the entire universe of available Objects 1111 (from which it has to make a selection based on the Recipient Profile Processor 1130 using the Recipient Database 33). The Object is inserted into the content “hole” (or Multi-Media Object Location) at step 1131 as a function of the purchase of the Multi-Media Object Location, as identified by the Object Location Brokerage 1010, in a continuous fashion where step 1132 is a frame or field of a composite video stream (for example) until the content stream is complete as determined at step 1133. The Object Insertion Processor Algorithm process can be done in advance, near real time, real time, or just-in-time. The timing of when an Object 32 and 1111 is inserted affects the market value of an Object.
The population of the Multi-Media Object Locations 21 with Objects 32 is controlled not only by the appropriateness of the Object 32 in the Master Program 11 but also by the purchasing of the Multi-Media Object Locations 21 by advertisers to have their products displayed in the Multi-Media Program 42. Likewise, a purchased Multi-Media Object Location could involve Highlighting the Object where such Highlighting may be considered to be a “premium” service to the advertiser and would have a corresponding additional cost. Objects can be Highlighted at the central, regional, local, or recipient level. This highlighting may occur at all levels depending on the “highlighting buy decision” of a given advertiser—there is nothing to limit an advertiser from highlighting a given object at the national level and then, re-inserting a new highlight for a given object for a specific household. Here too, the additional premium for Highlighting is dependent on where the Highlighting occurs. There are numerous procedures that can be used to effect the purchase and management of the Multi-Media Object Locations 21 and Highlighting, and these result in the creation of a set of attribution data that defines the particular Object 32 that is to be used to populate a selected Multi-Media Object Location 21, subject to the Master Program Rule Set 12, the Object Characteristic Data 31, and the Object Management Data 22 confirming the selection.
The Object Insertion Processor (for example, 110 in the Central Architecture 3A) must select an appropriate Object 32 for insertion into a selected Multi-Media Object Location 21 based upon certain parameters that are defined in the Object Management Data 22 and the Object Characteristic Data 31. In addition, the purchasing of selected Multi-Media Object Location 21 by advertisers is a consideration and must be reconciled with the parameters that are defined in the Object Management Data 22 and the Object Characteristic Data 31. For example, the Object Insertion Processor 110 as shown in
If an Object 32 is determined to violate one of the rules in the Master Program Rule Set 12 or Object Management Data 22, or there is a failure to match Object 32 with the selected Multi-Media Object Location 21 due to the Object Characteristic Data 31 failing to match the Object Management Data 22, the Reconcile Processor 52 includes a process to terminate the Object insertion into the selected Multi-Media Object Location 21. The Reconcile Processor 52 can then generate an error indication to a system operator or can autonomously locate a substitute Object for insertion into the selected Multi-Media Object location 21 by retrieving a default Object that is in this class of Object, an Object that represents the Object that was next highest in the bidding process for this Multi-Media Object Location, or some other Object owned by the same purchaser that is appropriate for the selected Multi-Media Object Location. There are numerous options that can be envisioned for managing this situation, and these mentioned above represent typical responses.
Any number of Objects can be selected to populate this Multi-Media Object Location, and the example illustrated herein in
As illustrated in
The highlighting can be any of a number of representation effects that produce a human sensible visualization. These human sensible characteristics can be multi-dimensional and may include, but are not limited to: visual only, visual and aural, aural only, and 3-D representation, where the recipient possibly wears special glasses so that only the selected Highlighted Object is present in 3-D form (or have some other unique attribute which is “enabled” by wearing special glasses). The visual effects can include: flashing, changes in brightness, movement, change in representation, and the like. The highlighting can switch between selected highlighting effects or the selected highlighting effects may be concurrently operational, such as flashing and movement. The highlighting can also include the use of an anomaly, such as a color representation in a black and white multi-media program or a black and white representation in a color multi-media program, or even an out-of-context object, such as an object inappropriate for the time frame or context of the program content. With these characteristics, the time required to produce a human sensible effect must be determined in order to make the resultant highlighting effective yet not unduly intrusive to the program content. This is particularly true for senses that take “time to develop” and then “clear”, such as aromas and smell. Other senses, such as touch, can be more immediate in their implementation, such as a motion-enabled seat back. That said, not all touch senses are this immediate. For aural Highlighting, the sense of hearing is immediate and generally there is no lag or wait after the aural Highlighting is removed.
In order to highlight the selected Object, one or more of the object highlighting paradigms can be activated to draw the recipient's attention to the selected Highlighted Object. For example, the brightness of the Object as displayed can be varied to “blink” the Object as the sequence of frames is displayed. Thus, the first frame 1230 illustrated in
Thus, as can be seen from
The present multi-media object highlighting system controls the retrieval of Object data that comprises an object representation (such as a product) and the integration of this Object data into a corresponding selected one of the predetermined Multi-Media Object Locations which are components of the Multi-Media Program. This enables advertisers to precisely control product placement on a customized basis thereby to dynamically modify the content of the Multi-Media Program as it is delivered to the individual recipient. The present multi-media object highlighting system takes the Master Program and creates the Multi-Media Object Locations with their associated Object Management Data thereby to enable the system to populate these Multi-Media Object Locations with appropriate Objects which are selected on the basis of purchaser interest, appropriateness for the selected Multi-Media Object Location, as well as the interests of the Recipients. In addition, the multi-media object highlighting system produces a representation of the Object that highlights the Object in the scenes in which it appears. The highlighting can be any human sensible characteristic, such as, but not limited to: flashing, changes in brightness, movement, change in representation, and the like. The highlighting can also include the use of an anomaly, such as a color representation in a black and white multi-media program or vice versa, or out-of-context object. Thus, the present multi-media object highlighting system provides an adaptable yet dynamic service for the placement of objects into a Multi-Media Program, with the end product containing Object representations that are integral to the Multi-Media Program.
This application is continuation-in-part of application Ser. No. 11/486,923 filed Jul. 14, 2006 and titled “System For Dynamic Personalized Object Placement In A Multi-Media Program; and application Ser. No. 11/487,024 filed Jul. 14, 2006 and titled “System For Managing The Purchasing Of Dynamic Personalized Object Placement In A Multi-Media Program”; and application Ser. No. 11/487,070 filed Jul. 14, 2006 and titled “Network Architecture For Dynamic Personalized Object Placement In A Multi-Media Program”; and application Ser. No. 11/486,900 filed Jul. 14, 2006 and titled “System For Dynamic Recipient-Specific Object Placement In A Multi-Media Program”, and application Ser. No. 11/486,922 filed Jul. 14, 2006 and titled “System For Product Placement Rendering In A Multi-Media Program”; and application Ser. No. 11/486,862 filed Jul. 14, 2006 and titled “System For Dynamic Logical Control Of Personalized Object Placement In A Multi-Media Program”; and application Ser. No. 11/486,683 filed Jul. 14, 2006 and titled “System For Creating Dynamic Personalized Object Placement Media”; and application Ser. No. 11/487,065 filed Jul. 14, 2006 and titled: Digital Rights Management In Dynamic Personalized Object Placement In A Multi-Media Program”.
Number | Date | Country | |
---|---|---|---|
Parent | 11486923 | Jul 2006 | US |
Child | 11581632 | US | |
Parent | 11487024 | Jul 2006 | US |
Child | 11486923 | US | |
Parent | 11487070 | Jul 2006 | US |
Child | 11487024 | US | |
Parent | 11486900 | Jul 2006 | US |
Child | 11487070 | US | |
Parent | 11486922 | Jul 2006 | US |
Child | 11486900 | US | |
Parent | 11486862 | Jul 2006 | US |
Child | 11486922 | US | |
Parent | 11486683 | Jul 2006 | US |
Child | 11486862 | US | |
Parent | 11487065 | Jul 2006 | US |
Child | 11486683 | US |