Digital video content providers such as movie producers or television broadcasters commonly provide digital video content that has been modified relative to the original digital video content. This can be done by superimposing one or more digital images in a video frame of a digital video data stream comprising moving picture video data, at the origin of the digital video data stream. By way of example, a sports telecaster may superimpose or overlay first-down markers on video frames for a football game. The sports telecaster typically broadcasts the moving picture video data modified to include the first-down markers to its local affiliates for subsequent viewing by individual viewers. In this example, changes or modifications to the original moving picture video data are done at the origin of the moving picture video data, and either the original moving picture video data or the modified moving picture video data is distributed to the viewing audience.
As another example, a sports telecaster may have different broadcasts for the same game, depending upon whether the viewing audience is local (“home game”) or non-local (“away game”). The local viewing audience may receive an unmodified broadcast of the game, while non-local audiences may receive a broadcast where one or more images in video frames have been replaced with one or more other images, such as replacing or overlaying the image of the actual billboard containing local advertising, with the image of a billboard containing other advertising. For example, the actual billboard may include an advertisement for a local restaurant, which is what local viewers see. But non-local viewers may see a billboard containing advertising for a nationally-distributed product or service, such as a chain restaurant or a beverage. Thus, for example, a viewer in Los Angeles viewing an LA Lakers basketball game being played in Los Angeles might see a billboard containing advertising local to Los Angeles, while viewers in New York and Chicago viewing the same game might see different advertising on the same billboard. Still, viewers in New York and Chicago would see the same non-local advertising. Again, in this example, changes or modifications to the original moving picture video data are done at the origin of the moving picture video data, and either the original moving picture video data or the modified moving picture video data is distributed to the viewing audience.
Additionally, digital video recording devices, such as those manufactured by TiVo Inc., of Alviso, Calif., may be used to “fast forward” through or skip commercial advertisements in previously recorded digital video content, such as digital video broadcasts and DVDs. This process, also known as “time-shifting”, results in decreased viewing of the commercial advertisements, and thus decreased advertising revenues for digital video content providers.
Accordingly, a need exists in the art for an improved solution that provides locally-pertinent content to be provided to particular demographics or regions. A further need exists for such a solution that lessens the effect of time-shifting to avoid advertisements.
Distributed synchronous program superimposition may be achieved by a first entity receiving a digital video data stream comprising time-stamped moving picture video data, determining superimposition data for use in superimposing a first one or more digital images on a second one or more digital images in the stream, and sending the stream and the superimposition data for remote superimposing of the first one or more digital images on the second one or more digital images in the stream. A second entity remote from the first entity receives the stream, the superimposition data, and the first one or more digital images, and superimposes the first one or more digital images on the second one or more digital images in the stream to create a superimposed image stream, where the superimposing is based at least in part on the. superimposition data.
The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more embodiments of the present invention and, together with the detailed description, serve to explain the principles and implementations of the invention.
In the drawings:
Embodiments of the present invention are described herein in the context of a system and method for distributed synchronous program superimposition. Those of ordinary skill in the art will realize that the following detailed description of the present invention is illustrative only and is not intended to be in any way limiting. Other embodiments of the present invention will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of the present invention as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.
In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of this disclosure.
In accordance with one embodiment of the present invention, the components, process steps, and/or data structures may be implemented using various types of operating systems (OS), computing platforms, firmware, computer programs, computer languages, and/or general-purpose machines. The method can be run as a programmed process running on processing circuitry. The processing circuitry can take the form of numerous combinations of processors and operating systems, or a stand-alone device. The process can be implemented as instructions executed by such hardware, hardware alone, or any combination thereof. The software may be stored on a program storage device readable by a machine.
In addition, those of ordinary skill in the art will recognize that devices of a less general purpose nature, such as hardwired devices, field programmable logic devices (FPLDs), comprising field programmable gate arrays (FPGAs) and complex programmable logic devices (CPLDs), application specific integrated circuits (ASICs), or the like, may also be used without departing from the scope and spirit of the inventive concepts disclosed herein.
In accordance with one embodiment of the present invention, the method may be implemented on a data processing computer such as a personal computer, workstation computer, mainframe computer, or high performance server running an OS such as Solaris® available from Sun Microsystems, Inc. of Santa Clara, Calif., Microsoft® Windows® XP and Windows® 2000, available from Microsoft Corporation of Redmond, Wash., or various versions of the Unix operating system such as Linux available from a number of vendors. The method may also be implemented on a mobile device running an OS such as Windows® CE, available from Microsoft Corporation of Redmond, Wash., Symbian OS™, available from Symbian Ltd of London, UK, Palm OS®, available from PalmSource, Inc. of Sunnyvale, Calif., and various embedded Linux operating systems. Embedded Linux operating systems are available from vendors including MontaVista Software, Inc. of Sunnyvale, Calif., and FSMLabs, Inc. of Socorro, N. Mex. The method may also be implemented on a multiple-processor system, or in a computing environment comprising various peripherals such as input devices, output devices, displays, pointing devices, memories, storage devices, media interfaces for transferring data to and from the processor(s), and the like. In addition, such a computer system or computing environment may be networked locally, or over the Internet.
In the context of the present invention, the term “network” comprises local area networks, wide area networks, the Internet, cable television systems, telephone systems, wireless telecommunications systems, fiber optic networks, ATM networks, frame relay networks, satellite communications systems, and the like. Such networks are well known in the art and consequently are not further described here.
In the context of the present invention, the term “identifier” describes one or more numbers, characters, symbols, or the like. More generally, an “identifier” describes any entity that can be represented by one or more bits.
In the context of the present invention, the term “digital image” describes an image represented by one or more bits, regardless of whether the image was originally represented as an analog image.
Many other devices or subsystems (not shown) may be connected in a similar manner. Also, it is not necessary for all of the devices shown in
Turning now to
The one or more image processors 315 are further adapted to determine superimposition data 330 for use in superimposing a first one or more digital images on a second one or more digital images in the digital video data stream 320, and to send both the scene image stream 335 and the superimposition data 330 to one or more superimposers 340 for remote superimposing of the first one or more digital images 345 on the second one or more digital images in the digital video data stream, based at least in part on the superimposition data 330.
According to one embodiment of the present invention, the first one or more digital images 345 are received from a remote location. According to another embodiment of the present invention, the first one or more digital images 345 are created or stored locally.
The superimposition data 330 comprises information regarding the second one or more digital images such as, by way of example, the orientation, lighting, shading, opacity, aspect ratio, and origination of the second one or more digital images. The superimposition data 330 may comprise information received from the one or more sensors at the scene 305, information derived from the one or more sensors at the scene 305, or both.
The orientation information may be used, for example, to put the first one or more digital images in a similar orientation as the second one or more digital images before the first one or more digital images are superimposed. Thus, for example, if the image being superimposed is a straight-on view of a beverage can, and if the corresponding second one or more digital images are offset, the image of the beverage can is processed to be in a similar offset orientation before being superimposed. Any 3-D model known in the art may be used as part of the superimposition. By way of example, the superimposition may utilize one or more 3D wireframe models, one or more 3D surface models, one or more 3D solid models, or a combination thereof. Additionally, information from sensed from the one or more sensors at the scene 305 may be sensed in 2D, 3D, or both.
Likewise, the lighting information may be used, for example, to apply similar lighting characteristics to the first one or more digital images as the lighting characteristics of the second one or more digital images before the first one or more digital images are superimposed. Likewise, the shading information may be used, for example, to apply similar shading characteristics to the first one or more digital images as the shading characteristics of the second one or more digital images before the first one or more digital images are superimposed. Likewise, the opacity information may be used, for example, to apply similar opacity characteristics to the first one or more digital images as the opacity characteristics of the second one or more digital images before the first one or more digital images are superimposed. Likewise, the aspect ratio information may be used, for example, to apply a similar aspect ratio to the first one or more digital images as the aspect ratio of the second one or more digital images before the first one or more digital images are superimposed. Likewise, the origination information may be used, for example, to apply similar origination characteristics to the first one or more digital images as the origination characteristics of the second one or more digital images before the first one or more digital images are superimposed.
According to one embodiment of the present invention, superimposition of the first one or more digital images comprises complete replacement of the second one or more digital images. According to another embodiment of the present invention, superimposition of the first one or more digital images comprises partial replacement or blending of the second one or more digital images. The partial replacement or blending may be based at least in part on the opacity of the first one or more images, the opacity of the second one or more digital images, or both.
According to one embodiment of the present invention, the first one or more digital images comprise one or more static images. According to another embodiment of the present invention, the first one or more images comprise time-stamped moving picture video data.
The one or more superimposers 340 are operatively coupled to the one or more image processors 315, e.g. via a network, dedicated, or other communications means. The one or more superimposers comprise one or more memories and at least one processor adapted to receive the scene image stream 335 comprising time-stamped moving picture video data obtained from a remote source, receive superimposition data 330 for the digital video data stream, receive a first one or more digital images 345 to superimpose on a second one or more digital images, and superimpose the first one or more digital images 345 on the second one or more digital images in the digital video data stream 335, based at least in part on the superimposition data 330. Synchronization between the scene image stream 335, the superimposition data 330, and the one or more superimposable images 345 may be based at least in part on time stamp information in the scene image stream 335 and the superimposition data 330.
Superimposed image stream 350 is received and displayed by a display device 355 of user 360. As shown in
According to one embodiment of the present invention, the one or more image processors 315 are co-located with the one or more cameras 325 and scene 305. According to another embodiment of the present invention, at least part of the one or more image processors 315 are not co-located with the one or more cameras 325, scene 305, or both.
According to one embodiment of the present invention, superimposition data 330 and scene image stream 335 comprise separate data streams having time-stamped data. The two data streams may be communicated using the same communication medium; alternatively the two data streams may be communicated using different communication mediums. The two data streams may also be communicated using the same communication protocol; alternatively the two data streams may be communicated using different communication protocols. The two data streams may also be communicated at different times.
According to another embodiment of the present invention, superimposition data 330 and scene image stream 335 comprise a single multiplexed data stream.
According to one embodiment of the present invention, at least part of the data communicated between the one or more image processors 315 and the one or more superimposers 340 are communicated in a “user data” data field specified by an MPEG (Motion Pictures Experts Group) standard. Exemplary MPEG standards include, by way of example, MPEG-1, MPEG-2, and MPEG-4. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 315 and the one or more superimposers 340 are communicated using one or more picture header extension codes specified by an MPEG standard. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 315 and the one or more superimposers 340 are communicated using a separate data PES (Packetized Elementary Stream) specified by an MPEG standard.
According to one embodiment of the present invention, the rate at which the one or more superimposers 340 update frames within the scene image stream 335 is based at least in part on the update rate of the original content at the image source 300. According to another embodiment of the present invention, the rate at which the one or more superimposers 340 update frames within the scene image stream 335 is based at least in part on the refresh rate of the display device 355.
According to one embodiment of the present invention, the one or more superimposable images 345 are provided by a global server (not shown in
Turning now to
Turning now to
Turning now to
The one or more image processors 615 are further adapted to determine superimposition data 630 for use in superimposing a first one or more digital images 645 on a second one or more digital images in the digital video data stream 620, and to send the scene image stream 635, the superimposition data 630, and the first one or more digital images 645 to one or more superimposers 640 for remote superimposing of the first one or more digital images 645 on the second one or more digital images in the digital video data stream, based at least in part on the superimposition data 630.
The one or more superimposers 640 are operatively coupled to the one or more image processors 615, e.g. via a network, dedicated, or other communications means. The one or more superimposers 640 comprise one or more memories and at least one processor adapted to receive the scene image stream 635 comprising time-stamped moving picture video data obtained from a remote source, receive superimposition data 630 for the digital video data stream, receive a first one or more digital images 645 to superimpose on a second one or more digital images, and superimpose the first one or more digital images 645 on the second one or more digital images in the digital video data stream 635, based at least in part on the superimposition data 630. Synchronization between the scene image stream 635, the superimposition data 630, and the one or more superimposable images 645 may be based at least in part on time stamp information in the scene image stream 635 and the superimposition data 630. Superimposed image stream 650 is received and displayed by a display device 655 of user 660.
According to one embodiment of the present invention, the one or more image processors 615 are co-located with the one or more cameras 625 and scene 605. According to another embodiment of the present invention, at least part of the one or more image processors 615 are not co-located with the one or more cameras 625, scene 605, or both.
According to one embodiment of the present invention, superimposition data 630, scene image stream 635, and the one or more superimposable images 645 comprise separate data streams having time-stamped data. The three data streams may be communicated using the same communication medium; alternatively the three data streams may be communicated using different communication mediums. The three data streams may also be communicated using the same communication protocol; alternatively the three data streams may be communicated using different communication protocols. The three data streams may also be communicated at different times.
According to another embodiment of the present invention, superimposition data 630, scene image stream 635, and the one or more superimposable images 645 comprise a single multiplexed data stream.
According to another embodiment of the present invention, two of the superimposition data 630, scene image stream 635, and the one or more superimposable images 645 comprise a single multiplexed data stream, and the third comprises a second data stream.
According to one embodiment of the present invention, at least part of the data communicated between the one or more image processors 615 and the one or more superimposers 640 are communicated in a “user data” data field specified by an MPEG standard. Exemplary MPEG standards include, by way of example, MPEG-1, MPEG-2, and MPEG-4. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 615 and the one or more superimposers 640 are communicated using one or more picture header extension codes specified by an MPEG standard. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 615 and the one or more superimposers 640 are communicated using a separate data PES (Packetized Elementary Stream) specified by an MPEG standard.
According to one embodiment of the present invention, the rate at which the one or more superimposers 640 update frames within the scene image stream 635 is based at least in part on the update rate of the original content at the image source 600. According to another embodiment of the present invention, the rate at which the one or more superimposers 640 update frames within the scene image stream 635 is based at least in part on the refresh rate of the display device 655.
According to one embodiment of the present invention, the one or more superimposable images 645 are provided by a global server (not shown in
Turning now to
Turning now to
Turning now to
The one or more image processors 815 are further adapted to determine superimposition data for use in superimposing a first one or more digital images on a second one or more digital images in the digital video data stream, and to send both the superimposition data and one or more digital images superimpose to one or more superimposers for remote superimposing of the first one or more digital images on a second one or more digital images in the scene image stream, based at least in part on the superimposition data.
The one or more superimposers 840 are operatively coupled to the one or more image processors 815, e.g. via a network, dedicated, or other communications means. The one or more superimposers 840 comprise one or more memories and at least one processor adapted to receive the scene image stream 835 comprising time-stamped moving picture video data obtained from a remote source, receive superimposition data for the digital video data stream, receive a first one or more digital images to superimpose on a second one or more digital images, and superimpose the first one or more digital images on the second one or more digital images in the digital video data stream, based at least in part on the superimposition data. Synchronization between the streams may be based at least in part on time stamp information in the streams. Superimposed image stream 850 is received and displayed by a display device 855 of user 860.
According to one embodiment of the present invention, the one or more superimposable images and the superimposition data are communicated between the one or more image processors 815 and the one or more superimposers 840 in separate data streams having time-stamped data. The two data streams may be communicated using the same communication medium; alternatively the two data streams may be communicated using different communication mediums. The two data streams may also be communicated using the same communication protocol; alternatively the two data streams may be communicated using different communication protocols. The two data streams may also be communicated at different times.
According to another embodiment of the present invention, the one or more superimposable images and the superimposition data are multiplexed into a single data stream for communication between the one or more image processors 815 and the one or more superimposers 840.
According to one embodiment of the present invention, at least part of the data communicated between the one or more image processors 815 and the one or more superimposers 840 are communicated in a “user data” data field specified by an MPEG standard. Exemplary MPEG standards include, by way of example, MPEG-1, MPEG-2, and MPEG-4. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 815 and the one or more superimposers 840 are communicated using one or more picture header extension codes specified by an MPEG standard. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 815 and the one or more superimposers 840 are communicated using a separate data PES (Packetized Elementary Stream) specified by an MPEG standard.
According to one embodiment of the present invention, the rate at which the one or more superimposers 840 update frames within the scene image stream 835 is based at least in part on the update rate of the original content at the image source 800. According to another embodiment of the present invention, the rate at which the one or more superimposers 840 update frames within the scene image stream 835 is based at least in part on the refresh rate of the display device 855.
According to one embodiment of the present invention, the one or more superimposable images are provided by a global server (not shown in
Turning now to
Turning now to
Turning now to
The one or more image processors 1015 are further adapted to determine superimposition data (1075, 1070) for use in superimposing a first one or more digital images (1045, 1096) on a second one or more digital images in the digital video data stream (1035, 1065), and send the digital video data stream (1035, 1065) and superimposition data (1075, 1070) to one or more superimposers (1098, 1040) for remote superimposing of the first one or more digital images (1045, 1096) on the second one or more digital images in the digital video data stream (1035, 1065), based at least in part on the superimposition data (1075, 1070).
A first one or more superimposers 1098 are operatively coupled to the one or more image processors 1015, e.g. via a network, dedicated, or other communications means. The first one or more superimposers 1098 comprise one or more memories and at least one processor adapted to the scene image stream 1035 comprising time-stamped moving picture video data obtained from a remote source, receive superimposition data 1030 for the digital video data stream, receive a first one or more digital images 1045 to superimpose on a second one or more digital images, and superimpose the first one or more digital images 1045 on the second one or more digital images in the digital video data stream 1035, based at least in part on the superimposition data 1030. Synchronization between the scene image stream 1035, the superimposition data 1075, and the first one or more superimposable images 1045 may be based at least in part on time stamp information in the scene image stream 1035 and the superimposition data 1075.
A second one or more superimposers 1040 are operatively coupled to the first one or more superimposers 1098, the one or more image processors 1015, or both, e.g. via a network, dedicated, or other communications means. The second one or more superimposers 1040 comprise one or more memories and at least one processor adapted to receive a scene image stream (1065, 1080) comprising time-stamped moving picture video data obtained from a remote source, receive superimposition data 1070 for the digital video data stream (1065, 1080), receive a third one or more digital images 1096 to superimpose on the second one or more digital images in the digital video data stream (1065, 1080), and superimpose the third one or more digital images 1096 on the second one or more digital images in the digital video data stream (1065, 1080), based at least in part on the superimposition data 1070. Synchronization between the streams may be. based at least in part on time stamp information in the streams. The second superimposed image stream 1050 is received and displayed by a display device 1055 of user 1060.
According to one embodiment of the present invention, the one or more image processors 1015 are co-located with the one or more cameras 1025 and scene 1005. According to another embodiment of the present invention, at least part of the one or more image processors 1015 are not co-located with the one or more cameras 1025, scene 1005, or both.
According to one embodiment of the present invention, superimposition data 1075 and scene image stream 1035 comprise separate data streams having time-stamped data for communication between the one or more image processors 1015 and the first one or more superimposers 1098. The two data streams may be communicated using the same communication medium; alternatively the two data streams may be communicated using different communication mediums. The two data streams may also be communicated using the same communication protocol; alternatively the two data streams may be communicated using different communication protocols. The two data streams may also be communicated at different times.
According to another embodiment of the present invention, superimposition data 1070 and scene image stream 1065 comprise separate data streams having time-stamped data for communication between the one or more image processors 1015 and the second one or more superimposers 1040. The two data streams may be communicated using the same communication medium; alternatively the two data streams may be communicated using different communication mediums. The two data streams may also be communicated using the same communication protocol; alternatively the two data streams may be communicated using different communication protocols. The two data streams may also be communicated at different times.
According to another embodiment of the present invention, superimposition data 1030 and scene image stream 1035 comprise a single multiplexed data stream for communication between the one or more image processors 1015 and the first one or more superimposers 1098.
According to another embodiment of the present invention, superimposition data 1070 and scene image stream 1065 comprise a single multiplexed data stream for communication between the one or more image processors 1015 and the second one or more superimposers 1040.
According to one embodiment of the present invention, at least part of the data communicated between the one or more image processors 1015 and the first one or more superimposers 1098, or between the one or more image processors 1015 and the second one or more superimposers 1040, are communicated in a “user data” data field specified by an MPEG standard. Exemplary MPEG standards include, by way of example, MPEG-1, MPEG-2, and MPEG-4. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 1015 and the first one or more superimposers 1098, or between the one or more image processors 1015 and the second one or more superimposers 1040 are communicated using one or more picture header extension codes specified by an MPEG standard. According to another embodiment of the present invention, at least part of the data communicated between the one or more image processors 1015 and the first one or more superimposers 1098, or between the one or more image processors 1015 and the second one or more superimposers 1040, are communicated using a separate data PES (Packetized Elementary Stream) specified by an MPEG standard.
According to one embodiment of the present invention, the rate at which the first one or more superimposers 1098 and the second one or more superimposers 1040 update frames within the scene image stream 1035 is based at least in part on the update rate of the original content at the image source 1000. According to another embodiment of the present invention, the rate at which the first one or more superimposers 1098 and the second one or more superimposers 1040 update frames within the scene image stream (1035, 1065) is based at least in part on the refresh rate of the display device 1055.
According to one embodiment of the present invention, the one or more superimposable images 1045 are provided by a global server (not shown in
According to another embodiment of the present invention, the second one or more superimposers 1040 receives the first superimposed image stream 1080 from the first one or more superimposers 1098. According to another embodiment of the present invention, the second one or more superimposers 1040 receive superimposition data 1075 from the first one or more superimposers 1098. According to another embodiment of the present invention, the second one or more superimposers 1098 receive the second one or more superimposable images 1096 from the first one or more superimposers 1098.
Turning now to
Turning now to
Turning now to
A program or programs may be provided having instructions adapted to cause a processing unit or a network of data processing units to realize elements of the above embodiments and to carry out the method of at least one of the above operations. Furthermore, a computer readable medium may be provided, in which a program is embodied, where the program is to make a computer execute the method of the above operation.
Also, a computer-readable medium may be provided having a program embodied thereon, where the program is to make a card device to execute functions or operations of the features and elements of the above described examples. A computer-readable medium can be a magnetic or optical or other tangible medium on which a program is recorded, but can also be a signal, e.g. analog or digital, electronic, magnetic or optical, in which the program is embodied for transmission. Furthermore, a data structure or a data stream may be provided comprising instructions to cause data processing means to carry out the above operations. The data stream or the data structure may constitute the computer-readable medium. Additionally, a computer program product may be provided comprising the computer-readable medium.
Although embodiments of the present invention have been illustrated with respect to the superimposition of digital video data, the invention may also be applied to digital audio or digital audio/video data. By way of example, a first one or more digital audio track could be superimposed on a second one or more digital audio track in a distributed and synchronous manner.
While embodiments and applications of this invention have been shown and described, it would be apparent to those skilled in the art having the benefit of this disclosure that many more modifications than mentioned above are possible without departing from the inventive concepts herein. The invention, therefore, is not to be restricted except in the spirit of the appended claims.