The invention relates to processing recorded video and audio, and more specifically, to techniques for determining where in a movie to insert an ad.
Linear media editing systems used for analog audio, video tape and photographic film are manual, time consuming and cumbersome to reduce content into a final form and distribute. In more recent times computer systems allow for time efficient non-linear video editing systems. Current non-linear editing on computer oriented systems involves capturing media content permitting rapid access to the media content at any point in the linear sequence for moving portions into any order and storing the media content on a storage device, such as a magnetic disk drive or digital versatile disk (DVD).
The average person currently has small set of alternatives for editing content from media capture devices such as camcorders, camera phones, audio recorders, and other media capture devices without having to incur the costs of a computer system and software for editing. In addition, non-linear editing systems are complex and very difficult to use.
People capture various random and personally interesting events, such as work, travel and entertainment event using their camcorders or camera phones. To edit this content, people require easy to use non-linear editing systems that facilitate editing without a high degree of computer or editing skill.
Media content storage technologies provide for storing great amounts of interactive multimedia, for example, the DVD format. Unfortunately, the DVD Specification for authoring is very complex, as are the computer systems that attempt to embody it. A further disadvantage of conventional DVD authoring systems is that they provide a DVD author with only minimal control and flexibility.
The process of authoring a DVD includes a number of complex steps and equipment. Accordingly, there is a need for authoring systems and methods that reduces the time, cost and complexity of the authoring and distributing DVD.
Separate and distinct systems for computer based non-linear editing, DVD authoring, and distributions are known. However, no system exists that enables an ad to be flexibly placed in different areas of a movie according to an automatic procedure.
In accordance with an aspect of this invention, there is provided a method of placing an ad in a movie. At least one of the following is performed: analyzing inherent characteristics of the movie, analyzing viewed characteristics of the movie, analyzing viewer characteristics of a viewer of the movie, and obtaining advertiser preferences for placement of the ad in the movie. Costs of placing the ad in the movie are determined based on the inherent characteristics of the movie, the viewed characteristics of the movie, the viewer characteristics and the advertiser preferences. The ad is placed in accordance with the inherent characteristics of the movie, the viewed characteristics of the movie, the viewer characteristics, the advertiser preferences and the determined costs. The movie is delivered to the viewer. Statistics are collected regarding the viewing of the movie by the viewer. At least one report is generated regarding the movie viewing statistics. It is not intended that the invention be summarized here in its entirety. Rather, further features, aspects and advantages of the invention are set forth in or are apparent from the following description and drawings.
As used herein and in the claims, the term “movie” refers to video and/or audio data intended for display to a human. In some cases, the video and audio data are separate but associated data, while in other cases, the video and audio are combined. In still other cases, video exists without audio, and vice-versa. Video encompasses still image data and moving image data.
The disclosures of the following patents are hereby incorporated by reference in their entirety:
There are several environments in which a user might want to use a movie. Turning to
Movie system 10 includes videobase 20, database 30, transcoder 40, server 50, DVD burner 60 and internal network 70. In some embodiments, videobase 20 and database 30 are combined. Elements 20, 30, 40, 50 are computers programmed according to the invention, and include suitable processors, memory, storage and communication interfaces; and each element may be embodied in one or many physical units depending on factors such as expected processing volume, redundant hardware design and so on.
Videobase 20 serves to store movies uploaded by users, in their uploaded format, and to store transcoded versions of these movies. Videobase 20 also stores advertisement movies, referred to herein as “ads”, intended for inclusion in the transcoded movies.
Database 30 serves to store data for movie system 10, including data relating to users of movie system 10, movies processed by movie system 10, and suitable administrative data such as usage, throughput and audit trail information. In some embodiments, users use movie system 10 for free and the suppliers of ads pay upon ad viewing. In other embodiments, users pay based on usage or a flat rate.
Transcoder 40 serves to receive uploaded movies and process them to generate transcoded movies, as described in detail below with regard to
Server 50 receives requests from users via network 100 and responds thereto. In cases where responding to a request requires the services of transcoder 40, server 50 passes appropriate messages between transcoder 40 and network 100. Server 50 also functions as a firewall to protect network 70 from improper usage.
Server 50 executes upload manager 55, a software program that works with uploader 112, described below, to upload a movie to server 50.
DVD burner 60 is responsive to commands from transcoder 40 to create a digital video disk, which is then shipped via conventional physical shipping services.
Billing program 65 examines usage data created by transcoder 40 and server 50. The usage data is part of the administrative data in database 30. Billing program 65 then generates invoices and applies authorized payments. For example, some users may have preauthorized charges to their credit cards, telephone bills or bank accounts for their usage of movie system 10. As another example, transcoded movies created by users may include advertising, for which the advertisers agree to pay based on number of views, and if so, billing system 65 arranges payments to users based on usage of the transcoded movies with advertising.
There is a cost to store and distribute movies. To offset this cost, and to reward users, movie system 10 enables movie creators to include ads, either manually or automatically, in their movies. Movie system 10 enables flexible ad placement, including at the start or end of a movie, within selected frames of a movie, and at a selected location and size within the selected frames. Advertisers generally pay for placement of their ads based on number of times their ad is viewed, and possibly in accordance with the popularity of the place in the movie where the ad is inserted, how much time the ad is inserted for, and the size of the ad relative to a movie frame.
Internal network 70 serves to carry communication traffic between the elements of movie system 10. Internal network 70 may be a local area network at a single premises, or may span multiple premises.
Server 80 can be a computer coupled to storage 90. Server 80 responds to requests from communication network 100 by providing movies stored in storage 90. By providing the address of server 80 to movie system 10, one of the movies stored in storage 90 can be used as an input for transcoder 40.
PC 110 can be a personal computer coupled to camcorder 120. Camcorder 120 enables a user to record a movie and transfer the movie to PC 110.
PC 110 executes uploader 112 and player 114. Uploader 112 can be a software program that enables a movie to be uploaded from PC 110 to server 50. Player 114 can be a software program that enables PC 110 to view and edit movies, in conjunction with transcoder 40. When PC 110 registers with server 50, server 50 downloads uploader 112 and player 114 to PC 110.
Uploader 112 functions to locate movie files stored in PC 110, and to manage transmission of the movie files to upload manager 55 of server 50 using a suitable protocol such as the secure file transfer protocol (sftp). In embodiments having a peer-to-peer network for downloading, such as networks using the bittorrent protocol, the peer-to-peer network is also used for uploading. Since movie files are large, the file uploading may be interrupted; uploader 112 enables the uploading to resume at its interruption point. In some embodiments, uploader 112 converts a very large file, such as a 36 Mb file in DV format, to a smaller file of comparable visual quality, such as a 3 Mb files in MPEG format. Uploader 112 enables the user of PC 110 to select a file for uploading; to monitor the status of the upload, such as percent completed and speed of uploading; to pause and resume the uploading; and to cancel the uploading.
Player 114 can be a client application for a PC browser. In some embodiments, player 114 resides on a portable device such as a mobile phone or network-attached digital video camera without a browser, and in these embodiments, player 114 can be a network enabled client application.
Player 114 enables a user to view a movie, including forward seek and rewind functions; to seek the compressed video on a remote server using random seek points; to request a movie download from the random seek point, for example, in accordance with U.S. Pat. No. 6,157,771, the disclosure of which is hereby incorporated by reference in its entirety; and to use the functions described below with regard to
Phone 130 can be a wireless communication device executing versions of uploader 112 and player 114 adapted for the device capabilities of phone 130. Phone 130 can be coupled to camera 135, which serves to capture images and provide the captured images to phone 130 as a movie signal. In some embodiments, phone 130 uses the multimedia messaging service (MMS) protocol to transmit and/or receive movies. In other embodiments (not shown), phone 130 communicates with a local network using a protocol such as WiFi, and the WiFi network in turn communicates with communication network 100.
PC 140 can be a personal computer that is able to view transcoded movies by obtaining an address for the movie, or a segment thereof, and providing the address to server 50. As an example, a user of PC 110 or a user of phone 130 may upload a movie to movie system 10, edit the uploaded movie, and provide to the user of PC 140, via email, an address of an edited segment that the user of PC 140 is permitted to view.
Camcorder 145 can be a network enabled movie capture device configured to upload its recordings to movie system 10. In some embodiments, there is at least one predefined user group able to immediately edit information uploaded from camcorder 145. This configuration is useful in a security application, for example.
The user of PC 110 or phone 130 serves as an editor. PC 110, phone 130, PC 140 and camcorder 145 are each at respective locations that are remote from the location of movie system 10.
Turning to
At step 320, transcoder 40 stores the transcoded movie in videobase 20. As described below, user1 can grant permission to other users to view the whole of the transcoded movie, or to view segments of the transcoded movie, by providing suitable addresses to the authorized users.
Turning to
At step 340, server 50 requests the selected movie from videobase 20, indicating the format for the movie.
At step 350, if the requested format happens to match the stored format, then the movie is provided directly to server 50. Otherwise, videobase 20 is operative to convert the format from the stored format to the requested format. The movie is provided in the requested format, also referred to as a third format, a downloaded format, or an output format. Server 50 then sends the provided movie to user2 via suitable distribution method such as streamed video or podcast, or presentation on a web page, blog, wiki, really simple syndication (RSS) or other technique. In some embodiments, videobase 20 sends the stored movie to transcoder 40, for conversion to the requested format, and then transcoder 40 provides the movie in the requested format to server 50.
At step 2005, transcoder 40 analyzes the movie characteristics. As used herein and in the claims, inherent movie characteristics means information in the movie itself, without reference to usage or viewing. Examples of inherent movie characteristics include motion, scene changes, face presence, audio track loudness, and so on. In this document, inherent movie characteristics are sometimes referred to as movie characteristics.
At step 2010, transcoder 40 determines whether the movie has been previously viewed. If not, processing proceeds to step 2020. If the movie has been previously viewed, then at step 2015, transcoder 40 analyzes the viewed characteristics of the movie, also referred to herein as the popularity of the movie. Viewed characteristics include number of times that the movie has been requested, number of times that a link or deep tag to the movie has been sent from one viewer to another potential viewer, number of deep tags in the movie, number of times that a particular movie segment was replayed, number of times that viewers paused at a particular movie position, and so on. In some embodiments, the viewed characteristics are associated with the demographics of the viewer, such as gender, age, location, income, interests and so on.
At step 2020, transcoder 40 determines if the ad placement is occurring in real time.
As explained below, movie system 10 can operate in one or more of the following modes:
Non-real time ad placement mode enables movie system 10 to operate in essentially a broadcast fashion, for ads. Real-time ad placement mode enables movie system 10 to operate in a narrowcast fashion, wherein the characteristics of the viewer determine what ads are sent to the viewer. If movie system 10 is operating in non-real time ad placement mode, then processing proceeds to step 2030. If movie system 10 is operating in real-time ad placement mode, or hybrid ad placement mode, then at step 2025, transcoder 40 analyzes the viewer's characteristics. In this context, characteristics of the viewer includes demographic information as well as activity of the viewer, such as entering a keyword, the material previously viewed by the viewer and so on.
At step 2030, transcoder 40 retrieves the advertiser's preferences for ad placement. Advertiser preferences can specify movie characteristics, movie popularity characteristics, viewer characteristics, and how much the advertiser is willing to pay depending on the combination of features delivered, which may include whether the ad is viewed or whether the viewer takes an action relating to the ad. For example, some ads include hyperlinks that a viewer can click on, and the advertiser may be willing to pay a first rate if the ad is merely presented to the viewer, and a second rate if the viewer clicks on the hyperlink.
At step 2035, transcoder 40 determines prices for positioning ads at various places in the movie. The price determination procedure can be a function of the movie characteristics, the viewed characteristics, the viewer characteristics, advertiser demand, and the price setting mechanism.
At step 2040, transcoder 40 places the at least one ad in the movie. It will be appreciated that an advertiser's rules may specify manual placement, in which case transcoder 40 notifies a person associated with the advertiser of the prices for placing the ad at various positions, and a human manually selects ad placement.
At step 2045, the movie is delivered to the viewer that requested the movie. When movie system 10 operates in non-real time ad placement mode, there is a step (not shown) of storing the movie plus ads in videobase 30 prior to delivering the movie to the requesting viewer.
At step 2050, statistics are collected about movie viewing, such as by server 50 receiving play trails (discussed below), notices of deep tag transmissions and so on, and storing them in database 30.
At step 2055, appropriate viewing reports are generated by billing program 65, server 50 and transcoder 40.
A prior art search engine enables advertisers to bid on keywords. During a set-up phase, the advertiser selects the desired keyword(s), and their maximum bid price. In operation, when a searcher enters the desired keyword, advertiser ads are presented in an order corresponding to their bid amounts, that is, the ad with the highest bid is listed first, followed by ads with sequentially lower bid amounts. When the searcher clicks on an ad, the search engine bills the advertiser, that is, if the ad is merely presented with no response from the searcher, the advertiser does not pay. Advertisers can then view reports on the “click-through” activities for their advertisements.
In an embodiment wherein the price of placing an ad in a movie varies, advertisers can determine the price at least in part by bidding. For example, movie system 10 can determine a starting price based for ad placement in portions of a movie based on popularity, level of motion, length of ad, whether the ad is full-motion video or a static image, and so on. Advertisers can then (i) submit specific bids for specific movies, ex: $100 for placement in movie xyz for two weeks, (ii) can define bidding rules, ex: placement in any travel movies viewed at least 100 times per day, ex: placement in any movie segment having a deep tag with the text label “airplane” and where the movie is one of the most popular 1,000 movies for the day, or (iii) can define results and how much they are willing to pay, ex: at least 100 views per hour by girls of age 10-14 located in California at a cost of up to $20 per hour.
The value placed upon media inventory across a timeline of a movie can vary dramatically and dynamically based on continuously updated statistics on both behavioral and contextual analysis of the video stream; for example the most prized video segments may be those that are most popular and contain the deep tags for “cars”, “hotels”, “swimming”, etc.
As an overview, transcoder 40 receives an uploaded movie, creates a representation for easy editing, and adds user-supplied editing data (steps 400-440) to create a transcoded movie. Then, at the user's option, some, all or none of the following functions can be performed, in any desired sequence, and in as many editing sessions as desired:
Including an advertising movie in a transcoded movie ensures that even if the viewer of the transcoded movie has a filter for blocking advertising pop-ups and the like, the included advertising movie is viewed, since the filter considers the advertising movie to be part of requested content.
At step 400 of
At step 410, transcoder 40 builds a texture strip representing the movie. Specifically, transcoder 40 applies a function to each frame to generate texture data, and saves the texture data as a video image. For example, the function might be to extract the center 8.times.8 pixels of each frame and realign into a 64 pixel height column and the texture strip is the sequence of 64 pixel columns. The texture strip may be saved as a .jpg file. The texture strip serves to represent the entire movie in a convenient information bar, and is sometimes referred to as a navigation bar. The texture strip is an intuitive way of determining the temporal position of a frame relative to the entirety of a movie. The texture strip often is useful in detecting scene changes, which is important when deciding which frames to group together as a segment.
At step 420, transcoder 40 creates a source proxy for the uploaded movie. Generally, a source proxy is a representation of the frames of the movie in a particular format that is easy to convert to other formats and to distribute via public communication network 100. For example, the Flash video format, according to the H.263 standard, can be used for the source proxy.
Using a source proxy reduces the format conversion issue. Specifically, if there are n movie formats, a general transcoder should be able to convert from any input to any output format, which, by brute force, would require n.sup.2 different format converters. However, using a source proxy means that only 2n format converters are needed (n converters to the source proxy format, and another n converters from the source proxy format to the output format). Additionally, as new movie formats become available, supporting them requires creating only 2 converters per format (one to the source proxy format, and one from the source proxy format), rather than 2n with the brute force approach. It is recognized that, sometimes, the source proxy format may be the desired output format.
Editing of the proxy format, also referred to as proxy editing, may occur in several ways.
In one embodiment of proxy editing, the edits are applied directly to the proxy frames.
In another embodiment of proxy editing, the proxy frames are maintained as generated, and an edit list is created, comprising edits to be sequentially applied to the proxy frames. Each time the edited movie is provided, the edits are applied anew to the proxy frames. This embodiment is particularly useful when edits need to be undone, or when many users are editing one movie to create separate edited movies.
In a further embodiment of proxy editing, a hybrid approach is used, wherein during an edit session, an edit list is created, and only at the termination of the edit session are the edits applied directly to the proxy frames.
At step 430, transcoder 40 generates a thumbnail as a visual representation of the entire movie. Typically, the user selects a frame, and transcoder 40 reduces it to a thumbnail size, such as 177.times.144 pixels. A user having many stored movies can conveniently view their thumbnails, rather than or in addition to text descriptions and/or filename descriptions.
At step 440, transcoder 40 accepts metadata from the user. Movie metadata may include a filename for the transcoded movie, subject matter keywords associated with the movie, a short text description to be associated with the thumbnail, any deep tags the user cares to define, address information such as a hyperlink of information to be associated with the transcoded movie, and an associated movie such as an audio file describing the contents of the movie.
A deep tag is a video bookmark, indicating a sequential group of frames that are to be treated as a separately addressable segment; the deep tag metadata includes the movie filename, the user filename, date of creation of the deep tag, date of most recent modification of the deep tag, a deep tag filename, the start frame, the end frame, the duration of the segment, and a short text description of the segment. A deep tag is understood to be a convenient way of identifying a segment.
Video display 710 shows the current frame of video. When the editor's device, such as PC 110 or phone 130 permits, the video frame is displayed in its proxy format. However, if the editor's device cannot support the proxy format, the transcoder 40 converts edited frames to an output format suitable for the editor's device prior to sending the edited frames to the editor for display.
Thumbnail 715 is a small image representing the entire movie.
Texture strip 720 comprises sequential frame representations 725 and subsequent information; each frame representation 725 is the result of the function used to create the texture strip, such as a vertical column of 64 pixels, and represents a single frame. Subsequent information indicates special effects applied to the frames and any advertising inserted in the frames.
Positioner 730 indicates where the frame display in video display 710 is located relative to the entirety of the movie. Positioner 730 enables the editor to use texture strip 720 to seek frames in the movie in a random access manner.
Deep tag marker 735 has a left edge that can be adjusted by a user, and also has a right edge that can be adjusted by the user; after the user has adjusted the left and right edges of deep tag marker 735, the user indicates that these settings should be saved as a deep tag, such as by clicking deep tag button 740, and providing a text description corresponding to the movie segment indicated by the deep tag. Deep tag marker 735 enables the editor to use texture strip 720 to select a segment of the movie.
Deep tag bar 750 is a visual representation of deep tags that have already been created for the movie. In the example of
Function buttons 760, 771, 772, 773, 775, 780 and 790 enable the user to edit the movie. Data button 760 enables the user to view and edit metadata associated with the movie. Playback product buttons 771, 772 and 773 take the user to specialized editing interfaces, discussed below. Effects button 775 enables the user to add and edit special effects. Ad button 780 enables the user to include advertising in the movie. Sharing button 790 enables the user to grant permission to other users or user groups to view selected segments of the movie.
At step 450 of
Turning to
A mash-up is a sequential display of selected segments. A viewer of a mash-up playback product can only navigate forward or backward in the product.
A tree is a set of segments and a hierarchical, linear control structure for displaying the segments. Generally, a viewer of a tree playback product clicks on selections to navigate the product, in addition to forward and backward.
A link-up is a set of segments and a non-linear control structure for displaying the segments. Generally, a viewer of a link-up playback product navigates via one or more of: forward and back movement, clicking on selections, and/or providing alphanumeric input. A tree and a mash-up are constrained forms of a link-up.
If a mash-up is selected, at step 655, the user selects the sequence of segments to be included via a mash-up editor. Then, at step 685, the user selects whether the mash-up is to be published or burned.
Publishing means transferring the mash-up to videobase 20 or to user PC 110. If publishing is selected, at step 690, the user selects a thumbnail to represent the mash-up, and provides metadata if desired such as a mash-up filename. At step 695, the user indicates a destination for the mash-up file, such as videobase 20, or PC 110. Transcoder 40 responds by transferring the mash-up in accordance with the user's selections.
Burning means writing the mash-up to a removable storage medium, such as a DVD or memory chip, and sending the removable storage medium to the user. If burning is selected, at step 698, transcoder 40 transfers the mash-up file to the removable storage medium type designated by the user. In the case of a DVD, transcoder 40 sends the mash-up file to DVD burner 60, which creates a DVD having the mash-up.
Slots area 836 comprises placeholders into which the editor, also referred to as the user, drags and drops thumbnails to indicate that the thumbnails are part of the mash-up being created. Slots area 836 includes slots 837, 838, 839, 840.
Texture strip 830 represents the mash-up being created. Phantom start and end frames 831, 832 enable the user to add thumbnails before or after the selected thumbnails. Frame selector 835 has start and end portions that can be adjusted by the user. After the user is satisfied with the thumbnails dragged into slots 837-840, the user clicks insert button 833 to insert these thumbnails into the mash-up. In response, transcoder 40 creates a frame representation of each thumbnail, puts the thumbnail frame representation in the appropriate frame of texture strip 830, and clears slots 837-840. To insert subsequent files into the mash-up, the user moves frame selector 835 to a position after the inserted thumbnails. To insert preceding files into the mash-up, the user moves frame selector 835 to include phantom frame 831. To delete files from the mash-up, the user positions frame selector 835 on the frame representations of the thumbnails of the files to be deleted, and clicks cut button 834.
At any time, the user can click preview button 896 to see what the mash-up will look like. In response to preview button 896, transcoder 40 creates preview window 802, shown in
Publish button 897 enables the user to indicate to transcoder 40 that publication of the mash-up is desired. Clicking publish button 897 causes transcoder 40 to pop-up a window (not shown) that enables the user to select a thumbnail and destination.
Burn button 898 enables the user to indicate to transcoder 40 that burning of the mash-up is desired. Clicking burn button 898 causes transcoder 40 to pop-up a window (not shown) that enables the user to select a media for burning and provide delivery directions for the burned media.
Back button 899 enables the user to return to edit window 700 in
Tree structure window 855 shows a graph of the control structure of the tree playback product being created. Initially, the window is blank, since nothing has been specified.
Each segment of a tree playback product comprises a background, a foreground and a link. In one embodiment, a background is a still image, a foreground is a video or audio segment, and the link is indicated by a graphic image. When the viewer of the playback product clicks on the link, the viewer is taken to the next segment, that is, a link indicates one segment. Each segment can have 0, 1 or multiple links.
A tree segment can also include 0, 1 or multiple commands. Typically, a command is indicated by a graphic image. When the viewer of the playback product clicks on the command, the command is sent to the source of the playback product, such as server 50, for execution.
Returning to
If a tree playback product segment is created with at least one link, transcoder 40 creates an empty segment as the destination of each link, and displays the empty segment in tree structure window 855. The editor clicks on the empty segment in tree structure window 855 and inserts thumbnails into at least one of the background, foreground and link slots.
If the editor wishes to delete a segment, the editor selects the segment in tree structure window 855, then clicks cut button 856 to remove the segment. Removing a segment automatically removes the link leading to the segment from the preceding segment.
To create a command in a segment, the editor clicks add command button 865. Transcoder 40 provides a pop-up window with a command editor (not shown) that enables the editor to drag and drop a thumbnail indicating the command, select a command from a menu of commands or type the command directly into a command line window (if the editor knows how to write commands in the command language, such as Javascript), and, when appropriate, provide parameters for the command. Examples of commands are: (i) a hyperlink to a webpage, (ii) provide the email address of the viewer of the playback product to the owner of the playback product, (iii) provide the email address of the viewer to a third party, (iv) download a program and execute it, and so on.
It will be understood that a thumbnail can be dropped into multiple slots during creation of a tree playback product.
Clicking preview button 896 causes transcoder 40 to create a window similar to that shown in
If a link-up playback product is selected, at step 670, the user selects the sequence of segments to be included via a link-up editor. At step 675, the user defines the link-up structure. At step 680, the user defines the link-up navigation questions and answers. Then, at step 685, the user selects whether the link-up is to be published or burned.
Link-up structure window 875 shows a graph of the control structure of the link-up playback product being created. Initially, the window is blank, since nothing has been specified.
Returning to
A Q&A dialog (not shown), consists of an optional header and a set of choices associated with the respective links. One instance of a Q&A is,
If the editor wishes to delete a segment, the editor selects the segment in link-up structure window 875, then clicks cut button 876 to remove the segment. Removing a segment automatically removes the portion of the Q&A leading to the segment from the preceding segments.
Clicking preview button 896 causes transcoder 40 to create a window similar to that shown in
Returning to
Examples of special effects include: watermark, mosaic, barndoor, noise, dissolve, spiral, fade in, fade out, increase contrast, decrease contrast, soften perimeter, cut frame, overlay, and so on.
A movie watermark for a movie including video is one of (a) a permanent graphic such as a logo inserted in each frame of the movie, (b) an overlay graphic such as a logo inserted in each frame of the movie, or (c) hidden information such as a logo, also referred to as steganography. A movie watermark for an audio only movie is (i) a sound file inserted at a predetermined point in the movie, such as its start or end, or (ii) hidden information such as text, also referred to as steganography.
An effect can also include an executable command, as described above with respect to the tree playback product editor.
Texture strip 1030 indicates the frame representations of the movie selected for effects, such as frame representation 1031. Slider 1035 has a left and right edge that can be adjusted by the editor, to indicate frames to receive an effect. After the editor selects an effect in effects area 1010, and adjust slider 1035, the editor clicks apply button 1033 to apply the effect. After the editor has finished applying effects, he or she clicks preview button 896 to preview. If an editor wishes to cancel an effect, she positions slider 1035 on the appropriate frame representations and clicks cancel button 1034. Publish button 897, burn button 898, and back button 899 function as described above.
Returning to
If the user wants to include an ad movie in the transcoded movie, at step 510, transcoder 40 determines whether the user wishes to select the ad movie, or to accept an ad movie selected by movie system 10. If the user wishes to select the ad movie, at step 520, transcoder 40 provides the user with a menu of ad movies that are consistent with the characteristics of the user's transcoded movie, and the user selects one or more ad movies for inclusion in their transcoded movie. If the user does not wish to select the ad movies, at step 530, transcoder 40 selects one or more ad movies based on an ad movie selection procedure and its own determination of how many ad movies are appropriate for the transcoded movie. In some embodiments, the ad movie selection procedure is based on maximizing revenue for the user, and following a least-recently-used ad movie selection procedure. In some embodiments, the determination of how many ad movies are appropriate is based on at least one of: the length of the transcoded movie, keywords in the metadata, how many segments are deep tagged, and the length of the deep tagged segments, and so on.
An ad used with movie system 10 can be static or dynamic. A static ad is inserted in the transcoded movie prior to its storage in videobase 20. For a dynamic ad, the transcoded movie is stored in videobase 20 with a placeholder, and when the transcoded movie is presented to a user, the actual ad is inserted, the ad being chosen based on a characteristic of the user such as the user's location, referred to as a “geo-aware” ad, the characteristics of the display device, referred to as a “device aware” ad, or other suitable characteristic.
At step 540, transcoder 40 determines whether the user wishes to control the position of the ad movie(s) within a frame of the transcoded movie, or to accept positioning determined by movie system 10. If the user wishes to control the positioning of the ad movie, at step 550, transcoder 40 provides the user with a graphical interface for controlling ad movie positioning, and a menu of how the ad-movie should be inserted, such as a picture-in-picture at the top, left, right or bottom of the transcoded movie, the top being referred to as a “banner” ad, or as a stand-alone segment in the transcoded movie. In some embodiments, transcoder 40 also provides the user with popularity statistics for portions of the movie, discussed in detail below, so that the user can position the ad in accordance with the parts of the movie that viewers like. In some embodiments, the user selects the size of the space in each frame that the ad may occupy. If the user does not wish to control the positioning of the ad movie, at step 560, transcoder 40 decides where the movie ad should be placed, typically by looking at the default position in the metadata associated with the ad movie.
At step 570, transcoder 40 determines whether the user wishes to control the frames of the transcoded movie where the ad will be placed, or to accept positioning determined by movie system 10. If the user wishes to control the positioning of the ad movie, at step 580, transcoder 40 provides the user with a graphical interface for controlling ad movie positioning, such as a texture strip and slider, and in some cases, popularity statistics (discussed below) for portions of the movie. If the user does not wish to control the positioning of the ad movie, at step 590, transcoder 40 estimates where the movie ad should be placed. Any suitable estimation procedure may be employed. As an example, if the transcoded movie is a completely new movie with no deep tags, the estimation procedure specifies that the first scene having a length of at least the length of the ad movie is the selected position, and that the ad movie should be inserted as a picture-in-picture in the lower right of the transcoded movie. However, if the transcoded movie is associated with at least one other user, then a popularity procedure is used to select the position for the ad movie.
Collection and processing of user statistics will now be discussed. These statistics are employed in the popularity procedure used to automatically position the ad movie.
At step 1300, a viewer, such as a user of PC 140, who is understood to also or alternatively be a listener, requests a movie, such as by clicking on a link to the movie, the link being embedded in an e-mail previously sent to the viewer. At step 1325, server 50 receives the movie request, and retrieves the requested movie from videobase 20. At step 1330, server 50 sends the movie to the requesting viewer, and at step 1335, server 50 updates its download statistics for the movie.
At step 1305, the viewer receives the movie, and at step 1310 plays the movie. Playing the movie includes viewing the movie in its intended temporal sequence, pausing, replaying a section of the movie, fast-forwarding, adding one or more deep tags and so on. Meanwhile, at step 1315, a play trail is being recorded. As used herein and in the claims, a “play trail” means a timestamped sequence of actions that a viewer takes when playing a movie. The timestamp is typically measured in tenths of a second since the start of the movie, but may be measured in clock time or by frame number of the movie, or any other convenient metric. Each action in the sequence carries appropriate information to reconstruct what the viewer did, for example, “pause” indicates how long the movie was paused. When the viewer finishes viewing the movie, at step 1320, the play trail is sent to server 50.
At step 1340, server 50 receives the play trail. At step 1345, server 50 stores the play trail along with information about when the play trail was received, and the identity of the viewer, if known, or the computer address from which the play trail was sent. At step 1350, server 50 updates basic statistics for the movie. Basic statistics include, but are not limited to: number of deep tags created for a portion of a movie, access (listening and/or viewing) frequency for portions of a movie, replay frequency for portions of a movie, pause frequency for portions of a movie, and community score for portions of a movie.
As described above, a play trail is created at the user's location, then sent to server 50. In other embodiments, the play trail is stored in a so-called cookie, a locally stored data file that tracks an authenticated user's actions on a PC, on the user's PC so that the play trail data can persist over time in the cookie data even when a user's browser session has completed or has closed prematurely. The cookie can be extensive in terms of data structures and may include demographic data as well.
Alternatives to cookies include the use of HTTP authentication, client side persistence, local stored Flash objects, JavaScript window. name variables, and so on; each of these methods provides mechanisms to store and track user play trail data. The advantage of a cookie relative to a local file is that a cookie uses a standardized Internet mechanism to track user data in a well defined location on a user's hard drive. The data is tracked across user browser sessions and is persistent across those sessions. Cookies protect user privacy by tracking user behavior without tracking identity.
Play trails are a form of user behavior tracking. Other user behavior that is tracked to determine popularity of a movie or movie segment includes creating a deep tag and/or sending a deep tag, such as via email, instant message (IM), message service such as SMS or MMS, or podcasting. Deep tags indicate what users consider interesting.
An email or IM version of a deep tag typically comprises a URL providing the address of the deep tag, such as www.server.com/deeptag/movieID/StartTime/EndTime, or www.server.com/movieID/deeptag/deeptagID. Additionally, the email may include text associated with the deep tag, a thumbnail image for the deep tag, and an attachment comprising the portion of the movie indicated by the deep tag. SMS, MMS and podcast versions of a transmitted deep tag are similar.
Replay frequency measures how many times a listener/viewer replays a particular portion, as opposed to its initial playing.
Pause frequency measures how many seconds that a listener/viewer pauses at a particular point in a movie.
Community score is based on ratings provided by listeners/viewers, explained below in connection with
While
In
Histogram 960, also referred to as popularity density function 960, indicates the popularity metric, herein the number of times that a portion of the movie has appeared in a deep tag, which serves as an indication of the popularity of the various parts of the movie, plotted against the sequence in which the frames of the movie are displayed.
Color graph 970 uses different colors to represent different levels of popularity. In one scheme, the most popular portions of a movie are indicated by red areas, while the least popular portions of a movie are indicated by blue areas. In another scheme, different shades of gray are used to indicate popularity.
Color graph 970 can be generated as follows. Let the metric being plotted be the Total Interest in a specific part of the movie. The Total Interest metric is an indicator of the level of popularity of a section of a movie. The Total Interest is a weighted combination of metrics, such as the amount of time paused, the number of playbacks, or the number of deep tags created in a specific section of a movie. First, the metric is normalized to be a value between 0-100. Next, the normalized metric is mapped to the 0-255 code for each of the display colors of blue, green, and red.
As a specific example, let the metric plotted be the number of playback events per unit time for a movie, and assume the maximum number of playback events for any segment of a video is 25, so that 25 playback events is normalized to a value of 100. The color code is calculated using the following formula:
Red=(MAX(((Normalized_Metric−50)*255/50)),0))
Blue=(MAX(((50−Normalized_Metric)*255/50),0))
For: Normalized_Metric<=50:Green=(Normalized_Metric*255/50)
Else: Green=((100−Normalized_Metric)*255/50)
Sample usage statistics for a particular movie, and their conversion to RGB values are shown in the following table.
The Red Green Blue color values determine the color for a vertical column of pixels in color graph 970 corresponding to the time interval of the movie segment. As is well-known, each pixel of a display can actually be a group of three light emitting elements: red, green and blue, and the intensity values for each of the elements determine the color perceived by the viewer. In some embodiments, the columns of color graph 970 are placed next to each other, like bar graphs having the same height, whereas in other embodiments, the colors are interpolated from column to column, to provide a more aesthetically pleasing appearance.
Cloud graph 980 has clouds or circles whose size corresponds to the popularity of a movie portion.
Other types of graphical representations of popularity may be employed, such as three dimensional strips with colored segments, three dimensional bar graphs, surface contour maps in three dimensions, and so on.
As used herein and in the claims, popularity statistics are based on one or more popularity metrics such as total number of requests for a movie, total number of times that the movie has been selected as a favorite (see
A further aspect of the movie may be provided by demographic or other selected data about viewers, such as age bracket, gender, location, income level and so on, instead of or in addition to the popularity metrics discussed above. Of course, this can be accomplished only when the viewer provides such demographic data, such as during establishing a new user account.
Popularity graphs can be reported to the owner of the movie.
As shown in
Popularity graphs can also be used internally by transcoder 40. Movie ads are placed in the popular parts, if they are long enough, or with respect to the popular parts, if the popular parts are shorter than the ad movie. For example, the movie ad could be placed so that its start is co-located with the start of a popular segment having a duration of at least 70% of the duration of the ad movie.
Returning to
At step 1380, an owner of a movie requests custom statistics about the movie. such as a number of deep tags created by a particular demographic group during a particular time period. At step 1355, server 50 receives the custom statistics request. At step 1360, server 50 retrieves appropriate play trails, and at step 1365, server 50 retrieves appropriate user data. At step 1370, server 50 generates the requested custom statistics report. At step 1375, server 50 sends the custom statistics report to the requester. At step 1385, the requester receives the custom statistics report.
At step 600 of
If the movie has been previously viewed, at step 1515, transcoder 40 obtains movie characteristics and popularity statistics for the movie, such as popularity density graph 960, color graph 970, cloud graph 980, or a computed metric such as Total Interest, discussed above as a weighted sum of various play events. Transcoder 40 also obtains the placement rules for an ad. Typically, an ad provider specifies default placement rules when their account is set up, and can modify, or override for a particular ad or movie, these default placement rules as they wish. Default ad provider placement rules specify whether an ad must be placed manually or automatically, by transcoder 40. If automatically, the ad provider placement rules are able to further specify whether the ad should be aligned with the start of a popular portion, be centered on a popular portion, be aligned with the end of a popular portion, be placed in the popular portion of closest duration to the ad, or other placement procedure. The duration of the ad and the duration of the movie may or may not be identical. The ad provider placement rules are also able to specify whether the ad provider requires the most popular portion of the movie, or will accept another portion that is still popular, such as by indicating how much of a premium the ad provider is willing to pay. In some embodiments, for aesthetic reasons, transcoder 40 generally ensures that there is only one movie ad in any portion of a movie.
On the other hand, if the movie has never been viewed, which in some embodiments corresponds to the number of views being below a threshold, then transcoder 40 uses a predefined procedure based on movie characteristics to place ads. Instances of a predefined procedure for ad placement include: (i) to place ads so they are uniformly distributed in time throughout the video, (ii) clustering ads into groups between movie portions of two to three minutes duration, permitting four to six ads to be inserted for every eight to ten minutes of movie content, as is done in broadcast television, (iii) to place ads at scene changes in the movie, (iv) to place ads at mid-scene in the movie, and so on. In some embodiments, for aesthetic reasons, transcoder 40 generally ensures that there is only one movie ad in any portion of a movie.
At step 1525, transcoder 40 places an ad in the movie, with the placement determined either by popularity statistics or the predefined procedure. At step 1530, transcoder 40 stores the movie with ad(s) in videobase 20.
Other procedures may be used to automatically insert an ad into a movie, instead of or in combination with the techniques described above.
Motion analysis of a movie may be used to determine where to place an ad, for example, in a scene having the least amount of motion, or alternatively in a scene having a lot of motion but in a relatively static area of the high motion scene. Motion analysis ensures that ad placement is complementary to activity in the movie, rather than obliterating it. For example, some advertisers may specify that their ad should be placed so that it does not interfere or intersect with dramatic scene changes or scenes of high interest with primary characters visible in the scene.
Face analysis of a movie may be used to ensure that an ad is not placed over a face and/or to ensure that an ad is placed in a scene with at least one human face. Advertiser business rules may indicate a preference for ad placement near faces (i.e., just above or below) so that they do not obscure any characters but are assured that their advertisement will be in an important scene.
Image analysis of a movie may be used to ensure that text or logos are not obscured by an ad. Alternatively, it may be desirable to obscure a logo in the movie with an ad.
Examples of products that perform object recognition and face detection on a video stream are:
Ad area 1110 includes ad thumbnail windows 1111-1120 and navigation buttons 1101, 1102 for altering which thumbnails of the ad thumbnails in videobase 20 are displayed in area 1110. Filename entry window 1103 enables a user to type in the name of an ad file, or select an ad filename from a directory, which puts the file's thumbnail in ad thumbnail window 1111. Ad area 1110 also includes automatic button 1104, for indicating that movie system 10 should select an ad.
An adpack is a pairing of an ad movie and an adpack description. An adpack description controls how the ad movie will be displayed in the transcoded movie. Examples of adpack descriptions are:
The International Advertising Bureau has defined standard ad display areas, and these may be used as adpack descriptions; see, for example, http://www.iab.net/standards/popup/index.asp.
An adpack can also include one or more executable commands, as described above with respect to the tree playback product editor. The provider of an ad movie typically specifies certain adpacks as selectable for its ad movie, and configures the commands to be associated with its ad movie. Examples of commands are: (i) a hyperlink to the ad provider's web site, (ii) sending the email address of the viewer of the transcoded movie to the ad provider, and (iii) requesting a file download from the ad provider to the viewer of the transcoded movie; other commands are also contemplated.
Adpack area 1121 includes adpack description windows 1125-1128 and navigation buttons 1122, 1123 for altering which adpack descriptions are displayed in area 1121. Filename entry window 1124 enables a user to type in the name of an adpack, or select an adpack from a directory, which puts the adpack in adpack description window 1125. Adpack area 1121 also includes automatic button 1105, for indicating that movie system 10 should select the placement of the ad in the frame.
Texture strip 1130 includes frame representations of the movie being edited, such as frame representation 1131. Slider 1135 indicates frame representations in the texture strip; the editor can adjust the left and right edges of slider 1135. Automatic button 1106 is used when the editor wishes to indicate that movie system 10 should select the frames in which the ad is placed.
To manually insert an ad, the editor selects an ad, such as by clicking ad thumbnail 1117, then selects an adpack description to control where in the frame the ad is placed, such as by clicking adpack description 1127, then adjusts slider 1135 to indicate which frames the ad is placed in, then clicks insert button 1133.
To instruct movie system 10 to select an ad and put it in the movie being edited, the editor clicks automatic buttons 1104, 1105 and 1106.
Rules button 1107 enables the editor to specify the advertiser's ad placement preferences. Advertiser preferences can specify movie characteristics, movie popularity characteristics, viewer characteristics, and how much the advertiser is willing to pay depending on the combination of features delivered, which may include whether the ad is viewed or whether the viewer takes an action relating to the ad. For example, some ads include hyperlinks that a viewer can click on, and the advertiser may be willing to pay a first rate if the ad is merely presented to the viewer, and a second rate if the viewer clicks on the hyperlink. Clicking rules button 1107 makes another window (not shown) appear, that guides the editor through specifying the advertiser preferences. In some embodiments, when an advertiser first registers with movie system 10, the advertiser specifies their default preferences; in these embodiments, the window (not shown) has the defaults filled in, and enables the editor to override the defaults.
To remove an ad, the editor adjust slider 1135 then clicks cut button 1134.
Buttons 896, 897, 898, 899 function as described above.
Returning to
User2 sees segment 904 as movie 910. User2 can define a deep tag indicating segment 914 of movie 910, and permit an outside user, user3, to access segment 914. Now, in movie 910, segments 912 and 916 are private to user2, while segment 914 is shared.
User3 sees segment 914 as movie 920, and can similarly enable access to all or parts of movie 920.
The transcoded movie is then stored in videobase 20, and database 30 is suitably updated.
A use case will now be discussed.
Let it be assumed that a user uploads three files (
Movie system 10 converts each of these three files to proxy files in H.263 Flash format (
Now the user creates a mash-up (
Next, the user adds special effects (
Finally, the user inserts two ads, allowing the system to automatically choose the ads and their placement (
It will be appreciated that some editing functions can be accomplished in several ways. For example, if the user wishes to delete a frame from the movie, this can be accomplished via (1) creating a mash-up comprising the frames before the deleted frame, followed by the frames after the deleted frame, and not including the deleted frame; (2) adding a special effect to cause the deleted frame to be entirely dark or entirely light, as part of a scene transition; (3) directly selecting “delete frame” from the special effects menu; (3) enabling sharing of the entirety of the movie except for the to-be-deleted frame; and so on.
Elements 720, 730, 740 and 804-807 function as described above; the description will not be repeated here for brevity.
Send button 1392 is used to email a hyperlink of the movie to an email account. Clicking on send button 1392 causes a window to pop-up for entry of an email address. The pop-up window may have a drop down menu of previously used email addresses arranged by frequency of use, or alphabetical order and so on. In some embodiments, the window enables provision of a text message accompanying the hyperlink.
Favorites button 1393 is used to add the movie to the viewer's list of favorite movies maintained by the player for the user.
Popularity selector 1395 enables the user to control display of a popularity graph, as explained above with regard to
Rating selector 1396 enables the user to rate the movie. Rating selector 1396 comprises a set of so-called radio buttons, from which one button can be selected. The default is “unrated”. If a viewer wishes to rate the movie, she or he clicks on the appropriate number of stars, from one star to five stars. In other embodiments, rating selector 1396 uses a different format, such as a slider bar instead of radio buttons.
Although an illustrative embodiment of the invention, and various modifications thereof, have been described in detail herein with reference to the accompanying drawings, it is to be understood that the invention is not limited to this precise embodiment and the described modifications, and that various changes and further modifications may be effected therein by one skilled in the art without departing from the scope or spirit of the invention as defined in the appended claims.
This application is a continuation of, and claims a benefit of priority under 35 U.S.C. § 120 from, U.S. patent application Ser. No. 16/595,137, filed Oct. 7, 2019, entitled “VIDEO CONTENT PLACEMENT OPTIMIZATION BASED ON BEHAVIOR AND CONTENT ANALYSIS,” now U.S. Pat. No. 10,863,224, which is a continuation of, and claims a benefit of priority under 35 U.S.C. § 120 from, U.S. patent application Ser. No. 15/450,836, filed Mar. 6, 2017, entitled “MOVIE ADVERTISING PLACEMENT OPTIMIZATION BASED ON BEHAVIOR AND CONTENT ANALYSIS,” now U.S. Pat. No. 10,491,935, which is a continuation of, and claims a benefit of priority under 35 U.S.C. § 120 from, U.S. patent application Ser. No. 13/408,843, filed Feb. 29, 2012, entitled “MOVIE ADVERTISING PLACEMENT OPTIMIZATION BASED ON BEHAVIOR AND CONTENT ANALYSIS,” now U.S. Pat. No. 9,654,735, which is a continuation of, and claims a benefit of priority under 35 U.S.C. § 120 from, U.S. patent application Ser. No. 11/592,901, filed Nov. 3, 2006, now U.S. Pat. No. 8,145,528, entitled “MOVIE ADVERTISING PLACEMENT OPTIMIZATION BASED ON BEHAVIOR AND CONTENT ANALYSIS,” which claims a benefit of priority from U.S. Provisional Application No. 60/683,662, filed May 23, 2005, and which is a continuation-in-part of U.S. patent application Ser. No. 11/439,600, filed May 23, 2006, now U.S. Pat. No. 8,755,673, entitled “METHOD, SYSTEM AND COMPUTER PROGRAM PRODUCT FOR EDITING MOVIES IN DISTRIBUTED SCALABLE MEDIA ENVIRONMENT,” U.S. patent application Ser. No. 11/439,594, filed May 23, 2006, now U.S. Pat. No. 8,724,969, entitled “METHOD, SYSTEM AND COMPUTER PROGRAM PRODUCT FOR EDITING MOVIES IN DISTRIBUTED SCALABLE MEDIA ENVIRONMENT,” and U.S. patent application Ser. No. 11/439,593, filed May 23, 2006, now U.S. Pat. No. 7,877,689, entitled “DISTRIBUTED SCALABLE MEDIA ENVIRONMENT FOR MOVIE ADVERTISING PLACEMENT IN USER-CREATED MOVIES.” The disclosures of all applications referenced in this paragraph are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
3143705 | Currey et al. | Aug 1964 | A |
3192313 | Rubinstein et al. | Jun 1965 | A |
5101364 | Davenport | Mar 1992 | A |
5453764 | Inagaki | Sep 1995 | A |
5577191 | Bonomi | Nov 1996 | A |
5659793 | Escobar et al. | Aug 1997 | A |
5708845 | Wistendahl et al. | Jan 1998 | A |
5715018 | Fasciano et al. | Feb 1998 | A |
5751883 | Ottesen et al. | May 1998 | A |
5828370 | Moeller et al. | Oct 1998 | A |
5880722 | Brewer | Mar 1999 | A |
5886692 | Brewer | Mar 1999 | A |
6005678 | Higashida et al. | Dec 1999 | A |
6027257 | Richards et al. | Feb 2000 | A |
6088722 | Herz et al. | Jul 2000 | A |
6154600 | Newman | Nov 2000 | A |
6157771 | Brewer | Dec 2000 | A |
6173317 | Chaddha et al. | Jan 2001 | B1 |
6181883 | Oswal | Jan 2001 | B1 |
6192183 | Taniguchi | Feb 2001 | B1 |
6201925 | Brewer | Mar 2001 | B1 |
6262777 | Brewer | Jul 2001 | B1 |
6263150 | Okada | Jul 2001 | B1 |
6285361 | Brewer | Sep 2001 | B1 |
6317141 | Pavley et al. | Nov 2001 | B1 |
6317147 | Tanaka | Nov 2001 | B1 |
6357042 | Srinivasan et al. | Mar 2002 | B2 |
6400886 | Brewer | Jun 2002 | B1 |
6414686 | Protheroe | Jul 2002 | B1 |
6453459 | Brodersen | Sep 2002 | B1 |
6493872 | Rangan et al. | Dec 2002 | B1 |
6496872 | Katz et al. | Dec 2002 | B1 |
6496981 | Wistendahl et al. | Dec 2002 | B1 |
6504990 | Abecassis | Jan 2003 | B1 |
6546188 | Ishii et al. | Apr 2003 | B1 |
6580437 | Liou et al. | Jun 2003 | B1 |
6587119 | Anderson | Jul 2003 | B1 |
6597375 | Yawitz | Jul 2003 | B1 |
6631522 | Erdelyi | Oct 2003 | B1 |
6647061 | Panusopone | Nov 2003 | B1 |
6661430 | Brewer | Dec 2003 | B1 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6710785 | Asai | Mar 2004 | B1 |
6714216 | Abe | Mar 2004 | B2 |
6774908 | Bates et al. | Aug 2004 | B2 |
6973130 | Wee et al. | Dec 2005 | B1 |
7027102 | Sacca | Apr 2006 | B2 |
7055168 | Errico | May 2006 | B1 |
7158676 | Rainsford | Jan 2007 | B1 |
7337403 | Pavley et al. | Feb 2008 | B2 |
7421729 | Zenoni | Sep 2008 | B2 |
7516129 | Risberg et al. | Apr 2009 | B2 |
7559017 | Datar | Jul 2009 | B2 |
7818763 | Sie et al. | Oct 2010 | B2 |
7877689 | Gilley | Jan 2011 | B2 |
8141111 | Gilley | Mar 2012 | B2 |
8145528 | Gilley | Mar 2012 | B2 |
8171509 | Girouard et al. | May 2012 | B1 |
8260656 | Harbick | Sep 2012 | B1 |
8724969 | Gilley | May 2014 | B2 |
8739205 | Gilley et al. | May 2014 | B2 |
8755673 | Gilley | Jun 2014 | B2 |
8949899 | Errico | Feb 2015 | B2 |
9330723 | Gilley et al. | May 2016 | B2 |
9648281 | Gilley et al. | May 2017 | B2 |
9653120 | Gilley et al. | May 2017 | B2 |
9654735 | Gilley et al. | May 2017 | B2 |
9934819 | Gilley | Apr 2018 | B2 |
9940971 | Gilley | Apr 2018 | B2 |
9947365 | Gilley | Apr 2018 | B2 |
10090019 | Gilley | Oct 2018 | B2 |
10192587 | Gilley | Jan 2019 | B2 |
10491935 | Gilley et al. | Nov 2019 | B2 |
10504558 | Gilley | Dec 2019 | B2 |
10510376 | Gilley | Dec 2019 | B2 |
10594981 | Gilley et al. | Mar 2020 | B2 |
10650863 | Gilley et al. | May 2020 | B2 |
10672429 | Gilley et al. | Jun 2020 | B2 |
10789986 | Gilley et al. | Sep 2020 | B2 |
10796722 | Gilley | Oct 2020 | B2 |
10863224 | Gilley | Dec 2020 | B2 |
10950273 | Gilley | Mar 2021 | B2 |
10958876 | Gilley | Mar 2021 | B2 |
11153614 | Gilley et al. | Oct 2021 | B2 |
11589087 | Gilley et al. | Feb 2023 | B2 |
11626141 | Gilley | Apr 2023 | B2 |
11706388 | Gilley et al. | Jul 2023 | B2 |
11930227 | Gilley et al. | Mar 2024 | B2 |
20010023436 | Srinivasan et al. | Sep 2001 | A1 |
20020013948 | Aguayo, Jr. et al. | Jan 2002 | A1 |
20020016961 | Goode | Feb 2002 | A1 |
20020028060 | Murata et al. | Mar 2002 | A1 |
20020053078 | Holtz et al. | May 2002 | A1 |
20020063737 | Feig et al. | May 2002 | A1 |
20020065678 | Peliotis et al. | May 2002 | A1 |
20020069218 | Sull et al. | Jun 2002 | A1 |
20020069405 | Chapin | Jun 2002 | A1 |
20020073417 | Kondo et al. | Jun 2002 | A1 |
20020100042 | Khoo | Jun 2002 | A1 |
20020092019 | Marcus | Jul 2002 | A1 |
20020112249 | Hendricks | Aug 2002 | A1 |
20020131511 | Zenoni | Sep 2002 | A1 |
20020144262 | Plotnick | Oct 2002 | A1 |
20020144263 | Eldering et al. | Oct 2002 | A1 |
20020145622 | Kauffman et al. | Oct 2002 | A1 |
20020152117 | Cristofalo | Oct 2002 | A1 |
20020156829 | Yoshimine | Oct 2002 | A1 |
20020156842 | Signes et al. | Oct 2002 | A1 |
20030028873 | Lemmons | Feb 2003 | A1 |
20030093790 | Logan | May 2003 | A1 |
20030093792 | Labeeb et al. | May 2003 | A1 |
20030115552 | Jahnke | Jun 2003 | A1 |
20030121058 | Dimitrova | Jun 2003 | A1 |
20030149975 | Eldering | Aug 2003 | A1 |
20030156824 | Lu | Aug 2003 | A1 |
20030191816 | Landress et al. | Oct 2003 | A1 |
20030225641 | Gritzmacher et al. | Dec 2003 | A1 |
20040008970 | Junkersfeld | Jan 2004 | A1 |
20040021685 | Denoue | Feb 2004 | A1 |
20040030599 | Sie et al. | Feb 2004 | A1 |
20040096110 | Yogeshwar et al. | May 2004 | A1 |
20040103429 | Carlucci et al. | May 2004 | A1 |
20040133909 | Ma | Jun 2004 | A1 |
20040128317 | Sull et al. | Jul 2004 | A1 |
20040131330 | Wilkins et al. | Jul 2004 | A1 |
20040133924 | Wilkins et al. | Jul 2004 | A1 |
20040139233 | Kellerman et al. | Jul 2004 | A1 |
20040158858 | Paxton et al. | Aug 2004 | A1 |
20040163101 | Swix et al. | Aug 2004 | A1 |
20040187160 | Cook et al. | Sep 2004 | A1 |
20040193488 | Khoo | Sep 2004 | A1 |
20040194128 | McIntyre et al. | Sep 2004 | A1 |
20040199923 | Russek | Oct 2004 | A1 |
20040268223 | Tojo | Dec 2004 | A1 |
20050058460 | Wang | Mar 2005 | A1 |
20050097599 | Plotnick et al. | May 2005 | A1 |
20050137958 | Huber | Jun 2005 | A1 |
20050154679 | Bielak | Jul 2005 | A1 |
20050210145 | Kim et al. | Sep 2005 | A1 |
20050220439 | Carton et al. | Oct 2005 | A1 |
20050229231 | Lippincott | Oct 2005 | A1 |
20050262539 | Barton et al. | Nov 2005 | A1 |
20050283754 | Vignet | Dec 2005 | A1 |
20060013554 | Poslinski | Jan 2006 | A1 |
20060015904 | Marcus | Jan 2006 | A1 |
20060020961 | Chiu | Jan 2006 | A1 |
20060026655 | Perez | Feb 2006 | A1 |
20060098941 | Abe et al. | May 2006 | A1 |
20060120689 | Baxter | Jun 2006 | A1 |
20060179453 | Kadie et al. | Aug 2006 | A1 |
20060212897 | Li et al. | Sep 2006 | A1 |
20060224940 | Lee | Oct 2006 | A1 |
20060239648 | Varghese | Oct 2006 | A1 |
20060248558 | Barton | Nov 2006 | A1 |
20060267995 | Radloff et al. | Nov 2006 | A1 |
20070124762 | Chickering et al. | May 2007 | A1 |
20070157228 | Bayer et al. | Jul 2007 | A1 |
20070157242 | Cordray | Jul 2007 | A1 |
20080212937 | Son | Sep 2008 | A1 |
20080219638 | Haot et al. | Sep 2008 | A1 |
20090077580 | Konig et al. | Mar 2009 | A1 |
20090262749 | Graumann et al. | Oct 2009 | A1 |
20110116760 | Gilley | May 2011 | A1 |
20120251083 | Svendsen et al. | Oct 2012 | A1 |
20140212109 | Gilley | Jul 2014 | A1 |
20140212111 | Gilley | Jul 2014 | A1 |
20160099024 | Gilley | Apr 2016 | A1 |
20170084308 | Gilley et al. | Mar 2017 | A1 |
20170180771 | Gilley et al. | Jun 2017 | A1 |
20170223391 | Gilley et al. | Aug 2017 | A1 |
20170229148 | Gilley et al. | Aug 2017 | A1 |
20180158486 | Gilley | Jun 2018 | A1 |
20180204598 | Gilley | Jul 2018 | A1 |
20180366161 | Gilley | Dec 2018 | A1 |
20190122701 | Gilley et al. | Apr 2019 | A1 |
20200037016 | Gilley et al. | Jan 2020 | A1 |
20200118592 | Gilley et al. | Apr 2020 | A1 |
20200154079 | Gilley et al. | May 2020 | A1 |
20200251147 | Gilley et al. | Aug 2020 | A1 |
20200402540 | Gilley | Dec 2020 | A1 |
20200411057 | Gilley | Dec 2020 | A1 |
20210193182 | Gilley et al. | Jun 2021 | A1 |
20210211610 | Gilley et al. | Jul 2021 | A1 |
20220038759 | Gilley et al. | Feb 2022 | A1 |
20220303504 | Gilley | Sep 2022 | A1 |
20230197113 | Gilley et al. | Jun 2023 | A1 |
20230199234 | Gilley et al. | Jun 2023 | A1 |
20230319235 | Gilley et al. | Oct 2023 | A1 |
Number | Date | Country |
---|---|---|
0526064 | Feb 1993 | EP |
1513151 | Mar 2005 | EP |
1667454 | Jun 2006 | EP |
WO9430008 | Dec 1994 | WO |
WO9739411 | Oct 1997 | WO |
WO9806098 | Feb 1998 | WO |
WO9926415 | May 1999 | WO |
WO 0014951 | Mar 2000 | WO |
WO0110127 | Feb 2001 | WO |
WO03085633 | Oct 2003 | WO |
WO2004104773 | Dec 2004 | WO |
WO2005018233 | Feb 2005 | WO |
Entry |
---|
Karidis et al., A Collaborative Working Paradigm for Distributed High-End Audio-Visual Content Creation, IEEE Int'l Conference on Multimedia Computing and Systems, 1999, (vol. 2) Jun. 7-11, 1999, pp. 328-332, IEEE #0-7695-0253-9. |
Distributed Video Production (DVP) project, EU ACTS Project (AC089), OFES 95.0493, Apr. 12, 1997, 7 pgs., Computer Vision Group, Geneva, Switzerland, at http://cui.unige.ch/-vision/ResearchProjects/DVP/. |
Rowe et al., The Berkeley Distributed Video-on-Demand System, 1996, 17 pgs., University of California, Berkley, CA, at http://bmrc.berkeley.edu/research/publications/1996/NEC95.html#intro. |
Office Action for U.S. Appl. No. 11/592,901, dated Oct. 5, 2009, 13 pgs. |
Office Action for U.S. Appl. No. 11/439,593 dated Dec. 29, 2009, 12 pgs. |
Hurwicz et al., “Overview and Table of Contents”, Special Edition Using Macromedia Flash MX, Aug. 15, 2002, Que Publishing, 5 pgs. at http://proquest.safaribooksonline.com/0-7897-2762-5. |
Office Action for U.S. Appl. No. 11/439,593 dated Mar. 18, 2010, 14 pgs. |
European Search Report and Written Opinion for Application No. EP 06771233 dated May 12, 2010 and completed May 5, 2010, 10 pgs. |
Solid Base from Which to Launch New Digital Video Research Efforts, PR Newswire Association LLC, Mar. 19, 2003, 5 pgs. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2007/023185, dated May 14, 2008, 8 pgs. |
International Preliminary Report on Patentability (Chapter I) for International Patent Application No. PCT/US2007/023185, dated May 5, 2009, 7 pgs. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2006/020343, dated Sep. 16, 2008, 8 pgs. |
International Preliminary Report on Patentability (Chapter I) for International Patent Application No. PCT/US2006/020343, dated Oct. 14, 2008, 8 pgs. |
Office Action for U.S. Appl. No. 11/439,600, dated Aug. 11, 2010, 6 pgs. |
Office Action for U.S. Appl. No. 11/439,594, dated Aug. 12, 2010, 10 pgs. |
Office Action for U.S. Appl. No. 11/439,594, dated Jan. 19, 2011, 10 pgs. |
Office Action for U.S. Appl. No. 11/439,600, dated Jan. 26, 2011, 11 pgs. |
Office Action for U.S. Appl. No. 11/713,116, dated Mar. 17, 2011, 16 pgs. |
Office Action for U.S. Appl. No. 11/713,115, dated Mar. 23, 2011, 21 pgs. |
Gowans, “Cookies Tutorial, Part 1—Introduction to Cookies”, 2001, 3 pgs. at http://www.freewebmasterhelp.com/ tutorials/cookies. |
Learn That, Apr. 2004, 2 pgs., WayBackMachine Internet Archive, at http://www.learnthat.com/courses/computer/attach/index.html. |
Revision Control, Mar. 2005, 7 pgs., WayBackMachine Internet Archive, at Wikipedia, http://en.wikipedia.org/ wiki/Revision_control. |
Carden, Making a Slideshow—Macromedia Flash Tutorial, 2001, 10 pgs. at http://www.designertoday.com/Tutorials/Flash/939/Making.a.slideshow.Macromedia.Flash.Tutorial.aspx, 2001. |
“Control Remote Desktops through a Web Browser,” Winbook Tech Article, Feb. 2004, 2 pgs., www.winbookcorp.com, at http://replay.waybackmachine.org/20040205135404/http://winbookcorp.com/_technote/WBTA20000870.htm. |
European Search Report for European Patent Application No. 10195448.5, dated Mar. 9, 2011, 11 pgs. |
European Search Report for European Patent Application No. 10195472.5, dated Mar. 9, 2011, 10 pgs. |
Office Action for U.S. Appl. No. 11/713,115, dated Sep. 8, 2011, 23 pgs. |
Flickr, “Tags,” Dec. 30, 2004, 1 pg., WayBackMachine Internet Archive at http://www.flickr.com/photos/tags/. |
Office Action for U.S. Appl. No. 11/713,115, dated Mar. 13, 2012, 18 pgs. |
Examination Report for European Patent Application No. 10195472.5, dated Apr. 17, 2012, 6 pgs. |
Examination Report for European Patent Application No. 06771233.1, dated Mar. 27, 2012, 5 pgs. |
WebmasterWorld.com, “Ecommerce Forum: Using email address as a username,” Oct. 2006, 3 pgs. |
European Search Report for European Patent Application No. 10195448.5, dated Jul. 13, 2012, 5 pgs. |
Final Office Action for U.S. Appl. No. 11/713,115, dated Nov. 14, 2012, 29 pgs. |
Office Action for U.S. Appl. No. 13/011,002 dated Dec. 21, 2012, 19 pgs. |
Office Action for U.S. Appl. No. 13/011,002, dated Apr. 22, 2013, 19 pgs. |
Office Action for U.S. Appl. No. 11/439,600, dated May 30, 2013, 10 pgs. |
Office Action for U.S. Appl. No. 11/439,594, dated Jun. 14, 2013, 11 pgs. |
Office Action for U.S. Appl. No. 13/404,911, dated Aug. 8, 2013, 11 pgs. |
Office Action for U.S. Appl. No. 13/408,843, dated Oct. 7, 2013, 15 pages. |
Office Action for U.S. Appl. No. 13/408,843, dated May 7, 2014, 12 pgs. |
Office Action for U.S. Appl. No. 13/011,002, dated Aug. 27, 2014, 22 pgs. |
Office Action for U.S. Appl. No. 13/408,843, dated Aug. 28, 2014, 14 pgs. |
Office Action for U.S. Appl. No. 14/229,601, dated Nov. 24, 2014, 8 pgs. |
Office Action for U.S. Appl. No. 14/242,277, dated Nov. 25, 2014, 8 pgs. |
Office Action for U.S. Appl. No. 11/713,115, dated Dec. 2, 2014, 27 pgs. |
Duplicate file checking, I/O and Streams forum, CodeRanch, Nov. 2004, retrieved from <http://www.coderanch.com/t/277045/java-io/java/Duplicate-file-file-checking> and printed Oct. 21, 2014, 3 pgs. |
Office Action for U.S. Appl. No. 13/408,843, dated Dec. 10, 2014, 16 pgs. |
Summons to Attend Oral Proceedings pursuant to Rule 115(1) EPC for European Patent Application No. 10195448.5, dated Nov. 3, 2014, 4 pgs. |
Summons to Attend Oral Proceedings pursuant to Rule 115(1) EPC for European Patent Application No. 10195472.5, dated Dec. 10, 2014, 5 pgs. |
Office Action for U.S. Appl. No. 14/229,601, dated Mar. 16, 2015, 9 pgs. |
Office Action for U.S. Appl. No. 14/242,277, dated Mar. 25, 2015, 10 pgs. |
Office Action for U.S. Appl. No. 14/247,059, dated Jun. 3, 2015, 9 pgs. |
Office Action for U.S. Appl. No. 14/229,601, dated Jun. 12, 2015, 12 pgs. |
Office Action for U.S. Appl. No. 14/242,277, dated Jun. 24, 2015, 10 pgs. |
Office Action for U.S. Appl. No. 13/408,843, dated Jul. 1, 2015, 16 pgs. |
Office Action for U.S. Appl. No. 14/229,601, dated Oct. 5, 2015, 11 pgs. |
Office Action for U.S. Appl. No. 14/242,277, dated Oct. 6, 2015, 11 pgs. |
Office Action for U.S. Appl. No. 11/713,115, dated Oct. 5, 2015, 24 pgs. |
Final Office Action issued for U.S. Appl. No. 13/011,002, dated Dec. 16, 2015, 22 pgs. |
Final Office Action issued for U.S. Appl. No. 13/408,843, dated Jan. 13, 2016, 21 pgs. |
Office Action for U.S. Appl. No. 14/885,632, dated Feb. 12, 2016, 23 pgs. |
Office Action for U.S. Appl. No. 14/229,601, dated Mar. 18, 2016, 14 pgs. |
Office Action for U.S. Appl. No. 14/242,277, dated May 5, 2016, 16 pgs. |
Office Action for U.S. Appl. No. 14/968,425, dated Apr. 29, 2016, 8 pgs. |
Office Action for U.S. Appl. No. 14/885,632, dated Jun. 3, 2016, 26 pgs. |
Office Action for U.S. Appl. No. 13/408,843, dated Jun. 30, 2016, 24 pgs. |
European Search Report for Application No. EP 15189067.0 dated Jun. 20, 2016, 10 pgs. |
Office Action issued for U.S. Appl. No. 13/011,002, dated Sep. 8, 2016, 14 pages. |
Office Action for U.S. Appl. No. 14/229,601, dated Oct. 24, 2016, 15 pgs. |
Office Action for U.S. Appl. No. 14/242,277, dated Nov. 7, 2016, 15 pgs. |
Office Action for U.S. Appl. No. 14/968,425, dated Nov. 21, 2016, 12 pgs. |
Office Action for U.S. Appl. No. 13/011,002, dated Apr. 26, 2017, 15 pages. |
Office Action issued for U.S. Appl. No. 15/450,836, dated Sep. 13, 2017, 23 pages. |
Office Action issued for U.S. Appl. No. 13/011,002, dated Oct. 17, 2017, 16 pages. |
Office Action issued for U.S. Appl. No. 15/489,191, dated Mar. 7, 2018, 24 pages. |
Office Action issued for U.S. Appl. No. 15/450,836, dated Apr. 17, 2018, 34 pages. |
Office Action issued for U.S. Appl. No. 15/450,836, dated Oct. 2, 2018, 28 pages. |
Office Action issued for U.S. Appl. No. 15/891,885, dated Nov. 2, 2018, 13 pages. |
Office Action issued for U.S. Appl. No. 15/918,804, dated Nov. 9, 2018, 11 pages. |
Office Action issued for U.S. Appl. No. 15/365,308, dated Jan. 7, 2019, 11 pages. |
Office Action issued for European Patent Application No. 15189067.0, dated Dec. 20, 2018, 4 pages. |
Moritz et al., Understanding MPEG-4 Technology and Business Insights, Copyright 2005. |
Office Action issued for U.S. Appl. No. 16/113,636, dated Apr. 2, 2019, 8 pages. |
Kang et al., “Creating Walk-Through Images from a Video Sequence of a Dynamic Scene,” Teleoperators and Virtual Environments, Dec. 2004, 1 page. |
Office Action issued for U.S. Appl. No. 15/495,041, dated Jun. 24, 2019, 32 pages. |
Summons to Attend Oral Proceedings issued for European Patent Application No. 15189067.0, mailed Jul. 23, 2019, 5 pages. |
Office Action issued for U.S. Appl. No. 16/227,481, dated Sep. 30, 2019, 11 pages. |
Office Action issued for U.S. Appl. No. 15/495,041, dated Nov. 18, 2019, 44 pages. |
Office Action issued for U.S. Appl. No. 16/706,380, dated Jan. 22, 2020, 8 pages. |
Office Action issued for U.S. Appl. No. 16/595,137, dated Feb. 7, 2020, 15 pages. |
Office Action issued for U.S. Appl. No. 15/495,041, dated Jul. 23, 2020, 14 pages. |
Office Action issued for U.S. Appl. No. 16/744,840, dated Jul. 22, 2020, 5 pages. |
Office Action issued for U.S. Appl. No. 16/854,739, dated Apr. 12, 2021, 8 pages. |
Office Action issued for U.S. Appl. No. 17/093,727, dated May 25, 2021, 16 pages. |
Notice of Allowance issued for U.S. Appl. No. 16/854,739, dated Jul. 7, 2021, 7 pages. |
Office Action issued for U.S. Appl. No. 17/010,731 dated Sep. 2, 2021, 13 pages. |
Office Action issued for U.S. Appl. No. 17/021,479 dated Sep. 2, 2021, 13 pages. |
Office Action issued for U.S. Appl. No. 17/209,101 dated Nov. 23, 2021, 11 pages. |
Office Action issued for U.S. Appl. No. 17/010,731 dated Apr. 1, 2022, 14 pages. |
Office Action issued for U.S. Appl. No. 17/021,479 dated Mar. 28, 2022, 14 pages. |
Notice of Allowance issued for U.S. Appl. No. 17/209,101 dated Mar. 18, 2022, 7 pages. |
Office Action issued for U.S. Appl. No. 17/021,479, dated Jul. 22, 2022, 17 pages. |
Office Action issued for U.S. Appl. No. 17/196,087, dated Aug. 12, 2022, 37 pages. |
Office Action issued for U.S. Appl. No. 17/502,395, dated Sep. 14, 2022, 8 pages. |
Office Action issued for U.S. Appl. No. 17/836,860 dated Oct. 13, 2022, 8 pages. |
Office Action issued for U.S. Appl. No. 17/010,731 dated Oct. 27, 2022, 20 pages. |
Notice of Allowance issued for U.S. Appl. No. 17/021,479 dated Nov. 17, 2022, 10 pages. |
Notice of Allowance issued for U.S. Appl. No. 17/502,395 dated Dec. 15, 2022, 7 pages. |
Final Office Action issued for U.S. Appl. No. 17/196,087 dated Jan. 10, 2023, 40 pages. |
Office Action issued for U.S. Appl. No. 17/010,731, dated May 16, 2023, 21 pages. |
Notice of Allowance issued for U.S. Appl. No. 17/196,087 mailed Jul. 20, 2023, 14 pages. |
Office Action issued for U.S. Appl. No. 18/170,764 mailed Jul. 7, 2023, 14 pages. |
Office Action issued for U.S. Appl. No. 18/171,287 mailed Jul. 11, 2023, 10 pages. |
Office Action issued for U.S. Appl. No. 17/010,731 mailed Sep. 8, 2023, 9 pages. |
Notice of Allowance issued for U.S. Appl. No. 18/171,287, mailed Oct. 30, 2023, 9 pages. |
Notice of Allowance issued for U.S. Appl. No. 18/309,557, mailed Nov. 30, 2023, 9 pages. |
Notice of Allowance issued for U.S. Appl. No. 17/010,731, mailed Dec. 29, 2023, 9 pages. |
Office Action issued for U.S. Appl. No. 18/170,764 mailed Dec. 29, 2023, 12 pages. |
Notice of Allowance issued for U.S. Appl. No. 17/196,087, mailed Feb. 22, 2024, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20210058662 A1 | Feb 2021 | US |
Number | Date | Country | |
---|---|---|---|
60683662 | May 2005 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16595137 | Oct 2019 | US |
Child | 17093727 | US | |
Parent | 15450836 | Mar 2017 | US |
Child | 16595137 | US | |
Parent | 13408843 | Feb 2012 | US |
Child | 15450836 | US | |
Parent | 11592901 | Nov 2006 | US |
Child | 13408843 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11439593 | May 2006 | US |
Child | 11592901 | US | |
Parent | 11439594 | May 2006 | US |
Child | 11439593 | US | |
Parent | 11439600 | May 2006 | US |
Child | 11439594 | US |