The present disclosure relates to technology for curating filters for audiovisual content.
Users are increasingly turning to the Internet for their entertainment needs. In recent years, online content providers like YouTube™, Netflix™, and Amazon Prime Instant Video™ have experienced explosive growth due to user demand for streaming of multimedia content. Many online content providers now offer unlimited streaming of the digital content they provide. Users can also rent movies online and stream the content on-demand to their consumption devices, such as smartphones, tablets, laptops, Internet-enabled televisions, etc.
These online content providers generally do not, however, provide users with a way to personalize the playback of the content, such as suppressing mature or otherwise offensive content that is often present in offered multimedia content. Some hardware solutions, e.g., specialized DVD players, enable personalized playback of movies, as allowed by the Family Entertainment and Copyright Act, but these solutions suffer from multiple shortcomings: they are limited to DVD playback, they require users to purchase expensive dedicated hardware, and they do not allow users to collectively curate filters for the audiovisual content, among other things.
Further, tagging a movie to identify filterable content can be burdensome and time consuming. What is needed is a way for groups, or communities, to collaborate in tagging movies and other content, and also a way to curate collaboratively prepared movie tagging, i.e., to audit and ensure the quality of collaboratively prepared movie tagging.
This application discloses a filter curation platform that enables users to curate and access custom filters that are usable to adapt the playback of audiovisual content.
In one embodiment, video tags may be prepared for a movie or other audio, visual, or audiovisual content. A video tag may identify a start time and a stop time of segment for possible filtering, and may further identify one or more categories of filterable content associated with the identified segment. A video map is a collection of one or more video tags for a particular movie, and effectively tags segments of filterable content in the movie.
A video viewer, via a media player interface may define filters using a video map of a movie. The video viewer may customize the filter to display (or make audible) some categories or specific segments of filterable content, but not others.
The disclosed curation platform may enable users, which may have different roles, to create one or more video tags for a movie, and thereby and create a full or partial video map for the movie. Roles may include video viewer, video tagger, video reviewer, and video publisher.
A video tagger is a user who may create video maps for audiovisual content. A video reviewer is a user who may review video maps for mistakes, make corrections, and provide feedback on the video maps created by video taggers. A video publisher is a user who may prepare, finalize, and publish video maps to a multimedia portal.
The video-tagging process may be collaborative and iterative. For example, multiple video taggers may tag the same portions of a movie, and a video reviewer may access the video maps from the multiple video taggers. In another embodiment, a video reviewer may review a video map created by a video tagger and send the video map to a different video tagger for further tagger. The process may be iterative in many ways, so that multiple video taggers, video reviewers, and video publishers may prepare, review, edit, and pass among each other video maps in various orders and workflows.
Video viewers, taggers, reviewers, and publishers may assign scores to video tags and/or video maps, reflective of the quality of a video tag or video map. The disclosed curation process may also employ an incentive system to motive users with various roles to perform their roles quickly and with high quality.
The curation system disclosed herein may further be configured to automatically generate video tags and video maps.
This application discloses a filter curation platform that enables users to curate and access custom filters that are usable to adapt the playback of audiovisual content.
Audiovisual content, as referred to herein, includes any audiovisual content available via a computer network, e.g., the Internet. It should be understood that the technology herein is applicable also to other forms of media including streaming audio and stored audio and/or video (e.g., read from a non-transitory physical medium such as a hard drive, flash memory, CD, DVD, Blu-ray™, etc.). In some embodiments, servers of various content providers may host the audiovisual content and stream it in real-time to the client devices of various users via the network. YouTube™ is an exemplary content provider. The audiovisual content may be freely available or may be paid content requiring a user account and/or a rental, subscription, or purchase to access.
In some embodiments, the technology facilitates a process for curating user-defined filters for digital audiovisual content. The audiovisual content may be embodied by a multimedia file that includes audio and/or visual content. Audio content (or audio for short) includes only content that is audible during playback. Visual content (or video for short) includes content that is audible and visual, or just visual, during playback.
A video tag (also referred to as a VidTag) is a short description of a segment/clip of a multimedia file. A video tag includes a type, start time, end time, and a category. Examples of the different types of video tags include audio, video, and audiovisual, in which audio refers to audio content, video refers to the video content, and audiovisual refers to both the audio and the video content. A video tag start time refers to the start time of the segment relative to the total time of the multimedia file, and an end time refers to the end time of the segment relative to the total time of the multimedia file. In alternate embodiments, a video tag start time or end time may be relative to a time other than the total time of the multimedia file, as long as the video tag start time or stop time identifies the beginning or ending, respectively, of a segment. Examples of video tag categories may include positive and negative categories, such as Action, Dramatic, Scary, Alcohol/Drugs, Profane/Crude Language, Sex/Nudity, Violence, Other (e.g., Negative, Positive) Elements, etc.
A video map (also referred to as a VidMap) is a collection of video tags that describe the content of a multimedia file. It is analogous to a review of the multimedia file content. In some embodiments, a user, via a media player interface (also referred to as a VidPlayer), may define filters using a video map of the multimedia file displayed via the VidPlayer.
The curation platform may use different user roles to curate the custom filters, such as, but not limited to, a video viewer (also referred to as a VidViewer), a video tagger (also referred to as a VidTagger), a video reviewer (VidReviewer), and a video publisher (VidPublisher). A video viewer is a user who can access video maps and create filters to use during playback of audiovisual content. For instance, a video viewer may create various filters on a per-show basis by referring to the video map associated with the show and defining filter settings from his own selection of video tags. In some embodiments, any user may act as a video viewer.
A video tagger is a user who may create video maps for audiovisual content. A video reviewer is a user who may review video maps for mistakes, make corrections, and provide feedback on the video maps created by video taggers. A video publisher is a user who may prepare, finalize, and publish video maps to a multimedia portal, such as the portals accessible at www.vidangel.com, www.youtube.com, etc. In some embodiments, a user must be granted authorization (e.g., by an administrator of the curation platform) before acting in the role of video viewer, video tagger, video reviewer, and video publisher. For instance, the roles various users have been granted may be stored in a user profile associated with the user in the data store of the curation system, and may be referenced during the user login to determine if the user is authorized to act in that role.
A user may access the multimedia portal to consume different audiovisual content (e.g., movies, shows, etc.) via a media player provided by the portal, such as an embedded media player capable of playing back the audiovisual content. The media player may be configured (e.g., using an API) to reference the video map created using the curation platform and augment the playback based on the filters defined by the user.
Network 702 may include any number of networks and/or network types. For example, network 702 may include, but is not limited to, one or more local area networks (LANs), wide area networks (WANs) (e.g., the Internet), virtual private networks (VPNs), mobile (cellular) networks (e.g., the mobile network 103), wireless wide area network (WWANs), WiMAX® networks, Bluetooth® communication networks, peer-to-peer networks, other interconnected data paths across which multiple devices may communicate, various combinations thereof, etc. Data transmitted by network 702 may include packetized data (e.g., Internet Protocol (IP) data packets) that is routed to designated computing devices coupled to network 702. In some implementations, network 702 may include a combination of wired and wireless networking software and/or hardware that interconnects the computing devices of system 700. For example, network 702 may include packet-switching devices that route the data packets to the various computing devices based on information included in a header of the data packets.
Mobile network 703 may include a cellular network having distributed radio networks and a hub. In some implementations, client devices 706a . . . 706n may send and receive signals to and from a transmission node of mobile network 703 over one or more of a control channel, a voice channel, a data channel, etc. In some implementations, one or more client devices 706a . . . 706n may connect to network 702 via a wireless wide area network (WWAN) of mobile network 703. For instance, mobile network 703 may route the network data packets sent and received by client device 706a to the other entities 706n, 716, 722, 730, and/or 734 that are connected to network 702 (e.g., via a the Internet, a VPN, etc.). Mobile network 703 and client devices 706 may use a multiplexing protocol or a combination of multiplexing protocols to communicate, including, for example, FDMA, CDMA, SDMA, WDMA, or any derivative protocols, etc. Mobile network 703 and client devices 706 may also employ multiple-input and output (MIMO) channels to increase the data throughput over the signal lines coupling mobile network 703 client devices 706. Mobile network 703 may be any generation mobile phone network. In some instances, mobile network 702 maybe a 2 G or 2.5 G GSM, IS-95, etc., network; a 3 G UTMS, IS-2000, etc., network; a 4 G HSPA+, 3GPP LTE, WiMAX™, etc., network; etc. In some instances, mobile network 703 may include a backwards-compatible multi-generational network that supports two or more technology standards.
Client devices 706a . . . 706n (also referred to individually and collectively as 106) are computing devices having data processing and communication capabilities. In some embodiments, a client device 706 may include a processor (e.g., virtual, physical, etc.), a memory, a power source, a network interface, and/or other software and/or hardware components, such as a display, graphics processor, wireless transceivers, keyboard, camera, sensors, firmware, operating systems, drivers, various physical connection interfaces (e.g., USB, HDMI, etc.). Client devices 706a . . . 706n may couple to and communicate with one another and the other entities of system 700 via network 702 using a wireless and/or wired connection.
Examples of client devices 706 may include, but are not limited to, mobile phones (e.g., feature phones, smart phones, etc.), tablets, laptops, desktops, netbooks, server appliances, servers, virtual machines, TVs, set-top boxes, media streaming devices, portable media players, navigation devices, personal digital assistants, etc. While two or more client devices 706 are depicted in
In the depicted implementation, client devices 706a . . . 706n respectively contain instances 708a . . . 708n of a user application (also referred to individually and collectively as 708). User application 708 may be storable in a memory 804 (e.g., see
In some implementations, user application 708 may generate and present various user interfaces for performing various acts and/or functionality, which may in some cases be based at least in part on information received from the curation server, and/or media distribution server 722, etc., via network 702. In some implementations, user application 708 is code operable in a web browser, a native application (e.g., a mobile app), a combination of both, etc. Example interfaces that can be rendered and displayed by the user application 708 are depicted in
Curation server 716 and media distribution server 722 may include one or more computing devices having data processing, storing, and communication capabilities. For example, these entities 716 and 722 may include one or more hardware servers, virtual servers, server arrays, storage devices and/or systems, etc., and/or may be centralized or distributed/cloud-based. In some implementations, entities 716 and/or 722 may include one or more virtual servers, which operate in a host server environment and access the physical hardware of the host server including, for example, a processor, memory, storage, network interfaces, etc., via an abstraction layer (e.g., a virtual machine manager).
In the depicted implementation, curation server 716 may include a curation engine 718 operable to curate video maps, tags, filters, facilitate collaboration between various stakeholders during the curation process, provide curation-related data (e.g., filters and video maps) to other entities of the system for use thereby to personalize playback of audiovisual content, provide users with a media portal providing access to media content, etc. Curation engine 718 may send data to and receive data from the other entities of the system, such as client devices 706, and media distribution server 722. It should be understood that curation server 716 is not limited to providing the above-noted acts and/or functionality and may include other network-accessible services. In addition, while a single curation server 716 is depicted in
In some embodiments, the platform may include various access levels, such as a community level and a premium level. The community level may be free to all users and the premium level may provide users access to premium content, video maps, parental controls, and filters in exchange for a payment, e.g., monthly or annual subscription fee.
A filter is a user-defined collection of one or more audio and/or video lineups. An audio lineup is a set of audio clips from a multimedia file (e.g., a movie) that are to be played during playback of the multimedia file by the media player. A video lineup is a set of video clips from the multimedia file that are to be played during playback of the multimedia file by the media player.
The clips included in the respective audio and video lineups for a given filter are selected based on the filter settings set by the system or a user. For instance, via a user interface a user may specify the type of content he wishes to exclude from the playback of a multimedia file, and the curation platform may select the clips to include in the respective lineups using the video tags associated with the video map for the multimedia file and the content settings specified by the user. As a further example, using toggles 110a-110j and 122a-122r, depicted in
Each lineup, whether audio or video, may be comprised of data describing various sequences of clips from the multimedia file that match the filter settings. In some embodiments, each clip may be denoted by timecodes corresponding to start and end points of the clip.
During playback, the user application 708 configures the multimedia player to play back the multimedia file in a customized fashion based on the clips included in the audio and/or video lineups. More specifically, during playback of the multimedia file, the multimedia player renders audio for a given location in the multimedia file if that location corresponds to an audio clip in the audio lineup. In other words, the audio clips dictate which portions of the multimedia file are audibly played back. This results in the audio content between the audio clips not being rendered (e.g., being muted), and thereby implicitly eliminates the audio content the user does not want to hear.
Similarly, for video, during playback of the multimedia file, the multimedia player renders video for a given location in the multimedia file if that location corresponds to a video clip in the video lineup. In other words, the video clips dictate which portions of the multimedia file are visually played back. This results in the video content between the video clips not being rendered (e.g., being skipped, blanked, etc.), and thereby implicitly eliminates the video content the user does not want to see.
In some embodiments, the multimedia player determines whether a given location in the multimedia file being played back corresponds to an audio or video clip in lineups by comparing the timecode associated with the current playback location to the timecodes of the audio and video clips in the lineups of the filter.
It should be understood that in some embodiments, curation engine 718 may include instructions executable by processor 802 to automatically map and tag videos. For example, curation engine 718 may analyze a video (e.g., video data, audio data, etc.) for various patterns that match known patterns associated with certain types of content (objectionable, desirable, etc.) and may generate tags for the sections of the video corresponding to those patterns automatically and store them in association with a video map for that video. For instance, the analysis algorithms used to automatically generate tags may include known voice recognition and image recognition algorithms. When generating the tags, curation engine 718 may in some cases use the descriptors for the known patterns in the video tags to provide context. In some cases, the automatically-generated video tags may then be reviewed and published using the crowd-sourced curation process described herein. This is advantageous as it helps to ensure the accuracy of the automatically-generated tags. In some embodiments, curation engine 718 may monitor edits/inputs made during the curation process by the various different users, store tracking data for those edits/inputs, and then use the data to improve the accuracy of the video tags being generated. Curation engine 718 may use any known machine learning techniques for improving the automatic video map and tag generation process.
In some embodiments, the curation platform may be implemented using a web server (e.g., Apache), a MySQL database cluster, and a PHP interpreter, although it should be understood that any other suitable solution stack may be used. In some embodiments, the webserver may transmit formatted data files (e.g., HTML, XML, JSON), software objects (e.g., JavaScript objects), and presentation information (e.g., CSS style sheets), etc., to user application 708, and the user application may render these files to display the various interfaces discussed herein. In further embodiments, various information may be cached client-side, and user application 708 may refresh this data by requesting it from the other entities of system 700. Additional structure and functionality of curation engine 718 and media portal 728a are discussed further elsewhere herein.
Media distribution server 722 provides audiovisual content (e.g., multimedia files representing various movies, shows, amateur videos, etc.) stored in a data store to the other entities of system 700, such as one or more client devices 706. In some embodiments, media engine 724 included in media distribution server 722 includes software and/or hardware for cataloging and providing access to media content, such as audiovisual content, audio content, etc. In some embodiments, media engine 724 may include APIs for accessing the audiovisual content subscribed to, purchased by, bookmarked, etc., by a user. For instance, media portal 728a included in the curation server 716 may be capable of ingesting the audiovisual content associated with (e.g., rented by, purchased by, bookmarked by, etc.) various users to provide a customized portal through which the users may consume that audiovisual content.
Media distribution server 722 can cooperate with media portal 728 to provide an electronic resource to a user for consumption. As an example, media portal 728 may transmit a file (e.g., a webpage) to a client device 706 for display to user 712. The file may include code (e.g., an embedded HTML5, Flash, etc., media player) executable to receive an audiovisual content data stream from media engine 724 of media distribution server 722 and play it back to the user. In a further example, user application 708 may include a dedicated media player configured to receive and play content received from media distribution server 722. The audiovisual content may be stored as media objects in a media data store included in media distribution server 722, and transmitted to the one or more client devices 706 on demand, etc. Media distribution server 722 may be coupled to the media data store to access audiovisual content and other data stored in the media data store. In some embodiments, the audiovisual content may be streamed from media distribution server 722 via network 702. In other embodiments, a user can download an instance of the video and audio media objects from media distribution server 722 to a local repository for storage and local playback.
In some implementations, media portal 728, media engine 724, and/or curation engine 718 may require users 712 to be registered to access the functionality provided by them. For example, to access various functionality provided by media portal 728, media engine 724, and/or curation engine 718, a user 712 may be required to authenticate his/her identity (e.g., by confirming a valid electronic address). In some instances, these entities 728, 724, and/or 718 may interact with a federated identity server (not shown) to register/authenticate users 712. Once registered, these entities 728, 724, and/or 718 may require a user 712 seeking access to authenticate by inputting credentials in an associated user interface.
Additional acts, structure, and functionality of client devices 706, curation server 716, media distribution server 722, and their constituent components are described in further elsewhere herein.
It should be understood that system 700 in
The computing system 800 depicted in
Processor 802 may execute software instructions by performing various input/output, logical, and/or mathematical operations. Processor 802 may have various computing architectures to process data signals including, for example, a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, and/or an architecture implementing a combination of instruction sets. Processor 802 may be physical and/or virtual, and may include a single core or plurality of processing units and/or cores. In some implementations, processor 802 may be capable of generating and providing electronic display signals to a display device, supporting the display of images, capturing and transmitting images, performing complex tasks including various types of feature extraction and sampling, etc. In some implementations, processor 802 may be coupled to memory 804 via bus 806 to access data and instructions therefrom and store data therein. In
Memory 804 may store and provide access to data to the other components of computing system 800 in
Memory 804 includes a non-transitory computer-usable (e.g., readable, writeable, etc.) medium, which can be an apparatus or device that can contain, store, communicate, propagate or transport instructions, data, computer programs, software, code, routines, etc., for processing by or in connection with processor 802. In some implementations, memory 804 may include one or more of volatile memory and non-volatile memory. For example, memory 804 may include, but is not limited, to one or more of a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, a discrete memory device (e.g., a PROM, FPROM, ROM), a hard disk drive, an optical disk drive (CD, DVD, Blu-ray™, etc.). It should be understood that memory 804 may be a single device or may include multiple types of devices and configurations.
Bus 806 can include a communication bus for transferring data between components of a computing device or between computing devices, a network bus system including network 702 or portions thereof, a processor mesh, a combination thereof, etc. In some implementations, curation engine 718, media engine 724, and/or media portal 728, and/or various other software operating on computing device 706 (e.g., an operating system, device drivers, etc.) may cooperate and communicate via a software communication mechanism implemented in association with bus 806. The software communication mechanism can include and/or facilitate, for example, inter-process communication, local function or procedure calls, remote procedure calls, an object broker (e.g., CORBA), direct socket communication (e.g., TCP/IP sockets) among software modules, UDP broadcasts and receipts, HTTP connections, etc. Further, any or all of the communication could be secure (e.g., SSH, HTTPS, etc.).
Communication unit 808 may include one or more interface devices (I/F) for wired and/or wireless connectivity with network 702. For instance, communication unit 808 may include, but is not limited to, CAT-type interfaces; wireless transceivers for sending and receiving signals using Wi-Fi™, Bluetooth®, cellular communications, etc.; USB interfaces; various combinations thereof; etc. Communication unit 808 may include radio transceivers (4 G, 3 G, 2 G, etc.) for communication with the mobile network 703, and radio transceivers for Wi-Fi™ and close-proximity (e.g., Bluetooth®, NFC, etc.) connectivity. Communication unit 808 may connect to and send/receive data via mobile network 703, a public IP network of network 702, a private IP network of network 702, etc. In some implementations, communication unit 808 can link processor 802 to network 702, which may in turn be coupled to other processing systems. Communication unit 808 can provide other connections to network 702 and to other entities of system 700 using various standard network communication protocols, including, for example, those discussed elsewhere herein.
Data store 810 is an information source for storing and providing access to data. In some implementations, data store 810 may be coupled to components 802, 804, and 808 of computing system 800 via bus 806 to receive and provide access to data. In some implementations, data store 810 may store data received from other elements of system 700 including, for example, media engine 724 and/or user application 708, and may provide data access to these entities.
Data store 810 may be included in computing system 800 or in another computing device and/or storage system distinct from but coupled to or accessible by computing system 800. Data store 810 can include one or more non-transitory computer-readable mediums for storing the data. In some implementations, data store 810 may be incorporated with memory 804 or may be distinct therefrom. In some implementations, data store 810 may include a database management system (DBMS) operable on computing system 800. For example, the DBMS could include a structured query language (SQL) DBMS, a NoSQL DMBS, or various combinations thereof, etc. In some instances, the DBMS may store data in multi-dimensional tables comprised of rows and columns, and manipulate, i.e., insert, query, update and/or delete, rows of data using programmatic operations.
With reference to
With reference to
Input device 912 may include any device for inputting information into client device 706. In some implementations, input device 912 may include one or more peripheral devices. For example, input device 912 may include a keyboard (e.g., a QWERTY keyboard), a pointing device (e.g., a mouse or touchpad), a microphone, an image/video capture device (e.g., camera), etc. In some implementations, input device 912 may include a touch-screen display capable of receiving input from the one or more fingers of user 712. For instance, the functionality of input device 912 and display 910 may be integrated, and a user 712 of client device 706 may interact with client device 706 by contacting a surface of display 910 using one or more fingers. In this example, user 712 could interact with an emulated (i.e., virtual or soft) keyboard displayed on touch-screen display 910 by using fingers to contact the display in the keyboard regions.
The sensor 914 may include one or more sensing devices for detecting changes in the state of the client device 706 (e.g., movement, rotation, temperature, etc.). Example sensors may include, but are not limited to accelerometers, gyroscopes, thermocouples, etc. The sensor may be coupled to bus 806 to send the signals describing the changes it detects to the other components of client device 706, which can be used by them to provide various functionality and information to user 712.
In response, at step 1042 curation engine 118 may provide the video map to video reviewer 1040 for review. For instance, video reviewer 1040 may log into the curation platform and, upon doing so, may receive notification that the video map is ready for review, may search for and find the video map to be available for review, etc., and select to review the video map. At step 1044, while reviewing the video map, video reviewer 1040 may further configure the tags of the video map. For instance, video reviewer 1040 may correct any incorrect tags, input new tags, delete superfluous tags, etc., using a corresponding interface. User application 708 (e.g., see
Similar to video reviewer 1040, at steps 1062, 1064, and 1066, video publisher 1060 may log in and review and edit the video map, and once satisfied, publish the video map at step 1068 via an associated user interface. In response, user application 708 may transmit the publication request to curation engine 718, which may flag the video map as available for use by video viewers (e.g., via the media portal 728b).
At steps 1082-1090, video viewer 1080 may select to configure filters for a given multimedia file, and in response, media portal 728b may provide a filter interface for personalizing the playback of the multimedia file. To generate the interface, curation engine 718 may group the various tags of the video map by category, sub-category, language type, etc., and the interface may associate each group with a specific filter toggle and a set of user-selectable settings. Using the interface, the video viewer may define the settings associated with different groups of tags from the video map, customize one or more of the settings (e.g., by toggling the corresponding filters), and then save the filter definition via the interface. In response, user application 708 may transmit the filter definition to media portal 728b. In response, at step 1082, media portal 728b may provide a video interface that includes a media player (e.g., an embedded object representing the media player). At step 1088, the media portal 728b may also provide the video map associated with the multimedia file and the filter definition configured by the user. At step 1090, the media player may then personalize the playback of the audiovisual content based on the video map and the filter definition. Should the user have any feedback regarding the personalized playback experience (e.g., the video map, filter definition, etc.), the user may enter the feedback into the video interface in an associated field and submit that feedback to media portal 728b, which may at step 1091 receive and store the feedback in data store 810. At step 1092, curation engine 718 may then provide the feedback to the relevant stakeholder(s) (e.g., the video publisher), who may then incorporate it into the video map using the editing features thereof.
As shown in
In some embodiments, when a viewer watches a community multimedia file, curation engine 718 may be configured by default to provide the most recent version of the video map to customize playback of the video. In situations where the most recent revision causes the video map's halo score to decrease, curation engine 718 may revert back to providing the previous revision of the video map.
In some embodiments, once the score of a particular video map reaches a certain threshold (e.g., four halos or higher), any further revisions may be curated using a process the same as or similar to the process for curing premium content as described above in
In some embodiments, to keep the various stakeholders engaged, the process may provide various incentives to the stakeholders to help ensure that the filters curated by them and provided to video viewers are of the highest quality. For instance, curation engine 718 may be capable of tracking the activity of the different stakeholders and giving them a predetermined amount of credit for the different functions that they perform when curating the video maps and filters. For instance, for every video tag that a video tagger adds to a video map, curation engine 718 may attribute X points to that video tagger (e.g., by storing a corresponding entry in the data store 810). In another example, for every valid video tag change that a reviewer makes, curation engine 718 may attribute Y points to that video tagger. After a given user accumulates a predetermined amount of points (e.g., 50,000), curation engine 718 may be configured to add an incentive to the user's account (e.g., a free video rental, premium subscription credit, etc.)
Curation engine 718 may also analyze the activity tracked by the various users (e.g., taggers, reviewers, publishers) to determine how much rework was required before successfully publishing the video maps for a multimedia file. For instance, curation engine 718 can quantify the accuracy of the initial tags created by the video tagger based on the number of changes the video reviewer and/or video publisher made. Similarly, curation engine 718 can quantify the accuracy of the review by the video reviewer by determining the number of changes that the video publisher had to subsequently make to the video map to ready it for publishing.
Curation engine 718 can also analyze user performance over time, over the publishing of numerous video maps, to determine a performance trend. In some cases, should the performance trend drop below a certain threshold, the user may be demoted to a more subordinate role or may be cut off from participating in the curation process. On the other hand, users who perform their roles well may be promoted to a more prestigious role (e.g., from video tagger to video reviewer, or video reviewer to video publisher).
In some cases, curation engine 718 may further base the performance analysis of its curators on the video viewer ratings of the multimedia files and/or demand for the multimedia files. If a given multimedia file consistently receives poor ratings and/or has low demand, then curation engine 718 may determine that creators, reviewers, and publishers of the tags of the video map associated with the multimedia file did a low-quality job in curating the tags and corresponding filters.
In some cases, the curators (e.g., video taggers, video reviewers, and video publishers) may earn a certain amount of money for each video map they curate. For instance, for premium content, $150 dollars may be collectively earned by the curators, and for community content, $110 dollars may be earned. Curation engine 718 may split up the amount based on the roles of the users. For instance, for each $110 dollar multimedia file, the video tagger may earn $60 dollars, the video reviewer may earn $30, and the video publisher may earn $10. However, curation engine 718 may adjust the monetary ratios downward or upward based on the actual contribution of the users. For instance, if, upon analyzing the activity of the various users, curation engine 718 determines that video tagger did not spend enough time creating the tags, and as a result, missed several tags that the video reviewer and the video publisher had to make up for, curation engine 718 may increase the portion paid to the video reviewer and video publisher and decrease the portion paid to the video tagger. In some embodiments, the multimedia file that was curated must receive a certain amount of traffic (e.g., must be streamed a certain number of times over a predetermined period of time), before curation engine 718 gives the curators credit for the work they did curating the video map for the multimedia file. This allows time to receive feedback from viewers and allows curation engine 718 to account for this feedback when allocating rewards to various curators.
The technology described herein can take the form of an entirely hardware implementation, an entirely software implementation, or implementations containing both hardware and software elements.
This application claims priority to Provisional Application No. 61/941,228 filed on Feb. 18, 2014.
Number | Date | Country | |
---|---|---|---|
61941228 | Feb 2014 | US |