Edited video clips or snippets are often made available to users for viewing. For example, to create an edited video clip from an original full length video, the full length video is often first converted into a format convenient for playing by a computing device. The converted video can then be accessed by a video editing tool that allows the converted video to be trimmed to produce a video clip. The video clip can be edited with video effects, sound track and so on, and then saved as a new file. Meta data such as names of people and places and dates can be added to a video file. The edited and saved video clip can then be transferred to a medium for distribution or viewing by a selected viewership. The medium used for distribution can be an online internet site, a mobile software application, a hard drive, a DVD, storage media, film and so on. Once transferred, the video clip is ready for on-demand viewing by anyone with access and/or authorization to view.
With the popularization of mobile hand-held devices such as smart phones, available processing power and video recording capabilities allow users to perform many tasks pertaining to video capture, editing and distribution on mobile hand-held devices. For desktop computers, some of these tasks are becoming optimized. With the use of an external conversion system as described herein, digitally stored media and film can instantly interact with a computing-device video bit processing system. As such, the instant disclosure identifies and addresses a need for a software automated method for converting, creating, and distributing a video bit from a computing system. A video bit is a short video clip of predetermined length, for example, ten seconds.
As is described in greater detail below, the instant disclosure generally pertains to methods and systems for software automated conversion, creation and distribution of video bits from a computing system. In one example, video is captured and then converted into a format that can interact with a user's personal computing device (referred to as a client-side personal computing device). The captured video is made accessible to a user by a client-side video bit processing system. The video is played for the user. While the video plays, if the user makes a selection, a video bit is created. For example, the selection is made by the user performing a double-tap within a defined viewable area using the device's selection function, for example using a mouse, touch screen or keyboard. As a result of the selection, a video bit is created. An end time for the video bit is based on the point in the video where the user selection is made. For example, the end time is at the point in the video where the user made the selection. Alternatively, the end time for the video bit is a predetermined amount of time before or after the user made the selection. For example, the end time is two seconds before or two seconds after the user made the selection. The beginning of the video bit occurs at a predetermined amount of time before the end time. For example, the predetermined times are user adjustable. For example, a default value for the predetermined time used to determine the beginning of the video bit is ten seconds. For example, the predetermined times are used adjustable through the client-side user interface.
For example, the number of video bits created by a user during the play of a video is unrestricted. That is, as the video is played the user can make a selection at any point and a video bit is created base on the point in the video the selection is made continuously until the end of the video is reached.
For example, each video bit is automatically stored on the client-side personal computing device as well as uploaded to the application's server-side cloud-based storage where each video bit is organized in its own folder hierarchy, linked to the user's application identifier, and time ordered in a first-in basis. For example, the video bit residing either in the client or in the server-side cloud-based storage can be edited. The editing features available include, for example, adding text, slow motion, fast motion, adding music, grouping video bits to create a series of video bits (video bit stream), adding emoticons, adjusting colors, adding shapes or objects, adding pictures, and any other digital visual editing effect, adjusting length of the video bit and so on. For example, the user can optionally upload any additional client-side edited video to the server to be stored and utilized with the stored video bits. For example, based on a user identifier, all uploaded video bits can be linked to a user's feed settings so that there is automatic distribution of video bits to a community of users who have been authorized to receive and view the video bits of the user.
For example, an external input interface receives and reads various media, such as a solid state drive, a flash drive, a USB drive, a memory stick, and any other media that can store corresponding original video for access into a user's computing device. This allows a client-side video bit processing system to begin creating video bits.
Similarly, for example, an external system can be used to convert film based video into a digital format for access into a user's computing device. This also allows a client-side video bit processing system running on the user's computing device to begin creating video bits.
Features from any of the above-mentioned embodiments may be used in combination with one another in accordance with the general principles described herein. These and other embodiments, features, and advantages are more fully understood from the following detailed description in conjunction with the accompanying drawings and claims.
The present disclosure is generally directed to methods for automated creation of video bits. As is explained in greater detail below, optimizing the process of extracting video bits from a full length original video enables a user to be assisted in the creation of video bits quickly and automatically. The video bits can then be automatically distributed to a community of viewers authorized to view the video bits. Embodiments of the instant disclosure may also provide various other advantages and features, as discussed in greater detail below.
The following describes, with reference to
Network cloud 110 is used to enable communication between client-side personal computing devices 100 and server-side devices 120. Network cloud 110 may at times undergo technological change but it is assumed that client-side and server-side devices interoperate seamlessly. Network cloud services are managed by a third party operator and it is assumed such services are operating with 100% availability.
Server-side devices 120 provide storage and user identifiers such that video can be stored and organized for identified users. Once video bits are uploaded to one of server-side devices 120, the server-side device 129 acknowledges and recognizes security settings. The security settings enable user peers, with approved authorization, to access video bits from the user creator. The instant client-side personal computing devices 100, network cloud 110 and server-side devices 120 interoperate and establish a communication link so that the video bit processing system within client-side personal computing devices 100 can convert, create and distribute video bits.
Captured video is often stored on a physical medium, such as film, or electronically in digital format. When stored digitally on a client-side device (i.e., a user's personal computing device), video bit processing system 200 is able to interact with such video instantly as long as the memory device is accessible. However, when on film, video must be converted into a digital format. A Conversion system 500 (shown in
Creation block 202 provides for creation of a distinct video bit. Original video is played and displayed to a user. Real time stamp data is embedded within the original video. During play of the video, the user can make a selection, for example by double-tapping a client-device selection tool (e.g., a mouse, touch-screen, or keyboard). As a result of this user selection creation block 202 creates two distinct time stamps. The first time stamp is an end time which marks the time within the original video where the user makes the selection. The second time stamp is a start time which occurs a predetermined length of time before the end time. For example, a default value for the predetermined length of time is 10 seconds. For example, the user can vary the default value. The video between start and end times is considered the video bit. Creation block 202 saves the video bit within the current client computing device of client-side personal computing devices 100 and, for example, stores a copy of the video bit on the current server side device 120. The creation process can be done repeatedly to create additional video bits.
When video bits have been created, a user has the option to edit the video bits. The editing can include, for example, adding text, slow motion, fast motion, adding music, grouping video bits to create a series of video bits (video bit stream), adding emoticons, adjusting colors, adding shapes or objects, adding pictures, and any other digital visual editing effect. Edited video bits have the option to then be uploaded to server-side storage of server side device 120.
Within server side device 120, video bit processing system 200 allows a user to, for example, arrange, manage and create video bit streams as well as to delete video bit streams. To support this functionality, video bits are organized and identified in relation to a user on an event relation, first-in basis. For example, a user creates video bits related to event Y at instant Z. When the user, makes a user selection to create video bit Z, video bit Z is stored on local memory of client-side personal computing device 100 and server memory of server side device 120. Video bit processing system 200 automatically creates folder Y to reference event Y at instant Z on both client-side personal computing devices 100 and server side device 120. When each subsequent video bit Z+1, Z+2, Z+N is created, the created video bit is stored in the same folder Y. If video for event W were to be used, corresponding to a different original video, video bit processing system 200 automatically creates the next folder Y+1 and restarts at instant Z. In all cases, the user has the option to rename and delete video bits. Video bit naming may also include pre-words that use a folder-instant convention as defined by video bit processing system 200. Such words, for example, would then name the video bit as “folder_event_Z” or similar nature and store under a general folder. While the folder and naming conventions are susceptible to various modifications and alternative forms, the definitions shown are by way of example. Actual implementation of the folder and video bit will vary across user types.
Distribution block 203 occurs based on the user's publishing settings. For example, the user can authorize immediate distribution to all users on video bit processing system 200. Alternatively, the user can authorize distribution to a select group of peers by providing viewing rights to them alone. Alternatively, the user can maintain privacy by only allowing the user creator to view the video bits. In all instances, distribution settings are stored on the server-side of distribution block 203, where video bit processing system 200 recognizes such settings and acts as the gating device on client-side personal computing device 100.
Display screen 410 shows video bits 1-12, navigation buttons 413 and 414, operational buttons 415-420. The display screen shown in
For example, when adding text, the user selects an area where text will be added to the video bit and types words or sayings in the user-selected area. For example, when adding music, the user selects a music track to be played in parallel with the video. For example, when the user selects the video bit stream option, a stream of selected video bits can be created so that video bits within the stream of video bits are played in order. For example, when adding emoticons is selected, video bit processing system 200 emoticon library super-imposes emoticon artwork onto a video bit. Additional editing effects include changing or modifying colors seen on a video bit, adding shapes, lines, or graphical objects to a video bit, adding still pictures and super-imposing still pictures onto a video bit, and any other video or sound effect that enhances a video bit. Button D “Delete” removes the video bit from the library and then automatically re-organizes remaining video bits.
The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”
The foregoing discussion discloses and describes merely exemplary methods and implementations. As will be understood by those familiar with the art, the disclosed subject matter may be embodied in other specific forms without departing from the spirit or characteristics thereof. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the scope, which is set forth in the following claims.
Number | Date | Country | |
---|---|---|---|
62182393 | Jun 2015 | US |