This disclosure relates to facilitating user controlled bidirectional speed ramping of visual content during playback.
Video applications may allow a user to change playback speeds of a video based on playback time locations. Changing playback speeds of a video based on playback time locations may result in the video running out of playback content prior to the desired end points. For example, a user may change a playback speed of a video having a duration of five minutes. At the four minute mark, the user may set the playback speed of the video to be 4× speed for thirty seconds. At the four minute mark, the video only has fifteen seconds of playback content at 4× speed. The video will run out of playback content prior to the end of the desired thirty-second segment at 4× speed.
This disclosure relates to bidirectional speed ramping. Electronic information defining visual content within video frames may be accessed. Video frames may be ordered in a source sequence. Positions in the source sequence may be associated with playback directions. Video frames may be ordered in a playback sequence based on the playback directions. The playback sequence may characterize a playback order in which video frames are displayed during playback. Video frames in the playback sequence may be associated with playback speeds. The playback speeds may determine perceived speeds with which the visual content is displayed during playback. Speed ramped video frames may be determined based on the playback sequence and the playback speeds. A speed ramped video may be generated based on the speed ramped video frames.
A system for bidirectional speed ramping may include one or more physical processors, and/or other components. The physical processor(s) may be configured by machine-readable instructions. Executing the machine-readable instructions may cause the physical processor(s) to facilitate bidirectional speed ramping. The machine-readable instructions may include one or more computer program components. The computer program components may include one or more of an access component, a playback direction component, a playback sequence component, a playback speed component, a speed ramped video frame component, a speed ramped video component, and/or other computer program components.
The access component may be configured to access electronic information and/or other information. The electronic information may be stored in a storage medium and/or in other locations. The electronic information may define visual content within video frames for playback. Visual content may refer to media content that may be observed visually. Visual content may include one or more videos stored in one or more formats/container, and/or other visual content. The video frames may be ordered in a source sequence. In some implementations, the source sequence may characterize a source order corresponding to a sequence in which the visual content occurred at capture.
The playback direction component may be configured to associate one or more positions in the source sequence with one or more playback directions. The playback directions may include a forward playback direction, a reverse playback direction, and/or other directions.
The playback sequence component may be configured to order the video frames in a playback sequence. The video frames may be ordered in the playback sequence based on the playback direction(s) and/or other information. The playback sequence may characterize a playback order in which one or more of the video frames in the playback sequence may be displayed during playback. Ordering the video frames in the playback sequence may include designating one or more of the video frames in the source sequence in one or more playback positions in the playback sequence.
In some implementations, ordering the video frames in the playback sequence may include designating one of the video frames in the source sequence as a first video frame in the playback sequence and designating one of the video frames in the source sequence as a last video frame in the playback sequence. In some implementations, ordering the video frames in the playback sequence may include excluding one or more of the video frames in the source sequence from the playback sequence.
The playback speed component may be configured to associate one or more of the video frames in the playback sequence with one or more playback speeds. The playback speeds may determine one or more perceived speeds with which the visual content is displayed during playback.
The speed ramped video frame component may be configured to determine speed ramped video frames to be included in one or more speed ramped videos. The speed ramped video frames may be determined based on the playback sequence, the playback speed(s), and/or other information. In some implementations, determining the speed ramped video frames may include identifying one or more playback positions in the playback sequence based on the playback speed(s) and/or other information. The playback position(s) may correspond to the speed ramped video frames.
In some implementations, determining the speed ramped video frames may include, in response to a playback position aligning with a position of a video frame in the source sequence, using the video frame in the source sequence as the speed ramped video frame. Determining the speed ramped video frames may include, in response to a playback position not aligning with a position of a video frame in the source sequence, using two of the video frames in the source sequence to determine an interpolated video frame for the playback position. Determining the speed ramped video frames may include, in response to a playback position not aligning with a position of a video frame in the source sequence, using one of the video frames in the source sequence to determine a duplicated video frame for the playback position. Determining the speed ramped video frames may include modifying one or more of the video frames to include motion blur.
The speed ramped video component may be configured to generate one or more speed ramped videos. The speed ramped video(s) may be generated based on the speed ramped video frames and/or other information.
These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
Storage medium 12 may include electronic storage medium that electronically stores information. Storage medium 12 may store software algorithms, information determined by processor 11, information received remotely, and/or other information that enables system 10 to function properly. For example, storage medium 12 may store information relating to visual content, video frames, source sequence, playback directions, playback sequence, playback speeds, speed ramped video frames, speed ramped video, and/or other information. System 10 may include electronic storage separate from storage medium 12. Electronic storage separate from storage medium 12 may perform one or more of the functionalities of storage medium 12 discussed above.
Processor 11 may be configured to provide information processing capabilities in system 10. As such, processor 11 may comprise one or more of a digital processor, an analog processor, a digital circuit designed to process information, a central processing unit, a graphics processing unit, a microcontroller, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Processor 11 may be configured to execute one or more machine-readable instructions 100 to facilitate bidirectional speed ramping. Machine-readable instructions 100 may include one or more computer program components. Machine-readable instructions 100 may include one or more of access component 102, playback direction component 104, playback sequence component 106, playback speed component 108, speed ramped video frame component 110, speed ramped video component 112, and/or other computer program components.
Access component 102 may be configured to access electronic information 20 and/or other information. Electronic information 20 may be stored in storage medium 12 and/or in other locations. Electronic information 20 may define visual content within video frames for playback. Visual content may refer to media content that may be observed visually. Visual content may include one or more videos stored in one or more formats/container, and/or other visual content. A video may include a video clip captured by a video capture device, multiple video clips captured by a video capture device, and/or multiple video clips captured by separate video capture devices. A video may include multiple video clips captured at the same time and/or multiple video clips captured at different times. A video may include a video clip processed by a video application, multiple video clips processed by a video application and/or multiple video clips processed by separate video applications.
The video frames may be ordered in a source sequence.
In some implementations, the source sequence may characterize a source order corresponding to a sequence ordered by one or more video edits. For example, the visual content may include multiple video clips arranged in a particular order by a video application. The source sequence may characterize the order in which the video frames appear in the visual content that includes the particular arrangement of the video clips.
Playback direction component 104 may be configured to associate one or more positions in the source sequence with one or more playback directions. The playback directions may include a forward playback direction, a reverse playback direction, and/or other directions. For example,
Playback sequence component 106 may be configured to order the video frames in a playback sequence. The playback sequence may characterize a playback order in which one or more of the video frames in the playback sequence may be displayed during playback. For example, a playback sequence in which video frames #1-3 appear in the order of video frame #1, video frame #3, video frame #2 may characterize the playback order in which the first video frame displayed during playback is video frame #1, the second video frame displayed during playback is video frame #3, and the third video frame displayed during playback is video frame #2.
The video frames may be ordered in the playback sequence based on the playback direction(s) and/or other information. Ordering the video frames in the playback sequence may include designating one or more of the video frames in the source sequence in one or more playback positions in the playback sequence. For example,
In some implementations, ordering the video frames in the playback sequence may include designating one of the video frames in the source sequence as a first video frame in the playback sequence and designating one of the video frames in the source sequence as a last video frame in the playback sequence. For example, in
In some implementations, ordering the video frames in the playback sequence may include excluding one or more of the video frames in the source sequence from the playback sequence. For example,
Playback speed component 108 may be configured to associate one or more of the video frames in the playback sequence with one or more playback speeds. The playback speeds may determine one or more perceived speeds with which the visual content is displayed during playback. The playback speeds may allow the visual content to be displayed at a slower speed (e.g., 0.5× speed, etc.), at a default speed (e.g., 1× speed, etc.), and/or at a faster speed (e.g., 2× speed, etc.).
For example,
The use of source video frame positions to determine playback directions and/or playback speeds of a video may allow a user/system to change the playback directions and/or the playback speeds of the video without the video running out of playback content prior to the desired end points. The use of source video frame positions to determine playback directions and/or the playback speeds of a video may allow a user/system to copy, cut, paste, and/or otherwise move a portion of the video without affecting the playback directions and/or the playback speeds of the portion of the video.
Although speed values in playback direction and playback speed plot A 700 are shown to not vary between ranges of source video frame positions in the same playback direction (e.g., between positions #0-9, #9-5, #5-11, #11-19), this is merely for ease of reference and is not limiting. For example, playback speeds associated with video frames may change linearly and/or non-linearly. For example, playback speeds associated with video frames may include an increasing slope and/or a decreasing slope. Although speed values and playback directions in playback direction and playback speed plot A 700 are shown to change at whole number source video frame positions, this is merely for ease of reference and is not limiting. Speed values and/or playback directions may change at one or more non-whole number source video frame positions.
For example,
In
Speed ramped video frame component 110 may be configured to determine speed ramped video frames to be included in one or more speed ramped videos. The speed ramped video frames may be determined based on the playback sequence, the playback speed(s), and/or other information. Determining the speed ramped video frames may include identifying one or more playback positions in the playback sequence based on the playback speed(s) and/or other information. The playback position(s) may correspond to the speed ramped video frames. For example, speed ramped video frame component 110 may identify one or more playback positions in the playback sequence based on the playback speeds. From these playback positions, speed ramped video frame component 110 may select and/or generate one or more video frames to be included as speed ramped video frames. This may allow rendering of a new set of video frames that capture the ordering of video frames based on the playback sequence and the timing of the video frames based on the playback speeds.
For example,
For example, between video frames #0-9 of playback sequence 500, one playback position may be identified for every video frame. Playback positions identified may include playback positions #0, #1, #2, #3, #4, #5, #6, #7, #8, and #9. Identifying one playback position per video frame may correspond to a playback speed of 1×.
Between video frames #9-5 of playback sequence 500, one playback position may be identified for every two video frames. Playback positions identified may include playback positions #7 and #5. Identifying one playback position per two video frames may correspond to a playback speed of 2×.
Between video frames #5-11 of playback sequence 500, two playback positions may be identified for every video frame. Playback positions identified may include playback positions #5.5, #6, #6.5, #7, #7.5, #8, #8.5, #9, #9.5, #10, #10.5, and #11. Identifying two playback positions per video frame may correspond to a playback speed of 0.5×.
Between video frames #11-19 of playback sequence 500, one playback position may be identified for every four video frames. Playback positions identified may include playback positions #15 and #19. Identifying one playback position per four video frames may correspond to a playback speed of 4×.
Playback positions identified by speed ramped video frame component 110 may or may not correspond to positions of video frames in the source sequence. In some implementations, determining the speed ramped video frames may include, in response to a playback position aligning with a position of a video frame in the source sequence, using the video frame in the source sequence as the speed ramped video frame. For example, in
In some implementations, determining the speed ramped video frames may include, in response to a playback position not aligning with a position of a video frame in the source sequence, using multiple video frames in the source sequence to determine an interpolated video frame for the playback position. For example, for playback position #5.5, a speed ramped video frame may be interpolated using video frame #5 and video frame #6 in source sequence 300. In some implementations, one or more images may be given more weight than other images for video frame interpolation. For example, an identified playback position may include playback position #5.3. A speed ramped video frame generated for playback position #5.3 may include interpolation of video frame #5 and video frame #6 in the source sequence, where video frame #5 is given more influence over the interpolation. This may allow the interpolated video frame for playback position #5.3 to appear closer to video frame #5 than video frame #6.
In some implementations, determining the speed ramped video frames may include, in response to a playback position not aligning with a position of a video frame in the source sequence, using one of the video frames in the source sequence to determine a duplicated video frame for the playback position. For example, for playback position #6.5, a speed ramped video frame may be duplicated from video frame #6 in source sequence 300. Duplication of video frames may allow one or more video frames in source sequence 300 to appear multiple times in a row and simulate a slower playback speed. In some implementations, the speed ramped video frames may include interpolated video frames and duplicated video frames.
In some implementations, determining the speed ramped video frames may include modifying one or more of the video frames to include motion blur. Motion blur may emphasize the perceived increase in speed of playback for the visual content. For example, in
Speed ramped video component 112 may be configured to generate one or more speed ramped videos. The speed ramped video(s) may be generated based on the speed ramped video frames and/or other information. For example, speed ramped video component 112 may encode the speed ramped video frames into a new video file. The speed ramped video frames may be encoded using one or more encoding framerates (e.g., 30 frames per second, variable framerate, etc.).
While the present disclosure may be directed to videos, one or more other implementations of the system may be configured for other types media content. Other types of media content may include one or more of audio content (e.g., music, podcasts, audio books, and/or other audio content), multimedia presentations, photos, slideshows, and/or other media content.
Although processor 11 and storage medium 12 are shown to be connected to an interface 13 in
Although processor 11 is shown in
It should be appreciated that although computer components 102, 104, 106, 108, 110, and 112 are illustrated in
The description of the functionality provided by the different computer program components described herein is for illustrative purposes, and is not intended to be limiting, as any of computer program components may provide more or less functionality than is described. For example, one or more of computer program components 102, 104, 106, 108, 110, and/or 112 may be eliminated, and some or all of its functionality may be provided by other computer program components. As another example, processor 11 may be configured to execute one or more additional computer program components that may perform some or all of the functionality attributed to one or more of computer program components 102, 104, 106, 108, 110, and/or 112 described herein.
The electronic storage media of storage medium 12 may be provided integrally (i.e., substantially non-removable) with one or more components of system 10 and/or removable storage that is connectable to one or more components of system 10 via, for example, a port (e.g., a USB port, a Firewire port, etc.) or a drive (e.g., a disk drive, etc.). Storage medium 12 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EPROM, EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Storage medium 12 may be a separate component within system 10, or storage medium 12 may be provided integrally with one or more other components of system 10 (e.g., processor 11). Although storage medium 12 is shown in
In some implementations, method 200 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, a central processing unit, a graphics processing unit, a microcontroller, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 200 in response to instructions stored electronically on one or more electronic storage mediums. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 200.
Referring to
At operation 202, one or more positions in the source sequence may be associated with one or more playback directions. The playback directions may include a forward playback direction and a reverse playback direction. In some implementations, operation 202 may be performed by a processor component the same as or similar to playback direction component 104 (shown in
At operation 203, the video frames may be ordered in a playback sequence based on the one or more playback directions. The playback sequence may characterize a playback order in which one or more of the video frames in the playback sequence are displayed during playback. In some implementations, operation 203 may be performed by a processor component the same as or similar to playback sequence component 106 (shown in
At operation 204, one or more of the video frames in the playback sequence may be associated with one or more playback speeds. The one or more playback speeds may determine one or more perceived speeds with which the visual content is displayed during playback. In some implementations, operation 204 may be performed by a processor component the same as or similar to playback speed component 108 (shown in
At operation 205, speed ramped video frames to be included in a speed ramped video may be determined. The speed ramped video frames may be determined based on the playback sequence and the one or more playback speeds. In some implementations, operation 205 may be performed by a processor component the same as or similar to speed ramped video frame component 110 (shown in
At operation 206, the speed ramped video may be generated. The speed ramped video may be generated based on the speed ramped video frames. In some implementations, operation 206 may be performed by a processor component the same as or similar to speed ramped video component 112 (shown in
Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
Number | Name | Date | Kind |
---|---|---|---|
6633685 | Kusama | Oct 2003 | B1 |
7512886 | Herberger | Mar 2009 | B1 |
7885426 | Golovchinsky | Feb 2011 | B2 |
7970240 | Chao | Jun 2011 | B1 |
8180161 | Haseyama | May 2012 | B2 |
8606073 | Woodman | Dec 2013 | B2 |
8774560 | Sugaya | Jul 2014 | B2 |
8971623 | Gatt | Mar 2015 | B2 |
8990328 | Grigsby | Mar 2015 | B1 |
9041727 | Ubillos | May 2015 | B2 |
9142257 | Woodman | Sep 2015 | B2 |
9342376 | Jain | May 2016 | B2 |
9418283 | Natarajan | Aug 2016 | B1 |
20020165721 | Chang | Nov 2002 | A1 |
20040001706 | Jung | Jan 2004 | A1 |
20050025454 | Nakamura | Feb 2005 | A1 |
20050108031 | Grosvenor | May 2005 | A1 |
20050198018 | Shibata | Sep 2005 | A1 |
20060080286 | Svendsen | Apr 2006 | A1 |
20060115108 | Rodriguez | Jun 2006 | A1 |
20080123976 | Coombs | May 2008 | A1 |
20080183843 | Gavin | Jul 2008 | A1 |
20090019995 | Miyajima | Jan 2009 | A1 |
20090027499 | Nicholl | Jan 2009 | A1 |
20090125559 | Yoshino | May 2009 | A1 |
20090252474 | Nashida | Oct 2009 | A1 |
20100046842 | Conwell | Feb 2010 | A1 |
20100086216 | Lee | Apr 2010 | A1 |
20100161720 | Colligan | Jun 2010 | A1 |
20100199182 | Lanza | Aug 2010 | A1 |
20100274714 | Sims | Oct 2010 | A1 |
20100318660 | Balsubramanian | Dec 2010 | A1 |
20110103700 | Haseyama | May 2011 | A1 |
20110137156 | Razzaque | Jun 2011 | A1 |
20110170086 | Oouchida | Jul 2011 | A1 |
20110206351 | Givoly | Aug 2011 | A1 |
20110242098 | Tamaru | Oct 2011 | A1 |
20120014673 | O'Dwyer | Jan 2012 | A1 |
20120027381 | Kataoka | Feb 2012 | A1 |
20120030263 | John | Feb 2012 | A1 |
20120141019 | Zhang | Jun 2012 | A1 |
20120210205 | Sherwood | Aug 2012 | A1 |
20120210228 | Wang | Aug 2012 | A1 |
20120246114 | Edmiston | Sep 2012 | A1 |
20120283574 | Park | Nov 2012 | A1 |
20130182166 | Shimokawa | Jul 2013 | A1 |
20130235071 | Ubillos | Sep 2013 | A1 |
20130239051 | Albouze | Sep 2013 | A1 |
20130330019 | Kim | Dec 2013 | A1 |
20140149865 | Tanaka | May 2014 | A1 |
20140152762 | Ukil | Jun 2014 | A1 |
20140282661 | Martin | Sep 2014 | A1 |
20150039646 | Sharifi | Feb 2015 | A1 |
20150071547 | Keating | Mar 2015 | A1 |
20150113009 | Zhou | Apr 2015 | A1 |
20150156247 | Hensel | Jun 2015 | A1 |
20150287435 | Land | Oct 2015 | A1 |
20160029105 | Newman | Jan 2016 | A1 |
20160094601 | Besehanic | Mar 2016 | A1 |
20160103830 | Cheong | Apr 2016 | A1 |
20160189752 | Galant | Jun 2016 | A1 |
20160260000 | Yamakawa | Sep 2016 | A1 |
20160286235 | Yamamoto | Sep 2016 | A1 |
Number | Date | Country |
---|---|---|
H09181966 | Jul 1997 | JP |
2005252459 | Sep 2005 | JP |
2006053694 | Feb 2006 | JP |
2006053694 | Feb 2006 | JP |
2008059121 | Mar 2008 | JP |
2009053748 | Mar 2009 | JP |
2011188004 | Sep 2011 | JP |
2011188004 | Sep 2011 | JP |
2006001361 | Jan 2006 | WO |
2009040538 | Apr 2009 | WO |
2012057623 | May 2012 | WO |
2012057623 | May 2012 | WO |
2012086120 | Jun 2012 | WO |
Entry |
---|
Nicole Lee, Twitter's Periscope is the best livestreaming video app yet; Mar. 26, 2015 URL:http://www.engadget.com/2015/03/26/periscope/ [Retrieved Aug. 25, 2015] 11 pages. |
Japanese Office Action for JP Application No. 2013-140131, dated Aug. 5, 2014, 6 pages. |
Office Action for U.S. Appl. No. 13/831,124, dated Mar. 19, 2015, 14 pages. |
PSonar URL: http://www.psonar.com/about retrieved on Aug. 24, 2016, 3 pages. |
PCT International Search Report and Written Opinion for PCT/US2015/023680, dated Oct. 6, 2015, 13 pages. |
PCT International Written Opinion for PCT/US2015/041624, dated Dec. 17, 2015, 7 pages. |
PCT International Search Report for PCT/US15/41624 dated Nov. 4, 2015, 5 pages. |
PCT International Search Report for PCT/US15/23680 dated Aug. 3, 2015, 4 pages. |
PCT International Preliminary Report on Patentability for PCT/US2015/023680, dated Oct. 4, 2016, 10 pages. |
FFmpeg, “Demuxing,” Doxygen, Dec. 5, 2014, 15 Pages, [online] [retrieved on Jul. 13, 2015] Retrieved from the internet <URL:https://www.ffmpeg.org/doxygen/2.3/group_lavf_encoding.html>. |
PCT International Search Report and Written Opinion for PCT/US15/12086 dated Mar. 17, 2016, 20 pages. |
FFmpeg, “Muxing,” Doxygen, Jul. 20, 2014, 9 Pages, [online] [retrieved on Jul. 13, 2015] Retrieved from the internet <URL: https://www.ffmpeg.org/doxyg en/2. 3/structA VP a ck et. html>. |
Iandola et al., “SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size”, arXiv:1602.07360v3 [cs.CV] Apr. 6, 2016 (9 pgs.). |
Yang et al., “Unsupervised Extraction of Video Highlights Via Robust Recurrent Auto-encoders” arXiv:1510.01442v1 [cs.CV] Oct. 6, 2015 (9 pgs). |
Tran et al., “Learning Spatiotemporal Features with 3D Convolutional Networks”, arXiv:1412.0767 [cs.CV] Dec. 2, 2014 (9 pgs). |
Schroff et al., “FaceNet: A Unified Embedding for Face Recognition and Clustering,” IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, 10 pgs. |
Parkhi et al., “Deep Face Recognition,” Proceedings of the British Machine Vision, 2015, 12 pgs. |
Iandola et al., “SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size,” arXiv:1602.07360, 2016, 9 pgs. |
Ioffe et al., “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” arXiv:1502.03167, 2015, 11 pgs. |
He et al., “Deep Residual Learning for Image Recognition,” arXiv:1512.03385, 2015, 12 pgs. |
Han et al., Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding, International Conference on Learning Representations 2016, 14 pgs. |