The present disclosure generally relates to video processing and more particularly, to systems and methods for rendering effects in 360 video.
As smartphones and other mobile devices have become ubiquitous, people have the ability to capture video virtually anytime. Furthermore, 360 videos have gained increasing popularity.
In a first embodiment, a computing device for inserting an effect into a 360 video receives the effect from a user. A target region is also received from the user, where the target region corresponds to a location within the 360 video for inserting the effect. Next, the following steps are performed for each frame in the 360 video. The effect is inserted on a surface of a spherical model based on the target region, and two half-sphere frames or several projection frames containing the effect from the spherical model are generated. The two half-sphere frames or the projection frames are stitched to generate a panoramic representation of the effect, and the panoramic representation of the effect is blended with an original source panorama to generate a modified 360 video frame with the effect.
Another embodiment is a system that comprises a memory storing instructions and a processor coupled to the memory. The processor is configured by the instructions to receive the effect from a user and receive a target region from the user, the target region corresponding to a location within the 360 video for inserting the effect. The processor is further configured to perform the following steps for each frame in the 360 video. The effect is inserted on a surface of a spherical model based on the target region, and two half-sphere or the projection frames containing the effect from the spherical model are generated. The two half-sphere frames or the projection frames are stitched to generate a panoramic representation of the effect, and the panoramic representation of the effect is blended with an original source panorama to generate a modified 360 video frame with the effect.
Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor. The instructions, when executed by the processor, cause the computing device to receive the effect from a user and receive a target region from the user, the target region corresponding to a location within the 360 video for inserting the effect. The computing device is further configured to perform the following steps for each frame in the 360 video. The effect is inserted on a surface of a spherical model based on the target region, and two half-sphere frames or the projection frames containing the effect from the spherical model are generated. The two half-sphere frames or the projection frames are stitched to generate a panoramic representation of the effect, and the panoramic representation of the effect is blended with an original source panorama to generate a modified 360 video frame with the effect.
Various aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
An increasing number of digital capture devices are equipped with the ability to record 360 degree video (hereinafter “360 video”), which offers viewers a fully immersive experience. The creation of 360 video generally involves capturing a full 360 degree view using multiple cameras, stitching the captured views together, and encoding the video. With the increasing popularity of 360 degree videos, users may wish to incorporate graphics (e.g., a sticker), text, a picture-in-picture (PiP) effect, image, and/or other customized effects to further enhance 360 videos.
Various embodiments are disclosed for systems and methods for incorporating effects into a 360 video. Specifically, different techniques are disclosed for incorporating effects into a 360 video on a frame-by-frame basis, where an effect panorama frame containing the effect is generated and the effect panorama frame is then merged with the original source panorama frame to generate an edited frame. This process is repeated for each frame in the 360 video.
A description of a system for implementing the disclosed editing techniques is now described followed by a discussion of the operation of the components within the system.
For some embodiments, the computing device 102 may be equipped with a plurality of cameras 104a, 104b, 104c where the cameras 104a, 104b, 104c are utilized to capture digital media content comprising 360 degree views. In accordance with such embodiments, the computing device 102 further comprises a stitching module 106 configured to process the 360 degree views. Alternatively, the computing device 102 may obtain 360 video from other digital recording devices 112 coupled to the computing device 102 through a network interface 109 over a network 111.
As one of ordinary skill will appreciate, the digital media content may be encoded in any of a number of formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc (DVD) multimedia, Video Compact Disc (VCD) multimedia, High-Definition Digital Versatile Disc (HD-DVD) multimedia, Digital Television Video/High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), an MPEG Audio Layer III (MP3), an MPEG Audio Layer II (MP2), Waveform Audio Format (WAV), Windows Media Audio (WMA), or any number of other digital formats.
A video processor 108 executes on a processor of the computing device 102 and configures the processor to perform various operations relating to the coding of 360 video. The video processor 108 includes an effects unit 110 configured to incorporate effects into 360 videos according to various effects editing techniques, where the effects unit incorporates effects into a 360 video on a frame-by-frame basis. Specifically, the effects unit 110 generates an effect panorama frame containing the effect and then merges the effect panorama frame the original source panorama frame to generate an edited frame, as described in more detail below. This process is repeated for each frame in the 360 video.
The processing device 202 may include any custom made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors associated with the computing device 102, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and other well known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the computing system.
The memory 214 can include any one of a combination of volatile memory elements (e.g., random-access memory (RAM, such as DRAM, and SRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). The memory 214 typically comprises a native operating system 216, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. For example, the applications may include application specific software which may comprise some or all the components of the computing device 102 depicted in
Input/output interfaces 204 provide any number of interfaces for the input and output of data. For example, where the computing device 102 comprises a personal computer, these components may interface with one or more user input/output interfaces 204, which may comprise a keyboard or a mouse, as shown in
In the context of this disclosure, a non-transitory computer-readable storage medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable storage medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
Reference is made to
Although the flowchart 300 of
To begin, in block 310, the effects unit 110 receives a 360 video in addition to an effect to be incorporated into the 360 video. The effect may include, but is not limited to, graphics to be inserted into the 360 video, text, image, or PiP to be inserted onto or around objects in the 360 video, and so on. In block 320, the effects unit 110 positions the effect on a spherical representation corresponding to the 360 video, where the position is specified by the user. In this implementation, the position specified by the user corresponds to a desired location within the 360 video (i.e., a target region). Thus, the effect may be positioned directly at the target location on the spherical depiction of the 360 video. As described in more detail below, the user may specify the desired position on either the spherical representation or on a two-dimensional (2D) panorama frame. The user may also set the size of the target region.
In block 330, the effects unit 110 places the effect into the target region of the spherical representation. In block 340, the effects unit 110 then renders the effect by setting camera positions on opposite sides of the spherical representation with the desired effect applied to the model. Two half-sphere frames containing the effect are then generated. The projection model of the camera may be adjusted to obtain the entire content of the half-sphere frames.
In block 350, an effect panorama frame containing the effect is generated by applying a stitching technique to merge the two half-sphere frames into a single panorama frame. This may comprise first warping the two half-sphere frames into two square frames and then stitching or merging the two square frames into a single effect panorama frame.
Additional details regarding various steps in the flowchart of
To further illustrate, reference is made to
Reference is made to
Although the flowchart 900 of
To begin, in block 910, the effects unit 110 receives a 360 video in addition to an effect to be incorporated into the 360 video. In block 920, the effects unit 110 obtains the desired location of the effect on a 2D view of the 360 video. In block 930, the obtained location is converted to a location of the effect on the spherical representation, and the effect is placed onto the surface of the spherical representation (block 940).
In block 950, the effects unit 110 then renders the effect as two half-sphere frames and generates an effect panorama frame from the two half-sphere frames (block 960). In block 970, the effect panorama frame is then blended or merged with the original source panorama frame. Thereafter, the process in
Additional details for the various operations in
Next, 3D modeling and texture mapping techniques are applied for placing the desired effect onto the 3D spherical surface. Reference is made to
As described above in connection with the flowchart of
In accordance with alternative embodiments, multiple rectilinear frames (rather than two half-sphere frames) are captured and stitched together. In accordance with such embodiments, the effect is applied to a desired region on the sphere, and the effect is then rendered as six rectilinear frames. An effect panorama frame is then generated from the six rectilinear frames. The effect panorama is then blended with the source panorama frame. To illustrate, reference is made to
After the six rectilinear views are captured, the views are then stitched or merged together to form a single panorama frame. Reference is made to
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
This application claims priority to, and the benefit of, U.S. Provisional patent application entitled, “Systems and Methods for Adjusting Directional Audio in a 360 Video,” having Ser. No. 62/367,716, filed on Jul. 28, 2016, which is incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
7483590 | Nielsen et al. | Jan 2009 | B2 |
8554014 | Levy et al. | Oct 2013 | B2 |
8717412 | Linder et al. | May 2014 | B2 |
9071752 | Kuo et al. | Jun 2015 | B2 |
9124867 | Furumura et al. | Sep 2015 | B2 |
9135678 | Feng et al. | Sep 2015 | B2 |
9277122 | Imura et al. | Mar 2016 | B1 |
10127632 | Burke | Nov 2018 | B1 |
10127714 | Kvaalen | Nov 2018 | B1 |
20090021576 | Linder | Jan 2009 | A1 |
20100045773 | Ritchey | Feb 2010 | A1 |
20130044108 | Tanaka | Feb 2013 | A1 |
20130063550 | Ritchey | Mar 2013 | A1 |
20140267596 | Geerds | Sep 2014 | A1 |
20170336705 | Zhou | Nov 2017 | A1 |
20170339391 | Zhou | Nov 2017 | A1 |
20170366748 | Festa | Dec 2017 | A1 |
20180211443 | Abbas | Jul 2018 | A1 |
Number | Date | Country |
---|---|---|
2685707 | Jan 2014 | EP |
2013020143 | Feb 2013 | WO |
Entry |
---|
Niclas Bahn, “360VR Toolbox Public Beta Released” Sep. 12, 2015. |
HDR Panoramas with PTGui Pro, https://www.ptgui.com/hdrtutorial.html (Printed on Jul. 21, 2017). |
Spherical Panorama, https://www.ptgui.com/info/spherical_panorama.html (Printed on Jul. 21, 2017). |
HDR Workflow with Hugin, http://hugin.sourceforge.net/docs/manual/HDR_workflow_with_hugin.html (Printed on Jul. 21, 2017). |
Number | Date | Country | |
---|---|---|---|
20180033176 A1 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
62367716 | Jul 2016 | US |