The present disclosure generally relates to systems and methods for performing distributed playback of 360-degree video in a plurality of viewing windows.
As smartphones and other mobile devices have become ubiquitous, people have the ability to capture video virtually anytime. Furthermore, 360-degree videos have gained increasing popularity.
Systems and methods for performing distributed playback of 360-degree video in a plurality of viewing windows are disclosed. In a first embodiment, a computing device receives a 360-degree video bitstream. The computing device also receives a field of view angle for a main viewing window from a user. A user interface comprising the main viewing window and the plurality of peripheral viewing windows is generated, where the plurality of peripheral viewing windows each have a corresponding field of view angle. The computing device executes distributed playback of the 360-degree video in the main viewing window and the plurality of peripheral viewing windows based on the field of view angles of the main viewing window and the plurality of peripheral viewing windows.
Another embodiment is a system that comprises a memory device storing instructions and a processor coupled to the memory device. The processor is configured by the instructions to receive a 360-degree video bitstream and receive a field of view angle for a main viewing window from a user. The processor is further configured to generate a user interface comprising the main viewing window and the plurality of peripheral viewing windows, the plurality of peripheral viewing windows each having a corresponding field of view angle. The processor is further configured to execute distributed playback of the 360-degree video in the main viewing window and the plurality of peripheral viewing windows based on the field of view angles of the main viewing window and the plurality of peripheral viewing windows.
Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor. The instructions, when executed by the processor, cause the computing device to receive a 360-degree video bitstream and receive a field of view angle for a main viewing window from a user. The processor is further configured to generate a user interface comprising the main viewing window and the plurality of peripheral viewing windows, the plurality of peripheral viewing windows each having a corresponding field of view angle. The processor is further configured to execute distributed playback of the 360-degree video in the main viewing window and the plurality of peripheral viewing windows based on the field of view angles of the main viewing window and the plurality of peripheral viewing windows.
Various aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
An increasing number of digital capture devices are capable of recording 360 degree video (hereinafter “360-degree video”). The creation of 360-degree video generally involves capturing a full 360 degree view using multiple cameras, stitching the captured views together, and encoding the video. However, when viewing 360-degree video, users are typically limited to viewing content in the field of view angle associated with the playback window and then navigating the 360-degree video using a dial component or other user interface component.
Various embodiments are disclosed for providing users with a plurality of viewing windows for playback of a 360-degree video where the plurality of viewing windows provide the user with an improved way of viewing the 360-degree video, thereby providing the user with a fully immersive experience. Specifically, various embodiments provide an improved mechanism for viewing 360-degree video by providing a virtual rear-view mirror effect whereby the user can view content (straight ahead) while at the same time viewing content behind the user without panning rearward. The combined field of view angles provided by the plurality of viewing windows may collectively provide a combined field of view of less than 360 degrees or greater than 360 degrees, as described in more detail below. For instances where the combined field of view angles is greater than 360 degrees, the field of views of two or more of the viewing windows at least partially overlap. In accordance with exemplary embodiments, the content displayed in the main viewing window and in the peripheral windows originates from a common source (e.g., a 360-degree camera).
To capture 360-degree video, multiple cameras are typically placed in the same location to obtain separate bitstreams from different points of view where the bitstreams are then stitched together to form a 360-degree video. The different bitstreams correspond to the same 360-degree video content. Note that this is in contrast to other configurations where different cameras are positioned at various locations and configured to capture digital content independently of the other cameras. In accordance with various embodiments, when the user adjusts the field of view associated with a first viewing window (e.g., a main viewing window), the field of view of each of the remaining windows is adjusted accordingly. Furthermore, in accordance with various embodiments, the user is able to specify a field of view angle for each viewing window for purposes of playback distribution among the plurality of viewing windows.
A description of a system for implementing the distributed playback techniques disclosed herein is now described followed by a discussion of the operation of the components within the system.
For some embodiments, the computing device 102 may be equipped with a plurality of cameras where the cameras are utilized to directly capture digital media content comprising 360 degree views. In accordance with such embodiments, the computing device 102 further comprises a stitching module (not shown) configured to process the 360 degree views. Alternatively, the computing device 102 may obtain 360-degree video from other digital recording devices. For example, the computing device 102 may also be configured to access one or more content sharing websites hosted on a server via a network to retrieve digital media content.
As one of ordinary skill will appreciate, the digital media content may be encoded in any of a number of formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, H.265, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc multimedia, Digital Television Video/High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), Full HD, Ultra HD, 8K or any number of other digital formats.
The computing device 102 includes a video processor 104 for generating a plurality of viewing windows and for executing distributed playback of a 360-degree video using the plurality of viewing windows. The video processor 104 includes a video decoder 106 configured to process an incoming 360-degree video 120 stored in a data store 122 of the computing device 102. The video processor 104 further comprises a user interface (UI) component 108 configured to render the plurality of viewing windows for display. The UI component 108 is further configured to obtain user input comprising a field of view angle for a main viewing window and/or each of a plurality of peripheral viewing windows. For some embodiments, the field of view angle of the main viewing window may be specified as a ratio relating to the field of view angles for each of the main viewing window and the peripheral viewing windows, where distribution of playback of the 360-degree video is executed based on the specified ratio. For example, the user may elect to have a field of view angle of 180 degrees for the main viewing window while two peripheral windows each have a field of view angle equal to half the field of view angle of the main viewing window, thereby providing a full 360 degree view.
The video processor 104 further comprises a navigation unit 110 for receiving navigation data generated by the user. In some embodiments, the UI component 108 may generate a user interface whereby the user is provided with a dial component for panning to the right, left, up, down, etc. The user generates navigation data using the dial component in the user interface. Based on the generated navigation data and specified field of view angles for the various viewing windows, the view generator 112 in the video processor 104 distributes the decoded 360-degree video bitstream to each of the main viewing window and peripheral viewing windows, wherein the content in each of the windows is displayed simultaneously. The video bitstream output by the view generator 112 is received by an audio/video (A/V) synchronizer 114 in the computing device 102. The A/V synchronizer 114 synchronizes the video content with audio content generated by the audio decoder 116 and outputs the content to an A/V output device 118.
The processing device 202 may include any custom made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors associated with the computing device 102, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and other well known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the computing system.
The memory 214 can include any one of a combination of volatile memory elements (e.g., random-access memory (RAM, such as DRAM, and SRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). The memory 214 typically comprises a native operating system 216, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. For example, the applications may include application specific software which may comprise some or all the components of the computing device 102 depicted in
Input/output interfaces 218 provide any number of interfaces for the input and output of data. For example, where the computing device 102 comprises a personal computer, these components may interface with one or more user input/output interfaces, which may comprise a keyboard or a mouse, as shown in
In the context of this disclosure, a non-transitory computer-readable medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
Reference is made to
Although the flowchart of
To begin, in block 310, a 360-degree video bitstream is received by the computing device 102 (
In block 340, the computing device executes distributed playback of the 360-degree video in the main viewing window and the plurality of peripheral viewing windows based on the field of view angles of the main viewing window and the plurality of peripheral viewing windows. In some embodiments, a modified field of view angle for the main viewing window is received by the computing device 102, where the modified field of view angle corresponds to a change in the field of view in either the horizontal direction or in the vertical direction. When a modified field of view angle is received by the computing device 102, the field of view angles for the peripheral viewing windows are automatically adjusted based on the modified field of view angle and the ratio previously specified by the user. Thereafter, the process in
Having generally described an exemplary process for providing distributed playback of a 360-degree video in a plurality of viewing windows, additional details regarding various steps and concepts in the flowchart of
As shown in
In some embodiments, a modified field of view angle for one of the peripheral windows is received in either the vertical direction or in the horizontal direction. In response to receiving the modified field of view angle in the vertical direction, the field of view angles for the remaining peripheral windows is automatically adjusted in the vertical direction based on the modified field of view angle and the ratio. In accordance with such embodiments, the field of view angle for the main viewing window remains fixed.
Generally, the main viewing window provides the user with a front view while the peripheral windows allow the user to view content to the side or behind the user. For some embodiments, a modified field of view angle for one of the peripheral windows is received in either the vertical direction or in the horizontal direction. When the user modifies the right peripheral viewing window in either the vertical direction or in the horizontal direction, the field of view angles for the main viewing window and left peripheral viewing window remain fixed. When the user modifies the left peripheral viewing window in either the vertical direction or in the horizontal direction, the field of view angles for the main viewing window and right peripheral viewing window remain fixed.
As shown in
In accordance with some embodiments, the field of view angles for the viewing windows are automatically adjusted in either the horizontal direction or in the vertical direction based on user input, where the user input may comprise, for example, manipulation of a user interface component, a gesture performed on a touchscreen, and so on. For example, the user input may cause the viewer's various vantage points of the 360-degree video to rotate by 120 degrees. For example, user input (e.g., a tap, an indicator, or pressing of an arrow key) received in the display area of the left peripheral window may cause the 360-degree video displayed in the main peripheral window and the right peripheral window to rotate by 120 degrees. As another example, user input (e.g., a tap, an indicator, or pressing of an arrow key) received in the display area of the main peripheral window may cause the 360-degree video displayed in the left peripheral window and the right peripheral window to rotate by 120 degrees. As another example, user input (e.g., a tap, an indicator, or pressing of an arrow key) received in the display area may cause the 360-degree video displayed in the main peripheral window, the left peripheral window, and the right peripheral window to rotate by 120 degrees.
In accordance with some embodiments, the content may be switched between the main viewing window and in the peripheral window, where the content displayed in the main viewing window is switched with the content displayed in the peripheral window in response to user input (e.g., manipulation of a user interface component, a gesture performed on a touchscreen interface). For example, as shown in
In accordance with some embodiments, the system is configured to detect the presence of an object of interest (e.g., a vehicle, face, or flower) in the content displayed in one of the peripheral windows. When the system detects the presence of the object of interest, the system switches the content displayed in the peripheral window to the main viewing window upon receiving user input so that the object of interest is displayed in the main viewing window. The system may be further configured to automatically adjust the field of view such that the object of interest is centrally located within the main viewing window. In accordance with some embodiments, no user input is required for switching the content between the peripheral window and the main viewing window.
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
This application claims priority to, and the benefit of, U.S. Provisional Patent Application entitled, “Systems and Methods for Performing Distributed Playback of 360-degree video in a Plurality of Viewing Windows,” having Ser. No. 62/399,517, filed on Sep. 26, 2016, which is incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6232932 | Thorner | May 2001 | B1 |
6266085 | Kato et al. | Jul 2001 | B1 |
6728477 | Watkins | Apr 2004 | B1 |
6738073 | Park et al. | May 2004 | B2 |
7145947 | Koga | Dec 2006 | B2 |
7407294 | Choi | Aug 2008 | B2 |
7974444 | Hongo | Jul 2011 | B2 |
8447173 | Jung et al. | May 2013 | B2 |
9036001 | Chuang et al. | May 2015 | B2 |
9106883 | Johnson et al. | Aug 2015 | B2 |
9305365 | Lovberg | Apr 2016 | B2 |
9787958 | Hattingh | Oct 2017 | B2 |
10166921 | Sypitkowski | Jan 2019 | B2 |
10219026 | Eim | Feb 2019 | B2 |
10257494 | Sadi | Apr 2019 | B2 |
10281570 | Parker | May 2019 | B2 |
10327708 | Yu | Jun 2019 | B2 |
10462343 | McCain | Oct 2019 | B2 |
10484652 | Hobby | Nov 2019 | B2 |
10491796 | Brav | Nov 2019 | B2 |
10503457 | Dimitrov | Dec 2019 | B2 |
10504241 | Singh | Dec 2019 | B2 |
10506221 | Kim | Dec 2019 | B2 |
10509459 | Lee | Dec 2019 | B2 |
10628019 | Parmar | Apr 2020 | B2 |
10827125 | Lee | Nov 2020 | B2 |
20030113096 | Taira et al. | Jun 2003 | A1 |
20070140663 | Ashley et al. | Jun 2007 | A1 |
20070253677 | Wang | Nov 2007 | A1 |
20100013738 | Covannon et al. | Jan 2010 | A1 |
20110169867 | Kniffen et al. | Jul 2011 | A1 |
20120002050 | Taniguchi | Jan 2012 | A1 |
20120169842 | Chuang | Jul 2012 | A1 |
20130107048 | Rottner | May 2013 | A1 |
20130181901 | West | Jul 2013 | A1 |
20160044284 | Goseberg | Feb 2016 | A1 |
20160173775 | Lowry | Jun 2016 | A1 |
20170064374 | Eim | Mar 2017 | A1 |
20170185851 | Jeromin | Jun 2017 | A1 |
20180005443 | Poulos | Jan 2018 | A1 |
20180046363 | Miller | Feb 2018 | A1 |
20180188801 | Leppanen | Jul 2018 | A1 |
20180332219 | Corcoran | Nov 2018 | A1 |
20180332267 | Hesla | Nov 2018 | A1 |
20180359419 | Hu | Dec 2018 | A1 |
20190289337 | Yan | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
200993865 | Dec 2007 | CN |
103200357 | Jul 2013 | CN |
104735402 | Jun 2015 | CN |
2510986 | Jan 2015 | GB |
Entry |
---|
The Opera Machine. https://www.roh.org.uk/interactives/opera-machine (printed Sep. 21, 2017). |
Number | Date | Country | |
---|---|---|---|
20180091852 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
62399517 | Sep 2016 | US |