The present disclosure relates to the field of information presentation technologies and, more particularly, relates to immersive information presentation.
For years, researchers have been struggling to bring immersive experiences to home entertainment. Immersive experiences with additional light sources (e.g., project lights, back-light displays, and so on) surround a big screen (e.g. TV display, monitor, and so on) have been extensively explored in the past years. Based on the observation that a larger display with a wider field of view may deliver to the user more immersed and present experience. The focus points of the recent research have been on how to extend the screen content in focus with surrounding effects or rich information to create immersive experiences. As shown in
Many types of illusions, such as edge effects, lighting changes, starfield effects, and so on, are created to increase the user's sense of presence. It is important to realize that the efforts mentioned above are based on the Focus+Context display concept. The surrounded contexts are used to enhance the focus display, and a single content source is considered for the focus display, which is either gaming or video streaming content displayed on a TV screen.
Nowadays, mobile devices, such as smartphones and tablets, are becoming more powerful in terms of computing and display, projectors are becoming smaller, quieter, and with higher resolution. TVs are becoming a big screen computer with easy access to the Internet and many video content providers (e.g., YouTube and Netflix), and link capacity around individual device has been significantly increased. As a result, the home entertainment systems are experiencing revolutionary changes. Many devices surround people in their daily lives, and provide various ways to access, retrieve, and view contents. People use different devices at different scenarios. For example, a user may enjoy a show on a big-screen TV at home, or on a tablet when going to a room without TV, and continue the experience on a smartphone when stepping out of the house. Such continuous and seamless viewing experience requires collaboration among devices and intelligence in determining user intension.
The usage model of the main screen is experiencing transition from only single content source to support multiple content sources. In 2014, Samsung released its Multi-Link Screen product that can let user display up to four screens on a UHD TV or two screens on a Full HD TV simultaneously on one screen to support the ultimate multi-viewing experience, so that the user can watch TV and YouTube videos, browse the web and watch TV, watch two different TV shows at the same time, or watch TV and play with apps.
Overall, immersive entertainment is expected to be a collaborative effort among devices, users and content. The disclosed method and system are directed to solve one or more problems in this area. It should be noted that, unless explicitly acknowledged, the above background information is part of the present disclosure and is not intended to be prior art.
One aspect of the present disclosure provides an information presentation method. The method includes: providing a plurality of displays comprising a main screen and at least one picture-in-picture (PiP) window, the main screen having a first displaying region for displaying a first content in the first displaying region, and the at least one PiP window displaying a second content in at least a portion of the first displaying region; determining an immersive impact region and a non-immersive impact region for each of the plurality of displays; providing a surrounding display around the plurality of displays for displaying a third content, the surrounding display being displayed on one or more overlapping immersive impact regions; calculating an immersive effect (IE) value for each of the plurality of displays; and displaying the third content on the surrounding display based on the IE values.
Another aspect of the present disclosure provides an information presentation system. The system includes: a plurality of displays comprising a main screen and at least one picture-in-picture (PiP) window, the main screen having a first displaying region for displaying a first content in the first displaying region, and the at least one PiP window displaying a second content in at least a portion of the first displaying region; and a surrounding display around the plurality of displays for displaying a third content; wherein the first content or the second content is displayed in a non-immersive impact region and the third content is displayed in an immersive impact region based on an immersive effect (IE) of the plurality of displays.
Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure.
The following drawings are merely examples for illustrative purposes according to various disclosed embodiments and are not intended to limit the scope of the present disclosure.
Reference will now be made in detail to exemplary embodiments of the invention, which are illustrated in the accompanying drawings. Hereinafter, embodiments consistent with the disclosure will be described with reference to drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. It is apparent that the described embodiments are some but not all of the embodiments of the present invention. Based on the disclosed embodiment, persons of ordinary skill in the art may derive other embodiments consistent with the present disclosure, all of which are within the scope of the present invention.
In the present disclosure, a display refers to a display with a source providing on-screen content. A display may refer to a main screen display or a Picture-in-Pictire (PiP) window in a main display. A surrounding display (SD) refers to a display area with content derived from one or more displays. A display or a SD may be an LCD screen, a projected display, or any other forms of displays. In embodiments of the present disclosure, the image in the SD may be generated to enhance the visual effect of one or more displays. The immersive effect (IE) for a display refers to the light intensity distribution of the light emitting source for the displays or the SD.
The present disclosure provides the usage model of co-existent, multiple levels of, displays with immersive experiences.
To consider more generic multiple content source scenarios, the main TV screen may contain multiple smaller windows in arbitrary locations with arbitrary window sizes, and each window may render content from different sources.
The present disclosure provides the usage model of co-existence of multiple levels of displays with immersive experiences. Referring to
In step S02, the information presentation method may further determine an immersive impact region and a non-immersive impact region for each of the plurality of displays. In step S03, the information presentation method may further provide a surrounding display circumambiently located around the plurality of displays for displaying a third content, the surrounding display being displayed on one or more overlapping immersive impact regions.
For example, as shown in
The PiP Windows 603 may be expanded to fully cover the space of Main Screen 601 when needed. For example, on some TV displays, the user may have a choice to split the rectangular Main Screen 601 space to 4 same sized rectangular PiP Windows 603. The impact of the immersive effect from the PiP Windows 603 on the Surrounding Display 605 may be evaluated. Generally, the content on the Main Screen 601 may not be compromised by the PiP Windows 603 extended effect unless the Main Screen 601 is fully covered by the PiP Windows 603. In
As shown in
In the embodiments of the present disclosure, the immersive contribution from each display (MS 601 or PiP Window 603) to the SD 605 may follow the same principle. The immersive effect (IE), or the intensity contribution, may be degraded according to the distance from the light source. The immersive effect (IE) may be the light intensity distribution of the light source emitting for the displays MS 601, PiP Window 603 or SD 605.
The embodiments of the present disclosure may denote (Bi, Ti) as the lower and upper bound thresholds of the immersive effect for the i-th display corresponding to the light degradation, and both thresholds are in the range of [0.0, 1.0]. The threshold pair value may be set by users for each display so that they are able to control the overall immersive effect contributed from each content sources. For example, if the pair is set to (0.0, 0.0), it means this display has turned off the immersive effect. If Bi=Ti, then the degradation effect is turned off.
As shown in
Bi is a lower bound threshold of the immersive impact for the i-th display of the plurality of displays, Ti is an upper bound threshold of the immersive impact for the i-th display of the plurality of displays, both Bi and Ti are in the range of [0.0, 1.0], (x,y) is a starting point of the plurality of displays, (Px, Py) is a point inside the immersive impact region, (w, h) is the width and height of the plurality of displays, and the immersive impact region of the plurality of displays has a width of W and height of H.
Returning to
In most scenarios, the user may choose not to impose immersive impact from PiP Windows 603 over the Main Screen 601, thus the surrounding display (SD) 605 areas are impacted by all other displays. Therefore, the rendering of the SD 605 is conducted by re-calculating each pixel, for example, (Px, Py) in
Step 801: Check with the next display in the list of (n-1) displays, to determine whether it is inside the immersive impact region of this display (see example in
Step 802: Calculate the immersive impact factor (IIF) value using Eq. (1) and store the value.
Step 803: Obtain the intensity value of the generated immersive effect (IE), and store the value.
Step 804: If this is the last display, go to step 805, otherwise, go to step 801.
Step 805: Obtain the background value of SD for this pixel (if available), and assign it to IEn, and obtain the default IIF setting for SD set by user (if not available, set to 0.0) and assign it to IIFn.
Step 806: Blend all the stored IIFj and IEj (j=1, . . . n) with following equation to achieve the intensity for this pixel:
Step 807: End of the process.
In step 505, the information presentation method further displays the first content or the second content on the SD with different intensities based on the calculation of Equation (2) and the non-immersive impact region. The first content may be the image to create the immersive effect for a display. The second content may the on-screen content from the content source of the display. The boundary between the main screen and the surrounding display may be a blurred boundary, and the boundary between the at least one PiP Window and the surrounding display may be a blurred boundary as well.
Furthermore, the third content displayed on the surrounding display 605 may provide a surrounding context related to the first content (e.g., a TV program) for enhancing the first content displayed on the main screen 601. For example, a high-resolution display MS may be surrounded by a lower resolution projection screen SD to enhance immersive visual experiences. The users may perceive the scene-consistency, low-resolution color, light, and movement patterns projected into their peripheral vision as a seamless extension of the primary content. The surrounding context may be automatically generated in real time based on the first content.
In another embodiment of the present disclosure, the third content displayed on the surrounding display SD may comprise a social network content, a news or article related to the first content, an instant messaging window, or a different angle view of the first content.
Another embodiment of the present disclosure provides an information presentation system. As shown in
In the embodiments of the present disclosure, the immersive contribution from each display (MS 601 or PiP Window 603) to the SD 605 may follow the same principle. The immersive effect (IE), or the intensity contribution, may be degraded according to the distance from the light source. The embodiments of the present disclosure may denote (Bi, Ti) as the lower and upper bound thresholds of the immersive effect for the i-th display corresponding to the light degradation, and both thresholds are in the range of [0.0, 1.0]. The threshold pair value may be set by users for each display so that they are able to control the overall immersive effect contributed from each content sources. For example, if the pair is set to (0.0, 0.0), it means this display has turned off the immersive effect. If Bi=Ti, then the degradation effect is turned off.
The embodiments of the present disclosure may denote (x,y) as the starting point of a display, and (w, h) as the width and height of the display. The immersive impact region 701 has a width of W and height of H, as shown in
Bi is a lower bound threshold of the immersive impact for the i-th display of the plurality of displays; Ti is an upper bound threshold of the immersive impact for the i-th display of the plurality of displays; both Bi and Ti are in the range of [0.0, 1.0]; (x,y) is a starting point of the plurality of displays; (Px, Py) is a point inside the immersive impact region; (w, h) is the width and height of the plurality of displays; and the immersive impact region of the plurality of displays has a width of W and height of H.
The information presentation system may further calculate the immersive effect (IE) of the plurality of displays. Before rendering, each display generates its immersive impact region by either automatically generating the effect or pre-designing the immersive effect with human interaction or input. In most scenarios, the user may choose not to impose the immersive impact from PiP Windows 603 to Main Screen 601, thus the surrounding display (SD) 605 areas are impacted by all other displays. When the main screen 601 or the at least one PiP window 603 is located inside the non-immersive impact region, the information presentation system may further determine an immersive impact factor (IIF) for the main screen or for the at least one PiP window based on an intensity value IE, a background value for the surrounding display, and the background value being assigned to the IE, and a default IIF setting defined by a user, and the default IIF setting being assigned to the IIF. The IIF and the IE of the plurality of displays may be combined to re-calculate intensities of each pixel, for example, at location (Px, Py) in
Therefore, the rendering of the SD 605 may be conducted by re-calculating each pixel in the immersive impact region 701 with the following steps shown in
Step 801: Check with the next display in the list of (n-1) displays, to determine whether it is inside the immersive impact region of this display (see example in
Step 802: Calculate the immersive impact factor (IIF) value using Eq. (1) for the display, and store the value.
Step 803: Obtain and store the intensity value of the generated immersive effect (IE) for the display.
Step 804: If this is the last display, go to step 805 below, otherwise, go to step 801.
Step 805: Obtain the background value of SD in this pixel (if available), and assign it to IEn, and fetch the default IIF setting for SD by user (if not available set to 0.0) and assign it to IIFn.
Step 806: Blend all the stored IIFj and IEj (j=1, . . . n) using the following equation to determine the intensity for this pixel:
Step 807: End of the process.
The information presentation system may further display the first content or the second content on the SD with different intensities based on the calculation of Equation (2) and the non-immersive impact region. The first content may be the image to create the immersive effect for a display. The second content may the on-screen content from the content source of the display. The boundary between the main screen and the surrounding display may be a blurred boundary, and the boundary between the at least one PiP window and the surrounding display may be a blurred boundary as well.
Furthermore, the third content displayed on the surrounding display 605 may be a surrounding context related to the first content (e.g., a TV program) for enhancing the first content displayed on the main screen 601. For example, a high-resolution display MS may be surrounded by a lower resolution projection screen SD to enhance immersive visual experiences. The users perceive the scene-consistency, low-resolution color, light, and movement patterns projected into their peripheral vision as a seamless extension of the primary content. The surrounding context may be automatically generated in real time based on to the first content.
In another embodiment of the present disclosure, the third content displayed on the surrounding display SD may comprise a social network content, a news or article related to the first content, an instant messaging window, or a different angle view of the first content.
The present disclosure proposes a novel framework to enable the immersive experience for such complicated multiple displays and multiple content sources condition.
It is understood that the disclosed collaborative and scalable information presentation system is not limited to sports watching scenario. The disclosed systems and methods can also be applied to other information presentation scenarios, such as watching news, movies and playing video games, displaying exhibits, presenting technologies and business plans, etc. Further, the disclosed system and method can be applied to any devices with displays, such as smart phones, tablets, PCs, smart watches, and so on.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the claims.
Number | Name | Date | Kind |
---|---|---|---|
5291295 | Srivastava | Mar 1994 | A |
5438372 | Tsumori | Aug 1995 | A |
5825347 | Prinsen | Oct 1998 | A |
6008860 | Patton | Dec 1999 | A |
6556253 | Megied | Apr 2003 | B1 |
6611297 | Akashi | Aug 2003 | B1 |
6697123 | Janevski | Feb 2004 | B2 |
6778226 | Eshelman | Aug 2004 | B1 |
6784945 | Norsworthy | Aug 2004 | B2 |
7061544 | Nonomura | Jun 2006 | B1 |
7119849 | Yui | Oct 2006 | B2 |
7148909 | Yui | Dec 2006 | B2 |
7206029 | Cohen-Solal | Apr 2007 | B2 |
7209180 | Takagi | Apr 2007 | B2 |
7486337 | Bian | Feb 2009 | B2 |
7894000 | Gutta | Feb 2011 | B2 |
7898600 | Lee | Mar 2011 | B2 |
8130330 | Tan | Mar 2012 | B2 |
8212930 | Park | Jul 2012 | B2 |
8599313 | Barenbrug | Dec 2013 | B2 |
8797372 | Liu | Aug 2014 | B2 |
8847972 | Kane | Sep 2014 | B2 |
9071800 | Ikawa | Jun 2015 | B2 |
9270921 | Kim | Feb 2016 | B2 |
9432612 | Bruhn | Aug 2016 | B2 |
9436076 | Kim | Sep 2016 | B2 |
9494847 | Katahira | Nov 2016 | B2 |
9495004 | Cho | Nov 2016 | B2 |
9520075 | Cho | Dec 2016 | B2 |
9799306 | Dunn | Oct 2017 | B2 |
9927867 | Yeom | Mar 2018 | B2 |
20020075407 | Cohen-Solal | Jun 2002 | A1 |
20020140862 | Dimitrova | Oct 2002 | A1 |
20030081834 | Philomin | May 2003 | A1 |
20040201780 | Kim | Oct 2004 | A1 |
20070242162 | Gutta | Oct 2007 | A1 |
20080016532 | Wang | Jan 2008 | A1 |
20090167950 | Chen | Jul 2009 | A1 |
20090180024 | Lai | Jul 2009 | A1 |
20090289874 | Ha | Nov 2009 | A1 |
20100079480 | Murtagh | Apr 2010 | A1 |
20130222386 | Tannhauser | Aug 2013 | A1 |
20130265232 | Yun | Oct 2013 | A1 |
20140285477 | Cho | Sep 2014 | A1 |