The present invention relates to stereoscopic rendering, and more particularly to adjusting a depth of displayed objects.
Traditionally, three-dimensional rendering has enabled the display of one or more objects to a user. For example, one or more objects may be displayed three-dimensionally to a user, utilizing a display. However, current techniques for displaying objects three-dimensionally have been associated with various limitations.
For example, current methods for displaying objects utilizing a display may display objects that are clipped by the edge of the display. This clipping may confuse viewers of the displayed objects and may also contribute to user eyestrain, headaches, etc.
There is thus a need for addressing these and/or other issues associated with the prior art.
A system, method, and computer program product are provided for adjusting a depth of displayed objects within a region of a display. In use, a display that displays one or more objects three-dimensionally is identified. Additionally, a region within the display is determined. Further, a depth of objects displayed within the region is adjusted.
Additionally, in one embodiment, the display may display the one or more objects three-dimensionally using one or more three-dimensional display techniques. For example, the display may display objects to one or more viewers utilizing stereoscopic imaging. In another embodiment, an additional viewing device (e.g., glasses, goggles, etc.) may be used in conjunction with the display in order to display the one or more objects three-dimensionally. In yet another embodiment, the one or more objects may be displayed as part of a scene.
Additionally, as shown in operation 104, a region within the display is determined. In one embodiment, the region may include a predetermined area of the display. For example, the region may include an area from a plurality of edges of the display to a location a predetermined distance from the edges of the display. In another example, the region may include an area at the outer perimeter of the display that has a predetermined width.
Further, in one embodiment, the region may include an outer perimeter of the display that has a width larger than its height. In another embodiment, the region may include an area at an outer perimeter of the display that has a height larger than its width. In yet another embodiment, the region within the display may be determined as a percentage of the display. In still another embodiment, the region may be determined based on one or more factors. For example, the region may be determined based on content that is displayed on the display. In another example, the region may be determined based on a range of depth in a scene displayed by the display.
Further still, as shown in operation 106, a depth of objects displayed within the region is adjusted. In one embodiment, the depth of the objects may include a depth in which an object is perceived three-dimensionally within the display. In another embodiment, the depth of the objects may include a W value of the objects that are rendered within a scene shown utilizing the display.
Also, in one embodiment, adjusting the depth of the objects may include warping the objects. For example, adjusting the depth of the objects may include performing an artificial non-linear alteration of the depth of the objects. In another embodiment, the depth of the objects may be adjusted as the objects travel within the region. For example, the depth of one of the objects may be decreased with respect to a viewer's perspective of the object as the object travels from one part of the region to another part of the region (e.g., from an inner perimeter of the region to an edge of the display, etc.). In another example, the depth of one of the objects may be increased with respect to a viewer's perspective of the object as the object travels from one part of the region to another part of the region.
Additionally, in one embodiment, the depth of the objects may be adjusted automatically according to one or more algorithms. For example, one or more algorithms may perform scaled depth adjustments for the objects based on the location of the objects within the region. In another embodiment, the depth of the objects may be adjusted linearly as an object moves within the region. In yet another embodiment, the depth of the objects may be adjusted in a non-linear fashion. For example, the depth of the objects may be adjusted more heavily as the object approaches an edge of the display.
In this way, the depth of the objects may be gradually and artificially adjusted within areas of the display where eyestrain may occur, such that when three dimensional viewing is performed, the objects may appear to be closer depth wise to the display, thereby reducing eyestrain.
More illustrative information will now be set forth regarding various optional architectures and features with which the foregoing framework may or may not be implemented, per the desires of the user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.
As shown, a user 204 (e.g., a viewer, etc.) views one or more objects that are displayed three-dimensionally by a stereoscopic display 202. For example, the user 204 may view visual content (e.g., a movie, television show, video game, computer display, etc.) three-dimensionally through the display 202 utilizing stereoscopic rendering. Additionally, the objects displayed three-dimensionally to the user 204 at a central region 208 of the display 202 appear at a predetermined depth 206 to the user 204. For example, the objects may be displayed at the predetermined depth in a scene presented to the user 204.
Further, the display 202 includes perimeter regions 210A and 210B near the perimeter of the display 202. In one embodiment, an object displayed to the user 204 may move from the central region 208 of the display 202 into one of the automatic perimeter regions 210A and 210B, and may exit the scene displayed by the display 202 at one of the edges 212A or 212B of the display 202.
In another embodiment, from the time the object enters one of the automatic depth adjustment regions 210A and 210B until the object exits the scene at one of the edges 212A or 212B of the display 202, the depth at which the object is displayed on the display 202 may be reduced from the predetermined depth 206 to the depth of the display 202. In another embodiment, the depth level 214 of the display 200 may represent the depth of one or more displayed objects at various locations within the display 200. In another embodiment, the edges 212A or 212B of the display 202 may include the top and bottom edges of the display 202 the left and right side edges of the display 202, etc.
In one embodiment, adjusting the depth of the object may include reducing an in-screen depth of the object with respect to the display 202. For example, the object may be shifted toward the display 202 on a pixel-by-pixel basis. In another embodiment, the depth of the object may be automatically adjusted according to one or more algorithms.
For example, assuming a position Pn is a point of interest in one of the perimeter regions 210A and 210B, position P1 is at one of the edges 212A or 212B of the display 202, and P0 is at the start of the region of interest (e.g., where the central region 208 meets one of the edges 212A or 212B, etc.), the depth Dn may be calculated as shown in Table 1. Of course, it should be noted that the depth calculations shown in Table 1 are set forth for illustrative purposes only, and thus should not be construed as limiting in any manner.
Dn adjusted=Dn−|(Dn−ZPD)|*|(P0−Pn)/(P0−P1)| for Dn>ZPD, else no adjustment
In another embodiment, the above depth calculations may simply scale the depth Dn of the vertex at point Pn by a percentage of the difference of the depths between Dn and the depth at the surface (ZPD) of the display 202. In yet another embodiment, that percentage may be based on a position where at P0 it is 0% and at P1 it is 100%. In still another embodiment, the X or Y dimension may be selected based on whether the object is heading toward the side edge or the top edge. This adjustment is shown in Table 2. Of course, it should be noted that the calculations shown in Table 2 are set forth for illustrative purposes only, and thus should not be construed as limiting in any manner.
Dn adjusted=current Dn reduced by a percentage of the difference between current Dn and ZPD.
The adjustment illustrated in Table 2 may include a simplified linear adjustment to the depth of an object as viewed by the user 204 as the object approaches an edge 212A or 212B of the display 202. In another embodiment, a non-linear adjustment curve may be utilized instead of a linear adjustment. This may avoid sharp adjustment in the central viewing area close to the central region 208 and may make bigger adjustments closer to the edges 212A and 212B which may include a periphery of the user's 204 viewing. In still another embodiment, the non-linear adjustment curve may be enabled by changing the percentage factor in the equation in Table 2 to an X squared (X^2)-like factor, which may result in a correct non-linear adjustment. In this way, the object depth adjustment may become more extreme as the object approaches one of the edges 212A or 212B of the display 202.
In another embodiment, adjusting the depth of the object as it moves across a scene presented by the display 202 may include calculating one or more statistics associated with the object. For example, a mechanism may be created in hardware to keep statistics associated with the depth of one or more objects at various places within the display 202. See, for example, U.S. patent application Ser. No. 12/574,527, filed Oct. 6, 2009, which is hereby incorporated by reference in its entirety, and which describes methods for calculating statistics associated with a surface to be rendered utilizing a graphics processor.
As shown, the display 300 includes a central display region 302 as well as a perimeter display region 304. In one embodiment, the display 300 may convey one or more objects three-dimensionally at one or more particular depths to a user as part of a scene. Additionally, in another embodiment, one or more of the objects may move from within the central display region 302 toward an edge 308A-D of the display 300, and may therefore enter the perimeter display region 304.
Further, in one embodiment, a depth of the object may be adjusted from the time the object enters the perimeter display region 304 until the time the object exits the display 300 at the edge 308A-D. For example, the depth at which the object is displayed to a viewer may be reduced incrementally from the time the object enters the perimeter display region 304 until the time the object exits the display 300 at the edge 308, such that the depth of the object is adjusted a small amount when the object crosses the inner perimeter 306A-D of the perimeter display region 304 and the depth of the object is adjusted a greater amount as the object approaches the edge 308A-D of the display 300.
Further still, in one embodiment, an object may be displayed at the depth of the display 300 when the object reaches the edge 308A-D of the display 300. In another embodiment, the size of the perimeter display region 304 may be variable. For example, the size of the perimeter display region 304 may be based on the type of content in the scene displayed by the display 300.
In another example, the size of the perimeter display region 304 may be based on a range of depths in the scene. In another embodiment, the depth of an object may be artificially increased going toward the edge 308A-D of the display 300. In yet another embodiment, the objects may be out of screen with respect to the display 300 rather than in screen. In still another embodiment, the depth of an object may be decreased at a different rate at the top and bottom sections of the perimeter display region 304 compared to the left and right sections of the perimeter display region 304.
Additionally, in another embodiment, the top and bottom sections of the perimeter display region 304 may be a different size (e.g., may have a different area, etc.) than the left and right sections of the perimeter display region 304. In this way, the dimensions of the perimeter display region 304 may be adjusted.
Also, in one embodiment, adjusting the depth of one or more objects within the perimeter display region 304 of the display 300 may be performed automatically by adjusting a depth (W) value of the objects within the perimeter display region 304 of the display 300 in direct proportion to the distance of each object from the edge 308A-D of the display 300. For example, artificial non-linear warping of the depth of objects within one or more sides of the perimeter display region 304 may be performed. In another embodiment, the depth adjustments may produce a nonlinear three-dimensional (3D) effect.
In this way, only the depth of objects within the perimeter display region 304 of the display 300 may be adjusted, as eyestrain may occur only within that region. In one embodiment, the objects residing in the perimeter display region 304 may have their depth artificially changed so that when stereopsis is applied, those objects may appear to be closer depth wise to the display 300, which may reduce eyestrain.
Additionally, in one embodiment, scenes including the objects may be further brought out with respect to the display 300 while still making the scene comfortable to view by a user. In yet another embodiment, any distortion of the objects as a result of the depth adjustment may be minimized due to the fact that such adjustments occur in the peripheral vision of a user and the offset of the objects may make them more comfortable to look at.
The system 400 also includes a graphics processor 406 and a display 408, i.e. a computer monitor. In one embodiment, the graphics processor 406 may include a plurality of shader modules, a rasterization module, etc. Each of the foregoing modules may even be situated on a single semiconductor platform to form a graphics processing unit (GPU).
In the present description, a single semiconductor platform may refer to a sole unitary semiconductor-based integrated circuit or chip. It should be noted that the term single semiconductor platform may also refer to multi-chip modules with increased connectivity which simulate on-chip operation, and make substantial improvements over utilizing a conventional central processing unit (CPU) and bus implementation. Of course, the various modules may also be situated separately or in various combinations of semiconductor platforms per the desires of the user.
The system 400 may also include a secondary storage 410. The secondary storage 410 includes, for example, a hard disk drive and/or a removable storage drive, representing a floppy disk drive, a magnetic tape drive, a compact disk drive, etc. The removable storage drive reads from and/or writes to a removable storage unit in a well known manner.
Computer programs, or computer control logic algorithms, may be stored in the main memory 404 and/or the secondary storage 410. Such computer programs, when executed, enable the system 400 to perform various functions. Memory 404, storage 410 and/or any other storage are possible examples of computer-readable media.
In one embodiment, the architecture and/or functionality of the various previous figures may be implemented in the context of the host processor 401, graphics processor 406, an integrated circuit (not shown) that is capable of at least a portion of the capabilities of both the host processor 401 and the graphics processor 406, a chipset (i.e. a group of integrated circuits designed to work and sold as a unit for performing related functions, etc.), and/or any other integrated circuit for that matter.
Still yet, the architecture and/or functionality of the various previous figures may be implemented in the context of a general computer system, a circuit board system, a game console system dedicated for entertainment purposes, an application-specific system, and/or any other desired system. For example, the system 400 may take the form of a desktop computer, lap-top computer, and/or any other type of logic. Still yet, the system 400 may take the form of various other devices m including, but not limited to a personal digital assistant (PDA) device, a mobile phone device, a television, etc.
Further, while not shown, the system 400 may be coupled to a network (e.g. a telecommunications network, local area network (LAN), wireless network, wide area network (WAN) such as the Internet, peer-to-peer network, cable network, etc.) for communication purposes.
While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
5714512 | Bastart et al. | Feb 1998 | A |
6197980 | Durand et al. | Mar 2001 | B1 |
7064201 | Hayashi et al. | Jun 2006 | B2 |
7662980 | Liao et al. | Feb 2010 | B2 |
7674903 | Hayashi et al. | Mar 2010 | B2 |
7919113 | Domb | Apr 2011 | B2 |
7919497 | Palladino et al. | Apr 2011 | B2 |
7935704 | Palladino et al. | May 2011 | B2 |
7956058 | Hayashi et al. | Jun 2011 | B2 |
20030158249 | Chi et al. | Aug 2003 | A1 |
20040092428 | Chen et al. | May 2004 | A1 |
20050090667 | Hayashi et al. | Apr 2005 | A1 |
20050197344 | Palladino et al. | Sep 2005 | A1 |
20060217553 | Hayashi et al. | Sep 2006 | A1 |
20060223822 | Hayashi et al. | Oct 2006 | A1 |
20060223823 | Hayashi et al. | Oct 2006 | A1 |
20070078138 | Palladino et al. | Apr 2007 | A1 |
20080108693 | Liao et al. | May 2008 | A1 |
20090143997 | Yaffe et al. | Jun 2009 | A1 |
20090219283 | Hendrickson et al. | Sep 2009 | A1 |
20090275762 | Liao et al. | Nov 2009 | A1 |
20100091093 | Robinson | Apr 2010 | A1 |
20100099897 | Kim et al. | Apr 2010 | A1 |
20100111830 | Boyden et al. | May 2010 | A1 |
20100111831 | Boyden et al. | May 2010 | A1 |
20100111837 | Boyden et al. | May 2010 | A1 |
20100111841 | Boyden et al. | May 2010 | A1 |
20100111842 | Boyden et al. | May 2010 | A1 |
20100111843 | Boyden et al. | May 2010 | A1 |
20100111844 | Boyden et al. | May 2010 | A1 |
20100111845 | Boyden et al. | May 2010 | A1 |
20100111846 | Boyden et al. | May 2010 | A1 |
20100111847 | Boyden et al. | May 2010 | A1 |
20100111848 | Boyden et al. | May 2010 | A1 |
20100111849 | Boyden et al. | May 2010 | A1 |
20100111850 | Boyden et al. | May 2010 | A1 |
20100111854 | Boyden et al. | May 2010 | A1 |
20100111855 | Boyden et al. | May 2010 | A1 |
20100111938 | Boyden et al. | May 2010 | A1 |
20100112067 | Boyden et al. | May 2010 | A1 |
20100112068 | Boyden et al. | May 2010 | A1 |
20100113614 | Boyden et al. | May 2010 | A1 |
20100113615 | Boyden et al. | May 2010 | A1 |
20100114267 | Boyden et al. | May 2010 | A1 |
20100114268 | Boyden et al. | May 2010 | A1 |
20100114496 | Boyden et al. | May 2010 | A1 |
20100114497 | Boyden et al. | May 2010 | A1 |
20100114546 | Boyden et al. | May 2010 | A1 |
20100119557 | Boyden et al. | May 2010 | A1 |
20100121466 | Boyden et al. | May 2010 | A1 |
20100143243 | Boyden et al. | Jun 2010 | A1 |
20100152651 | Boyden et al. | Jun 2010 | A1 |
20100152880 | Boyden et al. | Jun 2010 | A1 |
20100163576 | Boyden et al. | Jul 2010 | A1 |
20100168900 | Boyden et al. | Jul 2010 | A1 |
20100185174 | Boyden et al. | Jul 2010 | A1 |
20100187728 | Boyden et al. | Jul 2010 | A1 |
20100285873 | Tawara et al. | Nov 2010 | A1 |
20100286254 | Blatter et al. | Nov 2010 | A1 |
20110080420 | Cook et al. | Apr 2011 | A1 |
20110082193 | Kysilka | Apr 2011 | A1 |
20110112036 | Demeule et al. | May 2011 | A1 |
20110150765 | Boyden et al. | Jun 2011 | A1 |
20110189125 | George et al. | Aug 2011 | A1 |
20110195030 | Mumper et al. | Aug 2011 | A1 |
20110245260 | Palladino et al. | Oct 2011 | A1 |
20110293745 | Hoch et al. | Dec 2011 | A1 |
20120101738 | Boyden et al. | Apr 2012 | A1 |
20120109613 | Boyden et al. | May 2012 | A1 |
20120128783 | Boyden et al. | May 2012 | A1 |
20120157517 | Chen et al. | Jun 2012 | A1 |
20120164069 | Boyden et al. | Jun 2012 | A1 |
20120295802 | Yaffe et al. | Nov 2012 | A1 |
Number | Date | Country |
---|---|---|
101525321 | Sep 2009 | CN |
Entry |
---|
Chen, Jie, “Preparation, Characterization and In Vitro Evaluation of Solid Dispersions Containing Docetaxel,” Drug Development and Industrial Pharmacy (2008), 34(6): 588-594. |
Dev et al., “Isolation and characterization of impunties in docetaxel,” Journal of Pharmaceutical and Biomedical Analysis (2006), 40(3):614-622. |
Diaz et al., “Changes in microtubule protofilament number induced by Taxol binding to an easily accessible site. Internal microtubule dynamics,” Journal of Biological Chemistry (1998), 273(50):33803-33810. |
Ekins et al., “Accelerated Communication, A pharmacophore for human pregnane X receptor ligands,” Drug Metabolism and Disposition (2002), 30(1):96-99. |
Gentile et al., “Synthesis of dimeric and tetrameric macrolactams with cylotoxic activity,” Canadian Journal of Chemistry (2000), 78(6):925-934. |
Han et al., “Phytantriol-based inverted type bicontinuous cubic phase for vascular embolization and drug sustained release,”European Journal of Pharmaceutical Sciences (2010), 41(5):692-699. |
Harper et al., “13C NMR Investigation of Solid-State Polymorphism in 10- Deacetyl Baccatin III,” Journal of the American Chemical Society (2002), 124(35):10589-10595. |
Jimenez-Barbero et al., “The solid state, solution and tubulin-bound conformations of agents that promote microtubule stabilization,” Current Medicinal Chemistry: Anti-Cancer Agents (2002), 2(1):91-122. |
Johnson et al., 12, 13-Isotaxanes; Synthesis of New Potent Analogs and X-ray Crystailographic Confirmation of Structure, Journal of Medicinal Chemistry (1997), 40(18), 2810-2812 CODEN: JMCMAR; ISSN: 0022-2623. |
Juarez-Guerra et al., “Addition reaction of benzylbenzylideneamine to lithium enolates of 1,3-dioxolan-4-one: synthesis of 2-phenylisoserines,” ARKIVOC (2011), (9):354-366. |
Lucatelli et al., “Synthesis of C-3′Methyl Taxotere (Docetaxel),” Journal of Organic Chemistry (2002), 67(26):9468-9470. |
Muller et al., “‘Abnormal’ eight-membered ring formation through SN2′intramolecular Nozaki/Kishi reaction in a synthetic approach to a taxane precursor,” Tetrahedron Letters (1998), 39(3/4):279-282. |
Naik et al., “Preparation of PEGylated liposomes of docetaxel using supercritical fluid technology,” Journal of Supercritical Fluids (2010), 54(1):110-119. |
Perrin, M.A., “Crystailography of drug polymorphism: emergence of new resolution methods and prediction of crystalline structures,” Annales Pharmaceutiques Francaises (2002), 60(3):187-202, with English summary on first page. |
Qi et al., “A novel method to synthesize docetaxel and its isomer with high yields,” Journal of Heterocyclic Chemistry (2005), 42(4):679-684. |
Raczko et al., “Asymmetric syn-dihydroxylation of β-substituted (2R)-N-(α,β-enoyl)bornane-10,2-suitams,” Helvetica Chimica Acta (1998), 81(7):1264-1277. |
Roy et al., “A concise enantioselective synthesis of a fully oxygen substituted ring A Taxol precursor,” Tetrahedron (2003), 59(27):5115-5121. |
Skariyachan et al., “Design and discovery of novel therapeutic drugs against Helicobacter pylori gastroduodenal cancer by in silico approach,” Research Journal of Pharmaceutical, Biological and Chemical Sciences (2010), 1(4):1005-1016. |
Skariyachan et al., “In silico investigation and docking studies of E2F3 tumor marker: discovery and evaluation of potential inhibitors for prostate and breast cancer,” International Journal of Pharmaceutical Sciences and Drug Research (2010), 2(4):254-260. |
Wang et al., “Preparation and evaluation of docetaxel-loaded albumin nanoparticles for intravenous adminsitration,” Journal of Chinese Pharmaceutical Sciences (2010), 19(3):214-222. |
Zaske et al., “Docetaxel. Solid state characterization by x-ray powder diffraction and thermogravimetry,” Journal de Physique IV: Proceedings (2001), 11(Pr10, XXVII JEEP. Journees d'Etude des Equilibres entre Phases, 2001) 221-226. |