A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings hereto: Copyright© 2010-2012, Dean E. Wolf, All Rights Reserved.
The present disclosure relates to gaming environments. More particularly, the present disclosure relates to dynamic lighting and rendering techniques implemented in gaming environments.
Various aspects described or referenced herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects. At least some aspects described herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects. In at least one embodiment, the dynamic rendering of pixels for a selected virtual object within a given scene may be performed in real-time using predefined light source influence criteria representing the amount of light intensity or light influence which each individual light source (of the scene) has on the pixel being rendered.
A first aspect is directed to different methods, systems, and computer program products for operating a gaming system. According to different embodiments, the gaming system may be operable to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof): real-time, dynamic adjustment of lighting characteristics of individual light sources within a rendered scene; real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene; real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene; real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene; real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene; real-time projection of properties of reflected light (radiosity) onto virtual objects within a given scene based on the lighting characteristics of individual light sources and objects within that scene; real-time adjustment of lighting intensity, color and falloff; calculation of Light Source Influence criteria for selected features (e.g., virtual objects, pixels, Light Influence Grid Points, object vertices, etc.) of a virtual scene. In at least one embodiment, the Light Source Influence criteria associated with a given pixel characterizes the amount of light intensity or light influence which each distinct, identified light source has on that particular pixel; dynamic calculation (e.g., during runtime) of rendered display characteristics for selected features (e.g., virtual objects, pixels, Light Influence Grid Points, object vertices, etc.) of a virtual scene using predetermined Light Source Influence criteria and using current lighting characteristics of light source(s) within the scene; enabling the rendered color and/or brightness characteristics of one or more pixels, vertices, Light Influence Grid Points and/or other points in space of the scene being rendered to dynamically change over one or more different time intervals, for example, by dynamically adjusting the lighting characteristics (e.g., RGB, brightness, falloff, etc.) of one or more light sources in the scene; dynamic runtime rendering of pixels for a selected virtual object within a given scene using predefined light source influence criteria representing an amount of light intensity or light influence which each individual light source (of the scene) has on the pixel being rendered.
A second aspect is directed to different methods, systems, and computer program products for operating a gaming system. According to different embodiments, the gaming system may be operable to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof): initiate a first active gaming session at the first gaming system; identify a first virtual scene to be rendered for display during the first active gaming session; identify a first virtual light source associated with the first virtual scene, the first virtual light source having associated therewith a first portion of lighting characteristics; dynamically set the first portion of lighting characteristics to be in accordance with a first set of values; dynamically render and display, during the first active gaming session and using the first portion of lighting characteristics, a first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a first set of rendered lighting characteristics relating to a first visual appearance of the first rendered virtual object; dynamically modify the first portion of lighting characteristics to be accordance with a second set of values; dynamically adjust the visual appearance of the first rendered virtual object in response to the dynamic modification of the first portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the second set of values, a modified set of rendered lighting characteristics relating to a modified visual appearance of the first rendered virtual object; identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics; dynamically render and display, during the first active gaming session and using the first portion of lighting characteristics and second portion of lighting characteristics, the first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a third set of rendered lighting characteristics relating to a third visual appearance of the first rendered virtual object; dynamically modify the second portion of lighting characteristics to be accordance with a modified set of values; dynamically adjust the visual appearance of the first rendered virtual object in response to the dynamic modification of the second portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the modified set of values, a second modified set of rendered lighting characteristics relating to a second modified visual appearance of the first rendered virtual object; identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics; identify a first pixel associated with the first virtual object; identify first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel; identify second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a visual appearance of the first pixel; dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; identify a first current color profile associated with the first virtual light source; identify a second current color profile associated with the second virtual light source; dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; dynamically determine, during rendering of the first virtual object, at least one color characteristic of the first pixel using the set of first pixel lighting characteristics and using the first and second current color profiles.
Various objects, features and advantages of the various aspects described or referenced herein will become apparent from the following descriptions of its example embodiments, which descriptions should be taken in conjunction with the accompanying drawings.
Various techniques will now be described in detail with reference to a few example embodiments thereof as illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects and/or features described or reference herein. It will be apparent, however, to one skilled in the art, that one or more aspects and/or features described or reference herein may be practiced without some or all of these specific details. In other instances, well known process steps and/or structures have not been described in detail in order to not obscure some of the aspects and/or features described or reference herein.
One or more different inventions may be described in the present application. Further, for one or more of the invention(s) described herein, numerous embodiments may be described in this patent application, and are presented for illustrative purposes only. The described embodiments are not intended to be limiting in any sense. One or more of the invention(s) may be widely applicable to numerous embodiments, as is readily apparent from the disclosure.
These embodiments are described in sufficient detail to enable those skilled in the art to practice one or more of the invention(s), and it is to be understood that other embodiments may be utilized and that structural, logical, software, electrical and other changes may be made without departing from the scope of the one or more of the invention(s). Accordingly, those skilled in the art will recognize that the one or more of the invention(s) may be practiced with various modifications and alterations. Particular features of one or more of the invention(s) may be described with reference to one or more particular embodiments or figures that form a part of the present disclosure, and in which are shown, by way of illustration, specific embodiments of one or more of the invention(s). It should be understood, however, that such features are not limited to usage in the one or more particular embodiments or figures with reference to which they are described. The present disclosure is neither a literal description of all embodiments of one or more of the invention(s) nor a listing of features of one or more of the invention(s) that must be present in all embodiments.
Headings of sections provided in this patent application and the title of this patent application are for convenience only, and are not to be taken as limiting the disclosure in any way.
Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. To the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of one or more of the invention(s).
Further, although process steps, method steps, algorithms or the like may be described in a sequential order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described in this patent application does not, in and of itself, indicate a requirement that the steps be performed in that order. The steps of described processes may be performed in any order practical. Further, some steps may be performed simultaneously despite being described or implied as occurring non-simultaneously (e.g., because one step is described after the other step). Moreover, the illustration of a process by its depiction in a drawing does not imply that the illustrated process is exclusive of other variations and modifications thereto, does not imply that the illustrated process or any of its steps are necessary to one or more of the invention(s), and does not imply that the illustrated process is preferred.
When a single device or article is described, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article.
The functionality and/or the features of a device may be alternatively embodied by one or more other devices that are not explicitly described as having such functionality/features. Thus, other embodiments of one or more of the invention(s) need not include the device itself.
Techniques and mechanisms described or reference herein will sometimes be described in singular form for clarity. However, it should be noted that particular embodiments include multiple iterations of a technique or multiple instantiations of a mechanism unless noted otherwise.
Various aspects described or referenced herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects.
In at least one embodiment, each light source (L1, L2, L3) may have associated therewith a respective set of unique lighting characteristics, which, for example, may include, but are is limited to, one or more of the following (or combinations thereof):
In at least one embodiment, the RGB color model may be used to describe the lighting characteristics of a given light source by defining or indicating the respective amounts of red, green, and blue components which collectively make up the composite light emitted from that light source. For example, in one embodiment, the lighting characteristics of a given light source may be expressed as an RGB triplet (R,G,B), where the respective value of each component may vary from zero to a defined maximum value. If all the components are at zero the result is black; if all are at maximum, the result is the brightest representable white. According to different embodiments, these ranges may be quantified in several different ways, such as, for example, one or more of the following (or combinations thereof): (i) From 0 to 1, with any fractional value in between; a percentage, e.g., from 0% to 100%; as integer numbers in the range 0 to 255 (e.g., the range that a single 8-bit byte can offer); as integer numbers in the range 0 to 1023 (e.g., the range able to be represented using 10 bits; as integer numbers in the range 0 to 65535 (e.g., the range able to be represented using 16 bits); etc.
As illustrated in the example embodiment of
L1=[R1,G1,B1];
L2=[R2,G2,B2];
L3=[R3,B3,G3].
As illustrated in the example embodiment of
In the field of computer graphic lighting techniques, there are several conventional computer graphic lighting techniques which may be utilized for lighting virtual objects in a rendered image. For example, one popular lighting technique known as “per-pixel lighting” refers to a technique for lighting an image or scene that calculates illumination for each pixel on a rendered image. Many modern video game engineers prefer to implement lighting using per-pixel techniques to achieve increased detail and realism. However, per-pixel lighting is considered to be “costly” or “expensive” in terms of the computational resources required to calculate illumination on a per-pixel basis, which can often introduce latency issues with respect to real-time rendering of scenes. In particular, lighting influence from each light in the scene must be calculated on a per-pixel basis. Color is then derived from the light influence multiplied by the light color. This is high quality but also very computationally intensive. Accordingly, in an effort to reduce such computational resource and latency issues, some video games employ a lighting technique referred to as “pre-baked” vertex lighting, in which the lighting characteristics at each vertex of a 3D model are pre-calculated based upon a composite of predetermined, static light sources within the scene, and the “pre-baked” lighting characteristics of the vertices are then used to interpolate the resulting values over the model's outer surfaces to calculate the final per-pixel lighting characteristics. However, one significant limitation of the pre-baked vertex lighting technique is that it does not offer the ability to permit dynamic modification of the lighting characteristics of individual light sources within the scene. For example, referring to the example embodiment of
In contrast, the various dynamic lighting and rendering techniques described herein may provide functionality for supporting real-time, dynamic adjustment of the lighting characteristics of individual light sources within a rendered scene, and may provide functionality for supporting real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene. Moreover, as described in greater detail herein, the dynamic lighting and rendering functionality disclosed herein may be implemented using vertex-based lighting techniques, pixel-based lighting techniques, and/or grid-based lighting techniques.
For purposes of illustration, an example embodiment of the dynamic lighting/rendering technique will now be described by way of example with reference to
Taking into account the various different types of environmental and lighting features, characteristics, properties, and objects (e.g., radiosity, stationary v. dynamic objects, shadows, static v. dynamic light sources, ambient occlusion, falloff, etc.) which may influence lighting within scene portion 700, numerous algorithmic and/or complex computations may be performed (e.g., in advance of real-time game play) to thereby generate a set (or matrix) of Composite Light Intensity value(s) for each vertex (V1-V5) of virtual object 710. In at least one embodiment, the Composite Light Intensity Value for a given vertex (e.g., vertex V1) may be dynamically calculated (e.g., during runtime) using a set of one or more separately defined Light Source Intensity values each representing an amount of light intensity or light influence which a given light source (e.g., L1, L2, L3) has on that particular vertex. Thus, for example, in the example scene 700 which includes three different light sources (L1, L2, L3), the Composite Light Intensity Value I(V1) for vertex V1 may be defined as a set of separate Light Source Intensity values such as:
I(V1)=I(L1@V1)+I(L2@V1)+I(L3@V1),
where:
I(L1@V1) represents the computed amount of light intensity or light influence which light source L1 has upon vertex V1 (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L1's influence at vertex V1);
I(L2@V1) represents the computed amount of light intensity or light influence which light source L2 has upon vertex V1 (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L2's influence at vertex V1); and
I(L3@V1) represents the computed amount of light intensity or light influence which light source L3 has upon vertex V1 (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L3's influence at vertex V1).
Additionally, in at least one embodiment, the value I(L1@V1) (representing the amount of light intensity or light influence which light source L1 has upon vertex V1) may be expressed, for example, as
I(L1@V1)=I(R1@V1)+I(G1@V1)+I(B1@V1),
where:
I(R1@V1) represents the amount of light intensity or light influence which the red component (R1) of light source L1 has upon vertex V1;
I(G1@V1) represents the amount of light intensity or light influence which the green component (G1) of light source L1 has upon vertex V1; and
I(B1@V1) represents the amount of light intensity or light influence which the blue component (B1) of light source L1 has upon vertex V1.
Accordingly, in at least one embodiment, the Composite Light Intensity Value(s) for vertices V1, V2, V2, V4, V5 (
I(V1)=I(L1@V1)+I(L2@V1)+I(L3@V1);
I(V2)=I(L1@V2)+I(L2@V2)+I(L3@V2);
I(V3)=I(L1@V3)+I(L2@V3)+I(L3@V3);
I(V4)=I(L1@V4)+I(L2@V4)+I(L3@V4);
I(V5)=I(L1@V5)+I(L2@V5)+I(L3@V5);
where:
I(L1@V1)=I(R1@V1)+I(G1@V1)+I(B1@V1);
I(L2@V1)=I(R2@V1)+I(G2@V1)+I(B2@V1);
I(L3@V1)=I(R3@V1)+I(G3@V1)+I(B3@V1);
I(L1@V2)=I(R1@V2)+I(G1@V2)+I(B1@V2);
I(L2@V2)=I(R2@V2)+I(G2@V2)+I(B2@V2);
I(L3@V2)=I(R3@V2)+I(G3@V2)+I(B3@V2);
etc.
In at least one embodiment, the calculated Light Source Influence values may factor in various types of other lighting attributes and/or characteristics described and/or referenced herein. In at least one embodiment, the lighting characteristics relating to the color or hue of a given vertex may be dynamically determined at render time using the Light Source Influence values and current lighting characteristics of the identified light source(s) of the scene. For example, in one embodiment, lighting characteristics relating to the color or hue of a given vertex may be calculated as the sum of the products of the influence of each light source on the vertex multiplied by the light color. In one embodiment, the light color may not affect the amount of influence which a given light source has on the vertex, but may only affects the resulting color.
Thus, for example, in at least one embodiment, the amount of light intensity or light influence which a given light source has over a pixel (or vertex or other point in space of the scene being rendered) may be pre-calculated and/or recorded as a static value(s), and this information (e.g., the Light Source Influence criteria) may be used to render and/or to dynamically determine the color and/or brightness characteristics of one or more pixels, vertices, Light Influence Grid Points and/or other points in space of the scene being rendered. Moreover, the dynamic lighting and rendering techniques described herein enables the rendered color and/or brightness characteristics of pixels/objects in a given scene to dynamically change over one or more different time intervals, for example, by dynamically adjusting the lighting characteristics (e.g., RGB, brightness, falloff, etc.) of one or more light sources in the scene.
Thus, for example, in one embodiment, the color characteristics for an object not be determined until the pixels are rendered at runtime, and may be based on the current color attributes of each light source at that moment of rendering. Those, for example, unlike prior art techniques which may compute and storing a static composite light value (RGB) for each vertex, the dynamic lighting and rendering techniques described herein may compute and store the Light Source Influence values (e.g., I(V1), I(V2), I(V3), etc.) characterizing the amount of light intensity or light influence which a given light source has over one or more pixels, vertices, Light Influence Grid Points and/or other points in space of the scene being rendered. Thereafter, at runtime, color characteristics (and/or other display characteristics) for the scene features, pixels, objects, etc. may be dynamically calculated using the Light Source Influence values and the current composite RGB light values of light sources.
In at least one embodiment, once the Composite Light Intensity Values have been computed for each of the vertices of the virtual object 710, the lighting characteristics of any point or pixel (e.g., 711) on the surface of the virtual object may be relatively quickly calculated (e.g., in real-time) using a vertex-based lighting technique. For example, in one embodiment, the lighting characteristics associated with virtual object point 711 (e.g., at pixel coordinates X1,Y1,Z1) may be determined or calculated by interpolating the Composite Light Intensity Values of the n nearest vertices to the identified point (e.g., the n nearest vertices which are within direct “line of sight” of the identified point, which in this particular example is assumed to be vertices V3, V4, and V5. Thus, for example, in the example scene 700, the lighting characteristics for the point 711 of the virtual object's surface may be calculated and/or determined (e.g., in real-time), for example, by performing one or more of the following operations (or combinations thereof): identifying selected vertices of the virtual object to be used for interpolating the lighting characteristics of the selected point, and calculating a Composite Light Intensity Value for the identified point (711) using a vertex-based lighting technique wherein the lighting characteristics of the identified point are interpolated based on the relative amount of lighting influence which each of the identified vertices (e.g., V3, V4, V5) is determined to have on the selected point. For example, in at least one embodiment, the Composite Light Intensity Value for the identified point 711 may be determined according to:
I(X1,Y1,Z1)=a*I(V3)+b*I(V4)+c*I(V5),
where a, b, c are weighted variables representing interpolations of the relative influences of each identified vertices (V3, V4, V5). In at least one embodiment, the values assigned to weighted variables a, b, c may be based, at least in part, on the respective distance of each identified vertex to the identified coordinate (X1,Y1,Z1); and where a+b+c=1. In at least one embodiment, the weighted values for a, b, c may be determined by normalizing the relative distances of each identified vertex to the identified coordinate. For example, in at least one embodiment, the weighted values for a, b, c may be determined according to:
d
SUM
=d
A
+d
B
+d
c;
a=d
A
/d
SUM;
b=d
B
/d
SUM;
c=d
c
/d
SUM.
In at least one embodiment, the values I(V3), I(V4), and I(V5) (representing the Composite Light Intensity Values for vertices V3, V4, V5, respectively) may be expressed as a function of the amount of light intensity or light influence which each of the respective light sources L1, L2, and L3 has upon vertices V3, V4, V5, such as, for example:
I(V3)=I(L1@V3)+I(L2@V3)+I(L3@V3);
I(V4)=I(L1@V4)+I(L2@V4)+I(L3@V4);
I(V5)=I(L1@V5)+I(L2@V5)+I(L3@V5).
Alternatively, in some embodiments, the values I(V3), I(V4), and I(V5) (representing the Composite Light Intensity Values for vertices V3, V4, V5, respectively) may be expressed as a function of the amount of light intensity or light influence which each of the composite red, green, and blue channels (as determined from light sources L1, L2, and L3) has upon vertices V3, V4, V5, such as, for example:
where, for example:
R(L1@V3)+R(L2@V3)+R(L3@V3) represents a “red channel” composite light influence for vertex V3, which may be calculated based on the respective amounts of red component light influence which each of the light source L1, L2, and L3 has on vertex V3;
G(L1@V3)+G(L2@V3)+G(L3@V3) represents a “green channel” composite light influence for vertex V3, which may be calculated based on the respective amounts of green component light influence which each of the light source L1, L2, and L3 has on vertex V3; and
B(L1@V3)+B(L2@V3)+B(L3@V3) represents a “blue channel” composite light influence for vertex V3, which may be calculated based on the respective amounts of blue component light influence which each of the light source L1, L2, and L3 has on vertex V3.
In at least one embodiment, the shape of the virtual object may influence the number of vertices of that object which are used for interpolating the lighting characteristics of any given pixel on the surface of the virtual object. In some embodiments, there may be upper and/or lower limits to the number of vertices which may be used for interpolation of lighting characteristics of the virtual object's surface pixels.
In at least one embodiment, using the dynamic lighting and rendering techniques described herein, each (or selected ones) of the light sources in a given scene (and/or a given game) may have associated therewith a respective, dynamically adjustable light source profile which, for example, may be provided to a graphics processor (e.g., during game initialization) and which may be used to facilitate real-time, dynamic adjustment of the lighting characteristics of individual light sources within a rendered scene, and/or may be used to facilitate real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene. According to different embodiments, the dynamically adjustable light source profile(s) may include at least one matrix or table of values (e.g.,
In some embodiments, using the dynamic lighting and rendering techniques described herein, each (or selected ones) of the virtual object(s) of a given scene (and/or a given game) may have associated therewith a respective dynamic light influence profile which, for example, may be provided to a graphics processor (e.g., during game initialization) and which may be used to facilitate real-time dynamic adjustment of rendered lighting characteristics of that virtual object within a given scene based on the dynamic adjustment of the lighting characteristics of individual light sources within that scene. According to different embodiments, the dynamic light influence profile(s) may include at least one matrix or table of values representing different lighting characteristics of that particular virtual object.
For example,
In at least one embodiment, light influence characteristics for one or more vertices may be defined relative to a virtual object or to a group of virtual object (or other types of groupings) in a manner which facilitates the calculation of dynamic light influence characteristics of rotating or moving virtual objects or groups of virtual objects.
In at least one embodiment, the dynamic lighting and rendering techniques described herein may be advantageously leveraged and used to support dynamic adjustment of light source characteristics light sources, which, for example, may provide or enable additional light source features, functionalities, and/or characteristics such, as, for example, one or more of the following (or combinations thereof):
According to different embodiments, such additional light source features, functionalities, and/or characteristics may be advantageously used (e.g. by game software developers) to provide new types of visual effect(s) in one or more scenes, such as, for example, one or more of the following (or combinations thereof):
In some embodiments, using the dynamic lighting and rendering techniques described herein, each (or selected ones) of Light Influence Grid Points of a given scene may have associated therewith a respective set of predefined Light Source Influence criteria, which for example, may be provided to a graphics processor (e.g., during game initialization), and which may be used to facilitate real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene. According to different embodiments, the Light Source Influence criteria may include at least one matrix or table of values representing different Light Source Influence criteria.
For example, taking into account the various different types of environmental and lighting features, characteristics, properties, and objects (e.g., radiosity, stationary objects, shadows, light sources, ambient occlusion, etc.) which may influence lighting within scene portion 1300, numerous algorithmic and/or complex computations may be performed (e.g., in advance of real-time game play) to thereby generate a set (or matrix) of Composite Light Intensity value(s) for each Light Influence Grid Point 1302 represented in scene portion 1300. In at least one embodiment, the Composite Light Intensity Value(s) for a given Light Influence Grid Point (e.g., Light Influence Grid Point A) may be dynamically calculated (e.g., during runtime) using a set of one or more separately defined Light Source Intensity values each representing an amount of light intensity or light influence which a given light source (e.g., L1, L2, L3) has on that particular Light Influence Grid Point. Thus, for example, in the example scene 1300 which includes three different light sources (L1, L2, L3), the Composite Light Intensity Value for Light Influence Grid Point A I(A) may be defined as a set of separate Light Source Intensity values such as:
I(A)=I(L1@A),I(L2@A),I(L3@A),
where:
I(L1@A) represents the computed amount of light intensity or light influence which light source L1 has upon Light Influence Grid Point A (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L1's influence at Light Influence Grid Point A);
I(L2@A) represents the computed amount of light intensity or light influence which light source L2 has upon Light Influence Grid Point A (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L2's influence at Light Influence Grid Point A); and
I(L3@A represents the computed amount of light intensity or light influence which light source L3 has upon Light Influence Grid Point A (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L3's influence at Light Influence Grid Point A).
Additionally, in at least one embodiment, the value I(L1@A) (representing the amount of light intensity or light influence which light source L1 has upon Light Influence Grid Point A) may be expressed, for example, as
I(L1@A)=I(R1@A)+I(G1@A)+I(B1@A),
where:
I(R1@A) represents the amount of light intensity or light influence which the red component (R1) of light source L1 has upon Light Influence Grid Point A;
I(G1@A) represents the amount of light intensity or light influence which the green component (G1) of light source L1 has upon Light Influence Grid Point A; and
I(B1@A) represents the amount of light intensity or light influence which the blue component (B1) of light source L1 has upon Light Influence Grid Point A.
Accordingly, in at least one embodiment, the Composite Light Intensity Value(s) for Light Influence Grid Points A, B, C, D (
I(A)=I(L1@A)+I(L2@A)+I(L3@A);
I(B)=I(L1@B)+I(L2@B)+I(L3@B);
I(C)=I(L1@C)+I(L2@C)+I(L3@C);
I(D)=I(L1@D)+I(L2@D)+I(L3@D);
where:
I(L1@A)=I(R1@A)+I(G1@A)+I(B1@A);
I(L2@A)=I(R2@A)+I(G2@A)+I(B2@A);
I(L3@A)=I(R3@A)+I(G3@A)+I(B3@A);
I(L1@B)=I(R1@B)+I(G1@B)+I(B1@B);
I(L2@B)=I(R2@B)+I(G2@B)+I(B2@B);
I(L3@B)=I(R3@B)+I(G3@B)+I(B3@B);
etc.
In at least one embodiment, once the Composite Light Intensity Values have been computed for each of the Light Influence Grid Points in the scene, the lighting characteristics of an virtual object moving within the scene may be relatively quickly calculated (e.g., in real-time) using a Light Influence Grid Point Influence Interpolation technique wherein the lighting characteristics of the virtual object (e.g., 1310) are interpolated based on the Composite Light Intensity Values of the n nearest Light Influence Grid Points to the virtual object (at a given point in time). In at least one embodiment, the lighting characteristics of the virtual object (e.g., 1310) may be interpolated based on the respective distances of each of the identified Light Influence Grid Points to the virtual object (at that specific time). Thus, for example, in the example scene 1300, if at time T1 it is determined that a representative pixel (1311) of virtual object 1310 is located at coordinates (X1,Y1), the lighting characteristics of the identified pixel 1311 of virtual object 1310 may be calculated and/or determined (e.g., in real-time), for example, by performing one or more of the following operations (or combinations thereof):
I(X1,Y1)=a*I(A)+b*I(B)+c*I(C)+d*I(D),
d
SUM
=d
A
+d
B
+d
C
+d
D
a=d
A
/d
SUM
b=d
B
/d
SUM
c=d
C
/d
SUM
d=d
D
/d
SUM
In at least one embodiment, the Light Influence Grid Point Influence Interpolation technique may include calculating separate Composite Light Intensity Value(s) for each light source affecting or influencing the identified virtual object pixel coordinate 1311. For example, in at least one embodiment, the Composite Light Intensity Value(s) for the identified pixel coordinate 1311 may be determined according to:
I(L1@(X1,Y1)=a*I(L1@A)+b*I(L1@B)+c*I(L1@C)+d*I(L1@D)
I(L2@(X1,Y1)=a*I(L2@A)+b*I(L2@B)+c*I(L2@C)+d*I(L2@D)
I(L3@(X1,Y1)=a*I(L3@A)+b*I(L3@B)+c*I(L3@C)+d*I(L3@D)
In some embodiments, the lighting characteristics of the other portions/regions of the identified virtual object (1310) may be dynamically determined (e.g., in real-time) using the lighting characteristics, properties and/or attributes of the identified virtual object (1310) and the lighting characteristics of the identified Light Influence Grid Points.
In at least one embodiment, the spacing between Light Influence Grid Points may be configured or designed to match the pixel spacing of the display screen for which the scene is to be rendered and displayed. In some embodiments, spacing between Light Influence Grid Points may be configured or designed to match pixel spacing based on one or more of the following (or combinations thereof): camera angle, position, perspective, etc. In at least one embodiment, the granularity of the Light Influence Grid Array for a given scene may be adjusted (or may be based upon) the relative size(s) of the virtual objects to be displayed in that particular scene, and/or may be based upon other desired criteria.
Alternatively, in some embodiments, the values I(A), I(B), I(C), and I(D) (representing the Composite Light Intensity Values for Light Influence Grid Points A, B, C, D respectively) may be expressed as a function of the amount of light intensity or light influence which each of the composite red, green, and blue channels (as determined from light sources L1, L2, and L3) has upon Light Influence Grid Points A, B, C, D such as, for example:
where, for example:
R(L1@A)+R(L2@A)+R(L3@A) represents a “red channel” composite light influence for Light Influence Grid Point A, which may be calculated based on the respective amounts of red component light influence which each of the light source L1, L2, and L3 has on Light Influence Grid Point A;
G(L1@A)+G(L2@A)+G(L3@A) represents a “green channel” composite light influence for Light Influence Grid Point A, which may be calculated based on the respective amounts of green component light influence which each of the light source L1, L2, and L3 has on Light Influence Grid Point A; and
B(L1@A)+B(L2@A)+B(L3@A) represents a “blue channel” composite light influence for Light Influence Grid Point A, which may be calculated based on the respective amounts of blue component light influence which each of the light source L1, L2, and L3 has on Light Influence Grid Point A.
In some embodiments, the lighting characteristics of the identified virtual object pixel coordinate 1311 (of virtual object 1310) may be calculated and/or determined (e.g., in real-time), for example, by determining the separate RGB channel values for the identified virtual object pixel coordinate using the respective RGB component values associated with each influencing light source (e.g., L1, L2, L3). For example, in the example scene portion 1300, separate RGB channel values for the identified virtual object pixel coordinate 1311 may be calculated according to:
R(X1,Y1)=R(L1@(X1,Y1))+R(L2@(X1,Y1))+R(L3@(X1,Y1));
G(X1,Y1)=G(L1@(X1,Y1))+G(L2@(X1,Y1))+G(L3@(X1,Y1));
B(X1,Y1)=B(L1@(X1,Y1))+B(L2@(X1,Y1))+B(L3@(X1,Y1));
where:
R(L1@(X1,Y1))=R1*I(L1@(X1,Y1))
G(L1@(X1,Y1))=G1*I(L1@(X1,Y1))
B(L1@(X1,Y1))=B1*I(L1@(X1,Y1))
R(L2@(X1,Y1))=R2*I(L2@(X1,Y1))
G(L2@(X1,Y1))=G2*I(L2@(X1,Y1))
B(L2@(X1,Y1))=B2*I(L2@(X1,Y1))
R(L3@(X1,Y1))=R3*I(L3@(X1,Y1))
G(L3@(X1,Y1))=G3*I(L3@(X1,Y1))
B(L3@(X1,Y1))=B3*I(L3@(X1,Y1))
In at least one embodiment, the shape of the virtual object may influence the number of Light Influence Grid Points which are used for interpolating the lighting characteristics of any given pixel on the surface of the virtual object. In some embodiments, there may be upper and/or lower limits to the number of Light Influence Grid Points which may be used for interpolation of lighting characteristics of the virtual object's surface pixels.
For example, referring to the example scene of
where, for example:
R(L1@X1,Y1,Z1) represents a “red channel” composite light influence for pixel coordinate 1212, which may be calculated based on the respective amounts of red component light influence which each of the light source L1, L2, and L3 has on pixel coordinate 1212;
G(L1@X1,Y1,Z1) represents a “green channel” composite light influence for pixel coordinate 1212, which may be calculated based on the respective amounts of green component light influence which each of the light source L1, L2, and L3 has on pixel coordinate 1212;
B(L1@X1,Y1,Z1) represents a “blue channel” composite light influence for pixel coordinate 1212, which may be calculated based on the respective amounts of blue component light influence which each of the light source L1, L2, and L3 has on pixel coordinate 1212.
In at least one embodiment, the gaming system may use the Red Channel to store (L1@X1,Y1,Z1), the Green Channel to store (L2@X1,Y1,Z1), etc. In this way, at render time or runtime, the current RGB component color characteristics of each light source may be used to calculate the real-time color characteristics of the rendered pixel, for example, according to:
According to different embodiments, at least a portion of the various types of functions, operations, actions, and/or other features provided by the dynamic lighting and rendering procedures described herein may be implemented at one or more gaming systems(s), at one or more server systems (s), and/or combinations thereof.
In at least one embodiment, the dynamic rendering of pixels for a selected virtual object within a given scene may be performed in real-time using predefined light source influence criteria representing the amount of light intensity or light influence which each distinct light source (of the scene) has on each (or selected) pixels of the virtual object being rendered. In at least one embodiment, lighting characteristics for pixels of a selected static object (within a given scene) may be “pre-baked” into the pixels associated with that object. In at least one embodiment, the lighting characteristics for pixels of a selected static object may be “pre-baked” into the texture map (or texture-related pixels) associated with that object.
As illustrated in the example embodiment of
For example, taking into account the various different types of environmental and lighting features, characteristics, properties, and objects (e.g., radiosity, stationary objects, pixels, light sources, ambient occlusion, etc.) which may influence lighting within scene portion 1200, numerous algorithmic and/or complex computations may be performed (e.g., in advance of real-time game play) to thereby generate a set (or matrix) of Composite Light Intensity value(s) for each pixel 1202 represented in scene portion 1200. In at least one embodiment, the Composite Light Intensity Value(s) for a given pixel (e.g., pixel A) may be dynamically calculated (e.g., during runtime) using a set of one or more separately defined Light Source Intensity values each representing an amount of light intensity or light influence which a given light source (e.g., L1, L2, L3) has on that particular pixel. Thus, for example, in the example scene 1200 which includes three different light sources (L1, L2, L3), the Composite Light Intensity Value for pixel F (I(F)) may be defined as a set of separate Light Source Intensity values such as:
I(F)=I(L1@F),I(L2@F),I(L3@F),
where:
I(L1@F) represents the computed amount of light intensity or light influence which light source L1 has upon pixel F (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and pixels which affect Light Source L1's influence at pixel F);
I(L2@F) represents the computed amount of light intensity or light influence which light source L2 has upon pixel F (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and pixels which affect Light Source L2's influence at pixel F); and
I(L3@F represents the computed amount of light intensity or light influence which light source L3 has upon pixel F (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and pixels which affect Light Source L3's influence at pixel F).
Additionally, in at least one embodiment, the value I(L1@F) (representing the amount of light intensity or light influence which light source L1 has upon pixel F) may be expressed, for example, as
I(L1@F)=I(R1@F)+I(G1@F)+I(B1@F),
where:
I(R1@F) represents the amount of light intensity or light influence which the red component (R1) of light source L1 has upon pixel F;
I(G1@F) represents the amount of light intensity or light influence which the green component (G1) of light source L1 has upon pixel F; and
I(B1@F) represents the amount of light intensity or light influence which the blue component (B1) of light source L1 has upon pixel F.
Similarly, the value I(L2@F) (representing the amount of light intensity or light influence which light source L2 has upon pixel F) may be expressed, for example, as
I(L2@F)=I(R2@F)+I(G2@F)+I(B2@F),
where:
I(R2@F) represents the amount of light intensity or light influence which the red component (R2) of light source L2 has upon pixel F;
I(G2@F) represents the amount of light intensity or light influence which the green component (G2) of light source L2 has upon pixel F; and
I(B2@F) represents the amount of light intensity or light influence which the blue component (B2) of light source L2 has upon pixel F.
In at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such as one or more of those described and/or referenced herein.
In at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to utilize and/or generate various different types of data and/or other types of information when performing specific tasks and/or operations. This may include, for example, input data/information and/or output data/information. For example, in at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to access, process, and/or otherwise utilize information from one or more different types of sources, such as, for example, one or more local and/or remote memories, devices and/or systems. Additionally, in at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to generate one or more different types of output data/information, which, for example, may be stored in memory of one or more local and/or remote devices and/or systems. Examples of different types of input data/information and/or output data/information which may be accessed and/or utilized by the Dynamic Light Influence Rendering Procedure may include, but are not limited to, one or more of those described and/or referenced herein.
In at least one embodiment, a given instance of the Dynamic Light Influence Rendering Procedure may access and/or utilize information from one or more associated databases. In at least one embodiment, at least a portion of the database information may be accessed via communication with one or more local and/or remote memory devices. Examples of different types of data which may be accessed by the Dynamic Light Influence Rendering Procedure may include, but are not limited to, one or more of those described and/or referenced herein.
According to specific embodiments, multiple instances or threads of the Dynamic Light Influence Rendering Procedure may be concurrently implemented and/or initiated via the use of one or more processors and/or other combinations of hardware and/or hardware and software. For example, in at least some embodiments, various aspects, features, and/or functionalities of the Dynamic Light Influence Rendering Procedure may be performed, implemented and/or initiated by one or more of the various systems, components, systems, devices, procedures, processes, etc., described and/or referenced herein.
According to different embodiments, one or more different threads or instances of the Dynamic Light Influence Rendering Procedure may be initiated in response to detection of one or more conditions or events satisfying one or more different types of minimum threshold criteria for triggering initiation of at least one instance of the Dynamic Light Influence Rendering Procedure. Various examples of conditions or events which may trigger initiation and/or implementation of one or more different threads or instances of the Dynamic Light Influence Rendering Procedure may include, but are not limited to, one or more of those described and/or referenced herein.
According to different embodiments, one or more different threads or instances of the Dynamic Light Influence Rendering Procedure may be initiated and/or implemented manually, automatically, statically, dynamically, concurrently, and/or combinations thereof. Additionally, different instances and/or embodiments of the Dynamic Light Influence Rendering Procedure may be initiated at one or more different time intervals (e.g., during a specific time interval, at regular periodic intervals, at irregular periodic intervals, upon demand, etc.).
In at least one embodiment, initial configuration of a given instance of the Dynamic Light Influence Rendering Procedure may be performed using one or more different types of initialization parameters. In at least one embodiment, at least a portion of the initialization parameters may be accessed via communication with one or more local and/or remote memory devices. In at least one embodiment, at least a portion of the initialization parameters provided to an instance of the Dynamic Light Influence Rendering Procedure may correspond to and/or may be derived from the input data/information.
In the specific example embodiment of
As shown at 1502, it is assumed that a specific virtual scene is identified. In at least one embodiment, the static virtual objects and/or light sources of the scene are also identified. The purposes of illustration, it is assumed that the identified scene is scene 1200 of
As shown at 1504, Light Source Influence criteria for selected features of the scene (e.g., virtual objects, pixels, Light Influence Grid Points, object vertices, etc.) are calculated. In at least one embodiment, the Light Source Influence criteria associated with a given pixel characterizes the amount of light intensity or light influence which each distinct, identified light source has on that particular pixel.
In at least one embodiment, calculation of the Light Source Influence criteria may be performed in advance of runtime. In at least one embodiment, the calculated Light Source Influence criteria may be “pre-baked” into one or more selected features (e.g., pixels, Light Influence Grid Points, object vertices, texture maps, etc.) of the scene.
As shown at 1506 a virtual object may be identified to be rendered within identified scene at runtime.
As shown at 1508 the current lighting characteristics of the identified light source(s) are identified. Examples of such current lighting characteristics may include, but are not limited to, one or more of the following (or combinations thereof): hue, falloff, brightness/intensity, etc.
As shown at 1510 display characteristics for the pixels of the identified virtual object may be dynamically calculated (e.g., during runtime) using the pre-calculated Light Source Influence criteria and current lighting characteristics of the identified light source(s).
As shown at 1512 the identified virtual object may be dynamically rendered and displayed at runtime using the dynamically calculated pixel display characteristics.
According to different embodiments, the Gaming Network 100 may include a plurality of different types of components, devices, modules, processes, systems, etc., which, for example, may be implemented and/or instantiated via the use of hardware and/or combinations of hardware and software. For example, as illustrated in the example embodiment of
In at least one embodiment, a gaming device may be operable to detect gross motion or gross movement of a user. For example, in one embodiment, a gaming device may include motion detection component(s) which may be operable to detect gross motion or gross movement of a user's body and/or appendages such as, for example, hands, fingers, arms, head, etc.
According to different embodiments, at least some Gaming Network(s) may be configured, designed, and/or operable to provide a number of different advantages and/or benefits and/or may be operable to initiate, and/or enable various different types of operations, functionalities, and/or features, such as, for example, one or more of those described or referenced herein.
According to different embodiments, at least a portion of the various types of functions, operations, actions, and/or other features provided by the Gaming Network 100 may be implemented at one or more client systems(s), at one or more server systems (s), and/or combinations thereof.
According to different embodiments, the Gaming Network may be operable to utilize and/or generate various different types of data and/or other types of information when performing specific tasks and/or operations. This may include, for example, input data/information and/or output data/information. For example, in at least one embodiment, the Gaming Network may be operable to access, process, and/or otherwise utilize information from one or more different types of sources, such as, for example, one or more local and/or remote memories, devices and/or systems. Additionally, in at least one embodiment, the Gaming Network may be operable to generate one or more different types of output data/information, which, for example, may be stored in memory of one or more local and/or remote devices and/or systems. Examples of different types of input data/information and/or output data/information which may be accessed and/or utilized by the Gaming Network may include, but are not limited to, one or more of those described and/or referenced herein.
According to specific embodiments, multiple instances or threads of the Gaming Network may be concurrently implemented and/or initiated via the use of one or more processors and/or other combinations of hardware and/or hardware and software. For example, in at least some embodiments, various aspects, features, and/or functionalities of the Gaming Network may be performed, implemented and/or initiated by one or more of the various systems, components, systems, devices, procedures, processes, etc., described and/or referenced herein.
In at least one embodiment, a given instance of the Gaming Network may access and/or utilize information from one or more associated databases. In at least one embodiment, at least a portion of the database information may be accessed via communication with one or more local and/or remote memory devices. Examples of different types of data which may be accessed by the Gaming Network may include, but are not limited to, one or more of those described and/or referenced herein.
According to different embodiments, one or more different threads or instances of the Gaming Network may be initiated in response to detection of one or more conditions or events satisfying one or more different types of minimum threshold criteria for triggering initiation of at least one instance of the Gaming Network. Various examples of conditions or events which may trigger initiation and/or implementation of one or more different threads or instances of the Gaming Network may include, but are not limited to, one or more of those described and/or referenced herein.
It will be appreciated that the Gaming Network of
Generally, the dynamic lighting and rendering techniques described herein may be implemented in hardware and/or hardware+software. For example, they can be implemented in an operating system kernel, in a separate user process, in a library package bound into network applications, on a specially constructed machine, or on a network interface card. In a specific embodiment, various aspects described herein may be implemented in software such as an operating system or in an application running on an operating system.
Hardware and/or software+hardware hybrid embodiments of the dynamic lighting and rendering techniques described herein may be implemented on a general-purpose programmable machine selectively activated or reconfigured by a computer program stored in memory. Such programmable machine may include, for example, mobile or handheld computing systems, PDA, smart phones, notebook computers, tablets, netbooks, desktop computing systems, server systems, cloud computing systems, network devices, etc.
In one implementation, processor 210 and master game controller 212 are included in a logic device 213 enclosed in a logic device housing. The processor 210 may include any conventional processor or logic device configured to execute software allowing various configuration and reconfiguration tasks such as, for example: a) communicating with a remote source via communication interface 206, such as a server that stores authentication information or games; b) converting signals read by an interface to a format corresponding to that used by software or memory in the gaming machine; c) accessing memory to configure or reconfigure game parameters in the memory according to indicia read from the device; d) communicating with interfaces, various peripheral devices 222 and/or I/O devices; e) operating peripheral devices 222 such as, for example, card readers, paper ticket readers, etc.; f) operating various I/O devices such as, for example, displays 235, input devices 230; etc. For instance, the processor 210 may send messages including game play information to the displays 235 to inform players of cards dealt, wagering information, and/or other desired information.
The gaming machine 200 also includes memory 216 which may include, for example, volatile memory (e.g., RAM 209), non-volatile memory 219 (e.g., disk memory, FLASH memory, EPROMs, etc.), unalterable memory (e.g., EPROMs 208), etc. The memory may be configured or designed to store, for example: 1) configuration software 214 such as all the parameters and settings for a game playable on the gaming machine; 2) associations 218 between configuration indicia read from a device with one or more parameters and settings; 3) communication protocols allowing the processor 210 to communicate with peripheral devices 222 and I/O devices 211; 4) a secondary memory storage device 215 such as a non-volatile memory device, configured to store gaming software related information (the gaming software related information and memory may be used to store various audio files and games not currently being used and invoked in a configuration or reconfiguration); 5) communication transport protocols (such as, for example, TCP/IP, USB, Firewire, IEEE1394, Bluetooth, IEEE 802.11x (IEEE 802.11 standards), hiperlan/2, HomeRF, etc.) for allowing the gaming machine to communicate with local and non-local devices using such protocols; etc. In one implementation, the master game controller 212 communicates using a serial communication protocol. A few examples of serial communication protocols that may be used to communicate with the master game controller include but are not limited to USB, RS-232 and Netplex (a proprietary protocol developed by IGT, Reno, Nev.).
A plurality of device drivers 242 may be stored in memory 216. Example of different types of device drivers may include device drivers for gaming machine components, device drivers for peripheral components 222, etc. Typically, the device drivers 242 utilize a communication protocol of some type that enables communication with a particular physical device. The device driver abstracts the hardware implementation of a device. For example, a device drive may be written for each type of card reader that may be potentially connected to the gaming machine. Examples of communication protocols used to implement the device drivers include Netplex, USB, Serial, Ethernet 275, Firewire, I/O debouncer, direct memory map, serial, PCI, parallel, RF, Bluetooth™, near-field communications (e.g., using near-field magnetics), 802.11 (WiFi), etc. Netplex is a proprietary IGT standard while the others are open standards. According to a specific embodiment, when one type of a particular device is exchanged for another type of the particular device, a new device driver may be loaded from the memory 216 by the processor 210 to allow communication with the device. For instance, one type of card reader in gaming machine 200 may be replaced with a second type of card reader where device drivers for both card readers are stored in the memory 216.
In some embodiments, the software units stored in the memory 216 may be upgraded as needed. For instance, when the memory 216 is a hard drive, new games, game options, various new parameters, new settings for existing parameters, new settings for new parameters, device drivers, and new communication protocols may be uploaded to the memory from the master game controller 212 or from some other external device. As another example, when the memory 216 includes a CD/DVD drive including a CD/DVD designed or configured to store game options, parameters, and settings, the software stored in the memory may be upgraded by replacing a first CD/DVD with a second CD/DVD. In yet another example, when the memory 216 uses one or more flash memory 219 or EPROM 208 units designed or configured to store games, game options, parameters, settings, the software stored in the flash and/or EPROM memory units may be upgraded by replacing one or more memory units with new memory units which include the upgraded software. In another embodiment, one or more of the memory devices, such as the hard-drive, may be employed in a game software download process from a remote software server.
In some embodiments, the gaming machine 200 may also include various authentication and/or validation components 244 which may be used for authenticating/validating specified gaming machine components such as, for example, hardware components, software components, firmware components, information stored in the gaming machine memory 216, etc. Examples of various authentication and/or validation components are described in U.S. Pat. No. 6,620,047, titled, “ELECTRONIC GAMING APPARATUS HAVING AUTHENTICATION DATA SETS,” incorporated herein by reference in its entirety for all purposes.
Peripheral devices 222 may include several device interfaces such as, for example: transponders 254, wire/wireless power distribution components 258, input device(s) 230, sensors 260, audio and/or video devices 262 (e.g., cameras, speakers, etc.), transponders 254, wireless communication components 256, wireless power components 258, gaming device function control components 262, side wagering management components 264, etc.
Sensors 260 may include, for example, optical sensors, pressure sensors, RF sensors, Infrared sensors, image sensors, thermal sensors, biometric sensors, etc. Such sensors may be used for a variety of functions such as, for example detecting the presence and/or identity of various persons (e.g., players, casino employees, etc.), devices (e.g., gaming devices), and/or systems within a predetermined proximity to the gaming machine. In one implementation, at least a portion of the sensors 260 and/or input devices 230 may be implemented in the form of touch keys selected from a wide variety of commercially available touch keys used to provide electrical control signals. Alternatively, some of the touch keys may be implemented in another form which are touch sensors such as those provided by a touchscreen display. For example, in at least one implementation, the gaming machine player displays and/or gaming device displays may include input functionality for allowing players to provide desired information (e.g., game play instructions and/or other input) to the gaming machine, game table and/or other gaming system components using the touch keys and/or other player control sensors/buttons. Additionally, such input functionality may also be used for allowing players to provide input to other devices in the casino gaming network (such as, for example, player tracking systems, side wagering systems, etc.)
Wireless communication components 256 may include one or more communication interfaces having different architectures and utilizing a variety of protocols such as, for example, 802.11 (WiFi), 802.15 (including Bluetooth™), 802.16 (WiMax), 802.22, Cellular standards such as CDMA, CDMA2000, WCDMA, Radio Frequency (e.g., RFID), Infrared, Near Field Magnetic communication protocols, etc. The communication links may transmit electrical, electromagnetic or optical signals which carry digital data streams or analog signals representing various types of information.
Power distribution components 258 may include, for example, components or devices which are operable for providing wired or wireless power to other devices. For example, in one implementation, the power distribution components 258 may include a magnetic induction system which is adapted to provide wireless power to one or more gaming devices near the gaming machine. In one implementation, a gaming device docking region may be provided which includes a power distribution component that is able to recharge a gaming device without requiring metal-to-metal contact.
In at least one embodiment, gaming device function control components 262 may be operable to control operating mode selection functionality, features, and/or components associated with one or more gaming devices (e.g., 250). In at least one embodiment, gaming device function control components 262 may be operable to remotely control and/or configure components of one or more gaming devices 250 based on various parameters and/or upon detection of specific events or conditions such as, for example: time of day, player activity levels; location of the gaming device; identity of gaming device user; user input; system override (e.g., emergency condition detected); proximity to other devices belonging to same group or association; proximity to specific objects, regions, zones, etc.
In at least one embodiment, side wagering management components 264 may be operable to manage side wagering activities associated with one or more side wager participants. Side wagering management components 264 may also be operable to manage or control side wagering functionality associated with one or more gaming devices 250. In accordance with at least one embodiment, side wagers may be associated with specific events in a wager-based game that is uncertain at the time the side wager is made. The events may also be associated with particular players, gaming devices (e.g., EGMs), game themes, bonuses, denominations, and/or paytables. In embodiments where the wager-based game is being played by multiple players, in one embodiment the side wagers may be made by participants who are not players of the game, and who are thus at least one level removed from the actual play of the game.
In instances where side wagers are made on events that depend at least in part on the skill of a particular player, it may be beneficial to provide observers (e.g., side wager participants) with information which is useful for determining whether a particular side wager should be placed, and/or for helping to determine the amount of such side wager. In at least one embodiment, side wagering management components 264 may be operable to manage and/or facilitate data access to player ratings, historical game play data, historical payout data, etc. For example, in one embodiment, a player rating for a player of the wager-based game may be computed based on historical data associated with past play of the wager-based game by that player in accordance with a pre-determined algorithms. The player rating for a particular player may be displayed to other players and/or observers, possibly at the option (or permission) of the player. By using player ratings in the consideration of making side wagers, decisions by observers to make side wagers on certain events need not be made completely at random. Player ratings may also be employed by the players themselves to aid them in determining potential opponents, for example.
Dynamic Lighting and Rendering Component(s) 292 may be configured or designed to provide functionality for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects, dynamically movable virtual objects, shadow mapping, etc. In at least one embodiment, the Dynamic Lighting and Rendering Component(s) 292 may be configured or designed to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof):
In other embodiments (not shown) other peripheral devices include: player tracking devices, card readers, bill validator/paper ticket readers, etc. Such devices may each comprise resources for handling and processing configuration indicia such as a microcontroller that converts voltage levels for one or more scanning devices to signals provided to processor 210. In one embodiment, application software for interfacing with peripheral devices 222 may store instructions (such as, for example, how to read indicia from a portable device) in a memory device such as, for example, non-volatile memory, hard drive or a flash memory.
In at least one implementation, the gaming machine may include card readers such as used with credit cards, or other identification code reading devices to allow or require player identification in connection with play of the card game and associated recording of game action. Such a user identification interface can be implemented in the form of a variety of magnetic card readers commercially available for reading a user-specific identification information. The user-specific information can be provided on specially constructed magnetic cards issued by a casino, or magnetically coded credit cards or debit cards frequently used with national credit organizations such as VISA™, MASTERCARD™, banks and/or other institutions.
The gaming machine may include other types of participant identification mechanisms which may use a fingerprint image, eye blood vessel image reader, or other suitable biological information to confirm identity of the user. Still further it is possible to provide such participant identification information by having the dealer manually code in the information in response to the player indicating his or her code name or real name. Such additional identification could also be used to confirm credit use of a smart card, transponder, and/or player's gaming device.
It will be apparent to those skilled in the art that other memory types, including various computer readable media, may be used for storing and executing program instructions pertaining to the operation EGMs described herein. Because such information and program instructions may be employed to implement the systems/methods described herein, example embodiments may relate to machine-readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Example embodiments may also be embodied in a carrier wave traveling over an appropriate medium such as airwaves, optical lines, electric lines, etc. Examples of program instructions include both machine code, such as produced by a compiler, and files including higher level code that may be executed by the computer using an interpreter.
The exemplary computer system 300 includes a processor 302 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 304 and a static memory 306, which communicate with each other via a bus 308. The computer system 300 may further include a video display unit 310 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 300 also includes an alphanumeric input device 312 (e.g., a keyboard), a user interface (UI) navigation device 314 (e.g., a mouse), a disk drive unit 316, a signal generation device 318 (e.g., a speaker) and a network interface device 320.
The disk drive unit 316 includes a machine-readable medium 322 on which is stored one or more sets of instructions and data structures (e.g., software 324) embodying or utilized by any one or more of the methodologies or functions described herein. The software 324 may also reside, completely or at least partially, within the main memory 304 and/or within the processor 302 during execution thereof by the computer system 300, the main memory 304 and the processor 302 also constituting machine-readable media.
The software 324 may further be transmitted or received over a network 326 via the network interface device 320 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
While the machine-readable medium 322 is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Although an embodiment of the present invention has been described with reference to specific exemplary embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
According to various embodiments, Client Computer System 300 may include a variety of components, modules and/or systems for providing various types of functionality. For example, in at least one embodiment, Client Computer System 300 may include a web browser application which is operable to process, execute, and/or support the use of scripts (e.g., JavaScript, AJAX, etc.), Plug-ins, executable code, virtual machines, vector-based web animation (e.g., Adobe Flash), etc.
In at least one embodiment, the web browser application may be configured or designed to instantiate components and/or objects at the Client Computer System in response to processing scripts, instructions, and/or other information received from a remote server such as a web server. Examples of such components and/or objects may include, but are not limited to, one or more of the following (or combinations thereof):
In at least one embodiment, Client Computer System 300 may be configured or designed to include Dynamic Lighting and Rendering functionality for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects, dynamically movable virtual objects, shadow mapping, etc. In at least one embodiment, Client Computer System 300 may include Dynamic Lighting and Rendering Component(s), which, for example, may be configured or designed to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof):
According to specific embodiments, various aspects, features, and/or functionalities of the gaming device may be performed, implemented and/or initiated by one or more of the following types of systems, components, systems, devices, procedures, processes, etc. (or combinations thereof): Processor(s) 410; Device Drivers 442; Memory 416; Interface(s) 406; Power Source(s)/Distribution 443; Geolocation module 446; Display(s) 435; I/O Devices 430; Audio/Video devices(s) 439; Peripheral Devices 431; Motion Detection module 440; User Identification/Authentication module 447; Client App Component(s) 460; Other Component(s) 468; UI Component(s) 462; Database Component(s) 464; Processing Component(s) 466; Software/Hardware Authentication/Validation 444; Wireless communication module(s) 445; Information Filtering module(s) 449; Operating mode selection component 448; Speech Processing module 454; Scanner/Camera 452; OCR Processing Engine 456; Dynamic Lighting and Rendering Component(s); etc.
As illustrated in the example of
In at least one embodiment, the gaming device Application component(s) may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such as, for example, one or more of those described and/or referenced herein.
According to specific embodiments, multiple instances or threads of the gaming device Application component(s) may be concurrently implemented and/or initiated via the use of one or more processors and/or other combinations of hardware and/or hardware and software. For example, in at least some embodiments, various aspects, features, and/or functionalities of the gaming device Application component(s) may be performed, implemented and/or initiated by one or more of the various systems, components, systems, devices, procedures, processes, etc., described and/or referenced herein.
According to different embodiments, one or more different threads or instances of the gaming device Application component(s) may be initiated in response to detection of one or more conditions or events satisfying one or more different types of minimum threshold criteria for triggering initiation of at least one instance of the gaming device Application component(s). Various examples of conditions or events which may trigger initiation and/or implementation of one or more different threads or instances of the gaming device Application component(s) may include, but are not limited to, one or more of those described and/or referenced herein.
In at least one embodiment, a given instance of the gaming device Application component(s) may access and/or utilize information from one or more associated databases. In at least one embodiment, at least a portion of the database information may be accessed via communication with one or more local and/or remote memory devices. Examples of different types of data which may be accessed by the gaming device Application component(s) may include, but are not limited to, one or more of those described and/or referenced herein.
According to different embodiments, gaming device 400 may further include, but is not limited to, one or more of the following types of components, modules and/or systems (or combinations thereof):
According to a specific embodiment, the gaming device may be adapted to implement at least a portion of the features associated with the mobile game service system described in U.S. patent application Ser. No. 10/115,164, which is now U.S. Pat. No. 6,800,029, issued Oct. 5, 2004, (previously incorporated by reference in its entirety). For example. in one embodiment, the gaming device may be comprised of a hand-held game service user interface device (GSUID) and a number of input and output devices. The GSUID is generally comprised of a display screen which may display a number of game service interfaces. These game service interfaces are generated on the display screen by a microprocessor of some type within the GSUID. Examples of a hand-held GSUID which may accommodate the game service interfaces are manufactured by Symbol Technologies, Incorporated of Holtsville, N.Y.
The game service interfaces may be used to provide a variety of game service transactions and gaming operations services. The game service interfaces, including a login interface, an input/output interface, a transaction reconciliation interface, a ticket validation interface, a prize services interfaces, a food services interface, an accommodation services interfaces, a gaming operations interfaces, a multi-game/multi-denomination meter data transfer interface, etc. Each interface may be accessed via a main menu with a number of sub-menus that allow a game service representative to access the different display screens relating to the particular interface. Using the different display screens within a particular interface, the game service representative may perform various operations needed to provide a particular game service. For example, the login interface may allow the game service representative to enter a user identification of some type and verify the user identification with a password. When the display screen is a touch screen, the user may enter the user/operator identification information on a display screen comprising the login interface using the input stylus and/or using the input buttons. Using a menu on the display screen of the login interface, the user may select other display screens relating to the login and registration process. For example, another display screen obtained via a menu on a display screen in the login interface may allow the GSUID to scan a finger print of the game service representative for identification purposes or scan the finger print of a game player.
The user identification information and user validation information may allow the game service representative to access all or some subset of the available game service interfaces available on the GSUID. For example, certain users, after logging into the GSUID (e.g. entering a user identification and a valid user identification information), may be able to access a variety of different interfaces, such as, for example, one or more of: input/output interface, communication interface, food services interface, accommodation services interface, prize service interface, gaming operation services interface, transaction reconciliation interface, voice communication interface, gaming device performance or metering data transfer interface, etc.; and perform a variety of services enabled by such interfaces. While other users may be only be able to access the award ticket validation interface and perform EZ pay ticket validations. The GSUID may also output game service transaction information to a number of different devices (e.g., card reader, printer, storage devices, gaming machines and remote transaction servers, etc.).
In addition to the features described above, various embodiments of gaming devices described herein may also include additional functionality for displaying, in real-time, filtered information to the user based upon a variety of criteria such as, for example, geolocation information, casino data information, player tracking information, etc.
In according to one embodiment, network device 560 may include a master central processing unit (CPU) 562, interfaces 568, and a bus 567 (e.g., a PCI bus). When acting under the control of appropriate software or firmware, the CPU 562 may be responsible for implementing specific functions associated with the functions of a desired network device. For example, when configured as a server, the CPU 562 may be responsible for analyzing packets; encapsulating packets; forwarding packets to appropriate network devices; instantiating various types of virtual machines, virtual interfaces, virtual storage volumes, virtual appliances; etc. The CPU 562 preferably accomplishes at least a portion of these functions under the control of software including an operating system (e.g. Linux), and any appropriate system software (such as, for example, AppLogic™ software).
CPU 562 may include one or more processors 563 such as, for example, one or more processors from the AMD, Motorola, Intel and/or MIPS families of microprocessors. In an alternative embodiment, processor 563 may be specially designed hardware for controlling the operations of Server System 580. In a specific embodiment, a memory 561 (such as non-volatile RAM and/or ROM) also forms part of CPU 562. However, there may be many different ways in which memory could be coupled to the system. Memory block 561 may be used for a variety of purposes such as, for example, caching and/or storing data, programming instructions, etc.
The interfaces 568 may be typically provided as interface cards (sometimes referred to as “line cards”). Alternatively, one or more of the interfaces 568 may be provided as on-board interface controllers built into the system motherboard. Generally, they control the sending and receiving of data packets over the network and sometimes support other peripherals used with the Server System 580. Among the interfaces that may be provided may be FC interfaces, Ethernet interfaces, frame relay interfaces, cable interfaces, DSL interfaces, token ring interfaces, Infiniband interfaces, and the like. In addition, various very high-speed interfaces may be provided, such as fast Ethernet interfaces, Gigabit Ethernet interfaces, ATM interfaces, HSSI interfaces, POS interfaces, FDDI interfaces, ASI interfaces, DHEI interfaces and the like. Other interfaces may include one or more wireless interfaces such as, for example, 802.11 (WiFi) interfaces, 802.15 interfaces (including Bluetooth™), 802.16 (WiMax) interfaces, 802.22 interfaces, Cellular standards such as CDMA interfaces, CDMA2000 interfaces, WCDMA interfaces, TDMA interfaces, Cellular 3G interfaces, etc.
Generally, one or more interfaces may include ports appropriate for communication with the appropriate media. In some cases, they may also include an independent processor and, in some instances, volatile RAM. The independent processors may control such communications intensive tasks as packet switching, media control and management. By providing separate processors for the communications intensive tasks, these interfaces allow the master microprocessor 562 to efficiently perform routing computations, network diagnostics, security functions, etc.
In at least one embodiment, some interfaces may be configured or designed to allow the Server System 580 to communicate with other network devices associated with various local area network (LANs) and/or wide area networks (WANs). Other interfaces may be configured or designed to allow network device 560 to communicate with one or more direct attached storage device(s) 570.
Although the system shown in
Regardless of network device's configuration, it may employ one or more memories or memory modules (such as, for example, memory block 565, which, for example, may include random access memory (RAM)) configured to store data, program instructions for the general-purpose network operations and/or other information relating to the functionality of the various dynamic lighting and rendering techniques described herein. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store data structures, and/or other specific non-program information described herein.
Because such information and program instructions may be employed to implement the systems/methods described herein, one or more embodiments relates to machine readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable storage media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media such as floptical disks; and hardware devices that may be specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Some embodiments may also be embodied in transmission media such as, for example, a carrier wave travelling over an appropriate medium such as airwaves, optical lines, electric lines, etc. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
In at least one embodiment, the Server System may include a plurality of components operable to perform and/or implement various types of functions, operations, actions, and/or other features such as, for example, one or more of the following (or combinations thereof):
Although several example embodiments of one or more aspects and/or features have been described in detail herein with reference to the accompanying drawings, it is to be understood that aspects and/or features are not limited to these precise embodiments, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of spirit of the invention(s) as defined, for example, in the appended claims.
The present application claims benefit, pursuant to the provisions of 35 U.S.C. §119, of U.S. Provisional Application Ser. No. 61/504,141 (Attorney Docket No. 3GSTP001P), titled “USER BEHAVIOR, SIMULATION AND GAMING TECHNIQUES”, naming Kosta et al. as inventors, and filed Jul. 1, 2011, the entirety of which is incorporated herein by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
61504141 | Jul 2011 | US |