Non-limiting example embodiments of the disclosure are further described in the detailed description, which follows, by reference to the noted drawings, in which like reference numerals represent similar parts throughout the several views of the drawings, and wherein:
Referring now to the drawings in greater detail,
Mobile device 11 may, for example, be a mobile phone. The illustrated mobile device 11 is an embedded device, which captures, receives, and/or transmits voice, data, text, and/or images. The illustrated mobile device 11 further includes keys 13, to allow the control of mobile device 11 and the input of information into mobile device 11.
The illustrated device content development platform(s) may be a single, distributed, or multiple platforms. The illustrated platform(s) includes a number of software interfaces which interact with and provide corresponding windows or screens on a computer platform. These include a scripting window 16a and a corresponding scripting language interface 16b. A preview window 18a is provided which corresponds to a preview interface 18b. A source code window 20a is provided which corresponds to a source code interface 20b. A debugging window 22a is provided which corresponds to a debugging interface 22b. A device export window 24a is provided which corresponds to a device export interface 24b. A 3D modeling and/or image processing window 26a is provided which corresponds to a modeling/image processing interface 26b.
The illustrated 3D graphical virtual interface 10 graphically portrays and simulates a physical device with its interface components, and therefore, serves as a 3 dimensional (3D) user interface, with icons embedded therein.
Scripting language interface 16b is coupled to, and generates, one or more script files 28, which cater to the building of 3D user interfaces. Those script files 28 provide information for 3D icon and scene definition as well as for programming the animation of the defined 3D icons and scenes. The 3D icons and scenes, as animated, may be tied to or associated with mobile device 11, and tools thereof, to control or input and/or to display or output various mobile device operations, settings, events, and/or statuses.
Each of the interfaces 16b, 18b, 20b, 22b, 24b, and 26b is operable, through the use of its corresponding window, to receive controls and information via a computer screen and to display information to the user.
Preview interface 18b causes a viewer to load textures and animations. All files associated with a particular 3D model may be played, along with material animations and hierarchical animations of that 3D model.
Source code interface 20b, in connection with the source code window 20a, allows for the creation of a program using source code, typically using commands provided in code provided for original equipment manufacturers (OEMs).
Debugging interface 22b, interacting with debugging window 22a, facilitates the simulation of script files 28 for purposes of checking and debugging the script file. Device export interface 24b, together with device export window 24a, may allow a user to cause compiled script and/or source code to be exported to a mobile device 11.
Modeling/imaging processing interface 26b includes software for allowing an artist to perform 3D modeling and/or imaging processing through the use of 3D modeling and/or imaging processing window 26a, to create 3D assets for conversion into user interface assets and for the definition of user interface layouts to form and ultimately define a 3D user interface.
Scripting language interface 16b produces script files 28, while source code interface 20b produces source code 30. Either or each of these types of code may be compiled to produce compiled script and/or source code 32.
A file exporter 34 is provided to export files, i.e., convert such files, from modeling/image processing interface 26b into certain types of files that can be usable by the compiled script and/or source code 32 to create a particular type of 3D user interface which can be exported to mobile device 11. The “exporting” performed by file exporter 34 is distinct from the exporting performed by a device export interface 24b, in that the file exporter 34 simply converts information into files that are compatible with the compiled script and/or source code 32 (and also usable by a graphics engine that operates in accordance with the compiled code), while the device export interface 24b facilitates the physical exporting of such compiled script and/or source code, and associated user interface assets and user interface layout files, into mobile device 11.
In the illustrated embodiment, file exporter 34 exports information from modeling/image processing interface 26b into a set of files defining user interface assets 35, 36, and 37, and a set of files defining user interface layouts 38. Specifically the user interface assets include 3D models 35, animations 36, and textures 37. Modeling/image processing interface 26b and the corresponding 3D modeling and/or image processing window 26a may be implemented with standard software that is commercially available. For example, such standard software may include, for example, Maya.
The base structures and APIs 46 include textures 54, meshes 56, animations 58, cameras 60, and math and utilities 62. These structures and APIs provide full access to all geometry, animation streams, and other underlying engine data types. In addition, fixed point math and container structures may be provided that can be used independently of the rest of the engine. Applications may be implemented, embodied within compiled script and/or source code 40, so as to interface through managed APIs 44 for some or all functions. They may implement their own resource management and memory instantiation techniques, and, accordingly, interface directly with base structures and APIs 46. Moreover, completely bypassing managed APIs 44 is possible in the event an OEM developer wishes to write source code that takes advantage of exporters and mesh optimization tools or otherwise retain control over how higher-level functionality is implemented.
Managed APIs 44 together with base structures and APIs 46 comprise an optimization engine layer 42. The hardware level API 64 may include, for example, Open GL-ES, Direct 3D mobile, and SKT GIGA software. Files 35, 36, 37, and 38 are exported assets that define 3D models and animations of the 3D models. The exported assets are exported from modeling/image processing interface 26b, which, in the illustrated embodiment, includes a standard 3D modeling or imaging processing system. File exporter 34 exports the 3D model and animation assets from a 3D image defining system (which includes modeling/image processing interface 26b), to cause the assets to be in a format usable in a graphics engine, i.e., by one or both of managed APIs 44 and base structures and APIs 46. The exported assets may define 3D models including 3D icons and scenes of icons. In addition to models and animations, the assets may further include textures associated with shapes in the 3D models. The graphics engine may be an Open GL-ES-compatible, Direct 3D mobile-compatible, and SKT GIGA-compatible graphics engine. The embedded device may be a mobile device. Specifically, the embedded device may include a hand-held mobile communications device platform, wherein the device platform includes one or more integrated circuits. The exported assets may be stored in a memory on an integrated circuit.
The illustrated system includes a tool chain which includes file exporter 34. The tool chain may further include scripting language interface 16b to receive, via computer screen input through scripting window 16a, script statements defining a 3D user interface, and to generate a set of script files 28 representing the script statements defining the 3D user interface. The script files 28 may be stored; the script files may be XML script.
The tool chain may further include icon association mechanisms to associate a given 3D object in a scene with a mobile phone interface tool to cause, by manipulation of the given 3D object, at least one of an input and an output of a signal or information regarding the mobile phone.
Such input may involve a controlling function, a switch state change, and/or textual input. Such output may involve information display, or a state or status indication. The control, input and output, may all be regarding operations, settings, events, states, and statuses of a mobile phone.
The file exporter 34 includes a file generator to generate 3D model files 35, animation files 36, texture files 37, and user interface layout files 38.
Each of the subsystems depicted in the platform(s) 14 may include software that is running on a common platform or on different computers. For example, scripting language interface 16b and corresponding scripting window 16a may be running on one computer, while, for example, modeling/image processing interface 26b and corresponding 3D modeling and/or image processing window 26a are running on a different computer.
Scene nodes are all those nodes that are not linked to a 3D model. Link nodes have 3D models associated therewith. The models associated with link nodes are exported to their own files, and their link node association is specified in the UI definition file 92. The scripting language is used to provide a set of tags that can be used in connection with externally created 3D assets, otherwise referred to as user interface assets, produced by another set of software, modeling/image processing interface 26b as shown in
In embodiments herein, a scene is a type of 3D “world”. It is the place where 3D elements actually live. If a person is animated, then the scene in which that person is located can be a room. A scene can be large or small. For example, a set of 3D icons may each be in their own little world. For example, a scene for a particular icon can be a box surrounding that icon. Alternatively, all of the 3D icons could coexist in one scene comprising a bigger box that takes up the whole computer screen. A user interface can support several scenes, or alternatively, just one big scene.
A scene includes nodes. A node is a point in a scene at which objects are attached. A person in a room may be “attached” to a point represented by an X on the floor. A light may be attached to a point marked by where the electric cord comes out of the ceiling. There may be many nodes in a scene, and nodes themselves can be animated.
A model is a term that describes the behavior of each of the objects in a scene. Each object may have a model, or a model can comprise several objects. In one example, a puppy may be depicted which chases and fetches a ball when instructed. This puppy is defined as a model, which can be represented by an invisible wire-frame describing its shape and behavior, which in this case also includes the ball that the puppy chases and the checker-board base the puppy sits on. All three components are part of the same model. There can be many models in a scene, but each model is essentially independent of the other.
Mesh geometry can be drawn in various ways; it can be painted with a solid color, smoothly shaded between the colors at its vertices, or drawn with a texture map. Textures are a name for a specially-formatted image which is used to “drape” over the geometry represented by a model in order to give it a detailed surface. Textures are defined in texture files, in the illustrated embodiment, which may, for example, include the extension .qxt. Those textures are associated with the geometry they modify, for example, by the manner in which the name of the file is specified.
Each scene has at least one camera. The camera is the view onto the scene, and much like an animated object, is defined by a node in the scene specified by the author. A camera, in the embodiment, is activated before one can see through the camera. Switching on another camera may result in the automatic turning off of an already-active camera. A default camera (looking at the center of the world) may be provided for every scene, which is activated if no other camera is turned on.
A scene may have one or more lights. In addition, or alternatively, a scene may include default ambient “all-over” lighting. It is possible to bake lighting into vertex color and texture of a model to simulate static lighting in this ambient mode. Life-like dynamic lighting may be achieved by adding a light to a scene. A light is attached to a node, but in addition, it is associated with another node. That is, that association of the light to the other node defines the direction in which the light shines. Accordingly, a light can be pointed like a “torch”. In addition, one may define lights, and include parameters to specify the color of the light that is shined into the scene.
One or more animation files, files with the extension .qxa in the illustrated embodiment, may be provided, that describe how an object is animated. When an animation file is called upon, it is applied to a specific node within the scene. Animation files are a bit like a strip of film (or a timeline in Flash), and contain a set of frames. These frames do not have to represent a continuous sequence, and can contain several completely different animations in the same frame “stack”, which is why, in the illustrated embodiment, when they are called upon, both a start frame and an end frame are specified.
When an animation is activated, it is applied to a specific named node that it is meant to animate. By way of example, one animation file may be provided for animating a puppy, while a separate animation file is provided for animating the camera and light. The instructions specified in an animation file are applied to the object attached to that node. For example, a puppy may spin on a spot, fly around the scene, or jump up and down.
A 4-way navigation key that typically is provided in a mobile device key board can be used to animate a puppy in various ways. For example, in this example, one may press the right nav key and the ball rolls off to the right, shortly followed by the chasing puppy who retrieves it.
The 3D Scene.
q3dscene may be used as a tag that defines a 3D scene (or world). It takes a resource path which points to an imported filename .uis file (uis=UI Scene). If resources are exported from the puppy source using, for example, a plug-in (file exporter 34, as shown in
A Resource Tree folder may be provided, e.g., holding a resource called PuppyScene_uis which contains the external file scene_puppy.uis—note that the names do not have to correspond.
The 3D Model.
A model is a mesh (or wire-frame) describing one of more visible objects. In this example, the puppy, the base it sits on and the ball are all part of the same model—this is because they are all dependent upon each other in some way. It would also be quite acceptable to have several models in the same scene—all attached to different nodes. However, in this scenario, each model would be independent of the other.
The attributes that the q3dmodel tag take are a resource path (in this case PuppyModel_qxm, which imports the external file link_puppy.qxm) as well as anchor node—this is the point (node) in the scene that this object (the puppy) attaches to. The model may be provided with one or more textures, without which it will appear in a uniform or per-vertex color. This command loads a 3D model asset (see
Textures.
Textures may behave slightly differently than other imported resources. Models within a scene use textures and apply them so that they can be seen; however, there is no need to provide an attribute or tag to define a texture, as it is something that only the 3D content knows about. When a texture is accessed in this fashion, in the illustrated example embodiment, it is loaded into memory by resource manager 50 (
The Camera.
In order to be able to look at the scene, at least one camera is provided. A camera may be defined with the q3dcamera tag and, once again, is something that the author of the content specifies. In the puppy example, a camera is attached to a node called Puppycam, and the camera tag has an attribute cameranode=“Puppycam”. There's also a startactive=“_true” to make sure that the camera is turned on as soon as this scene loads.
It is possible to define more than one camera in a scene, but in the embodiment illustrated herein, to simplify the program, only one can be active at any one time. Turning on a different camera automatically switches off the current camera.
The Lighting.
The optional tag q3dlight allows one to illuminate the scene by shining a light from one node to another. It is optional because, without it, the scene will assume its own default ambient lighting. A list of lights available in a scene should be provided by the 3D author.
In the example, three q3dlight tags are defined: they have the IDs white, red and green, are all attached (anchored) to the scene node x_light, and all point to the node link_puppy. Remember that the puppy model is attached to the node link_puppy, so in effect the light is shining on the puppy. Only the first light, white, is set to be active on startup, and so when this scene first loads, there will be light shining on the puppy. Like the camera, there is only one light allowed at a time (per node)—the content will later use events to switch each of these lights on in turn (and in doing so give the effect of changing the light's color).
At this point, a world has been defined populated with a puppy and ball, with a light shone on it, and provided with a camera so it can be seen. The real power of 3D animation comes with what can be done with this object (or objects) once defined, and this is exercised with the q3danim tag. The tag itself takes several attributes: sframe and eframe (start and end frame) provide a way of selecting particular parts of a given animation sequence. It also takes a resource, which points to an animation file 36; once loaded, it may be managed in memory by resource manager 50.
In the puppy example, the 3D resource tree folder includes two qxa (animation) files defined—PuppyModel_qxa and PuppyScene_qxa. The former describes animation specific to the puppy, and the latter contains animations for the light and the camera—it is very much an authoring decision how these animations are broken down. An attribute may be provided specifying the node to which the animation is applied, but this is not a random choice: each sequence of animations is specific to a particular object, so it would be no use, for example, to use an animation sequence meant for the camera and apply it to the puppy. The author of the 3D content should therefore make it clear which animation refers to which object. The final attribute provides control over whether or not the animation is looped.
Animating the Puppy.
In the fragment trigml/main, a few variables may be pre-defined. These are:
The first group of q3danim tags uses the state of one of these, /var/object, to test whether the device is in “puppy” mode. The reason for this is that the same 4-way navigation key will be used to drive several different animations (for the puppy, the camera and the light). So this first q3danim listener determines when there has been a _keyleft key press while the device is in puppy mode, and when this occurs, it applies the animation contained in the resource 3D/PuppyModel_qxa to the node link_puppy. It will use frames 250 through 350, and loop once. It will then listen for the remaining three possible keypresses (while ensuring that the device is still in “puppy” mode), and apply the relevant sequence of frames to the puppy in order to animate it.
The script may use select sequences from this list. For example, frames 10-15 (Go to sit position) and 15-45 (Sit idle anim) may not be featured in the puppy animation sequences that have been used. One possible exercise would be to change the puppy navkey listeners from (e.g.)
The <seq> tag works by processing each child in turn, but doesn't fire a child until the previous one has completed. Thus, frames 250-350 are run once, and then frames 15-45 are looped once the first sequence has finished and until some other animation event has fired.
Camera Motion.
The next group of four listeners does very much the same thing for the camera, this time testing that the device is in “camera” mode (the camera has two sub-modes of smooth and stepped, so the listeners test that the device is in smooth mode in this instance). If so, animations are applied from the resource PuppyScene_qxa to the node Puppycam. Note how this differs from the puppy animations: a different set of animations is taken from a different file, and applied to a different node. Because there is a different animation file, the frame numbers are different—this time the effect of the animation is to take the node to which the camera is attached and spin it around in a circle about the puppy. Here is an example list of frames available for the camera:
The processing performed by each of the elements shown in the figures may be performed by a general purpose computer alone or in connection with a specialized processing computer. Such processing may be performed by a single platform or by a distributed processing platform. In addition, such processing can be implemented in the form of special purpose hardware or in the form of software being run by a general purpose computer. Any data handled in such processing or created as a result of such processing can be stored in any type of memory. Such data may be stored in a temporary memory, such as in the RAM of a given computer system or subsystem. In addition, or in the alternative, such data may be stored in longer-term storage devices, for example, magnetic discs, rewritable optical discs, and so on. For purposes of the disclosure herein, computer-readable media may comprise any form of data storage mechanism, including such different memory technologies as well as hardware or circuit representations of such structures and of such data.
The claims, as originally presented and as they may be amended, encompass variations, alternatives, modifications, improvements, equivalents, and substantial equivalents of the embodiments and teachings disclosed herein, including those that are presently unforeseen or unappreciated, and that, for example, may arise from applicants/patentees, and others.