The present application relates to the field of computers, and in particular, to 3D image implementation.
With the development of computer technologies, three-dimensional images are more and more favored by the majority of users. Therefore, a 3D model format was proposed and widely used in various scenarios such as live streaming and games to implement various 3D visualizing designs.
Some embodiments of the present application provide a method for implementing a 3D image, a computer device, and a non-transitory computer-readable storage medium.
An aspect of the embodiments of the present application provides a method, including:
An aspect of the embodiments of the present application further provides a computer device including a processor and a memory, where the memory stores computer-readable instructions that, when executed by the processor, cause the processor to:
An aspect of the embodiments of the present application further provides a non-transitory computer-readable storage medium storing computer-readable instructions that, when executed by at least one processor, cause the at least one processor to:
To make the objectives, technical solutions, and advantages of the present application clearer and more comprehensible, the present application will be further described in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely intended to explain the present application, and are not intended to limit the present application. All other embodiments obtained by those of ordinary skill in the art based on the embodiments of the present application without creative efforts shall fall within the protection scope of the present application.
It should be noted that the descriptions related to “first”, “second”, and the like in the embodiments of the present application are merely used for the illustrative purpose, and should not be construed as indicating or implying the relative importance thereof or implicitly indicating the number of technical features indicated. Thus, features defined with “first” and “second” may explicitly or implicitly include at least one of the features. In addition, technical solutions of various embodiments can be combined with each other, but they must be based on the implementation by those of ordinary skill in the art. When a combination of technical solutions is contradictory or cannot be implemented, it should be considered that such a combination of the technical solutions neither exists, nor falls within the protection scope claimed by the present application.
In the description of the present application, it should be understood that, the reference numerals of steps do not indicate the order of execution of the steps, but are merely to facilitate the description of the present application and differentiation between the steps, and thus will not be interpreted as limiting the present application.
To facilitate those skilled in the art to understand the technical solutions provided in the embodiments of the present application, the related technologies are described below:
FBX, DAZ, USD, and other formats cannot be loaded at runtime. Using these formats requires intermediate data to be generated in advance in a game engine for runtime rendering, and these formats cannot be directly used as a transmission carrier to be provided for a client, but are more applicable to be used as a production tool rather than a consumption carrier and can only be used by professional tools in professional fields as a productivity medium.
AssetBundle, Pak, and other formats are strongly bound to an engine version, and upgrading the engine version may cause all resources to be repackaged. As a result, these formats are not applicable to products with creations of players as creative themes. These formats are strongly correlated with operating systems, and resource packs for different platforms are not interchangeable and need to be respectively generated; they cannot be propagated and traded as independent resources, and cannot be assigned the value of virtual assets; and they cannot be exported at runtime, cannot be re-created and modified, and resources cannot be reused.
The MMD (MikuMikuDance) format is used for 3D animated movie scenes, and only supports exporting videos as projects in exclusive tools. It has commercial licensing restrictions, and has no ecological chain to support its application in games or virtual YouTuber (vTuber, a virtual uploader).
The VRM format is used in virtual live streaming and social VR games, but it contains only data of the character part, cannot be extended to be applicable to a lamer usage scenario, has a poor rendering effect, and has regional restrictions. For example, it has lip synchronization only in Japanese, its shader supports only MToon (a toon shader with global illumination), Unlit (a shader that lets materials not affected by lighting), and physically based rendering (PBR). As a result, this format has a low extension flexibility, for example, does not support either animation or scene loading and cannot be extended in functions by the third party, which hinders the development of vTuber.
As mentioned above, each of the above 3D file formats has specific limitations, for example, the inventors realized that a cross-platform capability of existing 3D model formats cannot meet requirements of application in various scenarios. The present application provides a new file format, so as to support players in creating high-degree-of-freedom 3D scenes to be shared and traded, where the use of the new file format is not affected by technical factors such as operating systems, tool types, and tool versions. This format is not affected by operating systems, tool types, and tool versions, easy to use, create, and modify, and convenient to be loaded and exported at =time.
According to the technical solutions provided in the present application, functions are developed based on Extensions and Extras fields reserved in the glTF format, and existing glTF files are compatible, thereby ensuring that the Json Schema of the standard glTF format is not destroyed, and that it can be opened and modified by other tools. The eligibility of conventional glTF tools for previewing files in the new file format is retained, such that previewing and editing capabilities of non-special-purpose tools can be retained to some extent, the smallest data structure of the files can be ensured, and default data is supported to be used in the fields; and a large amount of reused data does not need to be saved in the Extras field, and data with strong commonality and strong reusability is saved in the Extensions field.
Terms in the present application are explained below:
Three-dimensional (3D) image: one of image files and used to store information of a 3D model. A 3D image includes a 3D model, a 3D animation, and a 3D project file. The 3D image may include model information composed of polygons and vertices in a 3D space interpreted by 3D software, and the model information may further include information such as color, texture, geometric shape, light source, and shadow. 3D image file formats may be used in VR, 3D printing, game, movie effect, architecture, medicine, and other related scenes.
Graphics Language Transmission Format (glTF): a 3D computer graphics format and standard. It supports storage of 3D models, appearance, scenes, and animations. It is a simplified and interoperable format for 3D assets, while minimizing a file size and processing difficulty of applications. glTF assets include JSON files and supported external data. glTF assets include a JSON file (.gltf) that describes a complete scene: a hierarchical structure of a node, material, camera and mesh, animation, and descriptor information of other constructs; binary files (.bin) including geometry and animation data as well as other data in buffers; and texture (.jpg, .png). 3D objects in a scene are defined using meshes connected to the nodes. The material is used to define the appearance of the objects. The animation describes how a 3D object is transformed over time. The skin defines how the geometry of the object is deformed based on a skeleton pose. The camera describes view configurations for a renderer.
Resources: may include an image, a shader (shading code), a material, a model, an animation, etc.
Material: a data set expressing interaction of an object with light and read by the renderer, including a map, a lighting algorithm, etc.
Texture: a segment of regular and repeatable bitmap, and is the basic unit of data input.
Map: includes a texture and many other information, such as a texture coordinate set, map input and output control, etc. Maps have a plurality of forms, such as lightmaps, environment maps, reflection maps, etc. A lightmap is used to simulate a lighting effect of a surface of an object. An environment map includes six textures and a corresponding texture coordinate set.
Texture mapping: used to map a texture onto a surface of a 3D object using a coordinate set (such as UV coordinates).
AssetBundle: a file storage format supported by Unity and also a resource storage and update method officially recommended by Unity. It may compress resources (assets), pack them into groups, load them dynamically, and implement hot updates.
FBX: a format used by FilmBox software, and later renamed Motionbuilder. FBX may be used for import and export of models, materials, actions, and camera information between software such as Max, Maya, Softimage, etc.
DAZ: a file format for 3D scenes created by the modeling program DAZ Studio.
Universal Scene Description (USD): a file format based on a whole animated movie process and provided by Pixar.
Pak: an image resource format customized by the GUM engine.
Virtual Reality Modeling (VRM): a format for a virtual 3D humanoid model.
Avatar: a human-like 3D character model.
Metaverse: meta-universe, or referred to as post-universe, metaphysical universe, extra-sensory space, virtual space, is a network of 3D virtual worlds focused on social connections. Metaverse may involve a persistent and decentralized online 3D virtual environment.
Game engine: a core component of some well-written editable computer game systems or interactive real-time image applications. These systems provide game designers with various tools required to write games, and aim to allow game designers to make game programs easily and quickly without starting from the ground up. Most of them support a plurality of operating platforms, such as Linux, MacOS X, and Microsoft Windows. The game engine includes the following systems: a rendering engine (namely “renderer”, including a 2D image engine and a 3D image engine), a physical engine, a collision detection system, sound effects, a script engine, computer animations, artificial intelligence, a network engine, and scene management.
The technical solutions provided in the embodiments of the present application are described below using an exemplary application environment.
The computer device 2 may run operating systems such as Windows, Android™, or iOS. In addition, a user may install various applications according to requirements, for example, install a software development kit (SDK) for making 3D games, 3D animations, 3D videos, and the like. Hereinafter, the present application provides a method and system for implementing a 3D image, a computer device, and a computer-readable storage medium.
As shown in
In S200, a newly added attribute is defined in an attribute extension field of a target formatted file associated with a target format compatible with glTF format, where the target format is obtained by defining extension field information of the glTF format.
In S202, a 3D image is generated based on the target formatted file.
In S204, a newly added effect/function supported by the 3D image is implemented based on the newly added attribute in the target formatted file.
Various elements that construct a 3D image are defined in the glTF format. These elements are scene, node, mesh, camera, material, texture, and skin.
The scene refers to entries for describing a scene structure and defines a scene graph by referring to one or more nodes.
The node is attached to the scene. The node may refer to child nodes, meshes, cameras, skins that describe mesh transformations, etc.
The mesh is used to describe mesh data of a 3D object appearing in a scene.
The camera refers to viewing frustum configurations for rendering the scene.
Each of the above elements has one or more attributes. An attribute is used to define properties, features, characteristics, descriptions, etc. of a corresponding element.
For example, an attribute list of a node may include: camera, child node, skin, matrix, mesh, quaternion rotation, scale ratio, position information, weight array of meshes, name, an attribute extension field, and an attribute extras field.
In this embodiment of the present application, a new 3D file format (a target format) is provided on the basis of the glTF format.
In the target format, all the functions and effects supported by the glTF format are inherited, and on the premise of not destroying the structure of the glTF format, the attribute extension field and the attribute extras field are used to define the newly added attribute of the target format.
In exemplary application, the newly added attribute includes: an attribute defined in the attribute extension field to be pointed to by a node; an attribute defined in the attribute extension field, to which no node points; and/or an attribute defined in a node.
In exemplary application, the newly added attribute may include an audio file attribute, an audio behavior attribute, an expression transformation attribute, a collider attribute, a humanoid bone attribute, a cloth changing attribute, a lightmap attribute, a metadata attribute, a bone dynamics attribute, a post-processing attribute, a dynamic script attribute, a rendering attribute for a scene, a skybox attribute, a cubemap attribute, a story timeline attribute, a sprite attribute, a streaming media attribute, a resource variable attribute, an export attribute, etc. Certainly, other attributes supported by the engine or web may be included, so as to support more effects and functions.
Providing the target formatted file in the target format (the new 3D file format) has the following advantages:
It should be noted that, to optimize loading of the target formatted file and reduce memory usage, two different loading mechanisms are provided, to adapt to different usage scenarios, that is: a large amount of reused attribute information does not need to be saved in the attribute extras field, and data with strong commonality and strong reusability is saved in the attribute extension field.
The newly added attribute of the target formatted file is described below through a plurality of optional embodiments.
As an optional embodiment, the target formatted file may support an audio function.
As shown in
The audio file attribute may be pointed to by the node, so as to be used by the node.
The audio file attribute may include:
As listed in Table 1, the audio file attribute defined in the attribute extension field includes the following information:
The target formatted file may be exported with either a .gltf suffix or a .glb suffix. When the file is exported as a separate .gltf file, uri is used; or when the file is exported as a .glb file, information will be stored in the bufferView field. It should be noted that, more suffixes may be defined for different export types in the future, such as defining different suffixes for files based on character models or scenes, as a functional distinction.
In the table, mimeType specifies an audio format, such as supported WAV and OGG.
Based on the newly added audio file attribute, an audio segment may be exported in Unity in the WAY or OGG format. By default, a short audio clip may be exported in the WAV format, and a long audio clip may be compressed and exported in the OGG format.
As an optional embodiment, the target formatted file may support an audio control function.
As shown in
On the basis of referring to the audio file attribute, the node may further refer to the audio behavior attribute.
As listed in Table 2, the audio behavior attribute defined in the attribute extension field include the following information:
As an optional embodiment, the target formatted file may support an expression transformation function.
As shown in
The expression transformation attribute may be pointed to by the node, so as to be used by the node.
As listed in Table 3, the expression transformation attribute defined in the attribute extension field includes the following information:
In the table, blendShapeValues defines a mapping table that records the weights of a plurality of mesh transformations to expression transformations; material′Vector4Values defines a list that records a plurality of sets of material parameters, each set including four component vectors (for example, a mesh tangent and a shader); materialColorValues defines another list that records a plurality of sets of material parameters representing colors; and materialFloatValues defines still another list that includes a plurality of sets of material parameters of the float type.
When the above information is exported from an expression transformation module, required information is exported as extensions and then referred to under the node.
As an optional embodiment, the target formatted file may support a collision effect.
As shown in
The collider attribute may be pointed to by the node, so as to be used by the node.
As listed in Table 4, the collider attribute defined in the attribute extension field include the following information:
As an optional embodiment, the target formatted file may support a humanoid bone effect.
As shown in
The humanoid bone attribute may be pointed to by the node, so as to be used by the node. The node corresponds to actual humanoid bone points.
The humanoid bone attribute defines Avatar (a virtual character) used in a humanoid model.
Any model imported as a humanoid animation type may generate Avatar resources that store information about driving the animator.
The Avatar system is used to tell the game engine how to recognize a particular animated model is a humanoid model in layout, and which parts of the model correspond to leas, arms, the head, and the body. After this step, the animation data may be “reused”. It should be noted that due to the similarity in bone structures between different humanoid characters, it is possible to map one humanoid character to another in an animation, enabling relocation and inverse kinematics.
As listed in Table 5, the humanoid bone attribute defined in the attribute extension field includes the following information:
In the table, humanBones records a plurality of joints and a connection and spatial transformation relationship between the joints (such as the neck and head).
On the basis of referring to the humanoid bone attribute, the node may further refer to an bone change attribute.
The bone change attribute further includes the content listed in Table 6.
As an optional embodiment, the target formatted file may support a cloth changing function.
As shown in
where the cloth changing attribute includes a list of different cloth changing solutions and a material parameter list of each cloth changing solution.
The cloth changing attribute may be pointed to by the node, so as to be used by the node.
On the premise that there is the Avatar, the node may refer to/point to the cloth changing attribute, thereby supporting cloth changing of characters.
The cloth changing system implements cloth changing by changing the mesh visibility or materials on the mesh.
As listed in Tables 7 to 9, the cloth changing attribute defined in the attribute extension field includes the following information:
Table 7 lists a set of cloth changing solutions, Table 8 lists information of each cloth changing solution, and Table 9 lists changes included in a single cloth changing.
As an optional embodiment, the target formatted file may support a lightmap effect.
As shown in
where the lightmap attribute is used to instruct an engine to pre-calculate a change in brightness of surfaces in a scene. The lightmap attribute is defined in the attribute extension field and does not need to be pointed to in other objects.
As listed in Table 10, the lightmap attribute defined in the attribute extension field includes the following information:
In the table, each map stores different information about lighting for the scene of the user.
For example, LightmapTexturelnfo[ ] includes: the color of the incident light (required), the main direction of the incident light (required), the occlusion mask for each light (not required), etc.
As an optional embodiment, the target folliiatted file may support management, so as to extend support for element-based management and the like.
As shown in
where the metadata attribute includes resource description information, resource management information, and legal information and/or content reference information. The metadata attribute is defined in the attribute extension field and does not need to be pointed to in other objects.
The resource description information is used for discovery and recognition, and may include elements such as title, abstract, author, and keywords, which are arranged in sequence to form a chapter. It describes the type, version, relationship, and other characteristics of digital materials.
The resource management information is information such as resource types and permissions for managing resources.
The legal information provides information about creators, copyright owners, and the public license.
The content reference information is information about content.
As listed in Table 11, the metadata attribute defined in the attribute extension field includes the following information:
As an optional embodiment, the target formatted file may support an animation based on bone dynamics.
As shown in
The bone dynamics attribute is used to be pointed to by the node, so as to be used by the node.
In exemplary application, movement of skirts, hair, pendants, and the like following the movement of bones and bodies may be simulated.
As an optional embodiment, the target formatted file may support post-processing, so as to extend support for post-processing functions and the like.
As shown in
The post-processing attribute may be pointed to by the node, so as to be used by the node.
The volume component includes attributes that control how they affect the camera and interact with other volumes. It is an effect that acts on the full screen, is used for 3D rendering, improves rendering, and requires very little time to set.
The following describes the attributes of a volume component: As listed in Table 12, the attributes of a volume component include the following information:
Which effect is to be used may be specified based on the ID of the configuration file.
Regardless of whether the effect is used globally or locally, it needs to be pointed to by the node to serve the node where the post-processing attribute is specified.
The supported post-processing effect may include: ambient light occlusion, bloom, mixer, chromatic aberration, color adjustment, color curve, depth of field, film grain, lens distortion, lift, gamma and gain, motion blurring, panini projection, shadow midtone highlight, split tone, tone mapping, vignetting, and white balance.
For each post-processing effect, a corresponding attribute may be defined in the attribute extension field.
For example, vignetting is a reduction of an image's brightness or saturation toward the periphery compared to the image center. Vignetting includes the attributes in Table 13.
As an optional embodiment, the target formatted file may execute a dynamic script, such as hot update.
As shown in
In exemplary application, the above character string may point to an external script, such as Puerts, Lua, and the like.
Rendering events and events from an input device are received, and the script engine executes scripts after receiving corresponding events.
The events may include: rendering of a first frame of an object, enabling of an object component, disabling of the object component, destroying, updating of each frame, and calling periodically according to the time after all objects are updated.
Further, the events may further include a manually triggered event, such as an event triggered by the following devices: a keyboard, a mouse, a joystick, a controller, a touch screen, a motion sensing function (such as an accelerometer or a gyroscope), and a virtual reality (VR) and augmented reality (AR) controller, etc.
As an optional embodiment, the target formatted file may support scene rendering.
As shown in
As listed in Table 14, the rendering attribute for the scene defined in the attribute extension field includes the following information:
As an optional embodiment, the target formatted file may support a skybox effect.
As shown in
A video game level is used as an example. When a skybox is used, the level is enclosed in a cube. The sky, distant mountains, distant buildings, and other inaccessible objects are projected onto the surfaces of the cube, creating the illusion of a distant 3D environment. Similarly, the skydome uses a sphere or hemisphere instead of a cube.
As an optional embodiment, the target formatted file may support a cubemap effect.
As shown in
The cubemap attribute is not pointed to by a node, but is used as a special map type to be pointed to in the material.
As listed in Table 16, the cubemap attribute may include the following information:
A cubemap is a set of six square textures that represent reflections in the environment. The six squares form the faces of an imaginary cube surrounding an object; each face represents a view along a world axis (up, down, left, right, front, back) direction. The image type (imageType) includes a texture (with an aspect ratio of 6:1 or 1:6) formed by combining six squares in one row or column and three types of panoramic images (with an aspect ratio of 2:1).
As an optional embodiment, the target formatted file may implement operations based on the story timeline.
The target formatted file includes a node, and the story timeline attribute is attached to the node to extend editing functions and the like.
As shown in
The story timeline attribute may be pointed to by a node, so as to be used by the node.
The story timeline attribute may include the following information:
All tracks each include the following parameters: a resource name, a start time, an end time, and a resource ID. The resource ID is used to specify the subscript position of the data source, which may be an animation, a map, audio, or other data.
The track parameters may include: a track name (the character string type, not required), a start time (the float type, required), and an end time (the float type, required).
A generic type may be used to represent sub-track data contained in the track group of each category, for example, describe the set of all sub-tracks under the category.
Track data under different categories may be obtained by inheriting a specified generic type, for example, two track groups respectively representing animation and audio may be obtained.
Material curve parameters may be all inherited from the generic type, for example: the use of one of a plurality of materials on the renderer is specified, and whether to execute it in reverse after execution, and curve data.
The expression transformation curve is used for smooth facial capture expression transformation.
For the floating-point parameter curve of materials, parameters of the materials for the floating-point type may be continuously updated based on time, including: the name of a material parameter to be set.
For the color parameter curve of materials, parameters of the materials for the color type may be continuously updated based on time and inherited from the above, including: color values at the start and end. Interpolation is performed based on time, and the color of each frame is continuously updated.
The animation component on the specified node is obtained, only the node ID will be exported, and the rest of variables will be created during loading.
When parameters in the story timeline attribute are used in the node, the playing behavior of the story timeline may be specified, where playing parameters controlling the playing behavior may include: An ID (describing a track name, required), whether to play automatically after loading (the bool type, not required), and whether to play in a loop (the bool type, not required).
As an optional embodiment, the target formatted file may implement sprite operations to extend texture management and combination, and the like.
As shown in
The sprite attribute may be pointed to by the node, so as to be used by the node. As listed in Table 17, the sprite attribute defined in the attribute extension field may include the following information:
The sprite is a two-dimensional graphic object. In 3D scenes, sprites are generally standard textures. Textures may be combined and managed through the above sprite attribute, so as to improve the efficiency and convenience during development.
As an optional embodiment, the target formatted file may implement streaming media playing and control, so as to extend editing functions and the like.
As shown in
As listed in Table 18, the streaming media attribute defined in the attribute extension field may include the following information:
As an optional embodiment, the target formatted file may support the use of other files.
The target formatted file includes a node, and the resource variable attribute is attached to the node to extend the use of other files.
As shown in
where the resource variable attribute includes a variable type and a set of indexes pointing to reference fields, so as to support the use of resources.
As listed in Table 19, the resource variable attribute defined in the attribute extension field may include the following information:
The resource variable attribute is used to support some resources that are not currently used but may be used in the future. Exemplarily, these resources may be textures, cubemaps, materials, audio segments, animation segments, and lightmaps.
As an optional embodiment, the method further includes:
The non-common parameters are relative to common parameters, and refer to parameters such as those that are not global and have frequent update iterations.
In the target format of the embodiment, in addition to regular fields, the attribute extension field (Extensions) and attribute extras field (Extras) are included. The regular fields in the target format are the same as the fields in the glTF format, ensuring that the target format is compatible with the glTF format. The attribute extras field is used to add some non-generalized information. The attribute extension field is used to carry information of common and standard content, and the attribute extras field is used for supplement of non-standard and non-common information. The attribute extras field is generally attached to a node or object, providing specific functional supplements. The following attribute information may be recorded in Extras, for example, attributes of a few components supported by the engine, or attributes of components that are frequently updated (after some components are updated, their attribute names are changed or new fields are added). A code generator used to generate serialization and deserialization codes is provided, so as to help users who use the software development kit (SDK) to customize non-common functional supplements. The attribute extension field is used to record information with strong commonality. In other words, the attributes recorded in the attribute extension field are more common and reusable than those recorded in the attribute extras field.
For example, the following attribute information may be recorded in Extras:
Currently supported information includes information export of animation, sound, camera, light, material, physics, rendering, and other types of components, and variables publicly accessible by custom scripts also support export using code generation tools.
As an optional embodiment, the target formatted file may implement custom import and export, so as to extend the export function and the like.
As shown in
where the export mode is used to define an export of a provided material parameter and/or an export of a provided component parameter.
For example, parameters of a specified type (for example, a shader type) and information of various material parameters to be exported may be defined.
For another example, parameters to be exported as extras field information under the node, for example, parameters of a specified component type (for example, an animation component) and information of various public parameters to be exported may be defined.
According to the above content, compared with the glTF format, the target format provided in the embodiment defines a large number of newly added attributes to support implementation of a large number of functions or effects, as follows:
To make the advantages of the present application clearer, a comparison between the VRM format and the target format is provided below.
The virtual reality modeling (VRM) is also a 3D file format developed based on glTF. A VRM file allows all supported applications to run the same virtual avatar data (3D model).
As a new format developed based on the glTF format, the target format has the following advantages compared with the VRM format: It is compatible with the glTF format, may be used in various game engines and WebGL, and may be opened and edited by professional design software (such as Maya, Blender, C4D, etc.).
It supports scene export, animation, multimedia, skybox, mesh compression, customization of material parameters and script parameters, etc., and the functionality can be continuously extended.
It is cross-platform and cross-tool and supports version compatibility. One file is compatible with all devices. All needed is to have Runtime. It is not affected by the engine version and the target operating device. It is very suitable as an exchange medium to be put on the store shelves to create an ecology.
The materials may be selected by users, standard specifications of the users may be established, and it includes code generation tools, which can meet the requirements of rapid transformations.
Components or logics at the client may be customized flexibly based on services, and these data may be exported as files. For example, the application VR Kanojo may be placed in a file and loaded by the program framework instead of independently generating applications, which is convenient for long-term service development and ecological construction.
For details, see Table 20 below.
In some embodiments, the newly added attribute includes:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the audio file attribute includes:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiments, the target formatted file includes a node.
The defining module 2210 is further configured to:
In some embodiments, the target formatted file includes a node.
The defining module 2210 is further configured to:
In some embodiments, the defining module 2210 is further configured to:
In some embodiment, the defining module 2210 is further configured to:
The memory 10010 includes at least one type of computer-readable storage medium, and the readable storage medium includes a flash memory, a hard disk, a multimedia card, a card-type memory (for example, an SD or DX memory), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disc, and the like. In some embodiments, the memory 10010 may be an internal storage module of the computer device 2, for example, a hard disk or memory of the computer device 2. In some other embodiments, the memory 10010 may alternatively be an external storage device of the computer device 2, for example, a plug-in type hard disk equipped on the computer device 2, a smart media card (SMC for short), a secure digital (SD for short) card, or a flash card. Certainly, the memory 10010 may alternatively include both the internal storage module of the computer device 2 and the external storage device of the computer device. In this embodiment, the memory 10010 is generally configured to store an operating system and various types of application software installed on the computer device 2, such as program code for the method for implementing a 3D image. In addition, the memory 10010 may be further configured to temporarily store various types of data that have been output or are to be output.
The processor 10020 may be, in some embodiments, a central processing unit (CPU for short), a controller, a microcontroller, a microprocessor, or other data processing chips. The processor 10020 is generally configured to control overall operation of the computer device 2, for example, execute control, processing, and the like related to data exchange or communication with the computer device 2. In this embodiment, the processor 10020 is configured to run program code stored in the memory 10010 or to process data.
The network interface 10030 may include a wireless network interface or a wired network interface. The network interface 10030 is generally configured to establish a communication link between the computer device 2 and other computer devices. For example, the network interface 10030 is configured to connect the computer device 2 to an external terminal by using a network, and establish a data transmission channel, a communication link, and the like between the computer device 2 and the external terminal. The network may be a wireless or wired network, such as Intranet, Internet, the Global System for Mobile Communications (GSM for short), wideband code division multiple access (WCDMA for short), a 4G network, a 5G network, Bluetooth, or Wi-Fi.
It should be noted that
In this embodiment, the method for implementing a 3D image stored in the memory 10010 may alternatively be divided into one or more program modules and executed by one or more processors (the processor 10020 in this embodiment) to implement the embodiments of the present application.
The present application further provides a computer-readable storage medium storing computer-readable instructions, where when the computer-readable instructions are executed by a processor, the following steps are implemented:
In this embodiment, the computer-readable storage medium includes a flash memory, a hard disk, a multimedia card, a card-type memory (for example, an SD or DX memory), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disc, and the like. In some embodiments, the computer-readable storage medium may be an internal storage unit of the computer device, for example, a hard disk or memory of the computer device. In some other embodiments, the computer-readable storage medium may alternatively be an external storage device of the computer device, for example, a plug-in type hard disk equipped on the computer device, a smart media card (SMC for short), a secure digital (SD for short) card, or a flash card. Certainly, the computer-readable storage medium may alternatively include both the internal storage unit of the computer device and the external storage device of the computer device. In this embodiment, the computer-readable storage medium is generally configured to store an operating system and various types of application software installed on the computer device, such as program code for the method for implementing a 3D image in the embodiments. In addition, the computer-readable storage medium may be configured to temporarily store various types of data that have been output or are to be output.
It will be apparent to those skilled in the art that the various modules or steps in the embodiments of the present application can be implemented by a general-purpose computing apparatus that can be centralized on a single computing apparatus or distributed across a network formed by a plurality of computing apparatuses. Optionally, they may be implemented by program code executable by the computing apparatus, such that they may be stored in a storage apparatus and executed by the computing apparatus, and in some cases, the steps shown or described may be performed in a sequence different from the sequence described herein, or they may be respectively fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof may be implemented as a single integrated circuit module. In this way, the embodiments of the present application are not limited to any specific combination of hardware and software.
It should be noted that the foregoing descriptions are merely exemplary embodiments of the present application, and are not intended to limit the patent scope of the present application. Any transformation of equivalent structures or equivalent processes that is made using the contents of the description and accompanying drawings of the present application, or any direct or indirect application thereof in other related technical fields shall equally fall within the patent protection scope of the present application.
Number | Date | Country | Kind |
---|---|---|---|
202210814010.3 | Jul 2022 | CN | national |
This application is a continuation under 35 U.S.C. 111(a) of PCT International Application No. PCT/CN2022/116577, filed on Sep. 1, 2022, which claims priority to Chinese Patent Application No. 202210814010.3, filed on Jul. 11, 2022, the entire contents of which are hereby incorporated by reference in their entirety for all purpose.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2022/116577 | Sep 2022 | US |
Child | 18131674 | US |