This application claims priority to Chinese Application No. 202310140524.X filed on Feb. 13, 2023, the disclosure of which is incorporated herein by reference in its entities.
The present application belongs to the technical field of network live streaming, and in particular to a display method, a data processing method, apparatuses, an electronic device and a computer medium.
With the rapid development of Internet and streaming media technologies, various live streaming application programs emerge in endlessly, and viewing network live streaming becomes a daily entertainment mode of a lot of users. In related arts, picture display of a network live streaming room may only be displayed in a two-dimensional mode, such that the display mode of the live streaming room is relatively humdrum, and thus the user experience is worse.
Embodiments of the present application provide an implementation solution different from related arts, so as to solve the technical problem in the related arts that picture display of a network live streaming room may only be displayed in a two-dimensional mode, such that the display mode of the live streaming room is relatively single, and thus the user experience is worse.
In a first aspect, the present application provides a display method, including:
In a second aspect, the present application provides a data processing method, including:
In a third aspect, the present application provides a display apparatus, including:
In a fourth aspect, the present application provides a data processing apparatus, including:
In a fifth aspect, the present application provides an electronic device, including:
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements any method in the first aspect, the second aspect, various possible implementations of the first aspect, or various possible implementations of the second aspect.
By means of the solution provided in the present application in which the first live streaming scene is displayed in the set playing field space, the first display screen is displayed at the first location in the playing field space in response to the preset operation, and the second live streaming scene is displayed in the first display screen, the first live streaming scene can be displayed in a three-dimensional playing field space, thereby improving the flexibility of a display mode of a live streaming room, and improving the user experience.
To illustrate technical solutions in the embodiments of the present application or in related arts more clearly, a brief introduction on the drawings which are needed in the description of the embodiments or the related arts is given below. Apparently, the drawings in the description below are merely some of the embodiments of the present application, based on which other drawings may be obtained by those ordinary skilled in the art without any creative effort.
Embodiments of the present application are described in detail below, and examples of the embodiments are shown in the drawings. The embodiments described below with reference to the drawings are exemplary, and are intended to explain the present application, but cannot be construed as limiting the present application.
The terms “first” and “second” and the like in the specification, the claims and the drawings of the embodiments of the present application are used for distinguishing similar objects, and are not necessarily used for describing a specific sequence or precedence order. It should be understood that the data used in this way may be interchanged under appropriate circumstances, so that the embodiments of the present application described herein may be implemented in a sequence other than those illustrated or described herein. In addition, the terms “including” and “having”, and any variations thereof are intended to cover non-exclusive inclusions, for example, processes, methods, systems, products or devices including a series of steps or units are not necessarily limited to those clearly listed steps or units, but may include other steps or units that are not clearly listed or are inherent to these processes, methods, products or devices.
First, some terms in the embodiments of the present application are explained below, so as to facilitate the understanding of those skilled in the art.
An AR technology is a technology for skillfully fusing virtual information with a real world, which widely uses various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction and sensing, simulates virtual information generated by a computer, such as words, images, three-dimensional models, music and videos, and then applies the virtual information to the real world. The two kinds of information are complementary to each other, thereby implementing “enhancement” of the real world.
A VR technology is also referred to as a virtual reality or spiritual realm technology, the basic implementation of which is to utilize, mainly relying on a computer technology, the latest development results of a variety of high technologies to generate, by means of a computer and other devices, a vivid virtual world of various sensory experiences such as three-dimensional vision, tactile sense and smell, so that a person in the virtual world generates an immersive feeling.
MR is a further development of the virtual reality technology. In the technology, virtual scene information is presented in a real scene, and an interaction and feedback information loop is built among the real world, the virtual world and the user, so as to enhance the sense of reality of user experience.
A global positioning system (GPS) is a high-precision radio navigation positioning system based on an artificial earth satellite, which may provide accurate geographical locations, vehicle speeds and accurate time information at any place in the world and in a terrestrial space.
2D: 2-dimension in full name, that is, two-dimension, and content on one plane is two-dimensional. The two-dimension refers to a left-right direction and a front-rear direction, and there is no upper-lower direction.
3D: 3-dimension in full name, that is, three-dimension, which refers to a space system formed by adding a direction vector into a planar two-dimensional system. The three-dimension is three axes of a coordinate axis, that is, an x-axis, a y-axis, and a x-axis, where x represents a left-right space, y represents a front-rear space, and z represents an upper-lower space.
A 3D camera, which has dozens of functions such as facial recognition, gesture recognition, human skeleton recognition, three-dimensional measurement, environmental perception, three-dimensional map reconstruction, and so on, and may be widely applied to the fields of televisions, mobile phones, robots, unmanned aerial vehicles, logistics, VR/AR, smart home, security and protection, automobile driving assistance, etc.
With the rapid development of Internet and streaming media technologies, various live streaming application programs emerge in endlessly, and viewing network live streaming becomes a daily entertainment mode of a plurality of users. In related arts, picture display of a network live streaming room may only be displayed in a two-dimensional mode, such that the display mode of a live streaming room is relatively single, and thus the user experience is worse. In order to solve the technical problem, the present application provides a display method and apparatus, a data processing method and apparatus, an electronic device and a computer medium, so as to solve the technical problem in the related arts that picture display of a network live streaming room may only be displayed in the two-dimensional mode, such that the display mode of the live streaming room is relatively single, and thus the user experience is worse.
The technical solutions of the present application and how the technical solutions of the present application solve the above technical problem will be described in detail below with specific embodiments. The following several specific embodiments may be combined with each other, and the same or similar concepts or processes may not be repeated in some embodiments. The embodiments of the present application will be described below in combination with the drawings.
The first device 10 may be a terminal device such as a smart phone, a tablet computer, a computer, and the like; and the second device 20 may be a server, the server may be an independent physical server, or may be a server cluster or a distributed system composed of a plurality of physical servers, or may be a cloud server that provides basic cloud computing services such as cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content delivery network (CDN), big data and artificial intelligence platforms, etc.
The first device 10 may be connected with an input device such as a camera and a microphone, and is used for collecting live streaming data and sending the live streaming data to the second device 20, wherein the camera may be a 3D camera.
In some embodiments, the second device 20 is used for executing the following data processing method: acquiring picture information, wherein the picture information comprises: a first live streaming scene corresponding to the first device, a first location, first information used for instructing the head-mounted display device to display a first display screen, and a second live streaming scene to be displayed in the first display screen; and sending the picture information to the head-mounted display device to cause the head-mounted display device to display the first live streaming scene in a set playing field space, in response to a preset operation, presenting the first display screen at the first location in the playing field space and displaying the second live streaming scene in the first display screen.
The current first live streaming scene is obtained on the basis of live streaming data sent by the first device.
In some embodiments, the head-mounted display device 30 may be used for executing the following display method: displaying a first live streaming scene in a set playing field space; and in response to a preset operation, presenting a first display screen at a first location in the playing field space and displaying a second live streaming scene in the first display screen.
Optionally, the head-mounted display device 30 may be provided with a displayer, and the playing field space displayed by the displayer may be a three-dimensional picture.
In some embodiments, the playing field space may be a virtual reality scene space, a mixed reality scene space, or an augmented reality scene space, etc.
In some embodiments, the first live streaming scene is a live streaming room picture of the first device.
The detailed implementation of the display method and the specific functions of the head-mounted display device 30 or the second device 20 mentioned above are respectively described in detail below. It should be noted that the following description sequence of the embodiments is not used as a limitation on the priority sequence of the embodiments.
S201, displaying a first live streaming scene in a set playing field space.
In some embodiments, the set playing field space is a 3D space.
In some embodiments, the set playing field space is a 180° spherical space or a 360° spherical space.
In some embodiments, the head-mounted display device is provided with a displayer, and the set playing field space may be displayed in the displayer.
In some embodiments, the first live streaming scene is a 180° 3D video stream picture, and the second live streaming scene is a 2D video stream picture.
In some embodiments, the 3D video stream picture may be a picture of a virtual reality scene, a picture of a mixed reality scene, or a picture of an augmented reality scene, etc.
In some embodiments, the first live streaming scene may be a video stream picture of a live streaming room which a user wearing the head-mounted display device selects for viewing.
In some embodiments, in the set playing field space, the current first live streaming scene may be displayed in a curved-surface picture, or may be displayed in a planar picture.
In some embodiments, when the first live streaming scene is displayed in the curved-surface picture, referring to
S202, in response to a preset operation, presenting a first display screen at a first location in the playing field space and displaying a second live streaming scene in the first display screen.
In some embodiments, the distance between the first location and a virtual object corresponding to the user is less than the distance between the first live streaming scene and the virtual object.
In some embodiments, the second live streaming scene may be a live streaming room scene, which currently interacts with the first live streaming scene (such as microphone connection and competition).
In some embodiments, the first display screen may be a curved-surface screen or a planar screen.
In some embodiments, the second live streaming scene is a 2D video stream picture; and
In some embodiments, the virtual object corresponding to the user is included in the set playing field space. Referring to
In some embodiments, the first location may be preset, and the first location may be a three-dimensional coordinate. In some embodiments, the distance between the first location and the virtual object corresponding to the user is the distance between the first location and the fifth location.
In some embodiments, in the present application, when the first display screen, a second display screen, a third display screen and a fourth display screen are displayed at a same location, it refers to that the center of the corresponding screen or the center of the bottom of the screen coincides with the location. The location may refer to any one of the first location, a second location, a third location and a fourth location.
According to the scheme provided in the present application, by means of displaying the first live streaming scene in the set playing field space, and in response to the preset operation, presenting the first display screen at the first location in the playing field space and displaying the second live streaming scene in the first display screen, the first live streaming scene can be displayed in a three-dimensional playing field space, thereby improving the flexibility of a display mode of a live streaming room, and improving the user experience.
In some embodiments, the method further comprises:
In some embodiments, the second display screen may be a curved-surface screen or a planar screen, and when the second display screen is a curved-surface screen, referring to
In some embodiments, the second location may be preset, and the second location may be a three-dimensional coordinate.
In some embodiments, the distance between the second location and the virtual object corresponding to the user is the distance between the second location and the fifth location.
Referring to
In some embodiments, for display content of the playing field space when the user enters the first live streaming scene, the method further comprises: in response to detecting that the user enters the first live streaming scene, triggering the execution of: displaying the first live streaming scene in the set playing field space; in response to the preset operation, presenting the first display screen at the first location in the playing field space and displaying the second live streaming scene in the first display screen; displaying the second display screen at the second location in the playing field space; and displaying a preset opening animation of the competition at a sixth location of the playing field space. Referring to
In some embodiments, the fifth location, the sixth location and the center of the first live streaming scene are on the same plane.
In some embodiments, in a case that the first display screen is displayed at the first location in the playing field space, and the second display screen is displayed at the second location in the playing field space, the user cannot trigger the first display screen.
In some embodiments, in a case that the first display screen is displayed at the first location in the playing field space, and the second display screen is displayed at the second location in the playing field space, the transparency of the first display screen may be 50%.
In some embodiments, for the adjustment on the location where the first display screen and/or the second display screen is located, the method further comprises:
In some embodiments, in response to detecting that the user does not interact with the second display screen exceeding a preset duration, the second display screen is turned off; and
In response to detecting that the second display screen is turned off, the first display screen is controlled to move to the second location that is closer to a virtual user, so that the user can view the first display screen more clearly. Thus the user experience is improved.
In some embodiments, for the adjustment on the location where the first display screen and/or the second display screen is located, the method further comprises:
In some embodiments, the user may perform the hovering or dragging operation on the display screen of the competition through a handheld apparatus such as a handle or an action capture glove, in response to detecting the hovering or dragging operation on the first display screen performed by the user, a virtual hand corresponding to the virtual object may be displayed in the playing field space, and the virtual hand may further hold a virtual handheld apparatus (e.g., a virtual pen), so that the user can flexibly adjust the locations of the first display screen and the second display screen in the playing field space by using the handheld apparatus. Thus the user experience is improved.
In some embodiments, the method further comprises:
Referring to
In some embodiments, the method further comprises:
In some embodiments, in response to detecting a player killing (PK) in the first live streaming scene with another live streaming scene (the second live streaming scene), it is determined that the first live streaming scene is in the competition state. The live streamer of the first live streaming scene may initiate the PK to the other live streaming scene (the second live streaming scene) in an invitation or random matching manner.
In some embodiments, the information of a situation of the competition may further include: identification information of a winner, identification information of a loser, health points bar in the competition corresponding to the live streaming scenes of the both parties, and head portrait information of some of the voters participating in the current competition, etc.
In some embodiments, the information of a situation of the competition may further include the numbers of votes (125 and 180 as shown in
In some embodiments, the method further comprises:
In some embodiments, the user may perform the hovering or dragging operation on the fourth display screen of the competition through a handheld apparatus such as a handle or an action capture glove, so as to flexibly adjust the locations of the second display screen and the fourth display screen in the scene picture, thereby improving the user experience.
In some embodiments, the method further comprises:
In some embodiments, the step of displaying, in response to the click operation on the fourth display screen, the ranking information of the current competition on the fourth display screen comprises:
In some embodiments, referring to
In some embodiments, the method further comprises:
Specifically, the user may trigger the turn-off instruction for the ranking information of the current competition by triggering a shortcut key on the handheld apparatus, and/or, clicking a preset area in the scene picture.
In some embodiments, the method further comprises:
In some embodiments, the first pose information includes first location information and first posture information, wherein the first location information is coordinate information of the head-mounted display device in a world coordinate system (i.e., a real world), and the first posture information is posture information of the head-mounted display device in the world coordinate system (i.e., the real world).
Optionally, the coordinate information of the head-mounted display device in the world coordinate system (i.e., the real world) may be determined according to a positioning function of the head-mounted display device, and specifically with regard to the determination of the coordinate information of the head-mounted display device in the world coordinate system (i.e., the real world), reference may be made to related arts, for example, the GPS technology, and thus will not be repeated here again.
In some embodiments, the determining the target scene picture in the current first live streaming scene based on the first pose information comprises:
In some embodiments, the location relationship between the first location and the virtual object corresponding to the user is fixed.
In some embodiments, the location relationship between the first location and an initial center of the first live streaming scene is fixed.
In some embodiments, the initial center of the first live streaming scene is a center point of the first live streaming scene at a moment when the user enters the first live streaming scene.
In some embodiments, the step of displaying the fourth display screen above the first display screen comprises:
In some embodiments, in a case that the first preset display mode is the splitting effect, referring to
In some embodiments, displaying the first display screen at the first location in the playing field space comprises: displaying the first display screen at the first location in the playing field space according to a second preset display mode, wherein the second preset display mode includes any one of: a splitting effect, a wipe effect, a fly-in effect and a fade-out effect
In some embodiments, in a case that the second preset display mode is the wipe effect, referring to
In some embodiments, in response to a turn-off operation of the user for the first display screen and/or the fourth display screen, the first display screen and/or the fourth display screen is turned off in the playing field space according to a third preset display mode. The third preset display mode may be a fade-out effect, that is, the transparency of the first display screen and/or the fourth display screen is controlled to gradually change from 100% to 0%, and when the transparency of the first display screen and/or the fourth display screen is 0%, the first display screen and/or the fourth display screen is turned off.
In some optional embodiments, the set playing field space is a 180° spherical space or a 360° spherical space.
In some optional embodiments, the first live streaming scene is a 180° 3D video stream picture, and the second live streaming scene is a 2D video stream picture.
In some optional embodiments, the apparatus 120 is further configured to: display a second display screen at a second location in the playing field space, wherein the second display screen is used for displaying at least one of the following: a live streaming list of a live streaming platform in a live streaming state, live streamer information corresponding to the first live streaming scene, online audience information corresponding to the first live streaming scene, a definition setting option corresponding to the first live streaming scene, a dimension setting option corresponding to the first live streaming scene, and a setting option of whether to display bullet-screen messages of the competition.
In some optional embodiments, the apparatus 120 is further configured to: in response to a hovering or dragging operation of the user for the first display screen, control the first display screen to move to the second location, and control the second display screen to move to the first location.
In some optional embodiments, the apparatus 120 is further configured to: in response to detecting that the second display screen is turned off, control the first display screen to move to the second location.
In some optional embodiments, the apparatus 120 is further configured to: display a third display screen at a third location in the playing field space, wherein the third display screen is used for displaying bullet-screen messages sent by the user in the first live streaming scene; and
In some optional embodiments, the apparatus 120 is further configured to: in response to detecting that the first live streaming scene is in a competition state, display a fourth display screen above the first display screen, wherein the fourth display screen is used for displaying information of a situation of the competition.
In some optional embodiments, the apparatus 120 is further configured to: in response to a hovering or dragging operation of the user for the fourth display screen, control the fourth display screen to move to the second location, and control the second display screen to move to the first location or a fourth location, wherein the fourth location is located above or below the first location.
In some optional embodiments, the apparatus 120 is further configured to: in a case that the first display screen is located at the second location and the fourth display screen is located above the first display screen, in response to a click operation on the fourth display screen, display ranking information of the current competition on the fourth display screen, and control the first display screen to move to the first location.
In some optional embodiments, when being used for displaying the first live streaming scene in the set playing field space, the apparatus 120 is specifically configured to:
In some optional embodiments, the location relationship between the first location and a virtual object corresponding to the user is fixed.
In some optional embodiments, the location relationship between the first location and an initial center of the first live streaming scene is fixed.
In some optional embodiments, when being configured to display the fourth display screen above the first display screen, the first display unit 121 is specifically configured to:
In some optional embodiments, when being configured to present the first display screen at the first location in the playing field space, the second display unit 122 is specifically configured to:
It should be understood that the apparatus embodiments and the method embodiments may correspond to each other, and similar description may refer to the method embodiments. In order to avoid repetition, details are not described herein again. Specifically, the apparatus may execute the above method embodiments, and the foregoing and other operations and/or functions of the modules in the apparatus are respectively corresponding processes in the methods in the above method embodiments, and are not described herein again for brevity.
The apparatus in the embodiments of the present application is described above in the perspective of a functional module in conjunction with the drawings. It should be understood that the functional module may be implemented in a hardware form, or may be implemented by instructions in a software form, or may be implemented by a combination of hardware and software modules. Specifically, the steps of the method embodiments in the embodiments of the present application may be completed by an integrated logic circuit of hardware in a processor and/or instructions in the software form, and the steps of the method disclosed in combination with the embodiments of the present application may be directly executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor. Optionally, the software module may be located in a mature storage medium in the art, such as a random access memory, a flash memory, a read-only memory, a programmable read-only memory, an electrically erasable programmable memory and a register. The storage medium is located in a memory, and the processor reads information in the memory and completes the steps in the above method embodiments in combination with the hardware of the memory.
For example, the processor 702 may be used for executing the above method embodiments according to instructions in the computer program.
In some embodiments of the present application, the processor 702 may include, but is not limited to:
In some embodiments of the present application, the memory 701 includes, but is not limited to:
In some embodiments of the present application, the computer program may be divided into one or more modules, and the one or more modules are stored in the memory 701 and are executed by the processor 702, so as to complete the method provided in the present application. The one or more modules may be a series of computer program instruction segments capable of completing a specific function, and the instruction segments are used for describing an execution process of the computer program in the electronic device.
As shown in
The processor 702 may control the transceiver 703 to communicate with other devices, and specifically, may send information or data to the other devices, or receive information or data sent by the other devices. The transceiver 703 may include a transmitter and a receiver. The transceiver 703 may further include an antenna, and there may be one or more antennas.
It should be understood that the components in the electronic device are connected by means of a bus system, wherein the bus system further includes a power bus, a control bus and a state signal bus in addition to a data bus.
The present application further provides a non-transitory computer storage medium, on which a computer program is stored, wherein the computer program, when executed by a computer, enables the computer to execute the method in the above method embodiments. Or, an embodiment of the present application further provides a computer program product including an instruction, wherein the instruction, when executed by a computer, causes the computer to execute the method in the above method embodiments.
When implemented using software, all or in part may be implemented in the form of the computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions according to the embodiments of the present application are all or partially generated. The computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable apparatuses. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, one computer, one server or one data center to another website site, another computer, another server, or another data center in a wired manner (e.g., a coaxial cable, an optical fiber, a digital subscriber line (DSL)), or a wireless manner (e.g., infrared, wireless, microwave, and the like). The computer-readable storage medium may be any available medium accessible by the computer, or a data storage device such as a server or a data center, which is integrated with one or more usable media. The available medium may be a magnetic medium (e.g., a floppy disk, a hard disk, or a magnetic tape), an optical medium (e.g., a digital video disc (DVD)), or a semiconductor medium (e.g., a solid state disk (SSD)), etc.
According to one or more embodiments of the present application, provided is a display method, comprising:
According to one or more embodiments of the present application, the set playing field space is a 180° spherical space or a 360° spherical space.
According to one or more embodiments of the present application, the first live streaming scene is a 180° 3D video stream picture, and the second live streaming scene is a 2D video stream picture.
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the method further comprises:
According to one or more embodiments of the present application, the step of displaying the first live streaming scene in the set playing field space comprises:
According to one or more embodiments of the present application, the location relationship between the first location and a virtual object corresponding to the user is fixed.
According to one or more embodiments of the present application, the location relationship between the first location and an initial center of the first live streaming scene is fixed.
According to one or more embodiments of the present application, the step of displaying the fourth display screen above the first display screen comprises:
According to one or more embodiments of the present application, the step of presenting the first display screen at the first location in the playing field space comprises:
According to one or more embodiments of the present application, provided is a data processing method, including:
According to one or more embodiments of the present application, provided is a display apparatus, including:
According to one or more embodiments of the present application, the set playing field space is a 180° spherical space or a 360° spherical space.
According to one or more embodiments of the present application, the first live streaming scene is a 180° 3D video stream picture, and the second live streaming scene is a 2D video stream picture.
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, the apparatus is further configured to:
According to one or more embodiments of the present application, when being used for displaying the first live streaming scene in the set playing field space, the apparatus is specifically configured to:
According to one or more embodiments of the present application, the location relationship between the first location and a virtual object corresponding to the user is fixed.
According to one or more embodiments of the present application, the location relationship between the first location and an initial center of the first live streaming scene is fixed.
According to one or more embodiments of the present application, when being used for displaying the fourth display screen above the first display screen, the apparatus is specifically configured to:
According to one or more embodiments of the present application, when being configured to present the first display screen at the first location in the playing field space, the apparatus is specifically configured to:
According to one or more embodiments of the present application, provided is a data processing apparatus, including:
According to one or more embodiments of the present application, provided is an electronic device, including:
According to one or more embodiments of the present application, provided is a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the above methods.
According to one or more embodiments of the present application, provided is a computer program, which, when executed by a processor, implements the above methods.
Those ordinary skilled in the art may be aware that, modules and algorithm steps of the examples described in combination with the embodiments disclosed herein may be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are executed by hardware or software depends on specific applications and design constraint conditions of the technical solutions. Those skilled in the art may use different methods to implement the described functions for each specific application, but it should not be considered that the implementation goes beyond the scope of the present application.
In several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely exemplary, for example, the division of the modules is only a logic function division, there may be other division manners in practical implementation, for example, a plurality of modules or components may be combined or integrated to another system, or some features may be omitted or not executed. From another point of view, the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection of apparatuses or modules through some interfaces, and may be in electrical, mechanical or other forms.
The modules described as separate components may be separated physically or not, components displayed as modules may be physical modules or not, namely, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected to implement the purposes of the solutions in the present embodiment according to actual demands. For example, the functional modules in various embodiments of the present application may be integrated in one processing module, or the modules individually exist physically, or two or more modules are integrated in one module.
The foregoing descriptions are merely specific embodiments of the present application, but the protection scope of the present application is not limited thereto. Any skilled one who is familiar with this art could readily think of variations or substitutions within the disclosed technical scope of the present application, and these variations or substitutions shall fall within the protection scope of the present invention. Accordingly, the protection scope of the present application shall be subject to the protection scope of the claims.
Number | Date | Country | Kind |
---|---|---|---|
202310140524.X | Feb 2023 | CN | national |