During 2D (two-dimensional) planar display and 3D (three-dimensional) stereo model modeling, for target objects and positioning points in different operation areas (2D display areas, or 3D display areas obtained through 3D stereo model modeling), it is necessary to feed back the spatial positioning of the target objects and the positioning points in a plurality of operation areas to the user for viewing. However, in the related art, the display mode of the spatial positioning is not intuitive, so that the user cannot obtain the display feedback of the spatial positioning in time.
Embodiments of the present disclosure relate to the technical field of spatial positioning, and in particularly to a method and apparatus for interactive display of image positioning, an electronic device, and a storage medium.
Embodiments of the present disclosure provide a method and apparatus for interactive display of image positioning, an electronic device, and a storage medium.
The technical solutions of the embodiments of the present disclosure are implemented as follows.
An embodiment of the present disclosure provides a method for interactive display of image positioning, the method including: obtaining a positioning point in response to a selection operation of a target object; and obtaining an interactive object displayed at a position corresponding to the positioning point in each of a plurality of operation areas according to correspondence relationships between the plurality of operating areas with respect to the positioning point.
An embodiment of the present disclosure provides a device for interactive display of image positioning, including a memory storing processor-executable instructions, and a processor. The processor is configured to execute the stored processor-executable instructions to perform operations of: obtaining a positioning point in response to a selection operation of a target object; and obtaining an interactive object displayed at a corresponding position of the positioning point in each of a plurality of operation areas according to the correspondence relationships between the plurality of operating areas with respect to the positioning point.
An embodiment of the present disclosure provides a non-transitory computer storage medium having stored thereon computer-readable instructions that, when executed by a processor, cause the processor to perform operations of a method for interactive display of image positioning, the method including: obtaining a positioning point in response to a selection operation of a target object; and obtaining an interactive object displayed at a position corresponding to the positioning point in each of a plurality of operation areas according to correspondence relationships between the plurality of operation areas with respect to the positioning point.
It is to be understood that the above general descriptions and detailed descriptions below are only exemplary and explanatory and not intended to limit the embodiments of the disclosure.
According to the following detailed description on the exemplary embodiments with reference to the accompanying drawings, other characteristics and aspects of the embodiments of the disclosure become apparent.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and, together with the description, serve to explain the technical solution in the embodiments of the disclosure .
Various exemplary embodiments, features, and aspects of the disclosure will be described in detail below with reference to the accompanying drawings. The same reference signs in the drawings represent components with the sane similar functions. Although each aspect of the embodiments is shown in the drawings, the drawings are not required to be drawn to scale, unless otherwise specified.
Herein, special term “exemplary” refers to “use as an example, embodiment or description”. Herein, any “exemplarily” described embodiment may not be explained to be superior to or better than other embodiments.
In the disclosure, term “and/or” is only an association relationship describing associated objects and represents that three relationships may exist. For example, A and/or B may represent three conditions: i.e., independent existence of A, existence of both A and B and independent existence of B. In addition, term “at least one” in the disclosure represents any one of multiple or any combination of at least two of multiple. For example, including at least one of A, B and C may represent including any one or more elements selected from a set formed by A, B and C.
In addition, for describing the embodiments of the disclosure better, many specific details are presented in the following specific implementation modes. It is understood by those skilled in the art that the disclosure may still be implemented even without some specific details. In some examples, methods, means, components and circuits known very well to those skilled in the art are not described in detail, to highlight the subject of the disclosure.
In operation S101, in response to a selection operation of a target object, a positioning point is obtained.
In an embodiment, the target object may be various human body parts (e.g., sensory organs such as eyes, ears and the like, or visceral organs such as heart, liver, stomach and the like), human body tissues (e.g., epithelial tissue, muscle tissue, nerve tissue and the like), human body cells, blood vessels and the like in a medical scene.
In an embodiment, before obtaining the positioning point in response to a selection operation of a target object, the method further includes: obtaining a feature vector of the target object, and recognizing the target object according to the feature vector and a recognition network.
In operation S102, an interactive object displayed at a position corresponding to the positioning point in each of the plurality of operation areas is obtained according to the correspondence relationships between the plurality of operation areas with respect to the positioning point.
The interactive object may exhibit different display states as the relative positional relationship between the positioning point and the target object changes, such as a cross, a flat cylinder and the like. The relative positional relationship may include: the positioning point is located inside and outside the target object. In other embodiments, the relative positional relationship may also be subdivided, e.g., the positioning point is located at an angle, a direction and a distance outside the target object.
In an example, in the case where the plurality of operation areas represent a 2D image and a 3D image, the position corresponding to the positioning point in each of the plurality of operation areas is respectively obtained according to a correspondence relationship of the positioning point in the 2D image and the 3D image, and interactive objects interlocking between the plurality of operation areas are displayed at the positions corresponding to the positioning point in the plurality of operation areas.
The operation area 201 includes a cross line for positioning, the cross line consists of a first identification line 221 and a second identification line 222. A position positioned by the cross line is consistent with 2D coordinates of the positioning point in the operation area 202, and there is a spatial correspondence relationship in the 3D space. For example, a center position of a circle in the 2D plane is a center positioning point of a sphere in the corresponding 3D space.
Since there are correspondence relationships between the original 2D image and the plurality of 3D reconstruction images for any one positioning point in the spatial positioning and reconstruction process, the positions corresponding to the positioning point in the plurality of operation areas can be obtained according to the correspondence relationships (for example, the correspondence relationship between the 3D stereoscopic image of the blood vessel in the operation area 202 and the 2D cross section of the blood vessel in the operation area 201). Therefore, according to the embodiment of the present disclosure, the interactive objects corresponding to the operation areas can be respectively displayed in the positions corresponding to the plurality of operation areas, and the position changes of the interactive objects, which are caused due to tracking of the positioning point, in the plurality of operation areas are displayed in an interlocking way.
According to the embodiment of the present disclosure, it is also possible to indicate the positioning point with different interactive objects at positions of the positioning point in different operation areas.
As shown in
According to the embodiment of the present disclosure, the interactive object displayed at the position corresponding to the positioning point in each of the plurality of operation areas can be obtained according to the correspondence relationships between the plurality of operating areas with respect to the positioning point after the positioning point is obtained. Therefore, the positions corresponding to the positioning point in the plurality of operation areas can be synchronized according to the correspondence relationships between the plurality of operating areas with respect to the positioning point, so that the interactive object can be displayed at the corresponding positions. Through the correspondence matching of the spatial positioning and the intuitive display mode, the spatial positioning of the target object and the positioning point in the plurality of operation areas can be timely fed back to the user for viewing. In this way, the user can check and view the same positioning point by using the multiple operation areas, and can intuitively obtain different interactive display states, thereby not only the display feedback effect is improved, but also the next expected processing can be timely performed by the user according to the display feedback effect, and the interactive feedback speed is improved.
As shown in
In a possible implementation, the method further includes: after the interactive object displayed at the position corresponding to the positioning point in each of the plurality of operation areas is obtained, a relative positional relationship between the positioning point and the target object is obtained in response to a position change of the positioning point; and a display state of the interactive object is adjusted according to the relative positional relationship.
The position change of the positioning point may be that the positioning point is changed from inside the target object to outside the target object, that the positioning point is changed from outside the target object to inside the target object, or that the positioning point is always located outside the target object, but the relative angle, the relative distance, or the relative direction between the positioning point and the target object changes.
For example, when the target object is a blood vessel, the relative position includes: the positioning point is in the blood vessel, or the positioning point is out of the blood vessel to the outside of the blood vessel (in this case, the positioning point may be on other human tissues other than the blood vessel). Because the obtained interactive objects are different due to different position relationships, the display state of the interactive object needs to be adjusted. In an example, after the target object is recognized and the position of the positioning point and its position change are determined, then different correspondence relationship representations can be displayed in real time according to the position of the positioning point from the target object (such as a blood vessel) to adjust the display state of the interactive object. The display effects of the interactive object after the display state is adjusted are as shown in
In a possible implementation, the operation that the display state of the interactive object is adjusted according to the relative positional relationship includes: in response to the relative positional relationship being a first position relationship, the display state of the interactive object is adjusted into an interactive display state that has at least one of following relationships with the target object: angle, direction, displacement, and visual effect.
The first positional relationship may be that the positioning point is always located outside the target object, but the relative angle, the relative distance, or the relative direction between the positioning point and the target object changes.
In an example,
In a possible implementation, the display state of the interactive object is adjusted according to the relative positional relationship includes: in response to the relative positional relationship being a second position relationship, the display state of the interactive object is adjusted into an interactive display state indicating a position of the positioning point in the target object.
The second positional relationship may be that the positioning point is changed from inside the target object to outside the target object, or that the positioning point is changed from outside the target object to inside the target object.
In an example, the interactive display state includes: a cross obtained by lateral and longitudinal positioning identifications in the case where the target object is other non-vascular human body part, other non-vascular human tissue, or other non-vascular human cell, as shown in
As shown in
In the case of medical images related to, for example, a cardiac vessel, or a scene in which the 2D plane has a correspondence relationship with the 3D stereo model in terms positioning, it is necessary to match the points or contents on the 2D plane to the positions of the 3D stereo model, and there are some parts in which the contents need to be moved in a fixed way, for example, the movement is limited to be on the vessel, or on the trachea or on other objects, so that a synchronized matching position relationship will be seen by using the technical.
In the related art, a matching relationship between a point in a 2D plane and a point in a 3D stereoscopic model is realized, but a specific operation mode and an operable language are not provided for different parts, so that a usable operation expectation cannot be well provided to a user, and specific content corresponding to the position cannot be expressed in an intuitionistic way.
The method for interactive display of image positioning described in one or more embodiments of the present disclosure is illustrated by way of example below.
The technical solutions according to embodiments of the present disclosure can be applied to the process of searching for vascular lesions. A physician needs to perform a diagnosis of a patient by viewing the vessel one by one. Through the technical solutions of the present disclosure, it is possible to automatically recognize all blood vessels, sub-vascular lesions, and information of plaque attributes based on an Artificial Intelligence (AI) algorithm. In the confirmation process, the original 2D image and the reconstructed 3D image need to be viewed correspondingly; after a certain blood vessel is selected on the 3D image and a pointer of the blood vessel is moved on the blood vessel of the planar image (i.e., the original 2D image), and the movement of the point will be displayed on the 3D image correspondingly. It is also possible to view the blood vessel, switch between blood vessels and move the control point on the blood vessel on the 3D image, the image correspondence relationship of the control point can be seen in real time, and the control point can be moved to other region of interest in other tissues.
Assuming that the physician selects a blood vessel, as shown in
In the embodiments of the present disclosure, it is possible to recognize and embody the 3D spatial position relationship, adjust the operation representation of the corresponding part in real time, and display different corresponding relationship representations in real time according to the positions of different parts, feedback in real time, and instruct other expected operations of the user; and the operability of the part is represented in an vivid way, and the relative spatial relationship and the 3D spatial relationship are well reflected to ease user's understanding.
In the embodiments of the present disclosure, it is possible to recognize body parts and adjust the operation form in real time, whereas in the related art, the form of the operation is fixed and remains the same for different body parts. In the embodiments of the present disclosure, it is possible to vary the angle of the flat cylinder in real time according to the spatial positional relationship of the body tissues, whereas in the related art, the body part is displayed in a fixed form, such as a point.
According to the embodiments of the present disclosure, it is possible to display different corresponding relationship representations in real time according to the positions of different parts, feedback in real time, and instruct other expected operations of the user; and the operability of the part is represented in an vivid way, and the relative spatial relationship and the 3D spatial relationship are well reflected to accelerate the disease search by a doctor, and ease user's understanding.
The embodiments of the present disclosure may be applied to all logical operations having a correspondence relationship such as a scanning workstation such as an imaging department reading system, a Computed Tomography (CT), an Magnetic Resonance (MR), a positron emission tomography (PET), an AI-assisted diagnosis, an AI labeling system, a telemedicine diagnosis, a cloud platform-assisted intelligent diagnosis, and the like.
It can be understood that in the above method of embodiments, the order in which the steps are written does not imply a strict order of execution to constitutes any limitation on the implementation process, and the specific implementation modes, the specific execution sequence of each step may be determined in terms of the function and possible internal logic.
The method embodiments mentioned in the disclosure may be combined with each other to form a combined embodiment without departing from the principle and logic, which is not elaborated in the embodiments of the disclosure for the sake of simplicity.
In addition, the disclosure further provides an image processing apparatus, an electronic device, a computer readable storage medium and a program, all of which may be configured to implement any image processing method provided by the disclosure . The corresponding technical solutions and descriptions refer to the corresponding descriptions in the method and will not elaborated herein.
In a possible implementation, the interactive display unit is configured to:
In the case where the plurality of operation areas respectively represent a 2D image and a 3D image, obtain the position corresponding to the positioning point in each of the plurality of operation areas according to a correspondence relationship of the positioning point in the 2D image and the 3D image;
and display the interactive objects interlocking between the plurality of operation areas at the positions corresponding to the positioning point in the plurality of operation areas.
In a possible implementation, the response unit is configured to obtain a relative positional relationship between the positioning point and the target object in response to the position change of the positioning point;
and interactive display unit is configured to adjust the display state of the interactive object according to the relative positional relationship.
In a possible implementation, the interactive display unit is configured to:
in response to the relative positional relationship being a first position relationship, adjust the display state of the interactive object into an interactive display state that has at least one of following relationships with the target object: angle, a direction, displacement, and visual effect.
In a possible implementation, the interactive display unit is configured to:
in response to the relative positional relationship being a second position relationship, adjust a display state of the interactive object into an interactive display state indicating a position of the positioning point in the target object.
In a possible implementation, the device further includes an recognition unit, configured to:
obtain a feature vector of the target object;
recognize the target object according to the feature vector and a recognition network.
In some embodiments, the function or included module of the apparatus provided by the embodiment of the present disclosure may be configured to execute the method described in the above method embodiments, and the specific implementation may refer to the description in the above method embodiments. For the simplicity, the details are not elaborated herein.
The embodiments of the disclosure further provide a computer-readable storage medium, in which a computer program instruction is stored, the computer program instruction being executed by a processor to implement the above any image processing method. The computer-readable storage medium may be a non-volatile computer-readable storage medium.
An embodiment of the present disclosure also provides a computer program product including computer-readable code, under the condition that the computer readable code runs on a device, a processor in the device executes instructions for implementing the method for interactive display of image positioning as provided in any one of the above embodiments.
The embodiments of the present disclosure also provide another computer program product, configured to store computer readable instructions, when being executed, cause a computer to perform operations of the method for interactive display of image positioning provided in any one of the above embodiments.
The computer program product may be implemented in hardware, software, or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a Software Development Kit (SDK) and the like.
An embodiment of the present disclosure further provides an electronic device, including a processor; a memory, configured to store instructions executable by the processor; and the processor is configured to implement the method as described above.
The electronic device may be provided as a terminal, a server, or other form of device.
Referring to
The processing component 802 typically controls the overall operations of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps in the above described methods. Moreover, the processing component 802 may include one or more modules which facilitate the interaction between the processing component 802 and other components. For instance, the processing component 802 may include a multimedia module to facilitate the interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support the operation of the electronic device 800. Examples of such data include instructions for any disclosure or method operated on the electronic device 800, contact data, phone book data, messages, pictures, video, etc. The memory 804 may be implemented by any type of volatile or non-volatile memory devices, or a combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable read only memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic memory, a flash memory, a magnetic disk, or optical disk.
The power component 806 provides power to various components of electronic device 800. The power component 806 may include a power management system, one or more power sources, and other components associated with generation, management, and distribution of power in the electronic device 800.
The multimedia component 808 includes a screen providing an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch panel (TP). If the screen includes the TP, the screen may be implemented as a touch screen to receive input signals from a user. The TP includes one or more touch sensors to sense touch, swipes, and gestures on the TP. The touch sensor may not only sense the boundary of a touch or swipe action, but also sense a period of time and a pressure associated with the touch or swipe action. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a photographing mode or a video mode. Each of the front camera and the rear camera may be a fixed optical lens system or have focus and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a microphone (MIC) configured to receive an external audio signal when the electronic device 800 in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in memory 804 or transmitted via the communication component 816. In some embodiments, the audio component 810 further includes a speaker, configured to output an audio signal.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules. The peripheral interface modules may be a keyboard, a click wheel, buttons, and the like. The buttons may include, but are not limited to, a home page button, a volume button, a starting button, and a locking button.
The sensor component 814 includes one or more sensors to provide status assessments of various aspects of the electronic device 800. For instance, the sensor component 814 may detect an on/off state of the electronic device 800 and relative positioning of the components, such as a display and small keyboard of the electronic device 800, and the sensor component 814 may further detect a change in a position of the electronic device 800 or a component of the electronic device 800, the presence or absence of contact between the user and the electronic device 800, orientation or acceleration/deceleration of the electronic device 800 and a change in temperature of the electronic device 800. The sensor component 814 may include a proximity sensor, configured to detect the presence of nearby objects without any physical contact. The sensor component 814 may also include a light sensor, such as a Complementary Metal Oxide Semiconductor (CMOS) or Charge-coupled Device (CCD) image sensor, configured for use in an imaging disclosure. In some embodiments, the sensor component 814 may also include an accelerometer sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a communication-standard-based wireless network, such as WiFi, 2-Generation wireless telephone technology (2G) or 3-Generation wireless telephone technology (3G) or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast information from an external broadcast management system via a broadcast channel In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on a Radio Frequency Identification (RFID) technology, an Infrared Data Association (IrDA) technology, an Ultra Wide Band (UWB) technology, a Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuit (ASICs), Digital Signal Processing (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLD), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components, and is configured to execute the above method.
In an exemplary embodiment, a non-volatile computer-readable storage medium, for example, a memory 804 including a computer program instruction, is also provided. The computer program instruction may be executed by a processor 820 of the electronic device 800 to implement the above method.
The electronic device 900 may further include a power component 926 is configured to execute power management of the electronic device 900, a wired or wireless network interface 950 configured to connect the electronic device 900 to a network and an input/output (I/O) interface 958. The electronic device 900 may be operated based on an operating system stored in the memory 932, for example, Windows Server™, Mac OS X™, Unix™, Linux™, FreeBSD™ and the like.
In an exemplary embodiment, a non-volatile computer-readable storage medium for example, a second memory 1932 including a computer program instruction, is also provided. The computer program instruction may be executed by a processing component 922 of an electronic device 900 to implement the above method.
The present disclosure may be a system, a method and/or a computer program product. The computer program product may include a computer-readable storage medium, in which a computer-readable program instruction configured to enable a processor to implement each aspect of the present disclosure is stored.
The computer-readable storage medium may be a physical device capable of retaining and storing an instruction used by an instruction execution device. The computer-readable storage medium may be, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device or any suitable combination thereof. More specific examples (non-exhaustive list) of the computer-readable storage medium include a portable computer disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a static random access memory (SRAM), a Compact Disc Read-Only Memory (CD-ROM), a Digital Video Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, a punched card or in-slot raised structure with an instruction stored therein, and any appropriate combination thereof. Herein, the computer-readable storage medium is not explained as an transient signal, for example, a radio wave or another freely propagated electromagnetic wave, an electromagnetic wave propagated through a wave guide or another transmission medium (for example, a light pulse propagated through an optical fiber cable) or an electric signal transmitted through an electric wire.
The computer-readable program instruction described here may be downloaded from the computer-readable storage medium to various computing/processing device or downloaded to an external computer or an external storage device through a network such as an Internet, a local area network, a wide area network, and/or a wireless network. The network may include a copper transmission cable, a fiber optic transmission cable, a wireless transmission cable, a router, a firewall, a switch, a gateway computer and/or an edge server. A network adapter card or network interface in each computing/processing device receives the computer-readable program instruction from the network and forwards the computer-readable program instruction for storage in the computer-readable storage medium in each computing/processing device.
The computer program instruction f configured to execute the operations of the present disclosure may be an assembly instruction, an Industry Standard Architecture (ISA) instruction, a machine instructions, machine-related instruction, a microcode, a firmware instruction, state setting data or a source code or target code edited by one or any combination of more programming languages, the programming language including an object-oriented programming languages such as Smalltalk and C++ and a conventional procedural programming language such as “C” language or a similar programming language. The computer-readable program instruction may be completely or partially executed in a computer of a user, executed as an independent software package, executed partially in the computer of the user and partially in a remote computer, or executed completely in the remote server or a server. In a case involved in the remote computer, the remote computer may be connected to the user computer via any type of network including the local area network (LAN) or the Wide Area Network (WAN), or may be connected to an external computer (such as using an Internet service provider to provide the Internet connection). In some embodiments, an electronic circuit such as a programmable logic circuit, a field programmable gate array (FPGA), or a programmable logic array (PLA), is customized by using state information of the computer-readable program instruction to implement each aspect of the present disclosure.
Herein, each aspect of the embodiments of the present disclosure is described with reference to flowcharts and/or block diagrams of method, device (systems) and computer program product in according to the embodiments of the present disclosure. It is to be understood that each block in the flowcharts and/or block diagrams and a combination of each block in the flowcharts and/or block diagrams may be implemented by computer-readable program instructions.
The computer-readable program instructions may be provided for a universal computer, a dedicated computer or a processor of another programmable data processing device, thereby generating a machine to further generate a device that realizes a function/action specified in one or more blocks in the flowcharts and/or the block diagrams when the instructions are executed through the computer or the processor of the other programmable data processing device. These computer-readable program instructions may also be stored in a computer-readable storage medium, and through these instructions, the computer, the programmable data processing device and/or another device may work in a specific manner, so that the computer-readable medium including the instructions includes a product including instructions for implementing each aspect of the function/action specified in one or more blocks in the flowcharts and/or the block diagrams.
The computer-readable program instructions may further be loaded to the computer, the other programmable data processing device or other device, so that a series of operating steps are executed in the computer, the other programmable data processing device or other device to generate a process implemented by the computer to further realize the function/action specified in one or more blocks in the flowcharts and/or the block diagrams by the instructions executed in the computer, the other programmable data processing device or the other device.
The flowcharts and block diagrams in the drawings illustrate probably implemented system architectures, functions, and operations of the systems, method, and computer program product according to multiple embodiments of the present disclosure. On this aspect, each block in the flowcharts or block diagrams may represent part of a module, a program segment or an instruction, and part of the module, the program segment or the instruction includes one or more executable instructions configured to realize a specified logical function. In some alternative implementations, the functions marked in the blocks may also be realized in a sequence different from those marked in the drawings. For example, two continuous blocks may actually be executed in a substantially concurrent manner and may also be executed in a reverse sequence sometimes, which is determined by the involved functions. It is further to be noted that each block in the block diagrams and/or flowcharts and a combination of the blocks in the block diagrams and/or the flowcharts may be implemented by a dedicated hardware-based system configured to execute a specified function or operation or may be implemented by a combination of a special hardware and a computer instruction.
Various embodiments of the present disclosure may be combined with each other without departing from the logic, the description of the various embodiments being focused, and reference may be made to the description of other embodiments for the description of the various embodiments.
Each embodiment of the present disclosure has been described above. The above descriptions are exemplary, non-exhaustive, and also not limited to each disclosed embodiment. Many modifications and variations are apparent to those of ordinary skill in the art without departing from the scope and spirit of each described embodiment of the present disclosure. The terms used herein are selected to explain the principles and practical application of each embodiment or technical improvements in the technologies in the market best or enable others of ordinary skill in the art to understand the each embodiment disclosed herein.
In the embodiments of the present disclosure, through the correspondence matching of the spatial positioning and the intuitive display mode, the spatial positioning of the target object and the positioning point in a plurality of operation areas can be timely fed back to the user for viewing, which not only improves the display feedback effect, but also enables the user to perform the next expected processing in time according to the display feedback effect, thereby the interactive feedback speed is improved.
Number | Date | Country | Kind |
---|---|---|---|
201911203808.9 | Nov 2019 | CN | national |
This application is a continuation of International Application No. PCT/CN2020/100928, filed on Jul. 8, 2020, which claims priority to Chinese Patent Application No. 201911203808.9, filed on Nov. 29, 2019. The disclosures of International Application No. PCT/CN2020/100928 and Chinese Patent Application No. 201911203808.9 are hereby incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/100928 | Jul 2020 | US |
Child | 17547286 | US |