The present disclosure claims priority to Chinese patent applications No. 202311754545.7,filed on Dec. 19, 2023, titled “RECONSTRUCTING METHOD AND APPARATUS OF MEDICAL IMAGE, COMPUTER DEVICE, AND STORAGE MEDIUM”, the content of which is hereby incorporated by reference in its entirety.
The present disclosure generally relates to the field of medical imaging, and in particular, to a reconstructing method of a medical image, a reconstructing apparatus of a medical image, a computer device, a storage medium, and a computer program product.
A positron emission tomography (PET) is a medical imaging technology used to observe biological activities and functions inside a detecting object. A radioactive tracer (e.g. glucose labeled by a radioactive isotope) is injected to a body of the detecting object, and then a PET scanner detects and records a spatial distribution of the radioactive tracer in the body.
During a PET detection, trace signals that are emitted by the radioactive tracer (such as two gamma ray signals that moves along opposite directions are produced by annihilation of a positive electron and a negatron when the positive electron meet with the negatron in the body) are detected and recorded by the PET scanner. A tomographic image of the body is generated by analyzing and reconstructing data of the trace signals, to observe and evaluate metabolic activity, blood flow, brain function of different tissues and organs.
However, in a current PET system, a data volume and a counting rate are multiplied due to a system sensitivity increase. As a result, in a corresponding special scenario (a high-active radiation source scenario such as myocardial injection and dynamic scanning in clinical practice), the PET system captures a large quantity of trace signals in a short time. These trace signals bring great pressure to data processing, data storage, and reconstruction of the PET system, and affect data availability, correctness, and reconstructing image consistency.
According to various embodiments of the present disclosure, a reconstructing method of a medical image, a reconstructing apparatus of a medical image, a computer device, a storage medium, and a computer program product are provided.
In a first aspect, a reconstructing method of a medical image is provided, including: acquiring a signal data set obtained by performing signal detection on a detected object via a detector array, determining an effective detecting range of each detector in the detector array when a detecting scenario is a target scenario, and extracting target signal data of each detector within a respective effective detecting range from the signal data set, and reconstructing a medical detecting image of the detected object based on the target signal data. The detector array is configured to siege the detected object in three dimensions and detect a trace signal emitted by the detected object to obtain the signal data set.
In an embodiment, determining an effective detecting range of each detector in the detector array further includes: acquiring a detecting region of the detector array for the detected object, and determining, based on the detecting region, the effective detecting range corresponding to each detector.
In an embodiment, the detecting region includes a first projection region of the detected object in an axial view of the detector array. Determining, based on the detecting region, the effective detecting range corresponding to each detector further includes: determining, based on the first projection region, a first effective detecting range of each detector in the axial view.
In an embodiment, determining the effective detecting range corresponding to each detector further includes: for the axial view, taking the detector as an endpoint and two corresponding first siege lines as sidelines, sieging the first projection region to obtain a first siege region, and taking a view range corresponding to the first siege region as the first effective detecting range. An end of each of the first siege lines is located at the endpoint, the other end of each of the first siege lines is located within an axial length range of the detector array, and the first siege region is configured to siege at least a part of the first projection region.
In an embodiment, the detecting region includes a second projection region of the detected object in a circumferential view of the detector array. Determining, based on the detecting region, the effective detecting range corresponding to each detector further includes: determining, based on the second projection region, a second effective detecting range of each detector in the circumferential view.
In an embodiment, determining the effective detecting range corresponding to each detector further includes: for the circumferential view, taking the detector as an endpoint and two corresponding second siege lines as sidelines, sieging the second projection region to obtain a second siege region, and taking a view range corresponding to the second siege region as the second effective detecting range. An end of each of the second siege lines is located at the endpoint, the other end of each of the second siege lines is located within a circumferential length range of the detector array, and the second siege region is configured to siege all the second projection region.
In an embodiment, each detector has at least one coincidence line within the respective effective detecting range, an end of the at least one coincidence line is a local-end detector corresponding to the effective detecting range, and the other end of the at least one coincidence line is a peer-end detector. Extracting target signal data of each detector within the respective effective detecting range from the signal data set further includes: extracting, from the signal data set, target trace signals of the local-end detector and the peer-end detector those are on each of the at least one coincidence line, and taking the target trace signals as the target signal data. Each of the at least one coincidence line belongs to the corresponding effective detecting range of each detector.
In an embodiment, determining the detecting scenario of the detector array for the detected object based on the data feature of the signal data set further includes: performing signal identification on each signal data in the signal data set, determining statistical data of the trace signal emitted from the detected object based on the radioactive tracer, and determining the detecting scenario of the detector array based on an activity level. The statistical data is configured to represent the activity level of the radioactive tracer in the detected object.
In an embodiment, performing signal identification on each signal data in the signal data set, and determining statistical data of the trace signal sent from the detected object based on the radioactive tracer further includes: identifying a preset radioactive signal and a positron signal that are emitted by the detected object in a coincident time window, obtaining a data amount of the radioactive signal and a counting rate of the positron signal, and taking either or both of the data amount and the counting rate as the statistical data for the trace signal.
In a second aspect, a reconstructing apparatus of a medical image is further provided in the present disclosure, including: means for acquiring a signal data set obtained by performing signal detection on a detected object via a detector array, means for determining an effective detecting range of each detector in the detector array when the detecting scenario is a target scenario, and means for extracting target signal data of each detector within a respective effective detecting range from the signal data set, and reconstructing a medical detecting image of the detected object based on the target signal data. The detector array is configured to siege the detected object in three dimensions and detect a trace signal emitted by the detected object to obtain the signal data set.
In a third aspect, a computer device is further provided in the present disclosure, including a memory and a processor. The memory is configured to store an executable instruction of the processor, and the processor is configured to execute the executable instruction to implement the reconstructing method of the medical image.
In a fourth aspect, a computer-readable storage medium is further provided in the present disclosure, including program data. When the program data is executed by a processor of a computer device, the compute device can implement the reconstructing method of the medical image.
In a fifth aspect, a computer program product is further provided in the present disclosure, including a computer program. The computer program is executed by a processor to implement the reconstructing method of the medical image.
Details of one or more embodiments of the present disclosure are set forth in the following accompanying drawings and description. Other features, objectives, and advantages of the present disclosure become obvious with reference to the specification, the accompanying drawings, and the claims.
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure or the related technology, the accompanying drawings to be used in the description of the embodiments or the related technology will be briefly introduced below, and it will be obvious that the accompanying drawings in the following description are only some of the embodiments of the present disclosure, and that, for one skilled in the art, other accompanying drawings can be obtained based on these accompanying drawings without putting in creative labor.
To make objectives, technical solutions, and advantages of the present disclosure clearer, the following further describes the present disclosure in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely used to explain the present disclosure, and are not intended to limit the present disclosure.
The term “and/or” in the embodiments of the present disclosure refers to any and all possible combinations including one or more of associated enumerated items. It is also noted that when used in this specification, “include/comprise” specifies presence of stated features, integers, steps, operations, elements and/or components, but not rule out presence or addition of one or more other features, integers, steps, operations, elements and/or components and/or groups thereof.
The terms “first” and “second” in the present disclosure are used to distinguish between different objects, and are not used to describe a specific sequence. In addition, the terms “include”, “have” and any variations thereof are intended to cover the inclusion of non-exclusive. For example, a process, a method, a system, a product, or a device that includes a series of steps or units is not limited to the listed steps or units, but alternatively further includes an unlisted step or unit, or alternatively further includes another step or unit inherent to the process, the method, the product, or the device.
In addition, in the present disclosure, although the terms “first”, “second” and the like are repeatedly used to describe various operations (elements, applications, instructions, or data) and the like, these operations (elements, applications, instructions, or data) are not limited to these terms. These terms are used only to distinguish one operation (an element, an application, an instruction, or data) from another operation (an element, an application, an instruction, or data).
A reconstructing method of a medical image provided in an embodiment of the present disclosure is applied to an application environment shown in
In some embodiments, referring to
In some embodiments, the terminal 102 (such as a mobile terminal and a fixed terminal) may be implemented in various forms. The terminal 102 may be the mobile terminal that can reconstruct the medical detecting image of the detected object according to target signal data of the detector within the effective detecting range, such as a mobile phone, a smart phone, a notebook computer, a portable handheld device, a Personal Digital Assistant (PDA), a tablet computer (PAD), or the like. The terminal 102 may also be the fixed terminal that can reconstruct the medical detecting image of the detected object according to target signal data of the detector within the effective detecting range, such as an Automated Teller Machine (ATM), an automatic integrated computer, a digital TV, a desktop computer, a solid-state computer, or the like.
The following assumes that the terminal 102 is a fixed terminal. However, one skilled in the art will understand that, when there is an operation or a component specifically used for a mobile purpose, the reconstruction according to implementation manners disclosed in the present disclosure may also be applied to a terminal 102 with a mobile type.
In some embodiments, a data processing component running on the server 104 may load any of various additional server applications and/or intermediate layer applications that is being executed, such as HTTP (Hyper Text Transfer Protocol), FTP (File Transfer Protocol), CGI (Common Gateway Interface), RDBMS (Relational Database Management System), or the like.
In some embodiments, the server 104 may implement a corresponding network function
by deploying a server (such as an independent server or a server cluster consisting of a plurality of servers). The server 104 may also be adapted to run one or more application services or software components of the terminal 102 described in the foregoing disclosure.
In some embodiments, the application services may include providing a user with a service interface (a trace signal collection interface, a display interface of signal data processing, or the like) of reconstructing the medical detecting image, a corresponding program service, or the like. The software components may include, for example, a Software Development Kit (SDK) or a client (APP) that has an image reconstructing function according to a detecting scenario.
In some embodiments, the SDK or the APP that has the image reconstructing function according to the detecting scenario and is provided by the server 104 may include a portal port that provides a one-to-one application service to a user in a foreground and multiple service systems that are located in a background to perform data processing, so as to extend an image reconstruction processing function application to the APP or the client, so that the user can use and access the image reconstruction processing function anywhere at any time.
In some embodiments, the image reconstruction processing function in the APP or the
client may be a computer program running in a user mode to complete a particular task or tasks that may interact with the user and have a visual user interface. The APP or the client may include two parts: A graphical user interface (GUI) and an engine (engine). The graphical user interface (GUI) and the engine can be used to provide multiple application services to the user in a form of a user interface.
In some embodiments, the user may input corresponding code data or control parameters to the APP or the client by using a preset input apparatus or an automatic control program, so as to execute the application service of the computer program in the server 104 and display the application service in the user interface.
In some embodiments, the operating system running on the APP or the client may include various versions of Microsoft Windows®, Apple Macintosh®, and/or Linux operating systems, various commercial or UNIX®-like operating systems (including but not limited to various GNU/Linux operating systems, Google ChromeROS, and the like), and/or a mobile operating system such as iOS®, Windows®Phone, Android®OS, BlackBerry®OS, and Palm®OS operating systems, and other online operating systems or offline operating systems, which are not specifically limited herein.
In some embodiments, referring to
Step 11 includes that acquiring a signal data set obtained by performing signal detection on a detected object via a detector array.
The detected object may be a medical diagnosis and treatment object that has been injected with a radioactive tracer, such as a human object, an animal object, a plant object, or the like. The detected object may be a detected module.
In an embodiment, the radioactive tracer is a radioactive isotope-labeled compound that acts in a living object and can participate in blood flow or metabolism of a tissue of the object, including radioactive isotope-labeled glucose or the like.
In an embodiment, the detector array is configured to siege the detected object in three dimensions and detect a trace signal emitted by the detected object to obtain the signal data set.
In some embodiments, the radioactive tracer emits positrons after being injected into the detected object, and these positrons are annihilated when meeting negatrons in the detected object, to produce two gamma rays with relative motion, and then detectors in the detector array detect and record the gamma rays to obtain the signal data set.
In an exemplary embodiment, referring to
In some embodiments, the detector array 104 may include one or more crystal elements (e.g., detector crystals). When the trace signal (e.g., a gamma ray photon) reaches a detector crystal, the detector crystal may absorb energy of the trace signal, and convert absorbed energy into visible light. In some embodiments, the detector crystals may be arranged in N rows and M columns. N may be an integer greater than 0, and M may be an integer greater than 0. In some embodiments, a barrier material (e.g., a reflective film, etc.) may be filled between adjacent two detector crystals of the detector array 104. The detector crystal may use one or more types of crystals, including, e.g., one or a combination of NaI (TI), BGO, LSO, YSO, GSO, LYSO, LaBr3, LFS, LuAP, LuI3, BaF2, CeF, CsI (TI), CsI (Na), CaF2 (Eu), CdWO4, and YAP.
Step 12 includes that determining an effective detecting range of each detector in the detector array when a detecting scenario is a target scenario.
For example, an effective detecting range corresponding to the detector in a detecting scenario with high activity (the activity level of the radioactive tracer in the detected object is relatively high) may be a first range. An effective detecting range corresponding to the detector in a detecting scenario with medium activity (the activity level of the radioactive tracer in the detected object is relatively normal) may be a second range. An effective detecting range corresponding to the detector in a detecting scenario with low activity (the activity level of the radioactive tracer in the detected object is relatively low) may be a third range. Effective detecting ranges corresponding to the detector in different detecting scenarios are different. The first range is greater than the second range, and the second range is greater than the third range. The target scenario may be the detecting scenario with high activity, such as myocardial injection and dynamic scanning in clinical practice. The target scenario may also be a long-axis scanning scenario.
In an embodiment, each detector may have at least one coincidence line within the respective effective detecting range. An end of the at least one coincidence line may be a local-end detector corresponding to the effective detecting range, and the other end of the at least one coincidence line may be a peer-end detector.
In an exemplary embodiment, referring to
The method may further include that determining the detecting scenario of the detector array for the detected object based on a data feature of the signal data set.
Specifically, the server may first perform signal identification on signal data in the signal data set to determine statistical data of the trace signal emitted by the detected object based on the radioactive tracer. Then, the detecting scenario of the detector array may be determined based on the statistical data.
The statistical data is configured to represent an activity level of the radioactive tracer in the detected object. The higher the statistical data of the trace signal emitted by the detected object is, the higher the activity level of the radioactive tracer is, and the higher the activity level of the detecting scenario of the detector array is. The lower the statistical data of the trace signal emitted by the detected object is, the lower the activity level of the radioactive tracer is, and the lower the activity level of the detecting scenario of the detector array is.
In an embodiment, determining, by the server, the statistical data of the trace signal emitted by the detected object based on the radioactive tracer may further include: identifying a radioactive signal and a positron signal that are emitted by the detected object in a coincident time window, obtaining a data amount of the radioactive signal and a counting rate of the positron signal, and taking either or both of the data amount and the counting rate as the statistical data of the trace signal.
Exemplarily, the signal data set collected by the detector array in the coincident time window (e.g., 0 μs to 15 μs) may include X pieces of signal data. The server may identify the X pieces of signal data one by one to determine the data amount of the radioactive signal (e.g., gamma rays) and the counting rate of the positron signal, and then use either or both of the data amount and the counting rate as the statistical data of the trace signal.
In an embodiment, during a scanning process, reconstruction preview may be performed on a real-time data signal in slice time to obtain a brief outline of the detected object, so as to identify the detecting scenario.
In an embodiment, the detecting scenario may be determined according to a scanning protocol selected by the user.
Step 13 includes that extracting target signal data of each detector within a respective effective detecting range from the signal data set.
Specifically, the server may extract, from the signal data set, a target trace signal detected by a local-end detector and a peer-end detector on each coincidence line in a corresponding effective detecting range of each detector, and take the target trace signal as the target signal data.
The target trace signal may be a trace signal that is generated on a coincidence line and corresponding to a coincidence event.
In some embodiments, the server may first take trace signals detected by the local-end detector and the peer-end detector on each coincidence line as a signal data group, to obtain a set of signal data groups of coincidence lines. Then, coincidence event determining based on coincidence time window may be performed on the signal data groups in the set of signal data groups, respectively. When it is determined that a coincidence event is generated, the server may record response data (including information such as energy strength, response time, and response location when the coincidence event is generated) of the trace signal, and store the response data of the trace signal as the target signal data.
After the detected object is injected with the radioactive tracer, the positron emitted by decay of radionuclide is combined with the negatron in the tissue to annihilate, thereby generating two gamma photons with equal energy and opposite directions. Since the two gamma photons have different distances in the body of the detected object, the time to arrive at the two detectors is also different. When the two gamma photons between which a degree of 180° is formed are detected by the local-end detector and the peer-end detector on the coincidence line in a specified coincidence time window (e.g., 0 us to 15 μs), a coincidence event is formed.
Step 14 includes that reconstructing a medical detecting image of the detected object based on the target signal data.
Specifically, the server may sequentially perform data preprocessing, algorithm reconstruction processing, image post-processing, and image display processing on the target signal data to obtain a corresponding medical detecting image.
Data preprocessing is configured to preprocess the collected target signal data, including energy cutting, time correction, scattering correction, or the like. These preprocessing steps may facilitate improving quality and accuracy of the image.
Algorithm reconstruction processing is configured to convert the preprocessed data into a three-dimensional PET image. Common reconstruction algorithms may include an iterative algorithm, a filtering back projection algorithm, or the like. These algorithms may restore an image according to a characteristic and a sampling manner of the signal data by a mathematical model and an image processing technology.
Image post-processing is configured to perform post-processing on the reconstructed three-dimensional PET image, including noise removal, correction of non-uniformity of a scanner, correction of gradient drift, contrast enhancement, and smoothing of the image. These post-processing steps may facilitate improving visualization and diagnostic capabilities of the image.
Image display processing is configured to display the reconstructed PET image on a screen for analysis and diagnosis by a physician and a researcher.
In the above reconstructing process of the medical image, firstly, the signal data set obtained by performing signal detection on the detected object is acquired via the detector array. The detector array is configured to siege the detected object in three dimensions and detect the trace signal emitted by the detected object to obtain the signal data set. Then, the detecting scenario of the detector array for the detected object is determined based on the data feature of the signal data set. When the detecting scenario is the target scenario, the effective detecting range of each detector in the detector array is determined. Finally, the target signal data of each detector within the respective effective detecting range is extracted from the signal data set, and the medical detecting image of the detected object is reconstructed based on the target signal data. In this way, in one aspect, the detection scenario of the detector array for the detected object is determined by the data feature of the detected signal data set, and then the effective detecting range of each detector is determined according to the corresponding detection scenario, so as to reconstruct the medical detecting image based on the target signal data in the corresponding effective detecting range. Therefore, a procedure of reconstructing the medical image is optimized, and compared with a manner in the related art, reconstruction efficiency of the medical image is effectively improved by a standardized execution program, and human and material power consumption is reduced. In another aspect, the trace signal emitted by the detected object is first detected by sieging the detected object in three dimensions, and then the target signal data in the effective detecting range corresponding to each detector is extracted from the trace signal according to the detecting scenario of the detector array, so as to construct the medical detecting image. Therefore, accuracy and feasibility of reconstructing the medical image are improved, and it facilitates providing further effective reference information for subsequent medical treatment.
One skilled in the related art may understand that, in the foregoing method in a specific implementation manner, the disclosed method may be implemented in a more specific manner. For example, the foregoing described implementation in which the server determines the effective detecting range of each detector in the detector array according to the detecting scenario of the detector array for the detected object is merely exemplary.
In an exemplary embodiment, referring to
Step 121 may include that acquiring a detecting region of the detector array for the detected object.
In an embodiment, the detecting region may be a physical space region of the detected object, which includes a first projection region of the detected object in an axial view of the detector array and a second projection region of the detected object in a circumferential view of the detector array.
Specifically, in the axial view of the detector array, a region that belongs to the detecting region of the detector array for the detected object and that is in an object contour region occupied by the detected object may be taken as the first projection region. In addition, in the circumferential view of the detector array, a region that belongs to the detecting region of the detector array for the detected object and that is in the object contour region occupied by the detected object may be taken as the second projection region. The axial view may be a field of view along a forward direction of a bed in the space inside the detector array. The circumferential view may be a field of view along a direction perpendicular to the forward direction of the bed in a cross-section of the space inside the detector array.
In an embodiment, the detecting region may be obtained based on a current scanning protocol of the detector array, a photographing region of a 2D/3D camera mounted in the detector array, or a concern region manually planned by a physician. A manner of determining the detecting region may be not specifically limited herein.
In some embodiments, the detector array may scan the detected object based on a preset scanning protocol. Detecting regions of the detector array for the detected object in different protocols may be different.
In some embodiments, the scan protocol may include a scan program and a parameter that are configured when the detector array scans and detects the detected object. The scan protocol may include a corresponding scan parameter, a scan time, and a scan region.
Exemplarily, when the current scanning protocol represents that the detector array performs
scanning of a head of a patient, the detecting region of the detector array may be a space region of the head of the patient, and the space region of the head may be is displayed in the axial view of the detector array as the first projection region of a head entity, and may be displayed in the circumferential view as the second projection region of the head entity. When the current scanning protocol represents that the detector array performs scanning of a heart of the patient, the detecting region of the detector array may be a space region of the heart of the patient, and the space region of the heart may be displayed in the axial view of the detector array as the first projection region of a cardiac entity, and may be displayed in the circumferential view as the second projection region of the cardiac entity.
Step 122 may include that determining, based on the detecting region, the effective detecting range corresponding to each detector.
In an embodiment, for the axial view, the server may determine a first effective detecting range of each detector in the axial view based on the first projection region
Specifically, the server may first siege the first projection region by using a detector as an endpoint and two corresponding first siege lines as sidelines, to obtain a first siege region. Then, a view range corresponding to the first siege region may be taken as the first effective detecting range.
An end of each of the first siege lines may be located at the endpoint, the other end of each of the first siege lines may be located within an axial length range of the detector array, and the first siege region is configured to siege at least a part of the first projection region.
In an exemplary embodiment, referring to
Referring to a in
Similarly, referring to b in
In another embodiment, for the circumferential view, the server may determine a second effective detecting range of each detector in the circumferential view based on the second projection region. The final detecting range in three dimensions may include the first effective detecting range and the second effective detecting range.
Specifically, the server may first siege the second projection region by using a detector as an endpoint and two corresponding second siege lines as sidelines, to obtain the second siege region. Then, a view range corresponding to the second siege region may be taken as the second effective detecting range.
An end of each of the second siege lines may be located at the endpoint, the other end of each of the second siege lines may be located within a circumferential length range of the detector array, and the second siege region is configured to siege the entire second projection region.
In an exemplary embodiment, referring to
Referring to a in
Similarly, referring to b in
In this way, in one aspect, the detection scenario of the detector array for the detected object is determined by the data feature of the detected signal data set, and then the effective detecting range of each detector is determined according to the corresponding detection scenario, so as to reconstruct the medical detecting image based on the target signal data in the corresponding effective detecting range. Therefore, a procedure of reconstructing the medical image is optimized, and compared with a manner in the related art, reconstruction efficiency of the medical image is effectively improved by a standardized execution program, and human and material power consumption is reduced. In another aspect, the trace signal emitted by the detected object is first detected by sieging the detected object in three dimensions, and then the target signal data in the effective detecting range corresponding to each detector is extracted from the trace signal according to the detecting scenario of the detector array, so as to construct the medical detecting image. Therefore, accuracy and feasibility of reconstructing the medical image are improved, and it facilitates providing further effective reference information for subsequent medical treatment.
It should be understood that, although steps in the flowcharts of
It may be understood that a same/similar part of the foregoing method embodiments in the specification may refer to each other, and each embodiment focuses on a difference from another embodiment. For related parts, referring to descriptions of another method embodiment.
The scanning bed 11A is configured to support the detected object (i.e., the radiation source), and position the detected object in a required position in the PET device 13A. In some embodiments, the detected object may lie on the scanning bed 11A. The scanning bed 11A may move under the control of the server and reach a desired location of the PET device 13A. In some embodiments, the scanning bed 11A may have a relatively long axial view and a relatively large circumferential view, such as an axial view of 2 meters.
The CT device 12A is configured to scan CT data of the detected object to reconstruct a CT image. The CT image may provide anatomical structure information with high resolution, including detailed structures of a bone, an organ, and a soft tissue. This information may help the physician to accurately locate the specific location of the abnormal metabolic region in the PET image, thereby more accurately diagnosing the lesion.
The PET device 13A is configured to perform signal detection on the detected object by the detector array to obtain the signal data set. The detector array is configured to siege the detected object in three dimensions and detect the trace signal emitted by the detected object to obtain the signal data set.
The coincidence processing module 14A is configured to perform a coincidence operation on the target signal data of the detector in the respective effective detecting range, so as to reconstruct a medical detecting image (i.e., a PET image) for the detected object according to an operation result of the target signal data.
The display device 15A is configured to display a reconstructed CT image and a reconstructed PET image on a display interface, so that the physician and the researcher perform analysis and diagnosis.
In a specific implementation scenario, the PET/CT system 10A may first collect statistics about the counting rate of the positron signal on the signal data set collected by the PET device 13A in real time, so as to determine the current scanning protocol of the PET device 13A. Then, an ineffective scanning view region of the PET device 13A may be determined according to the current scanning protocol of the PET device 13A (i.e., a non-concern region in the coincidence lines may be determined). Then, it is determined, according to the ineffective scanning view region, that a coincidence line that needs to be closed in the PET device 13A (i.e., it is determined that the response line in the coincidence lines which outside a limiting coincidence range or an effective view about the detector crystal). Finally, the target signal data of the unclosed coincidence line may be extracted from the signal data set, and the PET image of the detected object may be reconstructed based on the target signal data.
The signal detecting module 11 is configured for acquiring a signal data set obtained by performing signal detection on a detected object via a detector array. The detector array is configured to siege the detected object in three dimensions and detect a trace signal emitted by the detected object to obtain the signal data set.
The range detecting module 13 is configured for determining an effective detecting range of each detector in the detector array when a detecting scenario is a target scenario. Effective detecting ranges corresponding to the detector in different detection scenarios are different.
The image constructing module 14 is configured for extracting target signal data of each detector within a respective effective detecting range from the signal data set, and reconstructing a medical detecting image of the detected object based on the target signal data.
In some embodiments, the range detecting module 13 is further configured for acquiring a detecting region of the detector array for the detected object, and determining, based on the detecting region, the effective detecting range corresponding to each detector.
In some embodiments, the detecting region may include a first projection region of the detected object in an axial view of the detector array. The range detecting module 13 is further configured for determining, based on the first projection region, a first effective detecting range of each detector in the axial view.
In some embodiments, for the axial view, the range detecting module 13 is further configured for taking the detector as an endpoint and two corresponding first siege lines as sidelines, sieging the first projection region to obtain a first siege region, and taking a view range corresponding to the first siege region as the first effective detecting range. An end of each of the first siege lines may be located at the endpoint, the other end of each of the first siege lines may be located within an axial length range of the detector array, and the first siege region is configured to siege at least a part of the first projection region.
In some embodiments, the detecting region may include a second projection region of the detected object in a circumferential view of the detector array. The range detecting module 13 is further configured for determining, based on the second projection region, a second effective detecting range of each detector in the circumferential view.
In some embodiments, for the circumferential view, the range detecting module 13 is further configured for taking the detector as an endpoint and two corresponding second siege lines as sidelines, sieging the second projection region to obtain a second siege region, and taking a view range corresponding to the second siege region as the second effective detecting range. An end of each of the second siege lines may be located at the endpoint, the other end of each of the second siege lines may be located within a circumferential length range of the detector array, and the second siege region is configured to siege all the second projection region.
In some embodiments, each detector has at least one coincidence line within the respective effective detecting range, an end of the at least one coincidence line is a local-end detector corresponding to the effective detecting range, and the other end of the at least one coincidence line is a peer-end detector. The image constructing module 14 is further configured for extracting, from the signal data set, target trace signals of the local-end detector and the peer-end detector those are on each of the at least one coincidence line, and taking the target trace signals as the target signal data. Each of the at least one coincidence line may belong to the corresponding effective detecting range of each detector.
Referring to
In some embodiments, the scenario identifying module 12 is further configured for performing signal identification on each signal data in the signal data set, determining statistical data of the trace signal emitted from the detected object based on the radioactive tracer, and determining the detecting scenario of the detector array based on an activity level. The statistical data is configured to represent the activity level of the radioactive tracer in the detected object.
In some embodiments, the scenario identifying module 12 is further configured for identifying a preset radioactive signal and a positron signal that are emitted by the detected object in a coincident time window, obtaining a data amount of the radioactive signal and a counting rate of the positron signal, and taking either or both of the data amount and the counting rate as the statistical data for the trace signal.
In some embodiments, the computer device 20 may be an electronic device, and a computing system in the electronic device may run one or more operating systems, including any operating system discussed above and any commercially available server operating system. The computer device 20 may further run any one of various additional server applications and/or intermediate layer applications, including an HTTP (Hypertext Transfer Protocol) server, an FTP (File Transfer Protocol) server, a CGI (Common Gateway Interface) server, a super server, a database server, or the like. Exemplary database servers may include, but are not limited to, database servers available from (international commercial machines) or the like.
In some embodiments, the processor 21 may generally control the overall operation of the computer device 20, such as operations associated with display, data processing, data communication, and recording operations. The processor 21 may include one or more processor components to execute the computer program, so as to complete all or a part of the steps of the foregoing method. In addition, the processor component may include one or more modules, so as to facilitate interaction between the processor component and another component. For example, the processor component may include a multimedia module, so as to facilitate use of the multimedia component to control interaction between the computer device 20 of the user and the processor 21.
In some embodiments, the processor component in the processor 21 may further be referred to as a CPU (Central Processing Unit). The processor component may be an electronic chip with a signal processing capability. The processor may further be a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or another programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component. The general-purpose processor may be a microprocessor, or any conventional processor component. In addition, the processor component may be jointly implemented by an integrated circuit chip.
In some embodiments, the memory 22 is configured to store various types of data to support operation at the computer device 20. Examples of these data may include instructions, data collection, messages, pictures, videos, or the like for any application program or method to operate on the computer device 20. The memory 22 may be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic disk, an optical disc, or a graphene memory.
In some embodiments, the memory 22 may be a memory bar, a TF (Trans-flash) card, or the like, and all information in the computer device 20, including input raw data, a computer program, an intermediate running result, and a final running result, may be stored in the memory 22. In some embodiments, the memory 22 may store and extract information according to a location specified by the processor. In some embodiments, with the memory 22, the computer device 20 may have a memory function, so that normal operation may be ensured. In some embodiments, the memory 22 of the computer device 20 may be classified into a primary memory (memory) and an auxiliary memory (external memory) by use, or may be classified into an external memory and an internal memory. The external memory may be usually a magnetic medium or an optical disc, and may store information for a long time. The memory may refer to the storage member on the main board, which is configured to store the data and programs currently being executed, but is only configured to temporarily store the programs and data, and the data will be lost when power off.
In some embodiments, the computer device 20 may further include: a power supply assembly 23 configured to perform power management of the computer device 20, a wired or wireless network interface 24 configured to connect the computer device 20 to a network, and an input/output (I/O) interface 25. The computer device 20 may operate an operating system that is stored in the memory 22, such as Windows Server, Mac OS X, Unix, Linux, FreeBSD, or the like.
In some embodiments, the power supply assembly 23 may provide power for various components of the computer device 20. The power supply assembly 23 may include a power supply management system, one or more power supplies, and other components associated with generating, managing, and allocating power for the computer device 20.
In some embodiments, the wired or wireless network interface 24 is configured to facilitate wired or wireless communication between the computer device 20 and another device. The computer device 20 may access a wireless network based on a communications standard, such as WIFI, an operator network (such as 2G, 3G, 4G, or 5G), or a combination thereof.
In some embodiments, the wired or wireless network interface 24 may receive broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the wired or wireless network interface 24 may further include a near field communication (NFC) module to facilitate short-range communication. For example, the NFC module may be implemented based on a radio frequency identification (RFID) technology, an infrared data association (IrDA) technology, an ultra-wideband (UWB) technology, a Bluetooth (BT) technology, and another technology.
In some embodiments, the input/output (I/O) interface 25 may provide an interface between the processor 21 and a peripheral interface module. The peripheral interface module may be a keyboard, a tap wheel, a button, or the like. The button may include but is not limited to a home button, a volume button, a start button, and a lock button.
When units integrated into the functional units in the embodiments of the present
disclosure are implemented in a form of a software functional unit and sold or used as an independent product, the units may be stored in the computer-readable storage medium 30. Based on this understanding, the essential technical solutions of the present disclosure, a part contributing to the related art, or all or a part of the technical solutions may be implemented in a form of a software product. The computer program 31 in the computer-readable storage medium 30 may include several instructions for instructing a computer device (which may be a personal computer, a system server, or a network device), an electronic device (such as an MP3 or a MP4, an intelligent terminal such as a mobile phone, a tablet computer, or a wearable device, or a desktop computer), or a processor to perform all or a part of the steps of the methods of the embodiments of the present disclosure.
One skilled in the art should understand that the embodiments of the present disclosure may provide the reconstructing method of the medical image, the reconstructing apparatus 10 of the medical image, the computer device 20, the computer-readable storage medium 30, or the computer program product 40. Therefore, the present disclosure may use a form of a complete hardware embodiment, a complete software embodiment, or an embodiment combined with software and hardware aspects. In addition, the present disclosure may be used in a form of a computer program product 40 implemented on one or more computer program instructions 41 (including but not limited to a disk memory, a CD-ROM, and an optical memory) that include computer available program code.
The present disclosure is described with reference to a flowchart and/or a block diagram of the reconstructing method of the medical image, the reconstructing apparatus 10 of the medical image, the computer device 20, the computer-readable storage medium 30, or the computer program product 40 according to embodiments of the present disclosure. It should be understood that the computer program product 40 may implement each process and/or block in the flowchart and/or the block diagram, and a combination of processes and/or blocks in the flowchart and/or the block diagram. The computer program product 40 may be provided to a processor of a general-purpose computer, a dedicated computer, an embedded processor, or another programmable data processing device to produce a machine, such that the program instructions 41 executed by a processor of a computer or another programmable data processing device may generate an apparatus for implementing a function specified in one or more processes of the flowchart and/or one or more blocks of the block diagram.
These computer program products 40 may also be stored in a computer-readable memory that can instruct a computer or another programmable data processing device to operate in a specific manner, so that the program instructions 41 stored in the computer program product 40 may generate a manufacturer including an instruction apparatus. The instruction apparatus may implement a function specified in one or more procedures of the flowchart and/or one or more blocks of the block diagram.
These program instructions 41 may also be loaded onto a computer or another programmable data processing device, so that a series of operation steps may be performed on the computer or another programmable device to generate computer-implemented processing, so that the program instructions 41 executed on the computer or another programmable device may provide steps for implementing a function specified in one or more processes of the flowchart and/or one or more blocks of the block diagram.
It should be noted that the foregoing various methods, apparatuses, electronic devices, computer-readable storage media, computer program products, and the like may further include another implementation manner according to descriptions of method embodiments. For a specific implementation manner, reference may be made to descriptions of related method embodiments, and details are not described herein again.
One skilled in the art may easily think of another implementation solution of the present disclosure after considering the specification and the invention disclosed herein. The present disclosure is intended to cover any variations, uses, or adaptations of the present disclosure that follow the general principles of the present disclosure and include well-known or commonly used technical means in the art that are not disclosed herein. The specification and embodiments are merely exemplary, and the true scope and spirit of the present disclosure are stated by the claims.
It should be understood that the present disclosure is not limited to the precise structure described above and shown in the accompanying drawings, and various modifications and changes may be made without departing from its scope. The scope of the present disclosure is limited only by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
202311754545.7 | Dec 2023 | CN | national |