The present invention relates to an ultrasonic wave imaging apparatus for capturing an ultrasonic wave image by inserting, in biomedical tissue, a guidewire equipped with, for instance, a photoacoustic ultrasonic wave generator; a therapy support system; and an image display method.
Catheterization has been widely and primarily used for treatment such as stenosis because a patient's burden in this operation is less than in surgery such as thoracotomy. It is critical to grasp the relationship between a treatment target area and a catheter during catheterization, so that X-ray fluoroscopy has been used as an imaging support method. In addition, JP 2019-213680A discloses that an ultrasound image was attempted to be used as a support image instead of X-ray fluoroscopic image.
Specifically, JP 2019-213680A discloses a technology in which regarding an ultrasonic wave generated by an ultrasonic wave generator installed on a guidewire, an arrival time difference occurring when the ultrasonic wave (the ultrasonic wave from the ultrasonic wave generator) arrives at an element array included in an ultrasonic probe or an ultrasonic wave generator image depending on the distance in an imaging area is used to estimate the tip position of the guidewire; and the estimation results are then used to grasp the relative positional relationship between the imaging output and the guidewire tip.
The above previous technology makes it possible to estimate the location of the tip position of an insert (guidewire) in the imaging area. Unfortunately, to grasp the 3D positional relationship between a living body imaging target such as a blood vessel and the insert tip, a 2D-array probe is required in which element arrays constituting an ultrasonic probe are arranged like a matrix.
In addition, although the relative positional relationship of, for instance, a catheter relative to an imaging area can be grasped, the absolute position cannot be detected. Consequently, the positioning on in vivo information acquired by, for instance, another imaging technique is impossible.
The purpose of the present invention is to provide an ultrasonic wave imaging apparatus, a therapy support system, and an image display method such that a linear array probe can used to grasp the 3D positional relationship between a guidewire and an imaging target such as a blood vessel.
An aspect of the present invention provides an ultrasonic wave imaging apparatus comprising:
an ultrasonic probe configured to irradiate a subject with an ultrasonic wave and receive a reflected wave of the ultrasonic wave and receive an ultrasonic wave from a beacon inserted into the subject;
a probe position-acquiring unit configured to acquire a 3D position and an orientation of the ultrasonic probe;
a beacon location-acquiring unit configured to determine a 3D location of the beacon from a relative location and a relative speed of the beacon relative to the ultrasonic probe as calculated from an ultrasonic wave image of the ultrasonic waves received at the ultrasonic probe and the 3D position and the orientation of the ultrasonic probe as acquired by the probe position-acquiring unit; and
a display image formation section configured to use the ultrasonic wave image of the ultrasonic waves received at the ultrasonic probe to form an image displayed on a display unit.
According to the invention, a linear array probe can be used to teach a surgeon about the 3D positional relationship between a guidewire and an imaging target such as a blood vessel.
Hereinafter, embodiments of the invention will be described in detail with reference to the Drawings.
Here,
As shown in
Examples of the body insertion instrument 11 include long and thin tubular medical devices such as balloon catheters, microcatheters, nutritional catheters, and other therapeutic devices as well as guidewires for delivering each therapeutic device to a target site. A case where the body insertion instrument is a guidewire is described below.
The following describes an ultrasonic wave-generating device 10 provided with a PA signal generator 13 that uses an photoacoustic (PA) effect to generate an ultrasonic wave signal. However, the ultrasonic wave may be generated by a piezoelectric element.
As shown in
This makes it possible to grasp the location of the guidewire 11 in the subject 80 or the blood vessel 82.
The PA signal generator 13 is made of material that can be subject to adiabatic expansion upon reception of a laser beam to generate an ultrasonic wave such as a PA signal. Examples of the material include a known pigment (photosensitizer), metal nanoparticles, or a carbon-based compound. The tip of the optical fiber 12 including the PA signal generator 13 is covered with a resin sealing member. Note that in
Next, the ultrasonic imaging module 30 as a component of the ultrasonic wave imaging apparatus of this embodiment will be described in detail.
The ultrasonic imaging module 30 includes: a controller 40 detailed later; a transmitter 31 for transmitting an ultrasonic wave signal to the ultrasonic probe 20; a receiver 32 for receiving a reflected wave (RF signal) detected at the ultrasonic probe 20 to perform, for instance, phasing and/or addition processing; an input unit 33 for inputting an instruction and/or conditions required for imaging by a user; a display unit 34 for displaying, for instance, an ultrasonic wave image acquired by the ultrasonic imaging module 30 and/or graphic user interface (GUI); and a memory 35 for storing, for instance, a processing output and/or a display image formed, based on the processing output, by a display image formation section 43.
The controller 40 includes: a signal-processing section 41 configured to process an ultrasonic wave image (including a reflected ultrasonic wave signal and a PA signal) received at the ultrasonic probe 20; a PA source location detection section 42 configured to use a signal processed by the signal-processing section 41 to detect the source location of the PA signal; a display image formation section 43 configured to form an image displayed on the display unit 34 while using the PA source location detected by the PA source location detection section 42 and pre-acquired 3D anatomical information 44 about a subject as obtained beforehand; and an operation planning section 45 configured to use the source location detected by the PA source location detection section 42 and the pre-acquired 3D anatomical information 44 about the subject as obtained beforehand to determine an operation position of the ultrasonic probe 20.
The pre-acquired 3D anatomical information 44 used may be a 3D volume image obtained by CT (Computer Tomography) and/or MRI (Magnetic Resonance Imaging) or a 3D volume image captured by sweeping with the ultrasonic wave imaging apparatus. The signal-processing section 41 includes: a reflected ultrasonic wave signal-processing unit 411 configured to use the RF signal, which is a reflected wave received by the receiver 32, to create an ultrasonic wave image such as a B-mode image; and an ultrasonic signal analyzing unit (PA signal analyzing unit) 412 configured to detect and process, based on the beam-emitting timing of the laser beam from the photogeneration module 15, a PA signal that is generated from the PA signal generator 13 and is then detected at each transducer element of the ultrasonic probe 20.
Note that the controller 40 is configured like a common ultrasonic wave imaging apparatus except for addition of the PA signal analyzing unit 412 where the PA signal is received and the PA source location detection section 42 configured to detect the location where the PA signal is generated.
The PA source location detection section 42 includes: a relative position detecting unit 421 configured to estimate, from the PA signal analyzed by the PA signal analyzing unit 412, the location of the PA signal generator 13 in an imaging area; a relative speed-measuring unit 422 configured to derive the speed by differentiating the location of the PA signal generator 13 as detected by the relative speed-measuring unit 421; a probe speed-measuring unit 425 configured to measure, based on operation position information about the robot arm 90, the speed of the ultrasonic probe 20; an absolute position-detecting unit 424 configured to detect the absolute location of the PA signal generator 13 on the basis of the speed measured by the relative speed-measuring unit 422 and the speed of the ultrasonic probe 20 as measured by the probe speed-measuring unit 425; and a position filter 423 used to reduce an error by filtering the absolute position detected by the absolute position-detecting unit 424.
In the position filter 423, any filtering may be used to reduce an error included in the detection results of the absolute position-detecting unit 424. For instance, in the case where the PA signal generator 13 moves at a low speed, an error during the position detection may be reduced by smoothening with a filter such as a movement averaging filter or low pass filter (LPF).
In addition, the position filter 423 may be a filter in view of an error model configured to model a detection error that can occur, in principle, in a position detection technique of the PA source location detection section 42. Specifically, the location and speed of the PA signal generator 13 as detected in the PA source location detection section 42, the position, speed, and attitude of the ultrasonic probe 20, and an error model for the PA source location detection section 42 may be integrally considered to apply a Kalman filter and/or a particle filter that can infer the statistically most likely state.
The details of the other configurations in the PA source location detection section 42 will be described later.
Part or all of the functions of the controller 40 may be implemented by executing software with a program(s) for the functions in a computer provided with a CPU(s) or GPU(s) and a memory. In addition, part or all of the functions of each unit may be implemented using hardware such as an electronic circuit, ASIC, or FPGA. Note that the controller 40 may be installed at a single computer or the functions may be separately installed at a plurality of computers.
The ultrasonic probe 20 may be a 1D-array probe (linear array probe) having one array sequence of multiple transducer elements aligned in a 1D direction. In addition, various kinds of the ultrasonic probe 20 may be used, including a 3D-array probe having 2 or 3 array sequences in directions perpendicular to an array sequence direction of the 1D array probe; or a 2D-array probe having multiple array sequences in 2D directions. The signal-processing section 41 employs an analysis technique depending on the type of the ultrasonic probe 20 used.
Next, how the medical support system 100 in this embodiment works will be described.
Here, the ultrasonic wave imaging apparatus is configured such that while the ultrasonic probe 20, which is a 1D-array probe, is used to capture an ultrasonic wave from the subject 80, the guidewire 11 (e.g., a catheter) that is guided by a surgeon and has, at the tip, the PA signal generator 13 is inserted into the body of a subject; and the ultrasonic wave imaging apparatus then monitors the tip location of the guidewire 11 by using the PA signal. The following describes a case of operating the robot arm 90 such that the ultrasonic probe 20 tracks the tip location of the guidewire.
At step S401, the ultrasonic wave imaging apparatus determines whether or not a support operation for tracking the ultrasonic probe 20 is in action. If not in action (S401: No), the processing is ended. If in action (S401: Yes), the processing goes to step S402.
At step S402, the ultrasonic wave imaging apparatus uses the ultrasonic probe 20 to capture a reflected ultrasonic wave (hereinafter, referred to as an imaging mode). In this imaging mode, an ultrasonic wave is captured like in conventional ultrasonic wave imaging apparatuses.
Specifically, the transmitter 31 transmits an ultrasonic wave through the ultrasonic probe 20 and the ultrasonic probe 20 receives a reflected wave after the transmitted ultrasonic wave is reflected from a tissue inside a subject. The receiver 32 performs phasing/addition processing of the reception signal that is received, as each frame, from the ultrasonic probe 20, and send the results to the signal-processing section 41. The reflected ultrasonic wave signal-processing unit 411 uses the frame signal from the receiver 32 to create an ultrasonic wave image such as a B-mode image, and transfer the image to the display image formation section 43 configured to form an image displayed on the display unit 34.
In this imaging mode, if the ultrasonic probe 20 is a 1D-array probe, it is possible to obtain information about the intensity of the reflected wave in the array probe direction and the depth direction. The information about the intensity of the reflected wave can be used to acquire 2D information about the intensity of the reflected wave.
Meanwhile, in the case of using a 2D-array probe as the ultrasonic probe 20, the imaging mode may be performed just once to obtain 3D information corresponding to the intensity of the reflected wave together in the probe plane and depth directions.
At step S403, the ultrasonic wave imaging apparatus uses the ultrasonic probe 20 to receive a PA signal (hereinafter, referred to as a PA reception mode). This PA reception mode can be used to monitor a PA signal from the PA signal generator 13 when a catheter is being inserted into the body (e.g., a blood vessel) of the subject.
Specifically, during the PA reception mode, the operation of the transmitter 31 is temporarily stopped, and the photogeneration module 15 is actuated to emit a pulsed laser beam from the photogeneration module 15. The PA signal generator 13 is irradiated, through the optical fiber 12 of the guidewire 11 inserted into the body, with the beam emitted by the photogeneration module 15. This irradiation beam causes a PA signal (ultrasonic wave) to occur from the photoacoustic material of the PA signal generator 13. Then, the PA signal is detected by elements of the ultrasonic probe 20.
The PA signal analyzing unit 412 use the PA signal received at the ultrasonic probe 20 to prepare signal data synchronized with the beam emitted from the photogeneration module 15. Then, the data is transferred to the relative position detecting unit 421 in the PA source location detection section 42. To synchronize the received PA signal, each timing may be obtained from a trigger signal output to the PA signal analyzing unit 412 upon emission of the beam from the photogeneration module 15. Alternatively, each beam emission timing may be inferred from the PA signal received by the elements of the ultrasonic probe 20.
In addition, in the case of using the ultrasonic wave-generating device 10 configured to generate an ultrasonic wave by using a piezoelectric element, the transmitter 31 may be used to transmit the ultrasonic wave, so that the signal generating timing can be inferred. Also, an external signal source and the trigger signal may be used for the synchronization like in the case of using the PA signal generator.
At step S404, the PA source location detection section 42 detects the location of the PA signal generator 13 on the base of information about the PA signal transferred from the PA signal analyzing unit 412 and the absolute (3D) position and the attitude (orientation) of the ultrasonic probe 20 as sent from the robot arm 90.
Specifically, in the PA source location detection section 42, first, the relative position detecting unit 421 detects the relative location of the PA signal generator 13 relative to the ultrasonic probe 20 on the basis of the PA signal transferred from the PA signal analyzing unit 412.
Next, the relative speed-measuring unit 422 detects the relative speed from a temporal change in the relative position detected. Then, the probe speed-measuring unit 425 detects the speed and angular velocity of the ultrasonic probe 20 from a temporal change in information about the absolute position and attitude (orientation) of the ultrasonic probe 20 as sent from the robot arm 90.
In addition, the relative speed-measuring unit 422 may measure the relative speed by using a Doppler effect occurring in the received PA signal.
After that, the absolute position-detecting unit 424 uses these values to detect the absolute location of the PA signal generator 13.
The detected absolute location of the PA signal generator 13 is filtered with the position filter 423 to reduce a detection error.
At step S405, the display image formation section 43 forms a display content to be displayed on the display unit 34.
Specifically, the display image formation section 43 uses a reflected ultrasonic wave image formed by the reflected ultrasonic wave signal-processing unit 411, the location of the PA signal generator 13 as detected by the PA source location detection section 42, and a 3D volume image regarding the anatomical structure of the subject 80 as obtained beforehand and recorded on the pre-acquired 3D anatomical information 44 to form a display image to be displayed on the display unit 34 in such a manner as to enable a surgeon to understand the 3D positional relationship of the PA signal generator 13.
At step S406, the operation planning section 45 uses the pre-acquired 3D anatomical information 44 about the subject 80, the location and coordinates of the probe as obtained from the robot arm 90, and the reflected ultrasonic wave image and the PA source location acquired by the ultrasonic wave imaging apparatus to plan an operation of the robot arm 90 and then instruct the robot arm 90 about the operation position.
For instance, the operation planning section 45 permits the robot arm 90 to move such that the PA signal generator 13 is positioned at a given position on the reflected ultrasonic wave image at step S402 so as to track the PA signal generator 13. This makes it possible for the ultrasonic probe 20 to continue receiving the PA signal (ultrasonic wave) generated from the PA signal generator 13.
Subsequently, the processing returns to step S401, and steps S402 to 406 are then repeated.
Note that the imaging mode (step S402) and the PA reception mode (step S403) are not limited to those in the flowchart of
Hereinafter, processing of the absolute position-detecting unit 424 at step S404 in
If the ultrasonic probe 20 is a 2D-array probe with multiple array sequences in the 2D direction, the orientation of the major axis 22 or the minor axis 23 may be determined arbitrarily.
First,
As shown in
Thus, since the position 715 of the PA signal generator 13 is a point of intersection between the arc 714A of the distance 713A and the arc 714B of the distance 713B, the absolute location can be calculated from the absolute position (712A or 712B) of the ultrasonic probe 20 and the distance (713A or 713B) to the PA signal generator 13, respectively.
Specifically, the distance 713A (la) or 713B (lb) to the PA signal generator 13 can be determined using formula (1) where ta or tb is set to the time of arrival of the PA signal before or after the movement.
[Formula 1]
l
b
=c·t
b
, l
a
=c·t
a (1)
where c is the sound speed.
Then, if the ultrasonic probe 20 is translated from the position 712B to 712A at a speed v in a small time period Δt, the position 715 (yPA or zPA) of the PA signal generator 13 can be determined using formula (2).
Provided that y represents a relative value in the direction of minor axis 23; z represents a relative value in the direction of depth axis 24; and approximation la≈lb is made because the small time period Δt is sufficiently short.
In the above case, the PA signal generator 13 rests. However, it may be moved. This causes an error in the absolute location of the PA signal generator 13 as calculated from the movement of the PA signal generator 13. Here, the position filter 423 may be used to reduce the error.
Next,
Since the location 725 of the PA signal generator 13 is a point of intersection between the straight line 724A and the straight line 724B, the absolute positions 723A and 723B can be used to calculate the absolute position at the location 725 of the PA signal generator 13.
Specifically, when the change xPA in position of the PA signal generator 13 in the direction of major axis 22 between before and after the movement, the angular velocity ω of the rotation 721 about the depth axis 24, the arrival time t of the PA signal, and the sound speed c are set, the location 725 (yPA or zPA) of the PA signal generator 13 can be determined using formula (3).
In the method for detecting the absolute location of the PA signal generator 13 from the changes in position of the above ultrasonic probe 20 and the PA signal generator 13 or the change in rotational position of the imaging plane of the ultrasonic probe 20, the positions before and after the movement are used to detect the absolute location of the PA signal generator 13. However, if the time difference between before and after the movement is small, the speed of the ultrasonic probe 20 may be used to detect the absolute location of the PA signal generator 13.
In this flowchart, the translation and/or the rotation are extracted from the movement of the ultrasonic probe 20 to detect the absolute location of the PA signal generator 13.
At step S801, the probe speed-measuring unit 425 measures the movement speed and angular velocity of the ultrasonic probe 20. Here, the movement speed and angular velocity of the ultrasonic probe 20 may be acquired from the robot arm 90.
At step S802, the absolute position-detecting unit 424 uses the movement speed and angular velocity of the ultrasonic probe 20 to extract the translation speed of the ultrasonic probe 20 in the direction of minor axis 23 and the speed component of the rotation of the ultrasonic probe 20 about the depth axis 24.
At step S803, the absolute position-detecting unit 424 determines whether or not the extracted translation speed in the direction of minor axis 23 is equal to 0. If equal to 0 (S803: “=0”), the processing goes to step S805. If unequal to 0 (S803: “not 0”), the processing goes to step S804.
At step S804, like the method described in
At step S805, the absolute position-detecting unit 424 determines whether or not the extracted speed of rotation of the ultrasonic probe 20 about the depth axis 24 is equal to 0. If equal to 0 (S805: “=0”), the processing goes to step S807. If unequal to 0 (S805: “not 0”), the processing goes to step S806.
At step S806, like the method described in
At step S807, the position filter 423 is used to filter at least one of the absolute locations of the PA signal generator 13 as calculated at step S804 and step S806. This can refine the position detection.
Hereinabove, the case of using a 1D-array probe as the ultrasonic probe 20 has been described. Here, the case of using a 2D-array probe as the ultrasonic probe 20 will be described.
In this case, the 3D location of PA signal generator 13 relative to the ultrasonic probe 20 can be obtained without acquiring the 3D location of the PA signal generator 13 from information about the translation and/or rotation of the ultrasonic probe 20. Here, the acquired 3D relative location may be added to the 3D absolute position and attitude of the ultrasonic probe 20 as notified from the robot arm 90 to detect the 3D absolute location of the PA signal generator 13.
The above ultrasonic wave imaging apparatus and therapy support system have been used to describe that the robot arm 90 operates the ultrasonic probe 20 and the 3D absolute location of the PA signal generator 13 can be detected based on the absolute position of the ultrasonic probe 20 detected by the robot arm 90. It may be configured to provide a probe position/attitude sensor 91 for detecting the absolute position of the ultrasonic probe 20. In addition, the robot arm 90 may be used to operate the ultrasonic probe 20, and the probe position/attitude sensor 91 may be used to detect the absolute position of the ultrasonic probe 20.
The probe position/attitude sensor 91 may be built in the ultrasonic probe 20 or may be configured as a separate body.
The probe position/attitude sensor 91 may be configured by combining, for instance, a positioning sensor (e.g., a geomagnetic sensor), an accelerometer, and/or a gyrometer, or may use an external camera. This sensor is not limited if the position and attitude of a probe can be measured.
The ultrasonic wave imaging apparatus in
The probe position/attitude sensor 91 is configured to periodically detect the absolute position and attitude of the ultrasonic probe 20 and notify the controller 40 about them.
Then, the probe speed-measuring unit 425 detects the speed and angular velocity of the ultrasonic probe 20 from a temporal change in information about the absolute position and attitude (orientation) of the ultrasonic probe 20 as sent from the probe position/attitude sensor 91 instead of the robot arm 90.
The display image formation section 43 uses the absolute position of the ultrasonic probe 20 as detected by the probe position/attitude sensor 91 to perform substantially the same processing as in the case of detecting the 3D absolute location of the PA signal generator 13.
The differences from the flowchart of processing in the ultrasonic wave imaging apparatus described in
At step S407, the PA source location detection section 42 detects the location of the PA signal generator 13 on the basis of information about the PA signal transferred from the PA signal analyzing unit 412 and the (3D) absolute position and attitude (orientation) of the ultrasonic probe 20 as sent from the probe position/attitude sensor 91.
This enables the 3D absolute location of the PA signal generator 13 to be detected even in the case of manually operating the ultrasonic probe 20.
Hereinabove, the embodiments for detecting the location of the PA signal generator 13 have been described. However, the respective embodiments may be used in combination, if appropriate, as long as they are technically consistent, and such a combination should be included in the invention.
Here, described are display images displayed on the display unit 34 at step S405 in
As shown in
In addition, the image 511 and the image 512 may be displayed using each display image obtained by imaging by the ultrasonic imaging module 30 and then formed by the display image formation section 43. Also, the images may be formed by computer graphics (CG) using information about the PA signal generator 13 detected by the PA source location detection section 42 and the pre-acquired 3D anatomical information 44 about the subject 80 as obtained beforehand. In this case, the display image formation section 43 may display information about the anatomical structure outside the imaging area of the ultrasonic imaging module 30.
The details of the image displayed on the display unit 34 may include any processed image to support surgery and may include a display, in which a notable point such as a blood vessel or a lesion present in the above ultrasonic wave image or the CG is emphasized, and/or a display, in which a site of lesion outside the screen is indicated.
During the image formation by means of the CG, the pre-acquired 3D anatomical information 44 about the subject 80 as obtained beforehand may be used to detect where is the position on the absolute coordinates as detected by the robot arm 90 for positioning. For this purpose, features in the ultrasonic wave image formed by the display image formation section 43 and features in the pre-acquired 3D anatomical information 44 are compared to detect any agreement point. Examples of the features that can be used include blood vessel branches and/or bones as well as other features.
In addition, during the image formation by means of the CG, the pre-acquired 3D anatomical information 44 may be used directly to form the display image. Alternatively, the display image may be formed by using information about the pre-acquired 3D anatomical information 44 as obtained beforehand and then modified such that deformation of the anatomical structure during surgery from the pre-acquired anatomical structure is detected and the pre-acquired 3D anatomical information 44 as obtained beforehand is fit better to the structure during surgery. For instance, it is possible to consider modification processing in which a blood vessel wall that has been formed during surgery and is in a reflected ultrasonic wave image is recognized; and a blood vessel wall included in the pre-acquired 3D anatomical information 44 as obtained beforehand is fit likewise to have a shape of the blood vessel wall in the reflected ultrasonic wave image.
The content of the image displayed on the display unit 34 is not limited to a screen image displayed in two directions as exemplified in
The display image 52 includes a blood vessel lateral view 521, a blood vessel transverse view 522, and a blood vessel top view 523, which are images in three directions.
Reference sign 528 in
The following details how to instruct movement of the robot arm, which is planed by the operation planning section 45 at step S406 in
In
In the ultrasonic wave image, the blood vessel 82 of the subject 80 and the PA signal generator 13 (location 615) are depicted. This makes it easier to grasp the location 615 of the PA signal generator 13 in the blood vessel 82 of the subject 80.
In
In the ultrasonic wave image, the blood vessel 82 of the subject 80 and the PA signal generator 13 (location 625) are depicted. This makes it easier to grasp the positional relationship between an stenosis lesion 83 and the PA signal generator 13 (location 625) when the stenosis lesion 83 is present in the blood vessel 82 of the subject 80.
In this regard, however, a way of moving the ultrasonic probe 20 is permitted if a reflected ultrasonic wave signal and a PA signal necessary for the formation of a display image can be acquired during the display image formation at step S405 in
As illustrated in
Then, the robot arm 90 operates and moves swingably the ultrasonic probe 20 such that the speed ve of the translation 711 and the angular velocity ωd of the rotation of the ultrasonic probe 20 are set in formula (4). As a result, the translation 711 and the rotation 721 are not equal to 0 at the same time. This allows the location of the PA signal generator 13 to be detected constantly and continuously. Provided that v0 and ω0 are each a constant for adjusting the speed; and T represents a cycle of movement back and forth.
Hereinabove, the embodiments of the ultrasonic wave imaging apparatus and the catheterization support system in the invention have been described. However, the respective embodiments may be used in combination, if appropriate, as long as they are technically consistent, and such a combination should be included in the invention.
Number | Date | Country | Kind |
---|---|---|---|
2020-094987 | May 2020 | JP | national |