Embodiments of the present invention relate to capturing ultrasound image of a patient body, more specifically to an ultrasound apparatus and an ultrasound treatment system that are remotely controlled.
Medical imaging systems are commonly used for performing diagnostic activities on a patient's body. A commonly used ultrasound based medical imaging system captures images of various body parts of a patient. An ultrasound based medical imaging technique is used to visualize muscles, and many internal organs, for capturing their size, structure and any pathological lesions in addition to any real-time tomographic images. A typical ultrasound imaging system includes an imaging probe that captures images when positioned and moved on a body portion. The imaging probe includes multiple transducer sensors that can generate ultrasound signals as short burst signals. These ultrasound signals are transmitted through the body portion and the ultrasound imaging system waits for return signals during a small time window. The return signals received are then processed to generate multiple images. The ultrasound signals that are not received will be ignored.
Generally to conduct an ultrasound diagnostic procedure, the patient and a medical expert need to be in the same location. The medical expert may be but not limited to a doctor, a technician and a sonographer. The medical expert uses the imaging probe to move over an area of interest on the patient's body for capturing images. If the patient and the medical expert are in different location, in order to conduct the diagnosis on the patient then either the patient or the medical expert needs to travel so that they are in the same location. This may occasionally be time consuming and cumbersome for both the parties. Now considering the case of the patient, the patient may not be having a favorable health condition to travel so reaching a location of an ultrasound imaging system of the medical expert may be nearly impossible.
Moreover trained medical experts are required for capturing the ultrasound images with accuracy. So there may be a non-availability of trained medical experts when a need arises for conducting a diagnosis especially in rural regions. In order to address these situations ultrasound images are captured and then compressed and transferred over a network to the trained medical expert in a remote location. As the medical expert do not have control over the ultrasound imaging system, the medical expert examines these images and then provides suggestion and instructions to a local technician for taking other ultrasound images if required. These images taken are then sent to the medical expert for examination which renders the process time consuming and cumbersome.
Thus there is a need for an ultrasound apparatus that conveniently captures the images of the patient for examination by a medical expert in a remote location.
The above-mentioned shortcomings, disadvantages and problems are addressed herein which will be understood by reading and understanding the following specification.
As discussed in detail below, embodiments of the present invention include an ultrasound apparatus for capturing a medical image of a patient. The ultrasound apparatus is remotely controlled by a medical expert The ultrasound apparatus includes a sensor assembly and a remote control unit. The sensor assembly is configured to be positioned on a patient body portion. The sensor assembly comprises an array of sensors configured to activate for acquiring multiple images of the patient body portion. The remote control unit comprises a detector interface capable of communicating wirelessly with the sensor assembly. A user operates an input device for interacting with the detector interface. The user may be a medical expert such as, a doctor or a sonographer or a technician. During interaction the input device transmits multiple input signals to the sensor assembly. One or more input signals activate one or more sensors for capturing one or more images of the patient body portion.
In an embodiment, an ultrasound treatment system for capturing a medical image of a patient is disclosed. The ultrasound treatment system includes a remote control unit and a sensor control unit. The remote control unit includes a detector module for receiving multiple input signals from the user through a detector interface. The input received activates one or more sensors of an array of sensors positioned proximal to the patient body. These multiple input signals are transmitted to the sensor control unit using the transceiver. The transceiver unit of the sensor control unit receives the multiple input signals over a wireless network. The sensor module activates one or more sensors of the array of sensors based on the received input signals. The signals transmitted by the sensors are used for generation of multiple images of the patient body portion.
In an embodiment, a method for remotely controlling an ultrasound procedure on a patient body is disclosed. The method includes receiving multiple input signals over a wireless network in response to interaction of an input device on a detector interface. The input device is operated by a user on the detector interface present in a remote location. Thereafter, one or more sensors of an array of sensors positioned proximal to the patient body are activated based on the multiple input signals. These activated sensors capture one or more images of the patient body portion.
Various other features, objects, and advantages of the invention will be made apparent to those skilled in the art from the accompanying drawings and detailed description thereof.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.
In an embodiment an ultrasound apparatus for capturing a medical image of a patient is disclosed. The ultrasound apparatus is remotely controlled by a medical expert. The ultrasound apparatus includes a sensor assembly and a remote control unit. The sensor assembly is configured to be positioned on a patient body portion. The sensor assembly comprises an array of sensors configured to activate for acquiring multiple images of the patient body portion. The remote control unit comprises a detector interface capable of communicating wirelessly with the sensor assembly. A user operates an input device for interacting with the detector interface. The user may be a medical expert such as, a doctor or a sonographer or a technician. During interaction the input device transmits multiple input signals to the sensor assembly. One or more input signals activate one or more sensors for capturing one or more images of the patient body portion.
The ultrasound apparatus 100 comprises a sensor assembly 102 configured to be positioned on the patient body portion. The sensor assembly 102 may include an array of sensors. The sensors may be for example ultrasound transducers. In an embodiment, the sensor assembly 102 may be wearable by the patient. The sensor assembly 102 may be worn to cover the patient body portion that needs to be diagnosed. The sensor assembly 102 may be in the form of a belt like structure having a fastening unit to be wearable on the patient's body. In an embodiment, the sensor assembly 102 may include multiple separate units of sensors that may be placed on different positions on the patient's body. Each unit of sensor may include one or more sensors. Each unit of sensor may have any fastening mechanism for adhering to the patient's body. The array of sensors in the sensor assembly 102 may be activated to acquire one or more images of the body portion. The activation or operation of the sensors is controlled by the user through a remote control unit 104. The remote control unit 104 includes a detector interface 106 and an input device 108 capable of interacting with the detector interface 106. The detector interface 106 may be but is not limited to a hardware detector interface and a virtual detector interface. The input device 108 may be operated by the user. The user moves the input device 108 with respect to the detector interface 106 for initiating an interaction. This interaction results in generation of multiple input signals that may be transmitted to the sensor assembly 102. The input signals facilitate in activating one or more sensors of the array of sensors. A position of interaction of the input device 108 on the detector interface 106 may determine a sensor that needs to be activated. For example if input device 108 interacts to cover an area in the detector interface 106, then sensors corresponding to the covered area from the array of sensors may be activated. The activated sensors send ultrasonic signals to the patient's body portion and subsequently reflected signals may be processed to generate one or more images. The images are then presented to the medical expert.
Now referring to
In order to send out the ultrasonic signals the sensors need to be activated. The sensors may be activated based on a user input. The user input may be received through a remote control unit 204. The remote control unit 204 receives instructions for activating the sensors in response to an interaction between an input device 206 and a detector interface 208. The remote control unit 204, the input device 206 and the detector interface 208 may be present in a remote location. The user may move the input device 206 with respect to the detector interface 208. In an embodiment the detector interface 208 may have a grid configuration. As illustrated in
Now considering an embodiment wherein the detector interface 208 may be a hardware detector interface, the input device 206 may be moved maintaining contact with the detector interface 208. The input device 206 may be moved to contact one or more cells in the detector interface 208. While maintaining contact input signals are generated for activating sensors in the sensor assembly 202. Further the input device 206 may be used to apply pressure on the detector interface 208. This pressure may be translated into generation of higher intensity signals from the sensors. For example a detector interface may be a pad like structure having a grid configuration. The pad like structure may be but is not limited to, a two dimensional structure and a three dimensional structure. A sonographer or a technician may move an input device on the detector interface to select one or more cells of interest. Once the cells are selected then input signals are sent by the detector interface to a sensor control unit. Alternatively the input device sends the input signals to the sensor control unit. Consequently corresponding sensors present in the sensor assembly are activated based on the input signals. The activated sensors send ultrasonic signals.
The ultrasonic signals transmitted by the sensors may be of different frequencies or intensities. The intensity of the ultrasonic signals depends on a pressure applied by the input device on the detector interface. More specifically, when the pressure applied on a cell of the one or more interested cells by the input device is greater, then the intensity of ultrasonic signals generated by the corresponding sensors of the cell may be higher. When the intensity of the ultrasonic signals is higher the images generated may have higher resolution. Moreover, in an embodiment, a direction of application of pressure on the detector interface may determine a direction of transmission of the ultrasonic signals by the activated sensors to the patient's body portion. This is explained in detail in conjunction with the embodiments depicted in
In an embodiment, the detector interface 208 may be a virtual detector interface. The virtual detector interface may be presented to the user. The user may employ the input device 206 to perform gestures for selecting one or more cells of a grid configuration in the detector interface 208. The gestures may include but are not limited to moving or maneuvering the input device 206 in space in multiple axes. A direction of movement in the multiple axes determines a direction in which the ultrasonic signals need to be transmitted from the interested sensors to the patient body portion. Further a velocity or a speed of movement of the input device 206 with respect to the detector interface 208 may be translated into an intensity of ultrasonic signals transmitted by the interested sensors. The detector interface 208 may be configured to detect these movements of the input device 206 and then send input signals to the sensor control unit 210 for activating the corresponding sensors.
Turning now to an embodiment depicted in
During operation the user may position the input device 206 on the detector interface 208 and selects for example, the cell 302. Consequently a position of the cell 302 may be identified with respect to multiple axes. For instance the position of the cell 302 may be determined with respect to X and Y axes. This position information of the cell 302 may be send to the sensor control unit 102 and subsequently the sensor control unit 102 process the position information to determine that the cell 302 is selected. As a result, corresponding with sensor 400, the sensor 402 and the sensor 404 may be activated. The sensor control unit 102 may include information indicating mapping between the sensor 400, the sensor 402 and the sensor 404 and the cell 302. Similarly mapping information between other cells and the corresponding sensors may be stored in the sensor control unit 102. The sensor control unit 102 may have a memory for storing this mapping information. In an alternate embodiment one of the detector interface 208 and the input device 206 may store the mapping information and send the information to the sensor control unit 102 based on the cell selected. In this case the detector interface 208 and the input device 206 may have a memory.
Further, in an embodiment an orientation of the input device 206 with respect to the detector interface 208 may determine the direction of transmission of the ultrasonic signals from the sensor 400, the sensor 402 and the sensor 404. For instance, the input device 206 may be oriented at an angle respect to the Y axis and the angular orientation information may be transmitted to the sensor control unit 102. The sensor control unit 102 processes this information and activates the corresponding sensors for example, the sensor 400 to transmit the ultrasonic signal into the patient's body portion in the oriented angle. When ultrasonic signals are transmitted from different angles, the ultrasonic signals reflected from tissues of the patient's body portion may be received and processed to generate images of the tissues from different angles. This also facilitates in improved diagnosis of the tissues.
Moreover the ultrasonic signals may be transmitted to different depths of the body portion to capture more detail images of various tissues. Hence the input device 206 may be pressed against the detector interface 208 so that the pressure applied may be sensed by the detector interface 208. Then the remote control unit 204 processes the information associated with the pressure applied to activate the corresponding sensors such as sensor 400, a sensor 402 and a sensor 404 in the sensor assembly 202. In this embodiment, a pressure applied on the detector interface 208 may be measured and corresponding intensity level at which the ultrasonic signals need to be generated may be identified. Then information regarding the intensity level may be communicated to the sensor control unit 210 for activating corresponding sensors to generate ultrasonic signals at the intensity level. The ultrasonic signals transmitted at higher intensity reach higher depths of the patient's body portion. The ultrasonic signals reflected back from various tissues of the patient's body portion may be processed by the sensor control unit 210. The processed signals may be converted into images by an image processing unit (not shown in the figures) and presented to the medical expert through the display unit 214.
In an embodiment, the detector interface 208 may be a three dimensional interface. For example a detector interface 208 may have a shape of a body portion such as but not limited to a limb and a body part. However, it is contemplated that other embodiments of the detector interface 208 may have any other shapes. In this scenario, the input device 106 may be positioned on a portion of the detector interface 208 and sensors proximal to corresponding portion of the patient's body may be activated. For example, if an input device is positioned on a palm portion of a detector interface having the shape of the hand then sensors proximal to a palm portion of the patient's hand are activated. The sensors may transmit ultrasonic signals to capture the image of the palm portion. Multiple ultrasonic signals may be transmitted for time duration till the input device is in contact with the detector interface. Reflected signals received from the palm portion may be processed to generate real time images of the palm portion.
In an embodiment, a display unit may act as a detector interface and receive user input.
The model may include but is not limited to a two dimensional model and a three dimensional model. The model may be a two dimensional structure similar to the pad like structure shown in
Turning now to
The input signals are received by a transceiver unit 606 in the sensor control unit 210 over the wireless network 212. The sensor module 608 processes the input signals to activate the one or more sensors of the array of sensors present in the sensor assembly 202. In case the input signals include information related to the sensors then the sensor module 608 directly activates the sensors. In an alternate embodiment, if the input signals include the position information of the selected cells, then the sensor module 608 determines position information of the corresponding sensors. A mapping between the position of the cells and the position of the sensors may be stored in the sensor control unit 210. Subsequently the sensor module 608 activates the sensors. The sensors then send the ultrasonic signals to the patient body portion. The signals may be transmitted in different directions based on the orientation and interaction of the input device 106 with respect to the detector interface 208. Further, the ultrasonic signals send may have one or more focus attributes. The focus attributes may include but are not limited to the frequency of the ultrasonic signals and the depth at which the ultrasonic signals are transmitted to the patient's body portion. The focus attributes may vary dependent on the interaction of the input device 106 with the detector interface 208 in the multiple axes such as, a Z axis. Once the signals are received at the patient body portion they are reflected and received at the sensor control unit 210. The sensor control unit 210 may process the reflected signals and send to the image processing unit. Thereafter the image processing unit processes the reflected signals to generate the images.
Now turning to an embodiment depicted in
Once the interested cells are selected then one or more sensors of the array of sensors may be activated at step 704. The activated sensors proximal to a region of interest of the patient body portion send ultrasonic signals. The signals reflected from the patient body portion may be processed to obtain one or more images of the patient body portion.
Once the interested cells are selected then one or more sensors of the array of sensors may be activated at step 806. The activated sensors proximal to a region of interest of the patient body portion send ultrasonic signals. The signals reflected from the patient body portion may be processed to obtain one or more images of the patient body portion.
The methods 700 and 800 can be performed using a processor or any other processing device. The method steps can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium. The tangible computer readable medium may be for example a flash memory, a read-only memory (ROM), a random access memory (RAM), any other computer readable storage medium and any storage media. Although the method of remotely controlling an ultrasound procedure on a patient body is explained with reference to the flow chart of
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any computing system or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
Number | Date | Country | Kind |
---|---|---|---|
2568/CHE/2012 | Jun 2012 | IN | national |