CN116531089B - Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method - Google Patents
Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method Download PDFInfo
- Publication number
- CN116531089B CN116531089B CN202310821825.9A CN202310821825A CN116531089B CN 116531089 B CN116531089 B CN 116531089B CN 202310821825 A CN202310821825 A CN 202310821825A CN 116531089 B CN116531089 B CN 116531089B
- Authority
- CN
- China
- Prior art keywords
- image
- data
- path
- image data
- injection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 206010002091 Anaesthesia Diseases 0.000 title claims abstract description 27
- 230000037005 anaesthesia Effects 0.000 title claims abstract description 27
- 230000000903 blocking effect Effects 0.000 title claims abstract description 21
- 238000003672 processing method Methods 0.000 title description 4
- 238000000034 method Methods 0.000 claims abstract description 23
- 210000003484 anatomy Anatomy 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000005516 engineering process Methods 0.000 claims abstract description 11
- 238000000605 extraction Methods 0.000 claims abstract description 9
- 230000011218 segmentation Effects 0.000 claims abstract description 9
- 238000004364 calculation method Methods 0.000 claims abstract description 5
- 238000002347 injection Methods 0.000 claims description 71
- 239000007924 injection Substances 0.000 claims description 71
- 230000000007 visual effect Effects 0.000 claims description 32
- 238000004458 analytical method Methods 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 238000013507 mapping Methods 0.000 claims description 10
- 238000003708 edge detection Methods 0.000 claims description 9
- 238000002604 ultrasonography Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 6
- 238000011156 evaluation Methods 0.000 claims description 5
- 238000001914 filtration Methods 0.000 claims description 4
- 238000012800 visualization Methods 0.000 claims description 4
- 239000000463 material Substances 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 238000005457 optimization Methods 0.000 claims description 3
- 238000012546 transfer Methods 0.000 claims description 3
- 230000001960 triggered effect Effects 0.000 claims description 3
- 230000002708 enhancing effect Effects 0.000 abstract 1
- 238000007781 pre-processing Methods 0.000 abstract 1
- 230000006872 improvement Effects 0.000 description 8
- 239000000243 solution Substances 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 239000003814 drug Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 210000004204 blood vessel Anatomy 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
- A61B2017/3413—Needle locating or guiding means guided by ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Theoretical Computer Science (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiology & Medical Imaging (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Quality & Reliability (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention relates to the technical field of treatment of blocking anesthesia ultrasonic guidance data. The invention relates to a method for processing image-enhanced blocking anesthesia ultrasonic guidance data. The invention carries out preprocessing on the collected ultrasonic image by utilizing an image enhancement algorithm, denoising, enhancing contrast and adjusting brightness, so that the image is clearer and more identifiable, carries out feature extraction and segmentation on the preprocessed ultrasonic image by utilizing a computer vision technology, and identifies and positions the relevant anatomical structure required by blocking anesthesia, so that the invention has the characteristics of automation and high accuracy, can improve the operation efficiency and the accuracy, realizes the accurate calculation of guiding and positioning by utilizing an image registration algorithm, plans the path of the injector, provides more accurate and more stable guiding and positioning of the injector, reduces the operation risk, improves the operation success rate, and provides a doctor with a real-time guiding image by tracking the position and the movement of the injector in real time.
Description
Technical Field
The invention relates to the technical field of processing of blocking anesthesia ultrasonic guidance data, in particular to a method for processing blocking anesthesia ultrasonic guidance data based on image enhancement.
Background
In conventional block anesthesia procedures, ultrasound guidance techniques are widely used to locate and guide the exact position of a syringe. However, since the quality of the ultrasound image is limited by various factors, such as noise, occlusion of bone structure, etc., resulting in poor definition and visualization of the image, the effect and safety of the anesthesia process are affected, and thus, a method for processing the image-enhanced blocking anesthesia ultrasound-guided data is proposed.
Disclosure of Invention
The invention aims to provide an image-enhancement-based blocking anesthesia ultrasonic guidance data processing method for solving the problems in the background art.
To achieve the above object, there is provided an image-enhancement-based method for processing ultrasound guidance data for blocking anesthesia, comprising the steps of:
s1, defining a specific area in operation, collecting image data of the area, and converting the image data;
s2, performing visual processing on the region image data acquired in the step S1, and extracting an injection path of the processed image data;
s3, carrying out real-time path detection change data on the injection path extracted in the S2 in the image data;
s4, adjusting the image according to the transformation data detected in the S3, and marking the image data according to the anatomical result;
s5, performing three-dimensional reconstruction and analysis according to the image information, and acquiring a surgical space relation according to the analysis data.
As a further improvement of the present technical solution, the step of S1 acquiring image data of the area is as follows:
s1.1, defining a specific area in an operation range according to the position of a patient;
s1.2, carrying out image acquisition scanning on the specific area defined in the S1.1, obtaining ultrasonic image data, and converting and storing the ultrasonic image data into a digital image.
As a further improvement of the technical scheme, the step of extracting the injection route of the S2 is as follows:
s2.1, performing image processing on the ultrasonic image data acquired in the S1.2 by using a filtering algorithm, a histogram equalization algorithm and a brightness adjustment algorithm;
s2.2, carrying out feature extraction based on the image data processed in the step S2.1, and analyzing according to the extracted feature data to obtain an injection path.
As a further improvement of the present technical solution, the step of S2.2 obtaining the injection path is as follows:
extracting and dividing the characteristics of the preprocessed ultrasonic image by utilizing a computer vision technology;
segmentation: performing edge detection on the image by adopting an edge detection algorithm, and identifying and positioning related anatomical structures such as nerve bundles and blood vessels required by blocking anesthesia;
based on the results of feature extraction and segmentation, performing accurate calculation of blocking anesthesia guidance positioning, and planning a path of an injection device, wherein the expression is as follows:
;
wherein ,representing the total length of the path>Representing the length of each road segment->The current position is indicated and the current position is indicated,representing the control variable +.>Representing the transfer function, describing the position and orientation changes of each step, the optimal path of the injection device can be obtained by solving the above-mentioned optimization problem.
As a further improvement of the present technical solution, the step of performing the real-time path detection change data in S3.1 is as follows:
s3.1, performing visual display based on the image data processed in the S2.1;
s3.2, acquiring the position of the injection device in the image data in real time, and comparing the acquired data with an injection path for analysis.
As a further improvement of the present technical solution, the step of comparing and analyzing the collected data and the injection route in S3.2 is as follows:
converting the image data into a visual object by utilizing a three-dimensional visual technology, and simultaneously setting transparency, a light source and material parameters according to requirements to generate a final visual result;
fusing the visual result with the real-time position of the injection device and displaying the visual result on display equipment;
position measurement is carried out by adopting a sensor, and real-time position data of the injection device are obtained;
comparing and analyzing the position data of the injection device acquired in real time with a preset injection path, judging whether the injection device deviates from the path, and if the injection device deviates from the path, immediately carrying out real-time alarm and reminding adjustment by a system, wherein the real-time alarm can be judged and triggered by the following expression:
;
alarm triggering condition:
;
wherein ,for the current injection device position +.>For presetting the position on the injection path, +.>Represents the square of the distance between the two, whereas +.>Then the maximum deviation allowed is indicated when +.>When the allowable deviation range is exceeded, the system can immediately trigger an alarm to prompt an operator to adjust, so that the success of injection is ensured.
As a further improvement of the present technical solution, the step of marking the image data according to the anatomical result in S4 is as follows:
s4.1, marking different anatomical structures by using a color mapping technology in the visual display image based on the S3.1;
and S4.2, evaluating according to the position of the injection device in the visual display image, and adjusting the visual angle and the proportion of the image display according to the evaluation result.
As a further improvement of the present technical solution, the expression of the S4.1 for marking different anatomical structures using color mapping techniques is as follows:
the gray values of the image pixels can be mapped to the RGB (red, green, blue) color space, taking red and green as examples, by the following color mapping formula:
,
then,/>,/>;
,
Then,/>,/>;
Then->,/>,/>;
wherein ,、/> and />Respectively the preset maximum value in the gray level image, the gray level value is +.>;
And then carrying out three-dimensional visualization on the image endowed with the color to display marks of different anatomical structures.
As a further improvement of the present technical solution, the step of S5 of obtaining the surgical spatial relationship according to the analysis data is as follows:
firstly, edge detection is carried out on a two-dimensional image to extract an interested region;
then dividing the two-dimensional image into a plurality of voxels by using a distance dividing algorithm to form a three-dimensional matrix;
constructing a three-dimensional model by using a gradient method according to the position and the attribute value of the voxel;
finally, the anatomical structure of the ultrasound image is revealed by the texture map, and the spatial relationship of the operation area is understood by a doctor.
Compared with the prior art, the invention has the beneficial effects that:
in the image-enhancement-based blocking anesthesia ultrasonic guidance data processing method, the acquired ultrasonic image is preprocessed by utilizing an image enhancement algorithm, the noise is removed, the contrast is enhanced, the brightness is adjusted, the image is clearer and more discernable, the preprocessed ultrasonic image is subjected to feature extraction and segmentation by utilizing a computer vision technology, and the relevant anatomical structure required by blocking anesthesia is identified and positioned, so that the method has the characteristics of automation and high accuracy, can improve the operation efficiency and the accuracy, realizes the accurate calculation of guidance positioning by utilizing an image registration algorithm, plans the path of the injector, provides more accurate and more stable injector guidance positioning, reduces the operation risk, improves the operation success rate, and provides a doctor with real-time guidance image by tracking the position and the movement of the injector and visually displaying the processed ultrasonic image, thereby helping the doctor to better control the operation progress.
Drawings
FIG. 1 is an overall flow diagram of the present invention;
FIG. 2 is a block flow diagram of the present invention for acquiring image data of the region;
FIG. 3 is a block diagram of the injection path extraction process of the present invention;
FIG. 4 is a flow chart of the real-time path detection change data according to the present invention;
fig. 5 is a block flow diagram of marking image data according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Examples
Referring to fig. 1-5, the present embodiment is directed to a method for processing image-enhanced ultrasound guidance data for blocking anesthesia, comprising the following steps:
s1, defining a specific area in operation, collecting image data of the area, and converting the image data;
the step of S1 collecting the image data of the region is as follows:
s1.1, defining a specific area in an operation range according to the position of a patient; presetting an operation area, making strict rules before operation, determining the position and the size of each operation area, scanning the operation area by using a radio device so as to determine the position and the state of a nearby object related to the operation, and avoiding the interference of external impurities on image data caused by overlarge scanning range;
s1.2, carrying out image acquisition scanning on the specific area defined in the S1.1, obtaining ultrasonic image data, and converting and storing the ultrasonic image data into a digital image. Scanning the relevant parts in the blocking anesthesia operation process by using ultrasonic equipment, obtaining ultrasonic image data, and storing the ultrasonic image data as a digital image; the DICOM format is used for data transmission and storage, so that the ultrasonic image data is converted into the DICOM format, and in addition, the ultrasonic image data can be converted into other digital image formats, such as JPEG and PNG;
s2, performing visual processing on the region image data acquired in the step S1, and extracting an injection path of the processed image data;
the step of extracting the injection route in the S2 is as follows:
s2.1, performing image processing on the ultrasonic image data acquired in the S1.2 by using a filtering algorithm, a histogram equalization algorithm and a brightness adjustment algorithm; filtering algorithm:
;
wherein ,is the filtered pixel value of the image, < >>Is the coordinates of the current pixel in the image, and />Is the width and height of the filter, +.>Is the size of the filter;
s2.2, carrying out feature extraction based on the image data processed in the step S2.1, and analyzing according to the extracted feature data to obtain an injection path.
The step of S2.2 obtaining the injection route is as follows:
extracting and dividing the characteristics of the preprocessed ultrasonic image by utilizing a computer vision technology;
segmentation: performing edge detection on the image by adopting an edge detection algorithm, and identifying and positioning related anatomical structures such as nerve bundles and blood vessels required by blocking anesthesia;
based on the results of feature extraction and segmentation, performing accurate calculation of blocking anesthesia guidance positioning, and planning a path of an injection device, wherein the expression is as follows:
;
wherein ,representing the total length of the path>Representing the length of each road segment->The current position is indicated and the current position is indicated,representing the control variable +.>Representing the transfer function, describing the position and orientation changes of each step, the optimal path of the injection device can be obtained by solving the above-mentioned optimization problem. In addition, the injection angle formula may be used in determining the drug injection angle:
;
wherein ,indicates the injection angle +_>Indicating the depth to which the drug needs to reach, < > j->Indicating the distance of the injection device from the injection point. The formula is deduced according to the trigonometric function principle, and the injection angle of the medicine can be calculated by measuring the distance between the injection device and the injection point and the depth to which the medicine needs to be reached.
S3, carrying out real-time path detection change data on the injection path extracted in the S2 in the image data;
the step of performing real-time path detection change data in S3.1 is as follows:
s3.1, performing visual display based on the image data processed in the S2.1;
visual display transformation formula:
;
wherein , and />Gray values representing the pixels of the output and input pictures, respectively,/-> and />Respectively representing the minimum value and the maximum value of gray values in an input picture, wherein L is the gray level number in an output picture;
s3.2, acquiring the position of the injection device in the image data in real time, and comparing the acquired data with an injection path for analysis.
The step of comparing and analyzing the acquired data and the injection path in S3.2 is as follows:
converting the image data into a visual object by utilizing a three-dimensional visual technology, and simultaneously setting transparency, a light source and material parameters according to requirements to generate a final visual result;
fusing the visual result with the real-time position of the injection device and displaying the visual result on display equipment;
position measurement is carried out by adopting a sensor, and real-time position data of the injection device are obtained;
comparing and analyzing the position data of the injection device acquired in real time with a preset injection path, judging whether the injection device deviates from the path, and if the injection device deviates from the path, immediately carrying out real-time alarm and reminding adjustment by a system, wherein the real-time alarm can be judged and triggered by the following expression:
;
alarm triggering condition:
;
wherein ,for the current injection device position +.>For presetting the position on the injection path, +.>Represents the square of the distance between the two, whereas +.>Then the maximum deviation allowed is indicated when +.>When the allowable deviation range is exceeded, the system can immediately trigger an alarm to prompt an operator to adjust, so that the success of injection is ensured.
S4, adjusting the image according to the transformation data detected in the S3, and marking the image data according to the anatomical result;
the step of marking the image data according to the anatomical result is as follows:
s4.1, marking different anatomical structures by using a color mapping technology in the visual display image based on the S3.1; to further enhance the visual effect of the image;
the expression of the S4.1 marking different anatomical structures using color mapping techniques is as follows:
the gray values of the image pixels can be mapped to the RGB (red, green, blue) color space, taking red and green as examples, by the following color mapping formula:
,
then,/>,/>;
,
Then,/>,/>;
Then->,/>,/>;
wherein ,、/> and />Respectively the preset maximum value in the gray level image, the gray level value is +.>And then, carrying out three-dimensional visualization on the image endowed with the color to display marks of different anatomical structures.
And S4.2, evaluating according to the position of the injection device in the visual display image, and adjusting the visual angle and the proportion of the image display according to the evaluation result.
Distance assessment:
;
wherein ,representing the distance between two points, < >> and />Respectively representing the coordinates of the two points.
Direction evaluation:
;
wherein theta represents the angle between the two lines, and />Respectively representing coordinates of two points on one line, +.>Representing the coordinates of a point on another line, +.>For the evaluation of the values of the directions, combine +.> and />Carrying out numerical value difference to amplify the image, wherein the larger the difference is, the smaller the image proportion is;
s5, performing three-dimensional reconstruction and analysis according to the image information, and acquiring a surgical space relation according to the analysis data.
The step of S5 obtaining the operation space relation according to the analysis data is as follows:
firstly, edge detection is carried out on a two-dimensional image to extract an interested region;
then dividing the two-dimensional image into a plurality of voxels by using a distance dividing algorithm to form a three-dimensional matrix;
the expression of the distance segmentation algorithm for segmenting a two-dimensional image into a plurality of voxels is:
;
wherein ,is the pixel coordinates +.>Is the center point coordinates of the target contour, +.>Splitting the numerical value;
constructing a three-dimensional model by using a gradient method according to the position and the attribute value of the voxel;
voxel gradient may be calculated using the following formula:
;
wherein ,representing the gradient of the voxel>,/> and />Representing the gradients of the voxel values in x, y and z directions, respectively, < >>Representing voxel gradient;
finally, the anatomical structure of the ultrasound image is revealed by the texture map, and the spatial relationship of the operation area is understood by a doctor. The main function of texture mapping is to attach colors or patterns to the 3D model surface. For ultrasound images, the anatomy of the ultrasound image can be presented more intuitively by attaching the image of the anatomy to the 3D model surface. In this process, the coordinates (u, v) of the texture map can be calculated using the following formula:
,
;
wherein ,is the pixel coordinates +.> and />The width and height of the texture picture, respectively.
The foregoing has shown and described the basic principles, principal features and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the above-described embodiments, and that the above-described embodiments and descriptions are only preferred embodiments of the present invention, and are not intended to limit the invention, and that various changes and modifications may be made therein without departing from the spirit and scope of the invention as claimed. The scope of the invention is defined by the appended claims and their equivalents.
Claims (5)
1. A computer storage medium having stored thereon an image-enhancement-based method of processing ultrasound-guided data for blocking anesthesia, characterized by: the method comprises the following steps:
s1, defining a specific area in operation, collecting image data of the area, and converting the image data;
s2, performing visual processing on the region image data acquired in the step S1, and extracting an injection path of the processed image data;
s3, detecting change data in real time in image data based on the injection path extracted in the S2, wherein the change data is determined by the real-time position of the injection device and the injection path;
s4, adjusting the image according to the change data detected in the S3, and marking the image data according to the anatomical result;
s5, carrying out three-dimensional reconstruction and analysis according to the image information, and obtaining a surgical space relation according to analysis data;
the step of S1 collecting the image data of the region is as follows:
s1.1, defining a specific area in an operation range according to the position of a patient;
s1.2, carrying out image acquisition scanning on the specific area defined in the S1.1 to obtain ultrasonic image data, and converting and storing the ultrasonic image data into a digital image;
the step of extracting the injection route in the S2 is as follows:
s2.1, performing image processing on the ultrasonic image data acquired in the S1.2 by using a filtering algorithm, a histogram equalization algorithm and a brightness adjustment algorithm;
s2.2, carrying out feature extraction based on the image data processed in the S2.1, and analyzing according to the extracted feature data to obtain an injection path;
the step of S2.2 obtaining the injection route is as follows:
extracting and dividing the characteristics of the preprocessed ultrasonic image by utilizing a computer vision technology;
segmentation: performing edge detection on the image by adopting an edge detection algorithm, and identifying and positioning related anatomical structures required by blocking anesthesia;
based on the results of feature extraction and segmentation, performing accurate calculation of blocking anesthesia guidance positioning, and planning a path of an injection device, wherein the expression is as follows:
;
wherein ,representing the total length of the path>Representing the length of each road segment->Indicating the current position +.>Representing the control variable +.>Representing transfer functions, describing the position and orientation change of each step, and obtaining an optimal path of the injection device by solving an optimization problem;
the step of S5 obtaining the operation space relation according to the analysis data is as follows:
firstly, edge detection is carried out on a two-dimensional image to extract an interested region;
dividing the two-dimensional image into a plurality of voxels by using a distance dividing algorithm to form a three-dimensional matrix;
constructing a three-dimensional model by using a gradient method according to the position and the attribute value of the voxel;
the anatomical structure of the ultrasound image is revealed by the texture map, revealing the spatial relationship that lets the doctor understand the surgical field.
2. The computer storage medium of claim 1, having stored thereon an image-enhancement-based method of processing ultrasound-guided data for block anesthesia, wherein: the step of performing real-time path detection change data in the step S3 is as follows:
s3.1, performing visual display based on the image data processed in the S2.1;
s3.2, acquiring the position of the injection device in the image data in real time, and comparing the acquired data with an injection path for analysis.
3. The computer storage medium of claim 2, having stored thereon an image-enhancement-based method of processing ultrasound-guided data for block anesthesia, wherein: the step of comparing and analyzing the acquired data and the injection path in S3.2 is as follows:
converting the image data into a visual object by utilizing a three-dimensional visual technology, and simultaneously setting transparency, a light source and material parameters according to requirements to generate a final visual result;
fusing the visual result with the real-time position of the injection device and displaying the visual result on display equipment;
position measurement is carried out by adopting a sensor, and real-time position data of the injection device are obtained;
comparing and analyzing the position data of the injection device acquired in real time with a preset injection path, judging whether the injection device deviates from the path, and if the injection device deviates from the path, immediately carrying out real-time alarm and reminding adjustment by a system, wherein the real-time alarm is judged and triggered by the following expression:
;
alarm triggering condition:
;
wherein ,for the current injection device position +.>For presetting the position on the injection path, +.>Represents the square of the distance between the two, whereas +.>Then the maximum deviation allowed is indicated when +.>When the allowable deviation range is exceeded, the system can immediately trigger an alarm to prompt an operator to adjust, so that the success of injection is ensured.
4. The computer storage medium of claim 2, having stored thereon an image-enhancement-based method of processing ultrasound-guided data for block anesthesia, wherein: the step of marking the image data according to the anatomical result is as follows:
s4.1, marking different anatomical structures by using a color mapping technology based on the S3.1 visual display image;
and S4.2, evaluating according to the position of the injection device in the visual display image, and adjusting the visual angle and the proportion of the image display according to the evaluation result.
5. The computer storage medium of claim 4, having stored thereon an image-enhancement-based method of processing ultrasound-guided data for block anesthesia, wherein: the expression of the S4.1 marking different anatomical structures using color mapping techniques is as follows:
the gray values of the image pixels are mapped to the RGB color space by the following color mapping formula:
;
then,/>,/>;
;
Then,/>,/>;
Then->,/>,/>;
wherein ,、/> and />Respectively preset maximum values in gray-scale imagesGray value +.>And then, carrying out three-dimensional visualization on the image endowed with the color to display marks of different anatomical structures.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310821825.9A CN116531089B (en) | 2023-07-06 | 2023-07-06 | Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310821825.9A CN116531089B (en) | 2023-07-06 | 2023-07-06 | Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116531089A CN116531089A (en) | 2023-08-04 |
CN116531089B true CN116531089B (en) | 2023-10-20 |
Family
ID=87449222
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310821825.9A Active CN116531089B (en) | 2023-07-06 | 2023-07-06 | Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116531089B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117357253B (en) * | 2023-11-28 | 2024-04-12 | 哈尔滨海鸿基业科技发展有限公司 | Portable medical imaging tracking navigation device |
CN117547686B (en) * | 2024-01-12 | 2024-03-19 | 广东省人民医院 | Ultrasonic guidance method and system for botulinum toxin injection |
CN117745989B (en) * | 2024-02-21 | 2024-05-14 | 首都医科大学附属北京积水潭医院 | Nerve root blocking target injection path planning method and system based on vertebral canal structure |
CN118154444A (en) * | 2024-05-10 | 2024-06-07 | 陕西省人民医院(陕西省临床医学研究院) | Method for processing blocking anesthesia ultrasonic guidance data |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004102603A (en) * | 2002-09-09 | 2004-04-02 | Matsushita Electric Ind Co Ltd | Data characteristics extracting device and data collating device |
JP2004208859A (en) * | 2002-12-27 | 2004-07-29 | Toshiba Corp | Ultrasonic diagnostic equipment |
CN101794460A (en) * | 2010-03-09 | 2010-08-04 | 哈尔滨工业大学 | Method for visualizing three-dimensional anatomical tissue structure model of human heart based on ray cast volume rendering algorithm |
CN102947862A (en) * | 2010-03-11 | 2013-02-27 | 皇家飞利浦电子股份有限公司 | Probabilistic refinement of model-based segmentation |
CN107789056A (en) * | 2017-10-19 | 2018-03-13 | 青岛大学附属医院 | A kind of medical image matches fusion method |
CN109124764A (en) * | 2018-09-29 | 2019-01-04 | 上海联影医疗科技有限公司 | Guide device of performing the operation and surgery systems |
JP2020058630A (en) * | 2018-10-10 | 2020-04-16 | キヤノンメディカルシステムズ株式会社 | X-ray CT system |
CN112037163A (en) * | 2019-05-17 | 2020-12-04 | 深圳市理邦精密仪器股份有限公司 | Blood flow automatic measurement method and device based on ultrasonic image |
CN114882256A (en) * | 2022-04-22 | 2022-08-09 | 中国人民解放军战略支援部队航天工程大学 | Heterogeneous point cloud rough matching method based on geometric and texture mapping |
CN115317128A (en) * | 2022-01-04 | 2022-11-11 | 中山大学附属第一医院 | Ablation simulation method and device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9248316B2 (en) * | 2010-01-12 | 2016-02-02 | Elekta Ltd. | Feature tracking using ultrasound |
US20130131501A1 (en) * | 2011-11-18 | 2013-05-23 | Michael Blaivas | Neuro-vasculature access system and device |
EP2807978A1 (en) * | 2013-05-28 | 2014-12-03 | Universität Bern | Method and system for 3D acquisition of ultrasound images |
CN107278316B (en) * | 2016-07-14 | 2022-05-24 | 中山大学附属第一医院 | Three-dimensional reconstruction visualization integration method for internal bundle structure of peripheral nerve of human body |
US10660613B2 (en) * | 2017-09-29 | 2020-05-26 | Siemens Medical Solutions Usa, Inc. | Measurement point determination in medical diagnostic imaging |
US20200069285A1 (en) * | 2018-08-31 | 2020-03-05 | General Electric Company | System and method for ultrasound navigation |
-
2023
- 2023-07-06 CN CN202310821825.9A patent/CN116531089B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004102603A (en) * | 2002-09-09 | 2004-04-02 | Matsushita Electric Ind Co Ltd | Data characteristics extracting device and data collating device |
JP2004208859A (en) * | 2002-12-27 | 2004-07-29 | Toshiba Corp | Ultrasonic diagnostic equipment |
CN101794460A (en) * | 2010-03-09 | 2010-08-04 | 哈尔滨工业大学 | Method for visualizing three-dimensional anatomical tissue structure model of human heart based on ray cast volume rendering algorithm |
CN102947862A (en) * | 2010-03-11 | 2013-02-27 | 皇家飞利浦电子股份有限公司 | Probabilistic refinement of model-based segmentation |
CN107789056A (en) * | 2017-10-19 | 2018-03-13 | 青岛大学附属医院 | A kind of medical image matches fusion method |
CN109124764A (en) * | 2018-09-29 | 2019-01-04 | 上海联影医疗科技有限公司 | Guide device of performing the operation and surgery systems |
JP2020058630A (en) * | 2018-10-10 | 2020-04-16 | キヤノンメディカルシステムズ株式会社 | X-ray CT system |
CN112037163A (en) * | 2019-05-17 | 2020-12-04 | 深圳市理邦精密仪器股份有限公司 | Blood flow automatic measurement method and device based on ultrasonic image |
CN115317128A (en) * | 2022-01-04 | 2022-11-11 | 中山大学附属第一医院 | Ablation simulation method and device |
CN114882256A (en) * | 2022-04-22 | 2022-08-09 | 中国人民解放军战略支援部队航天工程大学 | Heterogeneous point cloud rough matching method based on geometric and texture mapping |
Non-Patent Citations (1)
Title |
---|
Cao Xiaojing, Yu Mingan.Ultrasound-guided thermal ablation for papillary thyroid microcarcinoma: a multicenter retrospective study.INTERNATIONAL JOURNAL OF HYPERTHERMIA.2021,第38卷(第1期),第916-922页. * |
Also Published As
Publication number | Publication date |
---|---|
CN116531089A (en) | 2023-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116531089B (en) | Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method | |
CN110338840B (en) | Three-dimensional imaging data display processing method and three-dimensional ultrasonic imaging method and system | |
US11986252B2 (en) | ENT image registration | |
CN107909622B (en) | Model generation method, medical imaging scanning planning method and medical imaging system | |
US7889895B2 (en) | Method and apparatus for identifying pathology in brain images | |
CN106725593B (en) | Ultrasonic three-dimensional fetal face contour image processing method and system | |
CN108573502B (en) | Method for automatically measuring Cobb angle | |
CN108294728B (en) | Wound state analysis system | |
US20140018681A1 (en) | Ultrasound imaging breast tumor detection and diagnostic system and method | |
CN108186051B (en) | Image processing method and system for automatically measuring double-apical-diameter length of fetus from ultrasonic image | |
Joshi et al. | Vessel bend-based cup segmentation in retinal images | |
EP3242602B1 (en) | Ultrasound imaging apparatus and method for segmenting anatomical objects | |
JP2020161129A (en) | System and method for scoring color candidate poses against color image in vision system | |
JP4203279B2 (en) | Attention determination device | |
CN112603373A (en) | Method and system for diagnosing tendon injury via ultrasound imaging | |
KR101251822B1 (en) | System and method for analysising perfusion in dynamic contrast-enhanced lung computed tomography images | |
CN111932502B (en) | Cornea image point cloud selection method, cornea image point cloud selection system, intelligent terminal and storage medium | |
EP3292835B1 (en) | Ent image registration | |
CN117315210A (en) | Image blurring method based on stereoscopic imaging and related device | |
CN115423804B (en) | Image calibration method and device and image processing method | |
CN116703882A (en) | Non-contact type heart three-dimensional mapping method and system | |
CN110675444B (en) | Method and device for determining head CT scanning area and image processing equipment | |
CN113554688A (en) | Monocular vision-based O-shaped sealing ring size measurement method | |
CN114092399A (en) | Focus marking method, device, electronic equipment and readable storage medium | |
Wang et al. | 3D surgical overlay with markerless image registration using a single camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |