CN112184817A - Brachial plexus image identification and anesthesia puncture guiding method and system - Google Patents

Brachial plexus image identification and anesthesia puncture guiding method and system Download PDF

Info

Publication number
CN112184817A
CN112184817A CN202011057047.3A CN202011057047A CN112184817A CN 112184817 A CN112184817 A CN 112184817A CN 202011057047 A CN202011057047 A CN 202011057047A CN 112184817 A CN112184817 A CN 112184817A
Authority
CN
China
Prior art keywords
image
brachial plexus
anesthesia puncture
ultrasonic image
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011057047.3A
Other languages
Chinese (zh)
Other versions
CN112184817B (en
Inventor
邱逦
向茜
王丽芸
朱笔挥
李科君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Wangwang Technology Co ltd
West China Hospital of Sichuan University
Original Assignee
Chengdu Wangwang Technology Co ltd
West China Hospital of Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Wangwang Technology Co ltd, West China Hospital of Sichuan University filed Critical Chengdu Wangwang Technology Co ltd
Priority to CN202011057047.3A priority Critical patent/CN112184817B/en
Publication of CN112184817A publication Critical patent/CN112184817A/en
Application granted granted Critical
Publication of CN112184817B publication Critical patent/CN112184817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3401Puncturing needles for the peridural or subarachnoid space or the plexus, e.g. for anaesthesia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Pathology (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Anesthesiology (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an brachial plexus image identification and anesthesia puncture guiding method and system, which assist a doctor in performing brachial plexus nerve identification and marking on an image scanned by ultrasonic by constructing a targeted model in the brachial plexus anesthesia puncture guiding process, and simultaneously perform identification and marking on anterior oblique muscle and medial oblique muscle before and after brachial plexus nerve, so that the anesthesia doctor can accurately perform nerve blocking and quickly get on the hand based on an ultrasonic image with the anatomical structure mark in the actual anesthesia injection process, and meanwhile, the working efficiency is improved.

Description

Brachial plexus image identification and anesthesia puncture guiding method and system
Technical Field
The invention belongs to the technical field of image recognition, and particularly relates to an brachial plexus image recognition and anesthesia puncture guiding method and system.
Background
An anesthesia method in which a local anesthetic is injected around the brachial plexus trunk to block nerve conduction in the region innervated by the brachial plexus trunk is called brachial plexus block anesthesia. Is one of the anesthesia methods commonly used in clinic. Is suitable for various operations of hands, forearms, upper arms and shoulders. After the ultrasonic imaging equipment is introduced into clinical use, particularly in the field of anesthesia puncture, an anesthesia doctor is skilled in controlling anesthesia puncture under ultrasonic guidance, and long-time professional operation training of an ultrasonic machine is needed. Therefore, an auxiliary technical means is urgently needed to assist the anesthetist to quickly and accurately find the target position in the anesthesia puncture of the brachial plexus by using the ultrasound guide.
In order to solve the above problems, the inventor has not found relevant devices or techniques in the prior art, and in view of this, the inventor wants to be able to use artificial intelligence to assist an anesthesiologist in quickly confirming an accurate target position of an anesthetic puncture in an ultrasonic image. However, for some complicated ultrasound images or physicians with less experience, finding the target position in the complicated ultrasound images is very difficult, and it needs to consider processing the ultrasound images with auxiliary software to help the physicians quickly find the target position, but there is no such method and software at present.
In the course of completing the present invention, the inventors found that the following problems exist in the prior art: the method for processing the ultrasonic image comprises a Haar characteristic processing method and a Cascade processing method. The Haar characteristic processing method is to extract the Haar characteristic value of the image, but the Haar characteristic value only reflects the gray change condition of the ultrasonic image, and the capability of extracting the characteristic value is weak; the Cascade processing method uses the Cascade Cascade thought to locate the position from a smooth window, but the Cascade Cascade thought only uses weak characteristics, and a detector using the Cascade Cascade thought for classification is also a weak classifier, so that the interference resistance is poor and the generalization performance is poor. The analysis accuracy of the position of the nerve plexus in the ultrasonic image is low.
Disclosure of Invention
Aiming at the defects in the prior art, the brachial plexus image identification and anesthesia puncture guiding method and system provided by the invention solve the problems that the requirement on doctor experience is high and the brachial plexus is difficult to accurately position when nerve blocking is carried out in the prior art.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that: an brachial plexus anesthesia puncture image identification and anesthesia puncture guiding method comprises the following steps:
s1, constructing brachial plexus image recognition and anesthesia puncture guiding model
S2, scanning to obtain an ultrasonic image of the brachial plexus section;
s3, marking an anatomical structure on the ultrasonic image;
s4, training the constructed brachial plexus image recognition and anesthesia puncture guiding model by using the marked ultrasonic image;
s5, recognizing the ultrasonic image to be recognized by using the trained brachial plexus image recognition and anesthesia puncture guiding model, and obtaining an anesthesia puncture guiding image for accurately positioning brachial plexus.
Further, the brachial plexus image identification and anesthesia puncture guidance model in step S1 is a deep convolutional neural network having a ladder structure connected layer by layer at up and down stages.
Further, the brachial plexus nerve section in the step S2 includes an brachial plexus nerve root section, an brachial plexus trunk section and an brachial plexus thigh section;
the method for scanning and obtaining the ultrasonic image of the brachial plexus section specifically comprises the following steps: the probe is transversely placed from the lower part of the neck and the outer side of the thyroid to downwards slide and scan to the supraclavicular region, and an ultrasonic image with the anterior oblique angle muscle, the middle posterior oblique angle muscle and the interbrachial plexus nerve intersulcus segment is displayed in real time.
Further, in step S3, the anatomical structures to be marked in the ultrasound image include brachial plexus, anterior oblique muscle and central oblique muscle;
in the ultrasonic image, the brachial plexus is positioned between the anterior oblique muscle and the central oblique muscle, and the accurate positioning of the actual position of the brachial plexus is realized by marking the anterior oblique muscle and the central oblique muscle.
Further, the method for training the brachial plexus image recognition and anesthesia puncture guidance model in step S4 specifically includes:
training an brachial plexus image recognition and anesthesia puncture guiding model by adopting an input data enhanced affine transformation method, and performing verification test on the brachial plexus image recognition and anesthesia puncture guiding model in a cross inspection mode;
wherein, the parameter setting in the training process comprises setting the leaving _ rate to 0.001, setting the batch to 32, and setting the step to 35000.
Further, in step S5, the method for identifying the ultrasound image to be identified specifically includes:
s51, performing framing processing on the ultrasonic image to be identified to obtain a continuous single-frame ultrasonic image;
s52, extracting the characteristics of each single-frame ultrasonic image;
s53, performing similarity matching on the features in each single-frame ultrasonic image to obtain a target image;
s54, carrying out feature fusion processing on the extracted features to obtain target parameters;
and S55, positioning the obtained target parameters into a target image to obtain an anesthesia puncture guide image for accurately positioning brachial plexus.
Further, the step S53 is specifically:
a1, constructing a multilayer stair structure of each single-frame ultrasonic image, and acquiring the spatial resolution of each layer of stair structure;
a2, sequentially judging whether the spatial resolution of the high-level step structure in each single-frame ultrasonic image is higher than that of the low-level step structure;
if yes, entering the step A3;
if not, returning to the step S52;
a3, increasing the semantic value of the high-level ladder structure in the single-frame ultrasonic image, and further obtaining the overall semantic information of the current single-frame ultrasonic image;
a4, taking the sum of the overall semantic information of all the single-frame ultrasonic images as a denominator, and taking the overall semantic information of each single-frame ultrasonic image as a numerator in sequence to obtain a score corresponding to each overall semantic information;
a5, taking each integral semantic information score as a characteristic weight in the corresponding single-frame ultrasonic image;
a6, matching the similarity of the parameters of each characteristic weight with the parameters of a standard ultrasonic image stored in an brachial plexus image identification and anesthesia puncture guiding model to obtain the corresponding confidence coefficient;
and A7, using the single-frame ultrasonic image corresponding to the maximum confidence as a target image.
Further, the step S54 is specifically:
by classifying and regressing the extracted features, position information and name information of each anatomical structure are output as target parameters.
An brachial plexus image identification and anesthesia puncture guiding system, comprising:
the image acquisition module is used for scanning the brachial plexus section and acquiring the corresponding ultrasonic image;
the image transmission module is used for transmitting the acquired ultrasonic image to the image framing module in a wired or wireless transmission mode;
the image framing module is used for dividing the received ultrasonic image into continuous single-frame ultrasonic images;
the image identification module is used for sequentially identifying each single-frame ultrasonic image to obtain the ultrasonic image marked with the position and the name of the anatomical structure;
and the image display module is used for displaying the ultrasonic image marked with the position and the name of the anatomical structure as an brachial plexus anesthesia puncture guide image.
Further, the image acquisition module is a portable ultrasonic machine or palm ultrasonic;
the image acquisition module scans the neck lower part and the thyroid gland outer side to the clavicle upper region and acquires the corresponding ultrasonic image.
Furthermore, the image identification module comprises a feature extraction unit, a target image extraction unit, a feature fusion unit and a parameter positioning unit;
the feature extraction unit is used for extracting features for characterizing an anatomical structure in each single-frame ultrasonic image;
the target image extraction unit is used for carrying out similarity matching on the extracted features to obtain a target image with a complete anatomical structure;
the feature fusion unit is used for performing feature fusion on all the extracted features to obtain the position and name information of the anatomical structure as target parameters;
the parameter positioning unit is used for positioning the acquired target parameters into the target image to obtain the ultrasonic image marked with the position and name parameters of the anatomical structure.
Further, the anatomical structure marked in the brachial plexus anesthesia puncture guide image displayed in the image display module comprises brachial plexus nerves, anterior oblique muscles and central oblique muscles.
Furthermore, the image acquisition module, the image transmission module, the image framing module, the image recognition module and the image display module form an AI model for brachial plexus image recognition and anesthesia puncture guidance, and the AI model can be operated in any one of a Windows system, a linux system, an IOS system and an Android system.
Furthermore, the AI model can be operated not only in a large server, but also in any one of a general desktop computer, a notebook computer and a mobile device, and is not limited to whether supported by an independent graphics card;
the mobile device comprises a tablet computer and a smart phone.
The invention has the beneficial effects that:
(1) according to the invention, the brachial plexus nerve identification and marking are carried out on the scanned ultrasonic image by constructing the targeted brachial plexus nerve identification model, and meanwhile, the anterior oblique muscle and the middle oblique muscle before and after the brachial plexus nerve are also identified and marked, so that an anesthesiologist can independently judge the ultrasonic image based on the anatomical structure mark in the actual anesthesia puncture process, and the nerve block is accurately carried out;
(2) when the brachial plexus nerve identification and positioning method is used for identifying and positioning brachial plexus nerves in an ultrasonic image, the difference among all classes in the image can be effectively distinguished, the distribution in each class is well restrained, and a good supervision effect is achieved;
(3) according to the invention, the deep convolutional neural network is adopted to extract brachial plexus characteristics, so that higher characteristic extraction capability can be ensured, spatial characteristics, high-frequency characteristics and low-frequency characteristics in an image can be well captured, and meanwhile, the robustness is better;
(4) the brachial plexus image identification and anesthesia puncture guiding model has low requirements on hardware structures, can be widely applied to various mobile devices and environments, is suitable for basic level mechanisms and places where large-scale ultrasonic machines are inconvenient to carry, such as battlefield environments, needs anesthesia doctors to apply palm ultrasound to perform anesthesia puncture, and has high universality.
Drawings
Fig. 1 is a flowchart of an brachial plexus image recognition and anesthesia puncture guiding method provided by the present invention.
Fig. 2 is a block diagram of an brachial plexus image recognition and anesthesia puncture guiding system according to the present invention.
Fig. 3 is a schematic diagram of an brachial plexus anesthesia puncture guiding image provided by the present invention.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
Example 1:
as shown in fig. 1, a brachial plexus image recognition and anesthesia puncture guiding method includes the following steps:
s1, constructing an brachial plexus image recognition and anesthesia puncture guiding model;
s2, scanning to obtain an ultrasonic image of the brachial plexus section;
s3, marking an anatomical structure on the ultrasonic image;
s4, training the constructed brachial plexus image recognition and anesthesia puncture guiding model by using the marked ultrasonic image;
s5, recognizing the ultrasonic image to be recognized by using the trained brachial plexus image recognition and anesthesia puncture guiding model, and obtaining an anesthesia puncture guiding image for accurately positioning brachial plexus.
In the embodiment of the invention, the acquired ultrasonic image with the brachial plexus section is input into the constructed identification model, and the brachial plexus and the related positioning auxiliary structure thereof are labeled, so that an anesthesiologist can quickly position the brachial plexus through the image with labeled information after identification without the assistance of an sonographer, and further perform nerve blocking.
In this embodiment, the brachial plexus image recognition and anesthesia puncture guiding model in step S1 is a deep convolutional neural network having a ladder structure in which the upper and lower levels are connected layer by layer;
the brachial plexus nerve cut surface in step S2 of the present embodiment includes an brachial plexus nerve root cut surface, an brachial plexus trunk cut surface, and an brachial plexus thigh cut surface;
the method for scanning and obtaining the ultrasonic image of the brachial plexus section specifically comprises the following steps: the probe is transversely placed from the lower part of the neck and the outer side of the thyroid to downwards slide and scan to the supraclavicular region, and an ultrasonic image with the anterior oblique angle muscle, the middle posterior oblique angle muscle and the interbrachial plexus nerve intersulcus segment is displayed in real time.
In step S3 of the present embodiment, the anatomical structures to be marked in the ultrasound image include brachial plexus, anterior oblique and medial oblique muscles; in the ultrasonic image, the brachial plexus is positioned between the anterior oblique muscle and the central oblique muscle, and the accurate positioning of the actual position of the brachial plexus is realized by marking the anterior oblique muscle and the central oblique muscle. Specifically, clinical nerve block generally blocks the intertrochanteric groove stage of the brachial plexus, and by marking the anterior and medial oblique muscles, it is helpful for the anesthesiologist to accurately confirm the location of the nerve block.
The method for brachial plexus image recognition and anesthesia puncture guiding model training in step S4 of the embodiment of the present invention specifically comprises: training an brachial plexus image recognition and anesthesia puncture guiding model by adopting an input data enhanced affine transformation method, and performing verification test on the brachial plexus image recognition and anesthesia puncture guiding model in a cross inspection mode;
wherein the parameter setting in the training process comprises setting the learning _ rate to 0.001, setting the batch to 32 and setting the step to 35000;
the loss function of the brachial plexus image recognition and anesthesia puncture guiding model training process is as follows:
Figure BDA0002711128330000081
in the formula, yiExactly labeled result, y ', for the ith data in one batch'iAnd n is the total number of input data.
In step S5 of this embodiment, the method for identifying an ultrasound image to be identified specifically includes:
s51, performing framing processing on the ultrasonic image to be identified to obtain a continuous single-frame ultrasonic image;
s52, extracting the characteristics of each single-frame ultrasonic image;
s53, performing similarity matching on the features in each single-frame ultrasonic image to obtain a target image;
s54, carrying out feature fusion processing on the extracted features to obtain target parameters;
and S55, positioning the obtained target parameters into a target image to obtain an anesthesia puncture guide image for accurately positioning brachial plexus.
In the process, the acquired ultrasonic image is subjected to framing processing into ultrasonic images of a plurality of continuous frames by adopting a conventional video framing technology, so that the brachial plexus image identification and anesthesia puncture guiding model can conveniently complete the image identification processing process in the steps S52-S55; specifically, the feature extraction is to obtain a multi-dimensional weight matrix through combination of convolution layers, the feature fusion is to splice the multi-dimensional weight matrix, the similarity matching is to decompose the non-negative matrix of the feature weight matrix, and the parameter positioning is to output the name and the area coordinate of the position target through a convolution network.
The step S53 is specifically:
a1, constructing a multilayer stair structure of each single-frame ultrasonic image, and acquiring the spatial resolution of each layer of stair structure;
a2, sequentially judging whether the spatial resolution of the high-level step structure in each single-frame ultrasonic image is higher than that of the low-level step structure;
if yes, entering the step A3;
if not, returning to the step S52;
a3, increasing the semantic value of the high-level ladder structure in the single-frame ultrasonic image, and further obtaining the overall semantic information of the current single-frame ultrasonic image;
a4, taking the sum of the overall semantic information of all the single-frame ultrasonic images as a denominator, and taking the overall semantic information of each single-frame ultrasonic image as a numerator in sequence to obtain a score corresponding to each overall semantic information;
a5, taking each integral semantic information score as a characteristic weight in the corresponding single-frame ultrasonic image;
a6, matching the similarity of the parameters of each characteristic weight with the parameters of a standard ultrasonic image stored in an brachial plexus image identification and anesthesia puncture guiding model to obtain the corresponding confidence coefficient;
and A7, using the single-frame ultrasonic image corresponding to the maximum confidence as a target image.
In the process, the single-frame image is processed by constructing the deep convolutional neural network structure of the multilayer ladder structure, and the connection between layers is enhanced by using the multilayer structure, so that the characteristic image is effectively utilized, and rich high-frequency characteristics capable of representing the required anatomical structure of the image are extracted.
The step S54 is specifically:
by classifying and regressing the extracted features, position information and name information of each anatomical structure are output as target parameters. The features belonging to the same anatomical structure in the extracted features are classified, so that the features of the same structure are sufficient as much as possible, and the features of the anatomical structures under the same classification are fused, so that the constructed and trained recognition model can be recognized aiming at the anatomical structures in slightly different ultrasonic images, and the recognition efficiency of the model is improved.
Example 2:
as shown in fig. 2, the present embodiment provides an brachial plexus image recognition and anesthesia puncture guiding system corresponding to the method in embodiment 1, which includes:
the image acquisition module is used for scanning the brachial plexus section and acquiring the corresponding ultrasonic image;
the image transmission module is used for transmitting the acquired ultrasonic image to the image framing module in a wired or wireless transmission mode;
the image framing module is used for dividing the received ultrasonic image into continuous single-frame ultrasonic images;
the image identification module is used for sequentially identifying each single-frame ultrasonic image to obtain the ultrasonic image marked with the position and the name of the anatomical structure;
and the image display module is used for displaying the ultrasonic image marked with the position and the name of the anatomical structure as an brachial plexus anesthesia puncture guide image.
The image acquisition module in the embodiment is an ultrasonic machine or a handheld ultrasonic instrument; the image acquisition module scans the lower part of the neck, the outer side of the thyroid to the upper area of the clavicle and acquires corresponding ultrasonic images; the corresponding ultrasonic image is acquired by scanning and collecting the area range based on the brachial plexus, so that unnecessary areas are prevented from being scanned, and the identification efficiency is reduced.
The image recognition module in this embodiment has the same function as the brachial plexus image recognition and anesthesia puncture guiding model in embodiment 1, and includes a feature extraction unit, a target image extraction unit, a feature fusion unit, and a parameter positioning unit;
the feature extraction unit is used for extracting features for characterizing the anatomical structure in each single-frame ultrasonic image; the target image extraction unit is used for carrying out similarity matching on the extracted features to obtain a target image with a complete anatomical structure; the feature fusion unit is used for performing feature fusion on all the extracted features to obtain the position and name information of the anatomical structure as target parameters; the parameter positioning unit is used for positioning the acquired target parameters into the target image to obtain the ultrasonic image marked with the anatomical structure position and the name parameters.
The target image extraction unit is a process for performing similarity matching through various image similarity indexes, wherein the image similarity indexes comprise a Manhattan distance, an Euclidean distance, a Mahalanobis distance, an image mean value, an image standard deviation and the like, and the more the set similarity indexes are, the higher the obtained similarity matching precision is, and the more accurate the target image is; the feature fusion unit is a process of fusing all features of the same anatomical structure so that the determined target parameters can represent various anatomical structures as much as possible.
The anatomical structure marked in the brachial plexus anesthesia puncture guide image displayed in the image display module in the embodiment comprises brachial plexus nerves, anterior oblique muscles and central oblique muscles, in an ultrasonic image, the brachial plexus nerves are positioned between the anterior oblique muscles and the central oblique muscles, clinical nerve block generally blocks interbrachial plexus sulcus segments, and the marking of the anterior oblique muscles and the central oblique muscles is helpful for an anesthesiologist to accurately confirm the position for nerve block.
The image acquisition module, the image transmission module, the image framing module, the image recognition module and the image display module in the embodiment form an AI model for brachial plexus image recognition and anesthesia puncture guidance, and the AI model can be operated in any one of a Windows system, a linux system, an IOS system and an Android system; in addition, the AI model can be operated in a large-scale server, or in any one of a common desktop computer, a notebook computer and a mobile device, and is not limited to whether the AI model is supported by an independent display card or not; the mobile device comprises a tablet computer, a smart phone and the like.
The operation of the traditional relevant AI model needs a large amount of computing resources, so that the conventional AI model is integrated in a large-scale traditional ultrasonic machine, the invention breaks through the technology, the requirement on the computing resources is reduced, the hardware environment for the operation of the model can be on mobile equipment such as a common desktop (without a too professional display card), a notebook computer (without a too professional display card), a PAD (PAD), a mobile phone and the like, the software environment supports various versions of Linux, Windows, IOS and Android, the handheld ultrasonic work is effectively supported and matched, the use difficulty is reduced, and the application range is expanded.
Fig. 3 is an image with anatomical structure labels displayed by the display module in the embodiment of the present invention, and it can be seen from the image that the method of the present invention can clearly label brachial plexus, anterior oblique angle muscle and medial oblique angle muscle, so as to facilitate accurate positioning of brachial plexus in the operation process of an anesthesiologist and perform nerve blocking.

Claims (14)

1. A brachial plexus image identification and anesthesia puncture guiding method is characterized by comprising the following steps:
s1, constructing an brachial plexus image recognition and anesthesia puncture guiding model;
s2, scanning to obtain an ultrasonic image of the brachial plexus section;
s3, marking an anatomical structure on the ultrasonic image;
s4, training the constructed brachial plexus image recognition and anesthesia puncture guiding model by using the marked ultrasonic image;
s5, recognizing the ultrasonic image to be recognized by using the trained brachial plexus image recognition and anesthesia puncture guiding model, and obtaining an anesthesia puncture guiding image for accurately positioning brachial plexus.
2. The brachial plexus image identification and anesthesia puncture guiding method according to claim 1, wherein the brachial plexus image identification and anesthesia puncture guiding model in step S1 is a deep convolutional neural network having a ladder structure with up to down stages connected layer by layer.
3. The brachial plexus image recognition and anesthesia puncture guidance method according to claim 1, wherein the brachial plexus cut planes in step S2 include a brachial plexus root cut plane, an brachial plexus trunk cut plane, and an brachial plexus femoral cut plane;
the method for scanning and obtaining the ultrasonic image of the brachial plexus section specifically comprises the following steps: the probe is transversely placed from the lower part of the neck and the outer side of the thyroid to downwards slide and scan to the supraclavicular region, and an ultrasonic image with the anterior oblique angle muscle, the middle posterior oblique angle muscle and the interbrachial plexus nerve intersulcus segment is displayed in real time.
4. The brachial plexus image recognition and anesthesia puncture guiding method according to claim 1, wherein in step S3, the anatomical structures to be marked in the ultrasound image include brachial plexus, anterior oblique muscle and oblique muscle;
in the ultrasonic image, the brachial plexus is positioned between the anterior oblique muscle and the central oblique muscle, and the accurate positioning of the actual position of the brachial plexus is realized by marking the anterior oblique muscle and the central oblique muscle.
5. The brachial plexus image recognition and anesthesia puncture guiding method according to claim 1, wherein the method for training the brachial plexus image recognition and anesthesia puncture guiding model in step S4 specifically comprises:
training an brachial plexus image recognition and anesthesia puncture guiding model by adopting an input data enhanced affine transformation method, and performing verification test on the brachial plexus image recognition and anesthesia puncture guiding model in a cross inspection mode;
wherein, the parameter setting in the training process comprises setting the leaving _ rate to 0.001, setting the batch to 32, and setting the step to 35000.
6. The brachial plexus image identification and anesthesia puncture guiding method according to claim 1, wherein in step S5, the method for identifying the ultrasound image to be identified specifically comprises:
s51, performing framing processing on the ultrasonic image to be identified to obtain a continuous single-frame ultrasonic image;
s52, extracting the characteristics of each single-frame ultrasonic image;
s53, performing similarity matching on the features in each single-frame ultrasonic image to obtain a target image;
s54, carrying out feature fusion processing on the extracted features to obtain target parameters;
and S55, positioning the obtained target parameters into a target image to obtain an anesthesia puncture guide image for accurately positioning brachial plexus.
7. The brachial plexus image identification and anesthesia puncture guiding method according to claim 6, wherein the step S53 specifically comprises:
a1, constructing a multilayer stair structure of each single-frame ultrasonic image, and acquiring the spatial resolution of each layer of stair structure;
a2, sequentially judging whether the spatial resolution of the high-level step structure in each single-frame ultrasonic image is higher than that of the low-level step structure;
if yes, go to step A3;
if not, returning to the step S52;
a3, increasing the semantic value of the high-level ladder structure in the single-frame ultrasonic image, and further obtaining the overall semantic information of the current single-frame ultrasonic image;
a4, taking the sum of the overall semantic information of all the single-frame ultrasonic images as a denominator, and taking the overall semantic information of each single-frame ultrasonic image as a numerator in sequence to obtain a score corresponding to each overall semantic information;
a5, taking each integral semantic information score as a characteristic weight in the corresponding single-frame ultrasonic image;
a6, matching the similarity of the parameters of each characteristic weight with the parameters of a standard ultrasonic image stored in an brachial plexus image identification and anesthesia puncture guiding model to obtain the corresponding confidence coefficient;
and A7, using the single-frame ultrasonic image corresponding to the maximum confidence as a target image.
8. The brachial plexus image identification and anesthesia puncture guiding method according to claim 6, wherein the step S54 specifically comprises:
by classifying and regressing the extracted features, position information and name information of each anatomical structure are output as target parameters.
9. An brachial plexus image identification and anesthesia puncture guiding system is characterized by comprising:
the image acquisition module is used for scanning the brachial plexus section and acquiring the corresponding ultrasonic image;
the image transmission module is used for transmitting the acquired ultrasonic image to the image framing module in a wired or wireless transmission mode;
the image framing module is used for dividing the received ultrasonic image into continuous single-frame ultrasonic images;
the image identification module is used for sequentially identifying each single-frame ultrasonic image to obtain the ultrasonic image marked with the position and the name of the anatomical structure;
and the image display module is used for displaying the ultrasonic image marked with the position and the name of the anatomical structure as an brachial plexus anesthesia puncture guide image.
10. The brachial plexus image identification and anesthesia puncture guidance system according to claim 9, wherein the image acquisition module is a portable ultrasound machine or palm ultrasound;
the image acquisition module scans the neck lower part and the thyroid gland outer side to the clavicle upper region and acquires the corresponding ultrasonic image.
11. The brachial plexus image identification and anesthesia puncture guiding system according to claim 9, wherein the image identification module comprises a feature extraction unit, a target image extraction unit, a feature fusion unit and a parameter positioning unit;
the feature extraction unit is used for extracting features for characterizing an anatomical structure in each single-frame ultrasonic image;
the target image extraction unit is used for carrying out similarity matching on the extracted features to obtain a target image with a complete anatomical structure;
the feature fusion unit is used for performing feature fusion on all the extracted features to obtain the position and name information of the anatomical structure as target parameters;
the parameter positioning unit is used for positioning the acquired target parameters into the target image to obtain the ultrasonic image marked with the position and name parameters of the anatomical structure.
12. The brachial plexus image identification and anesthesia puncture guidance system according to claim 9, wherein the anatomical structures labeled in the brachial plexus anesthesia puncture guidance image displayed in the image display module include brachial plexus, anterior oblique muscle and medial oblique muscle.
13. The brachial plexus image identification and anesthesia puncture guidance system according to claim 9, wherein the image acquisition module, the image transmission module, the image framing module, the image identification module and the image display module constitute an AI model for brachial plexus image identification and anesthesia puncture guidance, and the AI model can be run in any one of a Windows system, a linux system, an IOS system and an Android system.
14. The brachial plexus image identification and anesthesia puncture guidance system according to claim 13, wherein said AI model can be operated not only in a large server, but also in any one of a general desktop computer, a notebook computer and a mobile device, without being limited to whether supported by an independent video card;
the mobile device comprises a tablet computer and a smart phone.
CN202011057047.3A 2020-09-30 2020-09-30 Brachial plexus image identification method and system Active CN112184817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011057047.3A CN112184817B (en) 2020-09-30 2020-09-30 Brachial plexus image identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011057047.3A CN112184817B (en) 2020-09-30 2020-09-30 Brachial plexus image identification method and system

Publications (2)

Publication Number Publication Date
CN112184817A true CN112184817A (en) 2021-01-05
CN112184817B CN112184817B (en) 2022-12-02

Family

ID=73946458

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011057047.3A Active CN112184817B (en) 2020-09-30 2020-09-30 Brachial plexus image identification method and system

Country Status (1)

Country Link
CN (1) CN112184817B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037751A (en) * 2024-02-27 2024-05-14 中国医学科学院肿瘤医院 Nerve block automatic segmentation and visualization method and system based on ultrasonic image

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009110410A1 (en) * 2008-03-04 2009-09-11 日本電気株式会社 Image matching device, image matching feature amount storage medium, image matching method, and image matching program
US20120034587A1 (en) * 2000-10-23 2012-02-09 Toly Christopher C Phsysiological simulator for use as a brachial plexus nerve block trainer
CN108804547A (en) * 2018-05-18 2018-11-13 深圳华声医疗技术股份有限公司 Ultrasonoscopy teaching method, device and computer readable storage medium
CN110384698A (en) * 2019-08-14 2019-10-29 牡丹江医学院 A kind of anesthetic and its application suitable under color ultrasound guidance to brachial plexus nerve anesthesia
CN110399915A (en) * 2019-07-23 2019-11-01 王英伟 A kind of Ultrasound Image Recognition Method and its system based on deep learning
CN110414405A (en) * 2019-07-23 2019-11-05 王英伟 The recognition methods and its system of Scalene gap brachial plexus nerve based on deep learning
CN110706807A (en) * 2019-09-12 2020-01-17 北京四海心通科技有限公司 Medical question-answering method based on ontology semantic similarity
CN111260786A (en) * 2020-01-06 2020-06-09 南京航空航天大学 Intelligent ultrasonic multi-mode navigation system and method
CN111292324A (en) * 2020-03-20 2020-06-16 电子科技大学 Multi-target identification method and system for brachial plexus ultrasonic image

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120034587A1 (en) * 2000-10-23 2012-02-09 Toly Christopher C Phsysiological simulator for use as a brachial plexus nerve block trainer
WO2009110410A1 (en) * 2008-03-04 2009-09-11 日本電気株式会社 Image matching device, image matching feature amount storage medium, image matching method, and image matching program
CN108804547A (en) * 2018-05-18 2018-11-13 深圳华声医疗技术股份有限公司 Ultrasonoscopy teaching method, device and computer readable storage medium
CN110399915A (en) * 2019-07-23 2019-11-01 王英伟 A kind of Ultrasound Image Recognition Method and its system based on deep learning
CN110414405A (en) * 2019-07-23 2019-11-05 王英伟 The recognition methods and its system of Scalene gap brachial plexus nerve based on deep learning
CN110384698A (en) * 2019-08-14 2019-10-29 牡丹江医学院 A kind of anesthetic and its application suitable under color ultrasound guidance to brachial plexus nerve anesthesia
CN110706807A (en) * 2019-09-12 2020-01-17 北京四海心通科技有限公司 Medical question-answering method based on ontology semantic similarity
CN111260786A (en) * 2020-01-06 2020-06-09 南京航空航天大学 Intelligent ultrasonic multi-mode navigation system and method
CN111292324A (en) * 2020-03-20 2020-06-16 电子科技大学 Multi-target identification method and system for brachial plexus ultrasonic image

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
ABRAHAM N等: ""Deep learning for Semantic Segmentation of Brachial Plexus Nervesin Ultrasound Images Using U-Net and M-Net"", 《2019 3RD INTERNATIONAL CONFERENCE ON IMAGING SIGNAL PROCESSING AND COMMUNICATION》 *
SILVA等: ""Impaired Visual Hand Recognition in Preoperative Patients during Brashial Plexus Anesthesia"", 《ANESTHESIOLOGY》 *
WANG R等: ""Ultrasound Nerve Segmentation of Brachial Plexus Based on Optimized ResU-Net"", 《2019 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES》 *
杨桐等: ""基于深度学习与自适应对比度增强的臂丛神经超声图像优化"", 《计算机科学》 *
殷琴琴等: ""两种容量罗哌卡因用于超声引导下肌间沟臂丛神经阻滞对膈肌***影像"", 《临床麻醉学杂志》 *
汪三岳等: ""不同定位方法对臂丛神经阻滞麻醉的效果比较"", 《中华全科医学》 *
龙法宁等: ""基于卷积神经网络的臂丛神经超声图像分割方法"", 《合肥工业大学学报(自然科学版)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118037751A (en) * 2024-02-27 2024-05-14 中国医学科学院肿瘤医院 Nerve block automatic segmentation and visualization method and system based on ultrasonic image

Also Published As

Publication number Publication date
CN112184817B (en) 2022-12-02

Similar Documents

Publication Publication Date Title
US11776421B2 (en) Systems and methods for monitoring and evaluating body movement
CN110232311B (en) Method and device for segmenting hand image and computer equipment
US11783615B2 (en) Systems and methods for language driven gesture understanding
CN110377779B (en) Image annotation method, and annotation display method and device based on pathological image
US20170243042A1 (en) Systems and methods for biometric identification
WO2022089257A1 (en) Medical image processing method, apparatus, device, storage medium, and product
US11527102B2 (en) Systems and methods of automated biometric identification reporting
CN112184817B (en) Brachial plexus image identification method and system
US11727710B2 (en) Weakly supervised semantic parsing
CN115331314A (en) Exercise effect evaluation method and system based on APP screening function
CN113707304A (en) Triage data processing method, device, equipment and storage medium
CN113642562A (en) Data interpretation method, device and equipment based on image recognition and storage medium
CN117237351A (en) Ultrasonic image analysis method and related device
CN111523507A (en) Artificial intelligence wound assessment area measuring and calculating method and device
CN116130088A (en) Multi-mode face diagnosis method, device and related equipment
CN114708981A (en) Disease map construction method, interpretation method and system based on medical image report
Li Badminton motion capture with visual image detection of picking robotics
CN115019396A (en) Learning state monitoring method, device, equipment and medium
CN114519804A (en) Human body skeleton labeling method and device and electronic equipment
CN113436143A (en) Joint detection method and device based on artificial intelligence and electronic equipment
CN112331312A (en) Method, device, equipment and medium for determining labeling quality
CN117173491B (en) Medical image labeling method and device, electronic equipment and storage medium
WO2022178504A1 (en) System for correcting user movement
Vasefi et al. Diagnosis and management of hand arthritis using a mobile medical application
CN115809989A (en) Spinal cord injury analysis method and device based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant