CN113269196B - Method for realizing hyperspectral medical component analysis of graph convolution neural network - Google Patents

Method for realizing hyperspectral medical component analysis of graph convolution neural network Download PDF

Info

Publication number
CN113269196B
CN113269196B CN202110811547.XA CN202110811547A CN113269196B CN 113269196 B CN113269196 B CN 113269196B CN 202110811547 A CN202110811547 A CN 202110811547A CN 113269196 B CN113269196 B CN 113269196B
Authority
CN
China
Prior art keywords
pixel
hyperspectral
super
medical
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110811547.XA
Other languages
Chinese (zh)
Other versions
CN113269196A (en
Inventor
王耀南
尹阿婷
毛建旭
曾凯
张辉
朱青
周显恩
李亚萍
赵禀睿
陈煜嵘
苏学叁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202110811547.XA priority Critical patent/CN113269196B/en
Publication of CN113269196A publication Critical patent/CN113269196A/en
Application granted granted Critical
Publication of CN113269196B publication Critical patent/CN113269196B/en
Priority to PCT/CN2022/076023 priority patent/WO2023000653A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16CCOMPUTATIONAL CHEMISTRY; CHEMOINFORMATICS; COMPUTATIONAL MATERIALS SCIENCE
    • G16C20/00Chemoinformatics, i.e. ICT specially adapted for the handling of physicochemical or structural data of chemical particles, elements, compounds or mixtures
    • G16C20/20Identification of molecular entities, parts thereof or of chemical compositions

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Chemical & Material Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for realizing hyperspectral medicine component analysis of a graph convolution neural network, which is characterized in that on one hand, medicine hyperspectral image data are processed into graph data, so that the number of pixels is greatly reduced, and the data volume is effectively reduced; on the other hand, the characteristic information of the medicine is extracted by the graph convolution neural network model, the spatial relationship between the visual characteristic and the medicine component in the hyperspectral image of the medicine is effectively learned, the expression capability of the classification characteristic of the medicine component is improved, the component and attribute precision of the detected medicine is improved, and the nondestructive and rapid detection and analysis of the medicine component and the quality can be realized.

Description

Method for realizing hyperspectral medical component analysis of graph convolution neural network
Technical Field
The invention relates to the field of hyperspectral intelligent detection and analysis of high-end medicines, in particular to a method for realizing hyperspectral medicine component analysis of a graph convolution neural network.
Background
The medical safety is a big matter related to the health and economic development of people, has become a civil and public safety problem concerned by people all the time, and has great significance for maintaining the national stability and the social harmony and stability. The existing medicine component quality detection methods, such as a chemical detection method, a spectrophotometry method and the like, can only be suitable for sampling detection, are destructive and cannot meet the requirements of nondestructive detection of medicine quality. In recent years, the near infrared spectrum detection technology is widely applied in the field of drug analysis, and the spectrum information of the near infrared spectrum detection technology is a fingerprint-like feature with strong robustness and can be used for metering and classifying different drug components. The spectral detection method is recorded in the 'Chinese pharmacopoeia' 2015 edition as the guarantee for testing the quality of medicines, but the spectral detection method can only detect the quantitative information of the components of a tested sample at a light source irradiation point, and cannot analyze the whole components of medicines. Therefore, there is a need to develop a novel, versatile and reliable method for mass spectrometric detection and analysis of pharmaceutical ingredients.
The hyperspectral imaging technology can simultaneously acquire the spectral information and the spatial information of the detected medicine, the acquired data information amount is very rich, the integral property of the detected medicine can be accurately reflected, and the nondestructive testing analysis requirement of the integral component of the current medicine is well met. At present, the hyperspectral imaging technology is combined with a chemometrics related algorithm, and related researches such as identification of medicinal materials and tablets, uniformity distribution detection of active ingredients and auxiliary materials in solid tablets, composition and distribution condition monitoring of a drug-carrying film and the like are carried out in the pharmaceutical field, so that the hyperspectral imaging technology can be used as a high-efficiency nondestructive quality detection means in the pharmaceutical field. However, due to the fact that medicines are various in types and complex in components, and the amount of hyperspectral data is huge, effective characteristic information of the medicines is difficult to extract by a chemometric method, and the component and attribute prediction accuracy of the detected medicines is not high. Deep learning is good at exploring complex relationships in multidimensional data and is one of the best methods for processing and analyzing mass data at present. The graph neural network is a kind of neural network for processing graph domain information, and has been widely paid attention to medical fields such as brain science, medical diagnosis, drug discovery and research due to strong explanatory property on biomolecular structures and functional relationships between molecules. The graph neural network has good learning ability on the spatial characteristics of a topological data structure, but is difficult to be directly used in component analysis of medical hyperspectral images. Therefore, the visual information of the hyperspectral image of the medicine is urgently needed to be deeply explored aiming at the difficult problems of various medicine types and complex medicine component analysis, and the accuracy of the medicine component analysis is improved by combining the spatial characteristics of the medicine to be detected.
Disclosure of Invention
In view of the above, the invention provides a method for implementing a graph convolution neural network hyperspectral medical component analysis, which effectively implements lossless medical component analysis and rapid quality detection by learning spectral information characteristics and effective component spatial distribution characteristics of a drug in a hyperspectral medical image.
In one aspect, the invention provides a method for realizing hyperspectral medical component analysis of a graph convolution neural network
The method comprises the following steps:
step 1, acquiring a medical hyperspectral image, and constructing a medical hyperspectral data set, wherein the medical hyperspectral data set comprises a training set and a testing set;
step 2, segmenting the medical hyperspectral images in the training set by utilizing a superpixel segmentation algorithm to obtain mutually non-overlapping superpixels, wherein the mutually non-overlapping superpixels form a medical hyperspectral superpixel set;
step 3, respectively counting the pixel mean value, the centroid pixel position, the perimeter, the area and the region azimuth angle of each super pixel, and the characteristic parameters of the distance from the centroid pixel to the boundary of each super pixel region, and constructing a characteristic matrix of the graph data;
step 4, constructing a region adjacency graph by taking each super pixel as a graph node and the nearest neighbor super pixel as an edge, and obtaining an adjacency weight matrix of graph data;
step 5, inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a atlas neural network for training to obtain model parameters of the atlas neural network;
and 6, repeating the steps 2 to 4 on the medical hyperspectral images in the test set to obtain a region adjacency graph needing to be subjected to medicine component analysis, obtaining a feature matrix and an adjacency weight matrix of the region adjacency graph needing to be subjected to the medicine component analysis, and inputting the feature matrix and the adjacency weight matrix obtained in the test set into a graph convolution neural network model initialized by the model parameters trained in the step 5 to obtain a medicine component analysis result.
Further, the step 1 specifically includes the following steps:
step 1.1, preparing a drug sample: seven drug samples of cefprozil tablets, oxytetracycline tablets, chlorphenamine maleate tablets, furosemide tablets, aspirin enteric-coated tablets, pelteubicin tablets and callicarpa nudiflora dispersible tablets;
step 1.2, acquiring a medicine hyperspectral image and constructing a medicine hyperspectral data set
Figure 291369DEST_PATH_IMAGE001
: acquiring a medical hyperspectral image of a drug sample by adopting a hyperspectral sorter, performing reflectivity correction on the acquired medical hyperspectral image, and taking the corrected image as aA sample of a medical hyperspectral dataset;
step 1.3, medical hyperspectral data set
Figure 622731DEST_PATH_IMAGE002
Random partitioning into training sets
Figure 45622DEST_PATH_IMAGE003
And test set
Figure 229479DEST_PATH_IMAGE004
Figure 864859DEST_PATH_IMAGE005
,
Figure 817772DEST_PATH_IMAGE006
Figure 145985DEST_PATH_IMAGE007
Figure 53023DEST_PATH_IMAGE008
Is composed of
Figure 757674DEST_PATH_IMAGE002
To middleiThe image of one of the samples is taken,
Figure 565093DEST_PATH_IMAGE009
is composed of
Figure 329787DEST_PATH_IMAGE002
To middleiThe label of the drug component corresponding to each sample,
Figure 957077DEST_PATH_IMAGE010
for training set
Figure 199840DEST_PATH_IMAGE011
To middleiThe image of one of the samples is taken,
Figure 366160DEST_PATH_IMAGE012
for training set
Figure 301755DEST_PATH_IMAGE011
To middleiThe label of the drug component corresponding to each sample,
Figure 416341DEST_PATH_IMAGE013
to test the set
Figure 462795DEST_PATH_IMAGE014
To middleiThe image of one of the samples is taken,
Figure 979227DEST_PATH_IMAGE015
to test the set
Figure 85723DEST_PATH_IMAGE014
To middleiThe medicine component labels corresponding to the samples, d represents a medicine hyperspectral dataset
Figure 687606DEST_PATH_IMAGE002
Total number of samples in (1), s represents the training set
Figure 39215DEST_PATH_IMAGE011
Total number of samples in (1), m represents the test set
Figure 410153DEST_PATH_IMAGE014
Total number of samples in (1).
Further, a K-fold cross validation method is adopted to verify the hyperspectral data set of the traditional Chinese medicine in the step 1.3
Figure 421972DEST_PATH_IMAGE002
Training set
Figure 245571DEST_PATH_IMAGE011
And test set
Figure 899406DEST_PATH_IMAGE014
The division of (2).
Further, the step 2 is embodied as: performing hyperspectral image of medicine in the training set by SLIC algorithmDividing, calculating the space distance and the spectrum distance between pixel points, updating the super-pixel clustering center and the boundary range in an iterative mode, stopping iteration when the error between a new clustering center and an old clustering center is smaller than a preset threshold value, and thus obtaining super-pixels which are not overlapped with each other, wherein the super-pixels which are not overlapped with each other form a medical hyperspectral super-pixel set
Figure 124851DEST_PATH_IMAGE016
Figure 71685DEST_PATH_IMAGE017
Is as followsiN is the number of super pixels which are not overlapped with each other.
Further, the step 3 is embodied as: subjecting each super pixel obtained in step 2
Figure 648160DEST_PATH_IMAGE018
Obtaining each super pixel
Figure 840107DEST_PATH_IMAGE018
Pixel mean of
Figure 185638DEST_PATH_IMAGE019
Centroid pixel
Figure 804838DEST_PATH_IMAGE020
Position of
Figure 868609DEST_PATH_IMAGE021
Circumference length, of
Figure 100132DEST_PATH_IMAGE022
Area of
Figure 300170DEST_PATH_IMAGE023
Azimuth of area
Figure 355850DEST_PATH_IMAGE024
And centroid pixel
Figure 375759DEST_PATH_IMAGE020
Distances from each super pixel region boundary to east, south, west, north, south, north and west 8 directions
Figure 175088DEST_PATH_IMAGE025
Thereby obtaining a feature matrixX
Figure 229631DEST_PATH_IMAGE026
Wherein N is the number of superpixels, M is the feature dimension,
Figure 190634DEST_PATH_IMAGE027
representing a set of real numbers.
Further, the specific implementation of the adjacent weight matrix in step 4 includes the following steps:
step 4.1, according to the medical hyperspectral superpixel set V obtained in the step 2, superpixels in the medical hyperspectral superpixel set
Figure 557611DEST_PATH_IMAGE028
Forming individual graph nodes, and selecting super-pixel by adopting K nearest neighbor algorithm
Figure 629472DEST_PATH_IMAGE028
Constructing edges by the nearest K super pixel points so as to form a region adjacency graph G;
step 4.2, according to each super-pixel area in the medical hyperspectral super-pixel set V obtained in the step 2, counting adjacent super-pixels of each super-pixel area to obtain an adjacent super-pixel set
Figure 804101DEST_PATH_IMAGE029
Step 4.3, obtaining the super pixel according to the step 3
Figure 936005DEST_PATH_IMAGE028
Pixel mean of
Figure 196085DEST_PATH_IMAGE030
Between each super pixelPixel mean distance of
Figure 337217DEST_PATH_IMAGE031
Step 4.4, obtaining the super pixel according to the step 3
Figure 867817DEST_PATH_IMAGE030
Centroid pixel of
Figure 170623DEST_PATH_IMAGE032
Position of
Figure 917999DEST_PATH_IMAGE033
Calculating the distance of each superpixel interstitial center coordinate
Figure 331663DEST_PATH_IMAGE034
Step 4.5, obtaining the pixel mean distance between the super pixels according to the step 4.3
Figure 480884DEST_PATH_IMAGE035
And the super-pixel interstitial-to-heart coordinate distance obtained in step 4.4
Figure 689012DEST_PATH_IMAGE034
Calculating to obtain an adjacent weight matrix A,
Figure 422219DEST_PATH_IMAGE036
further, the specific implementation of step 5 includes the following steps:
step 5.1, initializing model parameters of graph convolution neural network model by using Xavier method
Figure 905153DEST_PATH_IMAGE037
Step 5.2, calculating a degree matrix D of each graph node according to the region adjacency graph G constructed in the step 4,
Figure 643302DEST_PATH_IMAGE038
step 5.3, calculating the characteristic H of each layer of the graph convolution neural network GCN in the graph convolution neural network model according to the following formula:
Figure 553489DEST_PATH_IMAGE039
(4)
wherein,
Figure 275457DEST_PATH_IMAGE040
Wfor the matrix of weight parameters that can be learned,
Figure 30924DEST_PATH_IMAGE041
is an activation function, andlwhen the value is not less than 0, the reaction time is not less than 0,
Figure 656202DEST_PATH_IMAGE042
Xis a feature matrix;
step 5.4, in the training phase, the adjustment is carried out through graph convolution and micro-poolingWTo reduce the error on a continuous basis to optimize the output, the loss function is calculated by:
Figure 737291DEST_PATH_IMAGE043
(5)
wherein,
Figure 680976DEST_PATH_IMAGE044
is a training sample
Figure 240134DEST_PATH_IMAGE045
The real label of (a) is,sin order to train the number of samples,Lis a loss function;
step 5.5, according to the loss functionLThe gradient of the whole graph convolution neural network model is adjusted through back propagation
Figure 218454DEST_PATH_IMAGE046
Using the obtained parameter as the network initialization parameter in step 5.1, continuously iterating step 5.1 to step 5.5 until the graph volumeThe analytical precision of the product neural network model to the medicine components tends to be stable.
Further, the pixel mean distance between each superpixel in step 4.3
Figure 204864DEST_PATH_IMAGE047
Calculated from the following formula:
Figure 405819DEST_PATH_IMAGE048
(1)
in the formula,
Figure 768668DEST_PATH_IMAGE049
is shown asiThe pixel mean of the individual super-pixels,
Figure 601494DEST_PATH_IMAGE050
is shown asjPixel mean of individual superpixels.
Further, each superpixel interstitial-to-heart coordinate distance in step 4.4
Figure 758806DEST_PATH_IMAGE051
Calculated from the following formula:
Figure 677084DEST_PATH_IMAGE052
(2)
in the formula,
Figure 843623DEST_PATH_IMAGE053
is shown asiThe center of mass of each super-pixel,
Figure 32421DEST_PATH_IMAGE054
is shown asjThe center of mass of each super-pixel,
Figure 626213DEST_PATH_IMAGE055
is shown asiThe abscissa of the centroid of an individual super-pixel,
Figure 766208DEST_PATH_IMAGE056
is shown asiThe ordinate of the individual superpixel centroid,
Figure 205279DEST_PATH_IMAGE057
is shown asjThe abscissa of the centroid of an individual super-pixel,
Figure 747119DEST_PATH_IMAGE058
is shown asjThe ordinate of the individual superpixel centroid.
Further, the adjacency weight matrix a in step 4.5 is calculated by the following formula:
Figure 511813DEST_PATH_IMAGE059
(3)。
therefore, the method for realizing the hyperspectral medical component analysis of the atlas neural network comprises the steps of firstly, obtaining a medical hyperspectral image, and constructing a medical hyperspectral data set comprising a training set and a testing set; secondly, segmenting the medical hyperspectral images in the training set by utilizing a superpixel segmentation algorithm to obtain mutually non-overlapping superpixels; then, respectively counting the pixel mean value, the centroid pixel position, the perimeter, the area and the region azimuth angle of each super pixel, and the characteristic parameters of the distance from the centroid pixel to the boundary of each super pixel region, and constructing a characteristic matrix of the graph data; then, each super pixel is taken as a graph node, the nearest neighbor super pixel is taken as an edge, a region adjacency graph is constructed, and an adjacency weight matrix of graph data is obtained; thirdly, inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a atlas neural network for training to obtain model parameters of the atlas neural network; and finally, steps 2 to 4, obtaining a region adjacency graph needing to be subjected to medicine component analysis, obtaining a feature matrix and an adjacency weight matrix of the region adjacency graph needing to be subjected to medicine component analysis, and inputting the feature matrix and the adjacency weight matrix obtained in the test set into a graph convolution neural network model initialized by the model parameters trained in step 5 to obtain a medicine component analysis result. Compared with the prior art, on one hand, the medical hyperspectral image data is processed into image data, so that the number of pixels is greatly reduced, and the data volume is effectively reduced; on the other hand, the characteristic information of the medicine is extracted by the graph convolution neural network model, the spatial relationship between the visual characteristic and the medicine component in the medicine hyperspectral image is effectively learned, the expression capability of the medicine component classification characteristic is improved, the component and attribute precision of the detected medicine is improved, the problems of various medicine types, complex composition components, different physical characteristics and the like are solved, and the nondestructive medicine component analysis and the rapid quality detection are realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of an implementation method for hyperspectral pharmaceutical composition analysis by a graph convolution neural network according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for implementing a hyperspectral pharmaceutical composition analysis by a convolutional neural network according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating an embodiment of a process for obtaining an adjacency weight matrix;
FIG. 4 is a block diagram of a schematic structural framework of a convolutional neural network model according to an embodiment of the present invention;
FIG. 5 is a sample schematic diagram of a portion of a hyperspectral pharmaceutical composition analysis dataset according to an embodiment of the invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 is a flowchart of an implementation method for hyperspectral pharmaceutical composition analysis by a graph-convolution neural network according to an embodiment of the present invention. As shown in FIG. 1, the method for realizing the hyperspectral medical component analysis of the atlas neural network is realized by the following steps:
step 1, acquiring a medical hyperspectral image, and constructing a medical hyperspectral data set, wherein the medical hyperspectral data set comprises a training set and a testing set;
step 2, segmenting the medical hyperspectral images in the training set by using a superpixel segmentation algorithm to obtain mutually non-overlapping superpixels, wherein the mutually non-overlapping superpixels form a medical hyperspectral superpixel set;
step 3, respectively counting the pixel mean value, the centroid pixel position, the perimeter, the area and the region azimuth angle of each super pixel, and the characteristic parameters of the distance from the centroid pixel to the boundary of each super pixel region, and constructing a characteristic matrix of the graph data;
step 4, constructing a region adjacency graph by taking each super pixel as a graph node and the nearest neighbor super pixel as an edge, and obtaining an adjacency weight matrix of graph data;
step 5, inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a atlas neural network for training to obtain model parameters of the atlas neural network;
and 6, repeating the steps 2 to 4 on the medical hyperspectral images in the test set to obtain a region adjacency graph needing to be subjected to medicine component analysis, obtaining a feature matrix and an adjacency weight matrix of the region adjacency graph needing to be subjected to the medicine component analysis, and inputting the feature matrix and the adjacency weight matrix obtained in the test set into a graph convolution neural network model initialized by the model parameters trained in the step 5 to obtain a medicine component analysis result.
Firstly, acquiring a medical hyperspectral image, and constructing a medical hyperspectral data set comprising a training set and a testing set; secondly, segmenting the medical hyperspectral images in the training set by utilizing a superpixel segmentation algorithm to obtain mutually non-overlapping superpixels; then, respectively counting the pixel mean value, the centroid pixel position, the perimeter, the area and the region azimuth angle of each super pixel, and the characteristic parameters of the distance from the centroid pixel to the boundary of each super pixel region, and constructing a characteristic matrix of the graph data; then, each super pixel is taken as a graph node, the nearest neighbor super pixel is taken as an edge, a region adjacency graph is constructed, and an adjacency weight matrix of graph data is obtained; thirdly, inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a atlas neural network for training to obtain model parameters of the atlas neural network; and finally, repeating the steps 2 to 4 on the medical hyperspectral images in the test set to obtain a region adjacency graph needing to be subjected to medicine component analysis, obtaining a feature matrix and an adjacency weight matrix of the region adjacency graph needing to be subjected to the medicine component analysis, and inputting the feature matrix and the adjacency weight matrix obtained in the test set into a graph convolution neural network model initialized by the model parameters trained in the step 5 to obtain a medicine component analysis result. Compared with the prior art, the method can accurately analyze different components of the medicine sample in the medical hyperspectral image, solves the problems of various medicine types, complex composition, different physical characteristics and the like, and realizes nondestructive medicine component analysis and rapid quality detection.
Referring to fig. 2 to 4, fig. 2 is a flowchart of an implementation method of hyperspectral pharmaceutical composition analysis by a graph convolution neural network according to a second embodiment of the present invention; FIG. 3 is a flowchart of an acquisition process of an adjacent weight matrix in the embodiment of the present invention, and FIG. 4 is a schematic structural framework diagram of a graph convolution neural network model in the embodiment of the present invention.
A method for realizing hyperspectral medical component analysis of a graph convolution neural network comprises the following steps:
1.1, preparing a plurality of different drug samples;
it should be noted that, in this example, experiments were performed on seven kinds of drug samples, including cefprozil tablets, oxytetracycline tablets, chlorphenamine maleate tablets, furosemide tablets, aspirin enteric tablets, peruviol tablets, and callicarpa nudiflora dispersible tablets, but the number and kind of the drugs are not limited thereto. Fig. 5 is a sample diagram of a part of a hyperspectral pharmaceutical ingredient analysis dataset of a cefprozil tablet, a chlorphenamine maleate tablet and a callicarpa nudiflora dispersible tablet, specifically, in fig. 5, (a) shows a sample diagram of a callicarpa nudiflora dispersible tablet, (b) shows a sample diagram of a cefprozil tablet, and (c) shows a sample diagram of a chlorphenamine maleate tablet.
Step 1.2, acquiring a medicine hyperspectral image and constructing a medicine hyperspectral data set
Figure 404682DEST_PATH_IMAGE060
: acquiring a medical hyperspectral image of a medicine sample by adopting a hyperspectral sorter, performing reflectivity correction on the acquired medical hyperspectral image, and taking the corrected image as a sample of a medical hyperspectral data set;
in the process, the high spectrum sorter preferably adopts Sichuan Lianghe spectrum high spectrum sorter (V10E, N25E-SWIR), and the spectral ranges are respectively 400-1000nm and 1000-2500 nm;
step 1.3, medical hyperspectral data set
Figure 145980DEST_PATH_IMAGE061
Random partitioning into training sets
Figure 73485DEST_PATH_IMAGE062
And test set
Figure 743501DEST_PATH_IMAGE063
Figure 858087DEST_PATH_IMAGE064
,
Figure 170120DEST_PATH_IMAGE065
Figure 686552DEST_PATH_IMAGE066
Figure 28934DEST_PATH_IMAGE067
Is composed of
Figure 630816DEST_PATH_IMAGE068
To middleiThe image of one of the samples is taken,
Figure 215381DEST_PATH_IMAGE069
is composed of
Figure 851899DEST_PATH_IMAGE068
To middleiThe label of the drug component corresponding to each sample,
Figure 863717DEST_PATH_IMAGE070
for training set
Figure 218475DEST_PATH_IMAGE071
To middleiThe image of one of the samples is taken,
Figure 692883DEST_PATH_IMAGE072
for training set
Figure 449486DEST_PATH_IMAGE071
To middleiThe label of the drug component corresponding to each sample,
Figure 632206DEST_PATH_IMAGE073
to test the set
Figure 208680DEST_PATH_IMAGE074
To middleiThe image of one of the samples is taken,
Figure 902092DEST_PATH_IMAGE075
to test the set
Figure 247623DEST_PATH_IMAGE074
To middleiThe medicine component labels corresponding to the samples, d represents a medicine hyperspectral dataset
Figure 866823DEST_PATH_IMAGE068
Total number of samples in (1), s represents the training set
Figure 665015DEST_PATH_IMAGE071
Total number of samples in (1), m represents the test set
Figure 395073DEST_PATH_IMAGE074
Total number of samples in (1);
step 2, segmenting the medical hyperspectral images in the training set by utilizing a superpixel segmentation algorithm to obtain mutually non-overlapping superpixels, wherein the mutually non-overlapping superpixels form a medical hyperspectral superpixel set;
preferably, this step is embodied as: dividing medical hyperspectral images in the training set by adopting a Simple Linear Iterative Clustering (SLIC) algorithm, iteratively updating a superpixel Clustering center and a boundary range by calculating a spatial distance and a spectral distance between pixel points, stopping iteration when an error between a new Clustering center and an old Clustering center is smaller than a preset threshold value, thereby obtaining non-overlapping superpixels, wherein the non-overlapping superpixels form a medical hyperspectral superpixel set
Figure 595111DEST_PATH_IMAGE076
Figure 385212DEST_PATH_IMAGE077
Is as followsiN is the number of the super pixels which are not overlapped;
step 3, respectively counting the pixel mean value, the centroid pixel position, the perimeter, the area and the region azimuth angle of each super pixel, and the characteristic parameters of the distance from the centroid pixel to the boundary of each super pixel region, and constructing a characteristic matrix of the graph data;
specifically, the steps are represented as: subjecting each super pixel obtained in step 2
Figure 434814DEST_PATH_IMAGE077
Obtaining each super pixel
Figure 968564DEST_PATH_IMAGE077
Pixel mean of
Figure 288687DEST_PATH_IMAGE078
Centroid pixel
Figure 984110DEST_PATH_IMAGE079
Position of
Figure 22473DEST_PATH_IMAGE080
Circumference length, of
Figure 359914DEST_PATH_IMAGE081
Area of
Figure 770429DEST_PATH_IMAGE082
Azimuth of area
Figure 167912DEST_PATH_IMAGE083
And centroid pixel
Figure 162413DEST_PATH_IMAGE079
Distances from each super pixel region boundary to east, south, west, north, south, north and west 8 directions
Figure 303544DEST_PATH_IMAGE084
Thereby obtaining a feature matrixX
Figure 332680DEST_PATH_IMAGE085
Wherein N is the number of superpixels, M is the feature dimension,
Figure 635486DEST_PATH_IMAGE086
representing a set of real numbers;
step 4, constructing a region adjacency graph by taking each super pixel as a graph node and the nearest neighbor super pixel as an edge, and obtaining an adjacency weight matrix of graph data; specifically, referring to fig. 3, this step is decomposed into the following processes:
step 4.1, according to the medical hyperspectral superpixel set V obtained in the step 2, superpixels in the medical hyperspectral superpixel set
Figure 887256DEST_PATH_IMAGE087
Forming individual graph nodes, and selecting super-pixel by adopting K nearest neighbor algorithm
Figure 300920DEST_PATH_IMAGE087
Most recent K superimagesConstructing edges by using the prime points so as to form a region adjacency graph G, wherein the value of K is 8;
step 4.2, according to each super-pixel area in the medical hyperspectral super-pixel set V obtained in the step 2, counting adjacent super-pixels of each super-pixel area to obtain an adjacent super-pixel set
Figure 450142DEST_PATH_IMAGE088
Step 4.3, obtaining the super pixel according to the step 3
Figure 923848DEST_PATH_IMAGE087
Pixel mean of
Figure 892941DEST_PATH_IMAGE089
Calculating the pixel mean distance between each super pixel
Figure 110296DEST_PATH_IMAGE090
Mean distance of pixel between each superpixel
Figure 848445DEST_PATH_IMAGE090
Calculated from the following formula:
Figure 994518DEST_PATH_IMAGE091
(1)
in the formula,
Figure 450907DEST_PATH_IMAGE089
is shown asiThe pixel mean of the individual super-pixels,
Figure 206373DEST_PATH_IMAGE092
is shown asjA pixel mean of the individual superpixels;
step 4.4, obtaining the super pixel according to the step 3
Figure 330187DEST_PATH_IMAGE089
Centroid pixel of
Figure 880117DEST_PATH_IMAGE093
Position of
Figure 823802DEST_PATH_IMAGE094
Calculating the distance of each superpixel interstitial center coordinate
Figure 382960DEST_PATH_IMAGE095
Each super pixel interstitial-to-heart coordinate distance
Figure 859815DEST_PATH_IMAGE095
Calculated from the following formula:
Figure 111805DEST_PATH_IMAGE096
(2)
in the formula,
Figure 277207DEST_PATH_IMAGE097
is shown asiThe center of mass of each super-pixel,
Figure 640055DEST_PATH_IMAGE098
is shown asjThe center of mass of each super-pixel,
Figure 738461DEST_PATH_IMAGE099
is shown asiThe abscissa of the centroid of an individual super-pixel,
Figure 161352DEST_PATH_IMAGE100
is shown asiThe ordinate of the individual superpixel centroid,
Figure 581095DEST_PATH_IMAGE101
is shown asjThe abscissa of the centroid of an individual super-pixel,
Figure 482054DEST_PATH_IMAGE102
is shown asjA vertical coordinate of the individual superpixel centroids;
step 4.5, obtaining the pixel mean distance between the super pixels according to the step 4.3
Figure 434967DEST_PATH_IMAGE103
And the super-pixel interstitial-to-heart coordinate distance obtained in step 4.4
Figure 763180DEST_PATH_IMAGE095
Calculating to obtain an adjacent weight matrix A,
Figure 168754DEST_PATH_IMAGE104
the adjacency weight matrix A is calculated by the following formula:
Figure 607825DEST_PATH_IMAGE105
(3);
step 5, inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a atlas neural network for training to obtain model parameters of the atlas neural network; FIG. 4 is a schematic structural framework diagram of a convolutional neural network model according to an embodiment of the present invention;
and 6, repeating the steps 2 to 4 on the medical hyperspectral images in the test set to obtain a region adjacency graph needing to be subjected to medicine component analysis, obtaining a feature matrix and an adjacency weight matrix of the region adjacency graph needing to be subjected to the medicine component analysis, and inputting the feature matrix and the adjacency weight matrix obtained in the test set into a graph convolution neural network model initialized by the model parameters trained in the step 5 to obtain a medicine component analysis result.
As a preferred embodiment of the invention, a K-fold cross-validation method is adopted to perform verification on the hyperspectral data set of the traditional Chinese medicine in the step 1.3
Figure 931358DEST_PATH_IMAGE106
Training set
Figure 696051DEST_PATH_IMAGE107
And test set
Figure 323342DEST_PATH_IMAGE108
Wherein K is 10.
Meanwhile, in a further technical scheme, the step 5 of inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a convolutional neural network for training to obtain the specific realization of the model parameters of the convolutional neural network comprises the following steps:
step 5.1, initializing model parameters of graph convolution neural network model by using Xavier method
Figure 831683DEST_PATH_IMAGE109
It should be noted that the Xavier method is an effective neural network parameter initialization method, and the purpose of the method is mainly to make the variance of each layer output of the neural network equal as much as possible;
step 5.2, calculating a degree matrix D of each graph node according to the region adjacency graph G constructed in the step 4,
Figure 493609DEST_PATH_IMAGE110
step 5.3, calculating the characteristic H of each layer of the graph convolution neural network GCN in the graph convolution neural network model according to the following formula:
Figure 163625DEST_PATH_IMAGE111
(4)
wherein,
Figure 278211DEST_PATH_IMAGE112
Wfor the matrix of weight parameters that can be learned,
Figure 91709DEST_PATH_IMAGE113
is an activation function, andlwhen the value is not less than 0, the reaction time is not less than 0,
Figure 342561DEST_PATH_IMAGE114
Xis a feature matrix;
step 5.4, in the training phase, the adjustment is carried out through graph convolution and micro-poolingWTo reduce errors on a continuous basis, thereby optimizingThe output, the loss function is calculated by:
Figure 449058DEST_PATH_IMAGE115
(5)
wherein,
Figure 316520DEST_PATH_IMAGE116
is a training sample
Figure 166664DEST_PATH_IMAGE117
The real label of (a) is,sin order to train the number of samples,Lis a loss function; wherein L is calculated using the following formula of the cross entropy loss function:
Figure 301717DEST_PATH_IMAGE118
(6)
in the formula,
Figure 579114DEST_PATH_IMAGE119
for training samples
Figure 933872DEST_PATH_IMAGE120
The real components of the (c) are,
Figure 322128DEST_PATH_IMAGE121
for training samples
Figure 813152DEST_PATH_IMAGE120
Predicted component, s is the number of samples.
The atlas neural network model in fig. 4 includes the atlas layer, the atlas pooling layer, and the output layer.
Step 5.5, according to the loss functionLThe gradient of the whole graph convolution neural network model is adjusted through back propagation
Figure 527031DEST_PATH_IMAGE122
Using the initial parameter as the network initialization parameter in step 5.1, continuously iterating step 5.1 to step 5.5 until the graph convolution neural networkThe model tends to be stable to the analysis precision of the medicine components.
Compared with the prior art, the medical hyperspectral image data are processed into image data, so that the number of pixels is greatly reduced, and the data volume is effectively reduced; the characteristic information of the medicine is extracted by the graph convolution neural network, the spatial relationship between the visual characteristic and the medicine component in the hyperspectral image of the medicine is effectively learned, the expression capability of the classification characteristic of the medicine component is improved, the component and attribute precision of the detected medicine is improved, and the nondestructive and rapid detection and analysis of the medicine component and the quality can be realized.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. A method for realizing hyperspectral medical component analysis of a graph convolution neural network is characterized by comprising the following steps:
step 1, acquiring a medical hyperspectral image, and constructing a medical hyperspectral data set, wherein the medical hyperspectral data set comprises a training set and a testing set;
step 2, segmenting the medical hyperspectral images in the training set by utilizing a superpixel segmentation algorithm to obtain mutually non-overlapping superpixels, wherein the mutually non-overlapping superpixels form a medical hyperspectral superpixel set, and the detailed expression is as follows: dividing the medical hyperspectral images in the training set by adopting an SLIC algorithm, iteratively updating a superpixel clustering center and a boundary range by calculating a spatial distance and a spectral distance between pixel points, stopping iteration when an error between a new clustering center and an old clustering center is smaller than a preset threshold value, thereby obtaining mutually non-overlapping superpixels, wherein the mutually non-overlapping superpixels form a medical hyperspectral pixel set
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE004
Is as followsiN is the number of the super pixels which are not overlapped;
step 3, respectively counting the pixel mean value, the centroid pixel position, the perimeter, the area and the region azimuth angle of each super pixel, and the characteristic parameters of the distance from the centroid pixel to the boundary of each super pixel region, and constructing a characteristic matrix of the graph data;
step 4, constructing a region adjacency graph by taking each super-pixel as a graph node and the nearest neighbor super-pixel as an edge, obtaining an adjacency weight matrix of graph data, and specifically, according to the medical hyperspectral super-pixel set V obtained in the step 2, performing hyperspectral super-pixel collection on the super-pixels in the medical hyperspectral-pixel set
Figure 376996DEST_PATH_IMAGE004
Forming individual graph nodes, and selecting super-pixel by adopting K nearest neighbor algorithm
Figure 679801DEST_PATH_IMAGE004
Constructing edges by the nearest K super pixel points so as to form a region adjacency graph G;
step 5, inputting the feature matrix, the adjacent weight matrix and the medical hyperspectral component labels corresponding to the medical hyperspectral images in the training set into a atlas neural network for training to obtain model parameters of the atlas neural network;
and 6, repeating the steps 2 to 4 on the medical hyperspectral images in the test set to obtain a region adjacency graph needing to be subjected to medicine component analysis, obtaining a feature matrix and an adjacency weight matrix of the region adjacency graph needing to be subjected to the medicine component analysis, and inputting the feature matrix and the adjacency weight matrix obtained in the test set into a graph convolution neural network model initialized by the model parameters trained in the step 5 to obtain a medicine component analysis result.
2. The method for realizing the hyperspectral medical composition analysis of the convolutional neural network according to claim 1, wherein the step 1 specifically comprises the following steps:
1.1, preparing a plurality of different drug samples;
step 1.2, acquiring a medicine hyperspectral image and constructing a medicine hyperspectral data set
Figure DEST_PATH_IMAGE005
: acquiring a medical hyperspectral image of a medicine sample by adopting a hyperspectral sorter, performing reflectivity correction on the acquired medical hyperspectral image, and taking the corrected image as a sample of a medical hyperspectral data set;
step 1.3, medical hyperspectral data set
Figure DEST_PATH_IMAGE006
Random partitioning into training sets
Figure DEST_PATH_IMAGE007
And test set
Figure DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE009
,
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE011
Figure DEST_PATH_IMAGE012
Is composed of
Figure 210533DEST_PATH_IMAGE006
To middleiThe image of one of the samples is taken,
Figure DEST_PATH_IMAGE013
is composed of
Figure 76726DEST_PATH_IMAGE006
To middleiThe label of the drug component corresponding to each sample,
Figure DEST_PATH_IMAGE014
for training set
Figure DEST_PATH_IMAGE015
To middleiThe image of one of the samples is taken,
Figure DEST_PATH_IMAGE016
for training set
Figure 789730DEST_PATH_IMAGE015
To middleiThe label of the drug component corresponding to each sample,
Figure DEST_PATH_IMAGE017
to test the set
Figure DEST_PATH_IMAGE018
To middleiThe image of one of the samples is taken,
Figure DEST_PATH_IMAGE019
to test the set
Figure 529016DEST_PATH_IMAGE018
To middleiThe medicine component labels corresponding to the samples, d represents a medicine hyperspectral dataset
Figure 498109DEST_PATH_IMAGE006
Total number of samples in (1), s represents the training set
Figure 810404DEST_PATH_IMAGE015
Total number of samples in (1), m represents the test set
Figure 814132DEST_PATH_IMAGE018
Total number of samples in (1).
3. The method for realizing the hyperspectral medical component analysis of the convolutional neural network according to claim 2, wherein a K-fold cross-validation method is adopted to perform hyperspectral medical data collection on the medicine in the step 1.3
Figure 334106DEST_PATH_IMAGE006
Training set
Figure 711866DEST_PATH_IMAGE015
And test set
Figure 467333DEST_PATH_IMAGE018
The division of (2).
4. The method for implementing hyperspectral medical composition analysis by a convolutional neural network according to claim 3, wherein the step 3 is embodied as: subjecting each super pixel obtained in step 2
Figure DEST_PATH_IMAGE020
Obtaining each super pixel
Figure 466513DEST_PATH_IMAGE020
Pixel mean of
Figure DEST_PATH_IMAGE021
Centroid pixel
Figure DEST_PATH_IMAGE022
Position of
Figure DEST_PATH_IMAGE023
Circumference length, of
Figure DEST_PATH_IMAGE024
Area of
Figure DEST_PATH_IMAGE025
Azimuth of area
Figure DEST_PATH_IMAGE026
And centroid pixel
Figure 501596DEST_PATH_IMAGE022
Distances from each super pixel region boundary to east, south, west, north, south, north and west 8 directions
Figure DEST_PATH_IMAGE027
Thereby obtaining a feature matrixX
Figure DEST_PATH_IMAGE028
Wherein N is the number of superpixels, M is the feature dimension,
Figure DEST_PATH_IMAGE029
representing a set of real numbers.
5. The method for realizing the hyperspectral medical component analysis of the convolutional neural network according to claim 4, wherein the concrete realization of the adjacent weight matrix in the step 4 comprises the following steps:
step 4.1, according to the medical hyperspectral superpixel set V obtained in the step 2, superpixels in the medical hyperspectral superpixel set
Figure DEST_PATH_IMAGE030
Forming individual graph nodes, and selecting super-pixel by adopting K nearest neighbor algorithm
Figure 196014DEST_PATH_IMAGE030
Constructing edges by the nearest K super pixel points so as to form a region adjacency graph G;
step 4.2, according to each super-pixel area in the medical hyperspectral super-pixel set V obtained in the step 2, counting adjacent super-pixels of each super-pixel area to obtain an adjacent super-pixel set
Figure DEST_PATH_IMAGE031
Step 4.3, obtaining the super pixel according to the step 3
Figure 896116DEST_PATH_IMAGE030
Pixel mean of
Figure DEST_PATH_IMAGE032
Calculating the pixel mean distance between each super pixel
Figure DEST_PATH_IMAGE033
Step 4.4, obtaining the super pixel according to the step 3
Figure 297273DEST_PATH_IMAGE032
Centroid pixel of
Figure DEST_PATH_IMAGE034
Position of
Figure DEST_PATH_IMAGE035
Calculating the distance of each superpixel interstitial center coordinate
Figure DEST_PATH_IMAGE036
Step 4.5, obtaining the pixel mean distance between the super pixels according to the step 4.3
Figure DEST_PATH_IMAGE037
And the super-pixel interstitial-to-heart coordinate distance obtained in step 4.4
Figure 814842DEST_PATH_IMAGE036
Calculating to obtain an adjacent weight matrix A,
Figure DEST_PATH_IMAGE038
6. the method for realizing hyperspectral medical composition analysis of the convolutional neural network according to claim 5, wherein the concrete implementation of the step 5 comprises the following steps:
step 5.1, initializing model parameters of graph convolution neural network model by using Xavier method
Figure DEST_PATH_IMAGE039
Step 5.2, calculating a degree matrix D of each graph node according to the region adjacency graph G constructed in the step 4,
Figure DEST_PATH_IMAGE040
step 5.3, calculating the characteristic H of each layer of the graph convolution neural network GCN in the graph convolution neural network model according to the following formula:
Figure DEST_PATH_IMAGE041
(4)
wherein,
Figure DEST_PATH_IMAGE042
Wfor the matrix of weight parameters that can be learned,
Figure DEST_PATH_IMAGE043
is an activation function, andlwhen the value is not less than 0, the reaction time is not less than 0,
Figure DEST_PATH_IMAGE044
Xis a feature matrix;
step 5.4, in the training phase, the adjustment is carried out through graph convolution and micro-poolingWTo reduce the error on a continuous basis to optimize the output, the loss function is calculated by:
Figure DEST_PATH_IMAGE045
(5)
wherein,
Figure DEST_PATH_IMAGE046
is a training sample
Figure DEST_PATH_IMAGE047
The real label of (a) is,sin order to train the number of samples,Lis a loss function;
step 5.5, according to the loss functionLThe gradient of the whole graph convolution neural network model is adjusted through back propagation
Figure DEST_PATH_IMAGE048
And taking the parameter as the network initialization parameter in the step 5.1, and continuously iterating the step 5.1 to the step 5.5 until the analysis precision of the graph convolution neural network model to the medicine components tends to be stable.
7. The method for implementing the hyperspectral medical composition analysis of the convolutional neural network of claim 5, wherein the pixel mean distance between each superpixel in the step 4.3
Figure DEST_PATH_IMAGE049
Calculated from the following formula:
Figure DEST_PATH_IMAGE050
(1)
in the formula,
Figure DEST_PATH_IMAGE051
is shown asiThe pixel mean of the individual super-pixels,
Figure DEST_PATH_IMAGE052
is shown asjPixel mean of individual superpixels.
8. The method for performing hyperspectral medical composition analysis according to claim 7, wherein the centroid position between each superpixel in step 4.4Marking distance
Figure DEST_PATH_IMAGE053
Calculated from the following formula:
Figure DEST_PATH_IMAGE054
(2)
in the formula,
Figure DEST_PATH_IMAGE055
is shown asiThe center of mass of each super-pixel,
Figure DEST_PATH_IMAGE056
is shown asjThe center of mass of each super-pixel,
Figure DEST_PATH_IMAGE057
is shown asiThe abscissa of the centroid of an individual super-pixel,
Figure DEST_PATH_IMAGE058
is shown asiThe ordinate of the individual superpixel centroid,
Figure DEST_PATH_IMAGE059
is shown asjThe abscissa of the centroid of an individual super-pixel,
Figure DEST_PATH_IMAGE060
is shown asjThe ordinate of the individual superpixel centroid.
9. The method for implementing the hyperspectral medical composition analysis of the convolutional neural network of claim 8, wherein the adjacency weight matrix A in the step 4.5 is calculated by the following formula:
Figure DEST_PATH_IMAGE061
(3)。
CN202110811547.XA 2021-07-19 2021-07-19 Method for realizing hyperspectral medical component analysis of graph convolution neural network Active CN113269196B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110811547.XA CN113269196B (en) 2021-07-19 2021-07-19 Method for realizing hyperspectral medical component analysis of graph convolution neural network
PCT/CN2022/076023 WO2023000653A1 (en) 2021-07-19 2022-02-11 Method for implementing hyperspectral medical component analysis by using graph convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110811547.XA CN113269196B (en) 2021-07-19 2021-07-19 Method for realizing hyperspectral medical component analysis of graph convolution neural network

Publications (2)

Publication Number Publication Date
CN113269196A CN113269196A (en) 2021-08-17
CN113269196B true CN113269196B (en) 2021-09-28

Family

ID=77236799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110811547.XA Active CN113269196B (en) 2021-07-19 2021-07-19 Method for realizing hyperspectral medical component analysis of graph convolution neural network

Country Status (2)

Country Link
CN (1) CN113269196B (en)
WO (1) WO2023000653A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269196B (en) * 2021-07-19 2021-09-28 湖南大学 Method for realizing hyperspectral medical component analysis of graph convolution neural network
CN113989525B (en) * 2021-12-24 2022-03-29 湖南大学 Hyperspectral traditional Chinese medicinal material identification method of self-adaptive random block convolution kernel network
CN115825316B (en) * 2023-02-15 2023-06-16 武汉宏韧生物医药股份有限公司 Method and device for analyzing active ingredients of medicine based on supercritical chromatography
CN115979973B (en) * 2023-03-20 2023-06-16 湖南大学 Hyperspectral Chinese herbal medicine identification method based on dual-channel compressed attention network
CN116563711B (en) * 2023-05-17 2024-02-09 大连民族大学 Hyperspectral target detection method of binary encoder network based on momentum update
CN116429710B (en) * 2023-06-15 2023-09-26 武汉大学人民医院(湖北省人民医院) Drug component detection method, device, equipment and readable storage medium
CN116612333B (en) * 2023-07-17 2023-09-29 山东大学 Medical hyperspectral image classification method based on rapid full convolution network
CN116662593B (en) * 2023-07-21 2023-10-27 湖南大学 FPGA-based full-pipeline medical hyperspectral image neural network classification method
CN117333486B (en) * 2023-11-30 2024-03-22 清远欧派集成家居有限公司 UV finish paint performance detection data analysis method, device and storage medium
CN118096536B (en) * 2024-04-29 2024-06-21 中国科学院长春光学精密机械与物理研究所 Remote sensing hyperspectral image super-resolution reconstruction method based on hypergraph neural network

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106022353A (en) * 2016-05-05 2016-10-12 浙江大学 Image semantic annotation method based on super pixel segmentation
WO2020165913A1 (en) * 2019-02-12 2020-08-20 Tata Consultancy Services Limited Automated unsupervised localization of context sensitive events in crops and computing extent thereof
CN111681249A (en) * 2020-05-14 2020-09-18 中山艾尚智同信息科技有限公司 Grabcut-based sandstone particle improved segmentation algorithm research
CN112381813A (en) * 2020-11-25 2021-02-19 华南理工大学 Panorama visual saliency detection method based on graph convolution neural network
CN112446417A (en) * 2020-10-16 2021-03-05 山东大学 Spindle-shaped fruit image segmentation method and system based on multilayer superpixel segmentation
CN113095305A (en) * 2021-06-08 2021-07-09 湖南大学 Hyperspectral classification detection method for medical foreign matters

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10748036B2 (en) * 2017-11-21 2020-08-18 Nvidia Corporation Training a neural network to predict superpixels using segmentation-aware affinity loss
CN111695636B (en) * 2020-06-15 2023-07-14 北京师范大学 Hyperspectral image classification method based on graph neural network
CN113269196B (en) * 2021-07-19 2021-09-28 湖南大学 Method for realizing hyperspectral medical component analysis of graph convolution neural network

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106022353A (en) * 2016-05-05 2016-10-12 浙江大学 Image semantic annotation method based on super pixel segmentation
WO2020165913A1 (en) * 2019-02-12 2020-08-20 Tata Consultancy Services Limited Automated unsupervised localization of context sensitive events in crops and computing extent thereof
CN111681249A (en) * 2020-05-14 2020-09-18 中山艾尚智同信息科技有限公司 Grabcut-based sandstone particle improved segmentation algorithm research
CN112446417A (en) * 2020-10-16 2021-03-05 山东大学 Spindle-shaped fruit image segmentation method and system based on multilayer superpixel segmentation
CN112381813A (en) * 2020-11-25 2021-02-19 华南理工大学 Panorama visual saliency detection method based on graph convolution neural network
CN113095305A (en) * 2021-06-08 2021-07-09 湖南大学 Hyperspectral classification detection method for medical foreign matters

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
《Multiscale Dynamic Graph Convolutional Network for Hyperspectral Image Classification》;Sheng Wan et al;《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》;20200531;第3162-3177页; *
《Stack Attention-Pruning Aggregates Multiscale Graph Convolution Networks for Hyperspectral Remote Sensing Image Classification》;NA LIU et al;《IEEE》;20210326;第44974-44988页; *
《基于图模型的高光谱图像分类》;陈逸;《硕士论文库》;20201231;第1-59页; *
《基于注意力和特征复用机制的图卷积神经网》;杨萌;《硕士论文库》;20201231;第1-81页; *

Also Published As

Publication number Publication date
WO2023000653A1 (en) 2023-01-26
CN113269196A (en) 2021-08-17

Similar Documents

Publication Publication Date Title
CN113269196B (en) Method for realizing hyperspectral medical component analysis of graph convolution neural network
Aubry-Kientz et al. A comparative assessment of the performance of individual tree crowns delineation algorithms from ALS data in tropical forests
CN109919230B (en) Medical image pulmonary nodule detection method based on cyclic feature pyramid
Oh et al. Plant counting of cotton from UAS imagery using deep learning-based object detection framework
Wei et al. A small UAV based multi-temporal image registration for dynamic agricultural terrace monitoring
Wang et al. An automatic shadow detection method for VHR remote sensing orthoimagery
Hu et al. Adaptive mean shift-based identification of individual trees using airborne LiDAR data
Liu et al. BraggNN: fast X-ray Bragg peak analysis using deep learning
WO2016201186A1 (en) Systems and methods for finding regions of interest in hematoxylin and eosin (h&e) stained tissue images and quantifying intratumor cellular spatial heterogeneity in multiplexed/hyperplexed fluorescence tissue images
Huang et al. A novel building type classification scheme based on integrated LiDAR and high-resolution images
Hu et al. A 3D point cloud filtering method for leaves based on manifold distance and normal estimation
Kwan et al. Performance of change detection algorithms using heterogeneous images and extended multi-attribute profiles (EMAPs)
Zhu et al. Robust registration of aerial images and LiDAR data using spatial constraints and Gabor structural features
Jarząbek-Rychard et al. Supervised detection of façade openings in 3D point clouds with thermal attributes
Ling et al. An image matching algorithm integrating global SRTM and image segmentation for multi-source satellite imagery
WO2023115682A1 (en) Hyperspectral traditional chinese medicine identification method based on adaptive random block convolutional kernel network
Boukhechba et al. DCT-based preprocessing approach for ICA in hyperspectral data analysis
Wei et al. Deep learning-based method for compound identification in NMR spectra of mixtures
Chen et al. Improving building change detection in VHR remote sensing imagery by combining coarse location and co-segmentation
Gong et al. Patch matching and dense CRF-based co-refinement for building change detection from bi-temporal aerial images
Forero et al. Comparative analysis of detectors and feature descriptors for multispectral image matching in rice crops
Akcay et al. Assessment of segmentation parameters for object-based land cover classification using color-infrared imagery
Harmening et al. A fully automated three-stage procedure for spatio-temporal leaf segmentation with regard to the B-spline-based phenotyping of cucumber plants
Zhang et al. Robust procedural model fitting with a new geometric similarity estimator
Al-Azzawi et al. A super-clustering approach for fully automated single particle picking in cryo-em

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant