CN110584605B - Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof - Google Patents

Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof Download PDF

Info

Publication number
CN110584605B
CN110584605B CN201910856897.0A CN201910856897A CN110584605B CN 110584605 B CN110584605 B CN 110584605B CN 201910856897 A CN201910856897 A CN 201910856897A CN 110584605 B CN110584605 B CN 110584605B
Authority
CN
China
Prior art keywords
case
image
diagnosis
sequence
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910856897.0A
Other languages
Chinese (zh)
Other versions
CN110584605A (en
Inventor
贾英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Hengsheng Information Technology Co ltd
Original Assignee
Jiangsu Hengsheng Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Hengsheng Information Technology Co ltd filed Critical Jiangsu Hengsheng Information Technology Co ltd
Priority to CN201910856897.0A priority Critical patent/CN110584605B/en
Publication of CN110584605A publication Critical patent/CN110584605A/en
Application granted granted Critical
Publication of CN110584605B publication Critical patent/CN110584605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Cardiology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Optics & Photonics (AREA)
  • Vascular Medicine (AREA)
  • Physiology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A diagnosis and monitoring comprehensive medical system matched with similarity comprises a client, a workstation, a cloud server, an image acquisition device and a physiological parameter sensor, wherein a patient uploads basic information of the patient to the cloud server through the client, a decision user uploads case general information, main symptoms and physical signs of the patient, a test result, pathological information, basic judgment of the patient's condition, treatment means, postoperative symptoms, nursing suggestions and strategies to the cloud server through the workstation, the cloud server stores a diagnosis and treatment case library, the cloud server comprehensively forms cases for various uploaded information, the decision user searches the diagnosis and treatment case library through the workstation and finds the most similar case matched with the similarity in the diagnosis and treatment case library.

Description

Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof
Technical Field
The invention belongs to the field of comprehensive medical treatment, and particularly relates to a diagnosis and monitoring comprehensive medical treatment system with similarity matching.
Background
Diagnosis and treatment decisions, especially diagnosis and treatment decision methods, in the medical community at home and abroad are mostly based on the examination results of various medical instruments on patients, and doctors analyze and judge the state of an illness according to own medical knowledge and clinical diagnosis experiences accumulated for many years, so as to make corresponding decisions. The effectiveness of the decision method depends on the service level of doctors, and subjective factors have a large influence.
Medical image fusion techniques offer superior performance in integrating anatomical and functional information from different imaging modalities, which information facilitates accurate diagnosis of disease even at an early stage. Therefore, multi-modal medical image fusion plays a crucial role in information integration in the medical field.
Since medical data is often highly sensitive and private, if improperly used or disclosed, it may compromise the privacy of the patient and endanger their life. The security, privacy and confidentiality of personal health information are the most important concerns for medical information systems. Therefore, the security and reliability of the medical information system are highly required.
Disclosure of Invention
The invention provides a diagnosis and monitoring integrated medical system with matched similarity, which comprises a client, a workstation, a cloud server, an image acquisition device and a physiological parameter sensor, wherein the physiological parameter sensor is transmitted to a network coordinator through a short-distance wireless communication technology to be connected with the client, the client is connected with the cloud server through the internet, the image acquisition device is connected with the workstation and uploads acquired images to the cloud server, a patient user uploads basic information of the patient user to the cloud server through the client, a decision user uploads case profile information, main symptoms and signs of the patient user, a test result, pathological information, basic judgment of patient conditions, treatment means, postoperative symptoms, nursing suggestions and strategies to the cloud server through the workstation, the decision user searches the case database through the workstation to find the most similar cases in the diagnosis and treatment case database by matching, the diagnosis and treatment case server searches the case database through the workstation and finds the most similar cases in the diagnosis and treatment case database, the image acquisition device comprises a resonance machine case, a cloud ultrasound image acquisition instrument and a CT (CT) image acquisition server, the cloud image acquisition server fuses acquired images and a wearable image acquisition server, a wearable image acquisition device and a CT (CT) image acquisition server are fused with a nuclear magnetic resonance image acquisition server, and a wearable image acquisition server, and a nuclear magnetic resonance image acquisition server are fused in a nuclear magnetic resonance image fusion method.
The decision-making user searches the diagnosis and treatment case library through the workstation, finds out the most similar case in the diagnosis and treatment case library through similarity matching, and the similarity matching method specifically comprises the following steps:
step 1, defining characteristic attributes of a source decision case according to medical clinical diagnosis and treatment characteristics, and establishing a diagnosis and treatment case library;
step 2, after the patient user visits a doctor, inputting characteristic attribute information through a human-computer interaction interface, and extracting a characteristic vector;
wherein, the feature vector includes:
case profile information vectors, which include characteristic attributes for case numbers, case names, time of occurrence of the case, attending physician's name, name of the resident, and name of the caregiver;
a patient basic information vector which comprises characteristic attributes of the sex, the age, the height, the weight, the family history, the health history, the drug allergy history, the admission date and the discharge date of a patient user;
vectors of the patient user's chief symptoms and signs, which include characteristic attributes of the patient user's subjective abnormal sensations, patient user signs, etc.;
test result vectors including characteristic attributes including blood routine, immune combination, coagulogram, urine routine, blood gas analysis, B-mode ultrasonography, CT examination, electrocardio, blood pressure and blood oxygen saturation;
pathological information vectors including the tumor size, the number of the invaded lymph nodes, whether the nodules occur or not, the malignant tumor degree, the position of the tumor mass, the quadrant of the tumor mass, the distribution characteristics and the growth characteristics in the pathological information vectors;
the basic judgment vector of the patient condition of the patient user comprises the subjective judgment result of the decision-making user, namely characteristic attributes including malignancy, benign, easy recurrence and difficult recurrence;
treatment vectors, including disease description, diagnostic procedures, whether to treat, treatment procedures, treatment effects, and intermediate examination profiles;
a post-operative symptom vector, i.e. a characteristic attribute including a description of the user's symptoms of the breast cancer patient after treatment;
vectors of care recommendations and strategies for patient-user specific illness and symptom characteristics.
Step 3, inputting the characteristic vector, and finding a relevant case in a diagnosis and treatment case library by a fuzzy diagnosis and treatment knowledge finding program;
step 4, acquiring the optimal weight of each characteristic attribute in each characteristic vector through a genetic algorithm; the method specifically comprises the following steps:
step 4.1, encoding the characteristic attribute;
step 4.2, generating initial weights of all the characteristic attributes;
step 4.3, calculating fitness and keeping the preferred chromosome;
step 4.4, finding an accurate value;
4.5, copying and selecting by using a roulette selection method;
step 4.6, crossing;
step 4.7, mutation;
step 4.8, preferably chromosomes;
step 4.9, replacing the existing chromosome;
step 4.10, if the evolution iteration number reaches the upper limit, stopping iteration if the evolution iteration number reaches the upper limit, and entering step 4.11, otherwise, returning to step 4.3 to continue iteration;
step 4.11, inversely coding the current chromosome to generate an optimal weight;
and 5, searching in a diagnosis and treatment case library through a case searching program by combining the optimal weight and each characteristic attribute, and performing similarity analysis to obtain the most similar case, wherein the method specifically comprises the following steps of:
step 5.1, determining positive and negative ideal cases of decision users, constructing a fuzzy matrix and carrying out normalization;
step 5.2, performing weighted calculation on the fuzzy matrix;
step 5.3, weighting the index values of the attributes of the positive and negative ideal cases;
step 5.4, calculating the distance between the positive and negative ideal cases, the target case and the input case,
Figure GDA0003774930220000041
Figure GDA0003774930220000042
Figure GDA0003774930220000051
wherein i is a feature vector ordinal number; j is a characteristic attribute ordinal; n is the total number of the feature vectors; l is the total number of the characteristic attributes; x is the number of ij Is the jth characteristic attribute value of the ith characteristic vector;
Figure GDA0003774930220000052
is the distance to the positive ideal point;
Figure GDA0003774930220000053
is the distance to the negative ideal point; ID T The distance between each case and the target case;
Figure GDA0003774930220000054
in order to be able to take the point value as ideal,
Figure GDA0003774930220000055
is a negative ideal point value, x T For target point values, d (,) is a distance function between fuzzy numbers;
step 5.5, calculating the degree of closeness, determining the sequence among the cases, finding the case most similar to the target case, calculating the degree of closeness between the cases and the target case,
Figure GDA0003774930220000056
the TH is the fitting degree, the higher the TH value is, the more the two cases are close, and the case with the highest fitting degree is selected as the most similar case;
step 6, returning the most similar cases to the decision user, if the requirements are met, generating diagnosis cases, and entering step 7; if the requirements are not met, the decision-making user inputs specific requirements, a case correction program is started, the obtained cases are corrected until the cases meet the requirements, and the step 7 is carried out;
and 7: and (4) the decision user reviews and evaluates the value of the retrieved case, if the case is considered to be valuable, a corresponding diagnosis conclusion is generated and stored in the diagnosis and treatment case library, and if the case is considered to be not valuable, a corresponding diagnosis conclusion is generated and quit.
The invention has the beneficial effects that:
(1) The case closest to the patient is searched through similarity matching, so that a reliable basis is provided for the diagnosis of a doctor according to a treatment scheme and the selection of a treatment environment of the patient, and the labor intensity of the doctor is greatly reduced.
(2) The detail display of the focus image of the patient is realized through image fusion, and a powerful basis is provided for the diagnosis of a doctor;
(3) The posture of the patient is recognized by judging the motion state, and the state of the patient is monitored in real time;
(4) The privacy of the patient is effectively guaranteed not to be disclosed through encrypted communication.
Drawings
FIG. 1 is a block diagram of the system of the present invention;
FIG. 2 is a flow chart of similarity matching according to the present invention.
Detailed Description
The invention is further described with reference to the following figures and examples.
Embodiments of the present invention are illustrated with reference to fig. 1-2.
A diagnosis and monitoring integrated medical system matched with similarity comprises a client, a workstation, a cloud server, an image acquisition device and a physiological parameter sensor, wherein the physiological parameter sensor is transmitted to a network coordinator through a short-distance wireless communication technology to be connected with the client, the client is connected with the cloud server through the Internet, the image acquisition device is connected with the workstation to upload acquired images to the cloud server, a patient uploads basic information of the patient through the client to the cloud server, a decision user uploads general information, main symptoms and physical signs of the patient user, a test result, pathological information, basic judgment of the patient user's disease condition, treatment means, postoperative symptoms, nursing suggestions and strategies to the cloud server through the workstation, the cloud server stores a diagnosis and treatment case library, the cloud server comprehensively forms various uploaded information into the diagnosis and treatment case library through the workstation, the decision user searches the diagnosis and treatment case library through the workstation to find out the most similar cases matched with the similarity in the diagnosis and treatment case library, the image acquisition device comprises a CT machine, an ultrasonic instrument and a nuclear magnetic instrument, the physiological parameter sensor comprises a heart rate acquisition sensor, a sphygmomanometer, an oximeter and an accelerometer, the workstation uses the image to fuse the acquired images and the cloud server to perform resonance on the ultrasound image and the ultrasound image fusion station, and the ultrasound image fusion to be fused to the cloud server, and the fusion station.
The decision-making user searches the diagnosis and treatment case library through the workstation, finds out the most similar case in the diagnosis and treatment case library through similarity matching, and the similarity matching method specifically comprises the following steps:
step 1, defining characteristic attributes of a source decision case according to medical clinical diagnosis and treatment characteristics, and establishing a diagnosis and treatment case library;
step 2, after the patient user visits a doctor, inputting characteristic attribute information through a human-computer interaction interface, and extracting a characteristic vector;
wherein, the feature vector includes:
case profile information vectors, which include characteristic attributes for case numbers, case names, time of occurrence of the case, attending physician's name, name of the resident, and name of the caregiver;
a patient basic information vector which comprises characteristic attributes of the sex, the age, the height, the weight, the family history, the health history, the drug allergy history, the date of admission and the date of discharge of a patient user;
vectors of the patient user's chief symptoms and signs, which include characteristic attributes of the patient user's subjective abnormal sensations, patient user signs, etc.;
test result vectors including characteristic attributes including blood routine, immune combination, coagulogram, urine routine, blood gas analysis, B-mode ultrasonography, CT examination, electrocardio, blood pressure and blood oxygen saturation;
pathological information vectors including the tumor size, the number of the invaded lymph nodes, whether the nodules are present or not, the malignant tumor degree, the position of the tumor mass, the quadrant of the tumor mass, the distribution characteristics and the growth characteristics in the pathological information vectors;
the basic judgment vector of the patient condition of the patient user comprises the subjective judgment result of the decision-making user, namely characteristic attributes including malignancy, benign, easy recurrence and difficult recurrence;
treatment vectors, characteristic attributes including disease description, diagnosis process, whether radiotherapy is applied, treatment process, treatment effect and intermediate examination;
a post-operative symptom vector, namely characteristic attributes of the user symptom description of the breast cancer patient after treatment;
vectors of care recommendations and strategies for patient-user specific illness and symptom characteristics.
Step 3, inputting the characteristic vector, and finding out a relevant case in the diagnosis and treatment case library by a fuzzy diagnosis and treatment knowledge finding program;
step 4, acquiring the optimal weight of each characteristic attribute in each characteristic vector through a genetic algorithm; the method specifically comprises the following steps:
step 4.1, encoding the characteristic attribute;
step 4.2, generating initial weights of the characteristic attributes;
step 4.3, calculating fitness and keeping the preferred chromosome;
step 4.4, finding an accurate value;
4.5, duplicating and selecting by using a roulette selection method;
step 4.6, crossing;
step 4.7, mutation;
step 4.8, preferably chromosomes;
step 4.9, replacing the existing chromosome;
step 4.10, if the evolution iteration number reaches the upper limit, stopping iteration if the evolution iteration number reaches the upper limit, and entering step 4.11, otherwise, returning to step 4.3 to continue iteration;
step 4.11, inversely coding the current chromosome to generate an optimal weight;
and 5, searching in a diagnosis and treatment case library through a case searching program by combining the optimal weight and each characteristic attribute, and performing similarity analysis to obtain the most similar case, wherein the method specifically comprises the following steps of:
step 5.1, determining positive and negative ideal cases of decision users, constructing a fuzzy matrix and carrying out normalization;
step 5.2, performing weighted calculation on the fuzzy matrix;
step 5.3, weighting the index values of each attribute of the positive and negative ideal cases;
step 5.4, calculating the distance between the positive and negative ideal cases, the target case and the input case,
Figure GDA0003774930220000091
Figure GDA0003774930220000092
Figure GDA0003774930220000093
wherein i is a feature vector ordinal number; j is a characteristic attribute ordinal; n is the total number of the feature vectors; l is the total number of the characteristic attributes; x is the number of ij The j characteristic attribute value of the i characteristic vector;
Figure GDA0003774930220000094
is the distance to the positive ideal point;
Figure GDA0003774930220000095
is the distance to the negative ideal point; ID T The distance between each case and the target case;
Figure GDA0003774930220000096
in order to be positive for the ideal point value,
Figure GDA0003774930220000097
is a negative ideal point value, x T Is the target point value, d (,) is the fuzzy numberA function of the distance between;
step 5.5, calculating the degree of closeness, determining the sequence among the cases, finding the case most similar to the target case, calculating the degree of closeness between each case and the target case,
Figure GDA0003774930220000101
the TH is the fitting degree, the higher the TH value is, the more the two cases are close, and the case with the highest fitting degree is selected as the most similar case;
step 6, returning the most similar cases to the decision user, if the requirements are met, generating diagnosis cases, and entering step 7; if the requirements are not met, the decision-making user inputs specific requirements, a case correction program is started, the obtained cases are corrected until the cases meet the requirements, and the step 7 is carried out;
and 7: and (4) the decision user reviews and evaluates the value of the retrieved case, if the case is considered to be valuable, a corresponding diagnosis conclusion is generated and stored in the diagnosis and treatment case library, and if the case is considered to be not valuable, a corresponding diagnosis conclusion is generated and quit.
The workstation and the cloud server communicate by using an encryption algorithm, and the encryption and decryption processes are as follows:
step C1, the secret keys of the sender and the receiver are shared, and a shared secret key is generated according to the public key of the receiver;
step C2, the sender uses the shared key and other selectable parameters as the input of a key derivation function, and the derivation result forms a symmetric encryption key and an MAC key pair;
step C3, using a symmetric encryption algorithm to take a symmetric encryption key and a plaintext as input to generate a ciphertext;
step C4, using the encryption information, the MAC key and other public parameters as input, and selecting an MAC function to generate an identifier;
step C5, the sender uses the temporary public key, the identifier and the ciphertext to form encryption information and sends the encryption information to the receiver;
step C6, the receiver receives the encrypted information and obtains the temporary public key, the identifier and the ciphertext of the sender, so that the encrypted information is decomposed;
step C7, the receiving party multiplies the received temporary public key by the private key of the receiving party;
step C8, using the shared secret key and the public parameters used by other senders, the receiver performs the same derivation process as the sender, thereby obtaining an encryption secret key and an MAC secret key pair;
step C9, using the encryption information, the MAC key and other public parameters used by the sender as input, calculating a new identifier by the receiver, comparing the new identifier with the received identifier, and if the new identifier is different from the received identifier, failing to verify and refusing to receive the ciphertext;
and step C10, if the new identifier is the same as the received identifier, the receiver continues the decryption process, and outputs the plaintext of the sender by using the encrypted information and the decryption key as output by using a symmetric decryption algorithm.
The work station or the cloud server serves as a sender, and the cloud server or the work station serves as a receiver.
Wherein, physiological parameter sensor is wearable equipment, and it judges patient's motion state through the acceleration data of accelerometer, and specific judgement process is as follows:
a1, acquiring acceleration data from an accelerometer;
step A2, data pre-processing,
the method comprises the steps of generating the acquired acceleration data into acceleration time sequence data, windowing time domain of the acceleration time sequence data by adopting a sliding window, processing and identifying the acceleration time sequence data obtained by each windowing, segmenting and extracting sequences by using a time axis by the sliding window, and defining a window length w and an overlapping window length o.
For a time series of accelerations x 1 ,x 2 ...x n To get at, the first window is { x } 1 ,x 2 ...x w The second window is { x } w-o ,x w-o+1 ...x 2w-o-1 The third window is { x } 2w-2o-1 ,x 2w-2o ...x 3w-2o-2 N, each window is { x } n(w-o)-n+1 ,x n(w-o)-n+2 ...x n(w-o)+w-n Where w =60,o =10.x is the number of n Acceleration time series data.
The overlapping has the effects of reducing the frequency spectrum leakage and reducing the damage of signal truncation to information.
And step A3, generating data characteristics,
the acceleration data features include time domain features including accelerated mean, root mean square, variance, and absolute mean over a window length w, and frequency domain features that are the energy over the window length w.
Mean value
Figure GDA0003774930220000121
The average magnitude of the acceleration value is the accelerated direct current component, and the calculation formula is as follows:
Figure GDA0003774930220000122
the root mean square D is the total size of the acceleration values, and the calculation formula is as follows:
Figure GDA0003774930220000123
variance σ x The calculation formula for the degree of dispersion of the acceleration is as follows:
Figure GDA0003774930220000124
the variance clearly reflects the intensity of the data change, wherein
Figure GDA0003774930220000125
Is the average of the accelerations. The absolute mean difference K is calculated as:
Figure GDA0003774930220000126
the calculation method of the energy E is that after FFT is carried out on the signal, the sum of the squares of the amplitudes of all the components is as follows:
Figure GDA0003774930220000127
F i the amplitude of the ith component after the acceleration time FFT is shown, and N is the number of the components.
And respectively obtaining an average value sequence, a root mean square sequence, a variance sequence, an absolute average difference sequence and an energy sequence of a plurality of windows through the calculation of the average value, the root mean square, the variance, the absolute average difference and the energy.
Feature generation is crucial in the task of pattern recognition, given a set of measures, the goal of feature generation is to explore the natural patterns of the acquired data and the re-representation of the information. A good characteristic generation process can compress basic information of data and simultaneously eliminate redundant information, so that the dimension of an original data space is reduced.
Step A4 of matching and recognizing,
generating a sample sequence, wherein the sample sequence comprises an average value sequence, a root mean square sequence, a variance sequence, an absolute average difference sequence and an energy sequence in a walking state, a running state, a jumping state, a falling state and a static state, and the sample sequence is preset or is set by a user by executing a corresponding state;
and respectively matching the obtained average value sequence, root-mean-square sequence, variance sequence, absolute average difference sequence and energy sequence with sample sequences in different states, and taking the state of the sample sequence with the highest matching degree as the motion state of the current user.
The method comprises the following steps of generating a focused color image by using an ultrasonic image and a nuclear magnetic resonance image, wherein the fusion process is as follows:
b1, converting the images A and B from an RGB space to an HSV space through HSV conversion, and obtaining three components of H, S and V;
b2, obtaining a fused H component through neighborhood superposition;
step B3, obtaining the S component and the V component after fusion through gradient calculation;
and step B4, according to the H component, the S component and the V component of the steps B2-B3, forming a final fusion image through HSV inverse transformation.
Wherein, the step B2 specifically comprises the following steps:
step B2.1, initialization, n =1,y i,j (0)=0,L i,j (0)=1,θ i,j (0)=1,
Figure GDA0003774930220000141
Step B2.2, the iteration,
Figure GDA0003774930220000142
Figure GDA0003774930220000143
Figure GDA0003774930220000144
Figure GDA0003774930220000145
Figure GDA0003774930220000146
Figure GDA0003774930220000147
Figure GDA0003774930220000148
Figure GDA0003774930220000149
wherein n is the number of iterations;
Figure GDA00037749302200001410
linear link input for image A, B after nth superposition at point (i, j);
Figure GDA00037749302200001411
the liveness of the images A and B after the nth superposition at the point (i, j) is shown;
Figure GDA00037749302200001412
the H component after the nth superposition at the point (i, j) of the image A and B;
Figure GDA00037749302200001413
dynamic threshold for image a, B after nth overlay at point (i, j);
Figure GDA00037749302200001414
outputting the binary output of the image A and B after the nth superposition at the point (i, j);
Figure GDA00037749302200001415
the high binary times of the images A and B after the nth superposition at the point (i, j) are obtained; alpha is a time decay constant; β is the link strength coefficient; w i,j A weight coefficient at point (i, j); v L 、V θ Is the amplification factor; k, l are the adjacent link ranges with point (i, j) vertically and horizontally.
Step B2.3, judging that N is more than N and is an iteration threshold; step B2.4 is entered, otherwise n = n +1, step B2.2 is returned;
step B2.4, counting, respectively calculating the total times of high values of the image A and the image B,
Figure GDA0003774930220000151
step B2.5, performing the above steps B2.1-B2.4 on all points (i, j) in image A and image B to obtain T A (i, j) and T B (i,j);
And step B2.6, fusing,
Figure GDA0003774930220000152
wherein H AB (i, j) is the H component after fusion at point (i, j), H A (i,j)、h B (i, j) is the H component, T, of the original image A, B at point (i, j) A (i,j)、T B (i, j) is the total number of times the image A, B is high at point (i, j),
wherein, the step B3 is specifically as follows:
step B3.1, gradient calculation
Figure GDA0003774930220000153
Figure GDA0003774930220000154
Wherein the content of the first and second substances,
Figure GDA0003774930220000155
Figure GDA0003774930220000156
Figure GDA0003774930220000161
Figure GDA0003774930220000162
Figure GDA0003774930220000163
Figure GDA0003774930220000164
Figure GDA0003774930220000165
Figure GDA0003774930220000166
wherein the content of the first and second substances,
Figure GDA0003774930220000167
is the gradient, G, of the original image A, B at point (i, j) i ()、G j ()、G ij ()、G ji () Is a function of the transverse, vertical, 45 DEG oblique, 135 DEG oblique changes at the point (i, j), s A (i,j)、s B (i, j) is the S component of the original image A, B at point (i, j),
furthermore, { } here denotes the covering of long expressions, i.e., the covering of long expressions, in order to avoid the inability to display expressions that are too long
Figure GDA0003774930220000168
And step B3.2, fusing,
Figure GDA0003774930220000169
S AB (i, j) is the S component after fusion at point (i, j);
step B3.3, obtaining the V component V after the images A and B are fused at the point (i, j) through the steps B3.1-B3.2 AB (i,j)。
Wherein, the image A is an ultrasonic image or an NMR image, and the image B is an NMR image or an ultrasonic image.
The above-described embodiment merely represents one embodiment of the present invention, but is not to be construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.

Claims (1)

1. A similarity matching method for a similarity matching diagnosis and monitoring integrated medical system comprises a client, a workstation, a cloud server, an image acquisition device and a physiological parameter sensor, wherein the physiological parameter sensor is transmitted to a network coordinator through a short-distance wireless communication technology to be connected with the client, the client is connected with the cloud server through the Internet, the image acquisition device is connected with the workstation and uploads acquired images to the cloud server, a patient user uploads basic information of the patient user to the cloud server through the client, a decision user uploads case profile information, main symptoms and signs of the patient user, test results, pathological information, basic judgment of the patient user's condition, treatment means, postoperative symptoms, nursing suggestions and strategies to the cloud server through the workstation, the decision user searches the diagnosis and monitoring case library through the workstation and finds the most similar cases in the diagnosis and monitoring case library by matching, the image acquisition device comprises a CT machine, an ultrasonic instrument and a resonance instrument, the physiological parameter sensor comprises an acquisition device, a heart rate sensor, a work station and a CT image acquisition device, the cloud image acquisition device and the image acquisition device are fused into a wearable image acquisition device, the ultrasound image acquisition device and the image acquisition device are fused with the cloud server through a wearable image acquisition device and a wearable image acquisition device, the cloud server,
the decision-making user searches the diagnosis and treatment case library through the workstation, finds out the most similar case in the diagnosis and treatment case library through similarity matching, and the similarity matching method specifically comprises the following steps:
step 1, defining characteristic attributes of a source decision case according to medical clinical diagnosis and treatment characteristics, and establishing a diagnosis and treatment case library;
step 2, after the patient user visits a doctor, inputting characteristic attribute information through a human-computer interaction interface, and extracting a characteristic vector;
step 3, inputting the characteristic vector, and finding out a relevant case in the diagnosis and treatment case library by a fuzzy diagnosis and treatment knowledge finding program;
step 4, acquiring the optimal weight of each characteristic attribute in each characteristic vector through a genetic algorithm;
step 5, searching in a diagnosis and treatment case library through a case searching program by combining the optimal weight and each characteristic attribute, and analyzing the similarity to obtain the most similar case;
step 6, returning the most similar cases to the decision user, if the requirements are met, generating diagnosis cases, and entering step 7; if the requirements are not met, the user is decided to input specific requirements, a case correction program is started, the obtained cases are corrected until the cases meet the requirements, and the step 7 is carried out;
and 7: the decision user reviews and evaluates the value of the retrieved case, if the case is considered to be valuable, a corresponding diagnosis conclusion is generated and stored in a diagnosis and treatment case library, and if the case is considered to be worthless, a corresponding diagnosis conclusion is generated and quit;
step 2, the feature vector comprises:
case profile information vectors including characteristic attributes of case number, case name, time of occurrence of case, attending physician name, name of resident, name of caregiver;
a patient basic information vector which comprises characteristic attributes of the sex, the age, the height, the weight, the family history, the health history, the drug allergy history, the admission date and the discharge date of a patient user;
vectors of the principal symptoms and physical signs of the patient user comprise characteristic attributes including subjective abnormal feelings of the patient user and physical signs of the patient user;
test result vectors including characteristic attributes including blood routine, immune combination, coagulogram, urine routine, blood gas analysis, B-mode ultrasonography, CT examination, electrocardio, blood pressure and blood oxygen saturation;
pathological information vectors including the tumor size, the number of the invaded lymph nodes, whether the nodules occur or not, the malignant tumor degree, the position of the tumor mass, the quadrant of the tumor mass, the distribution characteristics and the growth characteristics in the pathological information vectors;
basic judgment vectors of the patient user illness state comprise subjective judgment results of decision users, namely characteristic attributes including malignancy, benign, easy recurrence and difficult recurrence;
treatment vectors, characteristic attributes including disease description, diagnosis process, whether radiotherapy is applied, treatment process, treatment effect and intermediate examination;
a post-operative symptom vector, namely characteristic attributes of the user symptom description of the breast cancer patient after treatment;
vectors of care recommendations and strategies for patient-user specific illness and symptom characteristics;
the step 4 specifically comprises the following steps:
step 4.1, encoding the characteristic attribute;
step 4.2, generating initial weights of the characteristic attributes;
step 4.3, calculating fitness and keeping the preferred chromosome;
step 4.4, finding an accurate value;
4.5, copying and selecting by using a roulette selection method;
step 4.6, crossing;
step 4.7, mutation;
step 4.8, preferably chromosomes;
step 4.9, replacing the existing chromosome;
step 4.10, whether the evolution iteration number reaches the upper limit or not is judged, if yes, iteration is stopped, the step 4.11 is started, and if not, the step 4.3 is returned to, and iteration is continued;
step 4.11, inversely coding the current chromosome to generate an optimal weight;
the step 5 specifically comprises the following steps:
step 5.1, determining positive and negative ideal cases of decision users, constructing a fuzzy matrix and normalizing;
step 5.2, carrying out weighted calculation on the fuzzy matrix;
step 5.3, weighting the index values of each attribute of the positive and negative ideal cases;
step 5.4, calculating the distance between the positive and negative ideal cases, the target case and the input case,
Figure FDA0003774930210000041
Figure FDA0003774930210000042
Figure FDA0003774930210000043
wherein i is a feature vector ordinal number; j is a feature attribute ordinal; n is the total number of the feature vectors; l is the total number of the characteristic attributes; x is the number of ij The j characteristic attribute value of the i characteristic vector;
Figure FDA0003774930210000044
is the distance to the positive ideal point;
Figure FDA0003774930210000045
is the distance to the negative ideal point; ID (identity) T The distance between each case and the target case;
Figure FDA0003774930210000046
in order to be positive for the ideal point value,
Figure FDA0003774930210000047
is a negative ideal point value, x T For target point values, d (,) is a distance function between fuzzy numbers;
step 5.5, calculating the degree of closeness, determining the sequence among the cases, finding the case most similar to the target case, calculating the degree of closeness between each case and the target case,
Figure FDA0003774930210000048
the TH is the fitting degree, the higher the TH value is, the more the two cases are close, and the case with the highest fitting degree is selected as the most similar case;
the physiological parameter sensor is wearable equipment, and the motion state of the patient is judged according to the acceleration data of the accelerometer, and the specific judgment process is as follows:
step A1, acquiring acceleration data from an accelerometer;
step A2, data pre-processing,
generating the acquired acceleration data into acceleration time sequence data, windowing time domain of the acceleration time sequence data by adopting a sliding window, processing and identifying the acceleration time sequence data obtained by each windowing, segmenting and extracting a sequence by using a time axis by using the sliding window, and defining a window length w and an overlapped window length o;
for a time series of accelerations x 1 ,x 2 ...x n A few square windows are { x }, the first window is 1 ,x 2 ...x w The second window is { x } w-o ,x w-o+1 ...x 2w-o-1 The third window is { x } 2w-2o-1 ,x 2w-2o ...x 3w-2o-2 N, each window is { x } n(w-o)-n+1 ,x n(w-o)-n+2 ...x n(w-o)+w-n Where w =60,o =10; x is the number of n Acceleration time series data;
the overlapping has the functions of reducing frequency spectrum leakage and reducing the damage of signal truncation to information;
and step A3, generating data characteristics,
the acceleration data characteristics comprise time domain characteristics and frequency domain characteristics, the time domain characteristics comprise accelerated average values, root mean square, variance and absolute average difference in the window length w, and the frequency domain characteristics are energy in the window length w;
mean value
Figure FDA0003774930210000051
The average magnitude of the acceleration value is the accelerated direct current component, and the calculation formula is as follows:
Figure FDA0003774930210000052
the root mean square D is the total size of the acceleration value, and the calculation formula is as follows:
Figure FDA0003774930210000053
variance σ x The calculation formula is as follows:
Figure FDA0003774930210000061
the variance clearly reflects the intensity of the data change, wherein
Figure FDA0003774930210000062
Is the mean of the accelerations;
the absolute mean difference K is calculated as:
Figure FDA0003774930210000063
the calculation method of the energy E is that after FFT is carried out on the signal, the sum of the squares of the amplitudes of all the components is as follows:
Figure FDA0003774930210000064
F i the amplitude of the ith component after the acceleration time FFT is obtained, and N is the number of the components;
respectively obtaining an average value sequence, a root-mean-square sequence, a variance sequence, an absolute average difference sequence and an energy sequence of a plurality of windows through the calculation of the average value, the root-mean-square, the variance, the absolute average difference and the energy;
step A4 of matching and identification,
generating a sample sequence, wherein the sample sequence comprises an average value sequence, a root mean square sequence, a variance sequence, an absolute average difference sequence and an energy sequence in a walking state, a running state, a jumping state, a falling state and a static state, and the sample sequence is preset or is set by a user by executing a corresponding state;
respectively matching the obtained average value sequence, root mean square sequence, variance sequence, absolute average difference sequence and energy sequence with sample sequences in different states, and taking the state of the sample sequence with the highest matching degree as the motion state of the current user;
the method comprises the following steps of generating a focused color image by adopting an ultrasonic image and a nuclear magnetic resonance image, wherein the fusion process is as follows:
b1, converting the images A and B from an RGB space to an HSV space through HSV conversion, and obtaining three components of H, S and V;
b2, obtaining a fused H component through neighborhood superposition;
step B3, obtaining the S component and the V component after fusion through gradient calculation;
step B4, according to the H component, the S component and the V component of the steps B2-B3, forming a final fusion image through HSV inverse transformation;
wherein, the step B2 is specifically as follows:
step B2.1, initialization, n =1,y i,j (0)=0,L i,j (0)=1,θ i,j (0)=1,
Figure FDA0003774930210000071
Step B2.2, the iteration,
Figure FDA0003774930210000072
Figure FDA0003774930210000073
Figure FDA0003774930210000074
Figure FDA0003774930210000075
Figure FDA0003774930210000076
Figure FDA0003774930210000077
Figure FDA0003774930210000078
Figure FDA0003774930210000079
wherein n is the number of iterations;
Figure FDA00037749302100000710
linear link input for image A, B after nth superposition at point (i, j);
Figure FDA00037749302100000711
for image A, B, n times at point (i, j)The activity after superposition;
Figure FDA0003774930210000081
for image a, B the H component after the nth superposition at point (i, j);
Figure FDA0003774930210000082
dynamic threshold for image a, B after nth overlay at point (i, j);
Figure FDA0003774930210000083
outputting the binaryzation of the images A and B after the nth superposition at the point (i, j);
Figure FDA0003774930210000084
the high binary times of the images A and B after the nth superposition at the point (i, j) are obtained; alpha is a time decay constant; beta is the link strength coefficient; w i,j A weight coefficient at point (i, j); v L 、V θ Is the amplification factor; k, l are points (i, j) as vertical and horizontal adjacent link ranges;
step B2.3, judging that N is more than N and is an iteration threshold; step B2.4 is entered, otherwise n = n +1, step B2.2 is returned;
step B2.4, counting, respectively calculating the total times of high values of the image A and the image B,
Figure FDA0003774930210000085
step B2.5, performing the above steps B2.1-B2.4 on all points (i, j) in image A and image B to obtain T A (i, j) and T B (i,j);
And step B2.6, fusing,
Figure FDA0003774930210000086
wherein H AB (i, j) is the H component after fusion at point (i, j), H A (i,j)、h B (i, j) are points of original images A, BH component at (i, j), T A (i,j)、T B (i, j) is the total number of times the image A, B is high by two values at point (i, j),
wherein, the step B3 is specifically as follows:
step B3.1, gradient calculation
Figure FDA0003774930210000087
Figure FDA0003774930210000088
Wherein the content of the first and second substances,
Figure FDA0003774930210000091
Figure FDA0003774930210000092
Figure FDA0003774930210000093
Figure FDA0003774930210000094
Figure FDA0003774930210000095
Figure FDA0003774930210000096
Figure FDA0003774930210000097
Figure FDA0003774930210000098
wherein the content of the first and second substances,
Figure FDA0003774930210000099
is the gradient, G, of the original image A, B at point (i, j) i ()、G j ()、G ij ()、G ji () Is a function of the transverse, vertical, 45 DEG oblique, 135 DEG oblique variation at point (i, j), s A (i,j)、s B (i, j) is the S component of the original image A, B at point (i, j),
furthermore, to avoid the inability to display excessively long expressions, { } herein denotes the covering of long expressions, i.e., the covering of long expressions
Figure FDA0003774930210000101
And step B3.2, fusing,
Figure FDA0003774930210000102
S AB (i, j) is the S component after fusion at point (i, j);
step B3.3, obtaining the V component V after the images A and B are fused at the point (i, j) through the steps B3.1-B3.2 AB (i,j);
Wherein, the image A is an ultrasonic image or a nuclear magnetic resonance image, and the image B is a nuclear magnetic resonance image or an ultrasonic image.
CN201910856897.0A 2019-09-10 2019-09-10 Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof Active CN110584605B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910856897.0A CN110584605B (en) 2019-09-10 2019-09-10 Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910856897.0A CN110584605B (en) 2019-09-10 2019-09-10 Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof

Publications (2)

Publication Number Publication Date
CN110584605A CN110584605A (en) 2019-12-20
CN110584605B true CN110584605B (en) 2023-01-31

Family

ID=68858728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910856897.0A Active CN110584605B (en) 2019-09-10 2019-09-10 Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof

Country Status (1)

Country Link
CN (1) CN110584605B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112382383A (en) * 2020-11-04 2021-02-19 北京紫冬认知科技有限公司 Diagnosis and treatment data processing method and device, server and storage medium
CN112542226A (en) * 2020-12-14 2021-03-23 上海理工大学 Decision support system for adaptation of rehabilitation aid for patients with limb dysfunction
CN112614589A (en) * 2020-12-30 2021-04-06 四川互慧软件有限公司 Central diagnosis and treatment method based on big data system
CN112786196A (en) * 2021-02-02 2021-05-11 吕丽 Intelligent nursing interaction system based on big data medical management
CN112949659A (en) * 2021-03-15 2021-06-11 东软集团股份有限公司 Method and device for processing check information, storage medium and electronic equipment
CN113793683B (en) * 2021-08-23 2024-04-02 广州医科大学附属第一医院(广州呼吸中心) PSA-based prostate cancer auxiliary decision-making method and system
CN114577457B (en) * 2022-03-10 2024-01-23 杭州承宇节能环保技术有限公司 Energy efficiency test system and method for condensing heat exchanger
CN115359868B (en) * 2022-09-14 2023-07-28 亿慧云智能科技(深圳)股份有限公司 Intelligent medical monitoring method and system based on cloud computing technology
CN115458160B (en) * 2022-10-12 2023-06-20 浙江大学 Whole-course intelligent management system, method, equipment and medium for breast tumor patients
CN116246776B (en) * 2023-04-04 2023-11-07 曜立科技(北京)有限公司 Method and system for assisting cerebral apoplexy diagnosis decision

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010119356A2 (en) * 2009-04-15 2010-10-21 Koninklijke Philips Electronics N.V. Clinical decision support systems and methods
CN103116707A (en) * 2013-03-04 2013-05-22 北京工业大学 Heart disease intelligent diagnostic method based on case reasoning
CN103338224A (en) * 2013-05-28 2013-10-02 美合实业(苏州)有限公司 Central diagnostic system of medical institution
CN106408126B (en) * 2016-09-23 2018-01-30 广东技术师范学院 A kind of three perfecting by stage methods concurrently gathered towards energy consumption data
CN106959675B (en) * 2017-03-21 2020-06-02 山东大学 Multi-objective scheduling optimization method for flow shop
CN109784722A (en) * 2019-01-15 2019-05-21 齐鲁工业大学 Web service selection method and system based on user preference

Also Published As

Publication number Publication date
CN110584605A (en) 2019-12-20

Similar Documents

Publication Publication Date Title
CN110584605B (en) Similarity-matched diagnosis and monitoring comprehensive medical system and matching method thereof
CN110600109B (en) Diagnosis and monitoring comprehensive medical system with color image fusion and fusion method thereof
Kawahara et al. Seven-point checklist and skin lesion classification using multitask multimodal neural nets
CN111133526B (en) Novel features useful in machine learning techniques, such as machine learning techniques for diagnosing medical conditions
WO2020132393A1 (en) A digital platform to identify health conditions and therapeutic interventions using an automatic and distributed artificial intelligence system
Walsh et al. Deep learning–based outcome prediction in progressive fibrotic lung disease using high-resolution computed tomography
CN110580951B (en) Diagnosis monitoring comprehensive medical system with encrypted communication and communication encryption method thereof
CN111095232B (en) Discovery of genomes for use in machine learning techniques
US20060120584A1 (en) Method and system for automatic diagnosis of possible brain disease
CN110575178B (en) Diagnosis and monitoring integrated medical system for judging motion state and judging method thereof
US20190189278A1 (en) Information processing device and information processing method
Liu et al. Big medical data decision-making intelligent system exploiting fuzzy inference logic for prostate cancer in developing countries
CN114566282B (en) Treatment decision system based on echocardiogram detection report
WO2021217906A1 (en) Posture detection method, apparatus and device based on gait features, and storage medium
JP2019532397A (en) Target data management system
KR20110113392A (en) Method for ubiquaters healthcare service
Soundrapandiyan et al. AI-based wavelet and stacked deep learning architecture for detecting coronavirus (COVID-19) from chest X-ray images
Kute et al. Machine learning–based disease diagnosis and prediction for E‐healthcare system
CN111863248A (en) Effective method for constructing clinical decision model
CN110600124B (en) Diagnosis and monitoring integrated medical system with gray level image fusion and fusion method thereof
US20190236389A1 (en) Methods and systems for identifying and profiling biological tissue
CN115935404A (en) System and method for generating a composite data set from an original data set
Gopalakrishnan et al. Cloud Security System for ECG Transmission and Monitoring Based on Chaotic Logistic Maps
Alagarsamy et al. Classification of covid patient image dataset using modified deep convolutional neural network system
Li et al. Shrink: A breast cancer risk assessment model based on medical social network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221122

Address after: 215000 east area, 2nd floor, 425 Changyang street, Suzhou Industrial Park, Jiangsu Province

Applicant after: Jiangsu Hengsheng Information Technology Co.,Ltd.

Address before: 250012 School of medicine, Shandong University, 44 West Wenhua Road, Lixia District, Shandong, Ji'nan

Applicant before: Jia Ying

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A Diagnosis Monitoring Integrated Medical System with Similarity Matching and Its Matching Method

Effective date of registration: 20231225

Granted publication date: 20230131

Pledgee: Agricultural Bank of China Limited Suzhou Industrial Park sub branch

Pledgor: Jiangsu Hengsheng Information Technology Co.,Ltd.

Registration number: Y2023980072811

PE01 Entry into force of the registration of the contract for pledge of patent right