CN111723622B - Heart beat classification method, heart beat classification device, wearable equipment and storage medium - Google Patents
Heart beat classification method, heart beat classification device, wearable equipment and storage medium Download PDFInfo
- Publication number
- CN111723622B CN111723622B CN201910220835.0A CN201910220835A CN111723622B CN 111723622 B CN111723622 B CN 111723622B CN 201910220835 A CN201910220835 A CN 201910220835A CN 111723622 B CN111723622 B CN 111723622B
- Authority
- CN
- China
- Prior art keywords
- detected
- heart beat
- heart
- learning model
- deep learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000013136 deep learning model Methods 0.000 claims abstract description 74
- 238000012549 training Methods 0.000 claims description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 2
- 230000000747 cardiac effect Effects 0.000 description 28
- 238000010586 diagram Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000002861 ventricular Effects 0.000 description 2
- 208000002102 Atrial Premature Complexes Diseases 0.000 description 1
- 208000000418 Premature Cardiac Complexes Diseases 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003183 myoelectrical effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000036620 skin dryness Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
Abstract
The disclosure relates to a heart beat classification method, a heart beat classification device, a wearable device and a storage medium, wherein the method comprises the following steps: acquiring target electrocardiographic data; identifying a plurality of heart beats to be detected from an electrocardiograph image corresponding to the target electrocardiograph data; inputting the heart beats to be tested into a pre-trained deep learning model, and determining preset classification of each heart beat to be tested based on the output result of the deep learning model. The method and the device identify a plurality of heart beats to be detected from the electrocardio images corresponding to the target electrocardio data, and divide the input heart beats to be detected through a pre-trained deep learning model, so that the accuracy of classifying the heart beats to be detected can be improved.
Description
Technical Field
The disclosure relates to the field of wearable technology, and in particular relates to a heart beat classification method, a heart beat classification device, wearable equipment and a storage medium.
Background
An Electrocardiograph (ECG) signal can record electrophysiological activity of the heart in units of time, and has important reference value for evaluating basic functions and pathological studies of the heart. In recent years, with the development of wearable technology, it is accepted that electrocardiosignals are measured by wearable technology.
The wearable equipment capable of measuring electrocardiosignals comprises a bracelet, a watch, a chest piece, intelligent clothes and the like, and great convenience is provided for self-help electrocardiosignal detection of a user.
However, the existing wearable device is easily affected by various factors (such as a user being in a motion state, skin dryness, myoelectric interference, or electrode interference of a sensor on the wearable device) in the process of detecting the electrocardiographic data of the user, so that the acquired electrocardiographic data is not accurate enough, and the accuracy of the subsequent heartbeat classification based on the acquired electrocardiographic data is reduced.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present disclosure provide a heart beat classification method, apparatus, wearable device, and storage medium, so as to solve the deficiencies in the related art.
According to a first aspect of an embodiment of the present disclosure, there is provided a beat classification method, including:
Acquiring target electrocardiographic data;
identifying a plurality of heart beats to be detected from an electrocardiograph image corresponding to the target electrocardiograph data;
Inputting the heart beats to be tested into a pre-trained deep learning model, and determining preset classification of each heart beat to be tested based on the output result of the deep learning model.
In an embodiment, the identifying a plurality of heart beats to be detected from the electrocardiographic image corresponding to the target electrocardiographic data includes:
dividing the target electrocardiographic data into a plurality of data segments based on a preset time interval;
and identifying the heart beat to be detected from the electrocardio images corresponding to each data segment.
In an embodiment, the method further comprises:
And determining the position of each heart beat to be measured in the plurality of heart beats to be measured.
In an embodiment, the determining the preset classification of each cardiac beat to be tested based on the output result of the deep learning model includes:
determining the probability of each heart beat to be detected belonging to each preset classification in a plurality of preset classifications based on the output result of the deep learning model;
and determining the preset classification of each heart beat to be detected based on the probability.
In an embodiment, the method further comprises training the deep learning model in advance based on:
acquiring a plurality of sample electrocardiographic data;
identifying a plurality of sample heart beats from the electrocardio images corresponding to each sample electrocardio data;
calibrating the preset classification corresponding to each sample heart beat;
taking the plurality of sample heart beats and the corresponding preset classifications as training sets, and training a deep learning model.
In one embodiment, the deep learning model comprises a ResNet-architecture deep learning model trained based on a Squeeze-specification block technique.
According to a second aspect of embodiments of the present disclosure, there is provided a heart beat classification device comprising:
The target data acquisition module is used for acquiring target electrocardiographic data;
the heart beat identification module to be detected is used for identifying a plurality of heart beats to be detected from the electrocardio images corresponding to the target electrocardio data;
And the heart beat classification module to be tested is used for inputting the heart beats to be tested into a pre-trained deep learning model, and determining the preset classification of each heart beat to be tested based on the output result of the deep learning model.
In an embodiment, the heart beat identification module to be tested includes:
The target data dividing unit is used for dividing the target electrocardiographic data into a plurality of data segments based on a preset time interval;
and the heart beat identification unit to be detected is used for identifying heart beats to be detected from the electrocardio images corresponding to each data segment.
In an embodiment, the device further comprises:
And the heart beat position determining module is used for determining the position of each heart beat to be detected in the plurality of heart beats to be detected.
In an embodiment, the heart beat classification module to be tested includes:
The classification probability determining unit is used for determining the probability that each heart beat to be detected belongs to each preset classification in a plurality of preset classifications based on the output result of the deep learning model;
and the heart beat classification unit to be detected is used for determining the preset classification of each heart beat to be detected based on the probability.
In an embodiment, the apparatus further comprises a deep learning model training module;
The deep learning model training module comprises:
a sample data acquisition unit for acquiring a plurality of sample electrocardiographic data;
the sample heart beat identification unit is used for identifying a plurality of sample heart beats from the electrocardio images corresponding to the electrocardio data of each sample;
The sample characteristic calibration unit is used for calibrating the preset classification corresponding to each sample heart beat;
And the deep learning model training unit is used for taking the plurality of sample heart beats and the corresponding preset classifications as training sets to train the deep learning model.
In one embodiment, the deep learning model comprises a ResNet-architecture deep learning model trained based on a Squeeze-specification block technique.
According to a third aspect of embodiments of the present disclosure, there is provided a wearable device, the device comprising:
A processor;
A memory configured to store processor-executable instructions;
wherein the processor is configured to perform any of the methods described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which when processed by a processor implements any of the methods described above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
In the method, a plurality of heart beats to be detected are identified from an electrocardio image corresponding to the target electrocardio data, then the heart beats to be detected are input into a pre-trained deep learning model, and the preset classification of each heart beat to be detected is determined based on the output result of the deep learning model. Further, by determining the position of each beat to be measured in the plurality of identified beats to be measured, accurate positioning of a particular type of beat while it is detected can be achieved. Furthermore, the heart beat to be detected is classified by adopting the deep learning model, so that the anti-interference performance on noise can be enhanced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a beat classification method according to an example embodiment;
FIG. 2 is a flowchart illustrating how a beat to be measured is identified, according to an example embodiment;
FIG. 3 is a flow chart illustrating how a preset classification of heart beats to be measured is determined according to an exemplary embodiment;
FIG. 4 is a flow chart illustrating a method of beat classification according to yet another exemplary embodiment;
FIG. 5 is a block diagram illustrating a beat classification device according to an example embodiment;
FIG. 6 is a block diagram illustrating another beat classification device according to an example embodiment;
fig. 7 is a block diagram of a wearable device, shown according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
FIG. 1 is a flow chart illustrating a method of beat classification according to an exemplary embodiment. This embodiment may be used for a terminal device (e.g. a mobile terminal such as a mobile phone, a tablet computer, a personal computer, or a wearable device such as a smart band or a smart watch), as shown in fig. 1, and the method includes the following steps S101-S103:
in step S101, target electrocardiographic data is acquired.
In an alternative embodiment, when the user uses the terminal device, the terminal device may collect electrocardiographic data of the user through a built-in sensor, as target electrocardiographic data in this embodiment.
In an alternative embodiment, the sensor of the terminal device may collect electrocardiographic data of the user according to a preset frequency.
It should be noted that, the type of the sensor and the preset frequency of the sensor for acquiring the electrocardiographic data may be freely set by a developer according to actual needs, which is not limited in this embodiment.
In step S102, a plurality of heart beats to be detected are identified from the electrocardiographic image corresponding to the target electrocardiographic data.
In an optional embodiment, after acquiring the target electrocardiographic data, a corresponding electrocardiographic image may be drawn based on the target electrocardiographic data, and then a plurality of cardiac beats to be detected may be identified from the electrocardiographic image by using a preset image identification algorithm.
In an alternative embodiment, the preset image recognition algorithm may be selected by a developer according to actual service requirements, which is not limited in this embodiment.
In an alternative embodiment, the above manner of identifying the heart beat to be measured from the electrocardiographic image may also refer to the embodiment shown in fig. 2 described below, which will not be described in detail herein.
In step S103, the plurality of heart beats to be measured are input into a pre-trained deep learning model, and a preset classification of each heart beat to be measured is determined based on an output result of the deep learning model.
In an optional embodiment, after a plurality of cardiac beats to be detected are identified from the cardiac image corresponding to the target cardiac data, the cardiac beats to be detected may be input into a pre-trained deep learning model, so as to determine, according to an output result of the deep learning model, a preset classification to which each cardiac beat of the plurality of cardiac beats to be detected belongs.
In an alternative embodiment, after determining the preset classification of the heart beat to be measured, it may be determined whether the preset classification of the heart beat to be measured is a heart beat classification to be positioned, if yes, the heart beat is positioned to determine the position of the heart beat to be measured in the plurality of heart beats to be measured.
For example, if the currently preset heartbeat to be positioned is classified as "class C", the preset classification of the first heartbeat to be measured is determined as "class a", and the preset classification of the second heartbeat to be measured is determined as "class C", the preset classification of the second heartbeat to be measured may be determined as the heartbeat classification to be positioned, and then the heartbeat may be positioned by adopting a preset positioning method, so as to determine the positions of the heartbeat to be measured in the plurality of heartbeats to be measured.
It should be noted that, the method for positioning the heartbeat may be selected by a developer according to actual needs, which is not limited in this embodiment.
It can be appreciated that by determining the position of each beat to be measured in the identified plurality of beats to be measured, accurate positioning of the beats of a particular type can be achieved while the beats are detected, facilitating subsequent analysis and processing thereof.
In an alternative embodiment, the preset classification may be set by a developer according to actual service requirements, for example, one or more of normal cardiac beat, ventricular premature beat, ventricular escape beat, and atrial extra-systole, which are classified in the prior art, and this embodiment is not limited thereto.
In an alternative embodiment, the manner of determining the above-mentioned preset classification of each heart beat to be measured may also refer to the embodiment shown in fig. 3 described below, which will not be described in detail here.
Note that, the position of the heart beat may be the center position of the heart beat, or other positions on the heart beat may be set as the position of the heart beat by a developer according to actual service requirements, which is not limited in this embodiment.
In an alternative embodiment, the deep learning model of ResNet architecture may be trained in advance based on the Squeeze-specification block technique, and the specific training manner of this model may also be referred to as the embodiment shown in fig. 4 below, which is not described in detail herein.
In another embodiment, the type of the deep learning model may be further selected by a developer according to the actual service requirement to train the existing deep learning model, which is not limited in this embodiment.
As can be seen from the foregoing description, in the embodiment of the present disclosure, the target electrocardiographic data is obtained, and a plurality of to-be-measured heart beats are identified from electrocardiographic images corresponding to the target electrocardiographic data, then the plurality of to-be-measured heart beats are input into a pre-trained deep learning model, and a preset classification of each to-be-measured heart beat is determined based on an output result of the deep learning model.
FIG. 2 is a flowchart illustrating how a beat to be measured is identified, according to an example embodiment; the present embodiment is exemplified on the basis of the above-described embodiments by taking as an example how to recognize a heart beat to be measured. As shown in fig. 2, in the step S102, a plurality of heart beats to be detected are identified from the electrocardiographic image corresponding to the target electrocardiographic data, including the following steps S201 to S202:
In step S201, the target electrocardiographic data is divided into a plurality of data segments based on a preset time interval.
In an alternative embodiment, after the target electrocardiographic data is acquired, the target electrocardiographic data may be divided into a plurality of data segments based on a preset time interval.
In an alternative embodiment, a sliding window with a preset length may be used to sequentially slide on the target electrocardiographic data, so as to divide the target electrocardiographic data into a plurality of data segments.
Note that, the length of the sliding window may be freely set by a developer according to an actual requirement or a physiological limit of the subject, for example, set to 200 ms, which is not limited in this embodiment.
In an alternative embodiment, each two adjacent data segments in the plurality of data segments may overlap, which is not limited in this embodiment.
In step S202, a beat to be measured is identified from the electrocardiographic image corresponding to each data segment.
In an optional embodiment, after the target electrocardiographic data is divided into a plurality of data segments, an electrocardiographic image corresponding to each data segment may be drawn, and then a preset image recognition algorithm may be adopted to recognize a corresponding cardiac beat to be detected from the electrocardiographic image.
In an alternative embodiment, the preset image recognition algorithm may be selected by a staff according to actual service needs, for example, an object detection YOLO (You Only Look Once) algorithm in the selected image, which is not limited in this embodiment.
As can be seen from the foregoing description, in this embodiment, the target electrocardiographic data is divided into a plurality of data segments based on a preset time interval, and the to-be-detected cardiac beats are identified from the electrocardiographic image corresponding to each data segment, so that a plurality of to-be-detected cardiac beats can be accurately identified from the electrocardiographic image corresponding to the target electrocardiographic data, and accurate basis can be improved for extracting features of subsequent to-be-detected cardiac beats and classifying to-be-detected cardiac beats.
FIG. 3 is a flow chart illustrating how a preset classification of heart beats to be measured is determined according to an exemplary embodiment; the present embodiment is exemplified on the basis of the above-described embodiments by taking as an example how to determine a preset classification of heart beats to be measured. As shown in fig. 3, the step S103 determines a preset classification of each cardiac beat to be measured based on the output result of the deep learning model, and includes the following steps S301-S302:
In step S301, a probability that each of the heart beats to be measured belongs to each of a plurality of preset classifications is determined based on an output result of the deep learning model.
In an optional embodiment, after a plurality of heart beats to be detected are identified from the electrocardiographic image corresponding to the target electrocardiographic data, each heart beat to be detected may be respectively input into a pre-trained deep learning model, so as to obtain a model output result corresponding to each heart beat to be detected.
In an optional embodiment, the model output result may include a probability that each of the heart beats to be tested belongs to each of a plurality of preset classifications.
In an alternative embodiment, if N (N is a positive integer) beat types exist, the model output result may be set as an N-dimensional vector, and the ith element in the vector may be used to represent the probability that the beat to be measured belongs to the ith class.
In an alternative embodiment, each dimension of the vector may be set as a probability of a beat type, so that multiple beat types may be detected at a time in the form of a vector, thereby improving efficiency and accuracy of classifying the beat to be detected.
In step S302, a preset classification of each of the heart beats to be measured is determined based on the probabilities.
In an optional embodiment, after determining the probability that each cardiac beat to be measured belongs to each preset classification in the plurality of preset classifications, the classification with the highest probability may be determined, that is, the preset classification of the cardiac beat to be measured.
In an alternative embodiment, the probabilities of the plurality of preset classifications may be arranged in ascending order or descending order, so as to determine the classification with the highest probability.
As can be seen from the foregoing description, in this embodiment, by determining, based on the output result of the deep learning model, the probability that each cardiac beat to be measured belongs to each of a plurality of preset classifications, and determining, based on the probability, the preset classification of each cardiac beat to be measured, it is possible to determine, based on the output result of the deep learning model, the preset classification of each cardiac beat to be measured, and it is possible to improve the accuracy of classifying cardiac beats to be measured.
FIG. 4 is a flow chart illustrating a method of beat classification according to yet another exemplary embodiment; this embodiment may be used for a terminal device (e.g., a mobile terminal such as a mobile phone, a tablet computer, a personal computer, or a wearable device such as a smart band or a smart watch), as shown in fig. 4, and the method includes the following steps S401 to S407:
In step S401, a plurality of sample electrocardiographic data are acquired.
In an alternative embodiment, sample electrocardiographic data for different sample users may be obtained.
In an alternative embodiment, the sample electrocardiographic data may be a plurality of electrocardiographic data with unequal lengths, and the length of the electrocardiographic data may be between 30 seconds and 300 seconds, which is not limited in this embodiment.
In step S402, a plurality of sample beats are identified from an electrocardiographic image corresponding to each of the sample electrocardiographic data.
In an alternative embodiment, after sampling the electrocardiographic data, a corresponding electrocardiographic image may be drawn based on the sampled electrocardiographic data, and then a plurality of sample beats may be identified from the electrocardiographic image by using a preset image identification algorithm.
In an alternative embodiment, the preset image recognition algorithm may be selected by a developer according to actual service needs, for example, an object detection YOLO (You Only Look Once) algorithm in the selected image, which is not limited in this embodiment.
In an alternative embodiment, the method for identifying the sample beat is the same as the method for identifying the beat to be measured in the embodiment shown in fig. 2, and will not be described herein.
In step S403, a preset classification corresponding to each of the sample beats is calibrated.
In an optional embodiment, after a plurality of sample cardiac beats are identified from the electrocardiographic image corresponding to each sample electrocardiographic data, a probability of each preset classification in a plurality of preset classifications corresponding to each sample cardiac beat may be determined, and thus a preset classification corresponding to each sample cardiac beat may be determined.
In an alternative embodiment, the preset classification corresponding to each heart beat may be represented in a multi-dimensional vector form, where each dimension of the vector may be set to be a heart beat type, so that the subsequent detection of multiple heart beat types in one time in the vector form may be realized, and efficiency and accuracy of classifying heart beats to be detected are improved.
It should be noted that, the above-mentioned determination of the probability of each preset classification corresponding to each sample beat may be determined by a developer according to actual service experience or a statistical result of related data, which is not limited in this embodiment.
In step S404, the plurality of sample heart beats and the corresponding preset classifications are used as a training set to train a deep learning model.
In an optional embodiment, after calibrating the preset classification corresponding to each sample beat, the plurality of sample beats and the corresponding preset classification may be used as a training set to train a pre-selected training deep learning model.
It should be noted that, the type of the deep learning model may be selected by a developer according to actual needs, for example, a deep learning model of ResNet architecture trained based on the Squeeze-specification block technique, or a multi-label random forest regression model, which is not limited in this embodiment.
In step S405, target electrocardiographic data is acquired.
In step S406, a plurality of heart beats to be detected are identified from the electrocardiographic image corresponding to the target electrocardiographic data.
In step S407, the plurality of heart beats to be measured are input into a pre-trained deep learning model, and a preset classification of each heart beat to be measured is determined based on an output result of the deep learning model.
The explanation and explanation of steps S405-S407 may be referred to the above embodiments and are not repeated here.
As can be seen from the foregoing description, in this embodiment, by acquiring a plurality of sample electrocardiographic data, identifying a plurality of sample beats from an electrocardiographic image corresponding to each sample electrocardiographic data, and calibrating a preset classification corresponding to each sample beat, using the plurality of sample beats and the corresponding preset classification as a training set, training a deep learning model, a preset classification of each cardiac beat to be tested can be determined based on the trained deep learning model, and accuracy of classifying the cardiac beats to be tested is improved.
FIG. 5 is a block diagram illustrating a beat classification device according to an example embodiment; as shown in fig. 5, the apparatus includes: the target data acquisition module 110, the heart beat to be measured identification module 120, and the heart beat to be measured classification module 130, wherein:
A target data acquisition module 110, configured to acquire target electrocardiographic data;
the heart beat identification module 120 to be tested is used for identifying a plurality of heart beats to be tested from the electrocardio image corresponding to the target electrocardio data;
And the heart beat classification module 130 is used for inputting the heart beats to be detected into a pre-trained deep learning model, and determining the preset classification of each heart beat to be detected based on the output result of the deep learning model.
As can be seen from the foregoing description, in the embodiment of the present disclosure, by acquiring the target electrocardiographic data, identifying a plurality of to-be-detected heart beats from the electrocardiographic image corresponding to the target electrocardiographic data, inputting the plurality of to-be-detected heart beats into a pre-trained deep learning model, determining a preset classification of each to-be-detected heart beat based on an output result of the deep learning model, and dividing the input to-be-detected heart beats by the pre-trained deep learning model, accuracy of classifying the to-be-detected heart beats may be improved.
FIG. 6 is a block diagram illustrating another beat classification device according to an example embodiment; the target data acquiring module 210, the beat identifying module to be tested 220, and the beat classifying module to be tested 230 are the same as the target data acquiring module 110, the beat identifying module to be tested 120, and the beat classifying module to be tested 130 in the embodiment shown in fig. 5, and are not described herein.
As shown in fig. 6, the beat identification module to be measured 220 may include:
A target data dividing unit 221, configured to divide the target electrocardiographic data into a plurality of data segments based on a preset time interval;
And the heart beat to be detected identifying unit 222 is configured to identify a heart beat to be detected from the electrocardiographic image corresponding to each data segment.
In an alternative embodiment, the beat classification module 230 to be tested may include:
a classification probability determining unit 231, configured to determine, based on an output result of the deep learning model, a probability that each of the heart beats to be tested belongs to each of a plurality of preset classifications;
and the heart beat classification unit 232 to be tested is used for determining the preset classification of each heart beat to be tested based on the probability.
In an alternative embodiment, the apparatus may further include a deep learning model training module 240;
the deep learning model training module 240 may include:
A sample data acquiring unit 241 for acquiring a plurality of sample electrocardiographic data;
A sample cardiac beat identification unit 242, configured to identify a plurality of sample cardiac beats from an electrocardiographic image corresponding to each of the sample electrocardiographic data;
A sample feature calibration unit 243, configured to calibrate a preset classification corresponding to each sample beat;
The deep learning model training unit 244 is configured to train the deep learning model by using the plurality of sample heart beats and the corresponding preset classifications as training sets.
In an alternative embodiment, the apparatus may further include:
The heart beat position determining module 250 is configured to determine a position of each of the heart beats to be measured in the plurality of heart beats to be measured.
In an alternative embodiment, the deep learning model may include a ResNet architecture deep learning model trained based on a Squeeze-specification block technique.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the objectives of the disclosed solution. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The heart beat classification method can be executed by a wearable device with an electrocardiographic data acquisition function, and the structure of the wearable device can be referred to as a schematic diagram shown in fig. 7. As shown in fig. 7, the wearable device may include a processor 1510, a communication interface (Communications Interface) 1520, a memory 1530, a bus 1540. Processor 1510, communication interface 1520, and memory 1530 communicate with each other via bus 1540.
The memory 1530 may store a beat sort logic instruction, and may be, for example, a nonvolatile memory (non-volatile memory). Processor 1510 may invoke execution of the beat classification logic instructions in memory 1530 to perform the beat classification methods described above. For example, the heart beat classification logic instruction may be a program corresponding to a part of functions of control software of the medical image processing system, and when the processor executes the instruction, the wearable device may correspondingly display a functional interface corresponding to the instruction on the display interface.
The functionality of the heart beat classification logic instructions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (12)
1. A heart beat classification method, comprising:
Acquiring target electrocardiographic data, and generating a corresponding electrocardiographic image based on the target electrocardiographic data, wherein the target electrocardiographic data comprises electrocardiographic data of a user acquired by a sensor according to a preset frequency;
a plurality of heart beats to be detected are identified from an electrocardio image corresponding to the target electrocardio data by using a preset image identification algorithm;
inputting the heart beats to be tested into a pre-trained deep learning model, and determining preset classification of each heart beat to be tested based on an output result of the deep learning model;
the method further comprises the steps of:
If the preset classification of any heart beat to be detected in the plurality of heart beats to be detected belongs to the preset heart beat classification to be positioned, determining the position of the any heart beat to be detected in the plurality of heart beats to be detected.
2. The method according to claim 1, wherein identifying a plurality of heart beats to be measured from the electrocardiographic image corresponding to the target electrocardiographic data includes:
dividing the target electrocardiographic data into a plurality of data segments based on a preset time interval;
and identifying the heart beat to be detected from the electrocardio images corresponding to each data segment.
3. The method of claim 1, wherein the determining a preset classification for each of the beats under test based on the output of the deep learning model comprises:
determining the probability of each heart beat to be detected belonging to each preset classification in a plurality of preset classifications based on the output result of the deep learning model;
and determining the preset classification of each heart beat to be detected based on the probability.
4. The method of claim 1, further comprising training the deep learning model based on the following steps in advance:
acquiring a plurality of sample electrocardiographic data;
identifying a plurality of sample heart beats from the electrocardio images corresponding to each sample electrocardio data;
calibrating the preset classification corresponding to each sample heart beat;
taking the plurality of sample heart beats and the corresponding preset classifications as training sets, and training a deep learning model.
5. The method of any one of claims 1-4, wherein the deep learning model comprises a ResNet architecture deep learning model trained based on a Squeeze-specification block technique.
6. A heart beat classification device, comprising:
the system comprises a target data acquisition module, a target data processing module and a data processing module, wherein the target data acquisition module is used for acquiring target electrocardiographic data and generating a corresponding electrocardiographic image based on the target electrocardiographic data, and the target electrocardiographic data comprises electrocardiographic data of a user acquired by a sensor according to a preset frequency;
the heart beat identification module to be detected is used for identifying a plurality of heart beats to be detected from the electrocardio images corresponding to the target electrocardio data by utilizing a preset image identification algorithm;
The heart beat classification module to be tested is used for inputting the heart beats to be tested into a pre-trained deep learning model, and determining the preset classification of each heart beat to be tested based on the output result of the deep learning model;
And the heart beat position determining module is used for determining the position of any heart beat to be detected in the plurality of heart beats to be detected when the preset classification of any heart beat to be detected in the plurality of heart beats to be detected belongs to the preset heart beat classification needing to be positioned.
7. The apparatus of claim 6, wherein the beat-to-be-measured identification module comprises:
The target data dividing unit is used for dividing the target electrocardiographic data into a plurality of data segments based on a preset time interval;
and the heart beat identification unit to be detected is used for identifying heart beats to be detected from the electrocardio images corresponding to each data segment.
8. The apparatus of claim 6, wherein the beat classification module to be tested comprises:
The classification probability determining unit is used for determining the probability that each heart beat to be detected belongs to each preset classification in a plurality of preset classifications based on the output result of the deep learning model;
and the heart beat classification unit to be detected is used for determining the preset classification of each heart beat to be detected based on the probability.
9. The apparatus of claim 6, further comprising a deep learning model training module;
The deep learning model training module comprises:
a sample data acquisition unit for acquiring a plurality of sample electrocardiographic data;
the sample heart beat identification unit is used for identifying a plurality of sample heart beats from the electrocardio images corresponding to the electrocardio data of each sample;
The sample characteristic calibration unit is used for calibrating the preset classification corresponding to each sample heart beat;
And the deep learning model training unit is used for taking the plurality of sample heart beats and the corresponding preset classifications as training sets to train the deep learning model.
10. The apparatus of any of claims 6-9, wherein the deep learning model comprises a ResNet architecture deep learning model trained based on a Squeeze-specification block technique.
11. A wearable device, the device comprising:
A processor;
A memory configured to store processor-executable instructions;
wherein the processor is configured to perform the method of any of the preceding claims 1-5.
12. A computer readable storage medium having stored thereon a computer program, characterized in that the program, when processed by a processor, implements the method of any of the preceding claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910220835.0A CN111723622B (en) | 2019-03-22 | 2019-03-22 | Heart beat classification method, heart beat classification device, wearable equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910220835.0A CN111723622B (en) | 2019-03-22 | 2019-03-22 | Heart beat classification method, heart beat classification device, wearable equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111723622A CN111723622A (en) | 2020-09-29 |
CN111723622B true CN111723622B (en) | 2024-04-26 |
Family
ID=72562824
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910220835.0A Active CN111723622B (en) | 2019-03-22 | 2019-03-22 | Heart beat classification method, heart beat classification device, wearable equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111723622B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0776631A1 (en) * | 1995-11-29 | 1997-06-04 | Hewlett-Packard Company | Method and apparatus for classifying heartbeats in an ECG waveform |
CN104706366A (en) * | 2013-12-13 | 2015-06-17 | ***通信集团公司 | Distraction testing method, device and system |
CN108039203A (en) * | 2017-12-04 | 2018-05-15 | 北京医拍智能科技有限公司 | The detecting system of arrhythmia cordis based on deep neural network |
CN108030488A (en) * | 2017-11-30 | 2018-05-15 | 北京医拍智能科技有限公司 | The detecting system of arrhythmia cordis based on convolutional neural networks |
CN108464827A (en) * | 2018-03-08 | 2018-08-31 | 四川大学 | It is a kind of it is Weakly supervised under electrocardio image-recognizing method |
CN108596073A (en) * | 2018-04-19 | 2018-09-28 | 郑州大学 | A kind of lightweight algorithm identifying electrocardiogram (ECG) data based on deep learning |
CN109091138A (en) * | 2018-07-12 | 2018-12-28 | 上海微创电生理医疗科技股份有限公司 | The judgment means and Mapping System of arrhythmia cordis originating point |
CN109288515A (en) * | 2018-11-14 | 2019-02-01 | 东南大学 | Periodical monitoring method and device based on premature beat signal in wearable ECG signal |
CN109350032A (en) * | 2018-10-16 | 2019-02-19 | 武汉中旗生物医疗电子有限公司 | A kind of classification method, system, electronic equipment and storage medium |
CN109480825A (en) * | 2018-12-13 | 2019-03-19 | 武汉中旗生物医疗电子有限公司 | The processing method and processing device of electrocardiogram (ECG) data |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8024030B2 (en) * | 2009-08-12 | 2011-09-20 | Siemens Aktiengesellschaft | System and method for analyzing an electrocardiogram signal |
-
2019
- 2019-03-22 CN CN201910220835.0A patent/CN111723622B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0776631A1 (en) * | 1995-11-29 | 1997-06-04 | Hewlett-Packard Company | Method and apparatus for classifying heartbeats in an ECG waveform |
CN104706366A (en) * | 2013-12-13 | 2015-06-17 | ***通信集团公司 | Distraction testing method, device and system |
CN108030488A (en) * | 2017-11-30 | 2018-05-15 | 北京医拍智能科技有限公司 | The detecting system of arrhythmia cordis based on convolutional neural networks |
CN108039203A (en) * | 2017-12-04 | 2018-05-15 | 北京医拍智能科技有限公司 | The detecting system of arrhythmia cordis based on deep neural network |
CN108464827A (en) * | 2018-03-08 | 2018-08-31 | 四川大学 | It is a kind of it is Weakly supervised under electrocardio image-recognizing method |
CN108596073A (en) * | 2018-04-19 | 2018-09-28 | 郑州大学 | A kind of lightweight algorithm identifying electrocardiogram (ECG) data based on deep learning |
CN109091138A (en) * | 2018-07-12 | 2018-12-28 | 上海微创电生理医疗科技股份有限公司 | The judgment means and Mapping System of arrhythmia cordis originating point |
CN109350032A (en) * | 2018-10-16 | 2019-02-19 | 武汉中旗生物医疗电子有限公司 | A kind of classification method, system, electronic equipment and storage medium |
CN109288515A (en) * | 2018-11-14 | 2019-02-01 | 东南大学 | Periodical monitoring method and device based on premature beat signal in wearable ECG signal |
CN109480825A (en) * | 2018-12-13 | 2019-03-19 | 武汉中旗生物医疗电子有限公司 | The processing method and processing device of electrocardiogram (ECG) data |
Non-Patent Citations (5)
Title |
---|
An Automated ECG Beat Classification System Using Convolutional Neural Networks;Muhammad Zubair;《2016 6th International Conference on IT Convergence and Security (ICITCS)》;20161110;全文 * |
ECG心率变异性分析的算法设计及FPGA实现;李梦妮;《中国优秀硕士论文全文数据库 信息科技辑》;第2017年卷(第03期);第I136-383页 * |
基于ARM的远程心电监护平台的设计与实现;晏晨伟;《中国优秀硕士论文全文数据库 信息科技辑》;第2014年卷(第05期);第I140-480页 * |
心律失常辅助诊断***的研制;陈晓俐 等;《中国医学物理学杂志》;第27卷(第02期);第1-6页 * |
王迪.低运算复杂度的心拍分类算法研究.《中国优秀硕士学位论文全文数据库 信息科技辑》.2019,第I136-361页. * |
Also Published As
Publication number | Publication date |
---|---|
CN111723622A (en) | 2020-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108186011B (en) | Atrial fibrillation detection method, atrial fibrillation detection device and readable storage medium | |
CN107981858B (en) | Automatic electrocardiogram heart beat identification and classification method based on artificial intelligence | |
US11562222B2 (en) | Systems and methods of identity analysis of electrocardiograms | |
KR102141617B1 (en) | Method, system and non-transitory computer-readable recording medium for estimating arrhythmia by using artificial neural network | |
WO2019100566A1 (en) | Artificial intelligence self-learning-based static electrocardiography analysis method and apparatus | |
Pollreisz et al. | A simple algorithm for emotion recognition, using physiological signals of a smart watch | |
WO2019161611A1 (en) | Ecg information processing method and ecg workstation | |
EP4003147A1 (en) | Deep end-to-end classification of electrocardiogram data | |
KR101796055B1 (en) | Method and device for monitoring brain status by constructing multiple brain networks | |
US10178974B2 (en) | Method and system for monitoring continuous biomedical signal | |
CN108652640B (en) | Non-invasive blood glucose detection method and system based on electrocardiosignals | |
CN112788200B (en) | Method and device for determining frequency spectrum information, storage medium and electronic device | |
CN116602642B (en) | Heart rate monitoring method, device and equipment | |
CN111631704B (en) | Diabetes early-stage detection system and method based on combination of electrocardio information and electroencephalogram information | |
CN108962379B (en) | Mobile phone auxiliary detection system for cranial nerve system diseases | |
CN111723622B (en) | Heart beat classification method, heart beat classification device, wearable equipment and storage medium | |
CN116636856A (en) | Electrocardiogram classification method, system, electronic equipment and storage medium | |
CN113171102B (en) | ECG data classification method based on continuous deep learning | |
CN111601539A (en) | Method for automatically diagnosing disease of subject and system for implementing method | |
US20230248295A1 (en) | Method for selecting features from electroencephalogram signals | |
Dan et al. | Sensor selection and miniaturization limits for detection of interictal epileptiform discharges with wearable EEG | |
CN112957018A (en) | Heart state detection method and device based on artificial intelligence | |
KR102149748B1 (en) | Method and apparatus for obtaining heart and lung sounds | |
CN112842355A (en) | Electrocardiosignal heart beat detection and identification method based on deep learning target detection | |
CN116229521B (en) | Method, device and equipment for detecting heart information based on multi-scale features |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |