CN115035057A - Method, device, storage medium and equipment for acquiring aqueous humor cell concentration of anterior chamber of eye - Google Patents

Method, device, storage medium and equipment for acquiring aqueous humor cell concentration of anterior chamber of eye Download PDF

Info

Publication number
CN115035057A
CN115035057A CN202210606965.XA CN202210606965A CN115035057A CN 115035057 A CN115035057 A CN 115035057A CN 202210606965 A CN202210606965 A CN 202210606965A CN 115035057 A CN115035057 A CN 115035057A
Authority
CN
China
Prior art keywords
gray
pixel points
anterior chamber
eye
anterior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210606965.XA
Other languages
Chinese (zh)
Other versions
CN115035057B (en
Inventor
王晓春
惠博阳
周盛
王效宁
李泽萌
计建军
巩丽文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TIANJIN EYE HOSPITAL
Institute of Biomedical Engineering of CAMS and PUMC
Original Assignee
TIANJIN EYE HOSPITAL
Institute of Biomedical Engineering of CAMS and PUMC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TIANJIN EYE HOSPITAL, Institute of Biomedical Engineering of CAMS and PUMC filed Critical TIANJIN EYE HOSPITAL
Priority to CN202210606965.XA priority Critical patent/CN115035057B/en
Publication of CN115035057A publication Critical patent/CN115035057A/en
Application granted granted Critical
Publication of CN115035057B publication Critical patent/CN115035057B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention provides a method, a device, a storage medium and equipment for acquiring the concentration of aqueous humor cells of an anterior chamber of an eye. The method comprises the following steps: acquiring an anterior chamber region and a corresponding region range value from an ultrasonic image of a target anterior segment of the eye; denoising the anterior chamber region of the eye to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region of the eye; for each high-brightness gray scale spot, taking the pixel point in the range of each high-brightness gray scale spot as a target pixel point, and acquiring the gray difference value of each target pixel point and all adjacent pixel points; obtaining cell pixel points according to all gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value; determining the number of cell pixel points as the number of cells; therefore, the aqueous humor cell concentration of the anterior chamber of the eye can be obtained according to the ratio of the cell number to the area range value, and the measurement precision of the aqueous humor cell concentration can be improved.

Description

Method, device, storage medium and apparatus for obtaining concentration of aqueous humor cells of anterior chamber of eye
Technical Field
The invention relates to the technical field of aqueous humor cell concentration detection, in particular to an aqueous humor cell concentration acquisition method, device, storage medium and equipment for an anterior chamber.
Background
Aqueous humor cell concentration is a physiological parameter of the human eye. Normally, due to the blood-aqueous humor barrier, the aqueous humor appears as a clear, clear liquid with very little protein content and few cells present. When vascular permeability increases, a large amount of cellular material such as blood cells, proteins, or cellulosic exudates permeates into the aqueous humor. And the physiological parameter information of the aqueous humor cell concentration is acquired, which is helpful for knowing whether the eye of the human body infiltrates cell substances and the infiltration condition thereof. However, in the prior art, the measurement accuracy of the aqueous humor cell concentration is low.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings in the prior art, and provides a method, a device, a storage medium and equipment for acquiring the concentration of aqueous humor cells in an anterior chamber of an eye, which can improve the measurement accuracy of the concentration of aqueous humor cells in the anterior chamber of the eye.
One embodiment of the present invention provides a method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye, including:
acquiring an ultrasonic image of a target anterior ocular segment;
acquiring an anterior chamber region and a corresponding region range value from the ultrasonic image;
denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point region refers to an aggregation region of a plurality of pixels with the gray values still larger than 0 after noise reduction; the high intensity gray scale spots are formed by the scattering effect of ultrasound waves on cells in the anterior chamber region of the eye;
for each high-brightness gray scale spot, taking a pixel point in the range of each high-brightness gray scale spot as a target pixel point, traversing the target pixel point, and obtaining a gray difference value between each target pixel point and all adjacent pixel points;
obtaining cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
determining the number of the cell pixel points as the number of cells;
and obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the area range value.
An embodiment of the present invention also provides an aqueous humor cell concentration obtaining apparatus including:
the image acquisition module is used for acquiring an ultrasonic image of a target anterior ocular segment;
the region range acquisition module is used for acquiring an anterior chamber region of the eye and a corresponding region range value from the ultrasonic image;
the high-brightness gray scale spot acquisition module is used for reducing noise of the anterior chamber region to obtain a plurality of gray scale pixel point regions divided by the noise-reduced pixel points, and determining the gray scale pixel point regions as the high-brightness gray scale spots of the anterior chamber region; the gray pixel point region refers to an aggregation region of a plurality of pixels with the gray values still larger than 0 after noise reduction; the high intensity gray scale spots are formed by the scattering of ultrasound waves on cells in the anterior chamber region of the eye;
the gray difference value acquisition module is used for taking the pixel points in the range of each high-brightness gray scale spot as target pixel points for each high-brightness gray scale spot, traversing the target pixel points and acquiring the gray difference value of each target pixel point and all adjacent pixel points;
the cell pixel acquisition module is used for acquiring cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
the cell number acquisition module is used for determining the number of the cell pixel points as the cell number;
and the aqueous humor cell concentration calculating module is used for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the area range value.
An embodiment of the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the aqueous humor cell concentration obtaining method for the anterior chamber of the eye as described above.
An embodiment of the present invention further provides an electronic device, comprising a storage, a processor and a computer program stored in the storage and executable by the processor, wherein the processor executes the computer program to implement the steps of the method for obtaining the concentration of aqueous humor cells in the anterior chamber of the eye as described above.
Compared with the prior art, the method for acquiring the concentration of the aqueous humor cells of the anterior chamber of the eye obtains the high-brightness gray-scale spots divided by the pixel points after noise reduction by reducing the noise of the anterior chamber area of the eye in the ultrasonic image, then taking the pixel points in the range of each high-brightness gray scale spot as target pixel points, obtaining the gray difference value of each target pixel point and all adjacent pixel points, obtaining cell pixel points according to all the gray difference values corresponding to each target pixel point and a preset gray difference threshold value, and determining the number of cells according to the number of the cell pixel points, and then calculating the aqueous humor cell concentration of the anterior chamber of the eye according to the cell number and the area range value of the anterior chamber area in the ultrasonic image, thereby realizing the technical effect of improving the measurement precision of the aqueous humor cell concentration of the anterior chamber of the eye.
In order that the invention may be more clearly understood, specific embodiments thereof will be described hereinafter with reference to the accompanying drawings.
Drawings
Figure 1 is a flow chart of a method for obtaining aqueous humor cell concentration of an anterior chamber of an eye according to one embodiment of the present invention.
FIG. 2 is a schematic diagram of an anterior segment of an eye of a patient with aqueous opacity using a method for obtaining the concentration of aqueous cells in the anterior chamber of an eye according to one embodiment of the present invention.
Figure 3 is an ultrasound image of the anterior segment of an aqueous humor turbid patient's eye with a method for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to one embodiment of the present invention.
Figure 4 is a block diagram of an aqueous humor cell concentration acquisition device for the anterior chamber of the eye according to one embodiment of the present invention.
100. A cornea; 300. iris, 500, lens; 1. an image acquisition module; 2. a region range acquisition module; 3. a high-brightness gray scale spot acquisition module; 4. a gray difference value obtaining module; 5. a cell pixel acquisition module; 6. a cell number acquisition module; 7. and an aqueous humor cell concentration calculating module.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that the embodiments described are only some embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the embodiments in the present application.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. In the description of the present application, it is to be understood that the terms "first," "second," "third," and the like are used solely for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order, nor is it to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The word "if/if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination".
Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Please refer to fig. 1, which is a flowchart illustrating a method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye according to an embodiment of the present invention, the method includes:
s1: an ultrasound image of the target anterior segment of the eye is acquired.
The target anterior segment of the eye includes the anterior chamber, the posterior chamber, the zonules of the lens, the angle of the chamber, a portion of the lens, the peripheral vitreous body, the retina and the attachment points of the extraocular muscles of the eye and the conjunctiva.
The ultrasound image may be a two-dimensional image or a three-dimensional image. The ultrasonic image can be obtained by performing ultrasonic scanning on the target anterior ocular segment through an array transducer.
The two-dimensional image can be obtained by performing ultrasonic scanning on the target anterior ocular segment through the array transducer, and can also be obtained by screenshot from the three-dimensional image.
When the ultrasonic image to be acquired is a three-dimensional image, the method comprises the following steps:
s11: and scanning the target anterior ocular segment through the array transducer to obtain a pre-scanned image of the target anterior ocular segment.
Wherein, the step S11 is to scan the target anterior segment by moving on a preset moving track, and the moving track is perpendicular to the direct front sight line of the target anterior segment.
S12: a centerline of rotation is determined from the midpoint of the array transducer and the midpoint of the anterior segment tissue in the pre-scan image.
Specifically, a moving route of a midpoint of the array transducer is obtained when the array transducer scans the target anterior segment, and a virtual line segment which is perpendicularly intersected with the moving route is established through the midpoint of the anterior segment tissue in the pre-scanned image, wherein the virtual line segment is the rotation center line.
S13: moving the array transducer to one side of the rotation center line, driving the array transducer to rotate around the rotation center line, obtaining panoramic information of the target anterior ocular segment, and constructing a three-dimensional image of the target anterior ocular segment according to the panoramic information. Preferably, the array transducer rotates about the centerline of rotation for at least one complete cycle of rotation while scanning the target anterior ocular segment. Rotation of the array transducer from a starting position about the centerline of rotation in a single clockwise direction of rotation until the starting position is again reached indicates a complete cycle of rotation of the array transducer. Alternatively, the step S13 may be performed by scanning with 2 or more array transducers of the same parameters.
And acquiring panoramic information of the target eye anterior segment through rotary scanning, and constructing a three-dimensional image of the target eye anterior segment in a three-dimensional space by matching with a reconstruction mode of registration, interpolation and segmentation.
S2: an anterior chamber region of the eye and corresponding region range values are acquired from the ultrasound image.
Referring to fig. 2-3, the anterior chamber region and corresponding region range values may be obtained from the tissue structures surrounding the anterior chamber region in the ultrasound image, such as the cornea 100, iris 200, and lens 500.
In the ultrasonic image of the target anterior segment of the eye, if the aqueous humor in the anterior chamber region is clear and uniform without echo information, the image in the anterior chamber region of the eye will show a black image without high brightness gray scale spots, indicating that no cellular material has penetrated into the aqueous humor in the anterior chamber region of the eye. If cellular material infiltrates the aqueous humor in the anterior chamber region, the image of the anterior chamber region will appear as high-intensity gray-scale spots due to the scattering effect of the sound waves after they contact the cells. The cellular material is not necessarily limited to only objects having cellular structure, including, for example, but not limited to, cells, protein particles, cellulosic exudates, and the like.
S3: denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point region refers to an aggregation region of a plurality of pixels with the gray values still larger than 0 after noise reduction.
The denoising operation is to reduce the gray level of the noise pixel point to 0. The noise pixel points may be pixel points with gray values lower than a preset gray threshold.
Through the noise reduction operation, the gray value of the noise pixel point can be reduced to 0, and as a result, the pixel points in the anterior chamber area are divided into two types, one type is the noise pixel point of which the gray value is changed into 0 through the noise reduction operation, and the other type is the gray pixel point still having the original gray value, wherein the separation of the noise pixel point can isolate the gray pixel points, so that a plurality of gray pixel point areas formed by the aggregation of the gray pixel points are displayed, and the gray pixel point areas are the high-brightness gray spot. The high-brightness gray-scale spots are formed by scattering effect of ultrasonic waves on cells in the anterior chamber region, so that the gray value of pixel points corresponding to the high-brightness gray-scale spots is larger than the preset gray threshold value, and the high-brightness gray-scale spots cannot be influenced by noise reduction operation, namely the gray pixel point region formed by gathering the gray pixel points which are not influenced by the noise reduction operation is the high-brightness gray-scale spots.
S4: and for each high-brightness gray scale spot, taking the pixel point in the range of each high-brightness gray scale spot as a target pixel point, traversing the target pixel point, and obtaining the gray difference value of each target pixel point and all adjacent pixel points.
The neighboring pixel is a relative concept, which means other pixels adjacent to the current target pixel (or the current pixel), and the neighboring pixel can surround the corresponding current target pixel (or the current pixel). And the adjacent pixel point may not be located in the high-brightness gray-scale spot, that is, the adjacent pixel point may be a pixel point subjected to noise reduction processing.
The gray scale difference has positive and negative values, for example: when the gray value of the target pixel point is smaller than that of an adjacent pixel point, the gray difference value of the target pixel point and the adjacent pixel point is a negative number smaller than 0, and when the gray value of the target pixel point is larger than that of the adjacent pixel point, the gray difference value of the target pixel point and the adjacent pixel point is a positive number larger than 0; and when the gray value of the target pixel point is equal to the gray value of an adjacent pixel point, the gray difference value of the target pixel point and the adjacent pixel point is 0.
When the ultrasonic image is a two-dimensional image, the total number of adjacent pixel points corresponding to a single target pixel point is 8, and when the ultrasonic image is a three-dimensional image, the total number of adjacent pixel points corresponding to a single target pixel point is 26.
S5: and obtaining cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value.
Specifically, in the step S5, all the gray scale difference values corresponding to the target pixel points and a preset gray scale difference threshold value may be judged according to a preset judgment rule, so as to determine a cell pixel point in each pixel point of each high-brightness gray scale blob. Wherein the cell pixel points are used for representing a cell substance or the center of a cell substance.
S6: and determining the number of the cell pixel points as the cell number.
Since the cell pixels are used to represent a cell material or the center of a cell material, the number of cells is equal to the number of cell pixels. Here, the number of cells refers to the number of cellular substances, and the cellular substances are not necessarily limited to only objects having a cellular structure.
S7: and obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the area range value.
Compared with the prior art, the method for acquiring the concentration of the aqueous humor cells of the anterior chamber of the eye obtains high-brightness gray-scale spots divided by the pixel points after noise reduction by performing noise reduction on the anterior chamber area in the ultrasonic image, then taking the pixel points in the range of each high-brightness gray scale spot as target pixel points, obtaining the gray difference value of each target pixel point and all adjacent pixel points, obtaining cell pixel points according to all the gray difference values corresponding to each target pixel point and a preset gray difference threshold value, and determining the number of cells according to the number of the cell pixel points, and then calculating the aqueous humor cell concentration of the anterior chamber of the eye according to the cell number and the area range value of the anterior chamber area in the ultrasonic image, thereby realizing the technical effect of improving the measurement precision of the aqueous humor cell concentration of the anterior chamber of the eye.
In one possible embodiment, the ultrasound image is a two-dimensional image; the two-dimensional image is a section image passing through the pupil center of the anterior segment of the eye; in step S2, the step of obtaining the anterior chamber region of the eye and the corresponding region range value from the ultrasound image includes:
s201: from the two-dimensional image, an inner corneal cortex interface, an anterior iris interface, and an anterior lens capsule interface are identified.
The step S201 may be implemented by a pre-trained neural network interface recognition model, or may be implemented by determining a region range selected by the user in the two-dimensional image as the corneal endothelial layer interface, the iris anterior interface, or the lens anterior capsule interface, so as to implement an effect of recognizing the corneal endothelial layer interface, the iris anterior interface, or the lens anterior capsule interface.
S202: and determining a two-dimensional closed area enclosed by the corneal endothelial layer interface, the iris front interface and the lens front capsule interface as the anterior chamber area of the eye.
After the corneal endothelial layer interface, the anterior iris interface or the anterior lens capsule interface is identified, a two-dimensional closed region surrounded by the corneal endothelial layer interface, the anterior iris interface and the anterior lens capsule interface can be further identified through a pre-trained neural network anterior chamber region identification model, so that the anterior chamber region of the eye is obtained.
S203: determining an area of the anterior chamber region of the eye as the region range value.
The area of the anterior chamber region of the eye can be calculated by the number of pixel points and the size ratio of the two-dimensional image to the actual section of the pupil center of the anterior segment of the eye.
In the two-dimensional image, the anterior chamber region can be identified by the corneal endothelial layer interface, the anterior iris interface and the anterior lens capsule interface so as to obtain an area parameter of the anterior chamber region in the two-dimensional image.
In one possible embodiment, the ultrasound image is a three-dimensional image; in step S2, the step of obtaining the anterior chamber region of the eye and the corresponding region range value from the ultrasound image includes:
s211: from the three-dimensional image, the corneal endothelium, the anterior iris interface, and the anterior lens capsule are identified.
The step S211 may be implemented by a pre-trained neural network recognition model, or may be implemented by determining a region range selected by the user in the two-dimensional image as the corneal endothelial layer, the iris anterior interface, or the lens anterior capsule, so as to achieve an effect of recognizing the corneal endothelial layer, the iris anterior interface, or the lens anterior capsule.
S212: determining a three-dimensional closed region enclosed by the corneal endothelial layer, the anterior iris interface, and the anterior lens capsule as the anterior chamber region of the eye.
After the corneal endothelial layer, the iris anterior interface or the anterior lens capsule is identified, a three-dimensional closed region surrounded by the corneal endothelial layer, the iris anterior interface and the anterior lens capsule can be further identified through a pre-trained neural network anterior chamber region identification model, so that the anterior chamber region of the eye is obtained.
S213: determining a volume of the anterior chamber region of the eye as the region range value.
The volume of the anterior chamber region can be calculated by the number of pixel points and the size ratio of the three-dimensional image to the actual anterior segment of the eye.
In the three-dimensional image, the anterior chamber region can be identified by the corneal endothelial layer, the anterior iris interface and the anterior lens capsule so as to obtain the volume parameter of the anterior chamber region in the three-dimensional image.
In a possible embodiment, the step of denoising the anterior chamber region to obtain a plurality of gray pixel regions partitioned by the denoised pixels, and determining the gray pixel regions as high-brightness gray-scale spots of the anterior chamber region includes:
s301: and acquiring the average gray value of the anterior chamber region of the eye in the ultrasonic image.
The average gray value of the anterior chamber region refers to the average gray value of all pixel points in the range of the anterior chamber region.
S302: and determining pixel points lower than the average gray value in the anterior chamber region as noise pixel points, and performing gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to 0.
In the ultrasonic image, the gray value of the noise pixel point is generally lower and is smaller than the average gray value, so that the pixel point lower than the average gray value is determined as the noise pixel point, and the gray value reduction processing is performed, so that the effect of noise reduction processing is realized.
S303: determining pixel points higher than or equal to the average gray value in the anterior chamber region as gray pixel points; and determining the aggregation area of the gray-scale pixel points isolated by the noise pixel points after the noise reduction as the high-brightness gray-scale spots of the anterior chamber area of the eye.
Wherein a collection area, consisting of consecutive gray-scale pixels, represents a high-brightness gray-scale blob.
In this embodiment, the noise pixel point is reduced to a preset value of 0 by noise reduction, so that the interference of the noise pixel point on the high-brightness gray-scale spot can be removed, and the high-brightness gray-scale spot can be obtained.
In a feasible embodiment, the step of obtaining a cell pixel point according to all the gray scale difference values corresponding to the target pixel points and a preset gray scale difference threshold includes:
s501: comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold, and determining the corresponding target pixel points as cell pixel points if all the gray level difference values are greater than the gray level difference threshold.
In this embodiment, since the gray scale value of the pixel point near the cell pixel point is formed by scattering after the acoustic wave contacts the cell material, the gray scale difference value of the pixel point near the cell pixel point is large, and the cell pixel point can be identified by setting a verified gray scale difference threshold value. Specifically, the gray difference threshold is decided by a user. In this embodiment, by comparing all the gray level difference values corresponding to the target pixel point with the gray level difference threshold, a cell pixel point representing a cell or a cell center can be determined from the target pixel point.
In order to prevent the cell material from being located exactly between two pixel points and causing the condition of misjudging the number of the cell material, in a practical embodiment, the step of obtaining the cell pixel points according to all the gray scale values corresponding to the target pixel points and a preset gray scale threshold includes:
s511: and comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold value.
Wherein the gray difference threshold is set by a user. The comparison is to compare all the gray level difference values corresponding to the target pixel points with the gray level difference threshold.
S512: and if only one gray difference value is less than or equal to the gray difference threshold value in all the gray difference values of the same target pixel point, determining the corresponding target pixel point as an undetermined pixel point.
For example, taking a two-dimensional image as an example, if 7 of the 8 gray scale differences corresponding to one target pixel point are greater than the gray scale difference threshold, and 1 of the 8 gray scale differences is less than or equal to the gray scale difference threshold, the target pixel point is the to-be-determined pixel point.
S513: and acquiring the position relation of all the undetermined pixel points, and determining one pixel point in the adjacent undetermined pixel points as a cell pixel point.
Preferably, in step S513, one pixel point with a higher gray value among the adjacent to-be-determined pixel points may be determined as a cell pixel point, because the to-be-determined pixel point with the higher gray value occupies a larger area or volume of the cell substance than another adjacent to-be-determined pixel point.
In this embodiment, through steps S511 to S513, the accuracy of cell pixel point statistics can be improved, and the situation that the cell or the center of the cell is just located between two pixel points and the cell number statistics is affected is prevented.
Referring to fig. 4, an embodiment of the present invention further provides an apparatus for acquiring aqueous humor cell concentration, including:
the image acquisition module 1 is used for acquiring an ultrasonic image of a target anterior segment of an eye;
the region range acquisition module 2 is used for acquiring an anterior chamber region of the eye and a corresponding region range value from the ultrasonic image;
the high-brightness gray scale spot acquisition module 3 is configured to perform noise reduction on the anterior chamber region to obtain a plurality of gray scale pixel point regions partitioned by the noise-reduced pixel points, and determine the gray scale pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point region refers to a gathering region of a plurality of pixels with the gray values still larger than 0 after noise reduction; the high intensity gray scale spots are formed by the scattering effect of ultrasound waves on cells in the anterior chamber region of the eye;
the gray difference value obtaining module 4 is configured to, for each high-brightness gray scale spot, take a pixel point within the range of each high-brightness gray scale spot as a target pixel point, traverse the target pixel point, and obtain a gray difference value between each target pixel point and all adjacent pixel points;
the cell pixel acquisition module 5 is configured to acquire a cell pixel point according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold;
the cell number acquisition module 6 is used for determining the number of the cell pixel points as the cell number;
and the aqueous humor cell concentration calculating module 7 is used for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the area range value.
The target anterior segment of the eye includes the anterior chamber, the posterior chamber, the zonules of the lens, the angle of the chamber, a portion of the lens, the peripheral vitreous body, the retina and the attachment points of the extraocular muscles of the eye and the conjunctiva.
The ultrasound image may be a two-dimensional image or a three-dimensional image.
In the ultrasound image of the target anterior segment, if the aqueous humor in the anterior chamber area is clear and homogeneous, without echogenic information, the image of the anterior chamber area will appear as a visibly dark area, indicating that no cellular material has infiltrated the aqueous humor in the anterior chamber area. If cellular material infiltrates the aqueous humor in the anterior chamber region, the image of the anterior chamber region will appear as high-intensity gray-scale spots due to the scattering effect of the sound waves after they contact the cells. The cellular material is not necessarily limited to objects having cellular structure, including, for example, but not limited to, cells, protein particles, cellulosic exudates, and the like.
The anterior chamber region and corresponding region range values may be obtained from tissue structures surrounding the anterior chamber region in the ultrasound image, such as the cornea, iris, lens, etc.
The noise reduction operation is to reduce the gray value of the noise pixel point to 0. The noise pixel points may be pixel points with gray values lower than a preset gray threshold.
The neighboring pixel point is a relative concept, which means other pixel points adjacent to the current target pixel point (or the current pixel point), and the neighboring pixel point can surround the corresponding current target pixel point (or the current pixel point). And the adjacent pixel point may not be located in the high-brightness gray-scale spot, that is, the adjacent pixel point may be a pixel point subjected to noise reduction processing.
The gray scale difference has positive and negative values, for example: when the gray value of the target pixel point is smaller than that of an adjacent pixel point, the gray difference value of the target pixel point and the adjacent pixel point is a negative number smaller than 0, and when the gray value of the target pixel point is larger than that of the adjacent pixel point, the gray difference value of the target pixel point and the adjacent pixel point is a positive number larger than 0; and when the gray value of the target pixel point is equal to the gray value of an adjacent pixel point, the gray difference value of the target pixel point and the adjacent pixel point is 0.
When the ultrasonic image is a two-dimensional image, the total number of adjacent pixel points corresponding to a single target pixel point is 8, and when the ultrasonic image is a three-dimensional image, the total number of adjacent pixel points corresponding to a single target pixel point is 26.
Since the cell pixels are used to represent a cell material or the center of a cell material, the number of cells is equal to the number of cell pixels. Here, the number of cells refers to the number of cellular substances, and the cellular substances are not necessarily limited to only objects having a cellular structure.
Compared with the prior art, the device for acquiring the concentration of the aqueous humor cells in the anterior chamber of the eye can obtain high-brightness gray-scale spots divided by the pixel points after noise reduction by reducing the noise of the anterior chamber area of the eye in the ultrasonic image, then taking the pixel points in the range of each high-brightness gray scale spot as target pixel points, obtaining the gray difference value of each target pixel point and all adjacent pixel points, obtaining cell pixel points according to all the gray difference values corresponding to each target pixel point and a preset gray difference threshold value, and determining the number of cells according to the number of the cell pixel points, and then calculating the aqueous humor cell concentration of the anterior chamber of the eye according to the cell number and the area range value of the anterior chamber area in the ultrasonic image, thereby realizing the technical effect of improving the measurement precision of the aqueous humor cell concentration of the anterior chamber of the eye.
In one possible embodiment, the ultrasound image is a two-dimensional image; the two-dimensional image is a section image passing through the pupil center of the anterior segment of the eye; the area range acquisition module 2 includes:
and the first identification module is used for identifying an inner corneal cortex interface, an anterior iris interface and an anterior lens capsule interface from the two-dimensional image.
A first anterior chamber region acquisition module for determining a two-dimensional closed region enclosed by the corneal endothelial layer interface, the anterior iris interface, and the anterior lens capsule interface as the anterior chamber region of the eye.
A first region range value acquisition module for determining an area of the anterior chamber region of the eye as the region range value.
After the corneal endothelial layer interface, the anterior iris interface or the anterior lens capsule interface is identified, a two-dimensional closed region surrounded by the corneal endothelial layer interface, the anterior iris interface and the anterior lens capsule interface can be further identified through a pre-trained neural network anterior chamber region identification model, so that the anterior chamber region of the eye is obtained.
The area of the anterior chamber region of the eye can be calculated by the number of pixel points and the size ratio of the two-dimensional image to the actual cross section of the pupil center of the anterior segment of the eye.
In the two-dimensional image, the anterior chamber region can be identified by the corneal endothelial layer interface, the anterior iris interface and the anterior lens capsule interface so as to obtain an area parameter of the anterior chamber region in the two-dimensional image.
In one possible embodiment, the ultrasound image is a three-dimensional image; the area range acquisition module 2 includes:
and the second identification module is used for identifying the corneal endothelium layer, the iris front interface and the lens front capsule from the three-dimensional image.
A second anterior chamber region acquisition module for determining a three-dimensional closed region enclosed by the corneal endothelial layer, the anterior iris interface, and the anterior lens capsule as the anterior chamber region.
A second region range value acquisition module for determining a volume of the anterior chamber region of the eye as the region range value.
The second recognition module may be implemented by a pre-trained neural network recognition model, or may determine a region range selected by a user in the two-dimensional image as the corneal endothelial layer, the iris anterior interface, or the lens anterior capsule, so as to achieve an effect of recognizing the corneal endothelial layer, the iris anterior interface, or the lens anterior capsule.
The volume of the anterior chamber region can be calculated by the number of pixel points and the size ratio of the three-dimensional image to the actual anterior segment of the eye.
In the three-dimensional image, the anterior chamber region can be identified by the corneal endothelial layer, the anterior iris interface and the anterior lens capsule so as to obtain the volume parameter of the anterior chamber region in the three-dimensional image.
In one possible embodiment, the high brightness gray level blob obtaining module 3 includes:
the average gray value acquisition module is used for acquiring the average gray value of the anterior chamber region of the eye in the ultrasonic image;
the noise reduction module is used for determining pixel points which are lower than the average gray value in the anterior chamber region of the eye as noise pixel points, and performing gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to 0 preset;
the high-brightness gray scale spot determining module is used for determining pixel points which are higher than or equal to the average gray scale value in the anterior chamber region of the eye as gray pixel points; and determining the aggregation area of the gray-scale pixel points isolated by the noise pixel points after the noise reduction as the high-brightness gray-scale spots of the anterior chamber area of the eye.
The average gray value of the anterior chamber region refers to the average gray value of all pixel points in the range of the anterior chamber region.
In the ultrasonic image, the gray value of the noise pixel point is generally lower and is smaller than the average gray value, so that the pixel point lower than the average gray value is determined as the noise pixel point, and the gray value reduction processing is performed, so that the effect of noise reduction processing is realized.
Wherein a collection region, consisting of consecutive gray-scale pixels, represents a high-brightness gray-scale blob.
In this embodiment, the noise pixel point is reduced to a preset value of 0 by noise reduction, so that the interference of the noise pixel point on the high-brightness gray-scale spot can be removed, and the high-brightness gray-scale spot can be obtained.
In one possible embodiment, the cell pixel acquisition module 5 comprises:
and the first cell pixel point acquisition module is used for comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold value, and determining the corresponding target pixel points as cell pixel points if all the gray level difference values are greater than the gray level difference threshold value.
In this embodiment, since the gray scale value of the pixel point near the cell pixel point is formed by scattering after the acoustic wave contacts the cell material, the gray scale difference value of the pixel point near the cell pixel point is large, and the cell pixel point can be identified by setting a verified gray scale difference threshold value. Specifically, the gray difference threshold is decided by a user. In this embodiment, by comparing all the gray level difference values corresponding to the target pixel point with the gray level difference threshold, a cell pixel point representing a cell or a cell center can be determined from the target pixel point.
In order to prevent the cell material from being located between two pixels, which may result in misjudging the amount of the cell material, in a practical embodiment, the cell pixel obtaining module 5 includes:
and the gray difference value comparison module is used for comparing all the gray difference values corresponding to the target pixel points with a preset gray difference threshold value.
And the undetermined pixel point acquisition module is used for determining the corresponding target pixel point as the undetermined pixel point if only one gray difference value is less than or equal to the gray difference threshold value in all the gray difference values of the same target pixel point.
And the second cell pixel point acquisition module is used for acquiring the position relation of all the undetermined pixel points and determining one pixel point in the adjacent undetermined pixel points as a cell pixel point.
An embodiment of the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the aqueous humor cell concentration obtaining method for the anterior chamber of the eye as described above.
An embodiment of the present invention also provides an electronic device, comprising a memory, a processor and a computer program stored in the memory and executable by the processor, wherein the processor executes the computer program to implement the steps of the method for obtaining the concentration of aqueous humor cells in the anterior chamber of the eye as described above.
The above-described device embodiments are merely illustrative, wherein the components described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart block or blocks and/or flowchart block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method for obtaining an aqueous humor cell concentration of an anterior chamber, comprising:
acquiring an ultrasonic image of a target anterior ocular segment;
acquiring an anterior chamber region and a corresponding region range value from the ultrasonic image;
denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point region refers to an aggregation region of a plurality of pixels with the gray values still larger than 0 after noise reduction;
for each high-brightness gray scale spot, taking a pixel point in the range of each high-brightness gray scale spot as a target pixel point, traversing the target pixel point, and obtaining a gray difference value between each target pixel point and all adjacent pixel points;
obtaining cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
determining the number of the cell pixel points as the number of cells;
and obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the area range value.
2. The method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye according to claim 1, wherein: the step of reducing noise of the anterior chamber region to obtain a plurality of gray pixel point regions divided by the noise-reduced pixel points and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region comprises the following steps:
acquiring an average gray value of the anterior chamber region of the eye in the ultrasonic image;
determining pixel points lower than the average gray value in the anterior chamber region as noise pixel points, and performing gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to 0;
determining pixel points higher than or equal to the average gray value in the anterior chamber region of the eye as gray pixel points; and determining the aggregation area of the gray-scale pixel points isolated by the noise pixel points after the noise reduction as the high-brightness gray-scale spots of the anterior chamber area of the eye.
3. The method for obtaining the concentration of the aqueous humor cells in the anterior chamber of the eye according to claim 1, wherein the step of obtaining the cell pixel points according to all the gray scale difference values corresponding to the respective target pixel points and a preset gray scale difference threshold comprises:
comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold, and determining the corresponding target pixel points as cell pixel points if all the gray level difference values are greater than the gray level difference threshold.
4. The method for obtaining the concentration of the aqueous humor cells in the anterior chamber of the eye according to claim 1, wherein the step of obtaining the cell pixel points according to all the gray scale difference values corresponding to the respective target pixel points and a preset gray scale difference threshold comprises:
comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold value;
if only one gray difference value is smaller than or equal to the gray difference threshold value in all the gray difference values of the same target pixel point, determining the corresponding target pixel point as an undetermined pixel point;
and acquiring the position relation of all the undetermined pixel points, and determining one pixel point in the adjacent undetermined pixel points as a cell pixel point.
5. The method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye according to any one of claims 1 to 4, wherein the ultrasonic image is a two-dimensional image; the two-dimensional image is a section image passing through the pupil center of the anterior segment of the eye;
the step of obtaining an anterior chamber region and corresponding region range values from the ultrasound image comprises:
identifying an inner corneal cortex interface, an anterior iris interface and an anterior lens capsule interface from the two-dimensional image;
determining a two-dimensional closed region enclosed by the corneal endothelial layer interface, the anterior iris interface and the anterior lens capsule interface as the anterior chamber region of the eye;
determining an area of the anterior chamber region of the eye as the region range value.
6. The method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye according to any one of claims 1 to 4, wherein the ultrasonic image is a three-dimensional image;
the step of obtaining an anterior chamber region and corresponding region range values from the ultrasound image comprises:
identifying an inner corneal cortex, an anterior iris interface and an anterior lens capsule from the three-dimensional image;
determining a three-dimensional closed region enclosed by the corneal endothelial layer, the anterior iris interface, and the anterior lens capsule as the anterior chamber region of the eye;
determining a volume of the anterior chamber region of the eye as the region range value.
7. An aqueous humor cell concentration acquisition apparatus, comprising:
the image acquisition module is used for acquiring an ultrasonic image of the anterior ocular segment of the target;
the region range acquisition module is used for acquiring an anterior chamber region of the eye and a corresponding region range value from the ultrasonic image;
the high-brightness gray scale spot acquisition module is used for reducing noise of the anterior chamber region to obtain a plurality of gray scale pixel point regions divided by the noise-reduced pixel points, and determining the gray scale pixel point regions as the high-brightness gray scale spots of the anterior chamber region; the gray pixel point region refers to an aggregation region of a plurality of pixels with the gray values still larger than 0 after noise reduction;
the gray difference value acquisition module is used for taking the pixel points in the range of each high-brightness gray scale spot as target pixel points for each high-brightness gray scale spot, traversing the target pixel points and acquiring the gray difference value of each target pixel point and all adjacent pixel points;
the cell pixel acquisition module is used for acquiring cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
the cell number acquisition module is used for determining the number of the cell pixel points as the cell number;
and the aqueous humor cell concentration calculating module is used for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the area range value.
8. The apparatus for acquiring aqueous humor cell concentration according to claim 7, wherein the high brightness gray scale spot acquiring module comprises:
the average gray value acquisition module is used for acquiring the average gray value of the anterior chamber region of the eye in the ultrasonic image;
the noise reduction module is used for determining pixel points which are lower than the average gray value in the anterior chamber region of the eye as noise pixel points, and performing gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to 0 preset;
the high-brightness gray scale speckle determining module is used for determining pixel points which are higher than or equal to the average gray scale value in the anterior chamber region as gray pixel points; and determining the aggregation area of the gray pixel points isolated by the noise pixel points after noise reduction as the high-brightness gray-scale spots in the anterior chamber area.
9. A computer-readable storage medium storing a computer program, characterized in that: the computer program when executed by a processor implements the steps of a method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye according to any one of claims 1 to 6.
10. An electronic device, characterized in that: comprising a memory, a processor and a computer program stored in said memory and executable by said processor, said processor when executing said computer program implementing the steps of a method for obtaining an aqueous humor cell concentration of an anterior chamber of an eye according to any one of claims 1 to 6.
CN202210606965.XA 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye Active CN115035057B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210606965.XA CN115035057B (en) 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210606965.XA CN115035057B (en) 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye

Publications (2)

Publication Number Publication Date
CN115035057A true CN115035057A (en) 2022-09-09
CN115035057B CN115035057B (en) 2023-07-11

Family

ID=83122255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210606965.XA Active CN115035057B (en) 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye

Country Status (1)

Country Link
CN (1) CN115035057B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794710A (en) * 2015-04-13 2015-07-22 上海泽煜实验设备有限公司 Image processing method and device
CN104794711A (en) * 2015-04-13 2015-07-22 上海泽煜实验设备有限公司 Image processing method and device
CN107527028A (en) * 2017-08-18 2017-12-29 深圳乐普智能医疗器械有限公司 Target cell recognition methods, device and terminal
CN109461165A (en) * 2018-09-29 2019-03-12 佛山市云米电器科技有限公司 Kitchen fume concentration based on the segmentation of three color of image divides identification method
CN111798467A (en) * 2020-06-30 2020-10-20 中国第一汽车股份有限公司 Image segmentation method, device, equipment and storage medium
US20210174489A1 (en) * 2019-12-04 2021-06-10 Beijing Boe Optoelectronics Technology Co., Ltd. Method and apparatus for detecting a screen, and electronic device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794710A (en) * 2015-04-13 2015-07-22 上海泽煜实验设备有限公司 Image processing method and device
CN104794711A (en) * 2015-04-13 2015-07-22 上海泽煜实验设备有限公司 Image processing method and device
CN107527028A (en) * 2017-08-18 2017-12-29 深圳乐普智能医疗器械有限公司 Target cell recognition methods, device and terminal
CN109461165A (en) * 2018-09-29 2019-03-12 佛山市云米电器科技有限公司 Kitchen fume concentration based on the segmentation of three color of image divides identification method
US20210174489A1 (en) * 2019-12-04 2021-06-10 Beijing Boe Optoelectronics Technology Co., Ltd. Method and apparatus for detecting a screen, and electronic device
CN111798467A (en) * 2020-06-30 2020-10-20 中国第一汽车股份有限公司 Image segmentation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN115035057B (en) 2023-07-11

Similar Documents

Publication Publication Date Title
CN110021009B (en) Method, device and storage medium for evaluating fundus image quality
JP7242906B2 (en) Method, apparatus, electronics and storage medium for localizing macular center from fundus image
CN110555845A (en) Fundus OCT image identification method and equipment
CN111681242B (en) Retinal vessel arteriovenous distinguishing method, device and equipment
US11284792B2 (en) Methods and systems for enhancing microangiography image quality
AU2021202217B2 (en) Methods and systems for ocular imaging, diagnosis and prognosis
CN109785399B (en) Synthetic lesion image generation method, device, equipment and readable storage medium
WO2021190656A1 (en) Method and apparatus for localizing center of macula in fundus image, server, and storage medium
CN114365190A (en) Spleen tumor identification method based on VRDS 4D medical image and related device
CN113658165A (en) Cup-to-tray ratio determining method, device, equipment and storage medium
CN113870270A (en) Eyeground image cup and optic disc segmentation method under unified framework
CN115330663A (en) Method for segmenting boundaries of scleral lens and tear lens in anterior segment OCT (optical coherence tomography) image
CN115035057B (en) Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye
WO2019171986A1 (en) Image processing device, image processing method, and program
CN114170378A (en) Medical equipment, blood vessel and internal plaque three-dimensional reconstruction method and device
Ilyasova et al. Graph-based segmentation for diabetic macular edema selection in OCT images
CN110927181B (en) Method for detecting foreign matters in part hole, terminal device and computer-readable storage medium
CN115908274A (en) Device, equipment and medium for detecting focus
CN116012287A (en) Blood vessel extraction method and device in OCT fundus image
Janpongsri et al. Pseudo‐real‐time retinal layer segmentation for high‐resolution adaptive optics optical coherence tomography
CN114373216A (en) Eye movement tracking method, device, equipment and storage medium for anterior segment OCTA
CN110097502B (en) Measuring method and device for fundus non-perfusion area and image processing method
CN108269258B (en) Method and system for segmenting corneal structures from OCT corneal images
CN110610147A (en) Blood vessel image extraction method, related device and storage equipment
CN112528714A (en) Single light source-based gaze point estimation method, system, processor and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant