CN106108832A - A kind of in-vivo information acquiring apparatus - Google Patents

A kind of in-vivo information acquiring apparatus Download PDF

Info

Publication number
CN106108832A
CN106108832A CN201610782831.8A CN201610782831A CN106108832A CN 106108832 A CN106108832 A CN 106108832A CN 201610782831 A CN201610782831 A CN 201610782831A CN 106108832 A CN106108832 A CN 106108832A
Authority
CN
China
Prior art keywords
pulse
umber
mentioned
information acquiring
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610782831.8A
Other languages
Chinese (zh)
Other versions
CN106108832B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xuetong Intelligent Technology Co.,Ltd.
Original Assignee
孟玲
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 孟玲 filed Critical 孟玲
Priority to CN201610782831.8A priority Critical patent/CN106108832B/en
Publication of CN106108832A publication Critical patent/CN106108832A/en
Application granted granted Critical
Publication of CN106108832B publication Critical patent/CN106108832B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00011Operational features of endoscopes characterised by signal transmission
    • A61B1/00016Operational features of endoscopes characterised by signal transmission using wireless means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00025Operational features of endoscopes characterised by power management
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0204Operational features of power management
    • A61B2560/0214Operational features of power management of power generation or supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Abstract

A kind of in-vivo information acquiring apparatus, including cell recognition module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include: information acquiring section, electric power source, Magnetic Sensor portion, umber of pulse count section, umber of pulse judging part and power cut control portion.The invention have the benefit that and reliably the switching on or off of power supply of the in-vivo information acquiring apparatus being introduced inside a subject can be switched over simple structure.

Description

A kind of in-vivo information acquiring apparatus
Technical field
The present invention relates to acquisition of information field, be specifically related to a kind of in-vivo information acquiring apparatus.
Background technology
In recent years, in the field of endoscope, it is proposed that a kind of receiving image pickup part, illumination in the shell of capsule shape Portion, sending part etc. and the capsule type endoscope of deglutition type that constitutes, wherein, above-mentioned image pickup part obtains image subject within to be believed Breath, the shooting position of image pickup part is illuminated by above-mentioned Lighting Division, the image that above-mentioned sending part wireless transmission is obtained by image pickup part Information.From the mouth as the patient of subject, swallow this capsule type endoscope this capsule type endoscope is imported to subject Internal.Then, until the period naturally drained, this capsule type endoscope is carried out along with its vermicular movement in body cavity Endoceliac image is shot successively while movement, and to the image information acquired in external wireless transmission.
Summary of the invention
For solving the problems referred to above, it is desirable to provide a kind of in-vivo information acquiring apparatus.
The purpose of the present invention realizes by the following technical solutions:
A kind of in-vivo information acquiring apparatus, including cell recognition module and data obtaining module, described cell recognition Module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signal Signal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number with On;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
The invention have the benefit that can be with simple structure reliably to the organism being introduced inside a subject Switching on or off of the power supply of internal information acquisition device switches over.
Accompanying drawing explanation
The invention will be further described to utilize accompanying drawing, but the embodiment in accompanying drawing does not constitute any limit to the present invention System, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to obtain according to the following drawings Other accompanying drawing.
Fig. 1 is data obtaining module structural representation of the present invention;
Fig. 2 is the structural representation of cell recognition module.
Reference:
Cell recognition module 1, Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, Classification and Identification unit 13.
Detailed description of the invention
In conjunction with following application scenarios, the invention will be further described.
Application scenarios 1
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cell Identification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signal Signal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number with On;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magnetic The output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advance In the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz, yz) arrive nucleus and Cytoplasm boundary point (xp, yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1, y1) ...,If sampled point Coordinate be not integer, its gray value is obtained by surrounding pixel linear interpolation;
Point (xi, yi) place is along the gray scale difference of line segment direction:
hd(xi, yi)=h (xi-1, yi-1)-h(xi, yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi, yi) place is along the gradient gra (x of line segment directioni, yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=13, d=2, image denoising effect improves 5% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 2
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cell Identification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signal Signal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number with On;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magnetic The output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advance In the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood lx,yInterior gray value falls within Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz, yz) arrive nucleus and Cytoplasm boundary point (xp, yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1, y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi, yi) place is along the gray scale difference of line segment direction:
hd(xi, yi)=h (xi-1, yi-1)-h(xi, yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=15, d=2, image denoising effect improves 6% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 3
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cell Identification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signal Signal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number with On;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magnetic The output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advance In the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=18, d=3, image denoising effect improves 7% relatively, cell image The extraction accuracy of feature improves 7%.
Application scenarios 4
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cell Identification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signal Signal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number with On;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magnetic The output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advance In the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=20, d=4, image denoising effect improves 8% relatively, cell image The extraction accuracy of feature improves 6%.
Application scenarios 5
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cell Identification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signal Signal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number with On;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magnetic The output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advance In the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, w (i) it is neighborhood lx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=26, d=2, image denoising effect improves 7.5% relatively, cytological map As the extraction accuracy of feature improves 8%.
Last it should be noted that, above example is only in order to illustrate technical scheme, rather than the present invention is protected Protecting the restriction of scope, although having made to explain to the present invention with reference to preferred embodiment, those of ordinary skill in the art should Work as understanding, technical scheme can be modified or equivalent, without deviating from the reality of technical solution of the present invention Matter and scope.

Claims (3)

1. an in-vivo information acquiring apparatus, is characterized in that, including cell recognition module and data obtaining module, described carefully Born of the same parents' identification module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports and corresponding with the detection state of this magnetic signal controls letter Number;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, it judges whether the umber of pulse counted to get by above-mentioned umber of pulse count section is more than predetermined number;
Power cut control portion, it is being judged as have input the situation of the pulse of more than predetermined number by above-mentioned umber of pulse judging part Under, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
A kind of in-vivo information acquiring apparatus the most according to claim 1, is characterized in that, described data obtaining module is also Including interval test section, this interval test section detects the output gap of the pulse signal from above-mentioned Magnetic Sensor portion,
Described umber of pulse count section is between the output gap detected by above-mentioned interval test section is not less than benchmark set in advance In the case of every, the output number of above-mentioned pulse signal is updated.
A kind of in-vivo information acquiring apparatus the most according to claim 2, is characterized in that, described interval test section is by counting Number device is constituted.
CN201610782831.8A 2016-08-30 2016-08-30 A kind of in-vivo information acquiring apparatus Active CN106108832B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610782831.8A CN106108832B (en) 2016-08-30 2016-08-30 A kind of in-vivo information acquiring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610782831.8A CN106108832B (en) 2016-08-30 2016-08-30 A kind of in-vivo information acquiring apparatus

Publications (2)

Publication Number Publication Date
CN106108832A true CN106108832A (en) 2016-11-16
CN106108832B CN106108832B (en) 2017-11-10

Family

ID=57273254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610782831.8A Active CN106108832B (en) 2016-08-30 2016-08-30 A kind of in-vivo information acquiring apparatus

Country Status (1)

Country Link
CN (1) CN106108832B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109360187A (en) * 2018-09-10 2019-02-19 天津大学 Lymphocyte is sliced cancer cell detector

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006064972A1 (en) * 2004-12-17 2006-06-22 Olympus Corporation Position detection system, guidance system, position detection method, medical device, and medical magnetic-induction and position-detection system
US20080154090A1 (en) * 2005-01-04 2008-06-26 Dune Medical Devices Ltd. Endoscopic System for In-Vivo Procedures
US20120022328A1 (en) * 2009-02-05 2012-01-26 Johannes Reinschke Separating an endoscopy capule from a surface of a liquid
CN103732114A (en) * 2011-07-28 2014-04-16 奥林巴斯株式会社 Bioinformation acquisition system and method for driving bioinformation acquisition system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006064972A1 (en) * 2004-12-17 2006-06-22 Olympus Corporation Position detection system, guidance system, position detection method, medical device, and medical magnetic-induction and position-detection system
US20080154090A1 (en) * 2005-01-04 2008-06-26 Dune Medical Devices Ltd. Endoscopic System for In-Vivo Procedures
US20120022328A1 (en) * 2009-02-05 2012-01-26 Johannes Reinschke Separating an endoscopy capule from a surface of a liquid
CN103732114A (en) * 2011-07-28 2014-04-16 奥林巴斯株式会社 Bioinformation acquisition system and method for driving bioinformation acquisition system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109360187A (en) * 2018-09-10 2019-02-19 天津大学 Lymphocyte is sliced cancer cell detector

Also Published As

Publication number Publication date
CN106108832B (en) 2017-11-10

Similar Documents

Publication Publication Date Title
CN104766058B (en) A kind of method and apparatus for obtaining lane line
EP2596746B1 (en) Pupil detection device and pupil detection method
CN107305635A (en) Object identifying method, object recognition equipment and classifier training method
CN110456320B (en) Ultra-wideband radar identity recognition method based on free space gait time sequence characteristics
CN105930852B (en) A kind of bubble image-recognizing method
CN106128121B (en) Vehicle queue length fast algorithm of detecting based on Local Features Analysis
WO2007044508A3 (en) System and method for whole body landmark detection, segmentation and change quantification in digital images
CN111160210B (en) Video-based water flow rate detection method and system
CN107729926B (en) Data amplification method and machine identification system based on high-dimensional space transformation
CN105740779A (en) Method and device for human face in-vivo detection
CN107016353B (en) A kind of integrated method and system of variable resolution target detection and identification
CN105869166A (en) Human body action identification method and system based on binocular vision
CN109976526A (en) A kind of sign Language Recognition Method based on surface myoelectric sensor and nine axle sensors
CN110288623B (en) Data compression method for unmanned aerial vehicle maritime net cage culture inspection image
CN109871829A (en) A kind of detection model training method and device based on deep learning
CN110084830A (en) A kind of detection of video frequency motion target and tracking
CN109271868B (en) Dense connection convolution network hypersphere embedding-based target re-identification method
CN112668510A (en) Method, system, device, processor and storage medium for realizing performance test of three-dimensional face recognition equipment
CN106023153A (en) Method for measuring bubbles in water body
Moon et al. Skin microstructure segmentation and aging classification using CNN-based models
CN106108832B (en) A kind of in-vivo information acquiring apparatus
CN105844614B (en) It is a kind of that northern method is referred to based on the vision for proofreading robot angle
CN117333846A (en) Detection method and system based on sensor fusion and incremental learning in severe weather
CN115620149B (en) Road detection method based on remote sensing image
CN103093481A (en) Moving object detection method under static background based on watershed segmentation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Song Changxin

Inventor after: Ma Ke

Inventor after: Li Anqiang

Inventor before: The inventor has waived the right to be mentioned

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20171012

Address after: 810000, 54 West Road, 54 West Road, 38 West Road, Xining, Qinghai, China

Applicant after: Song Changxin

Applicant after: Ma Ke

Applicant after: Li Anqiang

Address before: Tunnel road, Zhenhai District 315200 Zhejiang city of Ningbo province No. 555

Applicant before: Meng Ling

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210525

Address after: Room 1601, 238 JIANGCHANG Third Road, Jing'an District, Shanghai 200436

Patentee after: Shanghai Xuetong Intelligent Technology Co.,Ltd.

Address before: No.38 Wusi West Road, Chengxi District, Xining City, Qinghai Province 810000 Wusi West Road Campus of Qinghai Normal University

Patentee before: Song Changxin

Patentee before: Ma Ke

Patentee before: Li Anqiang

TR01 Transfer of patent right