CN111127461B - Chest image processing method, chest image processing device, storage medium and medical equipment - Google Patents

Chest image processing method, chest image processing device, storage medium and medical equipment Download PDF

Info

Publication number
CN111127461B
CN111127461B CN201911423166.3A CN201911423166A CN111127461B CN 111127461 B CN111127461 B CN 111127461B CN 201911423166 A CN201911423166 A CN 201911423166A CN 111127461 B CN111127461 B CN 111127461B
Authority
CN
China
Prior art keywords
projection
image
determining
dimensional
thoracic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911423166.3A
Other languages
Chinese (zh)
Other versions
CN111127461A (en
Inventor
张丛嵘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Medical Systems Co Ltd
Original Assignee
Neusoft Medical Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Medical Systems Co Ltd filed Critical Neusoft Medical Systems Co Ltd
Priority to CN201911423166.3A priority Critical patent/CN111127461B/en
Publication of CN111127461A publication Critical patent/CN111127461A/en
Application granted granted Critical
Publication of CN111127461B publication Critical patent/CN111127461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Optimization (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Algebra (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The application provides a chest image processing method, a chest image processing device, a storage medium and medical equipment, which are used for reducing missing of bone diseases such as micro fracture and bone fracture. The chest image processing method comprises the following steps: acquiring a three-dimensional chest image obtained by scanning a subject, wherein the three-dimensional chest image comprises a plurality of tomographic images, and identifying a rib region in which a rib is positioned in the tomographic images; selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a chest curve corresponding to any one of the sampling images according to a rib region in the sampling image; determining the corresponding relation between the projection point and the projection angle on the thoracic curve; determining a target projection angle; and according to the corresponding relation between the projection points and the projection angles on the thoracic curve, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection unfolding algorithm to obtain a two-dimensional thoracic unfolding image.

Description

Chest image processing method, chest image processing device, storage medium and medical equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a chest image processing method, device, storage medium, and medical apparatus.
Background
Computer Tomography (CT) or nuclear Magnetic Resonance (MR) is a main method for diagnosing rib fracture and other bone diseases of the chest, but because of the special tissue structure and morphology of the rib, partial fracture or fracture focus is difficult to diagnose due to the problem of shielding and other visual angles.
At present, one of the more commonly used methods for diagnosing rib fracture and other bone diseases is a rib two-dimensional plane projection method, wherein a rib image is projected onto a two-dimensional plane and unfolded, and a doctor performs diagnosis of bone diseases such as fracture and bone fracture by observing the gray continuity and brightness change of the rib two-dimensional projection image. The rib two-dimensional projection image obtained by the method is usually a two-dimensional chest expansion image taking the spine as the center, however, due to the influence of noise, images on two sides in the two-dimensional chest expansion image are blurred compared with the image of the center area, and bone diseases such as micro fracture and bone fracture are not easily observed, so that the bone diseases such as micro fracture and bone fracture are easily missed.
Disclosure of Invention
In view of the above, the present application provides a chest image processing method, device, storage medium and medical apparatus for reducing missing of bone diseases such as micro fracture and bone fracture.
In a first aspect, an embodiment of the present application provides a chest image processing method, including:
acquiring a three-dimensional chest image obtained by scanning a subject, wherein the three-dimensional chest image comprises a plurality of tomographic images, and identifying a rib region in which a rib is positioned in the tomographic images;
selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a chest curve corresponding to any one of the sampling images according to a rib region in the sampling image;
determining the corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a perpendicular projection of a projection line on a transverse plane and a coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system and the coordinate plane containing the transverse axis and the longitudinal axis, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
determining a target projection angle;
and according to the corresponding relation between the projection points and the projection angles on the thoracic curve, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection unfolding algorithm to obtain a two-dimensional thoracic unfolding image, wherein the two-dimensional thoracic unfolding image takes the pixel columns corresponding to the projection points in the direction of the target projection angle as central lines.
In a possible implementation manner, the determining the thoracic curve corresponding to the sampled image according to the rib region in the sampled image includes:
determining a viewpoint of a first transverse plane corresponding to the sampling image, wherein the viewpoint is an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
the viewpoint is taken as the center, and projection lines are emitted outwards in the first transverse plane according to a first set angle step;
when the projection line intersects with a rib region in the sampling image, determining an intersection point of the projection line and the rib region in the sampling image, and determining a projection point corresponding to the projection line according to the intersection point;
and fitting to obtain a thoracic curve corresponding to the sampling image according to all the projection points in the first transverse plane.
In a possible implementation manner, the determining the viewpoint of the first transverse plane corresponding to the sampled image includes:
determining a center of gravity of the chest of the subject from rib areas in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the center of gravity as a coordinate origin, taking the direction of a rotation axis of the object as a vertical axis, taking the horizontal direction on a plane parallel to a transverse plane as a horizontal axis and taking the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining an intersection point of a first transverse plane corresponding to the sampling image and a vertical axis of the space rectangular coordinate system as a viewpoint of the first transverse plane.
In a possible implementation manner, the determining an intersection point of the projection line and a rib region in the sampled image, and determining a projection point corresponding to the projection line according to the intersection point, includes:
determining a first intersection of the projection line with an inner edge line of a rib region in the sampled image, and determining a second intersection of the projection line with an outer edge line of a rib region in the sampled image;
and taking the midpoint of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
In a possible implementation manner, the determining the target projection angle includes:
receiving an image unfolding instruction aiming at any position to be diagnosed, wherein the image unfolding instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
In one possible implementation manner, the projecting the three-dimensional chest image onto a two-dimensional plane by adopting a cylindrical projection unfolding algorithm according to the corresponding relation between the projection point and the projection angle on the thoracic curve to obtain a two-dimensional thoracic unfolding image includes:
for the thoracic curve corresponding to each sampling image, searching a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle;
starting from the first projection point, selecting sampling points along the thoracic curve according to a second set angle step length;
and starting from the first projection point, projecting sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane, projecting sampling points on different thoracic curves corresponding to the same projection angle to the same column of pixels on the two-dimensional plane, and obtaining the two-dimensional thoracic unfolding image through interpolation.
In a second aspect, embodiments of the present application further provide a chest image processing device comprising means for performing the chest image processing method of the first aspect or any possible implementation of the first aspect.
In a third aspect, embodiments of the present application also provide a storage medium having stored thereon a computer program which when executed by a processor implements the steps of the chest image processing method of the first aspect or any possible implementation of the first aspect.
In a fourth aspect, embodiments of the present application also provide a medical device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the chest image processing method of the first aspect or any possible implementation of the first aspect when the program is executed.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
according to the technical scheme provided by the application, the target projection angle is determined firstly, then the three-dimensional chest image is projected to a two-dimensional plane by adopting a cylindrical projection unfolding algorithm according to the corresponding relation between the projection points on the chest curve and the projection angles, so that a two-dimensional chest unfolding image taking the pixel column corresponding to each projection point in the direction of the target projection angle as a central line is obtained, that is, when the projection angle corresponding to the position to be diagnosed is determined as the target projection angle, the image corresponding to the position to be diagnosed can be projected to the central area of the two-dimensional chest unfolding image, and therefore, the omission of bone diseases such as micro fracture, bone fracture and the like can be reduced.
Drawings
Fig. 1 is a schematic flow chart of a chest image processing method according to an embodiment of the present application;
FIG. 2 is a schematic illustration of a three-dimensional chest image provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of determining a thoracic curve according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an image grid according to an embodiment of the present application;
FIG. 5 is a schematic illustration of a two-dimensional chest expansion image obtained using the method provided by the embodiments of the present application;
fig. 6 is a schematic structural diagram of a chest image processing device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a thoracic curve determining module in the thoracic image processing apparatus according to the embodiment of the present application;
fig. 8 is a schematic structural diagram of an image unfolding module in the chest image processing device according to the embodiment of the application;
fig. 9 is a schematic structural diagram of a medical device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
Referring to fig. 1, an embodiment of the present application provides a chest image processing method, which may be used in a CT imaging system or a magnetic resonance imaging system, the method may include the steps of:
s101, acquiring a three-dimensional chest image obtained by scanning a detected body, wherein the three-dimensional chest image comprises a plurality of tomographic images, and identifying a rib region where a rib is located in the tomographic images;
the three-dimensional chest image is a three-dimensional medical image, and may be a CT image, an MR image, or the like, for example.
In the embodiment of the application, the rib region where the rib is located in the three-dimensional chest image (or a plurality of tomographic images) can be identified by adopting a traditional or deep learning method. In the traditional method, for example, a threshold value, region growing and other methods can be adopted to identify the region of the rib in the three-dimensional chest image, and in the deep learning method, for example, a neural network can be adopted to identify the region of the rib in the three-dimensional chest image, wherein the neural network can be designed by adopting a common VggBlock, downsampling and upsampling combined method.
When the neural network is used to identify the rib region where the ribs are located in the three-dimensional chest image, the neural network model may be trained in advance. When the model is trained, a large number of marked rib images can be obtained as a sample set, and the neural network model is obtained by continuously adjusting and optimizing the training model according to the sample set.
During recognition, the tomographic image can be input into the trained neural network model, and the tomographic image which is output by the neural network model and marks the rib region is obtained after the neural network model is processed.
S102, selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a chest curve corresponding to any one of the sampling images according to a rib region in the sampling image;
in the embodiment of the application, the number of layers (for example, 5 layers) can be set layer by layer or at intervals along the vertical axis direction (z axis) of the space rectangular coordinate system, and a preset number of images (for example, 30) are selected from the plurality of tomographic images to form a sampling image set.
In some embodiments, determining a thoracic curve corresponding to the sampled image according to the rib region in the sampled image in step S102 includes:
determining a viewpoint of a first transverse plane corresponding to the sampling image, wherein the viewpoint is an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
the viewpoint is taken as the center, and projection lines are emitted outwards in the first transverse plane according to a first set angle step;
when the projection line intersects with a rib region in the sampling image, determining an intersection point of the projection line and the rib region in the sampling image, and determining a projection point corresponding to the projection line according to the intersection point;
and fitting to obtain a thoracic curve corresponding to the sampling image according to all the projection points in the first transverse plane.
The first set angle step length is an included angle between two adjacent projection lines in the first transverse plane.
In some embodiments, determining the viewpoint of the first transverse plane corresponding to the sampled image includes:
determining a center of gravity of the chest of the subject from rib areas in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the center of gravity as a coordinate origin, taking the direction of a rotation axis of the object as a vertical axis, taking the horizontal direction on a plane parallel to a transverse plane as a horizontal axis and taking the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining an intersection point of a first transverse plane corresponding to the sampling image and a vertical axis of the space rectangular coordinate system as a viewpoint of the first transverse plane.
In the present embodiment, the center of gravity of the subject's chest may be determined from the positions of the rib points in the rib areas in the plurality of tomographic images, for example, the center of gravity of the subject's chest may be obtained by calculating the average value of the three-dimensional coordinates of the rib points in all the rib areas in the plurality of tomographic images.
For example, as shown in fig. 2, the center of gravity O of the chest of the subject is set up with the center of gravity O as the origin of coordinates, the direction of the rotation axis of the subject (i.e., the vertical direction) as the vertical axis (z axis), the horizontal direction on the plane parallel to the transverse plane as the horizontal axis (x axis), and the vertical direction on the plane parallel to the transverse plane as the vertical axis (y axis), a space rectangular coordinate system is established, and the intersections of the first transverse plane corresponding to each sample image and the vertical axis (z axis) of the space rectangular coordinate system are the viewpoints of the first transverse planes corresponding to each other, respectively, as shown in fig. 2, O 1 ,O 2 ,……,O n
Of course, the origin of coordinates of the space rectangular coordinate system may be determined by other methods, for example, an intersection point of the rotation axis of the subject and a transverse plane corresponding to the underlying tomographic image may be used as the origin of coordinates. Other directions may be selected for the directions of the horizontal axis and the vertical axis, which is not limited in the embodiment of the present application.
In some embodiments, determining an intersection point of the projection line and a rib region in the sampled image, and determining a projection point corresponding to the projection line according to the intersection point includes:
determining a first intersection of the projection line with an inner edge line of a rib region in the sampled image, and determining a second intersection of the projection line with an outer edge line of a rib region in the sampled image;
and taking the midpoint of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
For example, as shown in fig. 3, projection lines may be emitted outward in a first set angle step from the horizontal axis parallel direction with the viewpoint O as the center, the first set angle step is an included angle a (0 ° < a <360 °) between two adjacent projection lines, a first intersection point of the projection line R1 emitted outward and an inner edge line of a rib region (a highlighted region in fig. 3) in the sample image is P1, a second intersection point of the projection line with an outer edge line of the rib region in the sample image is P2, and a midpoint P of a line connecting P1 and P2 may be set as a projection point corresponding to the projection line R1.
After the projection points in the first transverse plane corresponding to the sampled image are determined, a thoracic curve corresponding to the sampled image can be obtained by fitting a B-spline interpolation method according to all the projection points in the first transverse plane, wherein a thoracic curve C is shown as a dotted line in fig. 3.
For each sampled image, the respective thoracic curve is determined using the method described above, whereby a series of thoracic curves C1-Cn can be obtained, n being an integer greater than 1.
S103, determining a corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a perpendicular projection of a projection line on a transverse plane and a coordinate plane (a transverse axis or a longitudinal axis) of the coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
s104, determining a target projection angle;
in this embodiment, the default target projection angle may be set to a projection angle corresponding to a spine position, and in order to reduce missing of bone diseases such as a micro fracture and a bone fracture, a user may project an image corresponding to a position to be diagnosed, where the bone diseases such as the micro fracture and the bone fracture may exist, to a central area of a two-dimensional chest expansion image, and at this time, the projection angle corresponding to the position to be diagnosed may be set as the target projection angle.
In some embodiments, determining the target projection angle in step S104 includes:
receiving an image unfolding instruction aiming at any position to be diagnosed, wherein the image unfolding instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
In this embodiment, the image expansion instruction may be generated according to a projection angle input by a user in the angle adjustment device, where the angle adjustment device may be designed as an input box of an angle, or may be designed as an angle adjustment scroll bar, and the angle adjustment device may be disposed at any position on a display page of the image, or may pop up a User Interface (UI) after receiving a preset operation for a position to be diagnosed by the user, and display the angle adjustment device on the UI interface.
S105, according to the corresponding relation between the projection points and the projection angles on the thoracic curve, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection unfolding algorithm to obtain a two-dimensional thoracic unfolding image, wherein the two-dimensional thoracic unfolding image takes a pixel column corresponding to each projection point in the direction of the target projection angle as a central line.
In some embodiments, in step S105, according to the correspondence between the projection points and the projection angles on the thoracic curve, a cylindrical projection expansion algorithm is used to project the three-dimensional thoracic image onto a two-dimensional plane, so as to obtain a two-dimensional thoracic expansion image, which includes:
for the thoracic curve corresponding to each sampling image, searching a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle;
starting from the first projection point, selecting sampling points along the thoracic curve according to a second set angle step length;
and starting from the first projection point, projecting sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane, projecting sampling points on different thoracic curves corresponding to the same projection angle to the same column of pixels on the two-dimensional plane, and obtaining the two-dimensional thoracic unfolding image through interpolation.
For example, the target projection angle is alpha (0 DEG.ltoreq.alpha <360 DEG), the second set angle step beta (0 DEG.ltoreq.beta <360 DEG) is the included angle between two adjacent projection lines in the first transversal plane corresponding to the sampled image, the smaller the beta is, the more projection points on the thoracic curve corresponding to the sampled image are, the first projection point on the thoracic curve C1 corresponding to the first projection angle alpha+180 DEG differing by 180 DEG is Q1, the first projection point on the thoracic curve C2 corresponding to the first projection angle alpha+180 DEG is Q2, … …, the first projection point on the thoracic curve Cn corresponding to the first projection angle alpha+180 DEG is Qn, for each thoracic curve, the sampling points are selected along the thoracic curve according to the second set angle step beta from the first projection point, then, the sampling points on the same thoracic curve are projected to the same row of pixels on the two-dimensional plane from the first projection point, for example, as shown in fig. 4, the sampling points on the thoracic curve C1 are projected to the same row of pixels C1 on the two-dimensional plane, the sampling points on the thoracic curve C2 are projected to the same row of pixels C2 on the two-dimensional plane, the sampling points on different thoracic curves corresponding to the same projection angle are projected to the same column of pixels on the two-dimensional plane, for example, as shown in fig. 4, the sampling points on different thoracic curves corresponding to the first projection angle α+180° are projected to the same column of pixels N1 on the two-dimensional plane, the sampling points on different thoracic curves corresponding to the projection angle α+180° +β are projected to the same column of pixels N2 on the two-dimensional plane, then three-dimensional coordinates of other points in the space rectangular coordinate system are calculated by interpolation method, image information corresponding to the other points is found according to the three-dimensional coordinates, and finally, obtaining a two-dimensional chest expansion image taking a pixel column corresponding to each projection point in the direction of the target projection angle alpha as a central line.
When the target projection angle is a default value, namely the projection angle corresponding to the spine position, the obtained two-dimensional chest expansion image is shown in fig. 5.
According to the technical scheme provided by the embodiment of the application, the three-dimensional chest image is projected to the two-dimensional plane by adopting the cylindrical projection expansion algorithm, the obtained two-dimensional chest expansion image can simultaneously display the forms of related tissues such as spine, ribs, fat and the like, and the chest can be expanded from any projection angle by adjusting the target projection angle.
Based on the same inventive concept, referring to fig. 6, an embodiment of the present application further provides a chest image processing device, the device comprising: an image acquisition and recognition module 11, a thoracic curve determination module 12, a correspondence determination module 13, a projection angle determination module 14, and an image expansion module 15.
An image acquisition and recognition module 11 configured to acquire a three-dimensional chest image obtained by scanning a subject, the three-dimensional chest image including a plurality of tomographic images, and recognize a rib region in which a rib is located in the plurality of tomographic images;
a thoracic curve determining module 12 configured to select a preset number of images from the plurality of tomographic images to form a sampling image set, and for any one of the sampling images, determine a thoracic curve corresponding to the sampling image according to a rib region in the sampling image;
a correspondence determining module 13, configured to determine a correspondence between a projection point on the thoracic curve and a projection angle, where the projection angle is an angle between a perpendicular projection of a projection line on a transverse plane and a coordinate axis on a coordinate plane including a transverse axis and a longitudinal axis of a rectangular spatial coordinate system, and the transverse plane is parallel to the coordinate plane including the transverse axis and the longitudinal axis;
a projection angle determination module 14 configured to determine a target projection angle;
the image unfolding module 15 is configured to adopt a cylindrical projection unfolding algorithm to project the three-dimensional chest image onto a two-dimensional plane according to the corresponding relation between the projection points and the projection angles on the chest curve, so as to obtain a two-dimensional chest unfolding image, wherein the two-dimensional chest unfolding image takes a pixel column corresponding to each projection point in the direction of the target projection angle as a central line.
In one possible implementation, as shown in fig. 7, the thoracic curve determination module 12 includes:
a viewpoint determining sub-module 121 configured to determine a viewpoint of a first transverse plane corresponding to the sampled image, the viewpoint being an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
a projection line emitting sub-module 122 configured to emit projection lines outwardly in the first transverse plane at a first set angle step with the viewpoint as a center;
a projection point determining submodule 123 configured to determine an intersection point of the projection line and a rib region in the sampled image when the projection line intersects the rib region in the sampled image, and determine a projection point corresponding to the projection line according to the intersection point;
and a thoracic curve acquisition sub-module 124, configured to fit a thoracic curve corresponding to the sampled image according to all the projection points in the first transverse plane.
In one possible implementation, the view determination submodule 121 is configured to:
determining a center of gravity of the chest of the subject from rib areas in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the center of gravity as a coordinate origin, taking the direction of a rotation axis of the object as a vertical axis, taking the horizontal direction on a plane parallel to a transverse plane as a horizontal axis and taking the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining an intersection point of a first transverse plane corresponding to the sampling image and a vertical axis of the space rectangular coordinate system as a viewpoint of the first transverse plane.
In one possible implementation, the proxel determination submodule 123 is configured to:
determining a first intersection of the projection line with an inner edge line of a rib region in the sampled image, and determining a second intersection of the projection line with an outer edge line of a rib region in the sampled image;
and taking the midpoint of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
In one possible implementation, the projection angle determination module 14 is configured to:
receiving an image unfolding instruction aiming at any position to be diagnosed, wherein the image unfolding instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
In one possible implementation, as shown in fig. 8, the image unfolding module 15 includes:
the searching sub-module 151 is configured to search, for a thoracic curve corresponding to each of the sampled images, a first projection point corresponding to a first projection angle 180 degrees different from the target projection angle according to a correspondence between projection points on the thoracic curve and projection angles;
a sampling sub-module 152 configured to select sampling points along the thoracic curve at a second set angular step from the first projection point;
the image unfolding sub-module 153 is configured to, from the first projection point, project sampling points on the same thoracic curve into the same row of pixels on the two-dimensional plane, project sampling points on different thoracic curves corresponding to the same projection angle into the same column of pixels on the two-dimensional plane, and obtain the two-dimensional thoracic unfolding image through interpolation.
The implementation process of the functions and roles of each unit in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present application without undue burden.
Based on the same inventive concept, the embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, implements the steps of the chest image processing method in any of the possible implementations described above.
Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
Based on the same inventive concept, referring to fig. 9, an embodiment of the present application further provides a medical device comprising a memory 61 (e.g. a non-volatile memory), a processor 62 and a computer program stored on the memory 61 and executable on the processor 62, the processor 62 implementing the steps of the chest image processing method in any of the possible implementations described above when executing the program. The medical device may be, for example, a PC, belonging to a CT imaging system or a magnetic resonance imaging system.
As shown in fig. 9, the computer device may generally further include: memory 63, network interface 64, and internal bus 65. In addition to these components, other hardware may be included, which is not described in detail.
It should be noted that the chest image processing device may be implemented by software, and is a device in a logic sense, and is formed by the processor 62 of the computer device where the chest image processing device is located reading the computer program instructions stored in the nonvolatile memory into the memory 63 and running the computer program instructions.
Embodiments of the subject matter and the functional operations described in this specification can be implemented in: digital electronic circuitry, tangibly embodied computer software or firmware, computer hardware including the structures disclosed in this specification and structural equivalents thereof, or a combination of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible, non-transitory program carrier for execution by, or to control the operation of, data processing apparatus. Alternatively or additionally, the program instructions may be encoded on a manually-generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode and transmit information to suitable receiver apparatus for execution by data processing apparatus. The computer storage medium may be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform corresponding functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Computers suitable for executing computer programs include, for example, general purpose and/or special purpose microprocessors, or any other type of central processing unit. Typically, the central processing unit will receive instructions and data from a read only memory and/or a random access memory. The essential elements of a computer include a central processing unit for carrying out or executing instructions and one or more memory devices for storing instructions and data. Typically, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks, etc. However, a computer does not have to have such a device. Furthermore, the computer may be embedded in another device, such as a mobile phone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices including, for example, semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., internal hard disk or removable disks), magneto-optical disks, and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any application or of what may be claimed, but rather as descriptions of features of specific embodiments of particular applications. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. On the other hand, the various features described in the individual embodiments may also be implemented separately in the various embodiments or in any suitable subcombination. Furthermore, although features may be acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Furthermore, the processes depicted in the accompanying drawings are not necessarily required to be in the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the application.

Claims (10)

1. A chest image processing method, the method comprising:
acquiring a three-dimensional chest image obtained by scanning a subject, wherein the three-dimensional chest image comprises a plurality of tomographic images, and identifying a rib region in which a rib is positioned in the tomographic images;
selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a chest curve corresponding to any one of the sampling images according to a rib region in the sampling image;
determining the corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a perpendicular projection of a projection line on a transverse plane and a coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system and the coordinate plane containing the transverse axis and the longitudinal axis, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
receiving an image unfolding instruction aiming at any position to be diagnosed, wherein the image unfolding instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle;
according to the corresponding relation between the projection points and the projection angles on the thoracic curve, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection unfolding algorithm to obtain a two-dimensional thoracic unfolding image, wherein the two-dimensional thoracic unfolding image takes a pixel column corresponding to each projection point in the direction of the target projection angle as a central line;
according to the corresponding relation between the projection point and the projection angle on the thoracic curve, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection unfolding algorithm to obtain a two-dimensional thoracic unfolding image, wherein the method comprises the following steps:
for the thoracic curve corresponding to each sampling image, searching a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle;
starting from the first projection point, selecting sampling points along the thoracic curve according to a second set angle step length;
and starting from the first projection point, projecting sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane, projecting sampling points on different thoracic curves corresponding to the same projection angle to the same column of pixels on the two-dimensional plane, and obtaining the two-dimensional thoracic unfolding image through interpolation.
2. The method of claim 1, wherein the determining a corresponding thoracic curve of the sampled image from a rib region in the sampled image comprises:
determining a viewpoint of a first transverse plane corresponding to the sampling image, wherein the viewpoint is an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
the viewpoint is taken as the center, and projection lines are emitted outwards in the first transverse plane according to a first set angle step;
when the projection line intersects with a rib region in the sampling image, determining an intersection point of the projection line and the rib region in the sampling image, and determining a projection point corresponding to the projection line according to the intersection point;
and fitting to obtain a thoracic curve corresponding to the sampling image according to all the projection points in the first transverse plane.
3. The method of claim 2, wherein determining the viewpoint of the first transverse plane corresponding to the sampled image comprises:
determining a center of gravity of the chest of the subject from rib areas in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the center of gravity as a coordinate origin, taking the direction of a rotation axis of the object as a vertical axis, taking the horizontal direction on a plane parallel to a transverse plane as a horizontal axis and taking the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining an intersection point of a first transverse plane corresponding to the sampling image and a vertical axis of the space rectangular coordinate system as a viewpoint of the first transverse plane.
4. The method according to claim 2, wherein determining an intersection point of the projection line and a rib region in the sampled image, and determining a projection point corresponding to the projection line according to the intersection point, comprises:
determining a first intersection of the projection line with an inner edge line of a rib region in the sampled image, and determining a second intersection of the projection line with an outer edge line of a rib region in the sampled image;
and taking the midpoint of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
5. A chest image processing device, the device comprising:
an image acquisition and identification module configured to acquire a three-dimensional chest image obtained by scanning a subject, the three-dimensional chest image including a plurality of tomographic images, and identify a rib region in which a rib is located in the plurality of tomographic images;
the thoracic curve determining module is configured to select a preset number of images from the plurality of tomographic images to form a sampling image set, and for any sampling image in the sampling image set, determining a thoracic curve corresponding to the sampling image according to a rib region in the sampling image;
the corresponding relation determining module is configured to determine a corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a perpendicular projection of a projection line on a transverse plane and a coordinate axis on a coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
the projection angle determining module is configured to receive an image unfolding instruction aiming at any position to be diagnosed, wherein the image unfolding instruction comprises a projection angle corresponding to the position to be diagnosed, and the projection angle corresponding to the position to be diagnosed is determined to be a target projection angle;
the image unfolding module is configured to adopt a cylindrical projection unfolding algorithm to project the three-dimensional chest image to a two-dimensional plane according to the corresponding relation between projection points and projection angles on the chest curve to obtain a two-dimensional chest unfolding image, wherein the two-dimensional chest unfolding image takes a pixel column corresponding to each projection point in the direction of the target projection angle as a central line;
the image unfolding module comprises:
the searching sub-module is configured to search a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle for the thoracic curve corresponding to each sampling image;
the sampling submodule is configured to select sampling points along the thoracic curve according to a second set angle step length from the first projection point;
the image unfolding sub-module is configured to start from the first projection point, project sampling points on the same thoracic curve into the same row of pixels on the two-dimensional plane, project sampling points on different thoracic curves corresponding to the same projection angle into the same column of pixels on the two-dimensional plane, and obtain the two-dimensional thoracic unfolding image through interpolation.
6. The apparatus of claim 5, wherein the thoracic curve determination module comprises:
a viewpoint determining submodule configured to determine a viewpoint of a first transverse plane corresponding to the sampling image, the viewpoint being an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
a projection line transmitting sub-module configured to transmit projection lines outwardly in a first set angle step in the first transverse plane with the viewpoint as a center;
a projection point determining submodule configured to determine an intersection point of the projection line and a rib region in the sampled image when the projection line intersects the rib region in the sampled image, and determine a projection point corresponding to the projection line according to the intersection point;
and the thoracic curve acquisition submodule is configured to fit and obtain a thoracic curve corresponding to the sampling image according to all the projection points in the first transverse plane.
7. The apparatus of claim 6, wherein the viewpoint determination submodule is configured to:
determining a center of gravity of the chest of the subject from rib areas in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the center of gravity as a coordinate origin, taking the direction of a rotation axis of the object as a vertical axis, taking the horizontal direction on a plane parallel to a transverse plane as a horizontal axis and taking the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining an intersection point of a first transverse plane corresponding to the sampling image and a vertical axis of the space rectangular coordinate system as a viewpoint of the first transverse plane.
8. The apparatus of claim 6, wherein the proxel determination submodule is configured to:
determining a first intersection of the projection line with an inner edge line of a rib region in the sampled image, and determining a second intersection of the projection line with an outer edge line of a rib region in the sampled image;
and taking the midpoint of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
9. A storage medium having stored thereon a computer program, which when executed by a processor performs the steps of the method according to any of claims 1-4.
10. A medical device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any of claims 1-4 when the program is executed.
CN201911423166.3A 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment Active CN111127461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911423166.3A CN111127461B (en) 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911423166.3A CN111127461B (en) 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment

Publications (2)

Publication Number Publication Date
CN111127461A CN111127461A (en) 2020-05-08
CN111127461B true CN111127461B (en) 2023-09-26

Family

ID=70507891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911423166.3A Active CN111127461B (en) 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment

Country Status (1)

Country Link
CN (1) CN111127461B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111710028B (en) * 2020-05-27 2023-06-30 北京东软医疗设备有限公司 Three-dimensional contrast image generation method and device, storage medium and electronic equipment
CN112767415A (en) * 2021-01-13 2021-05-07 深圳瀚维智能医疗科技有限公司 Chest scanning area automatic determination method, device, equipment and storage medium
CN113643176B (en) * 2021-07-28 2024-05-28 东软医疗***股份有限公司 Rib display method and device
CN114240740B (en) * 2021-12-16 2022-08-16 数坤(北京)网络科技股份有限公司 Bone expansion image acquisition method and device, medical equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194909A (en) * 2016-03-14 2017-09-22 东芝医疗***株式会社 Medical image-processing apparatus and medical imaging processing routine
CN109830289A (en) * 2019-01-18 2019-05-31 上海皓桦科技股份有限公司 Bone images display device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017214447B4 (en) * 2017-08-18 2021-05-12 Siemens Healthcare Gmbh Planar visualization of anatomical structures

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194909A (en) * 2016-03-14 2017-09-22 东芝医疗***株式会社 Medical image-processing apparatus and medical imaging processing routine
CN109830289A (en) * 2019-01-18 2019-05-31 上海皓桦科技股份有限公司 Bone images display device

Also Published As

Publication number Publication date
CN111127461A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN111127461B (en) Chest image processing method, chest image processing device, storage medium and medical equipment
US10667679B2 (en) Image-based global registration system and method applicable to bronchoscopy guidance
Grasa et al. Visual SLAM for handheld monocular endoscope
Grasa et al. EKF monocular SLAM with relocalization for laparoscopic sequences
Lin et al. Video‐based 3D reconstruction, laparoscope localization and deformation recovery for abdominal minimally invasive surgery: a survey
CN104000655B (en) Surface reconstruction and registration for the combination of laparoscopically surgical operation
JP4631057B2 (en) Endoscope system
US9767562B2 (en) Image processing apparatus, image processing method and storage medium
US10716457B2 (en) Method and system for calculating resected tissue volume from 2D/2.5D intraoperative image data
JP5355074B2 (en) 3D shape data processing apparatus, 3D shape data processing method and program
JP6501800B2 (en) Reconstruction of images from in vivo multi-camera capsules with confidence matching
JP2013150650A (en) Endoscope image diagnosis support device and method as well as program
CN104010560A (en) Overlay and motion compensation of structures from volumetric modalities onto video of uncalibrated endoscope
US10078906B2 (en) Device and method for image registration, and non-transitory recording medium
US11633235B2 (en) Hybrid hardware and computer vision-based tracking system and method
JP2013192741A (en) Medical image diagnosis support device and method and program
US20070053564A1 (en) Image processing method and computer readable medium for image processing
CN110634554A (en) Spine image registration method
US20130223701A1 (en) Medical image processing apparatus
US20150178989A1 (en) Medical image display apparatus, method, and program
Fu et al. Visual‐electromagnetic system: A novel fusion‐based monocular localization, reconstruction, and measurement for flexible ureteroscopy
US10299864B1 (en) Co-localization of multiple internal organs based on images obtained during surgery
Khare et al. Toward image-based global registration for bronchoscopy guidance
EP2639767A1 (en) A method of interrelating shared object information in multiple images, an image processing system and a computer program product
JP5904976B2 (en) 3D data processing apparatus, 3D data processing method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant