KR20170096506A - Imaging device for generating two-dimensional breast image - Google Patents

Imaging device for generating two-dimensional breast image Download PDF

Info

Publication number
KR20170096506A
KR20170096506A KR1020160017972A KR20160017972A KR20170096506A KR 20170096506 A KR20170096506 A KR 20170096506A KR 1020160017972 A KR1020160017972 A KR 1020160017972A KR 20160017972 A KR20160017972 A KR 20160017972A KR 20170096506 A KR20170096506 A KR 20170096506A
Authority
KR
South Korea
Prior art keywords
dimensional image
image
generating
dimensional
breast
Prior art date
Application number
KR1020160017972A
Other languages
Korean (ko)
Inventor
채승훈
정지욱
이수열
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020160017972A priority Critical patent/KR20170096506A/en
Publication of KR20170096506A publication Critical patent/KR20170096506A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5223Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • A61B6/5241Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present invention relates to an imaging device for generating a two-dimensional breast image. The imaging device comprises: a three-dimensional (3D) image synthesis unit combining a plurality of two-dimensional (2D) breast images to generate 3D breast images; a 2D image generation unit for generating a first 2D image and a second 2D image based on the 3D images; and a 2D image analysis unit increasing a contrast of the first 2D image to a reference value or more, extracting boundary lines between a plurality of areas included in the second 2D image and combining the first 2D image having an increased contrast and information of the boundary lines extracted from the second 2D image to generate a third 3D image. The 2D image generation unit includes: a first generation unit dividing 3D images into first multiple regions, calculating a first brightness value by performing a first arithmetic operation on a brightness value of each of the first multiple regions, and generating the first 2D image based on each of first multiple brightness values measured from the first multiple regions; and a second generation unit dividing the 3D images into second multiple regions, calculating a second brightness value by performing a second arithmetic operation on a brightness value of each of the second multiple regions, and generating the second 2D image based on each of the second multiple brightness values measured from the second multiple regions.

Description

TECHNICAL FIELD [0001] The present invention relates to an imaging device for generating a two-dimensional breast image,

The present invention relates to an imaging apparatus for generating a two-dimensional breast image.

Breast cancer can be diagnosed using techniques such as ultrasound imaging and Magnetic Resonance Imaging. Or for mammography, X-ray mammography may be used. X-ray mammography can capture mammograms in less time than ultrasound and magnetic resonance imaging, and shows excellent performance in detecting microcalcifications of the breast. However, mammography captures the breast in a single direction and produces a two-dimensional image of the mammogram.

Since two-dimensional images are taken in a single direction, the normal tissues and masses of the breast, such as the mammary gland, can be photographed without being accurately distinguished. Therefore, the detection accuracy of mammograms is lower than that of mammograms.

Digital Breast Tomosynthesis (DBT) can be used to obtain accurate images of normal breast tissue and mass. DBT is capable of imaging the breast in many directions, and requires a lower X-ray dose than breast ultrasound and magnetic resonance imaging. In addition, when DBT is used, three-dimensional mammograms synthesized from mammograms taken in various directions can be generated.

An object of the present invention is to provide an imaging device for generating a two-dimensional image from a three-dimensional image.

An imaging apparatus according to an embodiment of the present invention includes a three-dimensional image synthesizing unit for synthesizing a plurality of mammary images, each of which is a two-dimensional image, into a three-dimensional image, a first two-dimensional image and a second two- A second two-dimensional image generating unit, and a second two-dimensional image generating unit that increases a contrast of the first two-dimensional image to a reference value or more, extracts boundary lines between a plurality of regions included in the second two- Dimensional image to generate a third three dimensional image by synthesizing the information of the boundary lines extracted from the two dimensional image, wherein the two dimensional image generation unit divides the three dimensional image into a first plurality of regions, Calculating a first brightness value by performing a first arithmetic operation on the brightness values, A second generating unit for generating a first two dimensional image based on the first plurality of brightness values of the second plurality of regions, and a second generating unit for generating a first two dimensional image based on the first plurality of brightness values, And a second generating unit for generating a second two-dimensional image based on the second plurality of brightness values respectively calculated from the second plurality of areas.

According to the embodiment of the present invention, the imaging apparatus can provide a two-dimensional image in order to improve the accuracy of analysis of the mammary image.

Figure 1 is a block diagram illustrating a mammography system.
2 is a block diagram showing an imaging apparatus according to an embodiment of the present invention.
3 is a block diagram illustrating a method for generating a two-dimensional image using an average value.
4 is a block diagram illustrating a method for generating a two-dimensional image using a maximum value.
5 is a flowchart illustrating a method of generating a two-dimensional image according to an exemplary embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. The present invention is capable of various modifications and various forms, and specific embodiments are illustrated in the drawings and described in detail in the text. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

Like reference numerals are used for like elements in describing each drawing. In the accompanying drawings, the sizes of the components are enlarged to illustrate the present invention in order to clarify the present invention.

The terms first, second, etc. may be used to describe various elements, but the elements should not be limited by terms. Terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof. Also, where a section such as a layer, a film, an area, a plate, or the like is referred to as being "on" another section, it includes not only the case where it is "directly on" another part but also the case where there is another part in between. On the contrary, where a section such as a layer, a film, an area, a plate, etc. is referred to as being "under" another section, this includes not only the case where the section is "directly underneath"

Figure 1 is a block diagram illustrating a mammography system. The mammography system 10 may include an imaging device 100 for imaging the breast and a display device 200 for providing a two-dimensional image and a three-dimensional image of the taken breast.

The imaging device 100 may be a Digital Breast Tomosynthesis (DBT) device. The imaging apparatus 100 can photograph the breast by using an X-ray tube rotating at a limited angle. The imaging device 100 of the present invention can photograph the breast at multiple angles instead of performing one shot (illustratively, mammography) in one direction (illustratively, the vertical direction of the breast) . Illustratively, the imaging device 100 can photograph the breast while rotating about 3 degrees in a range of 占 21 占.

The imaging apparatus 100 can photograph the breast based on the attenuation characteristic of the X-ray. Specifically, X-rays have a characteristic of easily passing through a material having a low density and failing to pass through a material having a high density. Thus, in an X-ray image, a region composed of a material having a high density has a high brightness, and a region composed of a material having a low density can have a low brightness. Illustratively, in mammograms taken with X-rays, the mammary gland and mass areas with high density have high lightness and fat areas with low density can have low lightness.

The imaging apparatus 100 according to the embodiment of the present invention can photograph the breast several times at a low X-ray dose and can generate a plurality of images as a result of photographing. Each of the plurality of images may include a two-dimensional image of the breast. The imaging apparatus 100 can synthesize a plurality of images to generate a three-dimensional breast image. The imaging device is capable of extracting a two-dimensional breast image from a three-dimensional breast image. The imaging apparatus can extract the two-dimensional breast image using the average value and the maximum value of the brightness of the three-dimensional breast image. A method of extracting the two-dimensional image of the image pickup apparatus 100 will be described in detail with reference to FIGS. The image pickup apparatus 100 can transmit the generated three-dimensional image and two-dimensional image to the display device 200. The display device 200 may provide a three-dimensional image and a two-dimensional image to a user of the mammography system 10.

Since the 3D image provides the mammograms taken in various directions, the analysis accuracy of the mammograms can be improved. However, each of the plurality of images photographed for generating a three-dimensional image has a low image quality because it is photographed using a small X-ray dose. Therefore, a plurality of images may have lower resolution than two-dimensional images taken by mammography, and may be difficult to be used as data for analysis. In order to compensate for this, the image pickup apparatus 100 according to the embodiment of the present invention can convert a three-dimensional image to generate a two-dimensional image having a clear image quality. Two-dimensional images can be used as a readout for the detection of microcalcifications of the breast.

Each of the image capturing apparatus 100 and the analyzing apparatus 200 may be implemented in one computer. Illustratively, the imaging device 100 and the analysis device 200 may share the same processor in one computer. Alternatively, the image capturing apparatus 100 and the analyzing apparatus 200 may be implemented based on different processors in one computer.

The imaging device 100 and the display device 200 may be implemented by one or more computers. Illustratively, imaging device 100 and display device 200 may include one or more personal computers, desktops, laptops, tablet computers, and mobile < RTI ID = 0.0 > and at least one of mobile devices. The display device 200 may be a wearable device such as a smart watch, a smart ring, or the like.

The imaging device 100 and the display device 200 may include a processor. In particular, a processor may be included in one or more computers. One or more computers may include storage, and the storage may store software that includes instruction code for operating one or more computers. And the processor can run the software. Depending on the execution of the software, the functions of the imaging device 100 and the display device 200 described below may operate.

2 is a block diagram showing an imaging apparatus according to an embodiment of the present invention. 1 and 2, the image sensing apparatus 100 may include a three-dimensional image synthesizing unit 110, a two-dimensional image generating unit 120, and a two-dimensional image analyzing unit 130. The three-dimensional image synthesis unit 110, the two-dimensional image generation unit 120, and the two-dimensional image analysis unit 130 may be implemented in hardware, software, or hybrid.

In hardware form, each of the three-dimensional image synthesis unit 110, the two-dimensional image generation unit 120, and the two-dimensional image analysis unit 130 may include one or more digital and / or analog circuits for performing operations to be described later have. In software form, each of the three-dimensional image synthesis unit 110, the two-dimensional image generation unit 120, and the two-dimensional image analysis unit 130 includes one or more instruction codes that are configured to perform operations to be described later . The instruction code may be compiled or interpreted and processed into an instruction set by one or more processors.

Referring to FIGS. 1 and 2, the three-dimensional image synthesizer 110 can receive the plurality of images described above. The three-dimensional image synthesizing unit 110 may synthesize a plurality of images to generate a three-dimensional breast image. The three-dimensional image synthesizer 110 may output the generated three-dimensional image. The three-dimensional image can be provided to the user of the mammography system 10 via the display device 200.

The user can receive and read the three-dimensional image. The three-dimensional image may provide image information that clearly distinguishes the normal tissue and the boundary of the breast, but may include complex image information about the tissues contained in the breast. Therefore, the reading time of the three-dimensional image may take a long time. If a two-dimensional image is provided to the user together with the three-dimensional image, the reading time can be shortened.

For this reason, in order to generate a two-dimensional image, the two-dimensional image generating unit 120 may receive the three-dimensional image from the three-dimensional image generating unit 110. The two-dimensional image generating unit 120 may include a first generating unit 121 and a second generating unit 122 for generating a two-dimensional image from the three-dimensional image.

The first generating unit 121 will be described with reference to Figs. 2 and 3. Fig. 3 is a block diagram illustrating a method for generating a two-dimensional image using an average value. Since the three-dimensional image has a three-dimensional shape, the three-dimensional image can be formed on the basis of the first direction (x), the second direction (y), and the third direction (z). The first direction (x) may be the longitudinal direction of the three-dimensional image, and the second direction (y) may be the lateral direction of the three-dimensional image. And, the third direction z may be the height direction of the three-dimensional image.

In order to generate a two-dimensional image, the three-dimensional image can be divided into a plurality of regions. Each of the plurality of regions may include a plurality of brightness values. Illustratively, a plurality of brightness values may be present in a first region that is a portion of the three-dimensional image. The plurality of brightness values may be brightness values of pixels of a plurality of images, respectively. The first generating unit 121 can calculate an average brightness value of a plurality of brightness values of the first region. The first generating unit 121 may calculate the average brightness values of the remaining areas in the same or similar manner as the average brightness value of the first area. Looking down along the third direction z, the set of average brightness values can form one two-dimensional image. The first generating unit 121 may generate the first two-dimensional image based on the average brightness values.

The first two-dimensional image may include a plurality of regions. Illustratively, the plurality of regions may be at least one of a wired region, a fat region, and a lesion region (e.g., a mass region), respectively. The plurality of regions may be separated based on the boundary lines. The boundary line may be a criterion for distinguishing a plurality of areas having different brightness levels present inside / outside.

The first two dimensional image may contain less noise. However, since the first two-dimensional image is generated based on the average brightness values, the brightness contrast between the plurality of regions may be weak. Therefore, the boundary lines between the plurality of regions can be blurred. Since the first two-dimensional image has blurred boundaries, the boundary between the streamline region, the fat region and the lesion region may be ambiguous. In order to compensate for this, the second generator can generate a two-dimensional image based on the maximum brightness values.

The second generating unit 122 will be described with reference to Figs. 2 and 4. Fig. 4 is a block diagram illustrating a method for generating a two-dimensional image using a maximum value. 3 and 4, the three-dimensional image of FIG. 4 may be the same as or similar to the three-dimensional image of FIG. Therefore, detailed description will be omitted below.

Referring to FIGS. 2 and 4, the second generating unit 122 may calculate a maximum brightness value among a plurality of brightness values of the first region. The second generating unit 122 may sort the plurality of brightness values of the first area according to brightness. Illustratively, the second generation unit 122 may sort the plurality of brightness values of the first region in order from the highest brightness. The second generating unit 122 may calculate the maximum brightness among the plurality of aligned brightness values. At this time, the maximum brightness value may be a brightness value corresponding to the highest brightness. The second generating unit 122 may calculate the maximum brightness values of the remaining areas in the same or similar manner as the maximum brightness value of the second area. Looking down along the third direction z, a plurality of sets of maximum brightness values can form one two-dimensional image. In this manner, the second generation unit 122 can generate the second two-dimensional image based on the plurality of maximum brightness values.

Since the second two-dimensional image is an image generated based on a plurality of maximum brightness values, brightness contrast among a plurality of regions may be strong. Accordingly, the second two-dimensional image may include boundary lines that clearly distinguish between a plurality of regions. The second generating unit 122 may provide a second two-dimensional image having sharp lines between the wired area, the fat area, and the lesion area. However, the second two-dimensional image generated by the second generating unit 122 may include a large amount of noise. For this reason, the two-dimensional image analyzing unit 130 may synthesize the first and second two-dimensional images to generate a two-dimensional image complementing the limits of the first and second two-dimensional images.

The two-dimensional image analyzing unit 130 may include an image quality improving unit 131, a boundary extracting unit 132, and an image combining unit 133. The image quality improvement unit 131 may receive the first two-dimensional image from the two-dimensional image generation unit 121. The image quality improving unit 131 can improve the image quality of the first two-dimensional image. The image quality improving unit 131 may increase the contrast of the first two dimensional image to improve the image quality of the first two dimensional image. Contrast is the difference in visual characteristics that makes one object distinguishable from another object or background on the image. Thus, the contrast can be determined by the difference in color or brightness in one image. Thus, as the contrast of the first two-dimensional image increases, the bright and dark areas in the first two-dimensional image can be clearly contrasted. The two-dimensional image analyzer 130 may improve the image quality of the first two-dimensional image by increasing the contrast of the first two-dimensional image to be higher than a reference value. The two-dimensional image analyzing unit 130 may output the first two-dimensional image having the improved image quality to the image combining unit 133. [

The boundary extracting unit 132 may receive the second two-dimensional image from the two-dimensional image generating unit 120. The two-dimensional image generating unit 120 may extract the boundary lines of the second two-dimensional image and correct the boundary lines clearly. The boundary extracting unit 132 may extract the boundary lines of the second two-dimensional image using at least one of a Sobel algorithm, a Prewitt algorithm, a Roberts algorithm, and a Canny algorithm.

The boundary extracting unit 132 may remove the noise included in the boundary lines in order to correct the extracted boundary lines. In some embodiments, the boundary extractor 132 may use a wavelet technique to remove noise at the boundaries. The boundary extracting unit 132 can remove the noise of the boundary line and extract sharp boundary lines through the wavelet technique. The boundary extracting unit 132 may output the information of the boundary lines extracted from the second two-dimensional image to the image combining unit 133. [

The image combining unit 133 may receive information of the first two-dimensional image and the boundary lines from the image quality improving unit 131 and the boundary extracting unit 132, respectively. The image combining unit 133 may combine the information of the first two-dimensional image and the boundary lines. The image combining unit 133 may project information on the boundary lines based on the first two-dimensional image having a small noise. The image combining unit 133 may adjust the combining ratio of information about the first two-dimensional image and the boundary lines.

In some embodiments, combining the first two dimensional images at a rate that is higher than the ratio of the information about the boundaries, the image combining unit 133 can generate a sharp two dimensional image with less noise. Conversely, by combining information on the boundary lines at a ratio higher than the ratio of the first two-dimensional image, a two-dimensional image having clear boundary lines can be generated. The image combining unit 133 may output a two-dimensional image combining the first two-dimensional image and the second two-dimensional image.

The imaging apparatus 100 according to the embodiment of the present invention can generate a two-dimensional image of the breast without performing mammography. The imaging apparatus 100 provides a two-dimensional image, so that the accuracy of breast mass detection can be increased.

5 is a flowchart illustrating a method of generating a two-dimensional image according to an exemplary embodiment of the present invention. Referring to Figs. 1 to 5, the imaging apparatus 100 photographs the breast several times, and synthesizes a plurality of images generated by the photographing into a three-dimensional image (S110). The image pickup apparatus 100 generates a first two-dimensional image using the average brightness value of the three-dimensional image (S120). Specifically, the imaging apparatus 100 divides the three-dimensional image into a plurality of regions, and calculates average brightness values of the plurality of regions. Then, the image sensing apparatus 100 may generate a first two-dimensional image based on the calculated average brightness values.

The image pickup apparatus 100 generates a second two-dimensional image using the maximum brightness value of the three-dimensional image (S130). Specifically, the imaging apparatus 100 divides the three-dimensional image into a plurality of regions, and calculates the maximum brightness values of the plurality of regions. The image pickup device may generate a second two-dimensional image based on the calculated maximum brightness values.

The image pickup apparatus 100 improves the image quality of the first two-dimensional image (S140). Then, the imaging apparatus 100 corrects the boundary lines of the second two-dimensional image (S150). The image quality of the image pickup apparatus 100 and the method of correcting the boundary line have been described with reference to FIG. 2, and a detailed description thereof will be omitted. The image pickup apparatus 100 synthesizes the first and second two-dimensional images (S160).

A three-dimensional image and a two-dimensional image of the breast generated in this manner can be used to read the mass of the breast. Three-dimensional images can provide image information that clearly distinguishes the normal tissues and masses of the breast, but may take a long time to be read. Therefore, if the two-dimensional image is provided to the user together with the three-dimensional image, the reading time can be shortened.

As described above, an optimal embodiment has been disclosed in the drawings and specification. Although specific terms have been employed herein, they are used for purposes of illustration only and are not intended to limit the scope of the invention as defined in the claims or the claims. Therefore, those skilled in the art will appreciate that various modifications and equivalent embodiments are possible without departing from the scope of the present invention. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

100:
200: display device
110: 3D image synthesis unit
120: a two-dimensional image synthesizing unit
121: first generating unit
121: second generation unit
130: Two-dimensional image analysis unit
131: Picture quality improvement section
132:
133:

Claims (1)

A three-dimensional image synthesizer for synthesizing a plurality of breast images, each of which is a two-dimensional image, into a three-dimensional image;
A two-dimensional image generation unit for generating a first two-dimensional image and a second two-dimensional image based on the three-dimensional image; And
Dimensional image; extracting a boundary between a plurality of regions included in the second two-dimensional image; and comparing the contrast of the first two-dimensional image with the first two-dimensional image having the increased contrast, A two-dimensional image analyzer for synthesizing information of the boundary lines extracted from the two-dimensional image to generate a third three-dimensional image,
Wherein the two-
Dividing the three-dimensional image into a first plurality of regions, performing a first arithmetic operation on the brightness values of the first plurality of regions to calculate a first brightness value, A first generating unit for generating the first two-dimensional image based on a plurality of brightness values; And
Calculating a second brightness value by dividing the three-dimensional image into a second plurality of regions, performing a second arithmetic operation on the brightness values of each of the second plurality of regions, And a second generating unit for generating the second two-dimensional image based on the plurality of brightness values.
KR1020160017972A 2016-02-16 2016-02-16 Imaging device for generating two-dimensional breast image KR20170096506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160017972A KR20170096506A (en) 2016-02-16 2016-02-16 Imaging device for generating two-dimensional breast image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160017972A KR20170096506A (en) 2016-02-16 2016-02-16 Imaging device for generating two-dimensional breast image

Publications (1)

Publication Number Publication Date
KR20170096506A true KR20170096506A (en) 2017-08-24

Family

ID=59758427

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160017972A KR20170096506A (en) 2016-02-16 2016-02-16 Imaging device for generating two-dimensional breast image

Country Status (1)

Country Link
KR (1) KR20170096506A (en)

Similar Documents

Publication Publication Date Title
US11983799B2 (en) System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement
JP6457946B2 (en) System and method for improving workflow efficiency in tomosynthesis medical image data interpretation
US10448911B2 (en) Method and device for displaying medical images
US11551361B2 (en) Method and system of computer-aided detection using multiple images from different views of a region of interest to improve detection accuracy
US9098935B2 (en) Image displaying apparatus, image displaying method, and computer readable medium for displaying an image of a mammary gland structure without overlaps thereof
JP4851298B2 (en) Radiation tomographic image generator
US9865067B2 (en) Method of reconstruction of an object from projection views
US9569864B2 (en) Method and apparatus for projection image generation from tomographic images
JP2016510669A (en) System and method for navigating a tomosynthesis stack including automatic focusing
JP5467958B2 (en) Radiation image processing apparatus and method, and program
JP7084291B2 (en) Tomosynthesis photography support equipment, methods and programs
JP2012035068A (en) Radiation image processor, method, and program
JP6502509B2 (en) Image processing apparatus, radiographic imaging system, image processing method, and image processing program
JP5804340B2 (en) Radiation image region extraction device, radiation image region extraction program, radiation imaging device, and radiation image region extraction method
US11961165B2 (en) Tomographic image generating apparatus, tomographic image generating method, and tomographic image generating program
JP3758894B2 (en) Mammogram image diagnosis support device
CN107564021A (en) Detection method, device and the digital mammographic system of highly attenuating tissue
KR102527017B1 (en) Apparatus and method for generating 2d medical image based on plate interpolation
KR20170095012A (en) Analyzer for detecting nipple location
KR20170096506A (en) Imaging device for generating two-dimensional breast image
JP6584231B2 (en) Image processing apparatus, image processing system, image processing method, and program
JP2008073076A (en) Mammographic image processor
KR101494975B1 (en) Nipple automatic detection system and the method in 3D automated breast ultrasound images
WO2008041228A2 (en) Robust segmentation of a mass candidate in digital mammography images
JPH10108859A (en) Detecting method and device for abnormal shadow candidate