WO2017006595A1 - Ultrasonic diagnosing device and ultrasonic image processing method - Google Patents

Ultrasonic diagnosing device and ultrasonic image processing method Download PDF

Info

Publication number
WO2017006595A1
WO2017006595A1 PCT/JP2016/059805 JP2016059805W WO2017006595A1 WO 2017006595 A1 WO2017006595 A1 WO 2017006595A1 JP 2016059805 W JP2016059805 W JP 2016059805W WO 2017006595 A1 WO2017006595 A1 WO 2017006595A1
Authority
WO
WIPO (PCT)
Prior art keywords
volume data
partial volume
representative
data
unit
Prior art date
Application number
PCT/JP2016/059805
Other languages
French (fr)
Japanese (ja)
Inventor
西浦 朋史
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2017006595A1 publication Critical patent/WO2017006595A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the present invention relates to an ultrasonic diagnostic apparatus, and more particularly to a technique for generating a three-dimensional image of a site of interest.
  • three-dimensional ultrasonic diagnosis is spreading.
  • ultrasonic waves are transmitted / received to / from a three-dimensional space including a fetus in the mother body, thereby acquiring volume data.
  • a three-dimensional region of interest (3D-ROI) is set for volume data, and a three-dimensional image of the fetus is generated by rendering the data in the three-dimensional region of interest (see Patent Document 1).
  • An object of the present invention is to generate a three-dimensional ultrasonic image with good visibility of a site of interest in an ultrasonic diagnostic apparatus.
  • the ultrasonic diagnostic apparatus applies a process of extracting a portion including a region of interest for each of a plurality of volume data sequentially obtained by transmission and reception of ultrasonic waves for a three-dimensional space in a subject,
  • a partial volume data generation unit that generates a plurality of partial volume data and the plurality of partial volume data are spatially combined so that the target region is aligned therebetween, thereby generating combined partial volume data
  • a plurality of partial volume data is spatially synthesized so that attention sites are aligned among the plurality of partial volume data.
  • the combined partial volume data is generated in a state where the position and orientation of the target region are aligned among the plurality of partial volume data.
  • the data missing part is replaced with another part in a state where the target region is aligned between the multiple partial volume data. It can be supplemented by volume data.
  • volume data it is possible to prevent or reduce partial omission of the site-of-interest image in the 3D ultrasound image, and to generate a 3D ultrasound image that appropriately represents the site of interest.
  • the attention site is, for example, the face of the fetus.
  • the other part may be the target part.
  • the missing data part can be compensated by other partial volume data.
  • the synthesis process may be applied after the partial volume data is smoothed.
  • the synthesizing unit adjusts positions and orientations of the plurality of partial volume data so that a representative position and a representative orientation for the target site are aligned between the plurality of partial volume data. Synthesize the data.
  • the representative position is a position representatively showing the position of the target site
  • the representative orientation is an orientation representatively showing the direction (direction) of the target site.
  • the synthesis unit calculates a normal vector in units of voxels for each partial volume data, thereby obtaining a normal vector group for each partial volume data, and for each partial volume data
  • a representative position calculator that calculates the representative position based on the normal vector group
  • a representative azimuth calculator that calculates the representative azimuth based on the normal vector group for each partial volume data.
  • the direction of each voxel can be obtained by calculating the normal vector in units of voxels.
  • the normal vector group is a bundle of normal vectors calculated for individual voxels, and the representative position and the representative direction are calculated based on the normal vector group.
  • each vector included in the normal vector group may be virtually extended, and the representative position may be obtained based on the intersection group where the extension lines intersect.
  • the barycentric position, average position, etc. of the intersection group may be used as the representative position.
  • the normal vector group may be averaged, and the averaged vector orientation may be used as the representative orientation.
  • the representative position calculating unit calculates the representative position by fitting an approximate graphic to the partial volume data, and the representative azimuth calculating unit is configured to fit the approximate graphic to the partial volume data. Calculate the representative orientation.
  • the approximate figure is a figure that approximates the region of interest. For example, the approximate figure is fitted to the surface of the attention site or the vicinity of the surface.
  • the approximate figure is a hollow ellipsoid.
  • an ellipsoid is fitted to a site of interest that can be regarded as an ellipsoid, such as the head of a fetus, and the representative position and representative orientation are thereby calculated.
  • the ultrasonic image processing method applies a process of extracting a portion including a region of interest for each of a plurality of volume data sequentially acquired by transmitting and receiving ultrasonic waves to and from a three-dimensional space in a subject.
  • FIG. 1 is a block diagram showing an ultrasonic diagnostic apparatus according to an embodiment of the present invention. It is a block diagram which shows the structure of 3D memory and a three-dimensional image generation part. It is a schematic diagram which shows the some volume data acquired along the time-axis. It is a schematic diagram which shows an example of a cross-sectional image. It is a schematic diagram which shows an example of volume data. It is a schematic diagram which shows an example of a cross-sectional image. It is a flowchart which shows an example of the process by an adjustment information calculating part. It is a schematic diagram which shows an example of partial volume data. It is a schematic diagram which shows an example of partial volume data. It is a schematic diagram which shows an example of partial volume data. It is a schematic diagram which shows an example of partial volume data.
  • FIG. 1 shows an embodiment of an ultrasonic diagnostic apparatus according to the present invention.
  • FIG. 1 is a block diagram showing the overall configuration.
  • This ultrasonic diagnostic apparatus is used in the medical field and has a function of generating a three-dimensional image of a tissue in a living body by transmitting and receiving ultrasonic waves.
  • the tissue to be imaged is a fetus. Of course, other tissues may be imaged.
  • Probe 10 is a transducer that transmits and receives ultrasonic waves.
  • the probe 10 has a 2D array transducer.
  • the 2D array vibrator is formed by two-dimensionally arranging a plurality of vibration elements.
  • An ultrasonic beam is formed by the 2D array transducer, and the ultrasonic beam is scanned two-dimensionally.
  • a three-dimensional space 12 is formed as a three-dimensional echo data capturing space.
  • the probe 10 may incorporate a 1D array transducer and a scanning mechanism that mechanically scans the 1D array transducer.
  • the scanning surface may be formed by electronic scanning of the ultrasonic beam by the 1D array transducer, and the scanning surface may be mechanically scanned.
  • the three-dimensional space 12 is also formed by such a method.
  • electronic scanning methods electronic sector scanning, electronic linear scanning, and the like are known.
  • the probe 10 is brought into contact with the surface of the abdomen of the mother, and ultrasonic waves are transmitted and received in this state.
  • the transmission / reception unit 14 functions as a transmission beamformer and a reception beamformer. At the time of transmission, the transmission / reception unit 14 supplies a plurality of transmission signals having a fixed delay relationship to the plurality of vibration elements of the probe 10. Thereby, an ultrasonic transmission beam is formed. At the time of reception, a reflected wave from the living body is received by the probe 10, whereby a plurality of reception signals are output from the probe 10 to the transmission / reception unit 14.
  • the transmission / reception unit 14 performs phasing addition processing on a plurality of reception signals, and thereby outputs beam data as reception signals after phasing addition. It should be noted that techniques such as transmission aperture synthesis may be used in ultrasonic transmission / reception.
  • Signal processing such as detection, logarithmic compression, coordinate conversion, and the like is applied to the beam data by the signal processing unit 16.
  • the beam data after the signal processing is stored in the 3D memory 18.
  • beam data that is not subjected to such processing may be stored in the 3D memory 18.
  • the above processing may be performed when reading the beam data.
  • the 3D memory 18 has a storage space corresponding to a three-dimensional space as a transmission / reception space.
  • the 3D memory 18 stores volume data as an echo data aggregate acquired from the three-dimensional space 12.
  • the volume data is actually data constituted by coordinate conversion and interpolation processing for a plurality of beam data.
  • data that is not subjected to such processing may be stored in the 3D memory 18. The above processing may be performed when reading data.
  • the three-dimensional image generation unit 20 reads the volume data from the 3D memory 18 and executes a rendering process on the partial volume data in the three-dimensional region of interest (3D-ROI) according to the rendering conditions given from the control unit 30. . Thereby, a three-dimensional image is generated.
  • the image data is output to the display processing unit 26.
  • Various methods are known as rendering processing, and various methods can be employed. For example, an image processing method such as a volume rendering method is applied.
  • the 3D image generation unit 20 extracts partial volume data from each of a plurality of volume data sequentially acquired along the time axis.
  • the three-dimensional image generation unit 20 spatially synthesizes the plurality of partial volume data so that the target portions are aligned between them, thereby generating synthesized partial volume data.
  • a rendering process is applied to the combined partial volume data, thereby generating a three-dimensional image.
  • the cross-sectional image generation unit 22 has a function of generating a two-dimensional cross-sectional image (B-mode tomographic image).
  • the cross-sectional image generation unit 22 has a function of generating a cross-sectional image in a cross section arbitrarily set by the user. Specifically, when the coordinate information or the like of an arbitrary cross section is given from the control unit 30 to the cross section image generation unit 22, the cross section image generation unit 22 reads data corresponding to the arbitrary cross section from the 3D memory 18. The cross-sectional image generation unit 22 generates a two-dimensional cross-sectional image based on the read data. This image data is output to the display processing unit 26.
  • the cross-sectional image generation unit 22 may generate an arbitrary number of cross-sectional images specified by the user.
  • the cross-sectional image generation unit 22 may generate a B-mode tomographic image based on beam data acquired by transmitting and receiving ultrasonic waves with respect to a two-dimensional scanning plane.
  • the graphic image generation unit 24 generates graphic data to be overlaid on the cross-sectional image or the three-dimensional image according to the graphic creation parameters supplied from the control unit 30. For example, the graphic image generation unit 24 generates data such as graphic data representing a cross section of a three-dimensional region of interest and graphic data representing a cut line. The graphic data generated in this way is output to the display processing unit 26.
  • the display processing unit 26 performs overlay processing of necessary graphic data on an image such as a three-dimensional image or a cross-sectional image, thereby generating display image data.
  • the display image data is output to the display unit 28, and one or more images are displayed in a display mode according to the display mode. For example, an image such as a three-dimensional image or a cross-sectional image is displayed as a moving image in real time.
  • the display unit 28 is configured by a display device such as a liquid crystal display.
  • the control unit 30 has a function of controlling each part of the ultrasonic diagnostic apparatus.
  • the control unit 30 includes a region of interest setting unit 32.
  • the region-of-interest setting unit 32 sets a three-dimensional region of interest including a cut surface for volume data.
  • the cut surface is, for example, a deformable surface set for volume data.
  • the cut surface corresponds to the start surface of the rendering process and has a function of separating the imaging target tissue and the non-target tissue.
  • the tissue on the front side corresponds to a non-target tissue for imaging
  • the tissue on the back side corresponds to the side opposite to the projection viewpoint
  • the cut surface may be designated by a user's manual operation, or may be automatically set.
  • an image representing a representative cross section of the volume data is displayed on the display unit 28, and a box representing a cross section of the three-dimensional region of interest is displayed on the image.
  • the shape of the upper side of the box is operated by the user.
  • the upper side of the box corresponds to a cut line, and a cut surface is formed based on the cut line.
  • the cut line may be a spline curve formed based on at least three points, or may be a line having an arbitrary shape.
  • the region-of-interest setting unit 32 forms a spline curve as a cut line based on these points.
  • the cut line may be formed by another method.
  • the region-of-interest setting unit 32 generates, for example, a plurality of spline curves that pass through the cut line, thereby forming a cut surface.
  • amniotic fluid data between fetal data and uterine wall data in the volume data is detected, and a cut surface is set in the amniotic fluid data.
  • the cut surface may be set by another method.
  • the 3D image generation unit 20 applies rendering processing to partial volume data in the 3D region of interest.
  • the input unit 34 is connected to the control unit 30.
  • the input unit 34 is configured by, for example, an operation panel, and the operation panel is a device having a keyboard, a trackball, and the like.
  • the user can use the input unit 34 to input information such as numerical values necessary for setting the three-dimensional region of interest and coordinates of an arbitrary cross section.
  • the configuration other than the probe 10 can be realized by using hardware resources such as a processor and an electronic circuit, and a device such as a memory can be used as necessary in the realization.
  • the configuration other than the probe 10 may be realized by a computer, for example. That is, all or part of the configuration other than the probe 10 may be realized by cooperation of hardware resources such as a CPU, a memory, and a hard disk included in the computer and software (program) that defines the operation of the CPU and the like. .
  • the program is stored in a storage device (not shown) via a recording medium such as a CD or DVD, or via a communication path such as a network.
  • configurations other than the probe 10 may be realized by a DSP (Digital Signal Processor), an FPGA (Field Programmable Gate Array), or the like.
  • FIG. 2 shows detailed configurations of the 3D memory 18 and the three-dimensional image generation unit 20.
  • the volume data storage unit 36 stores volume data acquired from the three-dimensional space 12. By the transmission / reception of the ultrasonic beam, a plurality of volume data arranged on the time axis are sequentially acquired, and the plurality of volume data are stored in the volume data storage unit 36.
  • the partial volume data extraction unit 38 acquires a plurality of volume data from the volume data storage unit 36, and applies a process of extracting data in the three-dimensional region of interest to each of the plurality of volume data. Generate partial volume data.
  • the three-dimensional region of interest is a region that includes, for example, a target region.
  • the attention site is, for example, a fetus or its face.
  • a plurality of partial volume data including a fetal image or a fetal face image is generated.
  • the partial volume data extraction unit 38 may acquire a plurality of volume data acquired within a period designated by the user from the volume data storage unit 36 or may be acquired automatically within a period set. A plurality of volume data may be acquired from the volume data storage unit 36.
  • the adjustment information calculation unit 40 has a function of calculating the representative position (representative coordinates) and the representative azimuth of the target region for each partial volume data.
  • the representative position and the representative direction are used as adjustment information.
  • the adjustment information calculation unit 40 calculates a normal vector for each partial volume data in units of voxels included therein, and thereby obtains a normal vector group including a plurality of normal vectors for each partial volume data. It has a function. That is, in each volume data, the normal vector of each voxel is calculated, and thereby a normal vector group is obtained for each partial volume data.
  • the adjustment information calculation unit 40 calculates a representative position and a representative direction based on the normal vector group for each partial volume data.
  • the partial volume data adjustment unit 42 adjusts the position and orientation of the plurality of partial volume data so that the representative position and the representative orientation for the target region are aligned among the plurality of partial volume data. As a result, a plurality of adjustment partial volume data is generated and stored in the adjustment partial volume data storage unit 44.
  • the partial volume data synthesis unit 46 acquires a plurality of adjustment partial volume data from the adjustment partial volume data storage unit 44, and synthesizes the plurality of adjustment partial volume data. As a result, composite partial volume data is generated.
  • the combined partial volume data is stored in the combined partial volume data storage unit 48.
  • the rendering unit 50 acquires the combined partial volume data from the combined partial volume data storage unit 48, and applies the rendering process to the combined partial volume data. Thereby, a three-dimensional image is generated. The data of the three-dimensional image is output to the display processing unit 26.
  • FIG. 3 shows a volume data string generated by transmission / reception of an ultrasonic beam.
  • the volume data string includes a plurality of volume data (volume data 52a, 52b,..., 52n,...) Arranged on the time axis. Each volume data is stored in the volume data storage unit 36.
  • FIG. 4 shows an example of a cross-sectional image.
  • the cross-sectional image 54 is an image representing a representative cross-section of the volume data, for example, an image on the scanning surface at the center of the three-dimensional space 12.
  • the cross-sectional image 54 may be an image on another surface.
  • the cross-sectional image 54 includes a fetal image 56, a uterine wall image 58, and an amniotic fluid image 60.
  • the cross-sectional image 54 is an image representing an XY cross section, for example.
  • the section image 54 includes a box as a graphic image (not shown). This box represents the XY cross section of the three-dimensional region of interest. The top side of the box is a cut line.
  • the cross-sectional image 54 is displayed on the display unit 28, and the user uses the input unit 34 while observing the cross-sectional image 54, so that the cut line position, rotation angle, curvature, shape, and Parameters such as length can be changed. Further, the user can change the length (width) in the X direction and the length (height) in the Y direction of the box. Thereby, the width and height of the three-dimensional region of interest are changed.
  • the region-of-interest setting unit 32 forms a cut surface based on the parameters of the cut line.
  • the region-of-interest setting unit 32 changes the shape, position, size, and the like of the cut surface according to the change.
  • the region-of-interest setting unit 32 sets a three-dimensional region of interest having the changed cut surface.
  • a cut plane and a three-dimensional region of interest may be automatically set.
  • the region-of-interest setting unit 32 detects a tissue boundary in the living body based on the voxel value (luminance value) of each voxel of the volume data. For example, a fetal image, a uterine wall image, and an amniotic fluid image are discriminated, and a boundary between the fetal image and another tissue image is detected.
  • the region-of-interest setting unit 32 automatically sets a cut surface based on the boundary, and automatically sets a three-dimensional region of interest having the cut surface.
  • FIG. 5 shows an example of volume data.
  • a cut surface 64 is set in the volume data 62, and a three-dimensional region of interest 66 having the cut surface is set.
  • the partial volume data extraction unit 38 extracts data (partial volume data) in the three-dimensional region of interest 66 from the volume data 62.
  • the cut surface 64 corresponds to a rendering start surface, and the tissue on the projection viewpoint side (tissue on the near side) is not imaged with reference to the cut surface 64, and the tissue on the opposite side to the projection viewpoint (in the three-dimensional region of interest 66) Image). That is, the rendering process is applied to the partial volume data, and thereby a three-dimensional image representing the tissue in the three-dimensional region of interest 66 is generated.
  • FIG. 6 shows an example of a cross-sectional image representing the tissue in the region of interest.
  • the cross-sectional image 68 includes a fetal image 56 as a tissue image in the region of interest.
  • an image is generated with the uterine wall image 58 removed.
  • partial volume data as data in the region of interest is extracted from the volume data.
  • FIG. 7 shows a flowchart for explaining processing by the adjustment information calculation unit 40.
  • a plurality of volume data arranged on the time axis are sequentially acquired by repeating transmission and reception of ultrasonic waves (S01).
  • the plurality of volume data is stored in the volume data storage unit 36.
  • the partial volume data extraction unit 38 applies a process of extracting data in the three-dimensional region of interest to each of the plurality of volume data, thereby generating a plurality of partial volume data (S02). For example, a plurality of partial volume data including a fetal image or a fetal face image is generated.
  • the adjustment information calculation unit 40 calculates a normal vector for each partial volume data in units of voxels included therein, thereby obtaining a normal vector group including a plurality of normal vectors for each partial volume data.
  • This normal vector can be calculated by applying a known technique.
  • the normal vector of the voxel is calculated by calculating the luminance gradient in the voxel.
  • the luminance value of the voxel at the position (x, y, z) is I (x, y, z) and the normal vector is g (x, y, z)
  • the normal vector is expressed by the following equation (1). Defined.
  • the adjustment information calculation unit 40 may extract surface data representing the surface of the fetal face from the partial volume data, and calculate the normal vector of each voxel constituting the surface data. For example, surface data is extracted by applying an edge detection process.
  • normal vectors may be calculated for all voxels included in the partial volume data without extracting the surface data. In this case, the normal vector is also calculated for voxels in the fetal face image.
  • the adjustment information calculation unit 40 calculates the representative position of the site of interest (fetal face) based on the normal vector group for each partial volume data (S04). For example, the adjustment information calculation unit 40 extends a plurality of vectors for each partial volume data, and calculates a representative position based on a position where the extended plurality of vectors intersect. When a plurality of intersections are formed by extending a plurality of vectors, the barycentric positions and average positions of the plurality of intersections are used as representative positions. In addition, the adjustment information calculation unit 40 may calculate the representative position by excluding some intersection groups by applying clustering processing to a plurality of intersections.
  • intersection group included in the region where the distribution of the intersection group is sparse may be excluded, and the barycentric position or the average position of the intersection group included in the region where the distribution of the intersection group is dense may be calculated as the representative position.
  • intersections that are not appropriate for calculating the representative position are excluded, and a more appropriate representative position is calculated.
  • the adjustment information calculation unit 40 fits the partial volume data with a hollow ellipsoid (three-dimensional ellipsoid) having the representative position as the center point for each partial volume data (S05).
  • the ellipsoid is a concept including a sphere. Since the head of the fetus approximates an ellipsoid, an approximate figure as an ellipsoid is fitted to the partial volume data.
  • an ellipsoid can be fitted to the partial volume data by applying a known technique (for example, the least square method). Hereinafter, this fitting process will be described in detail.
  • the three-dimensional ellipsoid is defined by the following equation (2).
  • (x 0 , y 0 , z 0 ) are the coordinates of the representative position, and a, b, and c are unknown numbers. Note that the rotation of the ellipsoid is not considered.
  • the unknowns a, b, and c are obtained using the least square method. That is, the unknowns a, b, and c for minimizing the value J defined by the following equation (3) are obtained for the voxel group (x i , y i , z i ) to be fitted.
  • unknown numbers a, b, and c are obtained by solving the following equation (5).
  • the adjustment information calculation unit 40 removes data and normal vectors that deviate from the hollow ellipsoid for each partial volume data (S06). For example, the adjustment information calculation unit 40 calculates the distance from each voxel to the surface of the hollow ellipsoid, and removes the voxel whose distance is equal to or greater than the threshold and the normal vector of the voxel.
  • a voxel closer to a hollow ellipsoid is a voxel that is more likely to be a voxel on the face surface
  • a voxel farther from the hollow ellipsoid is a voxel that is less likely to be a voxel on the face surface.
  • the adjustment information calculation unit 40 calculates a representative vector (representative orientation) based on the remaining normal vector group for each partial volume data (S07). For example, the average vector of the remaining normal vector group is obtained as the representative vector.
  • a weighting process according to the magnitude of the vector may be applied. For example, a larger normal vector is multiplied by a larger weight coefficient and the average vector of the normal vector group after the weight coefficient is multiplied is obtained as a representative vector. It can be said that this representative vector represents the orientation of the face of the fetus.
  • FIGS. 8 to 10, 12, and 13 schematically show examples of partial volume data.
  • the partial volume data is expressed two-dimensionally for convenience of explanation, but the processing by the adjustment information calculation unit 40 is applied to the three-dimensional partial volume data.
  • FIG. 8 shows partial volume data 70 extracted by the partial volume data extraction unit 38.
  • the partial volume data 70 includes a fetal image 56 as a tissue image in the three-dimensional region of interest. By setting the three-dimensional region of interest, partial volume data with the uterine wall image and the like removed is generated.
  • a plurality of normal vectors 72 are calculated by applying the process of step S03 described above. Surface data representing the surface of the fetal face may be extracted from the partial volume data 70, and the normal vector 72 of each voxel constituting the surface data may be calculated, or partial volume data may be extracted without extracting the surface data.
  • a normal vector may be calculated for all voxels included in. In the example shown in FIG. 8, the normal vector of the voxel in the vicinity of the surface is illustrated. When normal vectors of all voxels are calculated, normal vectors are also calculated for voxels inside the fetal image 56.
  • FIG. 9 shows a plurality of extended normal vectors 74 in the partial volume data 70.
  • the representative position of the face image is calculated based on the position where the plurality of normal vectors 74 intersect. For example, the barycentric positions and average positions of a plurality of intersections are calculated as representative positions.
  • the position in this area is calculated as the representative position.
  • the intersections distributed outside the area indicated by reference numeral 76 may be removed, and the representative position may be calculated based on the intersection points included in the area indicated by reference numeral 76. Thereby, intersections that are not appropriate in calculating the representative position are excluded, and a more appropriate representative position is calculated as the representative position of the face image.
  • FIG. 10 shows a representative position 78 and an ellipsoid 80.
  • the representative position 78 is a position calculated based on the extended normal vectors 74 as described above.
  • the ellipsoid 80 is a three-dimensional ellipsoid having the representative position 78 as a center point in the above step S05, and is an ellipsoid fitted to the partial volume data 70.
  • the fetal head can be regarded as an ellipsoid, and an ellipsoid 80 that fits the image of the face surface is calculated in the fetal image 56.
  • step S06 data and normal vectors that are out of the ellipsoid 80 are removed.
  • FIG. 11 shows details of the processing.
  • the adjustment information calculation unit 40 connects the voxel 82 constituting the partial volume data and the representative position 78 with a straight line, and calculates the distance L from the voxel 82 to the surface of the ellipsoid 80 on the straight line.
  • the distance L is equal to or greater than the threshold value
  • the voxel 82 and the normal vector 84 obtained for the voxel 82 are removed.
  • the voxel 82 and the normal vector 84 are not removed.
  • FIG. 12 shows the remaining partial volume data 86 after removal. Voxels and normal vectors that deviate from the ellipsoid 80 are removed, and the remaining partial volume data 86 includes remaining data 88 that has not been removed.
  • the remaining data 88 is constituted by voxel group data near the surface of the ellipsoid 80.
  • a representative vector is calculated based on the normal vector group of the voxel group in the residual data 88, that is, the vector group remaining without being removed.
  • FIG. 13 shows a representative vector 92.
  • the representative vector 92 is a vector calculated so as to pass through the representative position 78 based on the normal vector group of the voxel group in the remaining data 88. For example, a vector that is an average vector of the remaining normal vector groups and passes through the representative position 78 is calculated as the representative vector 92. Thereby, the representative azimuth (direction) of the face image is calculated.
  • the representative position and the representative vector (representative orientation) of the face image as the adjustment information are calculated by the adjustment information calculation unit 40.
  • a representative position and a representative vector are calculated for each partial volume data.
  • the remaining partial volume data, information indicating the representative position, and information indicating the representative vector are output to the partial volume data adjustment unit 42.
  • FIG. 14 shows a flowchart for explaining the processing by the partial volume data adjustment unit 42.
  • the partial volume data adjustment unit 42 performs the following calculation for each voxel for each remaining partial volume data.
  • the partial volume data adjustment unit 42 translates the position of the voxel in the remaining partial volume data based on the difference between the reference position and the representative position (S10). That is, the partial volume data adjustment unit 42 translates the position of each voxel in the remaining partial volume data so that the representative position of the remaining partial volume data matches the reference position.
  • the reference position may be a position in a three-dimensional space and an assumed position set in advance, or may be a representative position in one remaining partial volume data among a plurality of remaining partial volume data. .
  • the representative position in the latest remaining partial volume data may be used as the reference position.
  • the partial volume data adjustment unit 42 translates each voxel in each remaining partial volume data so that the representative position in each remaining partial volume data matches the representative position in the latest remaining partial volume data. Become.
  • the partial volume data adjustment unit 42 adjusts the voxel direction (normal vector direction) in the remaining partial volume data based on the angular difference between the reference vector (reference azimuth) and the representative vector (representative azimuth). (S11). That is, the partial volume data adjustment unit 42 adjusts the direction of each voxel (direction of each normal vector) in the remaining partial volume data so that the representative vector of the remaining partial volume data matches the reference vector. For example, the direction of each voxel is adjusted by performing rotation correction around each axis constituting the three-dimensional orthogonal coordinate system.
  • the reference vector may be a vector in a three-dimensional space and set in advance, or may be a representative vector in one remaining partial volume data among a plurality of remaining partial volume data.
  • the representative position in the latest remaining partial volume data is used as the reference vector.
  • the partial volume data adjustment unit 42 applies the normal vector of each voxel in each remaining partial volume data so that the representative vector in each remaining partial volume data matches the representative vector in the latest remaining partial volume data. Rotation correction is applied.
  • the parallel movement process and the rotation correction process are applied to each remaining partial volume data, thereby generating a plurality of adjusted partial volume data. Since the parallel movement process and the rotation correction process are applied so that the representative position and the representative vector are aligned between the plurality of remaining partial volume data, the region of interest (fetal face) between the plurality of remaining partial volume data is applied. Position and orientation are aligned. A plurality of adjustment partial volume data is stored in the adjustment partial volume data storage unit 44.
  • FIG. 15A shows remaining partial volume data 86.
  • a representative position 78 and a representative vector 92 are obtained.
  • a parallel movement process and a rotation correction process are applied to the remaining partial volume data 86, whereby adjusted partial volume data is generated.
  • FIG. 15B shows adjustment partial volume data 96.
  • the position of the voxel 94 is translated so that the representative position 78 matches the reference position.
  • the direction of the voxel 94 (the direction of the normal vector) is adjusted so that the representative vector 92 matches the reference vector.
  • the representative position 98 is a representative position after translation, that is, a position that matches the reference position.
  • the representative vector 100 is a representative vector after translation and rotation correction.
  • the voxel 102 is the position of the voxel 94 after translation and rotation correction, and the voxel value (luminance value) of the voxel 94 is assigned to the voxel 102.
  • the parallel movement process and the rotation correction process are applied to all the voxels included in the remaining data 88.
  • FIG. 15C shows the adjusted partial volume data 96 after the processing.
  • the adjustment partial volume data 96 is composed of a plurality of voxels 102 to which the parallel movement process and the rotation correction process are applied.
  • FIG. 16A shows remaining partial volume data 86.
  • a representative position 78 and a representative vector 92 are obtained.
  • the parallel movement process and the rotation correction process are performed in two steps. First, as a process related to the first step, a parallel movement process is applied to the remaining partial volume data 86, and a rotation correction process around the X axis and the Z axis is further applied.
  • FIG. 16B shows the remaining partial volume data 104 after these processes are applied. As described above, the position of each voxel is translated so that the representative position 78 matches the reference position.
  • each voxel (the direction of the normal vector) is adjusted so that the representative vector 92 and the reference vector coincide with each other around the X axis and the Z axis.
  • the representative position 106 is a representative position after translation, that is, a position that matches the reference position.
  • the representative vector 108 is a vector after translation, and is a vector after rotation correction around the X axis and the Z axis.
  • a rotation correction process around the Y axis is applied to the remaining partial volume data 104.
  • adjusted partial volume data is generated.
  • FIG. 16C shows the adjusted partial volume data 110 after the processing is applied.
  • each voxel (the direction of the normal vector) is adjusted so that the representative vector 108 matches the reference vector around the Y axis.
  • the representative vector 112 is a vector after the rotation correction around the Y axis.
  • the adjusted partial volume data is generated from the remaining partial volume data.
  • the average of the normal vectors of a plurality of voxels (representative vector) was defined as (n x, n y, n z) t, defines the coordinate of the representative position and (x 0, y 0, z 0).
  • a coordinate transformation defined by the following equation (6) is applied to each voxel (x, y, z) in the remaining partial volume data.
  • R x ( ⁇ ) is a rotation matrix around the X axis
  • R z ( ⁇ ) is a rotation matrix around the Z axis.
  • R x ( ⁇ ) and R z ( ⁇ ) are defined by the following equation (7).
  • ⁇ 1 is a rotation angle for rotating the representative vector around the Z axis in the direction from the X axis to the Y axis so that the X component of the representative vector is 0 (zero). That, theta 1 is a rotation angle for collimating the representative vector in the YZ plane.
  • This rotation angle ⁇ 1 is defined by the following equation (8).
  • ⁇ 2 is a rotation angle for rotating the representative vector around the X axis in the direction from the Z axis to the Y axis so that the Z component of the representative vector is 0 (zero). That is, ⁇ 2 is a rotation angle for making the representative vector parallel to the XY plane.
  • This rotation angle ⁇ 2 is defined by the following equation (9).
  • R y ( ⁇ ) is a rotation matrix around the Y axis, and is a matrix defined by the following equation (11).
  • ⁇ 0 is a rotation angle around the Y axis.
  • the rotation angle ⁇ 0 is calculated by the following calculation method.
  • a process for calculating the rotation angle ⁇ 0 will be described with reference to FIGS. 17A and 17B.
  • FIG. 17A shows the remaining partial volume data 104 after the processing related to the first step (parallel movement processing and rotation correction processing around the X and Z axes) is applied.
  • the voxel group (x i , y i , z i ) included in the remaining partial volume data 104 is projected onto the XZ plane.
  • the voxel 114 at the position (x, y, z) is projected onto the XZ plane.
  • the projection point 116 is formed at the position (x, y) on the XZ plane.
  • the projection process is applied on the assumption that K pieces of data are duplicated at the position (x, y). Then, a projection process on the XZ plane is performed for each voxel.
  • FIG. 17B shows the distribution of the projection points 116 on the XZ plane.
  • the rotation angle ⁇ 0 is obtained by the following equation (15).
  • the parallel volume process and the rotation correction process are executed by the partial volume data adjustment unit 42.
  • a parallel movement process and a rotation correction process are applied to each remaining partial volume data, thereby generating a plurality of adjusted partial volume data.
  • the plurality of adjustment partial volume data is stored in the adjustment partial volume data storage unit 44.
  • FIG. 18 shows a flowchart for explaining the synthesis process.
  • the partial volume data composition unit 46 acquires a plurality of adjustment partial volume data from the adjustment partial volume data storage unit 44 (S20), and executes the following calculation for each voxel (i, j, k).
  • the partial volume data composition unit 46 acquires the voxel value (luminance value) of the voxel (i, j, k) from all the acquired adjusted partial volume data (S21), and the voxel of the voxel (i, j, k). An average value is obtained (S22). At this time, voxels having no voxel value are not included in the average value calculation target voxel group. As another example, a voxel whose voxel value is less than the threshold value may not be included in the average value calculation target voxel group.
  • the partial volume data combining unit 46 obtains the highest voxel value (maximum luminance value) among all the adjusted partial volume data, instead of obtaining the average value, for the voxel (i, j, k).
  • the median value of voxel values may be adopted as the value of the voxel (i, j, k).
  • the adjustment portion Whether or not a voxel value exists in the voxel (i, j, k) may be determined by a majority decision on the volume data.
  • steps S21 and S22 are executed for each voxel (i, j, k), thereby generating synthesized partial volume data.
  • the combined partial volume data is stored in the combined partial volume data storage unit 48 (S23).
  • FIG. 19A shows a plurality of adjustment partial volume data (for example, adjustment partial volume data 122a, 122b, 122c, 122d).
  • the representative positions 124a, 124b, 124c, and 124d and the representative vectors 126a, 126b, 126c, and 126d are obtained for each individual adjustment partial volume data. Further, by applying the parallel movement process and the rotation correction process by the partial volume data adjustment unit 42, the representative positions 124a, 124b, 124c, and 124d are aligned, and the directions of the representative vectors 126a, 126b, 126c, and 126d are set. It is aligned.
  • Each adjustment partial volume data includes voxel groups 128a, 128b, 128c, and 128d to which parallel movement processing and rotation correction processing are applied.
  • the adjustment part volume data 122a, 122b, 122c, 122d there is a part where data is partially missing.
  • the data missing part 130a exists in the voxel group 128a.
  • the data missing part 130b exists in the voxel group 128b
  • the data missing part 130c exists in the voxel group 128c.
  • a data missing portion 130d exists in the voxel group 128d.
  • These data missing portions 130a, 130b, 130c, and 130d are portions in which data is missing due to movement of the fetus, positional displacement of the probe 10, and the like, and there are voxel values (luminance values). There is no part.
  • the partial volume data combining unit 46 combines the adjusted partial volume data 122a, 122b, 122c, and 122d. As a result, composite partial volume data is generated. For example, the combined partial volume data is generated by applying the combining process shown in FIG.
  • FIG. 19B shows combined partial volume data 132 generated by the combining process.
  • the representative position 134 is the same position as the representative position after the parallel movement process and the rotation correction process are applied, and the representative vector 136 is the same vector as the representative vector after the parallel movement process and the rotation correction process are applied. is there.
  • the combined partial volume data 132 includes a voxel group 138.
  • the voxel group 138 is obtained by the synthesis process for the voxel groups 128a, 128b, 128c, and 128d. In the voxel group 138, the missing data portion is compensated.
  • the adjustment part volume data 122a has a data missing part 130a, but this part is supplemented by other adjustment part volume data 122b, 122c, and 122d. The same applies to the other data missing portions 130b, 130c, and 130d. Then, a rendering process is applied to the composite partial volume data 132, thereby generating a three-dimensional image.
  • the plurality of partial volume data (adjusted partial volume data) is spatially arranged so that the position and orientation of the target region (for example, the fetal face) are aligned between the plurality of partial volume data. Is synthesized. As a result, even if there is a data missing part in the partial volume data, the data missing part is replaced with the other partial volume data in a state where the position and orientation of the target region are aligned among the plurality of partial volume data. Can be compensated by. Therefore, it is possible to generate a three-dimensional image with reduced data loss.
  • the target region for example, the fetal face
  • the rendering process is applied to the adjustment part volume data 122a, it is assumed that the three-dimensional image obtained thereby includes an image missing part corresponding to the data missing part 130a.
  • the data missing portion 130a is supplemented by other data, it is possible to prevent or reduce image loss.
  • the parallel movement process and the rotation correction process it is possible to synthesize a plurality of partial volume data in a state where the position and orientation of the target region are aligned between the plurality of partial volume data. As a result, it is possible to prevent or reduce positional deviation and orientation deviation due to synthesis.
  • the processing according to the present embodiment may be applied to a plurality of volume data stored in the 3D memory 18, or may be sequentially applied to volume data acquired in real time.
  • a plurality of volume data stored in the 3D memory 18 for example, a plurality of volume data acquired within a period specified by the user, or automatically set Processing is applied to a plurality of volume data acquired within the period.
  • the process according to the present embodiment is applied to volume data acquired in real time, the process according to the present embodiment is applied every time volume data is acquired. As a result, the combined partial volume data is sequentially generated, and a three-dimensional image is sequentially generated.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A plurality of sets of partial volume data (122a-d) are generated by way of applying a process to extract portions that include a site of interest to each of a plurality of sets of volume data acquired sequentially by transmitting and receiving ultrasonic waves. The position and direction of the plurality of sets of partial volume data (122a-d) are adjusted so that the representative positions (124a-d) and representative vectors (126a-d) are uniform among the plurality of sets of partial volume data (122a-d) in the site of interest. Composite partial volume data is generated by combining the adjusted sets of partial volume data. A three-dimensional image is generated by applying a rendering process to the composite partial volume data.

Description

超音波診断装置及び超音波画像処理方法Ultrasonic diagnostic apparatus and ultrasonic image processing method
 本発明は超音波診断装置に関し、特に、注目部位の三次元画像を生成する技術に関する。 The present invention relates to an ultrasonic diagnostic apparatus, and more particularly to a technique for generating a three-dimensional image of a site of interest.
 医療分野において、三次元超音波診断が普及しつつある。例えば、産科においては、母体内の胎児を包含する三次元空間に対して超音波が送受波され、これにより、ボリュームデータが取得される。一般的に、ボリュームデータに対して三次元関心領域(3D-ROI)が設定され、三次元関心領域内のデータに対するレンダリング処理により、胎児の三次元画像が生成される(特許文献1参照)。 In the medical field, three-dimensional ultrasonic diagnosis is spreading. For example, in obstetrics, ultrasonic waves are transmitted / received to / from a three-dimensional space including a fetus in the mother body, thereby acquiring volume data. In general, a three-dimensional region of interest (3D-ROI) is set for volume data, and a three-dimensional image of the fetus is generated by rendering the data in the three-dimensional region of interest (see Patent Document 1).
特開2008-18224号公報JP 2008-18224 A
 ところで、注目部位の動きや超音波プローブの位置ずれ等に起因して、注目部位を表すデータが部分的に欠落する場合がある。この場合、三次元画像において注目部位の視認性が低下するという問題が生じる。例えば、胎児、母体、超音波プローブ等が動くことによって、胎児の顔を表すデータが部分的に欠落する場合があり、この場合、三次元画像において胎児の顔が適切に表されないという問題が生じる。 By the way, there are cases where data representing the target region is partially lost due to the movement of the target region, the displacement of the ultrasonic probe, or the like. In this case, there arises a problem that the visibility of the site of interest is reduced in the three-dimensional image. For example, when a fetus, a mother, an ultrasonic probe, or the like moves, data representing the fetal face may be partially lost. In this case, there is a problem that the fetal face is not properly represented in the three-dimensional image. .
 本発明の目的は、超音波診断装置において、注目部位の視認性が良好な三次元超音波画像が生成されるようにすることである。 An object of the present invention is to generate a three-dimensional ultrasonic image with good visibility of a site of interest in an ultrasonic diagnostic apparatus.
 本発明に係る超音波診断装置は、被検体内の三次元空間に対する超音波の送受波により順次取得された複数のボリュームデータのそれぞれに対して注目部位を含む部分を抽出する処理を適用し、これにより複数の部分ボリュームデータを生成する部分ボリュームデータ生成部と、前記複数の部分ボリュームデータをそれらの間で前記注目部位が揃うように空間的に合成し、これにより合成部分ボリュームデータを生成する合成部と、前記合成部分ボリュームデータに対してレンダリング処理を適用し、これにより三次元超音波画像を生成する超音波画像生成部と、を含むことを特徴とする。 The ultrasonic diagnostic apparatus according to the present invention applies a process of extracting a portion including a region of interest for each of a plurality of volume data sequentially obtained by transmission and reception of ultrasonic waves for a three-dimensional space in a subject, As a result, a partial volume data generation unit that generates a plurality of partial volume data and the plurality of partial volume data are spatially combined so that the target region is aligned therebetween, thereby generating combined partial volume data It includes a combining unit and an ultrasonic image generating unit that applies a rendering process to the combined partial volume data and thereby generates a three-dimensional ultrasonic image.
 上記の構成によると、複数の部分ボリュームデータの間で注目部位が揃うように、複数の部分ボリュームデータが空間的に合成される。例えば、複数の部分ボリュームデータの間で注目部位の位置と向きが揃えられた状態で合成部分ボリュームデータが生成される。上記の合成処理によると、部分ボリュームデータにおいてデータが部分的に欠落している場合であっても、複数の部分ボリュームデータの間で注目部位を揃えた状態で、そのデータ欠落部分を他の部分ボリュームデータによって補うことが可能となる。これにより、三次元超音波画像において注目部位像の部分的な欠落を防止又は低減することが可能となり、注目部位を適切に表す三次元超音波画像を生成することが可能となる。注目部位は、例えば胎児の顔である。もちろん、他の部位が注目部位であってもよい。例えば、注目部位の動きや超音波プローブの位置ずれ等に起因して、部分ボリュームデータにデータ欠落部分が発生した場合であっても、そのデータ欠落部分を他の部分ボリュームデータによって補うことが可能となり、その結果、注目部位が適切に表された三次元超音波画像を生成することが可能となる。なお、部分ボリュームデータを平滑化してから合成処理を適用してもよい。 According to the above configuration, a plurality of partial volume data is spatially synthesized so that attention sites are aligned among the plurality of partial volume data. For example, the combined partial volume data is generated in a state where the position and orientation of the target region are aligned among the plurality of partial volume data. According to the above synthesis process, even if data is partially missing in the partial volume data, the data missing part is replaced with another part in a state where the target region is aligned between the multiple partial volume data. It can be supplemented by volume data. As a result, it is possible to prevent or reduce partial omission of the site-of-interest image in the 3D ultrasound image, and to generate a 3D ultrasound image that appropriately represents the site of interest. The attention site is, for example, the face of the fetus. Of course, the other part may be the target part. For example, even if a missing data part occurs in the partial volume data due to the movement of the region of interest or the displacement of the ultrasonic probe, the missing data part can be compensated by other partial volume data. As a result, it is possible to generate a three-dimensional ultrasonic image in which the region of interest is appropriately represented. The synthesis process may be applied after the partial volume data is smoothed.
 望ましくは、前記合成部は、前記複数の部分ボリュームデータの間で前記注目部位についての代表位置及び代表方位が揃うように前記複数の部分ボリュームデータの位置及び向きを調整して前記複数の部分ボリュームデータを合成する。 Preferably, the synthesizing unit adjusts positions and orientations of the plurality of partial volume data so that a representative position and a representative orientation for the target site are aligned between the plurality of partial volume data. Synthesize the data.
 上記の構成において、代表位置は注目部位の位置を代表的に示す位置であり、代表方位は注目部位の方位(向き)を代表的に示す方位である。複数の部分ボリュームデータの間で代表位置と代表方位を揃えるように複数の部分ボリュームデータの位置と向きを調整することにより、複数の部分ボリュームデータの間で注目部位の位置と方位を揃えることが可能となる。その状態で複数の部分ボリュームデータを合成することにより、合成による注目部位の位置ずれや向きのずれを防止又は低減することが可能となる。 In the above configuration, the representative position is a position representatively showing the position of the target site, and the representative orientation is an orientation representatively showing the direction (direction) of the target site. By adjusting the position and orientation of the plurality of partial volume data so that the representative position and the representative orientation are aligned between the plurality of partial volume data, the position and orientation of the target region can be aligned among the plurality of partial volume data. It becomes possible. By synthesizing a plurality of partial volume data in this state, it is possible to prevent or reduce positional shift and orientation shift of the target region due to the synthesis.
 望ましくは、前記合成部は、前記部分ボリュームデータ毎にボクセル単位で法線ベクトルを演算し、これにより前記部分ボリュームデータ毎に法線ベクトル群を得る法線ベクトル演算部と、前記部分ボリュームデータ毎に前記法線ベクトル群に基づいて前記代表位置を演算する代表位置演算部と、前記部分ボリュームデータ毎に前記法線ベクトル群に基づいて前記代表方位を演算する代表方位演算部と、を含む。 Preferably, the synthesis unit calculates a normal vector in units of voxels for each partial volume data, thereby obtaining a normal vector group for each partial volume data, and for each partial volume data A representative position calculator that calculates the representative position based on the normal vector group, and a representative azimuth calculator that calculates the representative azimuth based on the normal vector group for each partial volume data.
 上記の構成によると、ボクセル単位で法線ベクトルを演算することにより、個々のボクセルの向きが求められる。法線ベクトル群は、個々のボクセルについて演算された法線ベクトルの束であり、その法線ベクトル群に基づいて代表位置と代表方位が演算される。例えば、法線ベクトル群に含まれる各ベクトルを仮想的に延長し、各延長線が交差する交点群に基づいて代表位置を求めてもよい。その交点群の重心位置や平均位置等が代表位置として用いられてもよい。また、法線ベクトル群を平均化し、その平均化されたベクトルの向きが代表方位として用いられてもよい。 According to the above configuration, the direction of each voxel can be obtained by calculating the normal vector in units of voxels. The normal vector group is a bundle of normal vectors calculated for individual voxels, and the representative position and the representative direction are calculated based on the normal vector group. For example, each vector included in the normal vector group may be virtually extended, and the representative position may be obtained based on the intersection group where the extension lines intersect. The barycentric position, average position, etc. of the intersection group may be used as the representative position. Alternatively, the normal vector group may be averaged, and the averaged vector orientation may be used as the representative orientation.
 望ましくは、前記代表位置演算部は、近似図形を前記部分ボリュームデータにフィッティングさせることにより前記代表位置を演算し、前記代表方位演算部は、前記近似図形を前記部分ボリュームデータにフィッティングさせることにより前記代表方位を演算する。近似図形は注目部位に近似する図形である。例えば、注目部位の表面又は表面近傍に対して近似図形がフィッティングされる。 Preferably, the representative position calculating unit calculates the representative position by fitting an approximate graphic to the partial volume data, and the representative azimuth calculating unit is configured to fit the approximate graphic to the partial volume data. Calculate the representative orientation. The approximate figure is a figure that approximates the region of interest. For example, the approximate figure is fitted to the surface of the attention site or the vicinity of the surface.
 望ましくは、前記近似図形は中空楕円体である。例えば、胎児の頭等のように楕円体とみなせる注目部位に、楕円体がフィッティングされて、これにより、代表位置と代表方位が演算される。 Desirably, the approximate figure is a hollow ellipsoid. For example, an ellipsoid is fitted to a site of interest that can be regarded as an ellipsoid, such as the head of a fetus, and the representative position and representative orientation are thereby calculated.
 本発明に係る超音波画像処理方法は、被検体内の三次元空間に対する超音波の送受波により順次取得された複数のボリュームデータのそれぞれに対して注目部位を含む部分を抽出する処理を適用し、これにより複数の部分ボリュームデータを生成する部分ボリュームデータ生成工程と、前記複数の部分ボリュームデータをそれらの間で前記注目部位が揃うように空間的に合成し、これにより合成部分ボリュームデータを生成する合成工程と、前記合成部分ボリュームデータに対してレンダリング処理を適用し、これにより三次元超音波画像を生成する超音波画像生成工程と、を含むことを特徴とする。 The ultrasonic image processing method according to the present invention applies a process of extracting a portion including a region of interest for each of a plurality of volume data sequentially acquired by transmitting and receiving ultrasonic waves to and from a three-dimensional space in a subject. In this way, a partial volume data generation step for generating a plurality of partial volume data, and a spatial synthesis of the plurality of partial volume data so that the target region is aligned between them, thereby generating a combined partial volume data A synthesizing step, and an ultrasonic image generating step of applying a rendering process to the synthesized partial volume data to thereby generate a three-dimensional ultrasonic image.
 本発明によると、超音波診断装置において、注目部位の視認性が良好な三次元超音波画像を生成することが可能となる。 According to the present invention, it is possible to generate a three-dimensional ultrasonic image with good visibility of a site of interest in an ultrasonic diagnostic apparatus.
本発明の実施形態に係る超音波診断装置を示すブロック図である。1 is a block diagram showing an ultrasonic diagnostic apparatus according to an embodiment of the present invention. 3Dメモリと三次元画像生成部の構成を示すブロック図である。It is a block diagram which shows the structure of 3D memory and a three-dimensional image generation part. 時間軸上に沿って取得された複数のボリュームデータを示す模式図である。It is a schematic diagram which shows the some volume data acquired along the time-axis. 断面画像の一例を示す模式図である。It is a schematic diagram which shows an example of a cross-sectional image. ボリュームデータの一例を示す模式図である。It is a schematic diagram which shows an example of volume data. 断面画像の一例を示す模式図である。It is a schematic diagram which shows an example of a cross-sectional image. 調整情報演算部による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by an adjustment information calculating part. 部分ボリュームデータの一例を示す模式図である。It is a schematic diagram which shows an example of partial volume data. 部分ボリュームデータの一例を示す模式図である。It is a schematic diagram which shows an example of partial volume data. 部分ボリュームデータの一例を示す模式図である。It is a schematic diagram which shows an example of partial volume data. ボクセルと中空楕円体との位置関係を説明するための図である。It is a figure for demonstrating the positional relationship of a voxel and a hollow ellipsoid. 部分ボリュームデータの一例を示す模式図である。It is a schematic diagram which shows an example of partial volume data. 部分ボリュームデータの一例を示す模式図である。It is a schematic diagram which shows an example of partial volume data. 部分ボリュームデータ調整部による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ調整部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data adjustment part. 部分ボリュームデータ合成部による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by a partial volume data synthetic | combination part. 部分ボリュームデータ合成部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data synthetic | combination part. 部分ボリュームデータ合成部による処理の一例を説明するための図である。It is a figure for demonstrating an example of the process by the partial volume data synthetic | combination part.
 図1には、本発明に係る超音波診断装置の実施形態が示されている。図1は、その全体構成を示すブロック図である。この超音波診断装置は医療分野において用いられ、超音波の送受波により生体内の組織の三次元画像を生成する機能を備えている。一例として、画像化の対象となる組織は胎児である。もちろん、他の組織が画像化されてもよい。 FIG. 1 shows an embodiment of an ultrasonic diagnostic apparatus according to the present invention. FIG. 1 is a block diagram showing the overall configuration. This ultrasonic diagnostic apparatus is used in the medical field and has a function of generating a three-dimensional image of a tissue in a living body by transmitting and receiving ultrasonic waves. As an example, the tissue to be imaged is a fetus. Of course, other tissues may be imaged.
 プローブ10は超音波を送受波する送受波器である。本実施形態においては、プローブ10は2Dアレイ振動子を有している。2Dアレイ振動子は、複数の振動素子が二次元的に配列されて形成されたものである。この2Dアレイ振動子によって超音波ビームが形成され、その超音波ビームは二次元的に走査される。これにより、三次元エコーデータ取込空間としての三次元空間12が形成される。または、プローブ10は、1Dアレイ振動子とそれを機械的に走査する走査機構とを内蔵していてもよい。1Dアレイ振動子による超音波ビームの電子走査により走査面が形成され、その走査面を機械的に走査してもよい。このような方式によっても、三次元空間12が形成される。電子走査方式としては、電子セクタ走査、電子リニア走査等が知られている。胎児の超音波診断を行う場合には、プローブ10が母体の腹部表面上に当接され、その状態において超音波の送受波が行われる。 Probe 10 is a transducer that transmits and receives ultrasonic waves. In the present embodiment, the probe 10 has a 2D array transducer. The 2D array vibrator is formed by two-dimensionally arranging a plurality of vibration elements. An ultrasonic beam is formed by the 2D array transducer, and the ultrasonic beam is scanned two-dimensionally. As a result, a three-dimensional space 12 is formed as a three-dimensional echo data capturing space. Alternatively, the probe 10 may incorporate a 1D array transducer and a scanning mechanism that mechanically scans the 1D array transducer. The scanning surface may be formed by electronic scanning of the ultrasonic beam by the 1D array transducer, and the scanning surface may be mechanically scanned. The three-dimensional space 12 is also formed by such a method. As electronic scanning methods, electronic sector scanning, electronic linear scanning, and the like are known. When performing ultrasonic diagnosis of the fetus, the probe 10 is brought into contact with the surface of the abdomen of the mother, and ultrasonic waves are transmitted and received in this state.
 送受信部14は、送信ビームフォーマ及び受信ビームフォーマとして機能する。送信時において、送受信部14は、プローブ10の複数の振動素子に対して一定の遅延関係をもった複数の送信信号を供給する。これにより、超音波の送信ビームが形成される。受信時において、生体内からの反射波がプローブ10において受波され、これによりプローブ10から送受信部14へ複数の受信信号が出力される。送受信部14では、複数の受信信号に対する整相加算処理が実行され、これにより整相加算後の受信信号としてビームデータが出力される。なお、超音波の送受波において、送信開口合成等の技術が利用されてもよい。 The transmission / reception unit 14 functions as a transmission beamformer and a reception beamformer. At the time of transmission, the transmission / reception unit 14 supplies a plurality of transmission signals having a fixed delay relationship to the plurality of vibration elements of the probe 10. Thereby, an ultrasonic transmission beam is formed. At the time of reception, a reflected wave from the living body is received by the probe 10, whereby a plurality of reception signals are output from the probe 10 to the transmission / reception unit 14. The transmission / reception unit 14 performs phasing addition processing on a plurality of reception signals, and thereby outputs beam data as reception signals after phasing addition. It should be noted that techniques such as transmission aperture synthesis may be used in ultrasonic transmission / reception.
 ビームデータに対しては、信号処理部16によって、検波、対数圧縮、座標変換、等の信号処理が適用される。信号処理後のビームデータは、3Dメモリ18に格納される。もちろん、そのような処理が行われないビームデータが3Dメモリ18に格納されてもよい。ビームデータの読み出し時に、上記の処理が行われてもよい。 Signal processing such as detection, logarithmic compression, coordinate conversion, and the like is applied to the beam data by the signal processing unit 16. The beam data after the signal processing is stored in the 3D memory 18. Of course, beam data that is not subjected to such processing may be stored in the 3D memory 18. The above processing may be performed when reading the beam data.
 3Dメモリ18は、送受波空間としての三次元空間に対応する記憶空間を有している。3Dメモリ18には、三次元空間12から取得されたエコーデータ集合体としてのボリュームデータが格納される。ボリュームデータは、実際には、複数本のビームデータに対する座標変換及び補間処理により構成されるデータである。もちろん、そのような処理が行われないデータが3Dメモリ18に格納されてもよい。データの読み出し時に、上記の処理が行われてもよい。 The 3D memory 18 has a storage space corresponding to a three-dimensional space as a transmission / reception space. The 3D memory 18 stores volume data as an echo data aggregate acquired from the three-dimensional space 12. The volume data is actually data constituted by coordinate conversion and interpolation processing for a plurality of beam data. Of course, data that is not subjected to such processing may be stored in the 3D memory 18. The above processing may be performed when reading data.
 三次元画像生成部20は、3Dメモリ18からボリュームデータを読み出し、制御部30から与えられたレンダリング条件に従って、三次元関心領域(3D-ROI)内の部分ボリュームデータに対してレンダリング処理を実行する。これにより、三次元画像が生成される。その画像データは表示処理部26に出力される。レンダリング処理としては各種の手法が知られており、様々な手法を採用することができる。例えば、ボリュームレンダリング法等の画像処理法が適用される。本実施形態では、三次元画像生成部20は、時間軸上に沿って順次取得された複数のボリュームデータのそれぞれから部分ボリュームデータを抽出する。そして、三次元画像生成部20は、複数の部分ボリュームデータをそれらの間で注目部位が揃うように空間的に合成し、これにより、合成部分ボリュームデータを生成する。この合成部分ボリュームデータに対してレンダリング処理が適用され、これにより、三次元画像が生成される。三次元画像生成部20については、図2以降の図面を参照して詳しく説明する。 The three-dimensional image generation unit 20 reads the volume data from the 3D memory 18 and executes a rendering process on the partial volume data in the three-dimensional region of interest (3D-ROI) according to the rendering conditions given from the control unit 30. . Thereby, a three-dimensional image is generated. The image data is output to the display processing unit 26. Various methods are known as rendering processing, and various methods can be employed. For example, an image processing method such as a volume rendering method is applied. In the present embodiment, the 3D image generation unit 20 extracts partial volume data from each of a plurality of volume data sequentially acquired along the time axis. Then, the three-dimensional image generation unit 20 spatially synthesizes the plurality of partial volume data so that the target portions are aligned between them, thereby generating synthesized partial volume data. A rendering process is applied to the combined partial volume data, thereby generating a three-dimensional image. The three-dimensional image generation unit 20 will be described in detail with reference to the drawings after FIG.
 断面画像生成部22は、二次元の断面画像(Bモード断層画像)を生成する機能を備えている。例えば、断面画像生成部22は、ユーザによって任意に設定された断面における断面画像を生成する機能を備えている。具体的には、制御部30から断面画像生成部22に対して任意断面の座標情報等が与えられると、断面画像生成部22は、その任意断面に対応するデータを3Dメモリ18から読み出す。断面画像生成部22は、読み出したデータに基づいて二次元の断面画像を生成する。この画像データは表示処理部26に出力される。なお、断面画像生成部22は、ユーザによって指定された任意の数の断面画像を生成してもよい。断面画像生成部22は、二次元の走査面に対する超音波の送受波により取得されたビームデータに基づいて、Bモード断層画像を生成してもよい。 The cross-sectional image generation unit 22 has a function of generating a two-dimensional cross-sectional image (B-mode tomographic image). For example, the cross-sectional image generation unit 22 has a function of generating a cross-sectional image in a cross section arbitrarily set by the user. Specifically, when the coordinate information or the like of an arbitrary cross section is given from the control unit 30 to the cross section image generation unit 22, the cross section image generation unit 22 reads data corresponding to the arbitrary cross section from the 3D memory 18. The cross-sectional image generation unit 22 generates a two-dimensional cross-sectional image based on the read data. This image data is output to the display processing unit 26. The cross-sectional image generation unit 22 may generate an arbitrary number of cross-sectional images specified by the user. The cross-sectional image generation unit 22 may generate a B-mode tomographic image based on beam data acquired by transmitting and receiving ultrasonic waves with respect to a two-dimensional scanning plane.
 グラフィック画像生成部24は、制御部30から供給されるグラフィック作成用のパラメータに従って、断面画像や三次元画像に対してオーバーレイ表示されるグラフィックデータを生成する。例えば、グラフィック画像生成部24は、三次元関心領域の断面を表すグラフィックデータやカットラインを表すグラフィックデータ等のデータを生成する。このように生成されたグラフィックデータは表示処理部26に出力される。 The graphic image generation unit 24 generates graphic data to be overlaid on the cross-sectional image or the three-dimensional image according to the graphic creation parameters supplied from the control unit 30. For example, the graphic image generation unit 24 generates data such as graphic data representing a cross section of a three-dimensional region of interest and graphic data representing a cut line. The graphic data generated in this way is output to the display processing unit 26.
 表示処理部26は、三次元画像や断面画像等の画像に対して、必要なグラフィックデータをオーバーレイ処理し、これによって表示画像データを生成する。表示画像データは表示部28に出力され、表示モードに従った表示態様で1又は複数の画像が表示される。例えば、三次元画像や断面画像等の画像がリアルタイムで動画像として表示される。表示部28は、例えば液晶ディスプレイ等の表示デバイスによって構成されている。 The display processing unit 26 performs overlay processing of necessary graphic data on an image such as a three-dimensional image or a cross-sectional image, thereby generating display image data. The display image data is output to the display unit 28, and one or more images are displayed in a display mode according to the display mode. For example, an image such as a three-dimensional image or a cross-sectional image is displayed as a moving image in real time. The display unit 28 is configured by a display device such as a liquid crystal display.
 制御部30は、超音波診断装置の各部の制御を行う機能を備えている。また、制御部30は関心領域設定部32を含む。 The control unit 30 has a function of controlling each part of the ultrasonic diagnostic apparatus. The control unit 30 includes a region of interest setting unit 32.
 関心領域設定部32は、カット面を含む三次元関心領域をボリュームデータに対して設定する。カット面は、例えば、ボリュームデータに対して設定される変形可能な面である。カット面はレンダリング処理の開始面に相当し、画像化対象組織と非対象組織とを分離する機能を有する。カット面を基準にして、手前側の組織(レンダリング処理における投影視点側の組織)は画像化の非対象組織に相当し、奥側の組織(投影視点とは反対側の組織)は画像化の対象組織に相当する。カット面は、ユーザのマニュアル操作によって指定されてもよいし、自動的に設定されてもよい。マニュアル設定においては、例えば、ボリュームデータの代表断面を表す画像が表示部28に表示され、その画像上に、三次元関心領域の断面を表すボックスが表示される。そして、ボックスの上辺の形状がユーザによって操作される。ボックスの上辺がカットラインに相当し、そのカットラインに基づいてカット面が形成される。カットラインは、例えば、少なくとも3点に基づいて形成されるスプライン曲線であってもよいし、任意の形状のラインであってもよい。例えば、ユーザが入力部34を利用して断面画像上において少なくとも3点を指定すると、関心領域設定部32は、それらの点に基づいて、カットラインとしてのスプライン曲線を形成する。もちろん、別の手法によってカットラインが形成されてもよい。関心領域設定部32は、例えば、カットラインを通る複数のスプライン曲線を生成し、これにより、カット面が形成される。自動設定においては、例えば、ボリュームデータ中の胎児データと子宮壁データとの間の羊水データが検出され、その羊水データの中にカット面が設定される。もちろん、別の手法によってカット面が設定されてもよい。三次元画像生成部20は、三次元関心領域内の部分ボリュームデータに対してレンダリング処理を適用する。 The region-of-interest setting unit 32 sets a three-dimensional region of interest including a cut surface for volume data. The cut surface is, for example, a deformable surface set for volume data. The cut surface corresponds to the start surface of the rendering process and has a function of separating the imaging target tissue and the non-target tissue. Based on the cut plane, the tissue on the front side (tissue on the projection viewpoint side in rendering processing) corresponds to a non-target tissue for imaging, and the tissue on the back side (tissue on the side opposite to the projection viewpoint) is imaged. Corresponds to the target organization. The cut surface may be designated by a user's manual operation, or may be automatically set. In the manual setting, for example, an image representing a representative cross section of the volume data is displayed on the display unit 28, and a box representing a cross section of the three-dimensional region of interest is displayed on the image. Then, the shape of the upper side of the box is operated by the user. The upper side of the box corresponds to a cut line, and a cut surface is formed based on the cut line. For example, the cut line may be a spline curve formed based on at least three points, or may be a line having an arbitrary shape. For example, when the user specifies at least three points on the cross-sectional image using the input unit 34, the region-of-interest setting unit 32 forms a spline curve as a cut line based on these points. Of course, the cut line may be formed by another method. The region-of-interest setting unit 32 generates, for example, a plurality of spline curves that pass through the cut line, thereby forming a cut surface. In the automatic setting, for example, amniotic fluid data between fetal data and uterine wall data in the volume data is detected, and a cut surface is set in the amniotic fluid data. Of course, the cut surface may be set by another method. The 3D image generation unit 20 applies rendering processing to partial volume data in the 3D region of interest.
 制御部30には入力部34が接続されている。入力部34は例えば操作パネルによって構成され、その操作パネルはキーボードやトラックボール等を有するデバイスである。ユーザは入力部34を用いて、三次元関心領域の設定にあたって必要な数値や任意断面の座標等の情報を入力することが可能である。 The input unit 34 is connected to the control unit 30. The input unit 34 is configured by, for example, an operation panel, and the operation panel is a device having a keyboard, a trackball, and the like. The user can use the input unit 34 to input information such as numerical values necessary for setting the three-dimensional region of interest and coordinates of an arbitrary cross section.
 上述した超音波診断装置においてプローブ10以外の構成は、例えばプロセッサや電子回路等のハードウェア資源を利用して実現することができ、その実現において必要に応じてメモリ等のデバイスが利用されてもよい。また、プローブ10以外の構成は、例えばコンピュータによって実現されてもよい。つまり、コンピュータが備えるCPUやメモリやハードディスク等のハードウェア資源と、CPU等の動作を規定するソフトウェア(プログラム)との協働により、プローブ10以外の構成の全部又は一部が実現されてもよい。当該プログラムは、CDやDVD等の記録媒体を経由して、又は、ネットワーク等の通信経路を経由して、図示しない記憶装置に記憶される。別の例として、プローブ10以外の構成は、DSP(Digital Signal Processor)やFPGA(Field Programmable Gate Array)等によって実現されてもよい。 In the above-described ultrasonic diagnostic apparatus, the configuration other than the probe 10 can be realized by using hardware resources such as a processor and an electronic circuit, and a device such as a memory can be used as necessary in the realization. Good. The configuration other than the probe 10 may be realized by a computer, for example. That is, all or part of the configuration other than the probe 10 may be realized by cooperation of hardware resources such as a CPU, a memory, and a hard disk included in the computer and software (program) that defines the operation of the CPU and the like. . The program is stored in a storage device (not shown) via a recording medium such as a CD or DVD, or via a communication path such as a network. As another example, configurations other than the probe 10 may be realized by a DSP (Digital Signal Processor), an FPGA (Field Programmable Gate Array), or the like.
 以下、本実施形態に係る超音波診断装置の構成について詳しく説明する。 Hereinafter, the configuration of the ultrasonic diagnostic apparatus according to the present embodiment will be described in detail.
 図2には、3Dメモリ18と三次元画像生成部20の詳細な構成が示されている。 FIG. 2 shows detailed configurations of the 3D memory 18 and the three-dimensional image generation unit 20.
 ボリュームデータ記憶部36には、三次元空間12から取得されたボリュームデータが記憶される。超音波ビームの送受波によって、時間軸上に並ぶ複数のボリュームデータが順次取得され、複数のボリュームデータがボリュームデータ記憶部36に記憶される。 The volume data storage unit 36 stores volume data acquired from the three-dimensional space 12. By the transmission / reception of the ultrasonic beam, a plurality of volume data arranged on the time axis are sequentially acquired, and the plurality of volume data are stored in the volume data storage unit 36.
 部分ボリュームデータ抽出部38は、複数のボリュームデータをボリュームデータ記憶部36から取得し、複数のボリュームデータのそれぞれに対して三次元関心領域内のデータを抽出する処理を適用し、これにより、複数の部分ボリュームデータを生成する。三次元関心領域は、例えば注目部位を含む領域である。注目部位は、例えば胎児又はその顔面である。この場合、胎児像又は胎児の顔面像を含む複数の部分ボリュームデータが生成される。なお、部分ボリュームデータ抽出部38は、ユーザによって指定された期間内に取得された複数のボリュームデータをボリュームデータ記憶部36から取得してもよいし、自動的に設定された期間内に取得された複数のボリュームデータをボリュームデータ記憶部36から取得されてもよい。 The partial volume data extraction unit 38 acquires a plurality of volume data from the volume data storage unit 36, and applies a process of extracting data in the three-dimensional region of interest to each of the plurality of volume data. Generate partial volume data. The three-dimensional region of interest is a region that includes, for example, a target region. The attention site is, for example, a fetus or its face. In this case, a plurality of partial volume data including a fetal image or a fetal face image is generated. The partial volume data extraction unit 38 may acquire a plurality of volume data acquired within a period designated by the user from the volume data storage unit 36 or may be acquired automatically within a period set. A plurality of volume data may be acquired from the volume data storage unit 36.
 調整情報演算部40は、部分ボリュームデータ毎に、注目部位の代表位置(代表座標)と代表方位を演算する機能を備えている。代表位置と代表方位は、調整情報として用いられる。例えば、調整情報演算部40は、部分ボリュームデータ毎にそこに含まれるボクセル単位で法線ベクトルを演算し、これにより、部分ボリュームデータ毎に、複数の法線ベクトルを含む法線ベクトル群を得る機能を備えている。つまり、各ボリュームデータにおいて、個々のボクセルの法線ベクトルが演算され、これにより、部分ボリュームデータ毎に法線ベクトル群が得られる。調整情報演算部40は、部分ボリュームデータ毎に、法線ベクトル群に基づいて代表位置と代表方位を演算する。 The adjustment information calculation unit 40 has a function of calculating the representative position (representative coordinates) and the representative azimuth of the target region for each partial volume data. The representative position and the representative direction are used as adjustment information. For example, the adjustment information calculation unit 40 calculates a normal vector for each partial volume data in units of voxels included therein, and thereby obtains a normal vector group including a plurality of normal vectors for each partial volume data. It has a function. That is, in each volume data, the normal vector of each voxel is calculated, and thereby a normal vector group is obtained for each partial volume data. The adjustment information calculation unit 40 calculates a representative position and a representative direction based on the normal vector group for each partial volume data.
 部分ボリュームデータ調整部42は、複数の部分ボリュームデータの間で注目部位についての代表位置と代表方位が揃うように、複数の部分ボリュームデータの位置と向きを調整する。これより、複数の調整部分ボリュームデータが生成され、これらは調整部分ボリュームデータ記憶部44に記憶される。 The partial volume data adjustment unit 42 adjusts the position and orientation of the plurality of partial volume data so that the representative position and the representative orientation for the target region are aligned among the plurality of partial volume data. As a result, a plurality of adjustment partial volume data is generated and stored in the adjustment partial volume data storage unit 44.
 部分ボリュームデータ合成部46は、複数の調整部分ボリュームデータを調整部分ボリュームデータ記憶部44から取得し、複数の調整部分ボリュームデータを合成する。これにより、合成部分ボリュームデータが生成される。合成部分ボリュームデータは、合成部分ボリュームデータ記憶部48に記憶される。 The partial volume data synthesis unit 46 acquires a plurality of adjustment partial volume data from the adjustment partial volume data storage unit 44, and synthesizes the plurality of adjustment partial volume data. As a result, composite partial volume data is generated. The combined partial volume data is stored in the combined partial volume data storage unit 48.
 レンダリング部50は、合成部分ボリュームデータ記憶部48から合成部分ボリュームデータを取得し、合成部分ボリュームデータに対してレンダリング処理を適用する。これにより、三次元画像が生成される。この三次元画像のデータは表示処理部26に出力される。 The rendering unit 50 acquires the combined partial volume data from the combined partial volume data storage unit 48, and applies the rendering process to the combined partial volume data. Thereby, a three-dimensional image is generated. The data of the three-dimensional image is output to the display processing unit 26.
 図3には、超音波ビームの送受波によって生成されたボリュームデータ列が示されている。ボリュームデータ列は、時間軸上に並ぶ複数のボリュームデータ(ボリュームデータ52a,52b,・・・,52n,・・・)を含む。各ボリュームデータは、ボリュームデータ記憶部36に記憶される。 FIG. 3 shows a volume data string generated by transmission / reception of an ultrasonic beam. The volume data string includes a plurality of volume data ( volume data 52a, 52b,..., 52n,...) Arranged on the time axis. Each volume data is stored in the volume data storage unit 36.
 図4には、断面画像の一例が示されている。断面画像54は、ボリュームデータの代表断面を表す画像であり、例えば、三次元空間12の中央の走査面における画像である。もちろん、断面画像54は、別の面における画像であってもよい。断面画像54には、一例として、胎児像56、子宮壁像58及び羊水像60が含まれる。 FIG. 4 shows an example of a cross-sectional image. The cross-sectional image 54 is an image representing a representative cross-section of the volume data, for example, an image on the scanning surface at the center of the three-dimensional space 12. Of course, the cross-sectional image 54 may be an image on another surface. As an example, the cross-sectional image 54 includes a fetal image 56, a uterine wall image 58, and an amniotic fluid image 60.
 断面画像54は、例えばXY断面を表す画像である。断面画像54内には図示しないグラフィックイメージとしてのボックスが含まれる。このボックスは、三次元関心領域のXY断面を表している。ボックスの上辺はカットラインである。マニュアルで関心領域を設定する場合、断面画像54が表示部28に表示され、ユーザは断面画像54を観察しながら入力部34を利用することにより、カットラインの位置、回転角度、曲率、形状及び長さ等のパラメータを変更することができる。また、ユーザは、ボックスのX方向の長さ(幅)とY方向の長さ(高さ)のそれぞれを変更することができる。これにより、三次元関心領域の幅及び高さが変更される。関心領域設定部32は、カットラインのパラメータに基づいてカット面を形成する。カットラインのパラメータが変更されると、関心領域設定部32は、その変更に応じてカット面の形状、位置及びサイズ等を変更する。カット面の形状等が変更されると、関心領域設定部32は、変更後のカット面を有する三次元関心領域を設定する。このように、ユーザの操作によって、三次元関心領域の形状、位置及びサイズ等を変更することができる。 The cross-sectional image 54 is an image representing an XY cross section, for example. The section image 54 includes a box as a graphic image (not shown). This box represents the XY cross section of the three-dimensional region of interest. The top side of the box is a cut line. When the region of interest is set manually, the cross-sectional image 54 is displayed on the display unit 28, and the user uses the input unit 34 while observing the cross-sectional image 54, so that the cut line position, rotation angle, curvature, shape, and Parameters such as length can be changed. Further, the user can change the length (width) in the X direction and the length (height) in the Y direction of the box. Thereby, the width and height of the three-dimensional region of interest are changed. The region-of-interest setting unit 32 forms a cut surface based on the parameters of the cut line. When the parameters of the cut line are changed, the region-of-interest setting unit 32 changes the shape, position, size, and the like of the cut surface according to the change. When the shape or the like of the cut surface is changed, the region-of-interest setting unit 32 sets a three-dimensional region of interest having the changed cut surface. Thus, the shape, position, size, and the like of the three-dimensional region of interest can be changed by a user operation.
 別の例として、カット面と三次元関心領域が自動的に設定されてもよい。例えば、関心領域設定部32は、ボリュームデータの各ボクセルのボクセル値(輝度値)に基づいて、生体内の組織の境界を検出する。例えば、胎児像、子宮壁像及び羊水像が判別され、胎児像と他の組織像との境界が検出される。関心領域設定部32は、その境界に基づいてカット面を自動的に設定し、そのカット面を有する三次元関心領域を自動的に設定する。 As another example, a cut plane and a three-dimensional region of interest may be automatically set. For example, the region-of-interest setting unit 32 detects a tissue boundary in the living body based on the voxel value (luminance value) of each voxel of the volume data. For example, a fetal image, a uterine wall image, and an amniotic fluid image are discriminated, and a boundary between the fetal image and another tissue image is detected. The region-of-interest setting unit 32 automatically sets a cut surface based on the boundary, and automatically sets a three-dimensional region of interest having the cut surface.
 図5には、ボリュームデータの一例が示されている。ボリュームデータ62にはカット面64が設定されており、そのカット面を有する三次元関心領域66が設定されている。部分ボリュームデータ抽出部38によって、ボリュームデータ62から三次元関心領域66内のデータ(部分ボリュームデータ)が抽出される。カット面64がレンダリング開始面に相当し、カット面64を基準にして投影視点側の組織(手前側の組織)は画像化されず、投影視点とは反対側の組織(三次元関心領域66内の組織)が画像化される。つまり、部分ボリュームデータに対してレンダリング処理が適用され、これにより、三次元関心領域66内の組織を表す三次元画像が生成される。 FIG. 5 shows an example of volume data. A cut surface 64 is set in the volume data 62, and a three-dimensional region of interest 66 having the cut surface is set. The partial volume data extraction unit 38 extracts data (partial volume data) in the three-dimensional region of interest 66 from the volume data 62. The cut surface 64 corresponds to a rendering start surface, and the tissue on the projection viewpoint side (tissue on the near side) is not imaged with reference to the cut surface 64, and the tissue on the opposite side to the projection viewpoint (in the three-dimensional region of interest 66) Image). That is, the rendering process is applied to the partial volume data, and thereby a three-dimensional image representing the tissue in the three-dimensional region of interest 66 is generated.
 図6には、関心領域内の組織を表す断面画像の一例が示されている。断面画像68には、関心領域内の組織像としての胎児像56が含まれている。関心領域を設定することにより、子宮壁像58が除去された状態で画像が生成される。また、ボリュームデータからは、関心領域内のデータとしての部分ボリュームデータが抽出される。 FIG. 6 shows an example of a cross-sectional image representing the tissue in the region of interest. The cross-sectional image 68 includes a fetal image 56 as a tissue image in the region of interest. By setting the region of interest, an image is generated with the uterine wall image 58 removed. Further, partial volume data as data in the region of interest is extracted from the volume data.
 以下、調整情報演算部40による処理について詳しく説明する。図7には、調整情報演算部40による処理を説明するためのフローチャートが示されている。 Hereinafter, the processing by the adjustment information calculation unit 40 will be described in detail. FIG. 7 shows a flowchart for explaining processing by the adjustment information calculation unit 40.
 まず、超音波の送受波を繰り返すことにより、時間軸上に並んだ複数のボリュームデータが順次取得される(S01)。複数のボリュームデータは、ボリュームデータ記憶部36に記憶される。次に、部分ボリュームデータ抽出部38は、複数のボリュームデータのそれぞれに対して三次元関心領域内のデータを抽出する処理を適用し、これにより、複数の部分ボリュームデータを生成する(S02)。例えば、胎児像又は胎児の顔面像を含む複数の部分ボリュームデータが生成される。 First, a plurality of volume data arranged on the time axis are sequentially acquired by repeating transmission and reception of ultrasonic waves (S01). The plurality of volume data is stored in the volume data storage unit 36. Next, the partial volume data extraction unit 38 applies a process of extracting data in the three-dimensional region of interest to each of the plurality of volume data, thereby generating a plurality of partial volume data (S02). For example, a plurality of partial volume data including a fetal image or a fetal face image is generated.
 次に、調整情報演算部40は、部分ボリュームデータ毎にそこに含まれるボクセル単位で法線ベクトルを演算し、これにより、部分ボリュームデータ毎に、複数の法線ベクトルを含む法線ベクトル群を得る(S03)。この法線ベクトルは、公知技術を適用することにより演算することができる。 Next, the adjustment information calculation unit 40 calculates a normal vector for each partial volume data in units of voxels included therein, thereby obtaining a normal vector group including a plurality of normal vectors for each partial volume data. Obtain (S03). This normal vector can be calculated by applying a known technique.
 例えば、ボクセルにおける輝度勾配を演算することにより、当該ボクセルの法線ベクトルが演算される。位置(x,y,z)におけるボクセルの輝度値をI(x,y,z)とし、法線ベクトルをg(x,y,z)とすると、法線ベクトルは以下の式(1)によって定義される。
Figure JPOXMLDOC01-appb-M000001
For example, the normal vector of the voxel is calculated by calculating the luminance gradient in the voxel. When the luminance value of the voxel at the position (x, y, z) is I (x, y, z) and the normal vector is g (x, y, z), the normal vector is expressed by the following equation (1). Defined.
Figure JPOXMLDOC01-appb-M000001
 調整情報演算部40は、部分ボリュームデータから胎児の顔の表面を表す表面データを抽出し、その表面データを構成する各ボクセルの法線ベクトルを演算してもよい。例えば、エッジ検出処理を適用することにより、表面データが抽出される。もちろん、表面データを抽出せずに、部分ボリュームデータに含まれる全ボクセルについて法線ベクトルを演算してもよい。この場合、胎児の顔画像内のボクセルについても法線ベクトルが演算される。 The adjustment information calculation unit 40 may extract surface data representing the surface of the fetal face from the partial volume data, and calculate the normal vector of each voxel constituting the surface data. For example, surface data is extracted by applying an edge detection process. Of course, normal vectors may be calculated for all voxels included in the partial volume data without extracting the surface data. In this case, the normal vector is also calculated for voxels in the fetal face image.
 次に、調整情報演算部40は、部分ボリュームデータ毎に、法線ベクトル群に基づいて注目部位(胎児の顔)の代表位置を演算する(S04)。例えば、調整情報演算部40は、部分ボリュームデータ毎に、複数のベクトルを延長し、延長された複数のベクトルが交差する位置に基づいて代表位置を演算する。複数のベクトルを延長することにより複数の交点が形成される場合、複数の交点の重心位置や平均位置が代表位置として用いられる。また、調整情報演算部40は、複数の交点に対してクラスタリング処理を適用することにより、一部の交点群を除外して代表位置を演算してもよい。例えば、交点群の分布が疎となる領域に含まれる交点群を除外し、交点群の分布が密となる領域に含まれる交点群の重心位置や平均位置を代表位置として演算してもよい。これにより、代表位置を演算する上で適切ではない交点が除外され、より適切な代表位置が演算される。 Next, the adjustment information calculation unit 40 calculates the representative position of the site of interest (fetal face) based on the normal vector group for each partial volume data (S04). For example, the adjustment information calculation unit 40 extends a plurality of vectors for each partial volume data, and calculates a representative position based on a position where the extended plurality of vectors intersect. When a plurality of intersections are formed by extending a plurality of vectors, the barycentric positions and average positions of the plurality of intersections are used as representative positions. In addition, the adjustment information calculation unit 40 may calculate the representative position by excluding some intersection groups by applying clustering processing to a plurality of intersections. For example, the intersection group included in the region where the distribution of the intersection group is sparse may be excluded, and the barycentric position or the average position of the intersection group included in the region where the distribution of the intersection group is dense may be calculated as the representative position. Thereby, intersections that are not appropriate for calculating the representative position are excluded, and a more appropriate representative position is calculated.
 次に、調整情報演算部40は、部分ボリュームデータ毎に、代表位置を中心点として有する中空楕円体(三次元の楕円体)を部分ボリュームデータにフィッティングさせる(S05)。なお、楕円体は球を含む概念である。胎児の頭は楕円体に近似しているため、楕円体としての近似図形を部分ボリュームデータにフィッティングさせる。例えば、公知技術(例えば最小二乗法)を適用することにより、楕円体を部分ボリュームデータにフィッティングさせることができる。以下、このフィッティング処理について詳しく説明する。 Next, the adjustment information calculation unit 40 fits the partial volume data with a hollow ellipsoid (three-dimensional ellipsoid) having the representative position as the center point for each partial volume data (S05). The ellipsoid is a concept including a sphere. Since the head of the fetus approximates an ellipsoid, an approximate figure as an ellipsoid is fitted to the partial volume data. For example, an ellipsoid can be fitted to the partial volume data by applying a known technique (for example, the least square method). Hereinafter, this fitting process will be described in detail.
 三次元の楕円体は、以下の式(2)によって定義される。
Figure JPOXMLDOC01-appb-M000002
The three-dimensional ellipsoid is defined by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
 ここで、(x,y,z)は代表位置の座標であり、a,b,cは未知数である。なお、楕円体の回転は考慮しないものとする。 Here, (x 0 , y 0 , z 0 ) are the coordinates of the representative position, and a, b, and c are unknown numbers. Note that the rotation of the ellipsoid is not considered.
 閾値以上の輝度値をもつボクセル群(x,y,z)に式(2)で定義される楕円体をフィッティングさせるために、最小二乗法を用いて未知数a,b,cを求める。すなわち、フィッティング対象のボクセル群(x,y,z)に対して、以下の式(3)で定義される値Jを最小にするための未知数a,b,cを求める。
Figure JPOXMLDOC01-appb-M000003
In order to fit the ellipsoid defined by the equation (2) to the voxel group (x i , y i , z i ) having a luminance value equal to or greater than the threshold value, the unknowns a, b, and c are obtained using the least square method. . That is, the unknowns a, b, and c for minimizing the value J defined by the following equation (3) are obtained for the voxel group (x i , y i , z i ) to be fitted.
Figure JPOXMLDOC01-appb-M000003
 そのために、未知数A=1/a、B=1/b、C=1/c、を導入し、以下の式(4)を解く。
Figure JPOXMLDOC01-appb-M000004
Therefore, unknowns A = 1 / a 2 , B = 1 / b 2 , C = 1 / c 2 are introduced, and the following equation (4) is solved.
Figure JPOXMLDOC01-appb-M000004
 すなわち、以下の式(5)を解いて未知数a,b,cを得る。
Figure JPOXMLDOC01-appb-M000005
That is, unknown numbers a, b, and c are obtained by solving the following equation (5).
Figure JPOXMLDOC01-appb-M000005
 以上のようにして、部分ボリュームデータ毎に、代表位置を中心点として有し、部分ボリュームデータにフィッティングする中空楕円体が求められる。 As described above, for each partial volume data, a hollow ellipsoid having a representative position as a center point and fitting to the partial volume data is obtained.
 次に、調整情報演算部40は、部分ボリュームデータ毎に、中空楕円体から外れるデータと法線ベクトルを除去する(S06)。例えば、調整情報演算部40は、各ボクセルから中空楕円体の表面までの距離を演算し、その距離が閾値以上となるボクセルと当該ボクセルの法線ベクトルを除去する。つまり、中空楕円体に近いボクセルほど顔表面上のボクセルである可能性が高いボクセルであり、中空楕円体から遠いボクセルほど顔表面上のボクセルである可能性が低いボクセルである。この処理により、胎児の顔面像以外の像(例えば手等の像)が、部分ボリュームデータから除外される。 Next, the adjustment information calculation unit 40 removes data and normal vectors that deviate from the hollow ellipsoid for each partial volume data (S06). For example, the adjustment information calculation unit 40 calculates the distance from each voxel to the surface of the hollow ellipsoid, and removes the voxel whose distance is equal to or greater than the threshold and the normal vector of the voxel. That is, a voxel closer to a hollow ellipsoid is a voxel that is more likely to be a voxel on the face surface, and a voxel farther from the hollow ellipsoid is a voxel that is less likely to be a voxel on the face surface. By this processing, images other than the fetal face image (for example, an image of a hand or the like) are excluded from the partial volume data.
 次に、調整情報演算部40は、部分ボリュームデータ毎に、残存する法線ベクトル群に基づいて代表ベクトル(代表方位)を演算する(S07)。例えば、残存する法線ベクトル群の平均ベクトルが、代表ベクトルとして求められる。また、ベクトルの大きさに応じた重み付け処理が適用されてもよい。例えば、大きな法線ベクトルほど大きな重み係数が法線ベクトルに乗算され、重み係数が乗算された後の法線ベクトル群の平均ベクトルが、代表ベクトルとして求められる。この代表ベクトルは、胎児の顔の向きを表していると言える。 Next, the adjustment information calculation unit 40 calculates a representative vector (representative orientation) based on the remaining normal vector group for each partial volume data (S07). For example, the average vector of the remaining normal vector group is obtained as the representative vector. A weighting process according to the magnitude of the vector may be applied. For example, a larger normal vector is multiplied by a larger weight coefficient and the average vector of the normal vector group after the weight coefficient is multiplied is obtained as a representative vector. It can be said that this representative vector represents the orientation of the face of the fetus.
 以下、図8から図13を参照して、調整情報演算部40の処理について更に詳しく説明する。図8から図10、図12及び図13には、部分ボリュームデータの一例が模式的に示されている。これらの図においては、説明の便宜上、部分ボリュームデータが二次元的に表現されているが、調整情報演算部40による処理は3次元の部分ボリュームデータに適用される。 Hereinafter, the processing of the adjustment information calculation unit 40 will be described in more detail with reference to FIGS. 8 to 10, 12, and 13 schematically show examples of partial volume data. In these drawings, the partial volume data is expressed two-dimensionally for convenience of explanation, but the processing by the adjustment information calculation unit 40 is applied to the three-dimensional partial volume data.
 図8には、部分ボリュームデータ抽出部38によって抽出された部分ボリュームデータ70が示されている。この部分ボリュームデータ70には、三次元関心領域内の組織像としての胎児像56が含まれている。三次元関心領域を設定することにより、子宮壁像等が除去された状態の部分ボリュームデータが生成される。上記のステップS03の処理が適用されることにより、複数の法線ベクトル72が演算される。部分ボリュームデータ70から胎児の顔の表面を表す表面データを抽出し、その表面データを構成する各ボクセルの法線ベクトル72が演算されてもよいし、表面データを抽出せずに、部分ボリュームデータに含まれる全ボクセルについて法線ベクトルが演算されてもよい。図8に示す例では、表面近傍におけるボクセルの法線ベクトルが図示されている。全ボクセルの法線ベクトルが演算される場合、胎児像56の内部におけるボクセルについても法線ベクトルが演算される。 FIG. 8 shows partial volume data 70 extracted by the partial volume data extraction unit 38. The partial volume data 70 includes a fetal image 56 as a tissue image in the three-dimensional region of interest. By setting the three-dimensional region of interest, partial volume data with the uterine wall image and the like removed is generated. A plurality of normal vectors 72 are calculated by applying the process of step S03 described above. Surface data representing the surface of the fetal face may be extracted from the partial volume data 70, and the normal vector 72 of each voxel constituting the surface data may be calculated, or partial volume data may be extracted without extracting the surface data. A normal vector may be calculated for all voxels included in. In the example shown in FIG. 8, the normal vector of the voxel in the vicinity of the surface is illustrated. When normal vectors of all voxels are calculated, normal vectors are also calculated for voxels inside the fetal image 56.
 図9には、部分ボリュームデータ70において、延長された複数の法線ベクトル74が示されている。上記のステップS04においては、これら複数の法線ベクトル74が交差する位置に基づいて、顔画像の代表位置が演算される。例えば、複数の交点の重心位置や平均位置が代表位置として演算される。図9に示す例では、符号76で示す領域内において、交点群の分布密度が高くなっているため、この領域内の位置が代表位置として演算される。また、交点群の分布密度が低くなっている領域に含まれる交点を除去してもよい。例えば、符号76で示す領域の外側に分布している交点を除去し、符号76で示す領域内に含まれる交点群に基づいて代表位置を演算してもよい。これにより、代表位置を演算する上で適切ではない交点が除外され、顔画像の代表位置としてより適切な代表位置が演算される。 FIG. 9 shows a plurality of extended normal vectors 74 in the partial volume data 70. In step S04, the representative position of the face image is calculated based on the position where the plurality of normal vectors 74 intersect. For example, the barycentric positions and average positions of a plurality of intersections are calculated as representative positions. In the example shown in FIG. 9, since the distribution density of the intersection group is high in the area indicated by reference numeral 76, the position in this area is calculated as the representative position. Moreover, you may remove the intersection contained in the area | region where the distribution density of an intersection group is low. For example, the intersections distributed outside the area indicated by reference numeral 76 may be removed, and the representative position may be calculated based on the intersection points included in the area indicated by reference numeral 76. Thereby, intersections that are not appropriate in calculating the representative position are excluded, and a more appropriate representative position is calculated as the representative position of the face image.
 図10には、代表位置78と楕円体80が示されている。代表位置78は、上記のように、延長された複数の法線ベクトル74に基づいて演算された位置である。楕円体80は、上記のステップS05において、代表位置78を中心点として有する三次元の楕円体であって、部分ボリュームデータ70に対してフィッティングされた楕円体である。胎児の頭は楕円体としてみなすことができ、胎児像56において、顔の表面の像にフィッティングするような楕円体80が演算される。 FIG. 10 shows a representative position 78 and an ellipsoid 80. The representative position 78 is a position calculated based on the extended normal vectors 74 as described above. The ellipsoid 80 is a three-dimensional ellipsoid having the representative position 78 as a center point in the above step S05, and is an ellipsoid fitted to the partial volume data 70. The fetal head can be regarded as an ellipsoid, and an ellipsoid 80 that fits the image of the face surface is calculated in the fetal image 56.
 上記のステップS06においては、楕円体80から外れるデータと法線ベクトルが除去される。図11には、その処理の詳細が示されている。例えば、調整情報演算部40は、部分ボリュームデータを構成するボクセル82と代表位置78とを直線で結び、その直線上においてボクセル82から楕円体80の表面まで距離Lを演算する。その距離Lが閾値以上となる場合、ボクセル82とそのボクセル82について求められた法線ベクトル84が、除去される。一方、距離Lが閾値未満となる場合、ボクセル82と法線ベクトル84は除去されない。 In step S06 described above, data and normal vectors that are out of the ellipsoid 80 are removed. FIG. 11 shows details of the processing. For example, the adjustment information calculation unit 40 connects the voxel 82 constituting the partial volume data and the representative position 78 with a straight line, and calculates the distance L from the voxel 82 to the surface of the ellipsoid 80 on the straight line. When the distance L is equal to or greater than the threshold value, the voxel 82 and the normal vector 84 obtained for the voxel 82 are removed. On the other hand, when the distance L is less than the threshold value, the voxel 82 and the normal vector 84 are not removed.
 図12には、除去後の残存部分ボリュームデータ86が示されている。楕円体80から外れるボクセルと法線ベクトルが除去されており、残存部分ボリュームデータ86には、除去されなかった残存データ88が含まれている。残存データ88は、楕円体80の表面付近のボクセル群のデータによって構成されている。残存データ88におけるボクセル群の法線ベクトル群、つまり除去されずに残存するベクトル群に基づいて、代表ベクトルが演算される。 FIG. 12 shows the remaining partial volume data 86 after removal. Voxels and normal vectors that deviate from the ellipsoid 80 are removed, and the remaining partial volume data 86 includes remaining data 88 that has not been removed. The remaining data 88 is constituted by voxel group data near the surface of the ellipsoid 80. A representative vector is calculated based on the normal vector group of the voxel group in the residual data 88, that is, the vector group remaining without being removed.
 図13には、代表ベクトル92が示されている。この代表ベクトル92は、残存データ88におけるボクセル群の法線ベクトル群に基づいて、代表位置78を通るように演算されたベクトルである。例えば、残存する法線ベクトル群の平均ベクトルであって代表位置78を通るベクトルが、代表ベクトル92として演算される。これにより、顔画像の代表方位(向き)が演算される。 FIG. 13 shows a representative vector 92. The representative vector 92 is a vector calculated so as to pass through the representative position 78 based on the normal vector group of the voxel group in the remaining data 88. For example, a vector that is an average vector of the remaining normal vector groups and passes through the representative position 78 is calculated as the representative vector 92. Thereby, the representative azimuth (direction) of the face image is calculated.
 以上のように、調整情報としての顔画像の代表位置と代表ベクトル(代表方位)が、調整情報演算部40によって演算される。個々の部分ボリュームデータ毎に、代表位置と代表ベクトルが演算される。残存部分ボリュームデータ、代表位置を示す情報、及び、代表ベクトルを示す情報は、部分ボリュームデータ調整部42に出力される。 As described above, the representative position and the representative vector (representative orientation) of the face image as the adjustment information are calculated by the adjustment information calculation unit 40. A representative position and a representative vector are calculated for each partial volume data. The remaining partial volume data, information indicating the representative position, and information indicating the representative vector are output to the partial volume data adjustment unit 42.
 以下、部分ボリュームデータ調整部42による処理について詳しく説明する。図14には、部分ボリュームデータ調整部42による処理を説明するためのフローチャートが示されている。部分ボリュームデータ調整部42は、残存部分ボリュームデータ毎に、各ボクセルについて以下の演算を実行する。 Hereinafter, processing by the partial volume data adjustment unit 42 will be described in detail. FIG. 14 shows a flowchart for explaining the processing by the partial volume data adjustment unit 42. The partial volume data adjustment unit 42 performs the following calculation for each voxel for each remaining partial volume data.
 まず、部分ボリュームデータ調整部42は、基準位置と代表位置との差に基づいて、残存部分ボリュームデータにおけるボクセルの位置を平行移動させる(S10)。つまり、部分ボリュームデータ調整部42は、残存部分ボリュームデータの代表位置が基準位置に一致するように、残存部分ボリュームデータにおける各ボクセルの位置を平行移動させる。基準位置は、三次元空間上の位置であって予め設定された仮定の位置であってもよいし、複数の残存部分ボリュームデータの中の1つの残存部分ボリュームデータにおける代表位置であってもよい。例えば、最新の残存部分ボリュームデータにおける代表位置が基準位置として用いられてもよい。この場合、部分ボリュームデータ調整部42は、各残存部分ボリュームデータにおける代表位置を、最新の残存部分ボリュームデータにおける代表位置に一致させるように、各残存部分ボリュームデータにおける各ボクセルを平行移動させることになる。 First, the partial volume data adjustment unit 42 translates the position of the voxel in the remaining partial volume data based on the difference between the reference position and the representative position (S10). That is, the partial volume data adjustment unit 42 translates the position of each voxel in the remaining partial volume data so that the representative position of the remaining partial volume data matches the reference position. The reference position may be a position in a three-dimensional space and an assumed position set in advance, or may be a representative position in one remaining partial volume data among a plurality of remaining partial volume data. . For example, the representative position in the latest remaining partial volume data may be used as the reference position. In this case, the partial volume data adjustment unit 42 translates each voxel in each remaining partial volume data so that the representative position in each remaining partial volume data matches the representative position in the latest remaining partial volume data. Become.
 次に、部分ボリュームデータ調整部42は、基準ベクトル(基準方位)と代表ベクトル(代表方位)との角度差に基づいて、残存部分ボリュームデータにおけるボクセルの向き(法線ベクトルの向き)を調整する(S11)。つまり、部分ボリュームデータ調整部42は、残存部分ボリュームデータの代表ベクトルが基準ベクトルに一致するように、残存部分ボリュームデータにおける各ボクセルの向き(各法線ベクトルの向き)を調整する。例えば、三次元直交座標系を構成する各軸周りの回転補正を行うことにより、各ボクセルの向きを調整する。基準ベクトルは、三次元空間上のベクトルであって予め設定されたベクトルであってもよいし、複数の残存部分ボリュームデータの中の1つの残存部分ボリュームデータにおける代表ベクトルであってもよい。例えば、最新の残存部分ボリュームデータにおける代表位置が基準ベクトルとして用いられる。この場合、部分ボリュームデータ調整部42は、各残存部分ボリュームデータにおける代表ベクトルを、最新の残存部分ボリュームデータにおける代表ベクトルに一致させるように、各残存部分ボリュームデータにおける各ボクセルの法線ベクトルに対して回転補正を適用することになる。 Next, the partial volume data adjustment unit 42 adjusts the voxel direction (normal vector direction) in the remaining partial volume data based on the angular difference between the reference vector (reference azimuth) and the representative vector (representative azimuth). (S11). That is, the partial volume data adjustment unit 42 adjusts the direction of each voxel (direction of each normal vector) in the remaining partial volume data so that the representative vector of the remaining partial volume data matches the reference vector. For example, the direction of each voxel is adjusted by performing rotation correction around each axis constituting the three-dimensional orthogonal coordinate system. The reference vector may be a vector in a three-dimensional space and set in advance, or may be a representative vector in one remaining partial volume data among a plurality of remaining partial volume data. For example, the representative position in the latest remaining partial volume data is used as the reference vector. In this case, the partial volume data adjustment unit 42 applies the normal vector of each voxel in each remaining partial volume data so that the representative vector in each remaining partial volume data matches the representative vector in the latest remaining partial volume data. Rotation correction is applied.
 各残存部分ボリュームデータに対して平行移動処理と回転補正処理が適用され、これにより、複数の調整部分ボリュームデータが生成される。複数の残存部分ボリュームデータの間で、代表位置と代表ベクトルが揃うように平行移動処理と回転補正処理が適用されるので、複数の残存部分ボリュームデータの間で、注目部位(胎児の顔)の位置と向きが揃うようになる。複数の調整部分ボリュームデータは調整部分ボリュームデータ記憶部44に記憶される。 The parallel movement process and the rotation correction process are applied to each remaining partial volume data, thereby generating a plurality of adjusted partial volume data. Since the parallel movement process and the rotation correction process are applied so that the representative position and the representative vector are aligned between the plurality of remaining partial volume data, the region of interest (fetal face) between the plurality of remaining partial volume data is applied. Position and orientation are aligned. A plurality of adjustment partial volume data is stored in the adjustment partial volume data storage unit 44.
 以下、図15A、図15B、図15C、図16A、図16B、図16C、図17A及び図17Bを参照して、部分ボリュームデータ調整部42による処理について更に詳しく説明する。 Hereinafter, with reference to FIGS. 15A, 15B, 15C, 16A, 16B, 16C, 17A, and 17B, the processing performed by the partial volume data adjusting unit 42 will be described in more detail.
 まず、図15A、図15B及び図15Cを参照して、平行移動処理と回転補正処理の概略について説明する。図15Aには、残存部分ボリュームデータ86が示されている。代表位置78と代表ベクトル92が求められている。この残存部分ボリュームデータ86に平行移動処理と回転補正処理が適用され、これにより、調整部分ボリュームデータが生成される。図15Bには、調整部分ボリュームデータ96が示されている。上記のように、代表位置78が基準位置と一致するように、ボクセル94の位置が平行移動させられる。また、代表ベクトル92が基準ベクトルに一致するように、ボクセル94の向き(法線ベクトルの向き)が調整される。代表位置98は平行移動後の代表位置、つまり、基準位置に一致する位置である。代表ベクトル100は平行移動及び回転補正後の代表ベクトルである。ボクセル102は、平行移動及び回転補正後のボクセル94の位置であり、そのボクセル102に、ボクセル94のボクセル値(輝度値)が割り当てられる。残存データ88に含まれる全ボクセルを対象にして平行移動処理と回転補正処理が適用される。図15Cには、その処理後の調整部分ボリュームデータ96が示されている。この調整部分ボリュームデータ96は、平行移動処理と回転補正処理が適用された複数のボクセル102によって構成されている。 First, an outline of the parallel movement process and the rotation correction process will be described with reference to FIGS. 15A, 15B, and 15C. FIG. 15A shows remaining partial volume data 86. A representative position 78 and a representative vector 92 are obtained. A parallel movement process and a rotation correction process are applied to the remaining partial volume data 86, whereby adjusted partial volume data is generated. FIG. 15B shows adjustment partial volume data 96. As described above, the position of the voxel 94 is translated so that the representative position 78 matches the reference position. Further, the direction of the voxel 94 (the direction of the normal vector) is adjusted so that the representative vector 92 matches the reference vector. The representative position 98 is a representative position after translation, that is, a position that matches the reference position. The representative vector 100 is a representative vector after translation and rotation correction. The voxel 102 is the position of the voxel 94 after translation and rotation correction, and the voxel value (luminance value) of the voxel 94 is assigned to the voxel 102. The parallel movement process and the rotation correction process are applied to all the voxels included in the remaining data 88. FIG. 15C shows the adjusted partial volume data 96 after the processing. The adjustment partial volume data 96 is composed of a plurality of voxels 102 to which the parallel movement process and the rotation correction process are applied.
 次に、図16A、図16B及び図16Cを参照して、平行移動処理と回転補正処理について詳しく説明する。図16Aには、残存部分ボリュームデータ86が示されている。代表位置78と代表ベクトル92が求められている。平行移動処理と回転補正処理は、2つのステップに分けて実施される。まず、第1ステップに係る処理として、残存部分ボリュームデータ86に平行移動処理が適用され、更に、X軸及びZ軸周りの回転補正処理が適用される。図16Bには、これらの処理が適用された後の残存部分ボリュームデータ104が示されている。上記のように、代表位置78が基準位置と一致するように、各ボクセルの位置が平行移動させられる。また、X軸及びZ軸周りにおいて、代表ベクトル92と基準ベクトルとが一致するように、各ボクセルの向き(法線ベクトルの向き)が調整される。代表位置106は平行移動後の代表位置、つまり、基準位置に一致する位置である。代表ベクトル108は、平行移動後のベクトルであって、更に、X軸及びZ軸周りに回転補正された後のベクトルである。次に、第2ステップに係る処理として、残存部分ボリュームデータ104に対してY軸周りの回転補正処理が適用される。これにより、調整部分ボリュームデータが生成される。図16Cには、その処理が適用された後の調整部分ボリュームデータ110が示されている。Y軸周りにおいて、代表ベクトル108と基準ベクトルとが一致するように、各ボクセルの向き(法線ベクトルの向き)が調整される。代表ベクトル112は、Y軸周りに回転補正された後のベクトルである。以上のように、残存部分ボリュームデータから調整部分ボリュームデータが生成される。 Next, the parallel movement process and the rotation correction process will be described in detail with reference to FIGS. 16A, 16B, and 16C. FIG. 16A shows remaining partial volume data 86. A representative position 78 and a representative vector 92 are obtained. The parallel movement process and the rotation correction process are performed in two steps. First, as a process related to the first step, a parallel movement process is applied to the remaining partial volume data 86, and a rotation correction process around the X axis and the Z axis is further applied. FIG. 16B shows the remaining partial volume data 104 after these processes are applied. As described above, the position of each voxel is translated so that the representative position 78 matches the reference position. In addition, the direction of each voxel (the direction of the normal vector) is adjusted so that the representative vector 92 and the reference vector coincide with each other around the X axis and the Z axis. The representative position 106 is a representative position after translation, that is, a position that matches the reference position. The representative vector 108 is a vector after translation, and is a vector after rotation correction around the X axis and the Z axis. Next, as a process relating to the second step, a rotation correction process around the Y axis is applied to the remaining partial volume data 104. As a result, adjusted partial volume data is generated. FIG. 16C shows the adjusted partial volume data 110 after the processing is applied. The direction of each voxel (the direction of the normal vector) is adjusted so that the representative vector 108 matches the reference vector around the Y axis. The representative vector 112 is a vector after the rotation correction around the Y axis. As described above, the adjusted partial volume data is generated from the remaining partial volume data.
 以下、第1ステップに係る処理(平行移動処理と、X軸及びZ軸周りの回転補正処理)と第2ステップに係る処理(Y軸周りの回転補正処理)について詳しく説明する。 Hereinafter, the processing related to the first step (parallel movement processing and rotation correction processing around the X and Z axes) and the processing related to the second step (rotation correction processing around the Y axis) will be described in detail.
 まず、第1ステップに係る処理について説明する。 First, the process according to the first step will be described.
 複数のボクセルの法線ベクトルの平均(代表ベクトル)を(n,n,nと定義し、代表位置の座標を(x,y,z)と定義する。残存部分ボリュームデータにおける各ボクセル(x,y,z)に対して、以下の式(6)によって定義される座標変換を適用する。
Figure JPOXMLDOC01-appb-M000006
The average of the normal vectors of a plurality of voxels (representative vector) was defined as (n x, n y, n z) t, defines the coordinate of the representative position and (x 0, y 0, z 0). A coordinate transformation defined by the following equation (6) is applied to each voxel (x, y, z) in the remaining partial volume data.
Figure JPOXMLDOC01-appb-M000006
 ここで、R(θ)はX軸周りの回転行列であり、R(θ)はZ軸周りの回転行列である。R(θ)とR(θ)は以下の式(7)によって定義される。
Figure JPOXMLDOC01-appb-M000007
Here, R x (θ) is a rotation matrix around the X axis, and R z (θ) is a rotation matrix around the Z axis. R x (θ) and R z (θ) are defined by the following equation (7).
Figure JPOXMLDOC01-appb-M000007
 θは、代表ベクトルをZ軸周りにX軸からY軸に向かう方向に回転させ、代表ベクトルのX成分を0(ゼロ)にするための回転角である。つまり、θは、代表ベクトルをYZ平面に平行にするための回転角である。この回転角θは、以下の式(8)によって定義される。
Figure JPOXMLDOC01-appb-M000008
θ 1 is a rotation angle for rotating the representative vector around the Z axis in the direction from the X axis to the Y axis so that the X component of the representative vector is 0 (zero). That, theta 1 is a rotation angle for collimating the representative vector in the YZ plane. This rotation angle θ 1 is defined by the following equation (8).
Figure JPOXMLDOC01-appb-M000008
 θは、代表ベクトルをX軸周りにZ軸からY軸に向かう方向に回転させ、代表ベクトルのZ成分を0(ゼロ)にするための回転角である。つまり、θは、代表ベクトルをXY平面に平行にするための回転角である。この回転角θは、以下の式(9)によって定義される。
Figure JPOXMLDOC01-appb-M000009
θ 2 is a rotation angle for rotating the representative vector around the X axis in the direction from the Z axis to the Y axis so that the Z component of the representative vector is 0 (zero). That is, θ 2 is a rotation angle for making the representative vector parallel to the XY plane. This rotation angle θ 2 is defined by the following equation (9).
Figure JPOXMLDOC01-appb-M000009
 なお、基準ベクトルの向きがY軸の正方向である場合、n>0となる場合のみ、上記の処理が適用され、基準ベクトルの向きがY軸の負方向である場合、n<0となる場合のみ、上記の処理が適用される。 When the direction of the reference vector is the positive direction of the Y axis, the above processing is applied only when n y > 0, and when the direction of the reference vector is the negative direction of the Y axis, n y <0. The above processing is applied only when
 次に、第2ステップに係る処理について説明する。 Next, processing related to the second step will be described.
 上記の第1ステップに係る処理が適用された後の残存部分ボリュームデータにおける各ボクセル(x,y,z)に対して、以下の式(10)によって定義される座標変換を適用する。
Figure JPOXMLDOC01-appb-M000010
The coordinate transformation defined by the following equation (10) is applied to each voxel (x, y, z) in the remaining partial volume data after the processing according to the first step is applied.
Figure JPOXMLDOC01-appb-M000010
 ここで、R(θ)はY軸周りの回転行列であり、以下の式(11)によって定義される行列である。
Figure JPOXMLDOC01-appb-M000011
Here, R y (θ) is a rotation matrix around the Y axis, and is a matrix defined by the following equation (11).
Figure JPOXMLDOC01-appb-M000011
 θは、Y軸周りの回転角である。回転角θは以下の算出方法によって算出される。以下、図17A及び図17Bを参照して、回転角θを算出するための処理について説明する。 θ 0 is a rotation angle around the Y axis. The rotation angle θ 0 is calculated by the following calculation method. Hereinafter, a process for calculating the rotation angle θ 0 will be described with reference to FIGS. 17A and 17B.
 図17Aには、第1ステップに係る処理(平行移動処理と、X軸及びZ軸周りの回転補正処理)が適用された後の残存部分ボリュームデータ104が示されている。まず、この残存部分ボリュームデータ104に含まれるボクセル群(x,y,z)をXZ平面に投影する。例えば、位置(x,y,z)のボクセル114をXZ平面に投影する。これにより、XZ平面上の位置(x,y)に投影点116が形成される。例えば、ボクセル114のボクセル値(輝度値)が「K」である場合、位置(x,y)にK個分のデータが重複して存在しているとみなして投影処理を適用する。そして、各ボクセルについてXZ平面上への投影処理を行う。 FIG. 17A shows the remaining partial volume data 104 after the processing related to the first step (parallel movement processing and rotation correction processing around the X and Z axes) is applied. First, the voxel group (x i , y i , z i ) included in the remaining partial volume data 104 is projected onto the XZ plane. For example, the voxel 114 at the position (x, y, z) is projected onto the XZ plane. Thereby, the projection point 116 is formed at the position (x, y) on the XZ plane. For example, when the voxel value (luminance value) of the voxel 114 is “K”, the projection process is applied on the assumption that K pieces of data are duplicated at the position (x, y). Then, a projection process on the XZ plane is performed for each voxel.
 図17Bには、XZ平面上の投影点116の分布が示されている。この分布に対して、直線118(z=ax+b)をフィッティングさせ、X軸に平行な基準線120に対する直線118の傾きを求める。 FIG. 17B shows the distribution of the projection points 116 on the XZ plane. A straight line 118 (z = ax + b) is fitted to this distribution, and the inclination of the straight line 118 with respect to the reference line 120 parallel to the X axis is obtained.
 投影処理後の投影点群(x,z)について、下記の式(12)によって定義されている値Jを最小にする未知数a,bを求める。
Figure JPOXMLDOC01-appb-M000012
For the projection point group (x i , z i ) after the projection processing, unknowns a and b that minimize the value J defined by the following equation (12) are obtained.
Figure JPOXMLDOC01-appb-M000012
 つまり、以下の式(13)を解いて未知数a,bを求める。
Figure JPOXMLDOC01-appb-M000013
That is, the unknowns a and b are obtained by solving the following equation (13).
Figure JPOXMLDOC01-appb-M000013
 すなわち、以下の式(14)を解いて未知数a,bを求める。
Figure JPOXMLDOC01-appb-M000014
That is, the unknowns a and b are obtained by solving the following equation (14).
Figure JPOXMLDOC01-appb-M000014
 回転角θは、以下の式(15)によって得られる。
Figure JPOXMLDOC01-appb-M000015
The rotation angle θ 0 is obtained by the following equation (15).
Figure JPOXMLDOC01-appb-M000015
 以上のように、平行移動処理と回転補正処理が部分ボリュームデータ調整部42によって実行される。個々の残存部分ボリュームデータ毎に、平行移動処理と回転補正処理が適用され、これにより、複数の調整部分ボリュームデータが生成される。複数の調整部分ボリュームデータは、調整部分ボリュームデータ記憶部44に記憶される。 As described above, the parallel volume process and the rotation correction process are executed by the partial volume data adjustment unit 42. A parallel movement process and a rotation correction process are applied to each remaining partial volume data, thereby generating a plurality of adjusted partial volume data. The plurality of adjustment partial volume data is stored in the adjustment partial volume data storage unit 44.
 以下、部分ボリュームデータ合成部46による合成処理について詳しく説明する。図18には、その合成処理を説明するためのフローチャートが示されている。 Hereinafter, the composition processing by the partial volume data composition unit 46 will be described in detail. FIG. 18 shows a flowchart for explaining the synthesis process.
 まず、部分ボリュームデータ合成部46は、調整部分ボリュームデータ記憶部44から複数の調整部分ボリュームデータを取得し(S20)、各ボクセル(i,j,k)について以下の演算を実行する。 First, the partial volume data composition unit 46 acquires a plurality of adjustment partial volume data from the adjustment partial volume data storage unit 44 (S20), and executes the following calculation for each voxel (i, j, k).
 部分ボリュームデータ合成部46は、取得したすべての調整部分ボリュームデータから、ボクセル(i,j,k)のボクセル値(輝度値)を取得し(S21)、ボクセル(i,j,k)のボクセル値の平均値を求める(S22)。このとき、ボクセル値が存在しないボクセルについては、平均値算出対象のボクセル群に含めない。別の例として、ボクセル値が閾値未満となるボクセルについては、平均値算出対象のボクセル群に含めなくてもよい。更に別の例として、部分ボリュームデータ合成部46は、平均値を求める替わりに、すべての調整部分ボリュームデータの中で最高のボクセル値(最高輝度値)を、当該ボクセル(i,j,k)の値として採用してもよいし、ボクセル値の中央値を、当該ボクセル(i,j,k)の値として採用してもよい。または、ボクセル(i,j,k)のボクセル値が存在する調整部分ボリュームデータと、当該ボクセル(i,j,k)のボクセル値が存在しない調整部分ボリュームデータと、が混在する場合、調整部分ボリュームデータについての多数決によって、当該ボクセル(i,j,k)にボクセル値が存在するか否かが決定されてもよい。 The partial volume data composition unit 46 acquires the voxel value (luminance value) of the voxel (i, j, k) from all the acquired adjusted partial volume data (S21), and the voxel of the voxel (i, j, k). An average value is obtained (S22). At this time, voxels having no voxel value are not included in the average value calculation target voxel group. As another example, a voxel whose voxel value is less than the threshold value may not be included in the average value calculation target voxel group. As another example, the partial volume data combining unit 46 obtains the highest voxel value (maximum luminance value) among all the adjusted partial volume data, instead of obtaining the average value, for the voxel (i, j, k). The median value of voxel values may be adopted as the value of the voxel (i, j, k). Alternatively, when the adjustment partial volume data in which the voxel value of the voxel (i, j, k) exists and the adjustment partial volume data in which the voxel value of the voxel (i, j, k) does not exist, the adjustment portion Whether or not a voxel value exists in the voxel (i, j, k) may be determined by a majority decision on the volume data.
 各ボクセル(i,j,k)を対象にしてステップS21,S22の処理が実行され、これにより、合成部分ボリュームデータが生成される。その合成部分ボリュームデータは、合成部分ボリュームデータ記憶部48に記憶される(S23)。 The processing of steps S21 and S22 is executed for each voxel (i, j, k), thereby generating synthesized partial volume data. The combined partial volume data is stored in the combined partial volume data storage unit 48 (S23).
 以下、図19A及び図19Bを参照して、部分ボリュームデータ合成部46による合成処理について更に詳しく説明する。 Hereinafter, the synthesizing process by the partial volume data synthesizing unit 46 will be described in more detail with reference to FIGS. 19A and 19B.
 図19Aには、複数の調整部分ボリュームデータ(例えば調整部分ボリュームデータ122a,122b,122c,122d)が示されている。個々の調整部分ボリュームデータ毎に、代表位置124a,124b,124c,124dと代表ベクトル126a,126b,126c,126dが求められている。また、上記の部分ボリュームデータ調整部42による平行移動処理と回転補正処理が適用されることにより、代表位置124a,124b,124c,124dが揃えられ、代表ベクトル126a,126b,126c,126dの向きが揃えられている。そして、個々の調整部分ボリュームデータには、平行移動処理と回転補正処理が適用されたボクセル群128a,128b,128c,128dが含まれている。 FIG. 19A shows a plurality of adjustment partial volume data (for example, adjustment partial volume data 122a, 122b, 122c, 122d). The representative positions 124a, 124b, 124c, and 124d and the representative vectors 126a, 126b, 126c, and 126d are obtained for each individual adjustment partial volume data. Further, by applying the parallel movement process and the rotation correction process by the partial volume data adjustment unit 42, the representative positions 124a, 124b, 124c, and 124d are aligned, and the directions of the representative vectors 126a, 126b, 126c, and 126d are set. It is aligned. Each adjustment partial volume data includes voxel groups 128a, 128b, 128c, and 128d to which parallel movement processing and rotation correction processing are applied.
 調整部分ボリュームデータ122a,122b,122c,122dには、部分的にデータが欠落している部分が存在している。例えば、調整部分ボリュームデータ122aにおいては、ボクセル群128aの中にデータ欠落部分130aが存在する。同様に、調整部分ボリュームデータ122bにおいては、ボクセル群128bの中にデータ欠落部分130bが存在し、調整部分ボリュームデータ122cにおいては、ボクセル群128cの中にデータ欠落部分130cが存在し、調整部分ボリュームデータ122dにおいては、ボクセル群128dの中にデータ欠落部分130dが存在する。これらのデータ欠落部分130a,130b,130c,130dは、胎児の動きやプローブ10の位置ずれ等に起因してデータの欠落が発生している部分であり、ボクセル値(輝度値)が存在していない部分である。 In the adjustment part volume data 122a, 122b, 122c, 122d, there is a part where data is partially missing. For example, in the adjustment part volume data 122a, the data missing part 130a exists in the voxel group 128a. Similarly, in the adjustment part volume data 122b, the data missing part 130b exists in the voxel group 128b, and in the adjustment part volume data 122c, the data missing part 130c exists in the voxel group 128c. In the data 122d, a data missing portion 130d exists in the voxel group 128d. These data missing portions 130a, 130b, 130c, and 130d are portions in which data is missing due to movement of the fetus, positional displacement of the probe 10, and the like, and there are voxel values (luminance values). There is no part.
 部分ボリュームデータ合成部46は、調整部分ボリュームデータ122a,122b,122c,122dを合成する。これにより、合成部分ボリュームデータが生成される。例えば、図18に示されている合成処理を適用することにより、合成部分ボリュームデータが生成される。 The partial volume data combining unit 46 combines the adjusted partial volume data 122a, 122b, 122c, and 122d. As a result, composite partial volume data is generated. For example, the combined partial volume data is generated by applying the combining process shown in FIG.
 図19Bには、合成処理によって生成された合成部分ボリュームデータ132が示されている。代表位置134は、平行移動処理と回転補正処理が適用された後の代表位置と同じ位置であり、代表ベクトル136は、平行移動処理と回転補正処理が適用された後の代表ベクトルと同じベクトルである。合成部分ボリュームデータ132には、ボクセル群138が含まれている。このボクセル群138は、ボクセル群128a,128b,128c,128dに対する合成処理によって得られる。ボクセル群138においては、データの欠落部分が補われている。つまり、調整部分ボリュームデータ122aには、データ欠落部分130aが存在するが、その部分は、他の調整部分ボリュームデータ122b,122c,122dによって補われている。他のデータ欠落部分130b,130c,130dについても同様である。そして、合成部分ボリュームデータ132に対してレンダリング処理が適用され、これにより、三次元画像が生成される。 FIG. 19B shows combined partial volume data 132 generated by the combining process. The representative position 134 is the same position as the representative position after the parallel movement process and the rotation correction process are applied, and the representative vector 136 is the same vector as the representative vector after the parallel movement process and the rotation correction process are applied. is there. The combined partial volume data 132 includes a voxel group 138. The voxel group 138 is obtained by the synthesis process for the voxel groups 128a, 128b, 128c, and 128d. In the voxel group 138, the missing data portion is compensated. That is, the adjustment part volume data 122a has a data missing part 130a, but this part is supplemented by other adjustment part volume data 122b, 122c, and 122d. The same applies to the other data missing portions 130b, 130c, and 130d. Then, a rendering process is applied to the composite partial volume data 132, thereby generating a three-dimensional image.
 以上のように、本実施形態によると、複数の部分ボリュームデータの間で注目部位(例えば胎児の顔)の位置と向きが揃うように、複数の部分ボリュームデータ(調整部分ボリュームデータ)が空間的に合成される。これにより、部分ボリュームデータにデータ欠落部分が存在している場合であっても、複数の部分ボリュームデータの間で注目部位の位置と向きを揃えた状態で、データ欠落部分を他の部分ボリュームデータによって補うことが可能となる。それ故、データの欠落が低減された三次元画像を生成することが可能となる。例えば、調整部分ボリュームデータ122aに対してレンダリング処理を適用すると、これによって得られた三次元画像には、データ欠落部分130aに相当する画像欠落部分が含まれることが想定される。これに対して本実施形態によると、そのデータ欠落部分130aは他のデータによって補われるため、画像の欠落を防止又は低減することが可能となる。また、平行移動処理と回転補正処理を適用することにより、複数の部分ボリュームデータの間で注目部位の位置と向きを揃えた状態で、複数の部分ボリュームデータを合成することが可能となる。これにより、合成による位置ずれや向きのずれを防止又は低減することが可能となる。 As described above, according to the present embodiment, the plurality of partial volume data (adjusted partial volume data) is spatially arranged so that the position and orientation of the target region (for example, the fetal face) are aligned between the plurality of partial volume data. Is synthesized. As a result, even if there is a data missing part in the partial volume data, the data missing part is replaced with the other partial volume data in a state where the position and orientation of the target region are aligned among the plurality of partial volume data. Can be compensated by. Therefore, it is possible to generate a three-dimensional image with reduced data loss. For example, when the rendering process is applied to the adjustment part volume data 122a, it is assumed that the three-dimensional image obtained thereby includes an image missing part corresponding to the data missing part 130a. On the other hand, according to the present embodiment, since the data missing portion 130a is supplemented by other data, it is possible to prevent or reduce image loss. Also, by applying the parallel movement process and the rotation correction process, it is possible to synthesize a plurality of partial volume data in a state where the position and orientation of the target region are aligned between the plurality of partial volume data. As a result, it is possible to prevent or reduce positional deviation and orientation deviation due to synthesis.
 なお、本実施形態に係る処理は、3Dメモリ18に格納された複数のボリュームデータに適用されてもよいし、リアルタイムに取得されるボリュームデータに対して順次適用されてもよい。3Dメモリ18に格納された複数のボリュームデータに本実施形態に係る処理が適用される場合、例えば、ユーザによって指定された期間内に取得された複数のボリュームデータ、又は、自動的に設定された期間内に取得された複数のボリュームデータに対して処理が適用される。リアルタイムに取得されるボリュームデータに本実施形態に係る処理が適用される場合、ボリュームデータが取得される度に本実施形態に係る処理が適用される。これにより、合成部分ボリュームデータが順次生成されて、三次元画像が順次生成される。 Note that the processing according to the present embodiment may be applied to a plurality of volume data stored in the 3D memory 18, or may be sequentially applied to volume data acquired in real time. When the processing according to the present embodiment is applied to a plurality of volume data stored in the 3D memory 18, for example, a plurality of volume data acquired within a period specified by the user, or automatically set Processing is applied to a plurality of volume data acquired within the period. When the process according to the present embodiment is applied to volume data acquired in real time, the process according to the present embodiment is applied every time volume data is acquired. As a result, the combined partial volume data is sequentially generated, and a three-dimensional image is sequentially generated.
 18 3Dメモリ、20 三次元画像生成部、32 関心領域設定部、38 部分ボリュームデータ抽出部、40 調整情報演算部、42 部分ボリュームデータ調整部、46 部分ボリュームデータ合成部、50 レンダリング部。 18 3D memory, 20 3D image generation unit, 32 region of interest setting unit, 38 partial volume data extraction unit, 40 adjustment information calculation unit, 42 partial volume data adjustment unit, 46 partial volume data composition unit, 50 rendering unit.

Claims (6)

  1.  被検体内の三次元空間に対する超音波の送受波により順次取得された複数のボリュームデータのそれぞれに対して注目部位を含む部分を抽出する処理を適用し、これにより複数の部分ボリュームデータを生成する部分ボリュームデータ生成部と、
     前記複数の部分ボリュームデータをそれらの間で前記注目部位が揃うように空間的に合成し、これにより合成部分ボリュームデータを生成する合成部と、
     前記合成部分ボリュームデータに対してレンダリング処理を適用し、これにより三次元超音波画像を生成する超音波画像生成部と、
     を含むことを特徴とする超音波診断装置。
    A process of extracting a portion including a region of interest is applied to each of a plurality of volume data sequentially acquired by transmitting and receiving ultrasonic waves to and from a three-dimensional space in a subject, thereby generating a plurality of partial volume data A partial volume data generator,
    Spatially synthesizing the plurality of partial volume data so that the target region is aligned between them, thereby generating a synthesized partial volume data; and
    Applying a rendering process to the combined partial volume data, thereby generating a three-dimensional ultrasonic image, an ultrasonic image generating unit,
    An ultrasonic diagnostic apparatus comprising:
  2.  請求項1に記載の超音波診断装置において、
     前記合成部は、前記複数の部分ボリュームデータの間で前記注目部位についての代表位置及び代表方位が揃うように前記複数の部分ボリュームデータの位置及び向きを調整して前記複数の部分ボリュームデータを合成する、
     ことを特徴とする超音波診断装置。
    The ultrasonic diagnostic apparatus according to claim 1,
    The synthesizing unit synthesizes the plurality of partial volume data by adjusting positions and orientations of the plurality of partial volume data so that a representative position and a representative direction for the target region are aligned between the plurality of partial volume data. To
    An ultrasonic diagnostic apparatus.
  3.  請求項2に記載の超音波診断装置において、
     前記合成部は、
     前記部分ボリュームデータ毎にボクセル単位で法線ベクトルを演算し、これにより前記部分ボリュームデータ毎に法線ベクトル群を得る法線ベクトル演算部と、
     前記部分ボリュームデータ毎に前記法線ベクトル群に基づいて前記代表位置を演算する代表位置演算部と、
     前記部分ボリュームデータ毎に前記法線ベクトル群に基づいて前記代表方位を演算する代表方位演算部と、
     を含む、
     ことを特徴とする超音波診断装置。
    The ultrasonic diagnostic apparatus according to claim 2,
    The synthesis unit is
    A normal vector calculation unit that calculates a normal vector in units of voxels for each partial volume data, thereby obtaining a normal vector group for each partial volume data;
    A representative position calculation unit for calculating the representative position based on the normal vector group for each partial volume data;
    A representative azimuth calculating unit that calculates the representative azimuth based on the normal vector group for each partial volume data;
    including,
    An ultrasonic diagnostic apparatus.
  4.  請求項3に記載の超音波診断装置において、
     前記代表位置演算部は、近似図形を前記部分ボリュームデータにフィッティングさせることにより前記代表位置を演算し、
     前記代表方位演算部は、前記近似図形を前記部分ボリュームデータにフィッティングさせることにより前記代表方位を演算する、
     ことを特徴とする超音波診断装置。
    The ultrasonic diagnostic apparatus according to claim 3.
    The representative position calculation unit calculates the representative position by fitting an approximate figure to the partial volume data,
    The representative azimuth calculating unit calculates the representative azimuth by fitting the approximate figure to the partial volume data.
    An ultrasonic diagnostic apparatus.
  5.  請求項4に記載の超音波診断装置において、
     前記近似図形は中空楕円体である、
     ことを特徴とする超音波診断装置。
    The ultrasonic diagnostic apparatus according to claim 4,
    The approximate figure is a hollow ellipsoid,
    An ultrasonic diagnostic apparatus.
  6.  被検体内の三次元空間に対する超音波の送受波により順次取得された複数のボリュームデータのそれぞれに対して注目部位を含む部分を抽出する処理を適用し、これにより複数の部分ボリュームデータを生成する部分ボリュームデータ生成工程と、
     前記複数の部分ボリュームデータをそれらの間で前記注目部位が揃うように空間的に合成し、これにより合成部分ボリュームデータを生成する合成工程と、
     前記合成部分ボリュームデータに対してレンダリング処理を適用し、これにより三次元超音波画像を生成する超音波画像生成工程と、
     を含むことを特徴とする超音波画像処理方法。
    A process of extracting a portion including a region of interest is applied to each of a plurality of volume data sequentially acquired by transmitting and receiving ultrasonic waves to and from a three-dimensional space in a subject, thereby generating a plurality of partial volume data Partial volume data generation process;
    A combination step of spatially combining the plurality of partial volume data so as to align the target portion therebetween, thereby generating combined partial volume data;
    Applying a rendering process to the combined partial volume data, thereby generating a three-dimensional ultrasonic image, an ultrasonic image generating step,
    An ultrasonic image processing method comprising:
PCT/JP2016/059805 2015-07-03 2016-03-28 Ultrasonic diagnosing device and ultrasonic image processing method WO2017006595A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015134260A JP6063525B1 (en) 2015-07-03 2015-07-03 Ultrasonic diagnostic apparatus and program
JP2015-134260 2015-07-03

Publications (1)

Publication Number Publication Date
WO2017006595A1 true WO2017006595A1 (en) 2017-01-12

Family

ID=57685395

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/059805 WO2017006595A1 (en) 2015-07-03 2016-03-28 Ultrasonic diagnosing device and ultrasonic image processing method

Country Status (2)

Country Link
JP (1) JP6063525B1 (en)
WO (1) WO2017006595A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7341668B2 (en) * 2018-02-16 2023-09-11 キヤノンメディカルシステムズ株式会社 Medical image diagnostic equipment, medical image processing equipment, and image processing programs

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004195028A (en) * 2002-12-19 2004-07-15 Aloka Co Ltd Ultrasonic diagnostic apparatus
JP2007330764A (en) * 2006-01-10 2007-12-27 Toshiba Corp Ultrasonic diagnostic apparatus and ultrasonic image creating method
JP2008073301A (en) * 2006-09-22 2008-04-03 Toshiba Corp Medical imaging diagnostic apparatus and medical image processor
JP2009131420A (en) * 2007-11-29 2009-06-18 Toshiba Corp Ultrasonic image diagnosing device
WO2012140984A1 (en) * 2011-04-14 2012-10-18 株式会社 日立メディコ Ultrasound diagnostic apparatus and ultrasound image-rendering method
JP2012239576A (en) * 2011-05-18 2012-12-10 Hitachi Aloka Medical Ltd Ultrasonic diagnostic apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004195028A (en) * 2002-12-19 2004-07-15 Aloka Co Ltd Ultrasonic diagnostic apparatus
JP2007330764A (en) * 2006-01-10 2007-12-27 Toshiba Corp Ultrasonic diagnostic apparatus and ultrasonic image creating method
JP2008073301A (en) * 2006-09-22 2008-04-03 Toshiba Corp Medical imaging diagnostic apparatus and medical image processor
JP2009131420A (en) * 2007-11-29 2009-06-18 Toshiba Corp Ultrasonic image diagnosing device
WO2012140984A1 (en) * 2011-04-14 2012-10-18 株式会社 日立メディコ Ultrasound diagnostic apparatus and ultrasound image-rendering method
JP2012239576A (en) * 2011-05-18 2012-12-10 Hitachi Aloka Medical Ltd Ultrasonic diagnostic apparatus

Also Published As

Publication number Publication date
JP6063525B1 (en) 2017-01-18
JP2017012587A (en) 2017-01-19

Similar Documents

Publication Publication Date Title
JP5400466B2 (en) Diagnostic imaging apparatus and diagnostic imaging method
JP5632680B2 (en) Ultrasonic image processing device
JP6288996B2 (en) Ultrasonic diagnostic apparatus and ultrasonic imaging program
US11160534B2 (en) Utilizing depth from ultrasound volume rendering for 3D printing
JP6342212B2 (en) Ultrasonic diagnostic equipment
JP6242025B2 (en) Ultrasonic imaging apparatus and ultrasonic image display method
CN103251429A (en) Ultrasonic imaging apparatus
EP3139838B1 (en) Imaging systems and methods for positioning a 3d ultrasound volume in a desired orientation
JP2006218210A (en) Ultrasonic diagnostic apparatus, ultrasonic image generating program and ultrasonic image generating method
US20110137168A1 (en) Providing a three-dimensional ultrasound image based on a sub region of interest in an ultrasound system
CN115486877A (en) Ultrasonic equipment and method for displaying three-dimensional ultrasonic image
JP2002102226A (en) Method for computing distance between image frames, three-dimensional image generating system, and method therefor
JP2020531086A (en) An ultrasound system that extracts an image plane from volume data using touch interaction with an image
JP2009291295A5 (en)
JP5566841B2 (en) Image processing apparatus and program
JP6063525B1 (en) Ultrasonic diagnostic apparatus and program
JP6501796B2 (en) Acquisition Orientation Dependent Features for Model-Based Segmentation of Ultrasound Images
JP2007222264A (en) Ultrasonograph
KR102321853B1 (en) Method and system for enhanced visualization of moving structures with cross-plane ultrasound images
JP5841771B2 (en) Ultrasonic data processor
JP7275261B2 (en) 3D ULTRASOUND IMAGE GENERATING APPARATUS, METHOD, AND PROGRAM
JP6545969B2 (en) Ultrasonic diagnostic equipment
US8579817B2 (en) Breast ultrasonic diagnostic apparatus and breast ultrasonic diagnostic method
JP5701362B2 (en) Diagnostic imaging apparatus and diagnostic imaging method
JP5396054B2 (en) Ultrasonic diagnostic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16821066

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16821066

Country of ref document: EP

Kind code of ref document: A1