CN112102469B - Three-dimensional modeling system, scanning system and control method thereof - Google Patents

Three-dimensional modeling system, scanning system and control method thereof Download PDF

Info

Publication number
CN112102469B
CN112102469B CN202010796134.4A CN202010796134A CN112102469B CN 112102469 B CN112102469 B CN 112102469B CN 202010796134 A CN202010796134 A CN 202010796134A CN 112102469 B CN112102469 B CN 112102469B
Authority
CN
China
Prior art keywords
dimensional
image
light source
detected
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010796134.4A
Other languages
Chinese (zh)
Other versions
CN112102469A (en
Inventor
王明超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202010796134.4A priority Critical patent/CN112102469B/en
Publication of CN112102469A publication Critical patent/CN112102469A/en
Application granted granted Critical
Publication of CN112102469B publication Critical patent/CN112102469B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P10/00Technologies related to metal processing
    • Y02P10/25Process efficiency

Abstract

The application relates to a three-dimensional modeling system, a scanning system and a control method thereof. The control method of the three-dimensional modeling system comprises the steps of controlling a light source to project laser light to a body to be detected along a direction perpendicular to a first surface. The bed body is controlled to move along the end face vertical to the aperture of the frame. Along the moving direction of the bed body, the displacement accumulated length of the bed body is recorded for a plurality of times. And recording the displacement accumulated length once, controlling the image acquisition device to shoot the contour line of the body to be detected irradiated by the laser once, and obtaining a two-dimensional image of the contour line. And constructing a three-dimensional model of the body to be detected according to the two-dimensional images and the displacement accumulated lengths. The control method fully utilizes the moving process of the bed body in the scanning preparation stage, only needs to collect two-dimensional images of the contour lines, calculates two-dimensional data, and simplifies the calculation parameters of three-dimensional modeling data before scanning.

Description

Three-dimensional modeling system, scanning system and control method thereof
Technical Field
The present disclosure relates to the field of detection technologies, and in particular, to a three-dimensional modeling system, a scanning system, and a control method thereof.
Background
The Magnetic Resonance Imaging (MRI) diagnosis method has no radiation damage, no wound, no pain, no danger and no damage to human body, and is one of the most advanced non-invasive imaging examination means at present.
In magnetic resonance scanning or CT detection, it is generally necessary to determine relevant parameters according to the appearance characteristics of a human body. In the prior art, a plurality of 3D cameras are generally used for detecting a plurality of positions, and then a plurality of three-dimensional graphs are spatially reconstructed, so that three-dimensional data are required to be calculated. The prior art three-dimensional modeling data before scanning is computationally complex.
Disclosure of Invention
Based on this, it is necessary to provide a three-dimensional modeling system, a scanning system, and a control method thereof, in order to solve the problem of how to simplify the calculation of three-dimensional modeling data before scanning.
A control method of a three-dimensional modeling system comprises a bed body, a light source and an image acquisition device. The bed includes a first surface. The first surface is used for placing a body to be detected. The bed is configured to be disposed toward an entrance of a gantry aperture of a scanning system. The light source is adapted to be secured to an edge of the inlet. The control method comprises the following steps:
s100, controlling the light source to project laser to the object to be detected along the direction vertical to the first surface.
S200, controlling the bed body to move along the end face perpendicular to the aperture of the frame.
S300, recording displacement accumulation lengths of the bed body for a plurality of times along the moving direction of the bed body, wherein each time the displacement accumulation lengths are recorded, controlling the image acquisition device to shoot the contour line of the body to be detected, which is irradiated by laser once, and obtaining a two-dimensional image of the contour line, and a plurality of the two-dimensional images are in one-to-one correspondence with a plurality of the displacement accumulation lengths.
S400, constructing a three-dimensional model of the body to be detected according to the two-dimensional images and the displacement accumulation lengths.
In one embodiment, the contour line includes a plurality of contour points. S400 includes:
s410, acquiring a plurality of image points corresponding to the contour points one by one from the two-dimensional image.
And S420, obtaining the position of the contour point corresponding to the image point according to the position of each image point in the two-dimensional image.
In one embodiment, the image acquisition device comprises a camera sensor and a camera lens which are arranged at intervals. And the focusing center line of the image acquisition device passes through the focus of the camera lens. And the focus center line is perpendicular to the camera lens. The light source irradiates along the direction vertical to the first surface to form a laser plane. And the intersection point of the focusing center line and the laser plane is a center point. S420 includes:
s421, a central image point corresponding to the central point in the two-dimensional image is obtained, and a first relative coordinate of the image point relative to the central image point is obtained.
S422, acquiring an included angle between the focusing center line and the laser plane. A first distance between the focal point and the center point is acquired. And acquiring a second distance from the focus to the camera sensor. And obtaining a second relative coordinate of the contour point corresponding to the image point according to the included angle, the first distance, the second distance, the first relative coordinate and the displacement accumulation length corresponding to the two-dimensional image.
In one embodiment, the light source is a linear light source, or a point light source.
A control method of a scanning system includes the control method of a three-dimensional modeling system according to any one of the embodiments described above. After the three-dimensional model of the object to be detected is built, the control method further comprises the steps of selecting scanning parameters according to the three-dimensional model, and completing scanning of the object to be detected according to the scanning parameters.
In one embodiment, the scan parameters include SAR (Specific Absorption Ratio, specific absorption rate) parameters of an MRI (Magnetic Resonance Imaging magnetic resonance imaging) scan and/or dose of a CT (Computed Tomography electronic computed tomography) scan.
A three-dimensional modeling system comprises a bed body, a driving device, a light source, an image acquisition device and a central control device.
The bed includes a first surface. The first surface is used for placing a body to be detected. The bed body is used for being arranged towards the inlet of the frame aperture of the detection device.
The driving device is fixed on the bed body. The driving device is used for driving the bed body to enter the aperture of the frame.
The light source is fixed to an entrance edge of the gantry aperture. The light source is used for projecting laser to the object to be detected along the direction vertical to the first surface to form a laser plane. The surface of the object to be detected forms a contour line.
And an included angle between the focusing center line of the image acquisition device and the laser plane is an acute angle. The image acquisition device is used for shooting the contour line and obtaining a two-dimensional image of the contour line.
The driving device, the light source and the image acquisition device are respectively connected with the central control device. The central control device is used for controlling the light source to project laser to the object to be detected along the direction vertical to the first surface. The central control device is used for controlling the driving device to drive the bed body to move along the direction perpendicular to the end face of the frame aperture. The central control device is used for recording the displacement accumulated length of the bed body for a plurality of times. And the central control device is used for controlling the image acquisition device to shoot the contour line once every time the displacement accumulated length is recorded, and obtaining a two-dimensional image of the contour line. The two-dimensional images and the displacement accumulation lengths are in one-to-one correspondence. The central control device is used for constructing a three-dimensional model of the body to be detected according to the two-dimensional images and the displacement accumulation lengths.
In one embodiment, the three-dimensional modeling system further comprises a fixture plate. The fixed plate is fixedly arranged at the edge of the inlet of the aperture of the frame. And the fixing frame is arranged opposite to the first surface at intervals. The light source and the image acquisition device are fixedly arranged on the surface, close to the first surface, of the fixing plate at intervals.
In one embodiment, the image acquisition device is disposed opposite a midpoint of the light source.
A scanning system comprising the three-dimensional modeling system of any of the embodiments described above. The scanning system further includes a gantry. The frame is connected with the central control device. The central control device is used for determining and selecting scanning parameters according to the three-dimensional model, and controlling the stand to scan the body to be detected according to the scanning parameters.
In the control method of the three-dimensional modeling system provided by the embodiment of the application, the three-dimensional modeling system comprises a bed body, a light source and an image acquisition device. The bed includes a first surface. The first surface is used for placing a body to be detected. The bed is configured to be disposed toward an entrance of a gantry aperture of a scanning system. The light source is for fixing to the inlet edge. The control method of the three-dimensional modeling system includes controlling the light source to project laser light toward the object to be detected in a direction perpendicular to the first surface. And controlling the bed body to move along the end face perpendicular to the aperture of the frame. And recording the displacement accumulated length of the bed body for a plurality of times along the moving direction of the bed body. And recording the displacement accumulated length once, simultaneously controlling the image acquisition device to shoot the contour line of the body to be detected irradiated by the laser once, and obtaining a two-dimensional image of the contour line. The two-dimensional images and the displacement accumulation lengths are in one-to-one correspondence. And constructing a three-dimensional model of the body to be detected according to the two-dimensional images and the displacement accumulated lengths.
The control method of the three-dimensional modeling system is characterized in that a plurality of displacement accumulation lengths and a plurality of two-dimensional images which are in one-to-one correspondence are recorded simultaneously in the process that the bed body enters the aperture of the frame. And reconstructing contour lines of different parts along the length direction of the body to be detected. The control method of the three-dimensional modeling system fully utilizes the moving process of the bed body in the scanning preparation stage, and only needs to collect the two-dimensional image of the contour line to calculate the two-dimensional data. The control method of the three-dimensional modeling system does not need a plurality of three-dimensional graphs to calculate three-dimensional data, and simplifies the calculation parameters of the three-dimensional modeling data before scanning.
Drawings
FIG. 1 is a flow chart of a control method of the three-dimensional modeling system provided in one embodiment of the present application;
FIG. 2 is a schematic diagram of the three-dimensional modeling system provided in one embodiment of the present application;
FIG. 3 is a schematic view of a portion of a human body access aperture according to one embodiment of the present disclosure;
FIG. 4 is a cross-sectional view of A-A provided in one embodiment of the present application;
FIG. 5 is a z-axis mirror schematic diagram 1 provided in one embodiment of the present application;
FIG. 6 is a top view of the position of the target point 1 provided in one embodiment of the present application;
FIG. 7 is a graph of image point locations of a camera sensor provided in one embodiment of the present application;
FIG. 8 is a schematic diagram of providing x-axis mirroring in one embodiment of the present application;
FIG. 9 is a position diagram of the marker lines at an initial calibration stage provided in one embodiment of the present application;
FIG. 10 is a graph of calibration positions of the marker line when the angle of the image capturing device or the light source is changed according to one embodiment of the present application;
FIG. 11 is a calibration chart of marker lines in a two-dimensional image provided in one embodiment of the present application.
Reference numerals:
10. a three-dimensional modeling system; 100. a scanning system; 20. a bed body; 210. a first surface; 200. a body to be detected; 30. a frame; 310. a frame aperture; 311. an inlet; 40. a driving device; 50. a light source; 501. a midpoint; 510. a laser plane; 520. a contour line; 60. an image acquisition device; 610. a camera sensor; 620. a camera lens; 601. a focus center line; 602. a focal point; 101. a center point; an included angle alpha 70, a central control device; 80. and a fixing plate.
Detailed Description
In order to make the above objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar modifications can be made by those skilled in the art without departing from the spirit of the application, and therefore the application is not limited to the specific embodiments disclosed below.
The numbering of the components itself, e.g. "first", "second", etc., is used herein only to divide the objects described, and does not have any sequential or technical meaning. The terms "coupled" and "connected," as used herein, are intended to encompass both direct and indirect coupling (coupling), unless otherwise indicated. In the description of the present application, it should be understood that the terms "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," etc. indicate or refer to an orientation or positional relationship based on that shown in the drawings, merely for convenience of description and to simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present application.
In this application, unless expressly stated or limited otherwise, a first feature "up" or "down" a second feature may be the first and second features in direct contact, or the first and second features in indirect contact via an intervening medium. Moreover, a first feature being "above," "over" and "on" a second feature may be a first feature being directly above or obliquely above the second feature, or simply indicating that the first feature is level higher than the second feature. The first feature being "under", "below" and "beneath" the second feature may be the first feature being directly under or obliquely below the second feature, or simply indicating that the first feature is less level than the second feature.
Referring to fig. 1, 2, 3 and 4, the embodiment of the present application provides a control method of a three-dimensional modeling system 10, wherein the three-dimensional modeling system 10 includes a bed 20, a light source 50 and an image acquisition device 60. The bed 20 includes a first surface 210. The first surface 210 is used for placing the object 200 to be detected. The bed 20 is configured to be disposed toward an entrance 311 of a gantry aperture 310 of the scanning system 10. The light source 50 is adapted to be secured to an edge of the inlet 311. The control method comprises the following steps:
and S100, controlling the light source 50 to project laser light to the object to be detected 200 along the direction vertical to the first surface 210. The light source 50 projects laser light to the object 200 to be inspected in a direction perpendicular to the first surface 210 to form a laser plane 510.
And S200, controlling the bed body 20 to move along the end face perpendicular to the frame aperture 310.
And S300, recording displacement accumulation lengths of the bed body 20 for a plurality of times along the moving direction of the bed body 20, wherein each time the displacement accumulation lengths are recorded, simultaneously controlling the image acquisition device 60 to shoot the contour line 520 of the body to be detected 200 irradiated by laser once, and obtaining a two-dimensional image of the contour line 520, wherein a plurality of the two-dimensional images are in one-to-one correspondence with a plurality of the displacement accumulation lengths. The contour line 520 is a portion of the intersection line of the laser plane 510 and the object 200 to be detected.
S400, constructing a three-dimensional model of the body to be detected 200 according to the two-dimensional images and the displacement accumulation lengths.
The embodiment of the present application provides a control method of the three-dimensional modeling system 10, by recording a plurality of displacement accumulated lengths and a plurality of two-dimensional images corresponding to each other simultaneously in the process that the bed 20 enters the frame aperture 310. And reconstructing the contour lines 520 of different positions along the length direction of the object 200. The control method of the three-dimensional modeling system 10 fully utilizes the process of moving the bed body into the frame aperture 310 in the scanning preparation stage, and only needs to collect the two-dimensional image of the contour line 520 and calculate the two-dimensional data. The control method of the three-dimensional modeling system 10 does not need a plurality of three-dimensional graphs to perform three-dimensional data calculation, and simplifies three-dimensional modeling data calculation parameters before scanning.
In the prior art, the scanning system 100 generally includes the gantry aperture 310 and the couch 20. The body 200 to be detected is generally a human body or a specific region of the human body, such as a head, a chest, or an abdomen. The scanning system 100 further comprises a driving means 40. The driving device 40 is fixedly arranged with the bed body 20. The human body is laid on the first surface 210 of the bed 20 while scanning a part of the human body. The bed 20 is driven into the gantry aperture 310 by the drive mechanism 40. The control method of the three-dimensional modeling system 10 fully utilizes the moving process of the bed 20 in the scanning preparation stage.
In one embodiment, the light source 50 comprises an in-line laser light. The laser lamp has good light focusing property, and light scattering is effectively avoided. The light source 50 may also be formed by sweeping a point light source.
In one embodiment, the image capture device 60 includes a camera, video recorder or video camera, depth camera, and the like.
The image acquisition area of the image acquisition device 60 encloses the projection area of the light source 50. The length of the projection area of the light source 50 in the width direction of the bed 20 is not smaller than the width of the bed 20.
The light source 50 is a linear light source, or a point light source. In one embodiment, before S100, the control method further includes:
the body to be detected 200 is positioned on the bed 20 such that the long axis of the body to be detected 200 coincides with the center line (center vertical plane) of the rack aperture 310.
In one embodiment, the contour line 520 includes a plurality of contour points. S400 includes:
s410, acquiring a plurality of image points (pixel points) corresponding to the contour points one by one from the two-dimensional image.
And S420, obtaining the position of the contour point corresponding to the image point (of the two-dimensional image) according to the position of each image point in the two-dimensional image.
In one embodiment, the image capturing device 60 includes a camera sensor 610 and a camera lens 620 that are disposed at intervals. The focal center line 601 of the image capturing device 60 passes through the focal point 602 of the camera lens 620. And the focal center line 601 is perpendicular to the camera lens 620. The light source 50 irradiates a laser plane 510 in a direction perpendicular to the first surface 210. The intersection of the focal centerline 601 and the laser plane 510 is the center point 101. S420 includes:
s421, a central image point corresponding to the central point 101 in the two-dimensional image is acquired, and a first relative coordinate of the image point (of the two-dimensional image) relative to the central image point is acquired.
S422, the included angle between the focusing center line 601 and the laser plane 510 is obtained. A first distance between the focal point 602 and the center point 101 is obtained. A second distance from the focal point 602 to the camera sensor 610 is obtained. And obtaining a second relative coordinate of the contour point corresponding to the image point according to the included angle, the first distance, the second distance, the first relative coordinate and the displacement accumulation length corresponding to the two-dimensional image.
In one embodiment, the first relative coordinates include a first coordinate value (offset of the image point in the x-axis direction in the two-dimensional image), a second coordinate value (offset of the image point in the y-axis direction in the two-dimensional image). The second relative coordinate includes a third coordinate value, a fourth coordinate value and a fifth coordinate value. S422 includes:
and S01, obtaining the third coordinate value (offset of the target point in the three-dimensional coordinate along the x-axis direction) according to the first distance, the second distance and the first coordinate value.
And S02, obtaining the fourth coordinate value (offset of the target point in the three-dimensional coordinate along the y-axis direction) according to the displacement accumulation length.
And S03, obtaining the fifth coordinate value (offset of the target point in the three-dimensional coordinate along the z-axis direction) according to the included angle, the first distance, the second distance and the second coordinate value.
Referring to fig. 5, 6 and 7, in one embodiment, the contour line 520 includes a target point 1, where coordinates of the target point 1 in three-dimensional space are (j, 0, h). The corresponding point of the target point 1 on the two-dimensional image is (j ', h').
When the target point 1 corresponds to the light emitted from the midpoint of the light source 50. The image point of the target point 1 is located on the vertical line where the central image point is located.
When the target point 1 does not correspond to the light emitted from the midpoint of the light source 50. The change in the height of the target point 1 is manifested as a shift of the image point along the y-axis.
Referring to fig. 8, in one embodiment, S03 is to bring the included angle, the first distance, the second distance, and the second coordinate value into a second formula to obtain the fifth coordinate value, where the second formula is:
h=u×h’/(v×sinα+h’×cosα) (2)
wherein h is the fifth coordinate value, h' is the second coordinate value, and α is the included angle.
The image point of the target point 1 on the camera sensor is shifted in the y-axis direction by h' due to the influence of the detector height. In the same way, the projection of the target point 2 onto the camera sensor is also shifted in the y-axis direction by h2' due to the height of the point. The height h2 of the point can be calculated according to the above method, and the coordinates of the point are recorded as (j, 0, h 2). According to the resolution of the x-direction camera sensor. The three-dimensional coordinates of each point on which the laser line is projected can be calculated discretely.
The displacement value of the bed body is smaller than the image acquisition range of the image acquisition device in the y-axis direction, and the sampling frequency of the image acquisition device per second is larger than the moving frequency of the bed body per second.
In one embodiment, S01 is a function of bringing the first distance, the second distance, and the first coordinate value into a first formula to obtain the third coordinate value, where the first formula is:
j=u/v×j’ (1)
wherein j is the third coordinate value, j' is the first coordinate value, u is the first distance, and v is the second distance. The offset j of the target point 1 in the x-direction and the offset j' of the projection on the camera sensor in the x-direction with respect to the center point are in a linear relationship, the ratio of which depends on the magnification of the camera lens, i.e. the focus 602 is the position of the focus center line 601.
The present embodiment provides a control method of the scanning system 100, including the control method of the three-dimensional modeling system 10 according to any one of the embodiments described above. After the three-dimensional model of the object to be detected 200 is built, the control method further includes selecting a scanning parameter according to the three-dimensional model, and completing the scanning of the object to be detected 200 according to the scanning parameter.
The control method of the scanning system 100 according to the embodiment of the present application fully utilizes the movement process of the bed body in the scanning preparation stage. The control method of the scanning system 100 only needs to acquire the two-dimensional image of the contour line 520 and calculate the two-dimensional data. The control method of the scanning system 100 does not need a plurality of three-dimensional graphs to perform three-dimensional data calculation, and simplifies three-dimensional modeling data calculation parameters before scanning. The control method of the scanning system 100 performs three-dimensional modeling in the scanning preparation stage, and does not need to utilize the time outside the scanning process. The control method of the scanning system 100 integrates time sufficiently, and saves overall detection time.
In one embodiment, the scan parameters include SAR (Specific Absorption Ratio, specific absorption rate) parameters of an MRI (Magnetic Resonance Imaging magnetic resonance imaging) scan and/or dose of a CT (Computed Tomography electronic computed tomography) scan.
The accuracy of the modeling of the three-dimensional modeling system 10 is related to the angle of the image acquisition device 60 and the light source 50. The three-dimensional modeling system 10 may change in angle after long-term operation due to vibration or the like, resulting in a calculated deviation.
Referring to fig. 9, in one embodiment, before the step of controlling the light source 50 to project laser light to the object to be inspected 200 along the direction perpendicular to the first surface 210, the control method of the three-dimensional modeling system 10 further includes:
and (5) initial calibration. The step of initial calibration includes:
the angle of the image capturing device 60 is adjusted such that the laser line of the first surface 210 captured by the camera sensor 610 is located at the center line of the camera sensor 610.
The height distance of the image acquisition device 60 relative to the ground is fixed and the horizontal distance of the light source 50 from the image acquisition device 60 relative to the gantry aperture 310 is fixed. The plane in which the center of the gantry aperture 310 is located is taken as a reference plane. If the first surface 210 of the bed 20 is at the reference plane, the distance between the image capturing device 60 and the reference plane is H, and the distance a between the light source 50 and the image capturing device 60 is the same.
The included angle between the focusing center line 601 and the laser plane 510 is:
α=arctan(A/H)。
alpha is one of the parameters of the altitude calculation described above.
Referring to fig. 10 and 11 together, in one embodiment, the control method of the three-dimensional modeling system 10 further includes:
the first surface 210 of the bed 20 is provided with marking lines. The marking line is perpendicular to the moving direction of the bed 20. The bed 20 is moved so that the marking line coincides with the laser line of the first surface 210.
In one embodiment, the control method of the three-dimensional modeling system 10 further includes:
after the mark line coincides with the laser line of the first surface 210, the absolute value M1 of the horizontal position movement of the bed 20 at this time is recorded.
During image acquisition, the height of the bed 20 is constant, and if the acquired laser line of the first surface 210 deviates from the center line of the camera sensor 610, it indicates that the angle of the image acquisition device 60 or the light source 50 is changed.
When the marking line does not overlap with the laser line on the first surface 210, the laser line and the center line in the two-dimensional image acquired by the camera sensor 610 are shifted by h1' along the y-axis direction, and the absolute value M2 of the horizontal position movement of the bed 20 is recorded. M2+.m1. M2—m1=a'.
In one embodiment, the control method of the three-dimensional modeling system 10 further includes:
when the marking line is not coincident with the laser line of the first surface 210, the horizontal displacement a' along the length direction of the bed 20 is adjusted so that the marking line coincides with the laser line of the first surface 210. Corresponds to the height H1 of the image acquisition device 60 or the light source 50.
The relationship between H1 and A' is:
h1/(h1+h) =a'/a. I.e. h1=h/(a/a' -1).
The calculation formula of the adjusted included angle is as follows:
α’=artan(A/(H1+H))。
taking h1=h/(a/a '-1) into α' =artan (a/(h1+h)), yields:
α’=artan(A 2 /H(A’-1))。
h1+h and α' are calibrated parameters.
The embodiment of the application provides a three-dimensional modeling system 10, which comprises a bed 20, a driving device 40, a light source 50, an image acquisition device 60 and a central control device 70.
The bed 20 includes a first surface 210. The first surface 210 is used for placing the object 200 to be detected. The bed 20 is adapted to be positioned towards an inlet 311 of a housing aperture 310 of the detection device.
The driving device 40 is fixed to the bed 20. The driving device 40 is used for driving the bed 20 into the frame aperture 310.
The light source 50 is fixed to the edge of the entrance 311 of the housing aperture 310. The light source 50 is configured to project laser light toward the object to be inspected 200 in a direction perpendicular to the first surface 210, so as to form a laser plane 510. The surface of the object 200 forms a contour line 520.
The included angle between the focusing center line 601 of the image acquisition device 60 and the laser plane 510 is an acute angle. The image acquisition device 60 is used for capturing the contour line 520 and obtaining a two-dimensional image of the contour line 520.
The driving device 40, the light source 50 and the image acquisition device 60 are respectively connected with the central control device 70. The central control device 70 is configured to control the light source 50 to project laser light toward the object to be detected 200 in a direction perpendicular to the first surface 210. The central control device 70 is used for controlling the driving device 40 to drive the bed 20 to move along the direction perpendicular to the end face of the frame aperture 310. The central control device 70 is used for recording the displacement accumulated length of the bed 20 for a plurality of times. The central control device 70 is configured to control the image capturing device 60 to capture the contour line 520 once every recording of the displacement accumulated length, and obtain a two-dimensional image of the contour line 520. The two-dimensional images and the displacement accumulation lengths are in one-to-one correspondence. The central control device 70 is configured to construct a three-dimensional model of the object to be detected 200 based on the plurality of two-dimensional images and the plurality of displacement integration lengths.
The three-dimensional modeling system 10 provided in the embodiments of the present application secures the light source 50 to the edge of the entrance 311 of the gantry aperture 310. The light source 50 is configured to project laser light toward the object to be inspected 200 in a direction perpendicular to the first surface 210. The image acquisition device 60 is used for capturing the contour line 520 and obtaining a two-dimensional image of the contour line 520. The three-dimensional modeling system 10 fully utilizes the movement process of the scanning preparation stage bed 20. The three-dimensional modeling system 10 only needs to acquire a two-dimensional image of the contour 520 and calculate two-dimensional data. The control method of the three-dimensional modeling system 10 does not need a plurality of three-dimensional graphs to perform three-dimensional data calculation, and simplifies three-dimensional modeling data calculation parameters before scanning. The three-dimensional modeling system 10 performs three-dimensional modeling in a scan preparation phase without using time outside the scan process. The three-dimensional modeling system 10 integrates time sufficiently to save overall inspection time.
In one embodiment, the three-dimensional modeling system 10 further includes a fixed plate 80. The fixing plate 80 is fixedly disposed at the edge of the inlet 311 of the frame aperture 310. And the fixing frame is disposed opposite to the first surface 210 at intervals. The light source 50 and the image acquisition device 60 are fixedly disposed on the surface of the fixing plate 80 near the first surface 210 at intervals.
In one embodiment, the image acquisition device 60 is disposed opposite the midpoint 501 of the light source 50, facilitating simplified calculation steps.
Embodiments of the present application provide a scanning system 100 comprising the three-dimensional modeling system 10 described in any of the embodiments above. The scanning system further comprises a gantry 30. The housing 30 is connected to the central control unit 70. The central control device 70 is configured to determine a selection scan parameter according to the three-dimensional model, and control the gantry 30 to scan the object to be detected 200 according to the scan parameter. The scanning system 100 provided in the embodiments of the present application uses the movement process of the bed 20 in the scanning preparation stage. The scanning system 100 only needs to acquire a two-dimensional image of the contour 520 and calculate two-dimensional data. The scanning system 100 does not need a plurality of three-dimensional graphs to perform three-dimensional data calculation, and simplifies three-dimensional modeling data calculation parameters before scanning. The scanning system 100 performs three-dimensional modeling during the scan preparation phase without having to utilize time outside the scan process. The scanning system 100 integrates time sufficiently to save overall detection time.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (10)

1. A control method of a three-dimensional modeling system, the three-dimensional modeling system including a bed body, a light source and an image acquisition device, the bed body including a first surface for placing a body to be detected, the bed body being disposed toward an entrance of a gantry aperture of a scanning system, the light source being fixed to an edge of the entrance, the control method comprising:
controlling the light source to project laser to the object to be detected along the direction vertical to the first surface;
controlling the bed body to move along the end face perpendicular to the aperture of the frame;
recording displacement accumulation lengths of the bed body for a plurality of times along the moving direction of the bed body, wherein each time the displacement accumulation lengths are recorded, controlling the image acquisition device to shoot an outline of the body to be detected irradiated by laser once, and obtaining a two-dimensional image of the outline, and a plurality of the two-dimensional images are in one-to-one correspondence with a plurality of the displacement accumulation lengths;
and constructing a three-dimensional model of the body to be detected according to the two-dimensional images and the displacement accumulated lengths.
2. The method of controlling a three-dimensional modeling system according to claim 1, wherein the contour line includes a plurality of contour points, and the step of constructing the three-dimensional model of the object to be detected from a plurality of the two-dimensional images and a plurality of the displacement integration lengths includes:
acquiring a plurality of image points corresponding to a plurality of contour points one by one from the two-dimensional image;
and obtaining the position of the contour point corresponding to each image point according to the position of each image point in the two-dimensional image.
3. The method for controlling a three-dimensional modeling system according to claim 2, wherein the image capturing device includes a camera sensor and a camera lens that are disposed at intervals, a focal center line of the image capturing device passes through a focal point of the camera lens, the focal center line is perpendicular to the camera lens, the light source irradiates in a direction perpendicular to the first surface to form a laser plane, an intersection point of the focal center line and the laser plane is a center point, and the step of obtaining a position of the contour point corresponding to the image point according to a position of each image point in the two-dimensional image includes:
acquiring a central image point corresponding to the central point in the two-dimensional image, and acquiring a first relative coordinate of the image point relative to the central image point;
acquiring an included angle between the focusing center line and the laser plane, acquiring a first distance between the focus and the center point, acquiring a second distance between the focus and the camera sensor, and acquiring a second relative coordinate of the contour point corresponding to the image point according to the included angle, the first distance, the second distance, the first relative coordinate and the displacement accumulation length corresponding to the two-dimensional image.
4. The method of claim 1, wherein the light source is a linear light source, or a point light source.
5. A control method of a scanning system, characterized by comprising the control method of a three-dimensional modeling system according to any one of claims 1 to 4, the control method further comprising, after the three-dimensional model of the object to be detected is built:
and selecting scanning parameters according to the three-dimensional model, and completing the scanning of the object to be detected according to the scanning parameters.
6. A method of controlling a scanning system according to claim 5, characterized in that the scanning parameters comprise SAR parameters of an MRI scan and/or a dose of a CT scan.
7. A three-dimensional modeling system, comprising:
the bed body comprises a first surface, wherein the first surface is used for placing a body to be detected, and the bed body is used for being arranged towards an inlet of a frame aperture of the detection device;
the driving device is fixed on the bed body and is used for driving the bed body to enter the aperture of the frame;
the light source is fixed at the inlet edge of the aperture of the frame and is used for projecting laser to the to-be-detected body along the direction perpendicular to the first surface to form a laser plane, and the surface of the to-be-detected body forms a contour line;
the included angle between the focusing center line of the image acquisition device and the laser plane is an acute angle, and the image acquisition device is used for shooting the contour line and obtaining a two-dimensional image of the contour line;
the central control device is used for controlling the driving device to drive the bed body to move along the direction perpendicular to the end face of the frame aperture, the central control device is used for recording the displacement accumulation length of the bed body for a plurality of times, each time the displacement accumulation length is recorded, the central control device is used for controlling the image acquisition device to shoot the contour line once and obtain two-dimensional images of the contour line, a plurality of the two-dimensional images are in one-to-one correspondence with a plurality of the displacement accumulation lengths, and the central control device is used for constructing a three-dimensional model of the body to be detected according to the two-dimensional images and the displacement accumulation lengths.
8. The three-dimensional modeling system of claim 7, further comprising:
the fixed plate is fixedly arranged at the edge of the inlet of the aperture of the frame, the fixed frame is opposite to the first surface at intervals, and the light source and the image acquisition device are fixedly arranged on the surface, close to the first surface, of the fixed plate at intervals.
9. The three-dimensional modeling system of claim 8, wherein the image acquisition device is disposed opposite a midpoint of the light source.
10. A scanning system comprising a three-dimensional modeling system according to any of claims 7-9, the scanning system further comprising a gantry, the gantry being connected to the central control device, the central control device being configured to determine a selection of scanning parameters according to the three-dimensional model, and to control the gantry to scan the object to be detected according to the scanning parameters.
CN202010796134.4A 2020-08-10 2020-08-10 Three-dimensional modeling system, scanning system and control method thereof Active CN112102469B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010796134.4A CN112102469B (en) 2020-08-10 2020-08-10 Three-dimensional modeling system, scanning system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010796134.4A CN112102469B (en) 2020-08-10 2020-08-10 Three-dimensional modeling system, scanning system and control method thereof

Publications (2)

Publication Number Publication Date
CN112102469A CN112102469A (en) 2020-12-18
CN112102469B true CN112102469B (en) 2023-07-25

Family

ID=73752978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010796134.4A Active CN112102469B (en) 2020-08-10 2020-08-10 Three-dimensional modeling system, scanning system and control method thereof

Country Status (1)

Country Link
CN (1) CN112102469B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032783A (en) * 2000-07-19 2002-01-31 Minolta Co Ltd Method and device for generating three-dimensional model
CN106959078A (en) * 2017-02-28 2017-07-18 苏州凡目视觉科技有限公司 A kind of contour measuring method for measuring three-dimensional profile
CN108416839A (en) * 2018-03-08 2018-08-17 云南电网有限责任公司电力科学研究院 Several X-ray rotating image contour line three-dimensional rebuilding methods of one kind and its system
CN109219731A (en) * 2016-06-01 2019-01-15 住友橡胶工业株式会社 The foreign matter of green tire adheres to method of discrimination
CN109584368A (en) * 2018-10-18 2019-04-05 中国科学院自动化研究所 The construction method and device of biological sample three-dimensional structure
CN111445516A (en) * 2020-03-03 2020-07-24 深圳市杰普特光电股份有限公司 System and method for calculating depth of two-dimensional code in glass substrate

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10610170B2 (en) * 2017-05-12 2020-04-07 Carestream Health, Inc. Patient position monitoring system based on 3D surface acquisition technique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032783A (en) * 2000-07-19 2002-01-31 Minolta Co Ltd Method and device for generating three-dimensional model
CN109219731A (en) * 2016-06-01 2019-01-15 住友橡胶工业株式会社 The foreign matter of green tire adheres to method of discrimination
CN106959078A (en) * 2017-02-28 2017-07-18 苏州凡目视觉科技有限公司 A kind of contour measuring method for measuring three-dimensional profile
CN108416839A (en) * 2018-03-08 2018-08-17 云南电网有限责任公司电力科学研究院 Several X-ray rotating image contour line three-dimensional rebuilding methods of one kind and its system
CN109584368A (en) * 2018-10-18 2019-04-05 中国科学院自动化研究所 The construction method and device of biological sample three-dimensional structure
CN111445516A (en) * 2020-03-03 2020-07-24 深圳市杰普特光电股份有限公司 System and method for calculating depth of two-dimensional code in glass substrate

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Shape Reconstruction using Differentiable Projections and Deep Priors";Matheus Gadelha.et al;《international《 computer vision 2019》;全文 *

Also Published As

Publication number Publication date
CN112102469A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN110392247B (en) Camera monitoring system for monitoring a patient in a bore-based medical system and method for calibrating the same
US10136970B2 (en) System, device, and method for dental intraoral scanning
US9091536B2 (en) Method and device for three-dimensional surface detection with a dynamic reference frame
US20170219498A1 (en) Optical geometry calibration devices, systems, and related methods for three dimensional x-ray imaging
US9046360B2 (en) System and method of acquiring three dimensional coordinates using multiple coordinate measurement devices
US5987349A (en) Method for determining the position and orientation of two moveable objects in three-dimensional space
US6379041B1 (en) X-ray apparatus for producing a 3D image from a set of 2D projections
US6522908B1 (en) Biomagnetic field measuring apparatus
US10772576B2 (en) X-ray imaging apparatus and control method thereof
US7628538B2 (en) Method and apparatus for calibrating an X-ray diagnostic system
US20100198112A1 (en) Patient monitoring at radiation machines
JP2011017700A (en) Method of determining three-dimensional coordinate of object
US20180085084A1 (en) A method of reducing the x-ray dose in an x-ray system
US20080317313A1 (en) System and method for tracking motion for generating motion corrected tomographic images
CN110906880A (en) Object automatic three-dimensional laser scanning system and method
JP2011516849A (en) 3D imaging system
EA031929B1 (en) Apparatus and method for three dimensional surface measurement
EP1565101B1 (en) Method and apparatus for selecting regions of interest in optical imaging
CN109953767A (en) Method and system for calibrating medical imaging devices, for executing registration
CN1791359A (en) Apparatus and method for recording the movement of organs of the body
CN112102469B (en) Three-dimensional modeling system, scanning system and control method thereof
JP4330181B2 (en) Imaging modality for image guided surgery
CN107843863B (en) Magnetic resonance imaging correction method, device and equipment based on 3D topography measurement
JP3906123B2 (en) Camera image output calibration device
WO2023247274A1 (en) Patient monitoring during a scan

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant