CN110111262B - Projector projection distortion correction method and device and projector - Google Patents

Projector projection distortion correction method and device and projector Download PDF

Info

Publication number
CN110111262B
CN110111262B CN201910249736.5A CN201910249736A CN110111262B CN 110111262 B CN110111262 B CN 110111262B CN 201910249736 A CN201910249736 A CN 201910249736A CN 110111262 B CN110111262 B CN 110111262B
Authority
CN
China
Prior art keywords
projection
grid
plane
point
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910249736.5A
Other languages
Chinese (zh)
Other versions
CN110111262A (en
Inventor
苏劲
蔡志博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bird Innovation Beijing Technology Co ltd
Original Assignee
Beijing Xiaoniao Tingting Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaoniao Tingting Technology Co Ltd filed Critical Beijing Xiaoniao Tingting Technology Co Ltd
Priority to CN201910249736.5A priority Critical patent/CN110111262B/en
Publication of CN110111262A publication Critical patent/CN110111262A/en
Application granted granted Critical
Publication of CN110111262B publication Critical patent/CN110111262B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Projection Apparatus (AREA)

Abstract

The invention discloses a projector projection distortion correction method and device and a projector. The method of the invention comprises the following steps: projecting the standard grid image on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a projection point cloud image; obtaining three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the pixel point corresponding relation between the standard grid image and the projection point cloud image and the three-dimensional projection model; acquiring a reference projection plane and projection points of the grid points on the projection plane on the reference projection plane by using the three-dimensional coordinates of the grid points on the projection plane, and acquiring texture sampling coordinates corresponding to each pixel point of the grating pixel plane; and performing texture mapping on the image to be projected by using the texture sampling coordinates, and outputting the image subjected to the texture mapping to the grating pixel surface for projection. The technical scheme of the invention ensures that the image projected to the non-flat projection surface by the projector has no distortion effect.

Description

Projector projection distortion correction method and device and projector
Technical Field
The invention relates to a projector projection distortion correction method and device and a projector.
Background
With the maturity of short-focus optical machine technology and the substantial reduction of cost, the applications of intelligent projectors in homes are increasing. The requirements of household users on high-quality viewing experience are not met, and the resolution, brightness and color image quality of the intelligent projector are rapidly improved. However, in a general household environment, people generally do not use a special projection curtain as a projection surface due to cost considerations, but only use a common household wall surface as the projection surface. For the projection surface, the quality of the wall surface is not ideal, and the problems of uneven wall surface, low reflection coefficient, nondirectional reflection and the like exist. The low reflectance and non-directivity can be solved simply by increasing the illumination of the projector and reducing the brightness of the ambient light, but the projection distortion caused by the uneven wall surface is difficult to solve by the conventional projector technology. The wall surface fluctuation phenomenon which is invisible to naked eyes can cause distortion which is visible to the naked eyes on a projection picture, the distortion is particularly serious when high definition or large-size images are projected, the distortion is difficult to be accepted by high-end users, and a brand new method needs to be found to solve the problem.
Projectors on the market currently do not provide a feasible solution to picture distortion caused by uneven projection surfaces. The existing scheme is based on the projection image collected by a mobile phone camera, the whole correction process is controlled and controlled by the mobile phone application, and the whole pre-correction process is completed through the interaction with a projector. The solution has the defects of low quality of collected images of the mobile phone, uncontrollable shooting environment of a user, complex operation and the like, and can not provide satisfactory user experience.
Disclosure of Invention
The invention provides a projector projection distortion correction method, a projector projection distortion correction device and a projector, which at least partially solve the problems.
In a first aspect, the present invention provides a method for correcting projection distortion of a projector, the projector having an optical-mechanical lens and a camera, the method comprising: projecting the standard grid image of the grating pixel surface on a projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface; acquiring three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the pixel point corresponding relation between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector and is used for calculating the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system; acquiring a reference projection plane by using the three-dimensional coordinates of the grid points on the projection surface, constructing a world coordinate system by using the reference projection plane, and acquiring projection points of the grid points on the projection surface on the reference projection plane based on the conversion relation between the world coordinate system and the optical machine lens coordinate system; obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the corresponding relation between the projection point and the grid point on the grating pixel surface of the optical machine lens, wherein the texture sampling coordinates are used for correcting distortion displacement of each pixel point of the grating pixel surface on the projection surface; and performing texture mapping on the image to be projected by using the texture sampling coordinates, and outputting the image subjected to the texture mapping to the grating pixel surface for projection.
In a second aspect, the present invention provides a projection distortion correction apparatus for a projector having an optical machine lens and a camera, the apparatus comprising: the projection point cloud obtaining unit is used for projecting the standard grid image of the grating pixel surface on a projection surface by using the optical machine lens and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface; a projection point cloud coordinate calculation unit, configured to obtain three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to a pixel point correspondence between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, where the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector, and is configured to calculate three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system; the reference projection plane construction unit is used for obtaining a reference projection plane by utilizing the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by utilizing the reference projection plane, and obtaining the projection points of the grid points on the projection plane on the reference projection plane based on the conversion relation between the world coordinate system and the optical machine lens coordinate system; the conversion relation calculation unit is used for obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the corresponding relation between the projection point and the grid point on the grating pixel surface of the optical machine lens, and the texture sampling coordinates are used for correcting distortion displacement of each pixel point of the grating pixel surface on the projection surface; and the texture mapping unit is used for performing texture mapping on the image to be projected by using the texture sampling coordinates and outputting the image subjected to the texture mapping to the grating pixel surface for projection.
In a third aspect, the present invention provides a projector comprising: the optical-mechanical lens is used for projecting the standard grid image of the grating pixel surface on the projection surface; shooting a projection surface to obtain a projection point cloud image and sending the projection point cloud image to a camera of the image processor; a memory storing computer executable instructions that, when executed, cause the graphics processor to perform the aforementioned projector projection distortion correction method.
In a fourth aspect, the present invention provides a computer-readable storage medium having one or more computer programs stored thereon, which when executed by a graphics processor of a projector, implement the aforementioned projector projection distortion correction method.
The projector is internally provided with the camera, so that the projector has stereoscopic vision capability, active three-dimensional modeling of a projection environment can be realized by utilizing an optical machine lens and the camera of the projector, a three-dimensional projection model is established, the projector can automatically obtain the distribution condition of a projection plane by utilizing a computer vision method based on the three-dimensional projection model without external equipment and instruments, the accurate distortion displacement of each pixel point of a grating pixel plane on the actual projection plane is calculated according to the distribution condition of the projection plane, and the correction of a grating image is realized by utilizing a texture mapping technology so as to eliminate the picture distortion of the projection plane.
Drawings
Fig. 1 is a flowchart illustrating a projection distortion correction method of a projector according to an embodiment of the present invention;
fig. 2 is a schematic view of an optical system of a projector according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating pinhole imaging according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a process for generating a projection point cloud image according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a three-dimensional projection model according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a process of fitting a reference projection plane according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a relationship between a world coordinate system, an optical-mechanical lens coordinate system, a grating pixel plane, and a reference projection plane according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of an orthogonal projection of grid points of a projection surface onto a reference projection plane according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of extended proxels on a reference projection plane according to an embodiment of the present invention;
fig. 10 is a block diagram showing a configuration of a projection distortion correction apparatus of a projector according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a projector according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a projector projection distortion correction method, which is characterized in that various computer vision methods are used for obtaining surface distribution parameters of a projection surface, the distribution parameters are used as the basis for calculating the grid texture layout of a pre-corrected target picture, and the GPU is used for realizing the texture mapping correction of an input image, so that a pre-distorted projection image is obtained, and the image finally projected to a non-flat projection surface presents a distortion-free effect. The embodiment of the invention also provides a corresponding device, a projector and a computer readable storage medium, which are respectively described in detail below.
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings. It is to be understood that such description is merely illustrative and not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The words "a", "an" and "the" and the like as used herein are also intended to include the meanings of "a plurality" and "the" unless the context clearly dictates otherwise. Furthermore, the terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Thus, the techniques of the present invention may be implemented in hardware and/or in software (including firmware, microcode, etc.). Furthermore, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of the present invention, a computer-readable storage medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer-readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The invention provides a projector projection distortion correction method.
Fig. 1 is a flowchart illustrating a projection distortion correction method of a projector according to an embodiment of the present invention, and as shown in fig. 1, the method of the embodiment includes:
s110, projecting the standard grid image of the grating pixel surface on the projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface.
The optical machine in this embodiment may be understood as a projection module in a projection device, and in general, the optical machine integrates all the display core, the light source, the lens optical path, and the heat dissipation of the digital micromirror device into one mechanism to form an integral component, so as to prevent dust and shock.
And S120, obtaining three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the corresponding relation of the pixel points between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector and is used for calculating the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system.
S130, obtaining a reference projection plane by using the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by using the reference projection plane, and obtaining the projection points of the grid points on the projection plane on the reference projection plane based on the conversion relation between the world coordinate system and the optical machine lens coordinate system.
And S140, acquiring texture sampling coordinates corresponding to each pixel point on the grating pixel surface of the optical machine lens according to the corresponding relation between the projection point and the grid point on the grating pixel surface of the optical machine lens, wherein the texture sampling coordinates are used for correcting the distortion displacement of each pixel point on the grating pixel surface on the projection surface.
And S150, performing texture mapping on the image to be projected by using the texture sampling coordinates, and outputting the image subjected to the texture mapping to a grating pixel surface for projection.
The projector has the advantages that the camera is arranged in the projector, so that the projector has stereoscopic vision capability, active three-dimensional modeling of a projection environment can be realized by utilizing an optical machine lens and the camera of the projector, a three-dimensional projection model is established, the projector can automatically obtain the distribution condition of a projection surface by utilizing a computer vision method based on the three-dimensional projection model without external equipment and instruments, the accurate distortion displacement of each pixel point of a grating pixel surface on the actual projection surface is calculated through the distribution condition of the projection surface, and the texture mapping technology is utilized to realize the correction of a grating image, so that the picture distortion of the projection surface is eliminated.
For promoting the precision of projecting apparatus projection distortion correction, this embodiment calibrates the projecting apparatus in advance, obtains the distortion parameter of projecting apparatus, and this distortion parameter includes the internal parameter and the external parameter of ray apparatus camera lens and camera. The calibration process for the projector is as follows:
as shown in fig. 2, there is a certain distance between the optical-mechanical lens and the camera of the projector, so that the same world coordinate point has parallax on the optical-mechanical grating pixel plane and the sensor pixel plane, for example, point a in the projection area in fig. 2 corresponds to pixel position a1 on the optical-mechanical grating pixel plane and pixel position a2 on the sensor pixel plane, thereby satisfying the formation condition of binocular stereo vision, since the three-dimensional projection model can be constructed based on the optical-mechanical lens coordinate system and the camera coordinate system of the projector.
The optical-mechanical lens can be regarded as a reverse camera, and a pinhole imaging model similar to the camera can be established, so that the correction principle of the optical-mechanical lens is similar to that of the camera, and the embodiment describes the case of obtaining the distortion parameter of the camera.
As shown in fig. 3, the formula of the pinhole imaging model is: sm' ═ A [ R | t]M', wherein s is a normalized scale factor; a is an internal parameter matrix of the camera; [ R | t]The external parameter matrix is used for converting the coordinate of the image point P from a world coordinate system to a camera coordinate system, R is a rotation matrix, and t is a translation vector; m 'is the coordinate position in the camera coordinate system, and M' is the coordinate position in the world coordinate system; in the pinhole imaging optical path shown in FIG. 3, the coordinate c of the object point FcxAnd cyThe plane coordinate corresponding to the uv plane is (u, v), and u ═ fx·x′+cx,v=fy·y′+cy,x′=x/z,y′=y/z,fx,fyRespectively camera focal coordinates, cx,cyRespectively, X-axis and Y-axis coordinates of the object point Fc, X and Y respectively, coordinates of the image point P, and X 'and Y' respectively, normalized coordinates of the longitudinal axis coordinates of the point P. The coordinate system of the object point Fc shown in fig. 3 corresponds to the coordinate system of the camera of the present embodiment, and the uv plane coordinate system corresponds to the pixel plane coordinate system of the camera sensor of the present embodiment. Therefore, after the conversion relation between the world coordinate system and the camera coordinate system is obtained, the method can be usedu=fx·x′+cx,v=fy·y′+cyAnd obtaining the corresponding relation between the world coordinate system and the sensor pixel surface coordinate system.
For the camera intrinsic parameter matrix, the intrinsic parameters of the projector camera can be obtained through the calibration plate and structured light projection, and the intrinsic parameters comprise a focal length, a radial distortion parameter, a tangential distortion parameter and a principal point coordinate (namely a central point of a sensor image). At this time, the corresponding relation between the camera coordinate system and the sensor pixel surface coordinate system is as follows: f ═ ux*x″+cx,v=fy*y″+cyWherein, in the step (A),
Figure GDA0002938941640000061
Figure GDA0002938941640000071
k1,k2,k3,k4,k5,k6respectively radial distortion parameter, p, of the camera1,p2Respectively the tangential distortion parameter, s, of the camera1,s2,s3,s4Respectively, the thin prism distortion parameters of the camera.
In this embodiment, a translation vector and a rotation matrix between the optical-mechanical lens and the camera are also required to be obtained. Specifically, in the internal parameter calibration process, a rotation matrix R from a world coordinate system to an optical machine lens coordinate system can be obtainedpAnd a translation vector tpRotation matrix R from world coordinate system to camera coordinate systemcAnd a translation vector tcFrom the pinhole imaging model, we can obtain:
Figure GDA0002938941640000072
(X, Y, Z) is the three-dimensional point coordinate of the world coordinate system, (X)p,Yp,Zp) And (X)c,Yc,Zc) The three-dimensional point coordinates of the optical-mechanical lens coordinate system and the camera coordinate system are respectively corresponding, and the relative position relationship between the optical-mechanical lens and the camera can be obtained by combining the two formulas:
Figure GDA0002938941640000073
therefore, the external parameters of the binocular vision of the projector can be obtained, and the coordinate conversion from the camera coordinate system to the optical machine lens coordinate system is realized.
Before the projector leaves the factory, the internal parameters and the external parameters of the optical machine lens and the camera are obtained by the projector calibration method. In some embodiments, the projector raster resolution used for calibration is 1920x1080, the virtual imaging of the grid points on the raster surface is obtained through re-projection, and the calibration residual is calculated. In some embodiments the intra-parameter calibration residual is 0.2137 pixels and the extra-parameter calibration residual is 0.4324 pixels. And downloading the calibrated internal and external parameters into the projector for utilization in the subsequent distortion correction process.
In the application of the projector, after a user interface comprising a projection surface distortion correction flow is selected and enters the user interface of the flow, the projector prompts to enter the distortion correction flow, and at the moment, the projector is required to be aligned to a projection surface to be corrected, the position of the projector is kept constant, and the automatic correction flow is entered. The above steps S110 to S150 will be described in detail with reference to fig. 4 to 9.
First, step S110 is executed, namely, the optical engine lens is used to project the standard grid image of the grating pixel surface on the projection surface, and the camera is used to shoot the projection surface to obtain the projection point cloud image of the grid point on the projection surface.
Generally, the higher the density of the projection surface point cloud is, the higher the measurement accuracy of the projection surface is, but due to the influence of factors such as circle spot scattering caused by lens distortion, acquisition noise, background light and the like, if the grid density is too high, the grid searching speed is influenced, and even the grid point cannot be searched. In the embodiment, the method for generating and projecting the dislocation grid is introduced, so that the density of the grid points can be improved under the condition that the searching efficiency of the grid points is not influenced, and the point cloud measurement accuracy of the projection surface is improved.
In some embodiments, the projection surface point cloud image is obtained by: projecting a first standard grid image of a grating pixel surface on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a first projection point cloud image; projecting a second standard grid image of the grating pixel surface on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a second projection point cloud image, wherein circular spots in the first standard grid image and the second standard grid image are staggered; respectively identifying circular spot grids in the first projection point cloud image and the second projection point cloud image to obtain the pixel position of each circular spot in the grating pixel surface in the first projection point cloud image and the second projection point cloud image; and obtaining a projection point cloud image by superposing the first projection point cloud image and the second projection point cloud image, wherein the pixel position of each circular spot in the grating pixel surface in the first projection point cloud image and the second projection point cloud image is the pixel position of the corresponding circular spot on the projection point cloud image.
As shown in fig. 4, the application of the projector generates two asymmetric circular spot mesh pictures, the circular spots of the two pictures are displaced from each other by a mesh distance. The two pictures are projected to a projection plane by an optical machine lens in sequence, then a projection picture is collected by a camera of a projector, a grid circular spot is searched in the collected point cloud image and the pixel coordinate of the grid circular spot is recorded, then the coordinates of the staggered grid circular spots on the two point cloud images are combined, and the circular spot center coordinate of the projection point cloud image with the grid density being the sum of the circular spot densities of the two point cloud images is obtained.
After obtaining the projection point cloud image of the grid point on the projection surface, the step S120 is continuously executed, that is, the three-dimensional coordinates of the grid point on the projection surface based on the optical machine lens coordinate system are obtained according to the pixel point corresponding relationship between the standard grid image and the projection point cloud image and according to the pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector, and is used for calculating the three-dimensional coordinates of the grid point on the projection surface based on the optical machine lens coordinate system.
Before the distribution condition of the projection surface is obtained based on the grid points of the projection surface, the images on the raster pixel surface and the sensor pixel surface are corrected by using the obtained distortion parameters, and grid point coordinates (u) of the corrected CCD image are obtainedc,vc) Mesh point coordinates in the optomechanical image(up,vp) (ii) a Wherein the distortion parameter is obtained as described above.
In some embodiments, the three-dimensional projection model is constructed by: firstly, establishing a first linear relation according to the optical center of the camera light path and a first correction point of the optical center on the camera sensor pixel surface; then, establishing a second linear relation according to the optical center of the optical path of the optical mechanical lens and a second correction point of the optical center on the grating pixel surface of the optical mechanical lens; then establishing a third linear relation between the first correction point and the second correction point according to an external parameter matrix between the camera and the optical-mechanical lens; and finally, obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the first linear relation, the second linear relation and the third linear relation.
The three-dimensional projection model of the embodiment calculates the three-dimensional coordinates of the grid points on the projection surface by adopting a triangle method. As shown in fig. 5, the mapping points of any grid point on the projection plane on the sensor pixel plane and the grating pixel plane can be obtained, and the corresponding relationship between the pixel points of the standard grid image and the projection point cloud image is determined, so that the three-dimensional coordinates of the grid points of the projection plane can be reconstructed. That is, since the projection plane point cloud image is obtained by projecting the standard grid image by the optical machine lens and collecting the standard grid image by the camera, the three-dimensional coordinates of each grid point of the projection plane can be reconstructed based on the three-dimensional projection model shown in fig. 5.
An example, u in FIG. 5cvcThe plane coordinate system corresponds to the coordinate system of the pixel surface of the sensor, upvpThe planar coordinate system corresponds to the coordinate system of the grating pixel plane, and then u-f is obtained according to the formula described abovex*x″+cxAnd v ═ fy·y′+cyCan obtain the optical center O of the camera light pathcThe coordinate of the coordinate system of the sensor pixel surface is qcOptical center of optical path O of optical machine lenspThe coordinate of the coordinate system of the grating pixel surface is qpFrom the three-dimensional projection model constructed according to the embodiment, the corresponding grid point Q of the projection plane can be calculatedwIn-ray machine lensThree-dimensional coordinates in a coordinate system.
Let the coordinate of a certain grid point of the projection plane be (X)p,Yp,Zp) Then can obtain
Figure GDA0002938941640000091
And
Figure GDA0002938941640000092
spand scScale factors of the camera and the opto-mechanical lens, respectively, (u)c,vc) And (u)p,vp) Two-dimensional coordinates of the projected points of the spatial three-dimensional points on the sensor pixel plane and the grating pixel plane, AcAnd ApThe internal parameter matrix of the camera and the optical-mechanical lens is [ R | t]Is the extrinsic parameter matrix of the projector.
Then according to the optical path optical center O of the cameracAnd the first pixel point (u) on the camera sensor pixel surfacep,vp) The first linear relation and the optical center O of the optical path of the optical-mechanical lenspAnd a second pixel point (u) on the grating pixel surface of the optical-mechanical lensp,vp) By establishing a second linear relationship, i.e. (X) is calculatedp,Yp,Zp)。
After obtaining the three-dimensional coordinates of the grid points on the projection surface based on the coordinate system of the optical machine lens, the step S130 is continuously executed, that is, the three-dimensional coordinates of the grid points on the projection surface are used to obtain a reference projection plane, the reference projection plane is used to construct a world coordinate system, and the projection points of the grid points on the projection surface on the reference projection plane are obtained based on the conversion relationship between the world coordinate system and the coordinate system of the optical machine lens.
In some embodiments, spatial filtering is performed on the mesh points on the projection plane according to the three-dimensional coordinates of the mesh points on the projection plane, and invalid mesh points in the mesh points are filtered to obtain valid mesh points; carrying out plane fitting on the effective grid points by using a least square method, and determining a plane obtained by fitting as a reference projection plane; the effective grid points are grid points approximately located on the same plane in the grid points on the projection surface, and the ineffective grid points are grid points which are far away from the plane in the grid points on the projection surface.
As shown in fig. 6, three-dimensional point clouds composed of all grid points on the projection plane may not be located on the same plane, and these grid points may include spatial noise points, discontinuity points, and off-plane points, and before reconstructing the projection reference plane, these invalid grid points need to be filtered out to form a smooth projection plane point cloud, and the filtered smooth projection plane point cloud is subjected to plane fitting to obtain the reference projection plane.
In some embodiments, the spatial noise points and the non-continuous points in the invalid grid points may be filtered by using a low-pass filtering method, and then the off-plane points in the invalid grid may be filtered by using the following method to obtain the valid grid points:
step A: randomly selecting three non-collinear grid points from the grid points, and obtaining sub-planes a 'determined by the three non-collinear grid points'0x+a′1y+a′2z ═ d, where a'0,a′1,a′2D is a constant;
and B: calculating the distance d between each grid point on the projection plane and the ith sub-planei=a′0xi+a′1yi+a′2ziRejecting abnormal grid points with the distance from the sub-plane larger than a preset distance value to obtain the number of reference grid points, wherein the reference grid points are grid points with the distance from the sub-plane not larger than the preset distance value; for example, the preset distance value t is 2 σ, and σ is the standard deviation of the distances from all grid points to the current sub-plane when d isiIf the grid point is determined to be an abnormal point and removed when the grid point is larger than 2 sigma, otherwise di≦ 2 σ, it is determined that this grid point is reserved for the reference grid point.
And repeating the step A and the step B, after N times of iteration, determining a reference sub-plane with the maximum number of reference grid points in the N sub-planes obtained after N times of iteration, wherein the reference grid points of the reference sub-plane are the effective grid points.
Referring to fig. 6, the left side of fig. 6 is a cloud of projected surface points before filtering, and it can be seen that some points are in an out-of-plane position, and the right side of fig. 6 is a point cloud for fitting a plane of the point cloud, and it can be seen that those out-of-plane points have been filtered out, and the remaining points are approximately on the fitted plane.
After obtaining the effective grid points, the plane equation of the reference projection plane may be determined as a using the three-dimensional coordinates of the effective grid points0x+a1y+a2Z, the unit normal vector of the plane
Figure GDA0002938941640000111
Wherein a is0,a1,a2Are all constant, NbpNorm () is the vector norm operator, normal to the reference projection plane.
After obtaining the reference projection plane and its pose information, the world coordinate system of the present embodiment can be determined.
In some embodiments, the XOY plane of the world coordinate system coincides with the reference projection plane, the X-axis of the XOY plane of the carriage lens coordinate system is parallel to the X-axis, and the Z-axis direction is perpendicular to the reference projection plane. Since the normal vector of the reference projection plane is known, the vector representation of the three coordinate axes of the world coordinate system in the opto-mechanical lens coordinate system can be calculated by the following formula: vxw=Nbp×Vyc,Vyw=Vyc,Vzw=NbpWherein V isyc=[0,1,0],Vxw、VywAnd VzwAre respectively vector representations of three coordinate axes of the world coordinate system, VycThe origin O of the world coordinate system is the vector representation of the Y axis of the optical-mechanical lens coordinate systemw=[X0-a0t,Y0-a1t,Z0+t],
Figure GDA0002938941640000112
(Xck,Yck,Zck) Is the barycentric coordinate of the kth sub-plane, [ X ]0,Y0,Z0]Is the average center of gravity point of the N sub-planes, and t is a constant.
As shown in fig. 7, the XOY plane of the world coordinate system coincides with the reference projection plane, the Y-axis vector of the world coordinate system has the same direction as the Y-axis vector of the opto-mechanical lens coordinate system, the Z-axis vector of the world coordinate system has the same direction as the normal vector of the reference projection plane, and the origin of the world coordinate system is the projection point of the average gravity center point of the N sub-planes on the reference projection plane.
After the world coordinate system is constructed, the translation vector T ═ O of the optical-mechanical lens coordinate system and the world coordinate system can be determinedwAnd the rotation matrix R ═ Vxw,Vyw,Vzw)TThe conversion relation between the world coordinate system and the optical-mechanical lens coordinate system can be obtained as T ═ Ow,R=(Vxw,Vyw,Vzw)T
In some embodiments, the projection point of the grid point on the projection plane on the reference projection plane is obtained by: according to the conversion relation T ═ O between the world coordinate system and the optical-mechanical lens coordinate systemw,R=(Vxw,Vyw,Vzw)TThe grid point coordinate on the projection surface under the coordinate system of the optical-mechanical lens is converted into the grid point coordinate on the projection surface under the world coordinate system
Figure GDA0002938941640000121
(Xw,Yw,Zw) And (X)c,Yc,Zc) Coordinates of the grid points in a world coordinate system and a coordinate system of the optical machine lens are respectively; and orthogonally projecting the grid points on the projection surface converted into the world coordinate system to a reference projection plane to obtain projection points of the grid points on the projection surface on the reference projection plane.
As shown in fig. 8, all three-dimensional grid points mapped to the world coordinate system are orthogonally projected onto the XOY plane of the world coordinate system, that is, onto the reference projection plane, and the two-dimensional coordinates of the grid points projected onto the projection reference plane are equivalent to X, which is the three-dimensional coordinates of the grid pointswAnd Yw. Assuming that the viewer is facing the projection reference plane, the two-dimensional image projected onto the reference projection plane in the above calculation process restores the distorted image distribution caused by the unevenness of the projection plane seen by the viewer, so that the projector automatically obtains the distribution of the projection plane.
After obtaining the projection points of the grid points on the projection plane on the reference projection plane, the step S140 is continuously executed, that is, texture sampling coordinates corresponding to each pixel point on the grating pixel plane are obtained according to the corresponding relationship between the projection points and the grid points on the grating pixel plane of the optical machine lens, and the texture sampling coordinates are used for correcting distortion displacement of each pixel point on the grating pixel plane on the projection plane.
After orthogonal projection from the three-dimensional grid points to the reference projection plane is finished, selecting orthogonal projection points corresponding to the effective projection points, matching the orthogonal projection points with corresponding grid points in the grating pixel plane, and calculating a homography mapping matrix between two plane coordinate systems. Then, using the calculated homography mapping matrix, as shown in fig. 8, the two-dimensional grid point coordinates on the reference projection plane are mapped to the pixel coordinates of the grating pixel plane, so as to obtain the pixel positions where the distortion coordinates of the grid points of the grating pixel plane on the projection plane are mapped to the grating pixel plane.
It can be understood that: if the pixel of the original pixel position on the raster pixel surface is replaced by the actual pixel value of the distorted picture on the pixel position, the pixel value of the corresponding projection position on the reference projection plane is consistent with the pixel value at the same position on the raster pixel surface, so that distortion-free projection is realized.
In some embodiments, the homography mapping matrix is obtained and texture sample coordinates corresponding to each pixel point of the grating pixel plane are obtained by: obtaining a first homography mapping matrix between the grating pixel surface of the optical machine lens and a reference projection plane according to the grating pixel surface grid point cluster matched with the effective projection point cluster on the grating pixel surface of the optical machine lens, wherein the effective projection point is a projection point of an effective grid point on the reference projection plane; and mapping the projection point coordinates to the grating pixel surface coordinates by using the first homography mapping matrix to obtain texture sampling coordinates of the standard grid image which are mapped to the grating pixel surface at the distortion coordinates of the projection surface. The two-dimensional coordinates of projection points of all grid points on the projection plane on the reference projection plane are mapped into back projection grating coordinates on the grating pixel plane through a first homography mapping matrix, and the back projection grating coordinates are texture sampling coordinates.
Since the projected and searched grid circular spots must be kept complete, referring to the projection point cloud image shown in fig. 4, the corner area of the projection point cloud image is free of grid circular spots, that is, the corner area of the projection point cloud image is outside the coverage range of the grid circular spots, so after obtaining the grid point clusters of the grating pixel surface, the projection point on the projection datum plane needs to be further expanded towards the corner, so that the expanded projection point corresponds to the grid point of the corner area on the grating pixel surface, and the distortion correction range is expanded to the whole projection area. As shown in fig. 9, the known grid points of the projected reference plane are expanded in the vertical or horizontal direction toward the edge of the image to form four sets of rows and columns of edge expanded projection points, and the pixel coordinates of each expanded projection point and the corresponding grid point in the raster pixel map are saved to form a complete projected point image containing the corner expanded projection points as shown in fig. 9.
In some embodiments, the extended projection point is obtained by the following method, a homography mapping matrix corresponding to a grating pixel surface grid point cluster located in a corner area is calculated based on the extended projection point, and texture sampling coordinates corresponding to each pixel point in the corner area of the grating pixel surface are obtained according to the homography mapping matrix: firstly, according to the corner area of the grating pixel surface, expanding the projection point on the reference projection plane to the direction of the corner area to obtain four groups of expanded projection points positioned at the edge of the reference projection plane and grid points on the grating pixel surface matched with the expanded projection points; searching for the expansion points according to a set search step length by using a search frame with a preset size, and acquiring a second homography mapping matrix between the grating pixel surface of the optical machine lens and the reference projection plane by using a grating pixel surface grid point cluster which is positioned in the search frame under each search step length and is matched with the effective projection point cluster; and finally, mapping the extended projection point coordinates to the grating pixel surface coordinates by using the second homography mapping matrix to obtain texture sampling coordinates of the standard grid image edge points mapped to the grating pixel surface at the distortion coordinates of the projection surface.
As shown in fig. 9, the size of the search box is 3 × 3 pixels, the search step is a distance of 3 pixels, and the number of projection points located in the search box under each search step is 9, where the projection points located in the search box at the corners include 5 extended projection points and 4 known adjacent projection points; and the projection points in the search box at the edge include 3 extended projection points and 6 known adjacent projection points. As shown in fig. 9, the cross projection points in the search box are known neighboring projection points. Calculating a second homography mapping matrix between the grating pixel surface and the reference projection plane of the adjacent grid points, and recording two-dimensional coordinates of the grating pixel surface corresponding to the extended projection point as texture sampling coordinates of the grid points in the corner area of the grating pixel surface by using the calculated second homography mapping matrix; and forming grid points and mapping coordinate clusters covering the whole grating image area by the texture sampling coordinates of the grid points in the corner area of the grating pixel surface and the texture sampling coordinates of the known grid points on the grating pixel surface, and finishing coordinate mapping.
After texture sampling coordinates corresponding to each pixel point of the grating pixel surface are obtained, the step S150 is continuously executed, that is, texture mapping is performed on the image to be projected by using the texture sampling coordinates, and the image subjected to the texture mapping is output to the grating pixel surface for projection.
And after the coordinate mapping calculation is finished, recording the pixel coordinate of each grid point of the grating pixel surface and the corresponding texture sampling coordinate, and reserving the pixel coordinate and the corresponding texture sampling coordinate in a nonvolatile memory of a projector to finish the distortion correction measurement process of the current projection environment.
The real-time distortion correction of the projection image is accelerated by the GPU, illustratively, a GPU rendering pipeline establishes grids, namely, all grid points in a raster pixel surface are scanned from top to bottom from left to right, three grid points are used as a group of vertexes to form a triangle, all the triangles are sequentially connected in series to cover the complete raster pixel surface, and the attributes of each vertex comprise the self coordinates of the vertex and the calculated source image texture coordinates corresponding to the vertex of the grid. In the operation process, the rendering pipeline calculates texture coordinates of pixels inside the vertex triangle through bilinear interpolation, a sampler of the GPU rendering pipeline extracts pixel values from corresponding coordinate positions in an original picture based on mapping coordinate clusters to serve as output pixel values of the vertex positions, the images subjected to texture mapping are output to a grating pixel surface of a projector, and the whole distortion pre-correction process is completed through projection.
Therefore, in the embodiment, the projector does not need to use external instruments and equipment, the distribution estimation of the projection surface is automatically finished by the projector, and the distortion correction is carried out on the raster image according to the distribution estimation, so that the picture which is actually projected to the non-flat projection surface is not distorted. The method is suitable for any application scene with the requirement of measuring the spatial distribution condition of the smooth surface.
The invention also provides a projector projection distortion correction device.
Fig. 10 is a block diagram showing a configuration of a projection distortion correction apparatus of a projector according to an embodiment of the present invention, and as shown in fig. 10, the apparatus of the present embodiment includes:
the projection point cloud obtaining unit is used for projecting the standard grid image of the grating pixel surface on a projection surface by using the optical machine lens and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface;
a projection point cloud coordinate calculation unit, configured to obtain three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to a pixel point correspondence between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, where the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector, and is configured to calculate three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system;
the reference projection plane construction unit is used for obtaining a reference projection plane by utilizing the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by utilizing the reference projection plane, and obtaining the projection points of the grid points on the projection plane on the reference projection plane based on the conversion relation between the world coordinate system and the optical machine lens coordinate system;
the conversion relation calculation unit is used for obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the corresponding relation between the projection point and the grid point on the grating pixel surface of the optical machine lens, and the texture sampling coordinates are used for correcting distortion displacement of each pixel point of the grating pixel surface on the projection surface;
and the texture mapping unit is used for performing texture mapping on the image to be projected by using the texture sampling coordinates and outputting the image subjected to the texture mapping to the grating pixel surface for projection.
In some embodiments, the apparatus in fig. 10 further includes a preprocessing unit, which establishes a first linear relationship between the optical center of the camera optical path and a first calibration point of the optical center on the camera sensor pixel surface; establishing a second linear relation according to the optical center of the optical path of the optical machine lens and a second correction point of the optical center on the grating pixel surface; establishing a third linear relation between the first correction point and the second correction point according to an external parameter matrix between the camera and the optical-mechanical lens; and obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the first linear relation, the second linear relation and the third linear relation.
In some embodiments, the reference projection plane construction unit includes: the device comprises a fitting module, a filtering module and a mapping module;
the fitting module is used for carrying out spatial filtering on the grid points on the projection surface according to the three-dimensional coordinates of the grid points on the projection surface, filtering invalid grid points in the grid points and obtaining effective grid points; and performing plane fitting on the effective grid points by using a least square method, and determining a plane obtained by fitting as the reference projection plane.
A filtering module, which repeatedly executes the step a and the step B, and after N iterations, determines a reference sub-plane with the largest number of reference grid points in N sub-planes obtained after N iterations, where the reference grid points of the reference sub-plane are the effective grid points; wherein the step A: randomly selecting three non-collinear grid points from the grid points to obtain sub-planes determined by the three non-collinear grid points; and B: and calculating the distance between each grid point on the projection plane and the sub-plane, and eliminating abnormal grid points with the distance between the abnormal grid points and the sub-plane being greater than a preset distance value to obtain the number of reference grid points, wherein the reference grid points are grid points with the distance between the abnormal grid points and the sub-plane being not greater than the preset distance value.
The mapping module is used for converting the grid point coordinates on the projection surface under the optical-mechanical lens coordinate system into grid point coordinates on the projection surface under the world coordinate system according to the conversion relation between the world coordinate system and the optical-mechanical lens coordinate system; and orthogonally projecting the grid points on the projection surface converted into the world coordinate system to the reference projection plane to obtain projection points of the grid points on the projection surface on the reference projection plane.
In some embodiments, the conversion relation calculation unit includes a first calculation module and a second calculation module;
the first calculation module is used for obtaining a first homography mapping matrix between the grating pixel surface of the optical-mechanical lens and the reference projection plane according to the grating pixel surface grid point cluster matched with the effective projection point cluster on the grating pixel surface of the optical-mechanical lens, wherein the effective projection point is a projection point of the effective grid point on the reference projection plane; and mapping the projection point coordinates to the grating pixel surface coordinates by using the first homography mapping matrix to obtain texture sampling coordinates of the standard grid image mapped to the grating pixel surface at the distortion coordinates of the projection surface.
The second calculation module is used for expanding the projection points on the reference projection plane to the direction of the corner areas according to the corner areas of the grating pixel surface to obtain four groups of expanded projection points positioned at the edge of the reference projection plane and grid points on the grating pixel surface matched with the expanded projection points; searching for the expansion points according to a set search step length by using a search frame with a preset size, and acquiring a second homography mapping matrix between the grating pixel surface of the optical machine lens and the reference projection plane by using a grating pixel surface grid point cluster which is positioned in the search frame and matched with the effective projection point cluster under each search step length; and mapping the extended projection point coordinates to the grating pixel surface coordinates by using the second homography mapping matrix to obtain texture sampling coordinates of the standard grid image edge points mapped to the grating pixel surface at the distortion coordinates of the projection surface.
In some embodiments, the projection point cloud obtaining unit projects a first standard grid image of a grating pixel surface on the projection surface by using the optical machine lens, and captures the projection surface by using the camera to obtain a first projection point cloud image; projecting a second standard grid image of a grating pixel surface on the projection surface by using the optical-mechanical lens, and shooting the projection surface by using the camera to obtain a second projection point cloud image, wherein circular spots in the first standard grid image and the second standard grid image are staggered; respectively identifying circular spot grids in the first projection point cloud image and the second projection point cloud image to obtain the pixel position of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface; and obtaining the pixel position of each circular spot of the projection point cloud image by combining the pixel positions of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The invention also provides a projector.
Fig. 11 is a schematic structural diagram of a projector according to an embodiment of the present invention, and as shown in fig. 11, in a hardware level, the projector includes a graphics processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least one disk Memory. Of course, the projector may also include hardware required by other services, such as an optical-mechanical lens and a camera, where the optical-mechanical lens projects the standard grid image of the grating pixel surface onto the projection surface, and the camera captures the projection surface to obtain the projection point cloud image.
The graphics processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 11, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may comprise program code comprising computer executable instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the graphics processor.
The image processor reads a corresponding computer program from the nonvolatile memory into the internal memory and then runs the computer program to form the projector projection distortion correcting device on a logic level. And a graphic processor executing the program stored in the memory to implement the projector projection distortion correction method as described above.
The method performed by the projector projection distortion correction apparatus disclosed in the embodiment of fig. 11 in the present specification can be applied to or implemented by a graphics processor. The graphics processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the projector distortion correction method described above may be performed by instructions in the form of software or integrated logic circuits of hardware in the graphics processor. The graphics Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present specification may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present specification may be embodied directly in a hardware decoding processor, or in a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The invention also provides a computer readable storage medium.
The computer readable storage medium stores one or more computer programs, the one or more computer programs comprising instructions, which when executed by a graphics processor of a projector, are capable of implementing the projector projection distortion correction method described above.
For the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", and the like are used to distinguish the same items or similar items with basically the same functions and actions, and those skilled in the art can understand that the words "first", "second", and the like do not limit the quantity and execution order.
While the foregoing is directed to embodiments of the present invention, other modifications and variations of the present invention may be devised by those skilled in the art in light of the above teachings. It should be understood by those skilled in the art that the foregoing detailed description is for the purpose of better explaining the present invention, and the scope of the present invention should be determined by the scope of the appended claims.

Claims (12)

1. A projector projection distortion correction method, wherein the projector has an optical machine lens and a camera, the method comprising:
projecting the standard grid image of the grating pixel surface on a projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface;
acquiring three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the corresponding relation of the pixel points between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and a camera coordinate system of the projector;
acquiring a reference projection plane by using the three-dimensional coordinates of the grid points on the projection surface, constructing a world coordinate system by using the reference projection plane, and acquiring projection points of the grid points on the projection surface on the reference projection plane based on the conversion relation between the world coordinate system and the optical machine lens coordinate system;
obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the corresponding relation between the projection point and the grid point on the grating pixel surface of the optical machine lens, wherein the texture sampling coordinates are used for correcting distortion displacement of each pixel point of the grating pixel surface on the projection surface;
and performing texture mapping on the image to be projected by using the texture sampling coordinates, and outputting the image subjected to the texture mapping to the grating pixel surface for projection.
2. The method of claim 1, wherein the three-dimensional projection model is constructed by:
establishing a first linear relation according to the optical center of the camera light path and a first correction point of the optical center on the camera sensor pixel surface;
establishing a second linear relation according to the optical center of the optical path of the optical machine lens and a second correction point of the optical center on the grating pixel surface of the optical machine lens;
establishing a third linear relation between the first correction point and the second correction point according to an external parameter matrix between the camera and the optical-mechanical lens;
and obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the first linear relation, the second linear relation and the third linear relation.
3. The method of claim 1, wherein the obtaining a reference projection plane using three-dimensional coordinates of grid points on the projection surface comprises:
spatial filtering is carried out on the grid points on the projection surface according to the three-dimensional coordinates of the grid points on the projection surface, invalid grid points in the grid points are filtered, and effective grid points are obtained;
and performing plane fitting on the effective grid points by using a least square method, and determining a plane obtained by fitting as the reference projection plane.
4. The method of claim 3, wherein the spatially filtering the mesh points on the plane of projection according to the three-dimensional coordinates of the mesh points on the plane of projection, and filtering invalid mesh points in the mesh points to obtain valid mesh points comprises:
step A: randomly selecting three non-collinear grid points from the grid points to obtain sub-planes determined by the three non-collinear grid points;
and B: calculating the distance between each grid point on the projection plane and the sub-plane, eliminating abnormal grid points with the distance between the abnormal grid points and the sub-plane being larger than a preset distance value, and obtaining the number of reference grid points, wherein the reference grid points are grid points with the distance between the abnormal grid points and the sub-plane being not larger than the preset distance value;
and repeating the step A and the step B, after N times of iteration, determining a reference sub-plane with the maximum number of reference grid points in the N sub-planes obtained after N times of iteration, wherein the reference grid points of the reference sub-plane are the effective grid points.
5. The method of claim 4, wherein an XOY plane of the world coordinate system XOYZ coincides with the reference projection plane, a Y-axis vector of the world coordinate system is in the same direction as a Y-axis vector of the opto-mechanical lens coordinate system, a Z-axis vector of the world coordinate system XOYZ is in the same direction as the reference projection plane normal vector, and the world coordinate system origin O is a projection point of an average center of gravity point of the N sub-planes on the reference projection plane.
6. The method of claim 1, wherein the obtaining the projection point of the grid point on the projection plane on the reference projection plane based on the conversion relationship between the world coordinate system and the optical-mechanical lens coordinate system comprises:
according to the conversion relation between the world coordinate system and the optical-mechanical lens coordinate system, converting the grid point coordinates on the projection surface under the optical-mechanical lens coordinate system into grid point coordinates on the projection surface under the world coordinate system;
and orthogonally projecting the grid points on the projection surface converted into the world coordinate system to the reference projection plane to obtain projection points of the grid points on the projection surface on the reference projection plane.
7. The method of claim 3, wherein the obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the correspondence between the projection point and the grid point on the grating pixel surface of the optical mechanical lens comprises:
obtaining a first homography mapping matrix between the grating pixel surface of the optical machine lens and the reference projection plane according to the grating pixel surface grid point cluster matched with the effective projection point cluster on the grating pixel surface of the optical machine lens, wherein the effective projection point is a projection point of the effective grid point on the reference projection plane;
and mapping the projection point coordinates to the grating pixel surface coordinates by using the first homography mapping matrix to obtain texture sampling coordinates of the standard grid image mapped to the grating pixel surface at the distortion coordinates of the projection surface.
8. The method according to claim 7, wherein the obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the correspondence between the projection point and the grid point on the grating pixel surface of the optical mechanical lens further comprises:
expanding the projection points on the reference projection plane to the direction of the corner area according to the corner area of the grating pixel surface to obtain four groups of expanded projection points positioned at the edge of the reference projection plane and grid points on the grating pixel surface matched with the expanded projection points;
searching for the expansion points according to a set search step length by using a search frame with a preset size, and acquiring a second homography mapping matrix between the grating pixel surface of the optical machine lens and the reference projection plane by using a grating pixel surface grid point cluster which is positioned in the search frame and matched with the effective projection point cluster under each search step length;
and mapping the extended projection point coordinates to the grating pixel surface coordinates by using the second homography mapping matrix to obtain texture sampling coordinates of the standard grid image edge points mapped to the grating pixel surface at the distortion coordinates of the projection surface.
9. The method of claim 1, wherein the projecting the standard grid image of the grating pixel surface on the projection surface by the optical-mechanical lens and capturing the projection surface by the camera to obtain the projection point cloud image of the grid point on the projection surface comprises:
projecting a first standard grid image of a grating pixel surface on the projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a first projection point cloud image;
projecting a second standard grid image of a grating pixel surface on the projection surface by using the optical-mechanical lens, and shooting the projection surface by using the camera to obtain a second projection point cloud image, wherein circular spots in the first standard grid image and the second standard grid image are staggered;
respectively identifying circular spot grids in the first projection point cloud image and the second projection point cloud image to obtain the pixel position of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface;
and obtaining the pixel position of each circular spot of the projection point cloud image by combining the pixel positions of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface.
10. A projector projection distortion correction apparatus, wherein the projector has an opto-mechanical lens and a camera, the apparatus comprising:
the projection point cloud obtaining unit is used for projecting the standard grid image of the grating pixel surface on a projection surface by using the optical machine lens and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface;
a projection point cloud coordinate calculation unit, which obtains three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the pixel point corresponding relation between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and a camera coordinate system of the projector;
the reference projection plane construction unit is used for obtaining a reference projection plane by utilizing the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by utilizing the reference projection plane, and obtaining the projection points of the grid points on the projection plane on the reference projection plane based on the conversion relation between the world coordinate system and the optical machine lens coordinate system;
the conversion relation calculation unit is used for obtaining texture sampling coordinates corresponding to each pixel point of the grating pixel surface according to the corresponding relation between the projection point and the grid point on the grating pixel surface of the optical machine lens, and the texture sampling coordinates are used for correcting distortion displacement of each pixel point of the grating pixel surface on the projection surface;
and the texture mapping unit is used for performing texture mapping on the image to be projected by using the texture sampling coordinates and outputting the image subjected to the texture mapping to the grating pixel surface for projection.
11. A projector, comprising:
the optical-mechanical lens is used for projecting the standard grid image of the grating pixel surface on the projection surface;
shooting a projection plane to obtain a projection point cloud image and sending the projection point cloud image to a camera of the image processor;
a memory storing computer-executable instructions that, when executed, cause the graphics processor to perform the method of any of claims 1-9.
12. A computer readable storage medium, wherein the computer readable storage medium has stored thereon one or more computer programs which, when executed by a graphics processor of a projector, implement the method of any of claims 1-9.
CN201910249736.5A 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector Active CN110111262B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910249736.5A CN110111262B (en) 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910249736.5A CN110111262B (en) 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector

Publications (2)

Publication Number Publication Date
CN110111262A CN110111262A (en) 2019-08-09
CN110111262B true CN110111262B (en) 2021-06-04

Family

ID=67484743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910249736.5A Active CN110111262B (en) 2019-03-29 2019-03-29 Projector projection distortion correction method and device and projector

Country Status (1)

Country Link
CN (1) CN110111262B (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110784699B (en) * 2019-11-01 2021-06-25 成都极米科技股份有限公司 Projection processing method, projection processing device, projector and readable storage medium
CN110864649B (en) * 2019-11-25 2021-10-08 歌尔光学科技有限公司 Method for determining compensation value and determining flatness of optical module
CN113066158B (en) * 2019-12-16 2023-03-10 杭州海康威视数字技术股份有限公司 Vehicle-mounted all-round looking method and device
CN111385947B (en) * 2020-03-23 2022-04-26 北京经纬恒润科技股份有限公司 Control method and device applied to pixel lamp
CN113643414B (en) * 2020-05-11 2024-02-06 北京达佳互联信息技术有限公司 Three-dimensional image generation method and device, electronic equipment and storage medium
CN111669557B (en) * 2020-06-24 2022-05-13 歌尔光学科技有限公司 Projected image correction method and correction device
CN111935468B (en) * 2020-09-24 2021-01-22 歌尔股份有限公司 Method and device for detecting deviation of projection center and computer readable storage medium
CN112330794B (en) * 2020-10-09 2022-06-14 同济大学 Single-camera image acquisition system based on rotary bipartite prism and three-dimensional reconstruction method
CN112295109B (en) * 2020-10-20 2022-04-01 北京理工大学 Therapeutic light control method and photodynamic therapy device using same
CN112652047A (en) * 2020-10-23 2021-04-13 成都完美时空网络技术有限公司 Warping effect generation method, device, equipment and storage medium
CN112614190B (en) * 2020-12-14 2023-06-06 北京淳中科技股份有限公司 Method and device for projecting mapping
CN112672127B (en) * 2020-12-29 2023-02-14 视田科技(天津)有限公司 Automatic calibration method for projection reflection picture
CN112614075B (en) * 2020-12-29 2024-03-08 凌云光技术股份有限公司 Distortion correction method and equipment for surface structured light 3D system
CN112995625B (en) * 2021-02-23 2022-10-11 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN113099198B (en) * 2021-03-19 2023-01-10 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN115412719B (en) * 2021-05-26 2024-03-01 致伸科技股份有限公司 Method for aligning camera lens and light source
CN113487500B (en) * 2021-06-28 2022-08-02 北京紫光展锐通信技术有限公司 Image distortion correction method and apparatus, electronic device, and storage medium
CN113838002A (en) * 2021-08-25 2021-12-24 网易(杭州)网络有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN115908679A (en) * 2021-08-31 2023-04-04 北京字跳网络技术有限公司 Texture mapping method, device, equipment and storage medium
CN113938661B (en) * 2021-09-29 2024-05-07 漳州万利达科技有限公司 Projector side projection correction method, terminal equipment and storage medium
WO2023087947A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and correction method
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN116540872B (en) * 2023-04-28 2024-06-04 中广电广播电影电视设计研究院有限公司 VR data processing method, device, equipment, medium and product
CN117058342B (en) * 2023-10-12 2024-01-26 天津科汇新创科技有限公司 Spine 3D voxel model construction method based on projection image

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7705877B2 (en) * 2004-01-28 2010-04-27 Hewlett-Packard Development Company, L.P. Method and system for display of facial features on nonplanar surfaces
JP3714365B1 (en) * 2004-03-30 2005-11-09 セイコーエプソン株式会社 Keystone correction of projector
CN101335901A (en) * 2007-06-29 2008-12-31 三星电子株式会社 Projected picture correcting method and apparatus
US7874678B2 (en) * 2008-07-02 2011-01-25 Hines Stephen P Projected autostereoscopic lenticular 3-D system
CN101321303A (en) * 2008-07-17 2008-12-10 上海交通大学 Geometric and optical correction method for non-plane multi-projection display
CN102184566A (en) * 2011-04-28 2011-09-14 湘潭大学 Micro projector mobile phone platform-based portable three-dimensional scanning system and method
CN103426149B (en) * 2013-07-24 2016-02-03 玉振明 The correction processing method of wide-angle image distortion
JP6636252B2 (en) * 2015-03-19 2020-01-29 株式会社メガチップス Projection system, projector device, imaging device, and program
CN107454373B (en) * 2016-05-31 2019-06-14 财团法人工业技术研究院 Projection system and non-planar automatic correction method and automatic correction processing device thereof
CN107833253B (en) * 2017-09-22 2020-08-04 北京航空航天大学青岛研究院 RGBD three-dimensional reconstruction texture generation-oriented camera attitude optimization method
CN108377371A (en) * 2018-02-09 2018-08-07 深圳市火乐科技发展有限公司 A kind of method and device of projection image correction
CN108665536B (en) * 2018-05-14 2021-07-09 广州市城市规划勘测设计研究院 Three-dimensional and live-action data visualization method and device and computer readable storage medium

Also Published As

Publication number Publication date
CN110111262A (en) 2019-08-09

Similar Documents

Publication Publication Date Title
CN110111262B (en) Projector projection distortion correction method and device and projector
CN110336987B (en) Projector distortion correction method and device and projector
CN110191326B (en) Projection system resolution expansion method and device and projection system
CN108876926B (en) Navigation method and system in panoramic scene and AR/VR client equipment
CN108257183B (en) Camera lens optical axis calibration method and device
KR101319777B1 (en) Panoramic projection device and method implemented by said device
US5898438A (en) Texture mapping of photographic images to CAD surfaces
CN108769462B (en) Free visual angle scene roaming method and device
KR20170005009A (en) Generation and use of a 3d radon image
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN102436639A (en) Image acquiring method for removing image blurring and image acquiring system
JP6674643B2 (en) Image processing apparatus and image processing method
CN110648274B (en) Method and device for generating fisheye image
CN114697623A (en) Projection surface selection and projection image correction method and device, projector and medium
JPH11175762A (en) Light environment measuring instrument and device and method for shading virtual image using same
JP2023546739A (en) Methods, apparatus, and systems for generating three-dimensional models of scenes
CN113643414A (en) Three-dimensional image generation method and device, electronic equipment and storage medium
US20180213215A1 (en) Method and device for displaying a three-dimensional scene on display surface having an arbitrary non-planar shape
US20170289516A1 (en) Depth map based perspective correction in digital photos
Hach et al. Cinematic bokeh rendering for real scenes
CN114283243A (en) Data processing method and device, computer equipment and storage medium
WO2022126430A1 (en) Auxiliary focusing method, apparatus and system
JP2001016621A (en) Multi-eye data input device
JP4554231B2 (en) Distortion parameter generation method, video generation method, distortion parameter generation apparatus, and video generation apparatus
CN116524022B (en) Offset data calculation method, image fusion device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220803

Address after: Room 1903, 19 / F, building D, Zhizhen building, No. 7 Zhichun Road, Haidian District, Beijing 100088

Patentee after: Bird innovation (Beijing) Technology Co.,Ltd.

Address before: 100191 room 1801, block D, Zhizhen building, 7 Zhichun Road, Haidian District, Beijing

Patentee before: BEIJING XIAONIAO TINGTING TECHNOLOGY Co.,Ltd.