CN110336987B - Projector distortion correction method and device and projector - Google Patents

Projector distortion correction method and device and projector Download PDF

Info

Publication number
CN110336987B
CN110336987B CN201910266198.0A CN201910266198A CN110336987B CN 110336987 B CN110336987 B CN 110336987B CN 201910266198 A CN201910266198 A CN 201910266198A CN 110336987 B CN110336987 B CN 110336987B
Authority
CN
China
Prior art keywords
projection
coordinate system
pixel
plane
projector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910266198.0A
Other languages
Chinese (zh)
Other versions
CN110336987A (en
Inventor
苏劲
蔡志博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bird Innovation Beijing Technology Co ltd
Original Assignee
Beijing Xiaoniao Tingting Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaoniao Tingting Technology Co Ltd filed Critical Beijing Xiaoniao Tingting Technology Co Ltd
Priority to CN201910266198.0A priority Critical patent/CN110336987B/en
Publication of CN110336987A publication Critical patent/CN110336987A/en
Application granted granted Critical
Publication of CN110336987B publication Critical patent/CN110336987B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a projector distortion correction method and device and a projector. The method of the invention comprises the following steps: projecting the standard grid image on the grating pixel surface on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a projection point cloud image of grid points on the projection surface; according to a pre-constructed three-dimensional projection model, obtaining three-dimensional coordinates of grid points on a projection surface based on an optical machine lens coordinate system; acquiring a reference projection plane by using the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by using the reference projection plane and the attitude information of the projector acquired by the inertial measurement unit, and acquiring a conversion relation between the world coordinate system and an optical machine lens coordinate system; acquiring a target area of the distortion-free projection on the grating pixel surface according to the conversion relation; and obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by using the mapping relation. The invention can eliminate the trapezoidal distortion caused by the non-ideal projection angle.

Description

Projector distortion correction method and device and projector
Technical Field
The invention relates to a projector distortion correction method and device and a projector.
Background
With the maturity of short-focus optical machine technology and the substantial reduction of cost, the applications of intelligent projectors in homes are increasing. The requirements of household users on high-quality viewing experience are not met, and the resolution, brightness and color image quality of the intelligent projector are rapidly improved. In general, if the projection direction of the projector is not at right angles to the projection screen, the projected picture is keystone-distorted. In practical applications, a user may adjust a projection tilt angle in a vertical direction of the projector to generate projection pictures with different heights and sizes, so that vertical keystone distortion may be generated. Even under the influence of factors such as uneven placement plane, the projector has an axial rotation angle component, so that the projection horizontal plane and the absolute horizontal plane cannot be kept horizontal, and the final projection picture is distorted due to the condition.
At present, most projectors have a vertical trapezoidal correction function, and the final projected picture is made to be rectangular by adjusting the intensity of trapezoidal correction of the projector in the vertical direction. Some projector manufacturers have developed a "horizontal keystone correction function" that solves the keystone distortion of a horizontal image caused by the fact that the lens of the projector cannot be perpendicular to the screen, so that the projector can also realize a standard rectangular projection image on the side of the projection screen.
However, the existing solutions cannot perfectly solve the distortion caused by the axial rotation deviation, so that the projector has poor adaptability to the projection environment and affects the viewing experience of the user. In addition, most of the existing trapezoidal correction technologies require users to finish correction processes by visual inspection and manual adjustment of correction parameters, and the automation degree is generally low.
Disclosure of Invention
The invention provides a projector distortion correction method, a projector distortion correction device and a projector, which at least partially solve the problems.
In a first aspect, the present invention provides a distortion correction method for a projector, the projector having an optical machine lens, a camera and an inertial measurement unit, the method comprising: projecting the standard grid image on the grating pixel surface on a projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface; acquiring three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the corresponding relation of the pixel points between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector; acquiring a reference projection plane by using the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by using the reference projection plane and the attitude information of the projector acquired by the inertial measurement unit, and acquiring a conversion relation between the world coordinate system and the optical machine lens coordinate system; acquiring a target area of the undistorted projection on the grating pixel surface according to the conversion relation between the world coordinate system and the optical machine lens coordinate system; and obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by utilizing the mapping relation to realize distortion-free projection.
In a second aspect, the present invention provides a distortion correction apparatus for a projector having an optical machine lens, a camera and an inertial measurement unit, the apparatus comprising: the projection point cloud obtaining unit is used for projecting the standard grid image on the grating pixel surface on a projection surface by using the optical machine lens and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface; a projection point cloud coordinate calculation unit, which obtains the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the pixel point corresponding relation between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector; the conversion relation calculation unit is used for acquiring a reference projection plane by utilizing the three-dimensional coordinates of the grid points on the projection surface, constructing a world coordinate system by utilizing the reference projection plane and the attitude information of the projector acquired by the inertia measurement unit, and acquiring the conversion relation between the world coordinate system and the optical machine lens coordinate system; the target area determining unit is used for acquiring a target area of the undistorted projection on the grating pixel surface according to the conversion relation between the world coordinate system and the optical machine lens coordinate system; and the texture mapping unit is used for obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by utilizing the mapping relation to realize distortion-free projection.
In a third aspect, the present invention provides a projector comprising: the optical machine lens is used for projecting the standard grid image on the grating pixel surface on a projection surface; the camera shoots a projection surface to obtain a projection point cloud image and sends the projection point cloud image to the image processor; the inertia measurement unit is used for measuring the attitude information of the projector and sending the attitude information to the graphic processor; a memory storing computer-executable instructions; a graphics processor that when executed causes the graphics processor to perform the aforementioned projector distortion correction method.
In a fourth aspect, the present invention provides a computer-readable storage medium having one or more computer programs stored thereon, the one or more computer programs, when executed, implementing the aforementioned projector distortion correction method.
The invention is provided with a built-in camera, an optical-mechanical lens and an inertial direction finding unit in a projector, the projector has stereoscopic vision capability through the built-in camera and the optical-mechanical lens, active three-dimensional modeling of a projection environment can be realized through the optical-mechanical lens and the camera of the projector, a three-dimensional projection model is established, the projector can automatically obtain the distribution condition of a projection surface through a computer vision method based on the three-dimensional projection model without external equipment and instruments, the attitude information of the projector can be obtained through the built-in inertial measurement unit, a target area of distortion-free projection on a grating pixel surface is calculated by combining the distribution condition of the projection surface and the attitude information of the projector, the mapping relation between an original grating image and a distortion-free pre-corrected image is obtained through matching the target area of distortion-free projection and the peak of the original grating image, when the real-time projection is carried out, distortion correction can be performed on the image to be projected by using a texture mapping technology, so that trapezoidal distortion caused by an unsatisfactory projection angle between the projector and the projection surface is eliminated.
Drawings
Fig. 1 is a flowchart illustrating a projector distortion correction method according to an embodiment of the present invention;
fig. 2 is a schematic view of an optical system of a projector according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating pinhole imaging according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a process for generating a projection point cloud image according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a three-dimensional projection model according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a process of fitting a reference projection plane according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a relationship between a world coordinate system, an optical-mechanical lens coordinate system, a grating pixel plane, and a reference projection plane according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a quadrilateral area and a maximum inscribed rectangle on a reference projection plane according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a target area on a pixel plane of a grating according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a triangular mesh according to an embodiment of the present invention;
FIG. 11 is a grid diagram of a pre-corrected target frame according to an embodiment of the present invention;
FIG. 12 is a schematic diagram illustrating the effect of distortion correction projection according to an embodiment of the present invention;
fig. 13 is a block diagram showing the configuration of a distortion correcting apparatus of a projector according to an embodiment of the present invention;
fig. 14 is a schematic structural view of a projector according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a projector distortion correction method, which is characterized in that various computer vision methods and projector posture information are utilized to calculate surface distribution parameters of a projection surface, the distribution parameters are used as the basis to calculate the grid texture layout of a pre-corrected target picture, and the texture mapping correction of an input image is realized through a GPU (graphics processing unit), so that a pre-distorted projection image is obtained, and the distortion-free projection effect is realized under the condition that the projection angle between a projector and the projection surface is not ideal. The embodiment of the invention also provides a corresponding device, a projector and a computer readable storage medium, which are respectively described in detail below.
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings. It is to be understood that such description is merely illustrative and not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The words "a", "an" and "the" and the like as used herein are also intended to include the meanings of "a plurality" and "the" unless the context clearly dictates otherwise. Furthermore, the terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Thus, the techniques of the present invention may be implemented in hardware and/or in software (including firmware, microcode, etc.). Furthermore, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of the present invention, a computer-readable storage medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer-readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The invention provides a distortion correction method for a projector.
Fig. 1 is a flowchart illustrating a distortion correction method for a projector according to an embodiment of the present invention, and as shown in fig. 1, the method of the embodiment includes:
and S110, projecting the standard grid image on the grating pixel surface on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a projection point cloud image of grid points on the projection surface.
The optical engine in this embodiment may be understood as a projection module in a projection device, and in general, the optical engine integrates all the display core (also referred to as a grating pixel surface), the light source, the lens optical path, and the heat dissipation of the digital micromirror device into one mechanism to form an integral component, so as to prevent dust and shock.
And S120, obtaining three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the corresponding relation of the pixel points between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector and is used for calculating the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system.
S130, obtaining a reference projection plane by using the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by using the reference projection plane and the attitude information of the projector obtained by the inertial measurement unit, and obtaining a conversion relation between the world coordinate system and the optical machine lens coordinate system.
In some embodiments, the gyroscope measured axial rotation angle identifies the degree of deviation of the projector from an absolute horizontal plane. Wherein, the absolute horizontal plane can be and is understood as the XOZ plane of the optical-mechanical lens coordinate system of the ideal projector, i.e. the projector without deviation in the vertical, horizontal and rotation directions; the horizontal plane of the projector can be understood as the XOZ plane of the optical machine lens coordinate system.
In some embodiments, the inertial measurement unit comprises a gyroscope, and the world coordinate system is constructed by using the reference projection plane and the attitude information of the projector obtained by the inertial measurement unit, in particular by using the reference projection plane and the axial rotation angle measured by the gyroscope.
And S140, acquiring a target area of the undistorted projection on the grating pixel surface according to the conversion relation between the world coordinate system and the optical machine lens coordinate system.
S150, obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by using the mapping relation to realize distortion-free projection.
The embodiment is characterized in that a camera, an optical-mechanical lens and an inertial direction finding unit are arranged in a projector, the projector has stereoscopic vision capability through the built-in camera and the optical-mechanical lens, active three-dimensional modeling of a projection environment can be realized through the optical-mechanical lens and the camera of the projector, a three-dimensional projection model is established, the projector can automatically obtain the distribution condition of a projection surface through a computer vision method based on the three-dimensional projection model without using external equipment and instruments, the attitude information of the projector can be obtained through the built-in inertial measurement unit, a target area of distortion-free projection on a grating pixel surface is calculated by combining the distribution condition of the projection surface and the attitude information of the projector, the mapping relation between an original grating image and a distortion-free pre-corrected image is obtained through matching the target area of distortion-free projection and the vertex of the original grating image, and during real-time projection, distortion correction can be performed on the image to be projected by using a texture mapping technology, so that trapezoidal distortion caused by an unsatisfactory projection angle between the projector and the projection surface is eliminated.
For promoting the precision of projecting apparatus distortion correction, this embodiment calibrates the projecting apparatus in advance, obtains the distortion parameter of projecting apparatus, and this distortion parameter includes the internal parameter and the external parameter of ray apparatus camera lens and camera.
The calibration process for the projector is as follows:
as shown in fig. 2, there is a certain distance between the optical-mechanical lens and the camera of the projector, so that the same world coordinate point has parallax on the optical-mechanical grating pixel plane and the sensor pixel plane, for example, point a in the projection area in fig. 2 corresponds to pixel position a1 on the optical-mechanical grating pixel plane and pixel position a2 on the sensor pixel plane, thereby satisfying the formation condition of binocular stereo vision, since the three-dimensional projection model can be constructed based on the optical-mechanical lens coordinate system and the camera coordinate system of the projector.
The optical-mechanical lens can be regarded as a reverse camera, and a pinhole imaging model similar to the camera can be established, so that the correction principle of the optical-mechanical lens is similar to that of the camera, and the embodiment describes the case of obtaining the distortion parameter of the camera.
As shown in fig. 3, the formula of the pinhole imaging model is: sm' ═ A [ R | t]M', wherein s is a normalized scale factor; a is an internal parameter matrix of the camera; [ R | t]The external parameter matrix is used for converting the coordinate of the image point P from a world coordinate system to a camera coordinate system, R is a rotation matrix, and t is a translation vector; m 'is the coordinate position in the camera coordinate system, and M' is the coordinate position in the world coordinate system; in the pinhole imaging optical path shown in FIG. 3, the coordinate c of the object point FcxAnd cyThe plane coordinate corresponding to the uv plane is (u, v), and u ═ fx·x′+cx,v=fy·y′+cy,x′=x/z,y′=y/z,fx,fyRespectively camera focal coordinates, cx,cyRespectively, X-axis and Y-axis coordinates of the object point Fc, X and Y respectively, coordinates of the image point P, and X 'and Y' respectively, normalized coordinates of the longitudinal axis coordinates of the point P. The coordinate system of the object point Fc shown in fig. 3 corresponds to the coordinate system of the camera of the present embodiment, and the uv plane coordinate system corresponds to the pixel plane coordinate system of the camera sensor of the present embodiment. Therefore, after the conversion relation between the world coordinate system and the camera coordinate system is obtained, the conversion relation can be obtained according to the condition that u is equal to fx·x′+cx,v=fy·y′+cyAnd obtaining the corresponding relation between the world coordinate system and the sensor pixel surface coordinate system.
For the camera intrinsic parameter matrix, the intrinsic parameters of the projector camera can be obtained through the calibration plate and structured light projection, and the intrinsic parameters comprise a focal length, a radial distortion parameter, a tangential distortion parameter and a main parameterPoint coordinates (i.e., the center point of the sensor image). At this time, the corresponding relation between the camera coordinate system and the sensor pixel surface coordinate system is as follows: f ═ ux*x″+cx,v=fy*y″+cyWherein, in the step (A),
Figure GDA0003057766580000071
Figure GDA0003057766580000072
k1,k2,k3,k4,k5,k6respectively radial distortion parameter, p, of the camera1,p2Respectively the tangential distortion parameter, s, of the camera1,s2,s3,s4Respectively, the thin prism distortion parameters of the camera.
In this embodiment, a translation vector and a rotation matrix between the optical-mechanical lens and the camera are also required to be obtained. Specifically, in the internal parameter calibration process, a rotation matrix R from a world coordinate system to an optical machine lens coordinate system can be obtainedpAnd a translation vector tpRotation matrix R from world coordinate system to camera coordinate systemcAnd a translation vector tcFrom the pinhole imaging model, we can obtain:
Figure GDA0003057766580000081
(X, Y, Z) is the three-dimensional point coordinate of the world coordinate system, (X)p,Yp,Zp) And (X)c,Yc,Zc) The three-dimensional point coordinates of the optical-mechanical lens coordinate system and the camera coordinate system are respectively corresponding, and the relative position relationship between the optical-mechanical lens and the camera can be obtained by combining the two formulas:
Figure GDA0003057766580000082
therefore, the external parameters of the binocular vision of the projector can be obtained, and the coordinate conversion from the camera coordinate system to the optical machine lens coordinate system is realized.
Before the projector leaves the factory, the internal parameters and the external parameters of the optical machine lens and the camera are obtained by the projector calibration method. In some embodiments, the projector raster resolution used for calibration is 1920x1080, the virtual imaging of the grid points on the raster surface is obtained through re-projection, and the calibration residual is calculated. In some embodiments the intra-parameter calibration residual is 0.2137 pixels and the extra-parameter calibration residual is 0.4324 pixels. And downloading the calibrated internal and external parameters into the projector for utilization in the subsequent distortion correction process.
In the application of the projector, after a user interface comprising a projection surface distortion correction flow is selected and enters the user interface of the flow, the projector prompts to enter the distortion correction flow, and at the moment, the projector is required to be aligned to a projection surface to be corrected, the position of the projector is kept constant, and the automatic correction flow is entered. The above steps S110 to S150 will be described in detail with reference to fig. 4 to 12.
First, step S110 is executed, namely, the optical engine lens is used to project the standard grid image on the grating pixel surface onto the projection surface, and the camera is used to shoot the projection surface to obtain the projection point cloud image of the grid point on the projection surface.
Generally, the higher the density of the projection surface point cloud is, the higher the measurement accuracy of the projection surface is, but due to the influence of factors such as circle spot scattering caused by lens distortion, acquisition noise, background light and the like, if the grid density is too high, the grid searching speed is influenced, and even the grid point cannot be searched. In the embodiment, the method for generating and projecting the dislocation grid is introduced, so that the density of the grid points can be improved under the condition that the searching efficiency of the grid points is not influenced, and the point cloud measurement accuracy of the projection surface is improved.
In some embodiments, the projection surface point cloud image is obtained by: projecting a first standard grid image of a grating pixel surface on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a first projection point cloud image; projecting a second standard grid image of the grating pixel surface on a projection surface by using an optical machine lens, and shooting the projection surface by using a camera to obtain a second projection point cloud image, wherein circular spots in the first standard grid image and the second standard grid image are staggered; respectively identifying circular spot grids in the first projection point cloud image and the second projection point cloud image to obtain the pixel position of each circular spot in the grating pixel surface in the first projection point cloud image and the second projection point cloud image; and obtaining a projection point cloud image by superposing the first projection point cloud image and the second projection point cloud image, wherein the pixel position of each circular spot in the grating pixel surface in the first projection point cloud image and the second projection point cloud image is the pixel position of the corresponding circular spot on the projection point cloud image.
As shown in fig. 4, the application of the projector generates two asymmetric circular spot mesh pictures, the circular spots of the two pictures are displaced from each other by a mesh distance. The two pictures are projected to a projection plane by an optical machine lens in sequence, then a projection picture is collected by a camera of a projector, a grid circular spot is searched in the collected point cloud image and the pixel coordinate of the grid circular spot is recorded, then the coordinates of the staggered grid circular spots on the two point cloud images are combined, and the circular spot center coordinate of the projection point cloud image with the grid density being the sum of the circular spot densities of the two point cloud images is obtained.
After obtaining the projection point cloud image of the grid point on the projection surface, continuously executing step S120, that is, obtaining the three-dimensional coordinates of the grid point on the projection surface based on the optical machine lens coordinate system according to the pixel point corresponding relationship between the standard grid image and the projection point cloud image and the pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector, and is used for calculating the three-dimensional coordinates of the grid point on the projection surface based on the optical machine lens coordinate system.
Before the distribution condition of the projection surface is obtained based on the grid points of the projection surface, the obtained distortion parameters (internal parameters of an optical camera lens and a camera) are used for correcting the images on the grating pixel surface and the sensor pixel surface, and grid point coordinates (u) of the corrected camera sensor image are obtainedc,vc) Grid point coordinates (u) in the optomechanical imagep,vp) (ii) a Wherein the distortion parameter is obtained as described above.
In some embodiments, the three-dimensional projection model is constructed by: firstly, establishing a first linear relation according to the optical center of the camera light path and a first correction point of the optical center on the camera sensor pixel surface; then, establishing a second linear relation according to the optical center of the optical path of the optical mechanical lens and a second correction point of the optical center on the grating pixel surface of the optical mechanical lens; then establishing a third linear relation between the first correction point and the second correction point according to an external parameter matrix between the camera and the optical-mechanical lens; and finally, obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the first linear relation, the second linear relation and the third linear relation.
The three-dimensional projection model of the embodiment calculates the three-dimensional coordinates of the grid points on the projection surface by adopting a triangle method. As shown in fig. 5, the mapping points of any grid point on the projection plane on the sensor pixel plane and the grating pixel plane can be obtained, and the corresponding relationship between the pixel points of the standard grid image and the projection point cloud image is determined, so that the three-dimensional coordinates of the grid points of the projection plane can be reconstructed. That is, since the projection plane point cloud image is obtained by projecting the standard grid image by the optical machine lens and collecting the standard grid image by the camera, the three-dimensional coordinates of each grid point of the projection plane can be reconstructed based on the three-dimensional projection model shown in fig. 5.
An example, u in FIG. 5cvcThe plane coordinate system corresponds to the coordinate system of the pixel surface of the sensor, upvpThe planar coordinate system corresponds to the coordinate system of the grating pixel plane, and then u-f is obtained according to the formula described abovex*x″+cxAnd v ═ fy·y″+cyCan obtain the optical center O of the camera light pathcThe coordinate of the coordinate system of the sensor pixel surface is qcOptical center of optical path O of optical machine lenspThe coordinate of the coordinate system of the grating pixel surface is qpFrom the three-dimensional projection model constructed according to the embodiment, the corresponding grid point Q of the projection plane can be calculatedwThree-dimensional coordinates in the opto-mechanical lens coordinate system.
Let the coordinate of a certain grid point of the projection plane be (X)p,Yp,Zp) Then can obtain
Figure GDA0003057766580000101
And
Figure GDA0003057766580000102
spand scScale factors of the camera and the opto-mechanical lens, respectively, (u)c,vc) And (u)p,vp) Two-dimensional coordinates of the projected points of the spatial three-dimensional points on the sensor pixel plane and the grating pixel plane, AcAnd ApInternal parameter matrix of camera and optical machine lens respectively, [ R | t]Is the extrinsic parameter matrix of the projector.
Then according to the optical path optical center O of the cameracAnd the first pixel point (u) on the camera sensor pixel surfacec,vc) The first linear relation and the optical center O of the optical path of the optical-mechanical lenspAnd a second pixel point (u) on the grating pixel surface of the optical-mechanical lensp,vp) By establishing a second linear relationship, i.e. (X) is calculatedp,Yp,Zp)。
After obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical-mechanical lens coordinate system, continuing to execute step S130, that is, obtaining a reference projection plane by using the three-dimensional coordinates of the grid points on the projection surface, constructing a world coordinate system by using the reference projection plane and the posture information of the projector obtained by the inertial measurement unit, and obtaining a conversion relationship between the world coordinate system and the optical-mechanical lens coordinate system.
In some embodiments, spatial filtering is performed on the mesh points on the projection plane according to the three-dimensional coordinates of the mesh points on the projection plane, and invalid mesh points in the mesh points are filtered to obtain valid mesh points; carrying out plane fitting on the effective grid points by using a least square method, and determining a plane obtained by fitting as a reference projection plane; the effective grid points are grid points which are approximately located on the same plane in the grid points on the projective plane, and the ineffective grid points are grid points which are far away from the plane in the grid points on the projective plane.
As shown in fig. 6, the three-dimensional point clouds composed of all the grid points on the projection plane may not be located on the same plane, and these grid points may include spatial noise points, discontinuity points, and off-plane points, and before reconstructing the projection reference plane, these invalid grid points are needed to form a smooth projection plane point cloud, and the filtered smooth projection plane point cloud is subjected to plane fitting to obtain the reference projection plane.
In some embodiments, the spatial noise points and the non-continuous points in the invalid grid points may be filtered by using a low-pass filtering method, and then the off-plane points in the invalid grid may be filtered by using the following method to obtain the valid grid points:
step A: randomly selecting three non-collinear grid points from the grid points, and obtaining sub-planes a0 'x + a 1' y + a2 'z ═ d determined by the three non-collinear grid points, wherein a 0', a1 ', a 2'd are constants;
and B: calculating the distance di between each grid point on the projection plane and the ith sub-plane, i.e. a0 ' xi + a1 ' yi + a2 ' ziiRejecting abnormal grid points with the distance from the sub-plane larger than a preset distance value to obtain the number of reference grid points, wherein the reference grid points are grid points with the distance from the sub-plane not larger than the preset distance value; for example, the preset distance value t is 2 σ, and σ is the standard deviation of the distances from all grid points to the current sub-plane when d isiIf the grid point is determined to be an abnormal point and removed when the grid point is larger than 2 sigma, otherwise di≦ 2 σ, it is determined that this grid point is reserved for the reference grid point.
And repeating the step A and the step B, after iterating for N times, determining the reference sub-plane with the maximum number of the reference grid points in the N sub-planes obtained after the iteration for N times, wherein the reference grid points of the reference sub-plane are the effective grid points.
Referring to fig. 6, the left side of fig. 6 is a cloud of projected surface points before filtering, and it can be seen that some points are in an out-of-plane position, and the right side of fig. 6 is a point cloud obtained by fitting a point cloud plane, and it can be seen that those out-of-plane points have been filtered out, and the remaining points are approximately on the fitted plane.
After obtaining the effective grid points, the plane equation of the reference projection plane may be determined as a using the three-dimensional coordinates of the effective grid points0x+a1y+a2Z, the first unit normal vector of the plane
Figure GDA0003057766580000111
Wherein a is0,a1,a2Are all constant, NbpNorm () is the vector norm operator, the first unit normal vector of the reference projection plane.
After obtaining the reference projection plane and the attitude information thereof, the attitude information of the projector can be obtained by using the inertial measurement unit; in some embodiments, the inertial measurement unit includes a gyroscope. And measuring an axial rotation angle by a gyroscope, and constructing a world coordinate system by using the axial rotation angle and the attitude information of the reference projection plane so as to eliminate trapezoidal distortion.
In some embodiments, the world coordinate system is constructed by: acquiring a first unit normal vector of the reference projection plane in the optical-mechanical lens coordinate system; obtaining a second unit normal vector of the absolute horizontal plane in the optical-mechanical lens coordinate system according to the Y-axis unit vector of the optical-mechanical lens coordinate system and the attitude information of the projector obtained by using the inertial measurement unit; a vector product of the first unit normal vector and the second unit normal vector is defined as a unit vector of the X axis of the world coordinate system, the second unit normal vector is defined as a unit vector of the Y axis of the world coordinate system, and the first unit normal vector is defined as a unit vector of the Z axis of the world coordinate system. After a world coordinate system is constructed, a translation matrix between the world coordinate system and the optical-mechanical lens coordinate system can be obtained according to the coordinate position of the world coordinate system origin in the optical-mechanical lens coordinate system, and a rotation matrix between the world coordinate system and the optical-mechanical lens coordinate system can be obtained according to unit vectors of the world coordinate system X axis and the world coordinate system Y axis.
As shown in fig. 7, the XOY plane of the world coordinate system constructed according to this embodiment should coincide with the reference projection plane, the X axis of the world coordinate system is parallel to the absolute horizontal plane, and the Z axis of the world coordinate system is perpendicular to the reference projection plane. Based on this, the vector representation of the three coordinate axes of the world coordinate system in the opto-mechanical lens coordinate system can be calculated by the following method:
the unit normal vector of the absolute horizontal plane is first calculated. The deviation degree of the XOZ plane of the optical machine lens coordinate system relative to the absolute horizontal plane can be determined by utilizing the axial rotation angle measured by a gyroscope included in the inertial measurement unit; the axial rotation angle measured by the gyroscope is the axial deviation angle shown in fig. 7, that is, the axial rotation angle θ measured by the gyroscope, and the axial rotation angle θ is the rotation angle of the projector in the Z-axis direction relative to the absolute horizontal plane.
The deviation degree of the XOZ plane of the optical-mechanical lens coordinate system relative to the absolute horizontal plane is
Figure GDA0003057766580000121
Second unit normal vector of absolute horizontal plane
Figure GDA0003057766580000122
Rz(theta) represents the degree of deviation of the XOZ plane of the optical machine lens coordinate system from the absolute horizontal plane, NhorDenotes the second unit normal vector, and θ denotes the axial rotation angle.
Then, unit vectors of individual coordinate axes of the world coordinate system are calculated. Because the X axis of the world coordinate system is respectively corresponding to the first unit normal vector NbpAnd a second unit normal vector NhorPerpendicular, and therefore can pass through the first unit normal vector NbpAnd a second unit normal vector NhorThe vector product of (A) is obtained as a unit vector V of the X axis of the world coordinate systemxwRespectively dividing the first unit normal vector NbpAnd a second unit normal vector NhorAs a Z-axis unit vector V of the world coordinate systemzwAnd Y-axis unit vector Vyw. I.e. Vxw=Nbp×Nhor,Vyw=Nhor, Vzw=Nbp
In some embodiments, the origin of the world coordinate system may be obtained from the center of gravity of the reference projection plane. In one example, the world coordinate system has an origin which is a projection point of an average gravity center point of the N sub-planes on the reference projection plane. Origin O ═ X of world coordinate system0-a0t,Y0-a1t,Z0+t],
Figure GDA0003057766580000131
(Xck,Yck,Zck) Is the barycentric coordinate of the kth sub-plane, [ X ]0,Y0,Z0]Is the average center of gravity point of the N sub-planes, and t is a constant.
As shown in fig. 7, after the world coordinate system is constructed, the conversion relationship between the optical-mechanical lens coordinate system and the world coordinate system may be determined as the translation vector T ═ OwAnd the rotation matrix R ═ Vxw,Vyw,Vzw)T
After the transformation relationship between the world coordinate system and the optical-mechanical lens coordinate system is obtained, the step S140 is continuously executed, that is, the target area of the distortion-free projection on the grating pixel surface is obtained according to the transformation relationship between the world coordinate system and the optical-mechanical lens coordinate system, so that the finally projected picture is presented as no trapezoid distortion.
It is first necessary to obtain the actual projection area of the projector on the projection surface at the current relative pose of the projector and the projection surface. Combining the pixel coordinates of four vertexes of the grating pixel surface, which are referred by the projector optical machine, and similar to the mode of reconstructing the three-dimensional coordinates of grid points on the projection surface, calculating the ray direction of the four vertexes of the grating pixel surface by taking the optical machine optical center as a starting point, then obtaining the intersection points of the four rays and the reference projection plane, calculating the coordinates of the four intersection points in the reference world coordinate system, obtaining the two-dimensional coordinates of the four intersection points on the world coordinate system XOY plane, namely the reference projection plane, and further calculating the quadrilateral area actually projected by the grating pixel surface on the projection surface.
In some embodiments, a method of acquiring a target region of an undistorted projection on a grating pixel plane comprises: obtaining reference projection points formed by four corner points on the grating pixel surface on the reference projection plane according to the three-dimensional projection model; acquiring a maximum inscribed rectangle of the quadrilateral region according to the aspect ratio of the grating pixel surface in the quadrilateral region surrounded by the reference projection points, wherein the aspect ratio of the maximum inscribed rectangle is consistent with the aspect ratio of the grating pixel surface, and the width and the height of the maximum inscribed rectangle are respectively parallel to the X axis and the Y axis of the world coordinate system; and obtaining back projection points of the maximum inscribed rectangle on the grating pixel surface according to the conversion relation between the world coordinate system and the optical-mechanical lens coordinate system and the internal parameter matrix of the optical-mechanical lens, wherein a rectangular region surrounded by the back projection points on the grating pixel surface is a target region.
Referring to fig. 8, in obtaining a quadrangular region, a maximum inscribed rectangle of the quadrangular region may be obtained according to the following method:
acquiring four vertexes of the quadrilateral area and a leftmost pixel point and a rightmost pixel point of the quadrilateral area, wherein an edge formed by a lower left vertex and a lower right vertex is a bottom edge, an edge formed by an upper left vertex and an upper right vertex is a top edge, and an edge formed by a lower right vertex and an upper right vertex is a right edge;
acquiring an initial position of a left vertical line according to the leftmost pixel point, wherein the left vertical line is parallel to the Y axis of the world coordinate system, and iteratively executing the following steps a-h on the left vertical line until the left vertical line reaches the rightmost pixel point:
step a, moving the left vertical line pixel by pixel to the right, and then executing step b;
b, forming a candidate upper left vertex of the maximum inscribed rectangle by using the intersection point of the left vertical line and the bottom line for the left vertical line of each pixel position, moving the candidate upper left vertex pixel by pixel upwards, and executing the step c;
step c, forming a candidate lower left vertex of the maximum inscribed rectangle by using the intersection point of the left vertical line and the top edge for the left vertical line of each pixel position, moving the candidate lower left vertex downwards pixel by pixel, and executing step d;
d, respectively acquiring a first straight line and a second straight line which are parallel to the X axis of the world coordinate system according to the candidate upper left vertex and the candidate lower left vertex, determining a candidate lower right vertex and a candidate upper right vertex according to an intersection point which is close to the left side and is formed by two intersection points of the first straight line, the second straight line and the right side, determining a candidate rectangle in a rectangular region surrounded by the four candidate vertices, calculating the area of the candidate rectangle, and executing the step e, wherein the candidate rectangle is the largest rectangle which is consistent with the aspect ratio of the raster pixel surface;
step e, when the area of the candidate rectangle is larger than the area of the maximum inscribed rectangle, updating the area of the maximum inscribed rectangle to the area of the candidate rectangle, updating the coordinates of the four vertexes of the maximum inscribed rectangle to the coordinates of the four vertexes of the candidate rectangle, and executing step f; when the area of the candidate rectangle is not larger than the area of the maximum inscribed rectangle, keeping the coordinates of the area of the maximum inscribed rectangle and the four vertexes of the maximum inscribed rectangle, and executing the step f; the initial value of the area of the maximum inscribed rectangle can be set to 0, and the initial four vertexes of the maximum inscribed rectangle can be set to null values;
step f, when the candidate top left vertex does not reach the intersection point pixel position of the left vertical line and the top edge, executing step b; executing step g when the candidate top left vertex reaches the intersection pixel position of the left vertical line and the top edge;
step g, when the candidate lower left vertex does not reach the intersection pixel position of the left vertical line and the bottom line, executing step c; executing step h when the candidate lower left vertex reaches the intersection pixel position of the left vertical line and the bottom line;
step h, when the left vertical line does not reach the rightmost pixel point, executing the step a; and when the left vertical line reaches the rightmost pixel point, obtaining four positioned coordinates corresponding to the area of the maximum inscribed rectangle as four vertex coordinates of the maximum inscribed rectangle.
The maximum inscribed rectangle can be obtained by iterating the steps a to h, as shown in fig. 8, the external quadrangle in fig. 8 is a quadrangle region on the reference projection plane, and appears as an irregular quadrangle due to trapezoidal distortion; the inner rectangle in fig. 8 is the largest undistorted rectangle within the obtained projection area, and it can be seen that this rectangle is parallel to the X-axis and the Y-axis of the world coordinate system.
After the target area of the undistorted projection is obtained, the step S150 is continuously executed, that is, the mapping relationship between the grating pixel surface and the target area is obtained, and the texture mapping is performed on the content to be projected by using the mapping relationship, so as to implement the undistorted projection.
After a target area of the undistorted projection is obtained, according to parameters in the projector and a conversion relation between a world coordinate system and an optical machine lens coordinate system, four vertexes of the target area are re-projected to a grating pixel surface according to a pinhole imaging model shown in fig. 3 and a three-dimensional projection model shown in fig. 5, so that four grating pixel coordinate points corresponding to the four vertexes of the target area of the undistorted projection are obtained, and the four grating pixel coordinate points form a quadrangle. If the original raster image is deformed to the quadrilateral area, the finally projected image effect corresponds to the maximum inscribed rectangle on the reference projection plane, and therefore the quadrilateral area on the raster pixel plane formed by the four raster pixel coordinate points is the target area for raster image distortion correction. The undistorted target area obtained by mapping is shown in fig. 9, the whole picture is an original grid image on the whole raster pixel surface, and the white quadrangle inside is the target area obtained by calculation.
In some embodiments, a homography mapping matrix between a target region and a grating pixel surface is obtained according to a correspondence between four vertices of the target region and four vertices of the grating pixel surface, so as to perform texture mapping on a picture to be projected by using the homography mapping matrix.
And forming a homographic mapping relation between two-dimensional coordinates by four vertexes of the target area and four vertexes of the grating pixel surface. The mapping relation can be accelerated by the GPU, the target area is divided into Mesh consisting of two triangles, the vertex sequence of each triangle is arranged clockwise, and the vertex attributes comprise vertex coordinates and source picture texture mapping coordinates.
As shown in fig. 10, fig. 10 exemplarily shows that a, B, C, and D are 4 vertices of a 1920 resolution raster pixel plane and a 1080 resolution raster pixel plane, a virtual frame quadrilateral is a calculated target region, two triangles are represented by different filling styles, and a vertex of each triangle includes a vertex coordinate and a texture coordinate attribute.
In the operation process, the GPU rendering pipeline calculates texture coordinates of pixels inside the vertex triangles through bilinear interpolation, and a sampler of the GPU rendering pipeline is responsible for extracting pixel values from corresponding coordinate positions in a source picture to serve as output pixel values of the vertex positions. The image subjected to texture mapping is output to a grating pixel surface of a projector, the whole distortion pre-correction process is completed through projection, the effect is shown in fig. 11, and after trapezoidal distortion correction, the original grating image is deformed into a target area. The final projection effect is shown in fig. 12, where the white quadrangle contour is an uncorrected projection area, the image inside the quadrangle is a projection picture obtained by keystone pre-correction, and it can be seen that the picture obtained by projection after keystone correction appears as an undistorted rectangle, and the horizontal axis is parallel to the absolute horizontal plane.
Therefore, the embodiment does not need external instruments and equipment, and trapezoidal distortion caused by an unsatisfactory projection angle between the projector and the projection surface is automatically finished by the projector.
The invention also provides a projector distortion correction device.
Fig. 13 is a block diagram showing a configuration of a distortion correction apparatus for a projector according to an embodiment of the present invention, and as shown in fig. 13, the apparatus of the present embodiment includes:
the projection point cloud obtaining unit is used for projecting the standard grid image on the grating pixel surface on a projection surface by using the optical machine lens and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface;
a projection point cloud coordinate calculation unit, which obtains the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the pixel point corresponding relation between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and the camera coordinate system of the projector;
the conversion relation calculation unit is used for acquiring a reference projection plane by utilizing the three-dimensional coordinates of the grid points on the projection surface, constructing a world coordinate system by utilizing the reference projection plane and the attitude information of the projector acquired by the inertia measurement unit, and acquiring the conversion relation between the world coordinate system and the optical machine lens coordinate system;
the target area determining unit is used for acquiring a target area of the undistorted projection on the grating pixel surface according to the conversion relation between the world coordinate system and the optical machine lens coordinate system;
and the texture mapping unit is used for obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by utilizing the mapping relation to realize distortion-free projection.
In some embodiments, the apparatus in fig. 13 further includes a preprocessing unit, which establishes a first linear relationship between the optical center of the camera optical path and a first calibration point of the optical center on the camera sensor pixel plane; establishing a second linear relation according to the optical center of the optical path of the optical machine lens and a second correction point of the optical center on the grating pixel surface; establishing a third linear relation between the first correction point and the second correction point according to an external parameter matrix between the camera and the optical-mechanical lens; and obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the first linear relation, the second linear relation and the third linear relation.
The conversion relation calculation unit comprises a reference projection plane construction module, a world coordinate system construction module and a matrix calculation module;
the reference projection plane construction module is used for carrying out spatial filtering on the grid points on the projection plane according to the three-dimensional coordinates of the grid points on the projection plane, filtering invalid grid points in the grid points and obtaining effective grid points; and performing plane fitting on the effective grid points by using a least square method, and determining a plane obtained by fitting as the reference projection plane. The reference projection plane construction module is further used for repeatedly executing the step A and the step B, after N times of iteration, determining a reference sub-plane with the maximum number of reference grid points in N sub-planes obtained after N times of iteration, wherein the reference grid points of the reference sub-plane are the effective grid points; wherein the step A: randomly selecting three non-collinear grid points from the grid points to obtain sub-planes determined by the three non-collinear grid points; and B: and calculating the distance between each grid point on the projection plane and the sub-plane, and eliminating abnormal grid points with the distance between the abnormal grid points and the sub-plane being greater than a preset distance value to obtain the number of reference grid points, wherein the reference grid points are grid points with the distance between the abnormal grid points and the sub-plane being not greater than the preset distance value.
The world coordinate system construction module is used for acquiring a first unit normal vector of the reference projection plane in the optical machine lens coordinate system; obtaining a second unit normal vector of the absolute horizontal plane in the optical-mechanical lens coordinate system according to the Y-axis unit vector of the optical-mechanical lens coordinate system and the attitude information of the projector obtained by the inertia measurement unit; a vector product of the first unit normal vector and the second unit normal vector is defined as a unit vector of the X axis of the world coordinate system, the second unit normal vector is defined as a unit vector of the Y axis of the world coordinate system, and the first unit normal vector is defined as a unit vector of the Z axis of the world coordinate system.
The matrix calculation module is used for obtaining a translation matrix between the world coordinate system and the optical machine lens coordinate system according to the coordinate position of the world coordinate system origin in the optical machine lens coordinate system, and obtaining a rotation matrix between the world coordinate system and the optical machine lens coordinate system according to unit vectors of X-axis and Y-axis Z-axis of the world coordinate system.
In some embodiments, the target area determining unit obtains, according to the three-dimensional projection model, reference projection points formed by four corner points on the grating pixel plane on the reference projection plane; acquiring a maximum inscribed rectangle of the quadrilateral region according to the aspect ratio of the grating pixel surface in the quadrilateral region surrounded by the reference projection points, wherein the aspect ratio of the maximum inscribed rectangle is consistent with the aspect ratio of the grating pixel surface, and the width and the height of the maximum inscribed rectangle are respectively parallel to the X axis and the Y axis of the world coordinate system; and obtaining back projection points of the maximum inscribed rectangle on the grating pixel surface according to the conversion relation between the world coordinate system and the optical-mechanical lens coordinate system and the internal parameter matrix of the optical-mechanical lens, wherein a rectangular region surrounded by the back projection points on the grating pixel surface is the target region.
With reference to the embodiment, the target area determining unit includes a calculating module, and acquires four vertexes of the quadrilateral area and a leftmost pixel point and a rightmost pixel point of the quadrilateral area, wherein an edge formed by a lower left vertex and a lower right vertex is a bottom edge, an edge formed by an upper left vertex and an upper right vertex is a bottom edge, and an edge formed by a lower right vertex and an upper right vertex is a right edge; acquiring an initial position of a left vertical line according to the leftmost pixel point, wherein the left vertical line is parallel to the Y axis of the world coordinate system, and iteratively executing the following steps a-h on the left vertical line until the left vertical line reaches the rightmost pixel point: step a, moving the left vertical line pixel by pixel to the right, and then executing step b; b, forming a candidate upper left vertex of the maximum inscribed rectangle by using the intersection point of the left vertical line and the bottom line for the left vertical line of each pixel position, moving the candidate upper left vertex pixel by pixel upwards, and executing the step c; step c, forming a candidate lower left vertex of the maximum inscribed rectangle by using the intersection point of the left vertical line and the top edge for the left vertical line of each pixel position, moving the candidate lower left vertex downwards pixel by pixel, and executing step d; d, respectively acquiring a first straight line and a second straight line which are parallel to the X axis of the world coordinate system according to the candidate upper left vertex and the candidate lower left vertex, determining a candidate lower right vertex and a candidate upper right vertex according to an intersection point which is close to the left side and is formed by two intersection points of the first straight line, the second straight line and the right side, determining a candidate rectangle in a rectangular region surrounded by the four candidate vertices, calculating the area of the candidate rectangle, and executing the step e, wherein the candidate rectangle is the largest rectangle which is consistent with the aspect ratio of the grating pixel surface; step e, when the area of the candidate rectangle is larger than the area of the maximum inscribed rectangle, updating the area of the maximum inscribed rectangle to the area of the candidate rectangle, updating the coordinates of the four vertexes of the maximum inscribed rectangle to the coordinates of the four vertexes of the candidate rectangle, and executing step f; when the area of the candidate rectangle is not larger than the area of the maximum inscribed rectangle, keeping the coordinates of the area of the maximum inscribed rectangle and the four vertexes of the maximum inscribed rectangle, and executing the step f; step f, when the candidate top left vertex does not reach the intersection point pixel position of the left vertical line and the top edge, executing step b; executing step g when the candidate top left vertex reaches the intersection pixel position of the left vertical line and the top edge; step g, when the candidate lower left vertex does not reach the intersection pixel position of the left vertical line and the bottom line, executing step c; executing step h when the candidate lower left vertex reaches the intersection pixel position of the left vertical line and the bottom line; step h, when the left vertical line does not reach the rightmost pixel point, executing the step a; and when the left vertical line reaches the rightmost pixel point, obtaining four positioned coordinates corresponding to the area of the maximum inscribed rectangle as four vertex coordinates of the maximum inscribed rectangle.
In some embodiments, the texture mapping unit obtains a homography mapping matrix between the target region and the grating pixel surface according to correspondence between four vertices of the target region and four vertices of the grating pixel surface.
In some embodiments, the projection point cloud obtaining unit projects a first standard grid image of a grating pixel surface on the projection surface by using the optical machine lens, and captures the projection surface by using the camera to obtain a first projection point cloud image; projecting a second standard grid image of a grating pixel surface on the projection surface by using the optical-mechanical lens, and shooting the projection surface by using the camera to obtain a second projection point cloud image, wherein circular spots in the first standard grid image and the second standard grid image are staggered; respectively identifying circular spot grids in the first projection point cloud image and the second projection point cloud image to obtain the pixel position of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface; and obtaining the pixel position of each circular spot of the projection point cloud image by combining the pixel positions of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The invention also provides a projector.
Fig. 14 is a schematic structural diagram of a projector according to an embodiment of the present invention, and as shown in fig. 14, in a hardware level, the projector includes a graphics processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least one disk Memory. Of course, the projector may also include hardware required by other services, such as an optical engine lens, a camera, and an inertial measurement unit, where the optical engine lens projects the standard grid image on the grating pixel surface onto the projection surface, the camera captures the projection surface to obtain a projection point cloud image, the inertial measurement unit measures attitude information of the projector, and the inertial measurement unit includes a gyroscope.
The graphics processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 14, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may comprise program code comprising computer executable instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the graphics processor.
The graphics processor reads a corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the projector distortion correction device on a logic level. And a graphic processor executing the program stored in the memory to implement the projector distortion correction method as described above.
The method performed by the projector distortion correction apparatus disclosed in the embodiment of fig. 14 in the present specification can be applied to or implemented by a graphics processor. The graphics processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the projector distortion correction method described above may be performed by instructions in the form of software or integrated logic circuits of hardware in the graphics processor. The graphics Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present specification may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present specification may be embodied directly in a hardware decoding processor, or in a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The invention also provides a computer readable storage medium.
The computer readable storage medium stores one or more computer programs, the one or more computer programs comprising instructions, which when executed by a graphics processor of a projector, are capable of implementing the projector distortion correction method described above.
For the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", and the like are used to distinguish the same items or similar items with basically the same functions and actions, and those skilled in the art can understand that the words "first", "second", and the like do not limit the quantity and execution order.
While the foregoing is directed to embodiments of the present invention, other modifications and variations of the present invention may be devised by those skilled in the art in light of the above teachings. It should be understood by those skilled in the art that the foregoing detailed description is for the purpose of better explaining the present invention, and the scope of the present invention should be determined by the scope of the appended claims.

Claims (12)

1. A projector distortion correction method, wherein the projector has an optical machine lens, a camera and an inertial measurement unit, the method comprising:
projecting the standard grid image on the grating pixel surface on a projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface;
acquiring three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the corresponding relation of the pixel points between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and a camera coordinate system of the projector;
acquiring a reference projection plane by using the three-dimensional coordinates of the grid points on the projection plane, constructing a world coordinate system by using the reference projection plane and the attitude information of the projector acquired by the inertial measurement unit, and acquiring a conversion relation between the world coordinate system and the optical machine lens coordinate system;
acquiring a target area of the undistorted projection on the grating pixel surface according to the conversion relation between the world coordinate system and the optical machine lens coordinate system;
and obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by utilizing the mapping relation to realize distortion-free projection.
2. The method of claim 1, wherein the three-dimensional projection model is constructed by:
establishing a first linear relation according to the optical center of the camera light path and a first correction point of the optical center on the camera sensor pixel surface;
establishing a second linear relation according to the optical center of the optical path of the optical machine lens and a second correction point of the optical center on the grating pixel surface;
establishing a third linear relation between the first correction point and the second correction point according to an external parameter matrix between the camera and the optical-mechanical lens;
and obtaining the three-dimensional coordinates of the grid points on the projection surface based on the optical machine lens coordinate system according to the first linear relation, the second linear relation and the third linear relation.
3. The method of claim 1, wherein the obtaining a reference projection plane using three-dimensional coordinates of grid points on the projection surface comprises:
spatial filtering is carried out on the grid points on the projection surface according to the three-dimensional coordinates of the grid points on the projection surface, invalid grid points in the grid points are filtered, and effective grid points are obtained;
and performing plane fitting on the effective grid points by using a least square method, and determining a plane obtained by fitting as the reference projection plane.
4. The method of claim 3, wherein the spatially filtering the mesh points on the plane of projection according to the three-dimensional coordinates of the mesh points on the plane of projection, and filtering invalid mesh points in the mesh points to obtain valid mesh points comprises:
step A: randomly selecting three non-collinear grid points from the grid points to obtain sub-planes determined by the three non-collinear grid points;
and B: calculating the distance between each grid point on the projection plane and the sub-plane, eliminating abnormal grid points with the distance between the abnormal grid points and the sub-plane being larger than a preset distance value, and obtaining the number of reference grid points, wherein the reference grid points are grid points with the distance between the abnormal grid points and the sub-plane being not larger than the preset distance value;
and repeating the step A and the step B, after N times of iteration, determining a reference sub-plane with the maximum number of reference grid points in the N sub-planes obtained after N times of iteration, wherein the reference grid points of the reference sub-plane are the effective grid points.
5. The method of claim 1, wherein the constructing a world coordinate system by using the reference projection plane and the pose information of the projector obtained by the inertial measurement unit, and obtaining a conversion relationship between the world coordinate system and the opto-mechanical lens coordinate system comprises:
acquiring a first unit normal vector of the reference projection plane in the optical-mechanical lens coordinate system;
acquiring a second unit normal vector of the absolute horizontal plane in the optical-mechanical lens coordinate system according to the Y-axis unit vector of the optical-mechanical lens coordinate system and the attitude information;
a vector product of the first unit normal vector and the second unit normal vector is used as a unit vector of an X axis of the world coordinate system, the second unit normal vector is used as a unit vector of a Y axis of the world coordinate system, and the first unit normal vector is used as a unit vector of a Z axis of the world coordinate system;
and obtaining a translation matrix between the world coordinate system and the optical machine lens coordinate system according to the coordinate position of the origin of the world coordinate system in the optical machine lens coordinate system, and obtaining a rotation matrix between the world coordinate system and the optical machine lens coordinate system according to unit vectors of X-axis and Y-axis Z-axis of the world coordinate system.
6. The method of claim 1, wherein the obtaining a target region of the distortion-free projection on the grating pixel plane according to a transformation relationship between the world coordinate system and the opto-mechanical lens coordinate system comprises:
obtaining reference projection points formed by four corner points on the grating pixel surface on the reference projection plane according to the three-dimensional projection model;
acquiring a maximum inscribed rectangle of the quadrilateral region according to the aspect ratio of the grating pixel surface in the quadrilateral region surrounded by the reference projection points, wherein the aspect ratio of the maximum inscribed rectangle is consistent with the aspect ratio of the grating pixel surface, and the width and the height of the maximum inscribed rectangle are respectively parallel to the X axis and the Y axis of the world coordinate system;
and obtaining back projection points of the maximum inscribed rectangle on the grating pixel surface according to the conversion relation between the world coordinate system and the optical-mechanical lens coordinate system and the internal parameter matrix of the optical-mechanical lens, wherein a rectangular region surrounded by the back projection points on the grating pixel surface is the target region.
7. The method of claim 6, wherein the obtaining a maximum inscribed rectangle of the quadrilateral area within the quadrilateral area enclosed by the reference projection points according to the aspect ratio of the grating pixel plane comprises:
acquiring four vertexes of the quadrilateral area and a leftmost pixel point and a rightmost pixel point of the quadrilateral area, wherein an edge formed by a lower left vertex and a lower right vertex is a bottom edge, an edge formed by an upper left vertex and an upper right vertex is a top edge, and an edge formed by a lower right vertex and an upper right vertex is a right edge;
acquiring an initial position of a left vertical line according to the leftmost pixel point, wherein the left vertical line is parallel to the Y axis of the world coordinate system, and iteratively executing the following steps a-h on the left vertical line until the left vertical line reaches the rightmost pixel point:
step a, moving the left vertical line pixel by pixel to the right, and then executing step b;
b, forming a candidate upper left vertex of the maximum inscribed rectangle by using the intersection point of the left vertical line and the bottom line for the left vertical line of each pixel position, moving the candidate upper left vertex pixel by pixel upwards, and executing the step c;
step c, forming a candidate lower left vertex of the maximum inscribed rectangle by using the intersection point of the left vertical line and the top edge for the left vertical line of each pixel position, moving the candidate lower left vertex downwards pixel by pixel, and executing step d;
d, respectively acquiring a first straight line and a second straight line which are parallel to the X axis of the world coordinate system according to the candidate upper left vertex and the candidate lower left vertex, determining a candidate lower right vertex and a candidate upper right vertex according to an intersection point which is close to the left side and is formed by two intersection points of the first straight line, the second straight line and the right side, determining a candidate rectangle in a rectangular region surrounded by the four candidate vertices, calculating the area of the candidate rectangle, and executing the step e, wherein the candidate rectangle is the largest rectangle which is consistent with the aspect ratio of the grating pixel surface;
step e, when the area of the candidate rectangle is larger than the area of the maximum inscribed rectangle, updating the area of the maximum inscribed rectangle to the area of the candidate rectangle, updating the coordinates of the four vertexes of the maximum inscribed rectangle to the coordinates of the four vertexes of the candidate rectangle, and executing step f; when the area of the candidate rectangle is not larger than the area of the maximum inscribed rectangle, keeping the coordinates of the area of the maximum inscribed rectangle and the four vertexes of the maximum inscribed rectangle, and executing the step f;
step f, when the candidate top left vertex does not reach the intersection point pixel position of the left vertical line and the top edge, executing step b; executing step g when the candidate top left vertex reaches the intersection pixel position of the left vertical line and the top edge;
step g, when the candidate lower left vertex does not reach the intersection pixel position of the left vertical line and the bottom line, executing step c; executing step h when the candidate lower left vertex reaches the intersection pixel position of the left vertical line and the bottom line;
step h, when the left vertical line does not reach the rightmost pixel point, executing the step a; and when the left vertical line reaches the rightmost pixel point, obtaining four positioned coordinates corresponding to the area of the maximum inscribed rectangle as four vertex coordinates of the maximum inscribed rectangle.
8. The method of claim 1, wherein the obtaining a mapping relationship of the grating pixel plane to the target area comprises:
and acquiring a homography mapping matrix between the target area and the grating pixel surface according to the corresponding relation between the four vertexes of the target area and the four vertexes of the grating pixel surface.
9. The method of claim 1, wherein the projecting the standard grid image of the grating pixel surface on the projection surface by the optical-mechanical lens and capturing the projection surface by the camera to obtain the projection point cloud image of the grid point on the projection surface comprises:
projecting a first standard grid image of a grating pixel surface on the projection surface by using the optical machine lens, and shooting the projection surface by using the camera to obtain a first projection point cloud image;
projecting a second standard grid image of a grating pixel surface on the projection surface by using the optical-mechanical lens, and shooting the projection surface by using the camera to obtain a second projection point cloud image, wherein circular spots in the first standard grid image and the second standard grid image are staggered;
respectively identifying circular spot grids in the first projection point cloud image and the second projection point cloud image to obtain the pixel position of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface;
and acquiring a projection point cloud image by overlapping the first projection point cloud image and the second projection point cloud image, wherein the pixel position of each circular spot in the first projection point cloud image and the second projection point cloud image on the grating pixel surface is the pixel position of the corresponding circular spot on the projection point cloud image.
10. A projector distortion correction apparatus, the projector having an opto-mechanical lens, a camera and an inertial measurement unit, the apparatus comprising:
the projection point cloud obtaining unit is used for projecting the standard grid image on the grating pixel surface on a projection surface by using the optical machine lens and shooting the projection surface by using the camera to obtain a projection point cloud image of grid points on the projection surface;
a projection point cloud coordinate calculation unit, which obtains three-dimensional coordinates of the grid points on the projection surface based on an optical machine lens coordinate system according to the pixel point corresponding relation between the standard grid image and the projection point cloud image and according to a pre-constructed three-dimensional projection model, wherein the three-dimensional projection model is constructed based on the optical machine lens coordinate system and a camera coordinate system of the projector;
the conversion relation calculation unit is used for acquiring a reference projection plane by utilizing the three-dimensional coordinates of the grid points on the projection surface, constructing a world coordinate system by utilizing the reference projection plane and the attitude information of the projector acquired by the inertia measurement unit, and acquiring the conversion relation between the world coordinate system and the optical machine lens coordinate system;
the target area determining unit is used for acquiring a target area of the undistorted projection on the grating pixel surface according to the conversion relation between the world coordinate system and the optical machine lens coordinate system;
and the texture mapping unit is used for obtaining the mapping relation between the grating pixel surface and the target area, and performing texture mapping on the content to be projected by utilizing the mapping relation to realize distortion-free projection.
11. A projector, comprising:
the optical machine lens is used for projecting the standard grid image on the grating pixel surface on a projection surface;
the camera shoots a projection surface to obtain a projection point cloud image and sends the projection point cloud image to the image processor;
the inertia measurement unit is used for measuring the attitude information of the projector and sending the attitude information to the graphic processor;
a memory storing computer-executable instructions;
a graphics processor that when executed causes the graphics processor to perform the method of any of claims 1-9.
12. A computer readable storage medium, wherein the computer readable storage medium has stored thereon one or more computer programs which, when executed, implement the method of any one of claims 1-9.
CN201910266198.0A 2019-04-03 2019-04-03 Projector distortion correction method and device and projector Active CN110336987B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910266198.0A CN110336987B (en) 2019-04-03 2019-04-03 Projector distortion correction method and device and projector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910266198.0A CN110336987B (en) 2019-04-03 2019-04-03 Projector distortion correction method and device and projector

Publications (2)

Publication Number Publication Date
CN110336987A CN110336987A (en) 2019-10-15
CN110336987B true CN110336987B (en) 2021-10-08

Family

ID=68139264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910266198.0A Active CN110336987B (en) 2019-04-03 2019-04-03 Projector distortion correction method and device and projector

Country Status (1)

Country Link
CN (1) CN110336987B (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110677634B (en) * 2019-11-27 2021-06-29 成都极米科技股份有限公司 Trapezoidal correction method, device and system for projector and readable storage medium
CN110996081B (en) * 2019-12-06 2022-01-21 北京一数科技有限公司 Projection picture correction method and device, electronic equipment and readable storage medium
CN111093067B (en) * 2019-12-31 2023-03-24 歌尔光学科技有限公司 Projection apparatus, lens distortion correction method, distortion correction device, and storage medium
CN113155053A (en) * 2020-01-22 2021-07-23 株式会社三丰 Three-dimensional geometry measuring device and three-dimensional geometry measuring method
JP7467947B2 (en) * 2020-01-31 2024-04-16 セイコーエプソン株式会社 METHOD FOR CONTROLLING IMAGE PROJECTION SYSTEM AND IMAGE PROJECTION SYSTEM
CN111443557A (en) * 2020-05-13 2020-07-24 浙江宝妈创客品牌管理有限公司 Multimedia projection automatic correction device based on light sensation control
CN111965630B (en) * 2020-08-17 2024-05-28 南京先能光电科技有限公司 Space positioning system
CN111800620A (en) * 2020-08-18 2020-10-20 深圳市慧视智图科技有限公司 Noninductive trapezoidal correction module of projector and calculation method thereof
CN114286065A (en) * 2020-09-28 2022-04-05 成都极米科技股份有限公司 Projection picture adjusting method and device, projector and storage medium
CN112330794B (en) * 2020-10-09 2022-06-14 同济大学 Single-camera image acquisition system based on rotary bipartite prism and three-dimensional reconstruction method
CN112261396B (en) * 2020-10-26 2022-02-25 成都极米科技股份有限公司 Projection method, projection device, projection equipment and computer readable storage medium
CN112203069B (en) * 2020-11-27 2021-04-23 深圳市当智科技有限公司 Wide-screen projection method and system based on camera and readable storage medium
CN112203071B (en) * 2020-12-03 2021-04-23 深圳市当智科技有限公司 Projection method, system and storage medium based on horizontal correction of distance sensor
CN112995624B (en) * 2021-02-23 2022-11-08 峰米(北京)科技有限公司 Trapezoidal error correction method and device for projector
CN112819939B (en) * 2021-03-08 2023-07-07 网易(杭州)网络有限公司 Method, apparatus, device and computer readable storage medium for correcting UV coordinates
CN113099198B (en) * 2021-03-19 2023-01-10 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN112689135B (en) * 2021-03-19 2021-07-02 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and electronic equipment
CN112967348A (en) * 2021-04-01 2021-06-15 深圳大学 Three-dimensional reconstruction method based on one-dimensional scanning structured light system and related components thereof
CN113191963B (en) * 2021-04-02 2022-08-05 华中科技大学 Projector residual distortion full-field calibration method and device without additional operation
CN113259642B (en) * 2021-05-12 2023-05-30 华强方特(深圳)科技有限公司 Film visual angle adjusting method and system
CN113470131B (en) * 2021-06-08 2024-04-26 中国人民解放军93114部队 Sea surface simulation image generation method and device, electronic equipment and storage medium
CN113781550A (en) * 2021-08-10 2021-12-10 国网河北省电力有限公司保定供电分公司 Four-foot robot positioning method and system
CN114018173B (en) * 2021-11-01 2024-04-02 江苏鑫晨光热技术有限公司 Heliostat surface shape initial normal outdoor measurement system and method
CN114760454A (en) * 2022-05-24 2022-07-15 海信视像科技股份有限公司 Projection equipment and trigger correction method
CN114466173A (en) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 Projection equipment and projection display control method for automatically throwing screen area
CN118104229A (en) * 2021-11-16 2024-05-28 海信视像科技股份有限公司 Projection equipment and display control method of projection image
WO2023087947A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and correction method
CN114866751A (en) * 2022-04-15 2022-08-05 海信视像科技股份有限公司 Projection equipment and trigger correction method
CN114111639B (en) * 2021-11-26 2024-04-30 凌云光技术股份有限公司 Correction method and device of surface structured light three-dimensional measurement system
CN114567762B (en) * 2021-12-28 2024-03-05 上海顺久电子科技有限公司 Correction method for projection image in projection plane and projection equipment
CN115190281B (en) * 2022-06-30 2024-01-02 海宁奕斯伟集成电路设计有限公司 Device and method for adjusting projection position of projector
CN115174879B (en) * 2022-07-18 2024-03-15 峰米(重庆)创新科技有限公司 Projection screen correction method, apparatus, computer device and storage medium
WO2024066776A1 (en) * 2022-09-29 2024-04-04 海信视像科技股份有限公司 Projection device and projection-picture processing method
CN115760620B (en) * 2022-11-18 2023-10-20 荣耀终端有限公司 Document correction method and device and electronic equipment
CN117058342B (en) * 2023-10-12 2024-01-26 天津科汇新创科技有限公司 Spine 3D voxel model construction method based on projection image
CN117818459A (en) * 2024-01-02 2024-04-05 深圳市欧冶半导体有限公司 Game interaction method and device of intelligent car lamp, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845002A (en) * 2005-04-06 2006-10-11 精工爱普生株式会社 Distortion correction for projector
CN103517016A (en) * 2012-06-22 2014-01-15 精工爱普生株式会社 Projector, image display system, and method of controlling projector
CN107454373A (en) * 2016-05-31 2017-12-08 财团法人工业技术研究院 Projection system and non-planar automatic correction method and automatic correction processing device thereof
CN108377371A (en) * 2018-02-09 2018-08-07 深圳市火乐科技发展有限公司 A kind of method and device of projection image correction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101820905B1 (en) * 2016-12-16 2018-01-22 씨제이씨지브이 주식회사 An image-based projection area automatic correction method photographed by a photographing apparatus and a system therefor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845002A (en) * 2005-04-06 2006-10-11 精工爱普生株式会社 Distortion correction for projector
CN103517016A (en) * 2012-06-22 2014-01-15 精工爱普生株式会社 Projector, image display system, and method of controlling projector
CN107454373A (en) * 2016-05-31 2017-12-08 财团法人工业技术研究院 Projection system and non-planar automatic correction method and automatic correction processing device thereof
CN108377371A (en) * 2018-02-09 2018-08-07 深圳市火乐科技发展有限公司 A kind of method and device of projection image correction

Also Published As

Publication number Publication date
CN110336987A (en) 2019-10-15

Similar Documents

Publication Publication Date Title
CN110336987B (en) Projector distortion correction method and device and projector
CN110111262B (en) Projector projection distortion correction method and device and projector
CN110191326B (en) Projection system resolution expansion method and device and projection system
JP7291244B2 (en) Projector Keystone Correction Method, Apparatus, System and Readable Storage Medium
CN108257183B (en) Camera lens optical axis calibration method and device
JP6515985B2 (en) Three-dimensional image combining method and three-dimensional image combining apparatus
TW201915944A (en) Image processing method, apparatus, and storage medium
KR101319777B1 (en) Panoramic projection device and method implemented by said device
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN110458932B (en) Image processing method, device, system, storage medium and image scanning apparatus
CN112734860B (en) Arc-screen prior information-based pixel-by-pixel mapping projection geometric correction method
JP2012088114A (en) Optical information processing device, optical information processing method, optical information processing system and optical information processing program
CN110648274B (en) Method and device for generating fisheye image
JP6683307B2 (en) Optimal spherical image acquisition method using multiple cameras
JP6674643B2 (en) Image processing apparatus and image processing method
US20180213215A1 (en) Method and device for displaying a three-dimensional scene on display surface having an arbitrary non-planar shape
WO2022126430A1 (en) Auxiliary focusing method, apparatus and system
CN116524022B (en) Offset data calculation method, image fusion device and electronic equipment
JP6674644B2 (en) Image processing apparatus and image processing method
JP2016114445A (en) Three-dimensional position calculation device, program for the same, and cg composition apparatus
CN116743973A (en) Automatic correction method for noninductive projection image
JP6684454B2 (en) Image processing apparatus and image processing method
JP2006338167A (en) Image data creation method
CN114401388A (en) Projection method, projection device, storage medium and projection equipment
KR20150058660A (en) Image processing device, method thereof, and system including the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220726

Address after: Room 1903, 19 / F, building D, Zhizhen building, No. 7 Zhichun Road, Haidian District, Beijing 100088

Patentee after: Bird innovation (Beijing) Technology Co.,Ltd.

Address before: 100191 room 1801, block D, Zhizhen building, 7 Zhichun Road, Haidian District, Beijing

Patentee before: BEIJING XIAONIAO TINGTING TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right