CN108429908B - Camera module testing method, device, equipment and medium - Google Patents

Camera module testing method, device, equipment and medium Download PDF

Info

Publication number
CN108429908B
CN108429908B CN201810623938.7A CN201810623938A CN108429908B CN 108429908 B CN108429908 B CN 108429908B CN 201810623938 A CN201810623938 A CN 201810623938A CN 108429908 B CN108429908 B CN 108429908B
Authority
CN
China
Prior art keywords
image
point
imaging
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810623938.7A
Other languages
Chinese (zh)
Other versions
CN108429908A (en
Inventor
王晶
钟祥赐
金元斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunshan Q Technology Co Ltd
Original Assignee
Kunshan Q Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunshan Q Technology Co Ltd filed Critical Kunshan Q Technology Co Ltd
Priority to CN201810623938.7A priority Critical patent/CN108429908B/en
Publication of CN108429908A publication Critical patent/CN108429908A/en
Application granted granted Critical
Publication of CN108429908B publication Critical patent/CN108429908B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method, a device, equipment and a medium for testing a camera module, wherein the method comprises the following steps: shooting a first image of a target image through a camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, and the first image comprises a first imaging point of a target point on the target image; rotating the camera module by a preset angle; shooting through the rotated shooting module to obtain a second image of the target image, wherein the second image comprises a second imaging point of the target point on the target image; and determining an included angle between the optical axis of the lens of the camera module and the rotating central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image. The method, the device, the equipment and the medium provided by the invention are used for solving the problem that the manual alignment in the prior art is difficult to accurately judge whether the optical axis of the lens of the camera module is vertical to the surface of the sensor, so that the alignment precision is low, and the technical effect of improving the alignment precision is realized.

Description

Camera module testing method, device, equipment and medium
Technical Field
The present invention relates to the field of camera technologies, and in particular, to a method, an apparatus, a device, and a medium for testing a camera module.
Background
The degree of perpendicularity between the optical axis of the lens in the camera module and the surface of the sensor is an important factor for evaluating the quality of the camera module. Especially to two cameras or many camera modules, the lens optical axis of each camera module needs strict parallel, and the parallel of optical axis often realizes through the perpendicular of lens optical axis and the sensor surface of guaranteeing single camera, if can guarantee that the optical axis and the sensor surface of single camera are perpendicular as far as, just can improve the optical axis depth of parallelism of camera module.
At present, a method for ensuring that a lens optical axis of a single camera is perpendicular to a sensor surface as much as possible is mainly realized by manually aligning and mounting the lens in a process of mounting the lens, however, errors of manual alignment are large, it is difficult to accurately judge whether the lens optical axis is perpendicular to the sensor surface, and alignment accuracy is low.
Therefore, the problem that the alignment precision is low is caused by the fact that whether the optical axis of the lens of the camera module is perpendicular to the surface of the sensor is difficult to accurately judge through manual alignment in the prior art.
Disclosure of Invention
In view of the above problems, the present invention has been made to provide a method, an apparatus, a device, and a medium for testing an image pickup module that overcome the above problems or at least partially solve the above problems.
In a first aspect, a method for testing a camera module includes:
shooting and acquiring a first image of a target image through the camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft;
shooting and acquiring a second image of the target image through the rotated shooting module, wherein the second image comprises a second imaging point, and the second imaging point is the imaging of the target point on the second image;
and determining an included angle between the optical axis of the lens of the camera module and the rotation central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image.
Optionally, the preset angle is 180 degrees.
Optionally, determining an included angle between an optical axis of a lens of the camera module and the rotation central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image, includes: determining the position of a virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image; wherein the virtual point is an imaging point of the target point on the first image or the second image, assuming that an included angle between the optical axis and the rotation central axis is 0; and determining an included angle between the optical axis and the rotating central axis according to the position of the virtual point.
Optionally, when the preset angle is 180 degrees, determining the position of the virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image, including: determining the position of a second mirror image point on the first image according to the position of the second imaging point on the second image, wherein the position coordinate of the second imaging point on the second image is the same as the position coordinate of the second mirror image point on the first image; determining a second symmetrical point on the first image according to the second mirror image point; the second symmetrical point and the second mirror image point are symmetrical based on the center of the first image; taking the middle point of the second symmetrical point and the first imaging point as the virtual point; or determining the position of a first mirror image point on the second image according to the position of the first imaging point on the first image, wherein the position coordinate of the first imaging point on the first image is the same as the position coordinate of the first mirror image point on the second image; determining a first symmetrical point on the second image according to the first mirror image point; the first symmetrical point and the first mirror image point are symmetrical based on the center of the second image; and taking the middle point of the first symmetrical point and the second imaging point as the virtual point.
Optionally, when the preset angle is 180 degrees, determining the position of the virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image, including: determining the position of a second mirror image point on the first image according to the position of the second imaging point on the second image, wherein the position coordinate of the second imaging point on the second image is the same as the position coordinate of the second mirror image point on the first image; adding a second vector to a first vector to obtain a third vector, wherein the starting point of the first vector is the midpoint of the first image, the end point of the first vector is the first imaging point, the starting point of the second vector is the second mirror image point, the end point of the second vector is the midpoint of the first image, and the starting point of the third vector is the midpoint of the first image; taking a midpoint between a midpoint of the first image and an end point of the third vector as the virtual point; or determining the position of a first mirror image point on the second image according to the position of the first imaging point on the first image, wherein the position coordinate of the first imaging point on the first image is the same as the position coordinate of the first mirror image point on the second image; adding a fifth vector to a fourth vector to obtain a sixth vector, wherein a starting point of the fourth vector is a midpoint of the second image, an end point of the fourth vector is the second imaging point, a starting point of the fifth vector is the first mirror image point, an end point of the fifth vector is a midpoint of the second image, and a starting point of the sixth vector is a midpoint of the second image; and taking the midpoint between the midpoint of the second image and the end point of the sixth vector as the virtual point.
Optionally, when the virtual point is a first virtual point of the target point imaged on the first image, determining an included angle between the optical axis and the rotation central axis according to the position of the virtual point includes: determining an included angle between the optical axis and the rotation central axis according to the position of the first virtual point on the first image and the position of the first imaging point; when the virtual point is a second virtual point imaged by the target point on the second image, determining an included angle between the optical axis and the rotation central axis according to the position of the virtual point, including: and determining an included angle between the optical axis and the rotating central axis according to the position of the second virtual point on the second image and the position of the second imaging point.
Optionally, determining an included angle between the optical axis and the rotation central axis according to the position of the first virtual point on the first image and the position of the first imaging point, including: determining the number of pixels spaced between the first virtual point and the first imaging point according to the position of the first virtual point and the position of the first imaging point on the first image; determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens; according to the position of the second virtual point on the second image and the position of the second imaging point, determining an included angle between the optical axis and the rotation central axis, including: determining the number of pixels spaced between the second virtual point and the second imaging point according to the position of the second virtual point and the position of the second imaging point on the second image; and determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens.
Optionally, when the number of pixels includes a number Δ X of pixels spaced between the first virtual point and the first imaging point in the X-axis direction and a number Δ Y of pixels spaced between the first virtual point and the first imaging point in the Y-axis direction, determining an included angle between the optical axis and the central axis according to the number of pixels, a pixel size corresponding to the sensor, and an effective focal length of the lens, includes: determining an X-axis angle value theta X of an included angle between the optical axis and the central axis projected to an X axis when the first image is shot according to a formula theta X-arctan (delta X p/EFL); determining a Y-axis angle value θ Y of an included angle between the optical axis and the central axis projected onto a Y-axis when the first image is captured according to a formula θ Y ═ arctan (Δ Y × p/EFL), where p is a pixel size corresponding to the sensor, and EFL is an effective focal length of the lens; when the number of pixels includes a number Δ X 'of pixels spaced between the second virtual point and the second imaging point in the X-axis direction and a number Δ Y' of pixels spaced between the second virtual point and the second imaging point in the Y-axis direction, determining an included angle between the optical axis and the central axis according to the number of pixels, a pixel size corresponding to the sensor, and an effective focal length of the lens, including: determining an X-axis angle value thetax ' of an angle formed by the optical axis and the central axis and projected onto an X-axis when the second image is shot according to a formula thetax ' ═ arctan (delta X '. p/EFL); and determining a Y-axis angle value theta Y ' of an included angle between the optical axis and the central axis projected onto a Y axis when the second image is shot according to a formula theta Y ' ═ arctan (delta Y '. p/EFL), wherein p is the pixel size corresponding to the sensor, and EFL is the effective focal length of the lens.
In a second aspect, a testing apparatus for a camera module is provided, which includes:
the first shooting module is used for shooting and acquiring a first image of a target image through the shooting module, the plane of the target image is parallel to the surface of the sensor of the shooting module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
the rotating module is used for rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft;
the second shooting module is used for shooting and acquiring a second image of the target image through the rotated shooting module, wherein the second image comprises a second imaging point, and the second imaging point is the imaging of the target point on the second image;
and the determining module is used for determining an included angle between the optical axis of the lens of the camera module and the rotating central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image.
In a third aspect, an electronic device is provided, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing any of the aspects of the first aspect when executing the program.
In a fourth aspect, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements any of the aspects of the first aspect.
The technical scheme provided by the embodiment of the invention at least has the following technical effects or advantages:
according to the test method, the test device, the test equipment and the test medium of the camera module, the target image is set to be parallel to the sensor surface of the camera module, the first image of the target image is shot and obtained, the camera module is rotated by a preset angle by taking the central axis of the sensor surface as an axis, the second image of the target image is shot and obtained, the two images both comprise imaging points of the target point on the target image, and the included angle between the lens optical axis of the camera module and the central axis of the sensor surface is accurately determined according to the positions of the imaging points of the target point on the two images before and after rotation, so that the relative position of the lens and the sensor surface can be re-adjusted in a targeted manner according to the determined included angle, the alignment precision is low, and the assembly qualification rate of the camera module is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flowchart of a method for testing a camera module according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a method for testing a camera module according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a first image according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a second image according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
FIG. 7 is a schematic structural diagram of a storage medium according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of an included angle and an offset direction between an optical axis of a lens and a rotation center axis in a three-dimensional space according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Example one
Referring to fig. 1, fig. 1 is a flowchart of a method for testing a camera module according to an embodiment of the present invention, including:
step S101, shooting and acquiring a first image of a target image through a camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
step S102, rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central axis;
step S103, shooting and acquiring a second image of the target image through the rotated shooting module, wherein the second image comprises a second imaging point of the target point, and the second imaging point is an image of the target point on the second image;
and step S104, determining an included angle between the optical axis of the lens of the camera module and the rotation central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image.
It should be noted that the method provided in this embodiment may be implemented by a preset testing device, and as shown in fig. 2, the plane where the preset target diagram 201 is located is parallel to the sensor surface 202 of the camera module, the lens 203 of the camera module is located between the target diagram 201 and the sensor surface 202, and the target point is marked on the target diagram 201. In the specific implementation process, the camera module can be arranged on a rotatable platform, the base of the camera module is arranged to be parallel to the target graph 201 to ensure that the target graph 201 is parallel to the sensor surface 202, and the normal of the sensor surface 202 passing through the central point is arranged to coincide with the rotation central axis of the rotatable platform. Of course, the camera module may be installed on a rotating shaft, and a normal line of the sensor surface passing through the center point coincides with the rotating shaft, which is not limited herein.
Before describing the specific implementation steps, the principle of determining the included angle between the optical axis of the lens and the central axis of the sensor surface in the embodiment is described:
on the basis of ensuring that the surface of the sensor is parallel to the target graph, if the optical axis of the lens does not coincide with the rotation central axis, a deviation amount exists between an intersection point of the optical axis of the lens on the target graph and an intersection point of the rotation central axis on the target graph, and according to the deviation amount and the distance between the target graph and the sensor, an included angle between the optical axis and the rotation central axis can be determined by combining a trigonometric function.
The following describes in detail specific implementation steps of the method for testing a camera module according to this embodiment with reference to fig. 1 and fig. 2:
firstly, step S101 is executed, a first image of a target image is captured and acquired by the camera module, a plane where the target image is located is parallel to a sensor surface of the camera module, the first image includes a first imaging point, and the first imaging point is an image of a target point on the target image on the first image.
In the embodiment of the present application, considering that an included angle between an optical axis of the lens and a central axis of the sensor surface is often relatively small, in order to improve accuracy of the determined included angle, an area of the marked target point needs to be as small as possible, so that a size of a first imaging point of the target point imaged on the first image is close to one pixel point.
Then, step S102 is executed to rotate the camera module by a preset angle with the normal of the sensor surface passing through the center point as a rotation center axis.
In the specific implementation process, an accurate rotating device needs to be arranged to ensure that the camera module can rotate by taking the normal line passing through the center of the surface of the sensor as a rotating central shaft.
For the single shooting module, in view of the processing accuracy of the shooting module, the central axis of the shooting module base can be used as the central axis of the sensor surface, further, in view of the variety of the base configuration design, the corresponding position of the sensor surface center on the base can be measured in advance when the shooting module base is opened, then a corresponding alignment mark is arranged on the position of the base, and when the shooting module is tested subsequently, the rotation central axis is aligned by using the alignment mark as the center of the sensor surface.
For the multi-camera module, the corresponding position of the center of the sensor surface of each module on the base can be measured in advance, then corresponding alignment marks are correspondingly arranged on the base of each module, and when the camera module is subsequently tested, each module respectively takes the alignment marks on the base as the center of the sensor surface to align the rotating central shaft.
In the embodiment of the present application, the preset angle may be any preset angle such as 30 degrees, 45 degrees, 90 degrees, 180 degrees, 270 degrees, 310 degrees, and the like, and is not limited herein.
Further, considering that when the preset angle is 180 degrees, the change component of the position of the target point generated simply by rotation can be easily obtained by the central symmetry principle, in the subsequent calculation, the change component of the position of the target point caused simply by the deviation of the optical axis can be considered in a centralized manner, the subsequent included angle determination is more convenient, the calculation amount of the included angle determination can be greatly reduced, and therefore the preset angle is preferably 180 degrees.
Next, step S103 is executed to capture a second image of the target image by the rotated capture module, where the second image includes a second imaging point of the target point.
In this embodiment, the first imaging point is an image of a target point captured before the rotation of the shooting module, and the second imaging point is an image of a target point captured after the rotation of the shooting module.
For example:
assuming that a target point a is pre-marked on the target image, the camera module obtains the first image shown in fig. 3 through step S101, where a first imaging point P1 on the first image is an imaging point of the target point a on the first image; then, the camera module is rotated according to step S102, and then a second image shown in fig. 4 is captured according to step S103, where a second imaging point P2 on the second image is an imaging point of the target point a on the second image.
And then, executing step S104, and determining an included angle between the optical axis of the lens of the camera module and the central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image.
Specifically, the angle between the optical axis and the rotation center axis may be calculated by using either one of the first image and the second image as a reference image and by using the imaging point of the target point on the other image as the reference image, and the reference image will be described in detail below as an example in which the reference image is the first image and the second image:
first, the first image is used as a reference image.
Determining the position of a first virtual point on a first image according to the position of the first imaging point on the first image and the position of a second imaging point on a second image, wherein the first virtual point is a point of a target point imaged on the first image when an included angle between an assumed optical axis and a rotation central axis is 0; and determining an included angle between the optical axis and the rotating central axis according to the position of the first virtual point on the first image and the position of the first imaging point.
The following describes the determination method of the first virtual point and the specific calculation method of the included angle between the optical axis and the rotation central axis.
In the specific implementation process, if the preset angles of the rotation of the camera module are different, the method for determining the virtual point is also different, and here, the method for determining the virtual point is described in detail by taking the preset angle as 180 degrees as an example:
for the determination mode of the first virtual point, the determination principle is as follows: when the optical axis is offset from the rotation center axis, the optical axis of the lens before 180 degrees rotation and the optical axis of the lens after 180 degrees rotation are symmetrical based on the rotation center axis. Regarding only the position of the lens in the three-dimensional space, it is assumed that the position before the lens is rotated is the first position and the position after the rotation is the second position. If the lens is adjustable, under the condition that the position of the sensor is not changed, the lens is adjusted from the first position to the second position, the position change component generated by 180-degree rotation does not exist at the moment, two imaging points of the target point on the sensor before and after adjustment are symmetrical based on the first virtual point, and the position difference between the two imaging points is the position change component generated by optical axis deviation. As described above, the rotation angle of the position change component due to the rotation is 180 degrees, which can be obtained by the central symmetry principle. When the position change component of the optical axis deviation and the position change component generated by the 180-degree rotation are considered in combination, it is assumed that the mirror image point of the second imaging point on the first image is the second mirror image point, i.e., the relative position of the second mirror image point on the first image is the same as the relative position of the second imaging point on the second image. The second symmetric point is the central symmetric point of the second mirror image point on the first image. The second point of symmetry is symmetrical with respect to the first virtual point, as between the first imaged point and the second point of symmetry. Therefore, as long as the position of the second symmetric point is obtained, the midpoint of the connecting line between the second symmetric point and the first imaging point is the first virtual point.
The following describes the determination method of the first virtual point in detail with reference to fig. 3 and 4:
the position of the first point of symmetry S1 on the second image is first determined. Specifically, the position of the second mirror point M2 on the first image is determined according to the position of the second imaging point P2 on the second image, wherein the position coordinate of the second imaging point P2 on the second image is the same as the position coordinate of the second mirror point M2 on the first image. Then, a second point of symmetry is determined on the first image from the second mirror point M2S 2; the second mirror point M2 is symmetrical to the second symmetrical point S2 based on the center of the first image;
then, as explained in the principle above, in view of the fact that the second symmetry point S2 and the first imaging point P1 are symmetrically disposed about the first virtual point V1, the midpoint of the second symmetry point S2 and the first imaging point P1 is taken as the first virtual point V1.
Those skilled in the art will understand that, in an embodiment, a central symmetry point of the second imaging point P2 on the second image may be first determined through a simple process sequence transformation, and a mirror image point of the central symmetry point on the first image is the second symmetry point S2.
Of course, in the implementation process, the first virtual point may also be determined in other manners, for example, referring to fig. 3 and 4, the position of the second mirror image point M2 on the first image is determined according to the position of the second imaging point P2 on the second image; then, the center point O1 of the first image is used as a starting point, the first imaging point P1 is used as an end point, the first vector O1P1 is used as a starting point, the second mirror image point M2 is used as an end point, and the center point O1 of the first image is used as a second vector M2O 1; the second vector M2O1 is added to the first vector O1P1 through vector operation to obtain a third vector O1O2 ', and a midpoint between an end point O2' of the third vector and a center point O1 of the first image is taken as a first virtual point V1, which is not limited and is not listed.
The specific calculation method for the included angle between the optical axis and the rotation central axis has the following calculation principle: according to the position of the first virtual point and the position of the first imaging point, a deviation value of the optical axis of the lens and the rotation central axis on a given plane (such as a plane where the sensor is located) can be determined, and according to the deviation value and the distance between the given plane and the optical center of the lens, an included angle between the optical axis and the rotation central axis can be determined through a tangent relation.
Specifically, the distance by which the intersection point of the lens optical axis and the intersection point of the rotation center axis on the given plane deviate is equal to the distance between the first virtual point and the first imaging point on the assumption that the distance from the given plane to the lens center is equal to the distance from the imaging plane to the lens center.
In the embodiment of the application, if the included angle is calculated by taking the sensor surface as a given plane, the deviation value of the optical axis of the lens and the rotation central axis on the sensor surface can be accurately determined according to the pixel size corresponding to the sensor, so that the calculation accuracy is improved. The method of calculating the angle with the sensor surface as a given plane is described below in connection with fig. 2:
firstly, determining the number Δ X of pixels spaced between the virtual point V1 and the first imaging point P1 in the X-axis direction, and the number Δ Y of pixels spaced between the first virtual point V1 and the first imaging point P1 in the Y-axis direction, wherein the coordinates of the first virtual pixel point V1 are (X, Y), the coordinates of the first imaging point P1 are (X1, Y1), Δ X is X1-X, Δ Y is Y1-Y, wherein the coordinates are the coordinates recorded according to the number of pixels, and the X-axis and the Y-axis are arbitrary two-dimensional coordinate axes on the plane where the sensor surface is located, so as to determine the specific deviation direction of the optical axis of the lens in the three-dimensional space;
then, according to a formula thetax ═ arctan (deltax × p/EFL), an X-axis angular value thetax of an included angle between the optical axis and the rotation center axis projected on a plane where the X-axis is located when the first image is captured is determined, and according to a formula thetay ═ arctan (deltay × p/EFL), a Y-axis angular value thetay of an included angle between the optical axis and the rotation center axis projected on a plane where the Y-axis is located when the first image is captured is determined, wherein p is a pixel size corresponding to the sensor, and EFL is an effective focal length of the lens. In the specific implementation process, the distance between the optical center of the lens and the sensor surface is set to be EFL when the camera assembly is assembled, that is, the EFL is the distance between the optical center of the lens and the sensor surface.
As shown in fig. 8, the calculated θ x and θ y represent the included angle and the offset direction between the optical axis 801 and the rotation center axis 802 in the three-dimensional space.
Of course, in the implementation process, the included angle may also be calculated by taking the plane of the target map as a given plane after the first image and the second image are enlarged and scaled to the target map, in this case, the deviation value of the optical axis of the lens and the central axis on the sensor surface may be obtained by a preset physical coordinate, and preferably, the target map has a physical coordinate mark, so that the corresponding position can be directly identified according to the image. The distance between the target map and the lens can also be measured, and then the included angle and the offset direction of the optical axis of the lens and the central axis in the three-dimensional space can be determined based on the tangent relation, which is not limited herein and is not listed.
Second, the second image is used as a reference image.
Determining the position of a second virtual point on a second image according to the position of a first imaging point on the first image and the position of a second imaging point on the second image, wherein the second virtual point is a point of a target point imaged on the second image when the included angle between the optical axis and the rotation central axis is assumed to be 0; and then, determining an included angle between the optical axis and the central axis according to the position of a second virtual point on the second image and the position of a second imaging point.
Taking the preset angle as 180 degrees as an example, the specific method for determining the second virtual point is as follows: determining the position of the first mirror image point on the second image according to the position of the first imaging point on the first image, wherein the position coordinate of the first imaging point on the first image is the same as the position coordinate of the first mirror image point on the second image; then, according to the first mirror image point, a first symmetric point is determined on the second image; the first symmetrical point and the first mirror image point are symmetrical based on the center of the second image; and taking the middle point of the first symmetrical point and the second imaging point as a second virtual point.
The method for determining the second virtual point is substantially the same as the method for determining the first virtual point in the first method, and the principle and the steps are not repeated here for the sake of brevity of the description.
The specific method for calculating the included angle between the optical axis and the rotating central axis comprises the following steps: determining the number of pixels delta X 'spaced between the second virtual point and the second imaging point in the X-axis direction and the number of pixels delta Y' spaced between the second virtual point and the second imaging point in the Y-axis direction according to the position of the second virtual point and the position of the second imaging point on the second image; determining an X-axis angle value theta X ' of an included angle between the optical axis and the rotation central axis projected to the X axis when a second image is shot according to a formula theta X ' ═ arctan (delta X '. p/EFL); and determining a Y-axis angle value theta Y ' of an included angle between the optical axis and the rotation central axis projected on the Y axis when the second image is shot according to a formula theta Y ' arc (delta Y ' p/EFL), wherein p is the pixel size corresponding to the sensor, and EFL is the effective focal length of the lens.
The method for calculating the included angle is basically the same as the method for calculating the included angle in the first method, and the principle and the steps are not repeated here for the sake of brevity of the description.
In a specific implementation process, when the rotation angle is an angle other than 180 degrees, the above principle may also be adopted, where one image is used as a reference image, an imaging point of a target point on the other image is mapped into the reference image, and a virtual point of the target point imaged on the reference image is determined when an included angle between the optical axis of the lens and the rotation central axis is 0, so as to calculate a deviation value between an intersection point of the optical axis of the lens and a given plane and an intersection point between the rotation central axis and the given plane, and further determine the included angle between the optical axis and the rotation central axis, which is not limited herein.
After describing the specific implementation steps provided by this embodiment, in order to facilitate understanding of the detailed implementation steps of this embodiment as a whole, the following fully describes the method with reference to fig. 3 and 4 by taking the rotation angle as 180 degrees and the second image as a reference image as an example:
first, a first image of fig. 3 having a first imaging point P1 and a second image of fig. 4 having a second imaging point P2 are obtained according to steps S101 to S103;
then, the position of the first mirror point M1 on the second image is determined based on the first imaging point P1. Determining a first symmetric point S1 on the second image according to the first mirror image point M1;
the midpoint of the first symmetrical point S1 and the second imaging point P2 is taken as the second virtual point V2;
then, with the sensor surface as a given plane, determining the number Δ X 'of pixels spaced between the second virtual point V2 and the second imaging point P2 in the X-axis direction, and the number Δ Y' of pixels spaced between the second virtual point V2 and the second imaging point P2 in the Y-axis direction, where the coordinates of the second virtual point V2 are (X ', Y'), the coordinates of the second imaging point P2 are (X2, Y2), Δ X '═ X2-X', Δ Y '═ Y2-Y', where the coordinates are both coordinates recorded in terms of the number of pixels, and the X-axis and the Y-axis are arbitrary two-dimensional coordinate axes on the plane where the sensor surface is located;
then, according to a formula θ X ', where p is a pixel size corresponding to the sensor, and EFL is an effective focal length of the lens, an X-axis angular value θ X' where an angle between the optical axis and the rotation center axis is projected onto a plane on which the X-axis is located when the second image is captured is determined, and according to a formula θ Y ', where p is a pixel size corresponding to the sensor, an Y-axis angular value θ Y' where an angle between the optical axis and the rotation center axis is projected onto a plane on which the Y-axis is located when the second image is captured is determined.
Specifically, in the test method, device, apparatus, and medium for the camera module according to the embodiments of the present invention, a target image is set to be parallel to a sensor surface of the camera module, a first image of the target image is captured, and then a second image of the target image is captured after the camera module is rotated by a predetermined angle around a normal line passing through a center of the sensor surface, and both the two images include an imaging point of a target point on the target image, so that an included angle between an optical axis of a lens of the camera module and a rotation center axis is accurately determined by positions of imaging points of the target point on two images before and after rotation, and thus a relative position between the lens and the sensor surface can be re-adjusted in a targeted manner according to the determined included angle, thereby improving alignment accuracy and improving assembly yield of the camera module.
In the specific implementation process, the testing method provided by the embodiment can also be applied to a dual camera module or a multi camera module. The method may be implemented by testing and correcting the included angle between the optical axis and the rotating central axis of each single shooting module by using the test method provided by the embodiment, and then assembling a plurality of single shooting modules into a double-shooting or multi-shooting module; or assembling the double-shooting or multi-shooting modules, and then testing and correcting the included angle between the optical axis and the rotating central axis of each single-shooting module. The optical axis parallelism of the assembled double-camera module and the multi-camera module is improved by ensuring that the optical axis of a single camera and the normal line of the center of the sensor coincide as much as possible.
Based on the same inventive concept, the embodiment of the invention also provides a device corresponding to the method in the first embodiment, which is shown in the second embodiment.
Example two
As shown in fig. 5, there is provided a test apparatus for a camera module, comprising:
the first shooting module 501 is configured to shoot a first image of a target image through the camera module, where a plane where the target image is located is parallel to a sensor surface of the camera module, the first image includes a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
a rotation module 502, configured to rotate the camera module by a preset angle with a normal of the sensor surface passing through a central point as a rotation central axis;
a second shooting module 503, configured to obtain a second image of the target image through shooting by the rotated shooting module, where the second image includes a second imaging point, and the second imaging point is an image of the target point on the second image;
a determining module 504, configured to determine an included angle between an optical axis of a lens of the camera module and the rotation central axis according to a position of the first imaging point on the first image and a position of the second imaging point on the second image.
Since the apparatus described in the second embodiment of the present invention is an apparatus used for implementing the method of the first embodiment of the present invention, based on the method described in the first embodiment of the present invention, a person skilled in the art can understand the specific structure and the deformation of the apparatus, and thus the details are not described herein. All the devices adopted in the method of the first embodiment of the present invention belong to the protection scope of the present invention.
Based on the same inventive concept, the embodiment of the invention also provides electronic equipment corresponding to the method in the first embodiment, which is shown in the third embodiment.
EXAMPLE III
As shown in fig. 6, the embodiment provides an electronic device, which includes a memory 610, a processor 620 and a computer program 611 stored in the memory 610 and capable of running on the processor 620, and the processor 620 implements the following steps when executing the computer program 611:
shooting and acquiring a first image of a target image through the camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft;
shooting and acquiring a second image of the target image through the rotated shooting module, wherein the second image comprises a second imaging point, and the second imaging point is the imaging of the target point on the second image;
and determining an included angle between the optical axis of the lens of the camera module and the rotation central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image.
In the embodiment of the present invention, when the processor 620 executes the computer program 611, any one of the first embodiment of the present invention may be implemented.
Since the electronic device described in the third embodiment of the present invention is a device used for implementing the method of the first embodiment of the present invention, a person skilled in the art can understand the specific structure and the deformation of the device based on the method described in the first embodiment of the present invention, and thus the details are not described herein. All the devices adopted by the method of the first embodiment of the invention belong to the protection scope of the invention.
Based on the same inventive concept, the embodiment of the present invention further provides a storage medium corresponding to the method in the first embodiment, which is shown in the fourth embodiment.
Example four
The present embodiment provides a computer-readable storage medium 700, as shown in fig. 7, on which a computer program 711 is stored, wherein the computer program 711, when executed by a processor, implements the following steps:
shooting and acquiring a first image of a target image through the camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft;
shooting and acquiring a second image of the target image through the rotated shooting module, wherein the second image comprises a second imaging point, and the second imaging point is the imaging of the target point on the second image;
and determining an included angle between the optical axis of the lens of the camera module and the rotation central axis according to the position of the first imaging point on the first image and the position of the second imaging point on the second image.
In a specific implementation, the computer program 711 may implement any one of the embodiments of the present invention when executed by a processor.
The technical scheme provided by the embodiment of the invention at least has the following technical effects or advantages:
according to the test method, the test device, the test equipment and the test medium of the camera module, the target image is set to be parallel to the sensor surface of the camera module, the first image of the target image is shot and obtained, the camera module is rotated by a preset angle by taking the central axis of the sensor surface as an axis, the second image of the target image is shot and obtained, the two images both comprise imaging points of the target point on the target image, and the included angle between the lens optical axis of the camera module and the central axis of the sensor surface is accurately determined according to the positions of the imaging points of the target point on the two images before and after rotation, so that the relative position of the lens and the sensor surface can be re-adjusted in a targeted manner according to the determined included angle, the alignment precision is low, and the assembly qualification rate of the camera module is improved.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some or all of the components in an apparatus, electronic device, or both, in accordance with embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.

Claims (7)

1. A test method of a camera module is characterized by comprising the following steps:
shooting and acquiring a first image of a target image through the camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft; the preset angle is 180 degrees;
shooting and acquiring a second image of the target image through the rotated camera module, wherein the second image comprises a second imaging point, and the second imaging point is an image of the target point on the second image;
determining the position of a virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image; the virtual point is an imaging point of the target point on the first image or the second image when an included angle between an optical axis of a lens of the camera module and the rotation central axis is assumed to be 0;
if the virtual point is a first virtual point imaged by the target point on the first image, determining the number of pixels spaced between the first virtual point and the first imaging point according to the position of the first virtual point on the first image and the position of the first imaging point; determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens;
if the virtual point is a second virtual point imaged by the target point on the second image, determining the number of pixels spaced between the second virtual point and the second imaging point according to the position of the second virtual point on the second image and the position of the second imaging point; and determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens.
2. The method of claim 1, wherein when the preset angle is 180 degrees, determining the position of the virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image comprises:
determining the position of a second mirror image point on the first image according to the position of the second imaging point on the second image, wherein the position coordinate of the second imaging point on the second image is the same as the position coordinate of the second mirror image point on the first image; determining a second symmetrical point on the first image according to the second mirror image point; the second symmetrical point and the second mirror image point are symmetrical based on the center of the first image; taking the middle point of the second symmetrical point and the first imaging point as the virtual point;
alternatively, the first and second electrodes may be,
determining the position of a first mirror image point on the second image according to the position of the first imaging point on the first image, wherein the position coordinate of the first imaging point on the first image is the same as the position coordinate of the first mirror image point on the second image; determining a first symmetrical point on the second image according to the first mirror image point; the first symmetrical point and the first mirror image point are symmetrical based on the center of the second image; and taking the middle point of the first symmetrical point and the second imaging point as the virtual point.
3. The method of claim 1, wherein when the preset angle is 180 degrees, determining the position of the virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image comprises:
determining the position of a second mirror image point on the first image according to the position of the second imaging point on the second image, wherein the position coordinate of the second imaging point on the second image is the same as the position coordinate of the second mirror image point on the first image; adding a second vector to a first vector to obtain a third vector, wherein the starting point of the first vector is the midpoint of the first image, the end point of the first vector is the first imaging point, the starting point of the second vector is the second mirror image point, the end point of the second vector is the midpoint of the first image, and the starting point of the third vector is the midpoint of the first image; taking a midpoint between a midpoint of the first image and an end point of the third vector as the virtual point;
alternatively, the first and second electrodes may be,
determining the position of a first mirror image point on the second image according to the position of the first imaging point on the first image, wherein the position coordinate of the first imaging point on the first image is the same as the position coordinate of the first mirror image point on the second image; adding a fifth vector to a fourth vector to obtain a sixth vector, wherein a starting point of the fourth vector is a midpoint of the second image, an end point of the fourth vector is the second imaging point, a starting point of the fifth vector is the first mirror image point, an end point of the fifth vector is a midpoint of the second image, and a starting point of the sixth vector is a midpoint of the second image; and taking the midpoint between the midpoint of the second image and the end point of the sixth vector as the virtual point.
4. The method of claim 1, wherein,
when the pixel number includes a pixel number Δ X spaced between the first virtual point and the first imaging point in an X-axis direction and a pixel number Δ Y spaced between the first virtual point and the first imaging point in a Y-axis direction, determining an included angle between the optical axis and the central axis according to the pixel number, a pixel size corresponding to the sensor, and an effective focal length of the lens, including:
determining an X-axis angle value theta X of an included angle between the optical axis and the central axis projected to an X axis when the first image is shot according to a formula theta X-arctan (delta X p/EFL); determining a Y-axis angle value θ Y of an included angle between the optical axis and the central axis projected onto a Y-axis when the first image is captured according to a formula θ Y ═ arctan (Δ Y × p/EFL), where p is a pixel size corresponding to the sensor, and EFL is an effective focal length of the lens;
when the number of pixels includes a number Δ X 'of pixels spaced between the second virtual point and the second imaging point in the X-axis direction and a number Δ Y' of pixels spaced between the second virtual point and the second imaging point in the Y-axis direction, determining an included angle between the optical axis and the central axis according to the number of pixels, a pixel size corresponding to the sensor, and an effective focal length of the lens, including:
determining an X-axis angle value thetax ' of an angle formed by the optical axis and the central axis and projected onto an X-axis when the second image is shot according to a formula thetax ' ═ arctan (delta X '. p/EFL); and determining a Y-axis angle value theta Y ' of an included angle between the optical axis and the central axis projected onto a Y axis when the second image is shot according to a formula theta Y ' ═ arctan (delta Y '. p/EFL), wherein p is the pixel size corresponding to the sensor, and EFL is the effective focal length of the lens.
5. The utility model provides a testing arrangement of module of making a video recording which characterized in that includes:
the first shooting module is used for shooting and acquiring a first image of a target image through the shooting module, the plane of the target image is parallel to the surface of the sensor of the shooting module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
the rotating module is used for rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft; the preset angle is 180 degrees;
the second shooting module is used for shooting and acquiring a second image of the target image through the rotated shooting module, the second image comprises a second imaging point, and the second imaging point is an image of the target point on the second image;
the determining module is used for determining the position of a virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image; the virtual point is an imaging point of the target point on the first image or the second image when an included angle between an optical axis of a lens of the camera module and the rotation central axis is assumed to be 0; if the virtual point is a first virtual point imaged by the target point on the first image, determining the number of pixels spaced between the first virtual point and the first imaging point according to the position of the first virtual point on the first image and the position of the first imaging point; determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens; if the virtual point is a second virtual point imaged by the target point on the second image, determining the number of pixels spaced between the second virtual point and the second imaging point according to the position of the second virtual point on the second image and the position of the second imaging point; and determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of:
shooting and acquiring a first image of a target image through a camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft; the preset angle is 180 degrees;
shooting and acquiring a second image of the target image through the rotated camera module, wherein the second image comprises a second imaging point, and the second imaging point is an image of the target point on the second image;
determining the position of a virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image; the virtual point is an imaging point of the target point on the first image or the second image when an included angle between an optical axis of a lens of the camera module and the rotation central axis is assumed to be 0;
if the virtual point is a first virtual point imaged by the target point on the first image, determining the number of pixels spaced between the first virtual point and the first imaging point according to the position of the first virtual point on the first image and the position of the first imaging point; determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens;
if the virtual point is a second virtual point imaged by the target point on the second image, determining the number of pixels spaced between the second virtual point and the second imaging point according to the position of the second virtual point on the second image and the position of the second imaging point; and determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens.
7. A computer-readable storage medium, on which a computer program is stored, which program, when executed by a processor, carries out the steps of:
shooting and acquiring a first image of a target image through a camera module, wherein the plane of the target image is parallel to the surface of a sensor of the camera module, the first image comprises a first imaging point, and the first imaging point is an image of a target point on the target image on the first image;
rotating the camera module by a preset angle by taking a normal line of the surface of the sensor passing through a central point as a rotating central shaft; the preset angle is 180 degrees;
shooting and acquiring a second image of the target image through the rotated camera module, wherein the second image comprises a second imaging point, and the second imaging point is an image of the target point on the second image;
determining the position of a virtual point according to the position of the first imaging point on the first image and the position of the second imaging point on the second image; the virtual point is an imaging point of the target point on the first image or the second image when an included angle between an optical axis of a lens of the camera module and the rotation central axis is assumed to be 0;
if the virtual point is a first virtual point imaged by the target point on the first image, determining the number of pixels spaced between the first virtual point and the first imaging point according to the position of the first virtual point on the first image and the position of the first imaging point; determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens;
if the virtual point is a second virtual point imaged by the target point on the second image, determining the number of pixels spaced between the second virtual point and the second imaging point according to the position of the second virtual point on the second image and the position of the second imaging point; and determining an included angle between the optical axis and the central axis according to the number of the pixels, the pixel size corresponding to the sensor and the effective focal length of the lens.
CN201810623938.7A 2018-06-15 2018-06-15 Camera module testing method, device, equipment and medium Active CN108429908B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810623938.7A CN108429908B (en) 2018-06-15 2018-06-15 Camera module testing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810623938.7A CN108429908B (en) 2018-06-15 2018-06-15 Camera module testing method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN108429908A CN108429908A (en) 2018-08-21
CN108429908B true CN108429908B (en) 2020-09-22

Family

ID=63164513

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810623938.7A Active CN108429908B (en) 2018-06-15 2018-06-15 Camera module testing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN108429908B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109186959B (en) * 2018-09-28 2020-02-07 歌尔股份有限公司 Method, device and equipment for detecting field curvature of VR optical module
CN109462752B (en) * 2018-11-19 2024-06-25 信利光电股份有限公司 Method and device for measuring optical center position of camera module
CN110838147B (en) * 2019-10-25 2022-07-05 深圳信息职业技术学院 Camera module detection method and device
CN111220360B (en) * 2020-01-03 2021-08-31 歌尔光学科技有限公司 Method and device for testing resolution of camera module
CN111380459B (en) * 2020-03-19 2022-06-10 昆山丘钛微电子科技股份有限公司 Method for measuring center point of chip and method for measuring offset of center point of chip
CN111425696B (en) * 2020-03-31 2021-12-10 北京博清科技有限公司 Camera positioning system and camera positioning method
CN112254672B (en) * 2020-10-15 2022-02-15 天目爱视(北京)科技有限公司 Height-adjustable's intelligent 3D information acquisition equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205593526U (en) * 2016-03-31 2016-09-21 京东方科技集团股份有限公司 Camera device , rotating device , range unit and ranging system
CN106500619A (en) * 2016-10-21 2017-03-15 哈尔滨理工大学 The camera internal imageing sensor alignment error separation method of view-based access control model measurement

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2549227B1 (en) * 2011-07-21 2013-09-11 Axis AB Method for determining the tilt of an image sensor
CN103163725A (en) * 2011-12-17 2013-06-19 鸿富锦精密工业(深圳)有限公司 Camera module detection device and detection method
CN105758337B (en) * 2014-12-19 2018-09-04 宁波舜宇光电信息有限公司 A method of obtaining angle between lens plane and image sensor plane
CN104730802B (en) * 2015-03-27 2017-10-17 酷派软件技术(深圳)有限公司 Calibration, focusing method and the system and dual camera equipment of optical axis included angle
CN105627995B (en) * 2016-03-31 2018-03-23 京东方科技集团股份有限公司 Camera device, tumbler, range unit, range-measurement system and distance-finding method
CN106989697A (en) * 2017-05-24 2017-07-28 深圳天珑无线科技有限公司 The acquisition methods and device of a kind of optical axis included angle
CN107742310B (en) * 2017-09-18 2021-06-04 广东美晨通讯有限公司 Method and device for testing included angle of double cameras and storage device
CN107702695B (en) * 2017-09-26 2020-05-12 歌尔股份有限公司 Method for testing relative position of camera module lens and image sensor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205593526U (en) * 2016-03-31 2016-09-21 京东方科技集团股份有限公司 Camera device , rotating device , range unit and ranging system
CN106500619A (en) * 2016-10-21 2017-03-15 哈尔滨理工大学 The camera internal imageing sensor alignment error separation method of view-based access control model measurement

Also Published As

Publication number Publication date
CN108429908A (en) 2018-08-21

Similar Documents

Publication Publication Date Title
CN108429908B (en) Camera module testing method, device, equipment and medium
CN107767422B (en) Fisheye lens correction method and device and portable terminal
CN108805934B (en) External parameter calibration method and device for vehicle-mounted camera
US8452568B2 (en) Method for calibrating cameras installed on vehicle
CN111263142B (en) Method, device, equipment and medium for testing optical anti-shake of camera module
CN111896221B (en) Alignment method of rotating optical measurement system for virtual coordinate system auxiliary camera calibration
CN111220130B (en) Focusing measurement method and terminal capable of measuring object at any position in space
WO2015045329A1 (en) Vehicle periphery image display device, and camera adjustment method
JPH11118425A (en) Calibration method and device and calibration data production
JP2012088114A (en) Optical information processing device, optical information processing method, optical information processing system and optical information processing program
US20130113897A1 (en) Process and arrangement for determining the position of a measuring point in geometrical space
CN106815866B (en) Calibration method of fisheye camera, calibration system and target thereof
CN106570907B (en) Camera calibration method and device
JPWO2008053649A1 (en) Wide-angle image acquisition method and wide-angle stereo camera device
TW201616214A (en) Testing chart, camera module testing method and system using same
CN115830103A (en) Monocular color-based transparent object positioning method and device and storage medium
Ding et al. A robust detection method of control points for calibration and measurement with defocused images
CN111383264A (en) Positioning method, positioning device, terminal and computer storage medium
US10196005B2 (en) Method and system of camera focus for advanced driver assistance system (ADAS)
JP3842988B2 (en) Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program
JP3696336B2 (en) How to calibrate the camera
López-Nicolás et al. Unitary torus model for conical mirror based catadioptric system
CN111754584A (en) Remote large-field-of-view camera parameter calibration system and method
CN116743973A (en) Automatic correction method for noninductive projection image
JP2005275789A (en) Three-dimensional structure extraction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: No.3, Taihong Road, Kunshan high tech Industrial Development Zone, Suzhou, Jiangsu Province, 215300

Patentee after: Kunshan Qiuti Microelectronics Technology Co.,Ltd.

Address before: No.3, Taihong Road, Kunshan high tech Industrial Development Zone, Suzhou, Jiangsu Province, 215300

Patentee before: KUNSHAN Q TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder